Joelsson, Daniel; Moravec, Phil; Troutman, Matthew; Pigeon, Joseph; DePhillips, Pete
2008-08-20
Transferring manual ELISAs to automated platforms requires optimizing the assays for each particular robotic platform. These optimization experiments are often time consuming and difficult to perform using a traditional one-factor-at-a-time strategy. In this manuscript we describe the development of an automated process using statistical design of experiments (DOE) to quickly optimize immunoassays for precision and robustness on the Tecan EVO liquid handler. By using fractional factorials and a split-plot design, five incubation time variables and four reagent concentration variables can be optimized in a short period of time.
Automated Inspection And Precise Grinding Of Gears
NASA Technical Reports Server (NTRS)
Frint, Harold; Glasow, Warren
1995-01-01
Method of precise grinding of spiral bevel gears involves automated inspection of gear-tooth surfaces followed by adjustments of machine-tool settings to minimize differences between actual and nominal surfaces. Similar to method described in "Computerized Inspection of Gear-Tooth Surfaces" (LEW-15736). Yields gears of higher quality, with significant reduction in manufacturing and inspection time.
NASA Astrophysics Data System (ADS)
Iles, E. J.; McCallum, L.; Lovell, J. E. J.; McCallum, J. N.
2018-02-01
As we move into the next era of geodetic VLBI, the scheduling process is one focus for improvement in terms of increased flexibility and the ability to react with changing conditions. A range of simulations were conducted to ascertain the impact of scheduling on geodetic results such as Earth Orientation Parameters (EOPs) and station coordinates. The potential capabilities of new automated scheduling modes were also simulated, using the so-called 'dynamic scheduling' technique. The primary aim was to improve efficiency for both cost and time without losing geodetic precision, particularly to maximise the uses of the Australian AuScope VLBI array. We show that short breaks in observation will not significantly degrade the results of a typical 24 h experiment, whereas simply shortening observing time degrades precision exponentially. We also confirm the new automated, dynamic scheduling mode is capable of producing the same standard of result as a traditional schedule, with close to real-time flexibility. Further, it is possible to use the dynamic scheduler to augment the 3 station Australian AuScope array and thereby attain EOPs of the current global precision with only intermittent contribution from 2 additional stations. We thus confirm automated, dynamic scheduling bears great potential for flexibility and automation in line with aims for future continuous VLBI operations.
Precision Departure Release Capability (PDRC) Overview and Results: NASA to FAA Research Transition
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Davis, Tom.
2013-01-01
NASA researchers developed the Precision Departure Release Capability (PDRC) concept to improve the tactical departure scheduling process. The PDRC system is comprised of: 1) a surface automation system that computes ready time predictions and departure runway assignments, 2) an en route scheduling automation tool that uses this information to estimate ascent trajectories to the merge point and computes release times and, 3) an interface that provides two-way communication between the two systems. To minimize technology transfer issues and facilitate its adoption by TMCs and Frontline Managers (FLM), NASA developed the PDRC prototype using the Surface Decision Support System (SDSS) for the Tower surface automation tool, a research version of the FAA TMA (RTMA) for en route automation tool and a digital interface between the two DSTs to facilitate coordination.
Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin
2016-10-01
Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements. © 2015 Society for Laboratory Automation and Screening.
[Automated analyzer of enzyme immunoassay].
Osawa, S
1995-09-01
Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.
Automation of Precise Time Reference Stations (PTRS)
NASA Astrophysics Data System (ADS)
Wheeler, P. J.
1985-04-01
The U.S. Naval Observatory is presently engaged in a program of automating precise time stations (PTS) and precise time reference stations (PTBS) by using a versatile mini-computer controlled data acquisition system (DAS). The data acquisition system is configured to monitor locally available PTTI signals such as LORAN-C, OMEGA, and/or the Global Positioning System. In addition, the DAS performs local standard intercomparison. Computer telephone communications provide automatic data transfer to the Naval Observatory. Subsequently, after analysis of the data, results and information can be sent back to the precise time reference station to provide automatic control of remote station timing. The DAS configuration is designed around state of the art standard industrial high reliability modules. The system integration and software are standardized but allow considerable flexibility to satisfy special local requirements such as stability measurements, performance evaluation and printing of messages and certificates. The DAS operates completely independently and may be queried or controlled at any time with a computer or terminal device (control is protected for use by authorized personnel only). Such DAS equipped PTS are operational in Hawaii, California, Texas and Florida.
An automated field phenotyping pipeline for application in grapevine research.
Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard
2015-02-26
Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale.
An Automated Field Phenotyping Pipeline for Application in Grapevine Research
Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard
2015-01-01
Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485
Drilling Precise Orifices and Slots
NASA Technical Reports Server (NTRS)
Richards, C. W.; Seidler, J. E.
1983-01-01
Reaction control thrustor injector requires precisely machined orifices and slots. Tooling setup consists of rotary table, numerical control system and torque sensitive drill press. Components used to drill oxidizer orifices. Electric discharge machine drills fuel-feed orifices. Device automates production of identical parts so several are completed in less time than previously.
Automated Detection of HONcode Website Conformity Compared to Manual Detection: An Evaluation.
Boyer, Célia; Dolamic, Ljiljana
2015-06-02
To earn HONcode certification, a website must conform to the 8 principles of the HONcode of Conduct In the current manual process of certification, a HONcode expert assesses the candidate website using precise guidelines for each principle. In the scope of the European project KHRESMOI, the Health on the Net (HON) Foundation has developed an automated system to assist in detecting a website's HONcode conformity. Automated assistance in conducting HONcode reviews can expedite the current time-consuming tasks of HONcode certification and ongoing surveillance. Additionally, an automated tool used as a plugin to a general search engine might help to detect health websites that respect HONcode principles but have not yet been certified. The goal of this study was to determine whether the automated system is capable of performing as good as human experts for the task of identifying HONcode principles on health websites. Using manual evaluation by HONcode senior experts as a baseline, this study compared the capability of the automated HONcode detection system to that of the HONcode senior experts. A set of 27 health-related websites were manually assessed for compliance to each of the 8 HONcode principles by senior HONcode experts. The same set of websites were processed by the automated system for HONcode compliance detection based on supervised machine learning. The results obtained by these two methods were then compared. For the privacy criterion, the automated system obtained the same results as the human expert for 17 of 27 sites (14 true positives and 3 true negatives) without noise (0 false positives). The remaining 10 false negative instances for the privacy criterion represented tolerable behavior because it is important that all automatically detected principle conformities are accurate (ie, specificity [100%] is preferred over sensitivity [58%] for the privacy criterion). In addition, the automated system had precision of at least 75%, with a recall of more than 50% for contact details (100% precision, 69% recall), authority (85% precision, 52% recall), and reference (75% precision, 56% recall). The results also revealed issues for some criteria such as date. Changing the "document" definition (ie, using the sentence instead of whole document as a unit of classification) within the automated system resolved some but not all of them. Study results indicate concordance between automated and expert manual compliance detection for authority, privacy, reference, and contact details. Results also indicate that using the same general parameters for automated detection of each criterion produces suboptimal results. Future work to configure optimal system parameters for each HONcode principle would improve results. The potential utility of integrating automated detection of HONcode conformity into future search engines is also discussed.
Automated Detection of HONcode Website Conformity Compared to Manual Detection: An Evaluation
2015-01-01
Background To earn HONcode certification, a website must conform to the 8 principles of the HONcode of Conduct In the current manual process of certification, a HONcode expert assesses the candidate website using precise guidelines for each principle. In the scope of the European project KHRESMOI, the Health on the Net (HON) Foundation has developed an automated system to assist in detecting a website’s HONcode conformity. Automated assistance in conducting HONcode reviews can expedite the current time-consuming tasks of HONcode certification and ongoing surveillance. Additionally, an automated tool used as a plugin to a general search engine might help to detect health websites that respect HONcode principles but have not yet been certified. Objective The goal of this study was to determine whether the automated system is capable of performing as good as human experts for the task of identifying HONcode principles on health websites. Methods Using manual evaluation by HONcode senior experts as a baseline, this study compared the capability of the automated HONcode detection system to that of the HONcode senior experts. A set of 27 health-related websites were manually assessed for compliance to each of the 8 HONcode principles by senior HONcode experts. The same set of websites were processed by the automated system for HONcode compliance detection based on supervised machine learning. The results obtained by these two methods were then compared. Results For the privacy criterion, the automated system obtained the same results as the human expert for 17 of 27 sites (14 true positives and 3 true negatives) without noise (0 false positives). The remaining 10 false negative instances for the privacy criterion represented tolerable behavior because it is important that all automatically detected principle conformities are accurate (ie, specificity [100%] is preferred over sensitivity [58%] for the privacy criterion). In addition, the automated system had precision of at least 75%, with a recall of more than 50% for contact details (100% precision, 69% recall), authority (85% precision, 52% recall), and reference (75% precision, 56% recall). The results also revealed issues for some criteria such as date. Changing the “document” definition (ie, using the sentence instead of whole document as a unit of classification) within the automated system resolved some but not all of them. Conclusions Study results indicate concordance between automated and expert manual compliance detection for authority, privacy, reference, and contact details. Results also indicate that using the same general parameters for automated detection of each criterion produces suboptimal results. Future work to configure optimal system parameters for each HONcode principle would improve results. The potential utility of integrating automated detection of HONcode conformity into future search engines is also discussed. PMID:26036669
Precision Relative Positioning for Automated Aerial Refueling from a Stereo Imaging System
2015-03-01
PRECISION RELATIVE POSITIONING FOR AUTOMATED AERIAL REFUELING FROM A STEREO IMAGING SYSTEM THESIS Kyle P. Werner, 2Lt, USAF AFIT-ENG-MS-15-M-048...REFUELING FROM A STEREO IMAGING SYSTEM THESIS Presented to the Faculty Department of Electrical and Computer Engineering Graduate School of...RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENG-MS-15-M-048 PRECISION RELATIVE POSITIONING FOR AUTOMATED AERIAL REFUELING FROM A STEREO IMAGING SYSTEM THESIS
NASA Technical Reports Server (NTRS)
Swenson, Harry N.; Zelenka, Richard E.; Dearing, Munro G.; Hardy, Gordon H.; Clark, Raymond; Davis, Tom; Amatrudo, Gary; Zirkler, Andre
1994-01-01
NASA and the U.S. Army have designed, developed, and flight evaluated a Computer Aiding for Low Altitude Helicopter Flight (CALAHF) guidance system. This system provides guidance to the pilot for near terrain covert helicopter operations. It automates the processing of precision navigation information, helicopter mission requirements, and terrain flight guidance. The automation is presented to the pilot through symbology on a helmet-mounted display. The symbology is a 'pilot-centered' design which preserves pilot flexibility and authority over the CALAHF system's automation. An extensive flight evaluation of the system has been conducted using the U.S. Army's NUH-60 STAR (Systems Testbed for Avionics Research) research helicopter. The evaluations were flown over a multiwaypoint helicopter mission in rugged mountainous terrain, at terrain clearance altitudes from 300 to 125 ft and airspeeds from 40 to 110 knots. The results of these evaluations showed that the pilots could precisely follow the automation symbology while maintaining a high degree of situational awareness.
Ma, Junlong; Wang, Chengbin; Yue, Jiaxin; Li, Mianyang; Zhang, Hongrui; Ma, Xiaojing; Li, Xincui; Xue, Dandan; Qing, Xiaoyan; Wang, Shengjiang; Xiang, Daijun; Cong, Yulong
2013-01-01
Several automated urine sediment analyzers have been introduced to clinical laboratories. Automated microscopic pattern recognition is a new technique for urine particle analysis. We evaluated the analytical and diagnostic performance of the UriSed automated microscopic analyzer and compared with manual microscopy for urine sediment analysis. Precision, linearity, carry-over, and method comparison were carried out. A total of 600 urine samples sent for urinalysis were assessed using the UriSed automated microscopic analyzer and manual microscopy. Within-run and between-run precision of the UriSed for red blood cells (RBC) and white blood cells (WBC) were acceptable at all levels (CV < 20%). Within-run and between-run imprecision of the UriSed testing for cast, squamous epithelial cells (EPI), and bacteria (BAC) were good at middle level and high level (CV < 20%). The linearity analysis revealed substantial agreement between the measured value and the theoretical value of the UriSed for RBC, WBC, cast, EPI, and BAC (r > 0.95). There was no carry-over. RBC, WBC, and squamous epithelial cells with sensitivities and specificities were more than 80% in this study. There is substantial agreement between the UriSed automated microscopic analyzer and the manual microscopy methods. The UriSed provides for a rapid turnaround time.
2010-01-01
Background Cell motility is a critical parameter in many physiological as well as pathophysiological processes. In time-lapse video microscopy, manual cell tracking remains the most common method of analyzing migratory behavior of cell populations. In addition to being labor-intensive, this method is susceptible to user-dependent errors regarding the selection of "representative" subsets of cells and manual determination of precise cell positions. Results We have quantitatively analyzed these error sources, demonstrating that manual cell tracking of pancreatic cancer cells lead to mis-calculation of migration rates of up to 410%. In order to provide for objective measurements of cell migration rates, we have employed multi-target tracking technologies commonly used in radar applications to develop fully automated cell identification and tracking system suitable for high throughput screening of video sequences of unstained living cells. Conclusion We demonstrate that our automatic multi target tracking system identifies cell objects, follows individual cells and computes migration rates with high precision, clearly outperforming manual procedures. PMID:20377897
Assessment of an automated capillary system for Plasmodium vivax microsatellite genotyping.
Manrique, Paulo; Hoshi, Mari; Fasabi, Manuel; Nolasco, Oscar; Yori, Pablo; Calderón, Martiza; Gilman, Robert H; Kosek, Margaret N; Vinetz, Joseph M; Gamboa, Dionicia
2015-08-21
Several platforms have been used to generate the primary data for microsatellite analysis of malaria parasite genotypes. Each has relative advantages but share a limitation of being time- and cost-intensive. A commercially available automated capillary gel cartridge system was assessed in the microsatellite analysis of Plasmodium vivax diversity in the Peruvian Amazon. The reproducibility and accuracy of a commercially-available automated capillary system, QIAxcel, was assessed using a sequenced PCR product of 227 base pairs. This product was measured 42 times, then 27 P. vivax samples from Peruvian Amazon subjects were analyzed with this instrument using five informative microsatellites. Results from the QIAxcel system were compared with a Sanger-type sequencing machine, the ABI PRISM(®) 3100 Genetic Analyzer. Significant differences were seen between the sequenced amplicons and the results from the QIAxcel instrument. Different runs, plates and cartridges yielded significantly different results. Additionally, allele size decreased with each run by 0.045, or 1 bp, every three plates. QIAxcel and ABI PRISM systems differed in giving different values than those obtained by ABI PRISM, and too many (i.e. inaccurate) alleles per locus were also seen with the automated instrument. While P. vivax diversity could generally be estimated using an automated capillary gel cartridge system, the data demonstrate that this system is not sufficiently precise for reliably identifying parasite strains via microsatellite analysis. This conclusion reached after systematic analysis was due both to inadequate precision and poor reproducibility in measuring PCR product size.
Johnson-Chavarria, Eric M.; Agrawal, Utsav; Tanyeri, Melikhan; Kuhlman, Thomas E.
2014-01-01
We report an automated microfluidic-based platform for single cell analysis that allows for cell culture in free solution with the ability to control the cell growth environment. Using this approach, cells are confined by the sole action of gentle fluid flow, thereby enabling non-perturbative analysis of cell growth away from solid boundaries. In addition, the single cell microbioreactor allows for precise and time-dependent control over cell culture media, with the combined ability to observe the dynamics of non-adherent cells over long time scales. As a proof-of-principle demonstration, we used the platform to observe dynamic cell growth, gene expression, and intracellular diffusion of repressor proteins while precisely tuning the cell growth environment. Overall, this microfluidic approach enables the direct observation of cellular dynamics with exquisite control over environmental conditions, which will be useful for quantifying the behaviour of single cells in well-defined media. PMID:24836754
Development and application of an automated precision solar radiometer
NASA Astrophysics Data System (ADS)
Qiu, Gang-gang; Li, Xin; Zhang, Quan; Zheng, Xiao-bing; Yan, Jing
2016-10-01
Automated filed vicarious calibration is becoming a growing trend for satellite remote sensor, which require a solar radiometer have to automatic measure reliable data for a long time whatever the weather conditions and transfer measurement data to the user office. An automated precision solar radiometer has been developed. It is used in measuring the solar spectral irradiance received at the Earth surface. The instrument consists of 8 parallel separate silicon-photodiode-based channels with narrow band-pass filters from the visible to near-IR regions. Each channel has a 2.0° full-angle Filed of View (FOV). The detectors and filters are temperature stabilized using a Thermal Energy Converter at 30+/-0.2°. The instrument is pointed toward the sun via an auto-tracking system that actively tracks the sun within a +/-0.1°. It collects data automatically and communicates with user terminal through BDS (China's BeiDou Navigation Satellite System) while records data as a redundant in internal memory, including working state and error. The solar radiometer is automated in the sense that it requires no supervision throughout the whole process of working. It calculates start-time and stop-time every day matched with the time of sunrise and sunset, and stop working once the precipitation. Calibrated via Langley curves and simultaneous observed with CE318, the different of Aerosol Optical Depth (AOD) is within 5%. The solar radiometer had run in all kinds of harsh weather condition in Gobi in Dunhuang and obtain the AODs nearly eight months continuously. This paper presents instrument design analysis, atmospheric optical depth retrievals as well as the experiment result.
Limberg, Brian J; Johnstone, Kevin; Filloon, Thomas; Catrenich, Carl
2016-09-01
Using United States Pharmacopeia-National Formulary (USP-NF) general method <1223> guidance, the Soleris(®) automated system and reagents (Nonfermenting Total Viable Count for bacteria and Direct Yeast and Mold for yeast and mold) were validated, using a performance equivalence approach, as an alternative to plate counting for total microbial content analysis using five representative microbes: Staphylococcus aureus, Bacillus subtilis, Pseudomonas aeruginosa, Candida albicans, and Aspergillus brasiliensis. Detection times (DTs) in the alternative automated system were linearly correlated to CFU/sample (R(2) = 0.94-0.97) with ≥70% accuracy per USP General Chapter <1223> guidance. The LOD and LOQ of the automated system were statistically similar to the traditional plate count method. This system was significantly more precise than plate counting (RSD 1.2-2.9% for DT, 7.8-40.6% for plate counts), was statistically comparable to plate counting with respect to variations in analyst, vial lots, and instruments, and was robust when variations in the operating detection thresholds (dTs; ±2 units) were used. The automated system produced accurate results, was more precise and less labor-intensive, and met or exceeded criteria for a valid alternative quantitative method, consistent with USP-NF general method <1223> guidance.
Measuring signal-to-noise ratio automatically
NASA Technical Reports Server (NTRS)
Bergman, L. A.; Johnston, A. R.
1980-01-01
Automated method of measuring signal-to-noise ratio in digital communication channels is more precise and 100 times faster than previous methods used. Method based on bit-error-rate (B&R) measurement can be used with cable, microwave radio, or optical links.
Hardware fault insertion and instrumentation system: Mechanization and validation
NASA Technical Reports Server (NTRS)
Benson, J. W.
1987-01-01
Automated test capability for extensive low-level hardware fault insertion testing is developed. The test capability is used to calibrate fault detection coverage and associated latency times as relevant to projecting overall system reliability. Described are modifications made to the NASA Ames Reconfigurable Flight Control System (RDFCS) Facility to fully automate the total test loop involving the Draper Laboratories' Fault Injector Unit. The automated capability provided included the application of sequences of simulated low-level hardware faults, the precise measurement of fault latency times, the identification of fault symptoms, and bulk storage of test case results. A PDP-11/60 served as a test coordinator, and a PDP-11/04 as an instrumentation device. The fault injector was controlled by applications test software in the PDP-11/60, rather than by manual commands from a terminal keyboard. The time base was especially developed for this application to use a variety of signal sources in the system simulator.
van der Heijden, Suzanne; de Oliveira, Susanne Juel; Kampmann, Marie-Louise; Børsting, Claus; Morling, Niels
2017-11-01
The Precision ID Identity Panel was used to type 109 Somali individuals in order to obtain allele frequencies for the Somali population. These frequencies were used to establish a Somali HID-SNP database, which will be used for the biostatistic calculations in family and immigration cases. Genotypes obtained with the Precision ID Identity Panel were found to be almost in complete concordance with genotypes obtained with the SNPforID PCR-SBE-CE assay. In seven SNP loci, silent alleles were identified, of which most were previously described in the literature. The project also set out to compare different AmpliSeq™ workflows to investigate the possibility of using automated library building in forensic genetic case work. In order to do so, the SNP typing of the Somalis was performed using three different workflows: 1) manual library building and sequencing on the Ion PGM™, 2) automated library building using the Biomek ® 3000 and sequencing on the Ion PGM™, and 3) automated library building using the Ion Chef™ and sequencing on the Ion S5™. AmpliSeq™ workflows were compared based on coverage, locus balance, noise, and heterozygote balance. Overall, the Ion Chef™/Ion S5™ workflow was found to give the best results and required least hands-on time in the laboratory. However, the Ion Chef™/Ion S5™ workflow was also the most expensive. The number of libraries that may be constructed in one Ion Chef™ library building run was limited to eight, which is too little for high throughput workflows. The Biomek ® 3000/Ion PGM™ workflow was found to perform similarly to the manual/Ion PGM™ workflow. This argues for the use of automated library building in forensic genetic case work. Automated library building decreases the workload of the laboratory staff, decreases the risk of pipetting errors, and simplifies the daily workflow in forensic genetic laboratories. Copyright © 2017 Elsevier B.V. All rights reserved.
Photogrammetry-Based Automated Measurements for Tooth Shape and Occlusion Analysis
NASA Astrophysics Data System (ADS)
Knyaz, V. A.; Gaboutchian, A. V.
2016-06-01
Tooth measurements (odontometry) are performed for various scientific and practical applications, including dentistry. Present-day techniques are being increasingly based on 3D model use that provides wider prospects in comparison to measurements on real objects: teeth or their plaster copies. The main advantages emerge through application of new measurement methods which provide the needed degree of non-invasiveness, precision, convenience and details. Tooth measurements have been always regarded as a time-consuming research, even more so with use of new methods due to their wider opportunities. This is where automation becomes essential for further development and implication of measurement techniques. In our research automation in obtaining 3D models and automation of measurements provided essential data that was analysed to suggest recommendations for tooth preparation - one of the most responsible clinical procedures in prosthetic dentistry - within a comparatively short period of time. The original photogrammetric 3D reconstruction system allows to generate 3D models of dental arches, reproduce their closure, or occlusion, and to perform a set of standard measurement in automated mode.
Keller, Mark; Naue, Jana; Zengerle, Roland; von Stetten, Felix; Schmidt, Ulrike
2015-01-01
Nested PCR remains a labor-intensive and error-prone biomolecular analysis. Laboratory workflow automation by precise control of minute liquid volumes in centrifugal microfluidic Lab-on-a-Chip systems holds great potential for such applications. However, the majority of these systems require costly custom-made processing devices. Our idea is to augment a standard laboratory device, here a centrifugal real-time PCR thermocycler, with inbuilt liquid handling capabilities for automation. We have developed a microfluidic disk segment enabling an automated nested real-time PCR assay for identification of common European animal groups adapted to forensic standards. For the first time we utilize a novel combination of fluidic elements, including pre-storage of reagents, to automate the assay at constant rotational frequency of an off-the-shelf thermocycler. It provides a universal duplex pre-amplification of short fragments of the mitochondrial 12S rRNA and cytochrome b genes, animal-group-specific main-amplifications, and melting curve analysis for differentiation. The system was characterized with respect to assay sensitivity, specificity, risk of cross-contamination, and detection of minor components in mixtures. 92.2% of the performed tests were recognized as fluidically failure-free sample handling and used for evaluation. Altogether, augmentation of the standard real-time thermocycler with a self-contained centrifugal microfluidic disk segment resulted in an accelerated and automated analysis reducing hands-on time, and circumventing the risk of contamination associated with regular nested PCR protocols.
Design Through Manufacturing: The Solid Model-Finite Element Analysis Interface
NASA Technical Reports Server (NTRS)
Rubin, Carol
2002-01-01
State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts reflecting every detail of the finished product. Ideally, in the aerospace industry, these models should fulfill two very important functions: (1) provide numerical. control information for automated manufacturing of precision parts, and (2) enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in aircraft and space vehicles. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. Presently, the process of preparing CAD models for FEA consumes a great deal of the analyst's time.
Design automation techniques for custom LSI arrays
NASA Technical Reports Server (NTRS)
Feller, A.
1975-01-01
The standard cell design automation technique is described as an approach for generating random logic PMOS, CMOS or CMOS/SOS custom large scale integration arrays with low initial nonrecurring costs and quick turnaround time or design cycle. The system is composed of predesigned circuit functions or cells and computer programs capable of automatic placement and interconnection of the cells in accordance with an input data net list. The program generates a set of instructions to drive an automatic precision artwork generator. A series of support design automation and simulation programs are described, including programs for verifying correctness of the logic on the arrays, performing dc and dynamic analysis of MOS devices, and generating test sequences.
Toni Antikainen; Anti Rohumaa; Christopher G. Hunt; Mari Levirinne; Mark Hughes
2015-01-01
In plywood production, human operators find it difficult to precisely monitor the spread rate of adhesive in real-time. In this study, macroscopic fluorescence was used to estimate spread rate (SR) of urea formaldehyde adhesive on birch (Betula pendula Roth) veneer. This method could be an option when developing automated real-time SR measurement for...
NASA Astrophysics Data System (ADS)
Harms, Justin D.; Bachmann, Charles M.; Ambeau, Brittany L.; Faulring, Jason W.; Ruiz Torres, Andres J.; Badura, Gregory; Myers, Emily
2017-10-01
Field-portable goniometers are created for a wide variety of applications. Many of these applications require specific types of instruments and measurement schemes and must operate in challenging environments. Therefore, designs are based on the requirements that are specific to the application. We present a field-portable goniometer that was designed for measuring the hemispherical-conical reflectance factor (HCRF) of various soils and low-growing vegetation in austere coastal and desert environments and biconical reflectance factors in laboratory settings. Unlike some goniometers, this system features a requirement for "target-plane tracking" to ensure that measurements can be collected on sloped surfaces, without compromising angular accuracy. The system also features a second upward-looking spectrometer to measure the spatially dependent incoming illumination, an integrated software package to provide full automation, an automated leveling system to ensure a standard frame of reference, a design that minimizes the obscuration due to self-shading to measure the opposition effect, and the ability to record a digital elevation model of the target region. This fully automated and highly mobile system obtains accurate and precise measurements of HCRF in a wide variety of terrain and in less time than most other systems while not sacrificing consistency or repeatability in laboratory environments.
Urinalysis: The Automated Versus Manual Techniques; Is It Time To Change?.
Ahmed, Asmaa Ismail; Baz, Heba; Lotfy, Sarah
2016-01-01
Urinalysis is the third major test in clinical laboratory. Manual technique imprecision urges the need for a rapid reliable automated test. We evaluated the H800-FUSIOO automatic urine sediment analyzer and compared it to the manual urinalysis technique to determine if it may be a competitive substitute in laboratories of central hospitals. 1000 urine samples were examined by the two methods in parallel. Agreement, precision, carryover, drift, sensitivity, specificity, and practicability criteria were tested. Agreement ranged from excellent to good for all urine semi-quantitative components (K > 0.4, p = 0.000), except for granular casts (K = 0.317, p = 0.000). Specific gravity results correlated well between the two methods (r = 0.884, p = 0.000). RBCS and WBCs showed moderate correlation (r = 0.42, p = 0.000) and (r = 0.44, p = 0.000), respectively. The auto-analyzer's within-run precision was > 75% for all semi-quantitative components except for proteins (50% precision). This finding in addition to the granular casts poor agreement indicate the necessity of operator interference at the critical cutoff values. As regards quantitative contents, RBCs showed a mean of 69.8 +/- 3.95, C.V. = 5.7, WBCs showed a mean of 38.9 +/- 1.9, C.V. = 4.9). Specific gravity, pH, microalbumin, and creatinine also showed good precision results with C.Vs of 0.000, 2.6, 9.1, and 0.00 respectively. In the between run precision, positive control showed good precision (C.V. = 2.9), while negative control's C.V. was strikingly high (C.V. = 127). Carryover and drift studies were satisfactory. Manual examination of inter-observer results showed major discrepancies (< 60% similar readings), while intra-observer's results correlated well with each other (r = 0.99, p = 0.000). Automation of urinalysis decreases observer-associated variation and offers prompt competitive results when standardized for screening away from the borderline cutoffs.
Comparison of in vivo 3D cone-beam computed tomography tooth volume measurement protocols.
Forst, Darren; Nijjar, Simrit; Flores-Mir, Carlos; Carey, Jason; Secanell, Marc; Lagravere, Manuel
2014-12-23
The objective of this study is to analyze a set of previously developed and proposed image segmentation protocols for precision in both intra- and inter-rater reliability for in vivo tooth volume measurements using cone-beam computed tomography (CBCT) images. Six 3D volume segmentation procedures were proposed and tested for intra- and inter-rater reliability to quantify maxillary first molar volumes. Ten randomly selected maxillary first molars were measured in vivo in random order three times with 10 days separation between measurements. Intra- and inter-rater agreement for all segmentation procedures was attained using intra-class correlation coefficient (ICC). The highest precision was for automated thresholding with manual refinements. A tooth volume measurement protocol for CBCT images employing automated segmentation with manual human refinement on a 2D slice-by-slice basis in all three planes of space possessed excellent intra- and inter-rater reliability. Three-dimensional volume measurements of the entire tooth structure are more precise than 3D volume measurements of only the dental roots apical to the cemento-enamel junction (CEJ).
Zengerle, Roland; von Stetten, Felix; Schmidt, Ulrike
2015-01-01
Nested PCR remains a labor-intensive and error-prone biomolecular analysis. Laboratory workflow automation by precise control of minute liquid volumes in centrifugal microfluidic Lab-on-a-Chip systems holds great potential for such applications. However, the majority of these systems require costly custom-made processing devices. Our idea is to augment a standard laboratory device, here a centrifugal real-time PCR thermocycler, with inbuilt liquid handling capabilities for automation. We have developed a microfluidic disk segment enabling an automated nested real-time PCR assay for identification of common European animal groups adapted to forensic standards. For the first time we utilize a novel combination of fluidic elements, including pre-storage of reagents, to automate the assay at constant rotational frequency of an off-the-shelf thermocycler. It provides a universal duplex pre-amplification of short fragments of the mitochondrial 12S rRNA and cytochrome b genes, animal-group-specific main-amplifications, and melting curve analysis for differentiation. The system was characterized with respect to assay sensitivity, specificity, risk of cross-contamination, and detection of minor components in mixtures. 92.2% of the performed tests were recognized as fluidically failure-free sample handling and used for evaluation. Altogether, augmentation of the standard real-time thermocycler with a self-contained centrifugal microfluidic disk segment resulted in an accelerated and automated analysis reducing hands-on time, and circumventing the risk of contamination associated with regular nested PCR protocols. PMID:26147196
Gokce, Sertan Kutal; Guo, Samuel X.; Ghorashian, Navid; Everett, W. Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C.; Ben-Yakar, Adela
2014-01-01
Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner. PMID:25470130
Whitter, P D; Cary, P L; Leaton, J I; Johnson, J E
1999-01-01
An automated extraction scheme for the analysis of 11 -nor-delta9-tetrahydrocannabinol-9-carboxylic acid using the Hamilton Microlab 2200, which was modified for gravity-flow solid-phase extraction, has been evaluated. The Hamilton was fitted with a six-head probe, a modular valve positioner, and a peristaltic pump. The automated method significantly increased sample throughput, improved assay consistency, and reduced the time spent performing the extraction. Extraction recovery for the automated method was > 90%. The limit of detection, limit of quantitation, and upper limit of linearity were equivalent to the manual method: 1.5, 3.0, and 300 ng/mL, respectively. Precision at the 15-ng/mL cut-off was as follows: mean = 14.4, standard deviation = 0.5, coefficient of variation = 3.5%. Comparison of 38 patient samples, extracted by the manual and automated extraction methods, demonstrated the following correlation statistics: r = .991, slope 1.029, and y-intercept -2.895. Carryover was < 0.3% at 1000 ng/mL. Aliquoting/extraction time for the automated method (48 urine samples) was 50 min, and the manual procedure required approximately 2.5 h. The automated aliquoting/extraction method on the Hamilton Microlab 2200 and its use in forensic applications are reviewed.
Automated geographic registration and radiometric correction for UAV-based mosaics
NASA Astrophysics Data System (ADS)
Thomasson, J. Alex; Shi, Yeyin; Sima, Chao; Yang, Chenghai; Cope, Dale A.
2017-05-01
Texas A and M University has been operating a large-scale, UAV-based, agricultural remote-sensing research project since 2015. To use UAV-based images in agricultural production, many high-resolution images must be mosaicked together to create an image of an agricultural field. Two key difficulties to science-based utilization of such mosaics are geographic registration and radiometric calibration. In our current research project, image files are taken to the computer laboratory after the flight, and semi-manual pre-processing is implemented on the raw image data, including ortho-mosaicking and radiometric calibration. Ground control points (GCPs) are critical for high-quality geographic registration of images during mosaicking. Applications requiring accurate reflectance data also require radiometric-calibration references so that reflectance values of image objects can be calculated. We have developed a method for automated geographic registration and radiometric correction with targets that are installed semi-permanently at distributed locations around fields. The targets are a combination of black (≍5% reflectance), dark gray (≍20% reflectance), and light gray (≍40% reflectance) sections that provide for a transformation of pixel-value to reflectance in the dynamic range of crop fields. The exact spectral reflectance of each target is known, having been measured with a spectrophotometer. At the time of installation, each target is measured for position with a real-time kinematic GPS receiver to give its precise latitude and longitude. Automated location of the reference targets in the images is required for precise, automated, geographic registration; and automated calculation of the digital-number to reflectance transformation is required for automated radiometric calibration. To validate the system for radiometric calibration, a calibrated UAV-based image mosaic of a field was compared to a calibrated single image from a manned aircraft. Reflectance values in selected zones of each image were strongly linearly related, and the average error of UAV-mosaic reflectances was 3.4% in the red band, 1.9% in the green band, and 1.5% in the blue band. Based on these results, the proposed physical system and automated software for calibrating UAV mosaics show excellent promise.
Applicability of two automated disintegration apparatuses for rapidly disintegrating (mini)tablets.
Sieber, Daniel; Lazzari, Alessia; Quodbach, Julian; Pein, Miriam
2017-03-01
Orally disintegrating (mini)tablets (OD(M)Ts) are of interest in the field of pharmaceutics. Their orodispersible character is defined by the disintegration time, which is measured with a basket apparatus according to the European Pharmacopoeia. This method, however, lacks applicability for ODTs and especially ODMTs. New disintegration apparatuses have been described in literature, but a qualification to assess the applicability has not been described. A qualification procedure for two automated disintegration apparatuses, OD-mate and Hermes apparatus, is introduced. Aspects of the operational qualification as well as precision and accuracy regarding a performance qualification were evaluated for both apparatuses analog to the ICH guideline Q2. While the OQ study is performed separately for each apparatus, accuracy and precision were performed following the same protocol for both testers. Small RSDs (16.9% OD-mate; 15.2% Hermes compared to 32.3% for the pharmacopeial method) were found despite very fast disintegration times (1.5 s for both apparatuses). By comparing these RSDs to practical examples, the authors propose threshold values for repeatability depending on the mean disintegration time. Obtained results from the qualification were used to assess the applicability of both apparatuses.
Dotette: Programmable, high-precision, plug-and-play droplet pipetting.
Fan, Jinzhen; Men, Yongfan; Hao Tseng, Kuo; Ding, Yi; Ding, Yunfeng; Villarreal, Fernando; Tan, Cheemeng; Li, Baoqing; Pan, Tingrui
2018-05-01
Manual micropipettes are the most heavily used liquid handling devices in biological and chemical laboratories; however, they suffer from low precision for volumes under 1 μ l and inevitable human errors. For a manual device, the human errors introduced pose potential risks of failed experiments, inaccurate results, and financial costs. Meanwhile, low precision under 1 μ l can cause severe quantification errors and high heterogeneity of outcomes, becoming a bottleneck of reaction miniaturization for quantitative research in biochemical labs. Here, we report Dotette, a programmable, plug-and-play microfluidic pipetting device based on nanoliter liquid printing. With automated control, protocols designed on computers can be directly downloaded into Dotette, enabling programmable operation processes. Utilizing continuous nanoliter droplet dispensing, the precision of the volume control has been successfully improved from traditional 20%-50% to less than 5% in the range of 100 nl to 1000 nl. Such a highly automated, plug-and-play add-on to existing pipetting devices not only improves precise quantification in low-volume liquid handling and reduces chemical consumptions but also facilitates and automates a variety of biochemical and biological operations.
Portable Automation of Static Chamber Sample Collection for Quantifying Soil Gas Flux
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Morgan P.; Groh, Tyler A.; Parkin, Timothy B.
Quantification of soil gas flux using the static chamber method is labor intensive. The number of chambers that can be sampled is limited by the spacing between chambers and the availability of trained research technicians. An automated system for collecting gas samples from chambers in the field would eliminate the need for personnel to return to the chamber during a flux measurement period and would allow a single technician to sample multiple chambers simultaneously. This study describes Chamber Automated Sampling Equipment (FluxCASE) to collect and store chamber headspace gas samples at assigned time points for the measurement of soil gasmore » flux. The FluxCASE design and operation is described, and the accuracy and precision of the FluxCASE system is evaluated. In laboratory measurements of nitrous oxide (N2O), carbon dioxide (CO2), and methane (CH4) concentrations of a standardized gas mixture, coefficients of variation associated with automated and manual sample collection were comparable, indicating no loss of precision. In the field, soil gas fluxes measured from FluxCASEs were in agreement with manual sampling for both N2O and CO2. Slopes of regression equations were 1.01 for CO2 and 0.97 for N2O. The 95% confidence limits of the slopes of the regression lines included the value of one, indicating no bias. Additionally, an expense analysis found a cost recovery ranging from 0.6 to 2.2 yr. Implementing the FluxCASE system is an alternative to improve the efficiency of the static chamber method for measuring soil gas flux while maintaining the accuracy and precision of manual sampling.« less
Assessing the difficulty and time cost of de-identification in clinical narratives.
Dorr, D A; Phillips, W F; Phansalkar, S; Sims, S A; Hurdle, J F
2006-01-01
To characterize the difficulty confronting investigators in removing protected health information (PHI) from cross-discipline, free-text clinical notes, an important challenge to clinical informatics research as recalibrated by the introduction of the US Health Insurance Portability and Accountability Act (HIPAA) and similar regulations. Randomized selection of clinical narratives from complete admissions written by diverse providers, reviewed using a two-tiered rater system and simple automated regular expression tools. For manual review, two independent reviewers used simple search and replace algorithms and visual scanning to find PHI as defined by HIPAA, followed by an independent second review to detect any missed PHI. Simple automated review was also performed for the "easy" PHI that are number- or date-based. From 262 notes, 2074 PHI, or 7.9 +/- 6.1 per note, were found. The average recall (or sensitivity) was 95.9% while precision was 99.6% for single reviewers. Agreement between individual reviewers was strong (ICC = 0.99), although some asymmetry in errors was seen between reviewers (p = 0.001). The automated technique had better recall (98.5%) but worse precision (88.4%) for its subset of identifiers. Manually de-identifying a note took 87.3 +/- 61 seconds on average. Manual de-identification of free-text notes is tedious and time-consuming, but even simple PHI is difficult to automatically identify with the exactitude required under HIPAA.
Campillo-Gimenez, Boris; Garcelon, Nicolas; Jarno, Pascal; Chapplain, Jean Marc; Cuggia, Marc
2013-01-01
The surveillance of Surgical Site Infections (SSI) contributes to the management of risk in French hospitals. Manual identification of infections is costly, time-consuming and limits the promotion of preventive procedures by the dedicated teams. The introduction of alternative methods using automated detection strategies is promising to improve this surveillance. The present study describes an automated detection strategy for SSI in neurosurgery, based on textual analysis of medical reports stored in a clinical data warehouse. The method consists firstly, of enrichment and concept extraction from full-text reports using NOMINDEX, and secondly, text similarity measurement using a vector space model. The text detection was compared to the conventional strategy based on self-declaration and to the automated detection using the diagnosis-related group database. The text-mining approach showed the best detection accuracy, with recall and precision equal to 92% and 40% respectively, and confirmed the interest of reusing full-text medical reports to perform automated detection of SSI.
Kitchen, Steve; Woolley, Anita
2013-01-01
The Q analyzer is a recently launched fully automated photo-optical analyzer equipped with primary tube cap-piercing and capable of clotting, chromogenic, and immunoturbidometric tests. The purpose of the present study was to evaluate the performance characteristics of the Q analyzer with reagents from the instrument manufacturer. We assessed precision and throughput when performing coagulation screening tests, prothrombin time (PT)/international normalized ratio (INR), activated partial thromboplastin time (APTT), and fibrinogen assay by Clauss assay. We compared results with established reagent instrument combinations in widespread use. Precision of PT/INR and APTT was acceptable as indicated by total precision of around 3%. The time to first result was 3 min for an INR and 5 min for PT/APTT. The system produced 115 completed samples per hour when processing only INRs and 60 samples (120 results) per hour for PT/APTT combined. The sensitivity of the DG-APTT Synth/Q method to mild deficiency of factor VIII (FVIII), IX, and XI was excellent (as indicated by APTTs being prolonged above the upper limit of the reference range). The Q analyzer was associated with high precision, acceptable throughput, and good reliability. When used in combination with DG-PT reagent and manufacturer's instrument-specific international sensitivity index, the INRs obtained were accurate. The Q analyzer with DG-APTT Synth reagent demonstrated good sensitivity to isolated mild deficiency of FVIII, IX, and XI and had the advantage of relative insensitivity to mild FXII deficiency. Taken together, our data indicate that the Q hemostasis analyzer was suitable for routine use in combination with the reagents evaluated.
Automation of the ELISpot assay for high-throughput detection of antigen-specific T-cell responses.
Almeida, Coral-Ann M; Roberts, Steven G; Laird, Rebecca; McKinnon, Elizabeth; Ahmed, Imran; Pfafferott, Katja; Turley, Joanne; Keane, Niamh M; Lucas, Andrew; Rushton, Ben; Chopra, Abha; Mallal, Simon; John, Mina
2009-05-15
The enzyme linked immunospot (ELISpot) assay is a fundamental tool in cellular immunology, providing both quantitative and qualitative information on cellular cytokine responses to defined antigens. It enables the comprehensive screening of patient derived peripheral blood mononuclear cells to reveal the antigenic restriction of T-cell responses and is an emerging technique in clinical laboratory investigation of certain infectious diseases. As with all cellular-based assays, the final results of the assay are dependent on a number of technical variables that may impact precision if not highly standardised between operators. When studies that are large scale or using multiple antigens are set up manually, these assays may be labour intensive, have many manual handling steps, are subject to data and sample integrity failure and may show large inter-operator variability. Here we describe the successful automated performance of the interferon (IFN)-gamma ELISpot assay from cell counting through to electronic capture of cytokine quantitation and present the results of a comparison between automated and manual performance of the ELISpot assay. The mean number of spot forming units enumerated by both methods for limiting dilutions of CMV, EBV and influenza (CEF)-derived peptides in six healthy individuals were highly correlated (r>0.83, p<0.05). The precision results from the automated system compared favourably with the manual ELISpot and further ensured electronic tracking, increased through-put and reduced turnaround time.
Proof of Concept of Automated Collision Detection Technology in Rugby Sevens.
Clarke, Anthea C; Anson, Judith M; Pyne, David B
2017-04-01
Clarke, AC, Anson, JM, and Pyne, DB. Proof of concept of automated collision detection technology in rugby sevens. J Strength Cond Res 31(4): 1116-1120, 2017-Developments in microsensor technology allow for automated detection of collisions in various codes of football, removing the need for time-consuming postprocessing of video footage. However, little research is available on the ability of microsensor technology to be used across various sports or genders. Game video footage was matched with microsensor-detected collisions (GPSports) in one men's (n = 12 players) and one women's (n = 12) rugby sevens match. True-positive, false-positive, and false-negative events between video and microsensor-detected collisions were used to calculate recall (ability to detect a collision) and precision (accurately identify a collision). The precision was similar between the men's and women's rugby sevens game (∼0.72; scale 0.00-1.00); however, the recall in the women's game (0.45) was less than that for the men's game (0.69). This resulted in 45% of collisions for men and 62% of collisions for women being incorrectly labeled. Currently, the automated collision detection system in GPSports microtechnology units has only modest utility in rugby sevens, and it seems that a rugby sevens-specific algorithm is needed. Differences in measures between the men's and women's game may be a result of physical size, and strength, and physicality, as well as technical and tactical factors.
Eichhold, Thomas H; McCauley-Myers, David L; Khambe, Deepa A; Thompson, Gary A; Hoke, Steven H
2007-01-17
A method for the simultaneous determination of dextromethorphan (DEX), dextrorphan (DET), and guaifenesin (GG) in human plasma was developed, validated, and applied to determine plasma concentrations of these compounds in samples from six clinical pharmacokinetic (PK) studies. Semi-automated liquid handling systems were used to perform the majority of the sample manipulation including liquid/liquid extraction (LLE) of the analytes from human plasma. Stable-isotope-labeled analogues were utilized as internal standards (ISTDs) for each analyte to facilitate accurate and precise quantification. Extracts were analyzed using gradient liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Use of semi-automated LLE with LC-MS/MS proved to be a very rugged and reliable approach for analysis of more than 6200 clinical study samples. The lower limit of quantification was validated at 0.010, 0.010, and 1.0 ng/mL of plasma for DEX, DET, and GG, respectively. Accuracy and precision of quality control (QC) samples for all three analytes met FDA Guidance criteria of +/-15% for average QC accuracy with coefficients of variation less than 15%. Data from the thorough evaluation of the method during development, validation, and application are presented to characterize selectivity, linearity, over-range sample analysis, accuracy, precision, autosampler carry-over, ruggedness, extraction efficiency, ionization suppression, and stability. Pharmacokinetic data are also provided to illustrate improvements in systemic drug and metabolite concentration-time profiles that were achieved by formulation optimization.
Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco
2016-02-09
Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established protocols, namely biomass adjustment and limited throughput. Automation was shown to improve data reliability, as well as experimental throughput simultaneously minimizing the needed hands-on-time to a third. Thereby, the presented protocol meets the demands for the analysis of samples generated by the upcoming generation of devices for higher throughput phototrophic cultivation and thereby contributes to boosting the time efficiency for setting up algae lipid production processes.
Jipp, Meike
2012-12-01
The extent to which individual differences in fine motor abilities affect indoor safety and efficiency of human-wheelchair systems was examined. To reduce the currently large number of indoor wheelchair accidents, assistance systems with a high level of automation were developed. It was proposed to adapt the wheelchair's level of automation to the user's ability to steer the device to avoid drawbacks of highly automated wheelchairs. The state of the art, however, lacks an empirical identification of those abilities. A study with 23 participants is described. The participants drove through various sections of a course with a powered wheelchair. Repeatedly measured criteria were safety (numbers of collisions) and efficiency (times required for reaching goals). As covariates, the participants' fine motor abilities were assessed. A random coefficient modeling approach was conducted to analyze the data,which were available on two levels as course sections were nested within participants.The participants' aiming, precision, and armhand speed contributed significantly to both criteria: Participants with lower fine motor abilities had more collisions and required more time for reaching goals. Adapting the wheelchair's level of automation to these fine motor abilities can improve indoor safety and efficiency. In addition, the results highlight the need to further examine the impact of individual differences on the design of automation features for powered wheelchairs as well as other applications of automation. The results facilitate the improvement of current wheelchair technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ievlev, Anton V.; Belianinov, Alexei; Jesse, Stephen
Time of flight secondary ion mass spectrometry (ToF SIMS) is one of the most powerful characterization tools allowing imaging of the chemical properties of various systems and materials. It allows precise studies of the chemical composition with sub-100-nm lateral and nanometer depth spatial resolution. However, comprehensive interpretation of ToF SIMS results is challengeable, because of the data volume and its multidimensionality. Furthermore, investigation of the samples with pronounced topographical features are complicated by the spectral shift. In this work we developed approach for the comprehensive ToF SIMS data interpretation based on the data analytics and automated extraction of the samplemore » topography based on time of flight shift. We further applied this approach to investigate correlation between biological function and chemical composition in Arabidopsis roots.« less
Ievlev, Anton V.; Belianinov, Alexei; Jesse, Stephen; ...
2017-12-06
Time of flight secondary ion mass spectrometry (ToF SIMS) is one of the most powerful characterization tools allowing imaging of the chemical properties of various systems and materials. It allows precise studies of the chemical composition with sub-100-nm lateral and nanometer depth spatial resolution. However, comprehensive interpretation of ToF SIMS results is challengeable, because of the data volume and its multidimensionality. Furthermore, investigation of the samples with pronounced topographical features are complicated by the spectral shift. In this work we developed approach for the comprehensive ToF SIMS data interpretation based on the data analytics and automated extraction of the samplemore » topography based on time of flight shift. We further applied this approach to investigate correlation between biological function and chemical composition in Arabidopsis roots.« less
On the precision of automated activation time estimation
NASA Technical Reports Server (NTRS)
Kaplan, D. T.; Smith, J. M.; Rosenbaum, D. S.; Cohen, R. J.
1988-01-01
We examined how the assignment of local activation times in epicardial and endocardial electrograms is affected by sampling rate, ambient signal-to-noise ratio, and sinx/x waveform interpolation. Algorithms used for the estimation of fiducial point locations included dV/dtmax, and a matched filter detection algorithm. Test signals included epicardial and endocardial electrograms overlying both normal and infarcted regions of dog myocardium. Signal-to-noise levels were adjusted by combining known data sets with white noise "colored" to match the spectral characteristics of experimentally recorded noise. For typical signal-to-noise ratios and sampling rates, the template-matching algorithm provided the greatest precision in reproducibly estimating fiducial point location, and sinx/x interpolation allowed for an additional significant improvement. With few restrictions, combining these two techniques may allow for use of digitization rates below the Nyquist rate without significant loss of precision.
Screening for Learning and Memory Mutations: A New Approach.
Gallistel, C R; King, A P; Daniel, A M; Freestone, D; Papachristos, E B; Balci, F; Kheifets, A; Zhang, J; Su, X; Schiff, G; Kourtev, H
2010-01-30
We describe a fully automated, live-in 24/7 test environment, with experimental protocols that measure the accuracy and precision with which mice match the ratio of their expected visit durations to the ratio of the incomes obtained from two hoppers, the progress of instrumental and classical conditioning (trials-to-acquisition), the accuracy and precision of interval timing, the effect of relative probability on the choice of a timed departure target, and the accuracy and precision of memory for the times of day at which food is available. The system is compact; it obviates the handling of the mice during testing; it requires negligible amounts of experimenter/technician time; and it delivers clear and extensive results from 3 protocols within a total of 7-9 days after the mice are placed in the test environment. Only a single 24-hour period is required for the completion of first protocol (the matching protocol), which is strong test of temporal and spatial estimation and memory mechanisms. Thus, the system permits the extensive screening of many mice in a short period of time and in limited space. The software is publicly available.
High-Precision Phenotyping of Grape Bunch Architecture Using Fast 3D Sensor and Automation.
Rist, Florian; Herzog, Katja; Mack, Jenny; Richter, Robert; Steinhage, Volker; Töpfer, Reinhard
2018-03-02
Wine growers prefer cultivars with looser bunch architecture because of the decreased risk for bunch rot. As a consequence, grapevine breeders have to select seedlings and new cultivars with regard to appropriate bunch traits. Bunch architecture is a mosaic of different single traits which makes phenotyping labor-intensive and time-consuming. In the present study, a fast and high-precision phenotyping pipeline was developed. The optical sensor Artec Spider 3D scanner (Artec 3D, L-1466, Luxembourg) was used to generate dense 3D point clouds of grapevine bunches under lab conditions and an automated analysis software called 3D-Bunch-Tool was developed to extract different single 3D bunch traits, i.e., the number of berries, berry diameter, single berry volume, total volume of berries, convex hull volume of grapes, bunch width and bunch length. The method was validated on whole bunches of different grapevine cultivars and phenotypic variable breeding material. Reliable phenotypic data were obtained which show high significant correlations (up to r² = 0.95 for berry number) compared to ground truth data. Moreover, it was shown that the Artec Spider can be used directly in the field where achieved data show comparable precision with regard to the lab application. This non-invasive and non-contact field application facilitates the first high-precision phenotyping pipeline based on 3D bunch traits in large plant sets.
System Performance of an Integrated Airborne Spacing Algorithm with Ground Automation
NASA Technical Reports Server (NTRS)
Swieringa, Kurt A.; Wilson, Sara R.; Baxley, Brian T.
2016-01-01
The National Aeronautics and Space Administration's (NASA's) first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature ATM technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise time-based scheduling in the Terminal airspace; Controller Managed Spacing (CMS), which provides controllers with decision support tools to enable precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain precise spacing behind another aircraft. Recent simulations and IM algorithm development at NASA have focused on trajectory-based IM operations where aircraft equipped with IM avionics are expected to achieve a spacing goal, assigned by air traffic controllers, at the final approach fix. The recently published IM Minimum Operational Performance Standards describe five types of IM operations. This paper discusses the results and conclusions of a human-in-the-loop simulation that investigated three of those IM operations. The results presented in this paper focus on system performance and integration metrics. Overall, the IM operations conducted in this simulation integrated well with ground-based decisions support tools and certain types of IM operational were able to provide improved spacing precision at the final approach fix; however, some issues were identified that should be addressed prior to implementing IM procedures into real-world operations.
Fong, Erika J.; Huang, Chao; Hamilton, Julie; ...
2015-11-23
Here, a major advantage of microfluidic devices is the ability to manipulate small sample volumes, thus reducing reagent waste and preserving precious sample. However, to achieve robust sample manipulation it is necessary to address device integration with the macroscale environment. To realize repeatable, sensitive particle separation with microfluidic devices, this protocol presents a complete automated and integrated microfluidic platform that enables precise processing of 0.15–1.5 ml samples using microfluidic devices. Important aspects of this system include modular device layout and robust fixtures resulting in reliable and flexible world to chip connections, and fully-automated fluid handling which accomplishes closed-loop sample collection,more » system cleaning and priming steps to ensure repeatable operation. Different microfluidic devices can be used interchangeably with this architecture. Here we incorporate an acoustofluidic device, detail its characterization, performance optimization, and demonstrate its use for size-separation of biological samples. By using real-time feedback during separation experiments, sample collection is optimized to conserve and concentrate sample. Although requiring the integration of multiple pieces of equipment, advantages of this architecture include the ability to process unknown samples with no additional system optimization, ease of device replacement, and precise, robust sample processing.« less
An evaluation of the automated assay of urinary oestrogens in pregnant women
Muir, G. G.; Ryan, M.; Conaill, D. U.
1970-01-01
An automated assay suitable for estimating urinary oestrogens in pregnant women has been investigated. Fluorimetry was found to have considerable advantages over colorimetry. The fluorimetric assay was simpler, more precise, more sensitive, and eliminated the need for correction for non-specific chromogens; in the assay of oestriol in pregnant women there was no need for correction for non-specific fluorescence. Spectrofluorimetric and photometric analyses, recoveries, and reproducibility show that the method offers a robust means of providing values for urinary oestrogen in pregnant women on a scale of up to 100 tests a day, the time of the assay being one and a half hours. PMID:5476876
Toward Automated Intraocular Laser Surgery Using a Handheld Micromanipulator
Yang, Sungwook; MacLachlan, Robert A.; Riviere, Cameron N.
2014-01-01
This paper presents a technique for automated intraocular laser surgery using a handheld micromanipulator known as Micron. The novel handheld manipulator enables the automated scanning of a laser probe within a cylinder of 4 mm long and 4 mm in diameter. For the automation, the surface of the retina is reconstructed using a stereomicroscope, and then preplanned targets are placed on the surface. The laser probe is precisely located on the target via visual servoing of the aiming beam, while maintaining a specific distance above the surface. In addition, the system is capable of tracking the surface of the eye in order to compensate for any eye movement introduced during the operation. We compared the performance of the automated scanning using various control thresholds, in order to find the most effective threshold in terms of accuracy and speed. Given the selected threshold, we conducted the handheld operation above a fixed target surface. The average error and execution time are reduced by 63.6% and 28.5%, respectively, compared to the unaided trials. Finally, the automated laser photocoagulation was demonstrated also in an eye phantom, including compensation for the eye movement. PMID:25893135
Aubrey, Wayne; Riley, Michael C; Young, Michael; King, Ross D; Oliver, Stephen G; Clare, Amanda
2015-01-01
Many advances in synthetic biology require the removal of a large number of genomic elements from a genome. Most existing deletion methods leave behind markers, and as there are a limited number of markers, such methods can only be applied a fixed number of times. Deletion methods that recycle markers generally are either imprecise (remove untargeted sequences), or leave scar sequences which can cause genome instability and rearrangements. No existing marker recycling method is automation-friendly. We have developed a novel openly available deletion tool that consists of: 1) a method for deleting genomic elements that can be repeatedly used without limit, is precise, scar-free, and suitable for automation; and 2) software to design the method's primers. Our tool is sequence agnostic and could be used to delete large numbers of coding sequences, promoter regions, transcription factor binding sites, terminators, etc in a single genome. We have validated our tool on the deletion of non-essential open reading frames (ORFs) from S. cerevisiae. The tool is applicable to arbitrary genomes, and we provide primer sequences for the deletion of: 90% of the ORFs from the S. cerevisiae genome, 88% of the ORFs from S. pombe genome, and 85% of the ORFs from the L. lactis genome.
Aubrey, Wayne; Riley, Michael C.; Young, Michael; King, Ross D.; Oliver, Stephen G.; Clare, Amanda
2015-01-01
Many advances in synthetic biology require the removal of a large number of genomic elements from a genome. Most existing deletion methods leave behind markers, and as there are a limited number of markers, such methods can only be applied a fixed number of times. Deletion methods that recycle markers generally are either imprecise (remove untargeted sequences), or leave scar sequences which can cause genome instability and rearrangements. No existing marker recycling method is automation-friendly. We have developed a novel openly available deletion tool that consists of: 1) a method for deleting genomic elements that can be repeatedly used without limit, is precise, scar-free, and suitable for automation; and 2) software to design the method’s primers. Our tool is sequence agnostic and could be used to delete large numbers of coding sequences, promoter regions, transcription factor binding sites, terminators, etc in a single genome. We have validated our tool on the deletion of non-essential open reading frames (ORFs) from S. cerevisiae. The tool is applicable to arbitrary genomes, and we provide primer sequences for the deletion of: 90% of the ORFs from the S. cerevisiae genome, 88% of the ORFs from S. pombe genome, and 85% of the ORFs from the L. lactis genome. PMID:26630677
IoT for Real-Time Measurement of High-Throughput Liquid Dispensing in Laboratory Environments.
Shumate, Justin; Baillargeon, Pierre; Spicer, Timothy P; Scampavia, Louis
2018-04-01
Critical to maintaining quality control in high-throughput screening is the need for constant monitoring of liquid-dispensing fidelity. Traditional methods involve operator intervention with gravimetric analysis to monitor the gross accuracy of full plate dispenses, visual verification of contents, or dedicated weigh stations on screening platforms that introduce potential bottlenecks and increase the plate-processing cycle time. We present a unique solution using open-source hardware, software, and 3D printing to automate dispenser accuracy determination by providing real-time dispense weight measurements via a network-connected precision balance. This system uses an Arduino microcontroller to connect a precision balance to a local network. By integrating the precision balance as an Internet of Things (IoT) device, it gains the ability to provide real-time gravimetric summaries of dispensing, generate timely alerts when problems are detected, and capture historical dispensing data for future analysis. All collected data can then be accessed via a web interface for reviewing alerts and dispensing information in real time or remotely for timely intervention of dispense errors. The development of this system also leveraged 3D printing to rapidly prototype sensor brackets, mounting solutions, and component enclosures.
Development of the automated circulating tumor cell recovery system with microcavity array.
Negishi, Ryo; Hosokawa, Masahito; Nakamura, Seita; Kanbara, Hisashige; Kanetomo, Masafumi; Kikuhara, Yoshihito; Tanaka, Tsuyoshi; Matsunaga, Tadashi; Yoshino, Tomoko
2015-05-15
Circulating tumor cells (CTCs) are well recognized as useful biomarker for cancer diagnosis and potential target of drug discovery for metastatic cancer. Efficient and precise recovery of extremely low concentrations of CTCs from blood has been required to increase the detection sensitivity. Here, an automated system equipped with a microcavity array (MCA) was demonstrated for highly efficient and reproducible CTC recovery. The use of MCA allows selective recovery of cancer cells from whole blood on the basis of differences in size between tumor and blood cells. Intra- and inter-assays revealed that the automated system achieved high efficiency and reproducibility equal to the assay manually performed by well-trained operator. Under optimized assay workflow, the automated system allows efficient and precise cell recovery for non-small cell lung cancer cells spiked in whole blood. The automated CTC recovery system will contribute to high-throughput analysis in the further clinical studies on large cohort of cancer patients. Copyright © 2014 Elsevier B.V. All rights reserved.
A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*
Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing
2016-01-01
Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569
Straight-Pore Microfilter with Efficient Regeneration
NASA Technical Reports Server (NTRS)
Liu, Han; LaConti, Anthony B.; McCallum. Thomas J.; Schmitt, Edwin W.
2010-01-01
A novel, high-efficiency gas particulate filter has precise particle size screening, low pressure drop, and a simple and fast regeneration process. The regeneration process, which requires minimal material and energy consumption, can be completely automated, and the filtration performance can be restored within a very short period of time. This filter is of a novel material composite that contains the support structure and a novel coating.
Python Leap Second Management and Implementation of Precise Barycentric Correction (barycorrpy)
NASA Astrophysics Data System (ADS)
Kanodia, Shubham; Wright, Jason
2018-01-01
We announce barycorrpy (BCPy) , a Python implementation to calculate precise barycentric corrections well below the 1 cm/s level, following the algorithm of Wright and Eastman (2014). This level of precision is required in the search for 1 Earth mass planets in the Habitable Zones of Sun-like stars by the Radial Velocity (RV) method, where the maximum semi-amplitude is about 9 cm/s. We have developed BCPy to be used in the pipeline for the next generation Doppler Spectrometers - Habitable-zone Planet Finder (HPF) and NEID. In this work, we also develop an automated leap second management routine to improve upon the one available in Astropy. It checks for and downloads a new leap second file before converting from the UT time scale to TDB.
Apollo experience report: Real-time display system
NASA Technical Reports Server (NTRS)
Sullivan, C. J.; Burbank, L. W.
1976-01-01
The real time display system used in the Apollo Program is described; the systematic organization of the system, which resulted from hardware/software trade-offs and the establishment of system criteria, is emphasized. Each basic requirement of the real time display system was met by a separate subsystem. The computer input multiplexer subsystem, the plotting display subsystem, the digital display subsystem, and the digital television subsystem are described. Also described are the automated display design and the generation of precision photographic reference slides required for the three display subsystems.
Automation of ⁹⁹Tc extraction by LOV prior ICP-MS detection: application to environmental samples.
Rodríguez, Rogelio; Leal, Luz; Miranda, Silvia; Ferrer, Laura; Avivar, Jessica; García, Ariel; Cerdà, Víctor
2015-02-01
A new, fast, automated and inexpensive sample pre-treatment method for (99)Tc determination by inductively coupled plasma-mass spectrometry (ICP-MS) detection is presented. The miniaturized approach is based on a lab-on-valve (LOV) system, allowing automatic separation and preconcentration of (99)Tc. Selectivity is provided by the solid phase extraction system used (TEVA resin) which retains selectively pertechnetate ion in diluted nitric acid solution. The proposed system has some advantages such as minimization of sample handling, reduction of reagents volume, improvement of intermediate precision and sample throughput, offering a significant decrease of both time and cost per analysis in comparison to other flow techniques and batch methods. The proposed LOV system has been successfully applied to different samples of environmental interest (water and soil) with satisfactory recoveries, between 94% and 98%. The detection limit (LOD) of the developed method is 0.005 ng. The high durability of the resin and its low amount (32 mg), its good intermediate precision (RSD 3.8%) and repeatability (RSD 2%) and its high extraction frequency (up to 5 h(-1)) makes this method an inexpensive, high precision and fast tool for monitoring (99)Tc in environmental samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Degrande, Céline; Fuks, Benjamin; Hirschi, Valentin; ...
2015-05-05
We present for the first time the full automation of collider predictions matched with parton showers at the next-to-leading accuracy in QCD within nontrivial extensions of the standard model. The sole inputs required from the user are the model Lagrangian and the process of interest. As an application of the above, we explore scenarios beyond the standard model where new colored scalar particles can be pair produced in hadron collisions. Using simplified models to describe the new field interactions with the standard model, we present precision predictions for the LHC within the MadGraph5_aMC@NLO framework.
Bessemans, Laurent; Jully, Vanessa; de Raikem, Caroline; Albanese, Mathieu; Moniotte, Nicolas; Silversmet, Pascal; Lemoine, Dominique
2016-01-01
High-throughput screening technologies are increasingly integrated into the formulation development process of biopharmaceuticals. The performance of liquid handling systems is dependent on the ability to deliver accurate and precise volumes of specific reagents to ensure process quality. We have developed an automated gravimetric calibration procedure to adjust the accuracy and evaluate the precision of the TECAN Freedom EVO liquid handling system. Volumes from 3 to 900 µL using calibrated syringes and fixed tips were evaluated with various solutions, including aluminum hydroxide and phosphate adjuvants, β-casein, sucrose, sodium chloride, and phosphate-buffered saline. The methodology to set up liquid class pipetting parameters for each solution was to split the process in three steps: (1) screening of predefined liquid class, including different pipetting parameters; (2) adjustment of accuracy parameters based on a calibration curve; and (3) confirmation of the adjustment. The run of appropriate pipetting scripts, data acquisition, and reports until the creation of a new liquid class in EVOware was fully automated. The calibration and confirmation of the robotic system was simple, efficient, and precise and could accelerate data acquisition for a wide range of biopharmaceutical applications. PMID:26905719
Generating a Magellanic star cluster catalog with ASteCA
NASA Astrophysics Data System (ADS)
Perren, G. I.; Piatti, A. E.; Vázquez, R. A.
2016-08-01
An increasing number of software tools have been employed in the recent years for the automated or semi-automated processing of astronomical data. The main advantages of using these tools over a standard by-eye analysis include: speed (particularly for large databases), homogeneity, reproducibility, and precision. At the same time, they enable a statistically correct study of the uncertainties associated with the analysis, in contrast with manually set errors, or the still widespread practice of simply not assigning errors. We present a catalog comprising 210 star clusters located in the Large and Small Magellanic Clouds, observed with Washington photometry. Their fundamental parameters were estimated through an homogeneous, automatized and completely unassisted process, via the Automated Stellar Cluster Analysis package ( ASteCA). Our results are compared with two types of studies on these clusters: one where the photometry is the same, and another where the photometric system is different than that employed by ASteCA.
NASA Astrophysics Data System (ADS)
Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.
2014-05-01
MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.
Validation of an automated system for aliquoting of HIV-1 Env-pseudotyped virus stocks.
Schultz, Anke; Germann, Anja; Fuss, Martina; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A; Montefiori, David C; Zimmermann, Heiko; von Briesen, Hagen
2018-01-01
The standardized assessments of HIV-specific immune responses are of main interest in the preclinical and clinical stage of HIV-1 vaccine development. In this regard, HIV-1 Env-pseudotyped viruses play a central role for the evaluation of neutralizing antibody profiles and are produced according to Good Clinical Laboratory Practice- (GCLP-) compliant manual and automated procedures. To further improve and complete the automated production cycle an automated system for aliquoting HIV-1 pseudovirus stocks has been implemented. The automation platform consists of a modified Tecan-based system including a robot platform for handling racks containing 48 cryovials, a Decapper, a tubing pump and a safety device consisting of ultrasound sensors for online liquid level detection of each individual cryovial. With the aim to aliquot the HIV-1 pseudoviruses in an automated manner under GCLP-compliant conditions a validation plan was developed where the acceptance criteria-accuracy, precision as well as the specificity and robustness-were defined and summarized. By passing the validation experiments described in this article the automated system for aliquoting has been successfully validated. This allows the standardized and operator independent distribution of small-scale and bulk amounts of HIV-1 pseudovirus stocks with a precise and reproducible outcome to support upcoming clinical vaccine trials.
Validation of an automated system for aliquoting of HIV-1 Env-pseudotyped virus stocks
Schultz, Anke; Germann, Anja; Fuss, Martina; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; Zimmermann, Heiko
2018-01-01
The standardized assessments of HIV-specific immune responses are of main interest in the preclinical and clinical stage of HIV-1 vaccine development. In this regard, HIV-1 Env-pseudotyped viruses play a central role for the evaluation of neutralizing antibody profiles and are produced according to Good Clinical Laboratory Practice- (GCLP-) compliant manual and automated procedures. To further improve and complete the automated production cycle an automated system for aliquoting HIV-1 pseudovirus stocks has been implemented. The automation platform consists of a modified Tecan-based system including a robot platform for handling racks containing 48 cryovials, a Decapper, a tubing pump and a safety device consisting of ultrasound sensors for online liquid level detection of each individual cryovial. With the aim to aliquot the HIV-1 pseudoviruses in an automated manner under GCLP-compliant conditions a validation plan was developed where the acceptance criteria—accuracy, precision as well as the specificity and robustness—were defined and summarized. By passing the validation experiments described in this article the automated system for aliquoting has been successfully validated. This allows the standardized and operator independent distribution of small-scale and bulk amounts of HIV-1 pseudovirus stocks with a precise and reproducible outcome to support upcoming clinical vaccine trials. PMID:29300769
Screening for Learning and Memory Mutations: A New Approach
Gallistel, C. R.; King, A. P.; Daniel, A. M.; Freestone, D.; Papachristos, E. B.; Balci, F.; Kheifets, A.; Zhang, J.; Su, X.; Schiff, G.; Kourtev, H.
2010-01-01
We describe a fully automated, live-in 24/7 test environment, with experimental protocols that measure the accuracy and precision with which mice match the ratio of their expected visit durations to the ratio of the incomes obtained from two hoppers, the progress of instrumental and classical conditioning (trials-to-acquisition), the accuracy and precision of interval timing, the effect of relative probability on the choice of a timed departure target, and the accuracy and precision of memory for the times of day at which food is available. The system is compact; it obviates the handling of the mice during testing; it requires negligible amounts of experimenter/technician time; and it delivers clear and extensive results from 3 protocols within a total of 7–9 days after the mice are placed in the test environment. Only a single 24-hour period is required for the completion of first protocol (the matching protocol), which is strong test of temporal and spatial estimation and memory mechanisms. Thus, the system permits the extensive screening of many mice in a short period of time and in limited space. The software is publicly available. PMID:20352069
Self-optimizing approach for automated laser resonator alignment
NASA Astrophysics Data System (ADS)
Brecher, C.; Schmitt, R.; Loosen, P.; Guerrero, V.; Pyschny, N.; Pavim, A.; Gatej, A.
2012-02-01
Nowadays, the assembly of laser systems is dominated by manual operations, involving elaborate alignment by means of adjustable mountings. From a competition perspective, the most challenging problem in laser source manufacturing is price pressure, a result of cost competition exerted mainly from Asia. From an economical point of view, an automated assembly of laser systems defines a better approach to produce more reliable units at lower cost. However, the step from today's manual solutions towards an automated assembly requires parallel developments regarding product design, automation equipment and assembly processes. This paper introduces briefly the idea of self-optimizing technical systems as a new approach towards highly flexible automation. Technically, the work focuses on the precision assembly of laser resonators, which is one of the final and most crucial assembly steps in terms of beam quality and laser power. The paper presents a new design approach for miniaturized laser systems and new automation concepts for a robot-based precision assembly, as well as passive and active alignment methods, which are based on a self-optimizing approach. Very promising results have already been achieved, considerably reducing the duration and complexity of the laser resonator assembly. These results as well as future development perspectives are discussed.
Timeliner: Automating Procedures on the ISS
NASA Technical Reports Server (NTRS)
Brown, Robert; Braunstein, E.; Brunet, Rick; Grace, R.; Vu, T.; Zimpfer, Doug; Dwyer, William K.; Robinson, Emily
2002-01-01
Timeliner has been developed as a tool to automate procedural tasks. These tasks may be sequential tasks that would typically be performed by a human operator, or precisely ordered sequencing tasks that allow autonomous execution of a control process. The Timeliner system includes elements for compiling and executing sequences that are defined in the Timeliner language. The Timeliner language was specifically designed to allow easy definition of scripts that provide sequencing and control of complex systems. The execution environment provides real-time monitoring and control based on the commands and conditions defined in the Timeliner language. The Timeliner sequence control may be preprogrammed, compiled from Timeliner "scripts," or it may consist of real-time, interactive inputs from system operators. In general, the Timeliner system lowers the workload for mission or process control operations. In a mission environment, scripts can be used to automate spacecraft operations including autonomous or interactive vehicle control, performance of preflight and post-flight subsystem checkouts, or handling of failure detection and recovery. Timeliner may also be used for mission payload operations, such as stepping through pre-defined procedures of a scientific experiment.
Harder, Nathalie; Mora-Bermúdez, Felipe; Godinez, William J; Wünsche, Annelie; Eils, Roland; Ellenberg, Jan; Rohr, Karl
2009-11-01
Live-cell imaging allows detailed dynamic cellular phenotyping for cell biology and, in combination with small molecule or drug libraries, for high-content screening. Fully automated analysis of live cell movies has been hampered by the lack of computational approaches that allow tracking and recognition of individual cell fates over time in a precise manner. Here, we present a fully automated approach to analyze time-lapse movies of dividing cells. Our method dynamically categorizes cells into seven phases of the cell cycle and five aberrant morphological phenotypes over time. It reliably tracks cells and their progeny and can thus measure the length of mitotic phases and detect cause and effect if mitosis goes awry. We applied our computational scheme to annotate mitotic phenotypes induced by RNAi gene knockdown of CKAP5 (also known as ch-TOG) or by treatment with the drug nocodazole. Our approach can be readily applied to comparable assays aiming at uncovering the dynamic cause of cell division phenotypes.
Campestrini, J; Lecaillon, J B; Godbillon, J
1997-12-19
An automated high-performance liquid chromatography (HPLC) method for the determination of formoterol in human plasma with improved sensitivity has been developed and validated. Formoterol and CGP 47086, the internal standard, were extracted from plasma (1 ml) using a cation-exchange solid-phase extraction (SPE) cartridge. The compounds were eluted with pH 6 buffer solution-methanol (70:30, v/v) and the eluate was further diluted with water. An aliquot of the extract solution was injected and analyzed by HPLC. The extraction, dilution, injection and chromatographic analysis were combined and automated using the automate (ASPEC) system. The chromatographic separations were achieved on a 5 microm, Hypersil ODS analytical column (200 mm x 3 mm I.D.), using (pH 6 phosphate buffer, 0.035 M + 20 mg/l EDTA)-MeOH-CH3CN (70:25:5, v/v/v) as the mobile phase at a flow-rate of 0.4 ml/min. The analytes were detected with electrochemical detection at an operating potential of +0.63 V. Intra-day accuracy and precision were assessed from the relative recoveries of calibration/quality control plasma samples in the concentration range of 7.14 to 238 pmol/l of formoterol base. The accuracy over the entire concentration range varied from 81 to 105%, and the precision (C.V.) ranged from 3 to 14%. Inter-day accuracy and precision were assessed in the concentration range of 11.9 to 238 pmol/l of formoterol base in plasma. The accuracy over the entire concentration range varied from 98 to 109%, and precision ranged from 8 to 19%. At the limit of quantitation (LOQ) of 11.9 pmol/l for inter-day measurements, the recovery value was 109% and C.V. was 19%. As shown from intra-day accuracy and precision results, favorable conditions (a newly used column, a newly washed detector cell and moderate residual cell current level) allowed us to reach a LOQ of 7.14 pmol/l of formoterol base (3 pg/ml of formoterol fumarate dihydrate). Improvement of the limit of detection by a factor of about 10 was reached as compared to the previously described methods. The method has been applied for quantifying formoterol in plasma after 120 microg drug inhalation to volunteers. Formoterol was still measurable at 24 h post-dosing in most subjects and a slow elimination of formoterol from plasma beyond 6-8 h after inhalation was demonstrated for the first time thanks to the sensitivity of the method.
Ding, Huiyang; Shi, Chaoyang; Ma, Li; Yang, Zhan; Wang, Mingyu; Wang, Yaqiong; Chen, Tao; Sun, Lining; Toshio, Fukuda
2018-04-08
The maneuvering and electrical characterization of nanotubes inside a scanning electron microscope (SEM) has historically been time-consuming and laborious for operators. Before the development of automated nanomanipulation-enabled techniques for the performance of pick-and-place and characterization of nanoobjects, these functions were still incomplete and largely operated manually. In this paper, a dual-probe nanomanipulation system vision-based feedback was demonstrated to automatically perform 3D nanomanipulation tasks, to investigate the electrical characterization of nanotubes. The XY-position of Atomic Force Microscope (AFM) cantilevers and individual carbon nanotubes (CNTs) were precisely recognized via a series of image processing operations. A coarse-to-fine positioning strategy in the Z-direction was applied through the combination of the sharpness-based depth estimation method and the contact-detection method. The use of nanorobotic magnification-regulated speed aided in improving working efficiency and reliability. Additionally, we proposed automated alignment of manipulator axes by visual tracking the movement trajectory of the end effector. The experimental results indicate the system's capability for automated measurement electrical characterization of CNTs. Furthermore, the automated nanomanipulation system has the potential to be extended to other nanomanipulation tasks.
Ding, Huiyang; Shi, Chaoyang; Ma, Li; Yang, Zhan; Wang, Mingyu; Wang, Yaqiong; Chen, Tao; Sun, Lining; Toshio, Fukuda
2018-01-01
The maneuvering and electrical characterization of nanotubes inside a scanning electron microscope (SEM) has historically been time-consuming and laborious for operators. Before the development of automated nanomanipulation-enabled techniques for the performance of pick-and-place and characterization of nanoobjects, these functions were still incomplete and largely operated manually. In this paper, a dual-probe nanomanipulation system vision-based feedback was demonstrated to automatically perform 3D nanomanipulation tasks, to investigate the electrical characterization of nanotubes. The XY-position of Atomic Force Microscope (AFM) cantilevers and individual carbon nanotubes (CNTs) were precisely recognized via a series of image processing operations. A coarse-to-fine positioning strategy in the Z-direction was applied through the combination of the sharpness-based depth estimation method and the contact-detection method. The use of nanorobotic magnification-regulated speed aided in improving working efficiency and reliability. Additionally, we proposed automated alignment of manipulator axes by visual tracking the movement trajectory of the end effector. The experimental results indicate the system’s capability for automated measurement electrical characterization of CNTs. Furthermore, the automated nanomanipulation system has the potential to be extended to other nanomanipulation tasks. PMID:29642495
Automated characterization and assembly of individual nanowires for device fabrication.
Yu, Kaiyan; Yi, Jingang; Shan, Jerry W
2018-05-15
The automated sorting and positioning of nanowires and nanotubes is essential to enabling the scalable manufacturing of nanodevices for a variety of applications. However, two fundamental challenges still remain: (i) automated placement of individual nanostructures in precise locations, and (ii) the characterization and sorting of highly variable nanomaterials to construct well-controlled nanodevices. Here, we propose and demonstrate an integrated, electric-field based method for the simultaneous automated characterization, manipulation, and assembly of nanowires (ACMAN) with selectable electrical conductivities into nanodevices. We combine contactless and solution-based electro-orientation spectroscopy and electrophoresis-based motion-control, planning and manipulation strategies to simultaneously characterize and manipulate multiple individual nanowires. These nanowires can be selected according to their electrical characteristics and precisely positioned at different locations in a low-conductivity liquid to form functional nanodevices with desired electrical properties. We validate the ACMAN design by assembling field-effect transistors (FETs) with silicon nanowires of selected electrical conductivities. The design scheme provides a key enabling technology for the scalable, automated sorting and assembly of nanowires and nanotubes to build functional nanodevices.
Yip, Hon Ming; Li, John C. S.; Cui, Xin; Gao, Qiannan; Leung, Chi Chiu
2014-01-01
As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities. PMID:25133248
Janiszewski, J; Schneider, P; Hoffmaster, K; Swyden, M; Wells, D; Fouda, H
1997-01-01
The development and application of membrane solid phase extraction (SPE) in 96-well microtiter plate format is described for the automated analysis of drugs in biological fluids. The small bed volume of the membrane allows elution of the analyte in a very small solvent volume, permitting direct HPLC injection and negating the need for the time consuming solvent evaporation step. A programmable liquid handling station (Quadra 96) was modified to automate all SPE steps. To avoid drying of the SPE bed and to enhance the analytical precision a novel protocol for performing the condition, load and wash steps in rapid succession was utilized. A block of 96 samples can now be extracted in 10 min., about 30 times faster than manual solvent extraction or single cartridge SPE methods. This processing speed complements the high-throughput speed of contemporary high performance liquid chromatography mass spectrometry (HPLC/MS) analysis. The quantitative analysis of a test analyte (Ziprasidone) in plasma demonstrates the utility and throughput of membrane SPE in combination with HPLC/MS. The results obtained with the current automated procedure compare favorably with those obtained using solvent and traditional solid phase extraction methods. The method has been used for the analysis of numerous drug prototypes in biological fluids to support drug discovery efforts.
NASA Astrophysics Data System (ADS)
Gong, X.; Wu, Q.
2017-12-01
Network virtual instrument (VI) is a new development direction in current automated test. Based on LabVIEW, the software and hardware system of VI used for emission spectrum of pulsed high-voltage direct current (DC) discharge is developed and applied to investigate pulsed high-voltage DC discharge of nitrogen. By doing so, various functions are realized including real time collection of emission spectrum of nitrogen, monitoring operation state of instruments and real time analysis and processing of data. By using shared variables and DataSocket technology in LabVIEW, the network VI system based on field VI is established. The system can acquire the emission spectrum of nitrogen in the test site, monitor operation states of field instruments, realize real time face-to-face interchange of two sites, and analyze data in the far-end from the network terminal. By employing the network VI system, the staff in the two sites acquired the same emission spectrum of nitrogen and conducted the real time communication. By comparing with the previous results, it can be seen that the experimental data obtained by using the system are highly precise. This implies that the system shows reliable network stability and safety and satisfies the requirements for studying the emission spectrum of pulsed high-voltage discharge in high-precision fields or network terminals. The proposed architecture system is described and the target group gets the useful enlightenment in many fields including engineering remote users, specifically in control- and automation-related tasks.
Automated GC-MS analysis of free amino acids in biological fluids.
Kaspar, Hannelore; Dettmer, Katja; Gronwald, Wolfram; Oefner, Peter J
2008-07-15
A gas chromatography-mass spectrometry (GC-MS) method was developed for the quantitative analysis of free amino acids as their propyl chloroformate derivatives in biological fluids. Derivatization with propyl chloroformate is carried out directly in the biological samples without prior protein precipitation or solid-phase extraction of the amino acids, thereby allowing automation of the entire procedure, including addition of reagents, extraction and injection into the GC-MS. The total analysis time was 30 min and 30 amino acids could be reliably quantified using 19 stable isotope-labeled amino acids as internal standards. Limits of detection (LOD) and lower limits of quantification (LLOQ) were in the range of 0.03-12 microM and 0.3-30 microM, respectively. The method was validated using a certified amino acid standard and reference plasma, and its applicability to different biological fluids was shown. Intra-day precision for the analysis of human urine, blood plasma, and cell culture medium was 2.0-8.8%, 0.9-8.3%, and 2.0-14.3%, respectively, while the inter-day precision for human urine was 1.5-14.1%.
The Automation and Exoplanet Orbital Characterization from the Gemini Planet Imager Exoplanet Survey
NASA Astrophysics Data System (ADS)
Jinfei Wang, Jason; Graham, James; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry; Kalas, Paul; arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Ruffio, Jean-Baptiste; Sivaramakrishnan, Anand; Gemini Planet Imager Exoplanet Survey Collaboration
2018-01-01
The Gemini Planet Imager (GPI) Exoplanet Survey (GPIES) is a multi-year 600-star survey to discover and characterize young Jovian exoplanets and their planet forming environments. For large surveys like GPIES, it is critical to have a uniform dataset processed with the latest techniques and calibrations. I will describe the GPI Data Cruncher, an automated data processing framework that is able to generate fully reduced data minutes after the data are taken and can also reprocess the entire campaign in a single day on a supercomputer. The Data Cruncher integrates into a larger automated data processing infrastructure which syncs, logs, and displays the data. I will discuss the benefits of the GPIES data infrastructure, including optimizing observing strategies, finding planets, characterizing instrument performance, and constraining giant planet occurrence. I will also discuss my work in characterizing the exoplanets we have imaged in GPIES through monitoring their orbits. Using advanced data processing algorithms and GPI's precise astrometric calibration, I will show that GPI can achieve one milliarcsecond astrometry on the extensively-studied planet Beta Pic b. With GPI, we can confidently rule out a possible transit of Beta Pic b, but have precise timings on a Hill sphere transit, and I will discuss efforts to search for transiting circumplanetary material this year. I will also discuss the orbital monitoring of other exoplanets as part of GPIES.
Survey and Method for Determination of Trajectory Predictor Requirements
NASA Technical Reports Server (NTRS)
Rentas, Tamika L.; Green, Steven M.; Cate, Karen Tung
2009-01-01
A survey of air-traffic-management researchers, representing a broad range of automation applications, was conducted to document trajectory-predictor requirements for future decision-support systems. Results indicated that the researchers were unable to articulate a basic set of trajectory-prediction requirements for their automation concepts. Survey responses showed the need to establish a process to help developers determine the trajectory-predictor-performance requirements for their concepts. Two methods for determining trajectory-predictor requirements are introduced. A fast-time simulation method is discussed that captures the sensitivity of a concept to the performance of its trajectory-prediction capability. A characterization method is proposed to provide quicker, yet less precise results, based on analysis and simulation to characterize the trajectory-prediction errors associated with key modeling options for a specific concept. Concept developers can then identify the relative sizes of errors associated with key modeling options, and qualitatively determine which options lead to significant errors. The characterization method is demonstrated for a case study involving future airport surface traffic management automation. Of the top four sources of error, results indicated that the error associated with accelerations to and from turn speeds was unacceptable, the error associated with the turn path model was acceptable, and the error associated with taxi-speed estimation was of concern and needed a higher fidelity concept simulation to obtain a more precise result
The BAARA (Biological AutomAted RAdiotracking) System: A New Approach in Ecological Field Studies
Řeřucha, Šimon; Bartonička, Tomáš; Jedlička, Petr; Čížek, Martin; Hlouša, Ondřej; Lučan, Radek; Horáček, Ivan
2015-01-01
Radiotracking is an important and often the only possible method to explore specific habits and the behaviour of animals, but it has proven to be very demanding and time-consuming, especially when frequent positioning of a large group is required. Our aim was to address this issue by making the process partially automated, to mitigate the demands and related costs. This paper presents a novel automated tracking system that consists of a network of automated tracking stations deployed within the target area. Each station reads the signals from telemetry transmitters, estimates the bearing and distance of the tagged animals and records their position. The station is capable of tracking a theoretically unlimited number of transmitters on different frequency channels with the period of 5–15 seconds per single channel. An ordinary transmitter that fits within the supported frequency band might be used with BAARA (Biological AutomAted RAdiotracking); an extra option is the use of a custom-programmable transmitter with configurable operational parameters, such as the precise frequency channel or the transmission parameters. This new approach to a tracking system was tested for its applicability in a series of field and laboratory tests. BAARA has been tested within fieldwork explorations of Rousettus aegyptiacus during field trips to Dakhla oasis in Egypt. The results illustrate the novel perspective which automated radiotracking opens for the study of spatial behaviour, particularly in addressing topics in the domain of population ecology. PMID:25714910
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierce, Karisa M.; Wright, Bob W.; Synovec, Robert E.
2007-02-02
First, simulated chromatographic separations with declining retention time precision were used to study the performance of the piecewise retention time alignment algorithm and to demonstrate an unsupervised parameter optimization method. The average correlation coefficient between the first chromatogram and every other chromatogram in the data set was used to optimize the alignment parameters. This correlation method does not require a training set, so it is unsupervised and automated. This frees the user from needing to provide class information and makes the alignment algorithm more generally applicable to classifying completely unknown data sets. For a data set of simulated chromatograms wheremore » the average chromatographic peak was shifted past two neighboring peaks between runs, the average correlation coefficient of the raw data was 0.46 ± 0.25. After automated, optimized piecewise alignment, the average correlation coefficient was 0.93 ± 0.02. Additionally, a relative shift metric and principal component analysis (PCA) were used to independently quantify and categorize the alignment performance, respectively. The relative shift metric was defined as four times the standard deviation of a given peak’s retention time in all of the chromatograms, divided by the peak-width-at-base. The raw simulated data sets that were studied contained peaks with average relative shifts ranging between 0.3 and 3.0. Second, a “real” data set of gasoline separations was gathered using three different GC methods to induce severe retention time shifting. In these gasoline separations, retention time precision improved ~8 fold following alignment. Finally, piecewise alignment and the unsupervised correlation optimization method were applied to severely shifted GC separations of reformate distillation fractions. The effect of piecewise alignment on peak heights and peak areas is also reported. Piecewise alignment either did not change the peak height, or caused it to slightly decrease. The average relative difference in peak height after piecewise alignment was –0.20%. Piecewise alignment caused the peak areas to either stay the same, slightly increase, or slightly decrease. The average absolute relative difference in area after piecewise alignment was 0.15%.« less
NASA Astrophysics Data System (ADS)
Lu, Yiqing; Xi, Peng; Piper, James A.; Huo, Yujing; Jin, Dayong
2012-11-01
We report a new development of orthogonal scanning automated microscopy (OSAM) incorporating time-gated detection to locate rare-event organisms regardless of autofluorescent background. The necessity of using long-lifetime (hundreds of microseconds) luminescent biolabels for time-gated detection implies long integration (dwell) time, resulting in slow scan speed. However, here we achieve high scan speed using a new 2-step orthogonal scanning strategy to realise on-the-fly time-gated detection and precise location of 1-μm lanthanide-doped microspheres with signal-to-background ratio of 8.9. This enables analysis of a 15 mm × 15 mm slide area in only 3.3 minutes. We demonstrate that detection of only a few hundred photoelectrons within 100 μs is sufficient to distinguish a target event in a prototype system using ultraviolet LED excitation. Cytometric analysis of lanthanide labelled Giardia cysts achieved a signal-to-background ratio of two orders of magnitude. Results suggest that time-gated OSAM represents a new opportunity for high-throughput background-free biosensing applications.
SONG-China Project: A Global Automated Observation Network
NASA Astrophysics Data System (ADS)
Yang, Z. Z.; Lu, X. M.; Tian, J. F.; Zhuang, C. G.; Wang, K.; Deng, L. C.
2017-09-01
Driven by advancements in technology and scientific objectives, data acquisition in observational astronomy has been changed greatly in recent years. Fully automated or even autonomous ground-based network of telescopes has now become a tendency for time-domain observational projects. The Stellar Observations Network Group (SONG) is an international collaboration with the participation and contribution of the Chinese astronomy community. The scientific goal of SONG is time-domain astrophysics such as asteroseismology and open cluster research. The SONG project aims to build a global network of 1 m telescopes equipped with high-precision and high-resolution spectrographs, and two-channel lucky-imaging cameras. It is the Chinese initiative to install a 50 cm binocular photometry telescope at each SONG node sharing the network platform and infrastructure. This work is focused on design and implementation in technology and methodology of SONG/50BiN, a typical ground-based network composed of multiple sites and a variety of instruments.
van der Laak, Jeroen A W M; Dijkman, Henry B P M; Pahlplatz, Martin M M
2006-03-01
The magnification factor in transmission electron microscopy is not very precise, hampering for instance quantitative analysis of specimens. Calibration of the magnification is usually performed interactively using replica specimens, containing line or grating patterns with known spacing. In the present study, a procedure is described for automated magnification calibration using digital images of a line replica. This procedure is based on analysis of the power spectrum of Fourier transformed replica images, and is compared to interactive measurement in the same images. Images were used with magnification ranging from 1,000 x to 200,000 x. The automated procedure deviated on average 0.10% from interactive measurements. Especially for catalase replicas, the coefficient of variation of automated measurement was considerably smaller (average 0.28%) compared to that of interactive measurement (average 3.5%). In conclusion, calibration of the magnification in digital images from transmission electron microscopy may be performed automatically, using the procedure presented here, with high precision and accuracy.
Performance evaluation of the automated nucleated red blood cell enumeration on Sysmex XN analyser.
Tantanate, C; Klinbua, C
2015-06-01
Presence of peripheral blood nucleated red blood cell (NRBC) is associated with pathological conditions and leads to the overestimation of white blood cell count in automated haematology analysers (HA). The authors evaluated NRBC enumeration by a new HA Sysmex XN (XN) to demonstrate the precision and comparability to manual count (MC) at the various NRBC values. Specimens with initially NRBC positive were included. For precision assessment, 8 levels of NRBCs were repeatedly analysed. For comparison study, 234 specimens were analysed by both XN and MC. For precision study, the percentage of coefficient of variation ranged from 14% to 45.6% and 1.2% to 4.4% for MC and XN, respectively. For comparison study between XN and MC, NRBCs ranged from 0% to 612.5%. Regression analysis demonstrated an r(2) of 0.98. The mean bias of 14.1% with 95% limits of agreement between -48.76% and 76.95% was found. The NRBC counts from XN appeared to be more in accordance with MC when the NRBCs were lower than 200% with the concordance rate of 94.2%. The automated NRBC enumeration by XN was precise and could replace the traditional MC, especially for the specimens with NRBCs lower than 200%. © 2014 John Wiley & Sons Ltd.
Development and operation of a high-throughput accurate-wavelength lens-based spectrometer a)
Bell, Ronald E.
2014-07-11
A high-throughput spectrometer for the 400-820 nm wavelength range has been developed for charge exchange recombination spectroscopy or general spectroscopy. A large 2160 mm -1 grating is matched with fast f /1.8 200 mm lenses, which provide stigmatic imaging. A precision optical encoder measures the grating angle with an accuracy ≤ 0.075 arc seconds. A high quantum efficiency low-etaloning CCD detector allows operation at longer wavelengths. A patch panel allows input fibers to interface with interchangeable fiber holders that attach to a kinematic mount behind the entrance slit. The computer-controlled hardware allows automated control of wavelength, timing, f-number, automated datamore » collection, and wavelength calibration.« less
Graphic overlays in high-precision teleoperation: Current and future work at JPL
NASA Technical Reports Server (NTRS)
Diner, Daniel B.; Venema, Steven C.
1989-01-01
In space teleoperation additional problems arise, including signal transmission time delays. These can greatly reduce operator performance. Recent advances in graphics open new possibilities for addressing these and other problems. Currently a multi-camera system with normal 3-D TV and video graphics capabilities is being developed. Trained and untrained operators will be tested for high precision performance using two force reflecting hand controllers and a voice recognition system to control two robot arms and up to 5 movable stereo or non-stereo TV cameras. A number of new techniques of integrating TV and video graphics displays to improve operator training and performance in teleoperation and supervised automation are evaluated.
Kuepper, Claus; Kallenbach-Thieltges, Angela; Juette, Hendrik; Tannapfel, Andrea; Großerueschkamp, Frederik; Gerwert, Klaus
2018-05-16
A feasibility study using a quantum cascade laser-based infrared microscope for the rapid and label-free classification of colorectal cancer tissues is presented. Infrared imaging is a reliable, robust, automated, and operator-independent tissue classification method that has been used for differential classification of tissue thin sections identifying tumorous regions. However, long acquisition time by the so far used FT-IR-based microscopes hampered the clinical translation of this technique. Here, the used quantum cascade laser-based microscope provides now infrared images for precise tissue classification within few minutes. We analyzed 110 patients with UICC-Stage II and III colorectal cancer, showing 96% sensitivity and 100% specificity of this label-free method as compared to histopathology, the gold standard in routine clinical diagnostics. The main hurdle for the clinical translation of IR-Imaging is overcome now by the short acquisition time for high quality diagnostic images, which is in the same time range as frozen sections by pathologists.
Flight Test Results from Real-Time Relative Global Positioning System Flight Experiment on STS-69
NASA Technical Reports Server (NTRS)
Park, Young W.; Brazzel, Jack P., Jr.; Carpenter, J. Russell; Hinkel, Heather D.; Newman, James H.
1996-01-01
A real-time global positioning system (GPS) Kalman filter has been developed to support automated rendezvous with the International Space Station (ISS). The filter is integrated with existing Shuttle rendezvous software running on a 486 laptop computer under Windows. In this work, we present real-time and postflight results achieved with the filter on STS-69. The experiment used GPS data from an Osborne/Jet propulsion Laboratory TurboRouge receiver carried on the Wake Shield Facility (WSF) free flyer and a Rockwell Collins 3M receiver carried on the Orbiter. Real time filter results, processed onboard the Shuttle and replayed in near-time on the ground, are based on single vehicle mode operation and on 5 to 20 minute snapshots of telemetry provided by WSF for dual-vehicle mode operation. The Orbiter and WSF state vectors calculated using our filter compare favorably with precise reference orbits determined by the University of Texas Center for Space Research. The lessons learned from this experiment will be used in conjunction with future experiments to mitigate the technology risk posed by automated rendezvous and docking to the ISS.
Baig, Ayaz; Siddiqui, Imran; Jabbar, Abdul; Azam, Syed Iqbal; Sabir, Salman; Alam, Shahryar; Ghani, Farooq
2007-01-01
To determine the accuracy, turnaround time and cost effectiveness of bedside monitoring of blood glucose levels by non-laboratory health care workers and centralized testing of blood glucose by automated analyzer in a tertiary care hospital. The study was conducted in Section of Chemical Pathology, Department of Pathology and Microbiology and Section of Endocrinology Department of Medicine, Aga Khan University and Hospital Karachi, from April 2005 to March 2006. One hundred and ten patients were included in the study. The blood glucose levels were analyzed on glucometer (Precision Abbott) by finger stick, using Biosensor Technology. At the same time venous blood was obtained to analyze glucose in clinical laboratory on automated analyzer (SYNCHRON CX7) by glucose oxidase method. We observed good correlation between bed side glucometer and laboratory automated analyzer for glucose values between 3.3 mmol/L (60 mg/dl) and 16.7 (300 mg/dl). A significant difference was observed for glucose values less than 3.3 mmol/L (p = 0.002) and glucose values more than 16.67 mmol/l (p = 0.049). Mean Turnaround time for glucometer and automated analyzer were 0.08 hours and 2.49 hours respectively. The cost of glucose testing with glucometer was 48.8% lower than centralized lab based testing. Bedside glucometer testing, though less expensive does not have good accuracy in acutely ill patient with either very high or very low blood glucose levels.
Some Automated Cartography Developments at the Defense Mapping Agency.
1981-01-01
on a pantographic router creating a laminate step model which was moulded in plaster for carving Into a terrain model. This section will trace DMA’s...offering economical automation. Precision flatbed Concord plotters were brought into DMA with sufficiently programmable control computers to perform these
Minifactory: a precision assembly system adaptable to the product life cycle
NASA Astrophysics Data System (ADS)
Muir, Patrick F.; Rizzi, Alfred A.; Gowdy, Jay W.
1997-12-01
Automated product assembly systems are traditionally designed with the intent that they will be operated with few significant changes for as long as the product is being manufactured. This approach to factory design and programming has may undesirable qualities which have motivated the development of more 'flexible' systems. In an effort to improve agility, different types of flexibility have been integrated into factory designs. Specifically, automated assembly systems have been endowed with the ability to assemble differing products by means of computer-controlled robots, and to accommodate variations in parts locations and dimensions by means of sensing. The product life cycle (PLC) is a standard four-stage model of the performance of a product from the time that it is first introduced in the marketplace until the time that it is discontinued. Manufacturers can improve their return on investment by adapting the production process to the PLC. We are developing two concepts to enable manufacturers to more readily achieve this goal: the agile assembly architecture (AAA), an abstract framework for distributed modular automation; and minifactory, our physical instantation of this architecture for the assembly of precision electro-mechanical devices. By examining the requirements which each PLC stage places upon the production system, we identify characteristics of factory design and programming which are appropriate for that stage. As the product transitions from one stage to the next, the factory design and programing should also transition from one embodiment to the next in order to achieve the best return on investment. Modularity of the factory components, highly flexible product transport mechanisms, and a high level of distributed intelligence are key characteristics of minifactory that enable this adaptation.
Precision and Disclosure in Text and Voice Interviews on Smartphones
Antoun, Christopher; Ehlen, Patrick; Fail, Stefanie; Hupp, Andrew L.; Johnston, Michael; Vickers, Lucas; Yan, H. Yanna; Zhang, Chan
2015-01-01
As people increasingly communicate via asynchronous non-spoken modes on mobile devices, particularly text messaging (e.g., SMS), longstanding assumptions and practices of social measurement via telephone survey interviewing are being challenged. In the study reported here, 634 people who had agreed to participate in an interview on their iPhone were randomly assigned to answer 32 questions from US social surveys via text messaging or speech, administered either by a human interviewer or by an automated interviewing system. 10 interviewers from the University of Michigan Survey Research Center administered voice and text interviews; automated systems launched parallel text and voice interviews at the same time as the human interviews were launched. The key question was how the interview mode affected the quality of the response data, in particular the precision of numerical answers (how many were not rounded), variation in answers to multiple questions with the same response scale (differentiation), and disclosure of socially undesirable information. Texting led to higher quality data—fewer rounded numerical answers, more differentiated answers to a battery of questions, and more disclosure of sensitive information—than voice interviews, both with human and automated interviewers. Text respondents also reported a strong preference for future interviews by text. The findings suggest that people interviewed on mobile devices at a time and place that is convenient for them, even when they are multitasking, can give more trustworthy and accurate answers than those in more traditional spoken interviews. The findings also suggest that answers from text interviews, when aggregated across a sample, can tell a different story about a population than answers from voice interviews, potentially altering the policy implications from a survey. PMID:26060991
Precision and Disclosure in Text and Voice Interviews on Smartphones.
Schober, Michael F; Conrad, Frederick G; Antoun, Christopher; Ehlen, Patrick; Fail, Stefanie; Hupp, Andrew L; Johnston, Michael; Vickers, Lucas; Yan, H Yanna; Zhang, Chan
2015-01-01
As people increasingly communicate via asynchronous non-spoken modes on mobile devices, particularly text messaging (e.g., SMS), longstanding assumptions and practices of social measurement via telephone survey interviewing are being challenged. In the study reported here, 634 people who had agreed to participate in an interview on their iPhone were randomly assigned to answer 32 questions from US social surveys via text messaging or speech, administered either by a human interviewer or by an automated interviewing system. 10 interviewers from the University of Michigan Survey Research Center administered voice and text interviews; automated systems launched parallel text and voice interviews at the same time as the human interviews were launched. The key question was how the interview mode affected the quality of the response data, in particular the precision of numerical answers (how many were not rounded), variation in answers to multiple questions with the same response scale (differentiation), and disclosure of socially undesirable information. Texting led to higher quality data-fewer rounded numerical answers, more differentiated answers to a battery of questions, and more disclosure of sensitive information-than voice interviews, both with human and automated interviewers. Text respondents also reported a strong preference for future interviews by text. The findings suggest that people interviewed on mobile devices at a time and place that is convenient for them, even when they are multitasking, can give more trustworthy and accurate answers than those in more traditional spoken interviews. The findings also suggest that answers from text interviews, when aggregated across a sample, can tell a different story about a population than answers from voice interviews, potentially altering the policy implications from a survey.
The NANOGrav 11-year Data Set: High-precision Timing of 45 Millisecond Pulsars
NASA Astrophysics Data System (ADS)
Arzoumanian, Zaven; Brazier, Adam; Burke-Spolaor, Sarah; Chamberlin, Sydney; Chatterjee, Shami; Christy, Brian; Cordes, James M.; Cornish, Neil J.; Crawford, Fronefield; Thankful Cromartie, H.; Crowter, Kathryn; DeCesar, Megan E.; Demorest, Paul B.; Dolch, Timothy; Ellis, Justin A.; Ferdman, Robert D.; Ferrara, Elizabeth C.; Fonseca, Emmanuel; Garver-Daniels, Nathan; Gentile, Peter A.; Halmrast, Daniel; Huerta, E. A.; Jenet, Fredrick A.; Jessup, Cody; Jones, Glenn; Jones, Megan L.; Kaplan, David L.; Lam, Michael T.; Lazio, T. Joseph W.; Levin, Lina; Lommen, Andrea; Lorimer, Duncan R.; Luo, Jing; Lynch, Ryan S.; Madison, Dustin; Matthews, Allison M.; McLaughlin, Maura A.; McWilliams, Sean T.; Mingarelli, Chiara; Ng, Cherry; Nice, David J.; Pennucci, Timothy T.; Ransom, Scott M.; Ray, Paul S.; Siemens, Xavier; Simon, Joseph; Spiewak, Renée; Stairs, Ingrid H.; Stinebring, Daniel R.; Stovall, Kevin; Swiggum, Joseph K.; Taylor, Stephen R.; Vallisneri, Michele; van Haasteren, Rutger; Vigeland, Sarah J.; Zhu, Weiwei; The NANOGrav Collaboration
2018-04-01
We present high-precision timing data over time spans of up to 11 years for 45 millisecond pulsars observed as part of the North American Nanohertz Observatory for Gravitational Waves (NANOGrav) project, aimed at detecting and characterizing low-frequency gravitational waves. The pulsars were observed with the Arecibo Observatory and/or the Green Bank Telescope at frequencies ranging from 327 MHz to 2.3 GHz. Most pulsars were observed with approximately monthly cadence, and six high-timing-precision pulsars were observed weekly. All were observed at widely separated frequencies at each observing epoch in order to fit for time-variable dispersion delays. We describe our methods for data processing, time-of-arrival (TOA) calculation, and the implementation of a new, automated method for removing outlier TOAs. We fit a timing model for each pulsar that includes spin, astrometric, and (for binary pulsars) orbital parameters; time-variable dispersion delays; and parameters that quantify pulse-profile evolution with frequency. The timing solutions provide three new parallax measurements, two new Shapiro delay measurements, and two new measurements of significant orbital-period variations. We fit models that characterize sources of noise for each pulsar. We find that 11 pulsars show significant red noise, with generally smaller spectral indices than typically measured for non-recycled pulsars, possibly suggesting a different origin. A companion paper uses these data to constrain the strength of the gravitational-wave background.
Take-over performance in evasive manoeuvres.
Happee, Riender; Gold, Christian; Radlmayr, Jonas; Hergeth, Sebastian; Bengler, Klaus
2017-09-01
We investigated after effects of automation in take-over scenarios in a high-end moving-base driving simulator. Drivers performed evasive manoeuvres encountering a blocked lane in highway driving. We compared the performance of drivers 1) during manual driving, 2) after automated driving with eyes on the road while performing the cognitively demanding n-back task, and 3) after automated driving with eyes off the road performing the visually demanding SuRT task. Both minimum time to collision (TTC) and minimum clearance towards the obstacle disclosed a substantial number of near miss events and are regarded as valuable surrogate safety metrics in evasive manoeuvres. TTC proved highly sensitive to the applied definition of colliding paths, and we prefer robust solutions using lane position while disregarding heading. The extended time to collision (ETTC) which takes into account acceleration was close to the more robust conventional TTC. In line with other publications, the initial steering or braking intervention was delayed after using automation compared to manual driving. This resulted in lower TTC values and stronger steering and braking actions. Using automation, effects of cognitive distraction were similar to visual distraction for the intervention time with effects on the surrogate safety metric TTC being larger with visual distraction. However the precision of the evasive manoeuvres was hardly affected with a similar clearance towards the obstacle, similar overshoots and similar excursions to the hard shoulder. Further research is needed to validate and complement the current simulator based results with human behaviour in real world driving conditions. Experiments with real vehicles can disclose possible systematic differences in behaviour, and naturalistic data can serve to validate surrogate safety measures like TTC and obstacle clearance in evasive manoeuvres. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Automated tumor analysis for molecular profiling in lung cancer
Boyd, Clinton; James, Jacqueline A.; Loughrey, Maurice B.; Hougton, Joseph P.; Boyle, David P.; Kelly, Paul; Maxwell, Perry; McCleary, David; Diamond, James; McArt, Darragh G.; Tunstall, Jonathon; Bankhead, Peter; Salto-Tellez, Manuel
2015-01-01
The discovery and clinical application of molecular biomarkers in solid tumors, increasingly relies on nucleic acid extraction from FFPE tissue sections and subsequent molecular profiling. This in turn requires the pathological review of haematoxylin & eosin (H&E) stained slides, to ensure sample quality, tumor DNA sufficiency by visually estimating the percentage tumor nuclei and tumor annotation for manual macrodissection. In this study on NSCLC, we demonstrate considerable variation in tumor nuclei percentage between pathologists, potentially undermining the precision of NSCLC molecular evaluation and emphasising the need for quantitative tumor evaluation. We subsequently describe the development and validation of a system called TissueMark for automated tumor annotation and percentage tumor nuclei measurement in NSCLC using computerized image analysis. Evaluation of 245 NSCLC slides showed precise automated tumor annotation of cases using Tissuemark, strong concordance with manually drawn boundaries and identical EGFR mutational status, following manual macrodissection from the image analysis generated tumor boundaries. Automated analysis of cell counts for % tumor measurements by Tissuemark showed reduced variability and significant correlation (p < 0.001) with benchmark tumor cell counts. This study demonstrates a robust image analysis technology that can facilitate the automated quantitative analysis of tissue samples for molecular profiling in discovery and diagnostics. PMID:26317646
Quantitative analysis of cardiovascular MR images.
van der Geest, R J; de Roos, A; van der Wall, E E; Reiber, J H
1997-06-01
The diagnosis of cardiovascular disease requires the precise assessment of both morphology and function. Nearly all aspects of cardiovascular function and flow can be quantified nowadays with fast magnetic resonance (MR) imaging techniques. Conventional and breath-hold cine MR imaging allow the precise and highly reproducible assessment of global and regional left ventricular function. During the same examination, velocity encoded cine (VEC) MR imaging provides measurements of blood flow in the heart and great vessels. Quantitative image analysis often still relies on manual tracing of contours in the images. Reliable automated or semi-automated image analysis software would be very helpful to overcome the limitations associated with the manual and tedious processing of the images. Recent progress in MR imaging of the coronary arteries and myocardial perfusion imaging with contrast media, along with the further development of faster imaging sequences, suggest that MR imaging could evolve into a single technique ('one stop shop') for the evaluation of many aspects of heart disease. As a result, it is very likely that the need for automated image segmentation and analysis software algorithms will further increase. In this paper the developments directed towards the automated image analysis and semi-automated contour detection for cardiovascular MR imaging are presented.
Scheduling Mission-Critical Flows in Congested and Contested Airborne Network Environments
2018-03-01
precision agriculture [64–71]. However, designing, implementing, and testing UAV networks poses numerous interdisciplinary challenges because the...applications including search and rescue, disaster relief, precision agriculture , environmental monitoring, and surveillance. Many of these applications...monitoring enabling precision agriculture ,” in Automation Science and Engineering (CASE), 2015 IEEE International Conference on. IEEE, 2015, pp. 462–469. [65
Harvester-based sensing system for cotton fiber-quality mapping
USDA-ARS?s Scientific Manuscript database
Precision agriculture in cotton production attempts to maximize profitability by exploiting information on field spatial variability to optimize the fiber yield and quality. For precision agriculture to be economically viable, collection of spatial variability data within a field must be automated a...
Rigo, Vincent; Graas, Estelle; Rigo, Jacques
2012-07-01
Selected optimal respiratory cycles should allow calculation of respiratory mechanic parameters focusing on patient-ventilator interaction. New computer software automatically selecting optimal breaths and respiratory mechanics derived from those cycles are evaluated. Retrospective study. University level III neonatal intensive care unit. Ten mins synchronized intermittent mandatory ventilation and assist/control ventilation recordings from ten newborns. The ventilator provided respiratory mechanic data (ventilator respiratory cycles) every 10 secs. Pressure, flow, and volume waves and pressure-volume, pressure-flow, and volume-flow loops were reconstructed from continuous pressure-volume recordings. Visual assessment determined assisted leak-free optimal respiratory cycles (selected respiratory cycles). New software graded the quality of cycles (automated respiratory cycles). Respiratory mechanic values were derived from both sets of optimal cycles. We evaluated quality selection and compared mean values and their variability according to ventilatory mode and respiratory mechanic provenance. To assess discriminating power, all 45 "t" values obtained from interpatient comparisons were compared for each respiratory mechanic parameter. A total of 11,724 breaths are evaluated. Automated respiratory cycle/selected respiratory cycle selections agreement is high: 88% of maximal κ with linear weighting. Specificity and positive predictive values are 0.98 and 0.96, respectively. Averaged values are similar between automated respiratory cycle and ventilator respiratory cycle. C20/C alone is markedly decreased in automated respiratory cycle (1.27 ± 0.37 vs. 1.81 ± 0.67). Tidal volume apparent similarity disappears in assist/control: automated respiratory cycle tidal volume (4.8 ± 1.0 mL/kg) is significantly lower than for ventilator respiratory cycle (5.6 ± 1.8 mL/kg). Coefficients of variation decrease for all automated respiratory cycle parameters in all infants. "t" values from ventilator respiratory cycle data are two to three times higher than ventilator respiratory cycles. Automated selection is highly specific. Automated respiratory cycle reflects most the interaction of both ventilator and patient. Improving discriminating power of ventilator monitoring will likely help in assessing disease status and following trends. Averaged parameters derived from automated respiratory cycles are more precise and could be displayed by ventilators to improve real-time fine tuning of ventilator settings.
Long-term Behavioral Tracking of Freely Swimming Weakly Electric Fish
Jun, James J.; Longtin, André; Maler, Leonard
2014-01-01
Long-term behavioral tracking can capture and quantify natural animal behaviors, including those occurring infrequently. Behaviors such as exploration and social interactions can be best studied by observing unrestrained, freely behaving animals. Weakly electric fish (WEF) display readily observable exploratory and social behaviors by emitting electric organ discharge (EOD). Here, we describe three effective techniques to synchronously measure the EOD, body position, and posture of a free-swimming WEF for an extended period of time. First, we describe the construction of an experimental tank inside of an isolation chamber designed to block external sources of sensory stimuli such as light, sound, and vibration. The aquarium was partitioned to accommodate four test specimens, and automated gates remotely control the animals' access to the central arena. Second, we describe a precise and reliable real-time EOD timing measurement method from freely swimming WEF. Signal distortions caused by the animal's body movements are corrected by spatial averaging and temporal processing stages. Third, we describe an underwater near-infrared imaging setup to observe unperturbed nocturnal animal behaviors. Infrared light pulses were used to synchronize the timing between the video and the physiological signal over a long recording duration. Our automated tracking software measures the animal's body position and posture reliably in an aquatic scene. In combination, these techniques enable long term observation of spontaneous behavior of freely swimming weakly electric fish in a reliable and precise manner. We believe our method can be similarly applied to the study of other aquatic animals by relating their physiological signals with exploratory or social behaviors. PMID:24637642
Brudvig, Jean M; Swenson, Cheryl L
2015-12-01
Rapid and precise measurement of total and differential nucleated cell counts is a crucial diagnostic component of cavitary and synovial fluid analyses. The objectives of this study included (1) evaluation of reliability and precision of canine and equine fluid total nucleated cell count (TNCC) determined by the benchtop Abaxis VetScan HM5, in comparison with the automated reference instruments ADVIA 120 and the scil Vet abc, respectively, and (2) comparison of automated with manual canine differential nucleated cell counts. The TNCC and differential counts in canine pleural and peritoneal, and equine synovial fluids were determined on the Abaxis VetScan HM5 and compared with the ADVIA 120 and Vet abc analyzer, respectively. Statistical analyses included correlation, least squares fit linear regression, Passing-Bablok regression, and Bland-Altman difference plots. In addition, precision of the total cell count generated by the VetScan HM5 was determined. Agreement was excellent without significant constant or proportional bias for canine cavitary fluid TNCC. Automated and manual differential counts had R(2) < .5 for individual cell types (least squares fit linear regression). Equine synovial fluid TNCC agreed but with some bias due to the VetScan HM5 overestimating TNCC compared to the Vet abc. Intra-assay precision of the VetScan HM5 in 3 fluid samples was 2-31%. The Abaxis VetScan HM5 provided rapid, reliable TNCC for canine and equine fluid samples. The differential nucleated cell count should be verified microscopically as counts from the VetScan HM5 and also from the ADVIA 120 were often incorrect in canine fluid samples. © 2015 American Society for Veterinary Clinical Pathology.
Design and real-time control of a robotic system for fracture manipulation.
Dagnino, G; Georgilas, I; Tarassoli, P; Atkins, R; Dogramadzi, S
2015-08-01
This paper presents the design, development and control of a new robotic system for fracture manipulation. The objective is to improve the precision, ergonomics and safety of the traditional surgical procedure to treat joint fractures. The achievements toward this direction are here reported and include the design, the real-time control architecture and the evaluation of a new robotic manipulator system. The robotic manipulator is a 6-DOF parallel robot with the struts developed as linear actuators. The control architecture is also described here. The high-level controller implements a host-target structure composed by a host computer (PC), a real-time controller, and an FPGA. A graphical user interface was designed allowing the surgeon to comfortably automate and monitor the robotic system. The real-time controller guarantees the determinism of the control algorithms adding an extra level of safety for the robotic automation. The system's positioning accuracy and repeatability have been demonstrated showing a maximum positioning RMSE of 1.18 ± 1.14mm (translations) and 1.85 ± 1.54° (rotations).
Highly multiplexed targeted proteomics using precise control of peptide retention time.
Gallien, Sebastien; Peterman, Scott; Kiyonami, Reiko; Souady, Jamal; Duriez, Elodie; Schoen, Alan; Domon, Bruno
2012-04-01
Large-scale proteomics applications using SRM analysis on triple quadrupole mass spectrometers present new challenges to LC-MS/MS experimental design. Despite the automation of building large-scale LC-SRM methods, the increased numbers of targeted peptides can compromise the balance between sensitivity and selectivity. To facilitate large target numbers, time-scheduled SRM transition acquisition is performed. Previously published results have demonstrated incorporation of a well-characterized set of synthetic peptides enabled chromatographic characterization of the elution profile for most endogenous peptides. We have extended this application of peptide trainer kits to not only build SRM methods but to facilitate real-time elution profile characterization that enables automated adjustment of the scheduled detection windows. Incorporation of dynamic retention time adjustments better facilitate targeted assays lasting several days without the need for constant supervision. This paper provides an overview of how the dynamic retention correction approach identifies and corrects for commonly observed LC variations. This adjustment dramatically improves robustness in targeted discovery experiments as well as routine quantification experiments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High-speed laser microsurgery of alert fruit flies for fluorescence imaging of neural activity
Sinha, Supriyo; Liang, Liang; Ho, Eric T. W.; Urbanek, Karel E.; Luo, Liqun; Baer, Thomas M.; Schnitzer, Mark J.
2013-01-01
Intravital microscopy is a key means of monitoring cellular function in live organisms, but surgical preparation of a live animal for microscopy often is time-consuming, requires considerable skill, and limits experimental throughput. Here we introduce a spatially precise (<1-µm edge precision), high-speed (<1 s), largely automated, and economical protocol for microsurgical preparation of live animals for optical imaging. Using a 193-nm pulsed excimer laser and the fruit fly as a model, we created observation windows (12- to 350-µm diameters) in the exoskeleton. Through these windows we used two-photon microscopy to image odor-evoked Ca2+ signaling in projection neuron dendrites of the antennal lobe and Kenyon cells of the mushroom body. The impact of a laser-cut window on fly health appears to be substantially less than that of conventional manual dissection, for our imaging durations of up to 18 h were ∼5–20 times longer than prior in vivo microscopy studies of hand-dissected flies. This improvement will facilitate studies of numerous questions in neuroscience, such as those regarding neuronal plasticity or learning and memory. As a control, we used phototaxis as an exemplary complex behavior in flies and found that laser microsurgery is sufficiently gentle to leave it intact. To demonstrate that our techniques are applicable to other species, we created microsurgical openings in nematodes, ants, and the mouse cranium. In conjunction with emerging robotic methods for handling and mounting flies or other small organisms, our rapid, precisely controllable, and highly repeatable microsurgical techniques should enable automated, high-throughput preparation of live animals for optical experimentation. PMID:24167298
Kloefkorn, Heidi E.; Pettengill, Travis R.; Turner, Sara M. F.; Streeter, Kristi A.; Gonzalez-Rothi, Elisa J.; Fuller, David D.; Allen, Kyle D.
2016-01-01
While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns. PMID:27554674
Kloefkorn, Heidi E; Pettengill, Travis R; Turner, Sara M F; Streeter, Kristi A; Gonzalez-Rothi, Elisa J; Fuller, David D; Allen, Kyle D
2017-03-01
While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns.
Remote Sensing and Information Technology for Large Farms
NASA Technical Reports Server (NTRS)
Williams, John E.; Ramsay, Jimmie A.
2003-01-01
A method of applying remote sensing (RS) and information-management technology to help large farms produce at maximum efficiency is undergoing development. The novelty of the method does not lie in the concept of precision agriculture, which involves variation of seeding, of application of chemicals, and of irrigation according to the spatially and temporally local variations in the growth stages and health of crops and in the chemical and physical conditions of soils. The novelty also does not lie in the use of RS data registered with other data in a geographic information system (GIS) to guide the use of precise agricultural techniques. Instead, the novelty lies in a systematic approach to overcoming obstacles that, heretofore, have impeded the timely distribution of reliable, relevant, and sufficient GIS data to support day-to-day, acre-to-acre decisions concerning the application of precise agricultural techniques to increase production and decrease cost. The development and promotion of the method are inspired in part by a vision of equipping farm machinery to accept GIS (including RS) data and using the data for automated or semi-automated implementation of precise agricultural techniques. Primary examples of relevant GIS data include information on plant stress, soil moisture, and effects of applied chemicals, all derived by automated computational analysis of measurements taken by one or more airborne spectroradiometers. Proper management and timeliness of the large amount of GIS information are of paramount concern in agriculture. Information on stresses and changes in crops is especially perishable and important to farmers. The need for timeliness and management of information is satisfied by use of computing hardware and software capable of (1) rapid geo-rectification and other processing of RS data, (2) packaging the output data in the form of GIS plots, and (3) making the data available to farmers and other subscribers by Internet password access. It is a goal of this development program to make RS data available no later than the data after an aerial survey. In addition, data from prior surveys are kept in the data base. Farmers can, for example, use current and prior data to analyze changes.
High-throughput accurate-wavelength lens-based visible spectrometer.
Bell, Ronald E; Scotti, Filippo
2010-10-01
A scanning visible spectrometer has been prototyped to complement fixed-wavelength transmission grating spectrometers for charge exchange recombination spectroscopy. Fast f/1.8 200 mm commercial lenses are used with a large 2160 mm(-1) grating for high throughput. A stepping-motor controlled sine drive positions the grating, which is mounted on a precision rotary table. A high-resolution optical encoder on the grating stage allows the grating angle to be measured with an absolute accuracy of 0.075 arc sec, corresponding to a wavelength error ≤0.005 Å. At this precision, changes in grating groove density due to thermal expansion and variations in the refractive index of air are important. An automated calibration procedure determines all the relevant spectrometer parameters to high accuracy. Changes in bulk grating temperature, atmospheric temperature, and pressure are monitored between the time of calibration and the time of measurement to ensure a persistent wavelength calibration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fong, Erika J.; Huang, Chao; Hamilton, Julie
Here, a major advantage of microfluidic devices is the ability to manipulate small sample volumes, thus reducing reagent waste and preserving precious sample. However, to achieve robust sample manipulation it is necessary to address device integration with the macroscale environment. To realize repeatable, sensitive particle separation with microfluidic devices, this protocol presents a complete automated and integrated microfluidic platform that enables precise processing of 0.15–1.5 ml samples using microfluidic devices. Important aspects of this system include modular device layout and robust fixtures resulting in reliable and flexible world to chip connections, and fully-automated fluid handling which accomplishes closed-loop sample collection,more » system cleaning and priming steps to ensure repeatable operation. Different microfluidic devices can be used interchangeably with this architecture. Here we incorporate an acoustofluidic device, detail its characterization, performance optimization, and demonstrate its use for size-separation of biological samples. By using real-time feedback during separation experiments, sample collection is optimized to conserve and concentrate sample. Although requiring the integration of multiple pieces of equipment, advantages of this architecture include the ability to process unknown samples with no additional system optimization, ease of device replacement, and precise, robust sample processing.« less
Automated micromanipulation desktop station based on mobile piezoelectric microrobots
NASA Astrophysics Data System (ADS)
Fatikow, Sergej
1996-12-01
One of the main problems of present-day research on microsystem technology (MST) is to assemble a whole micro- system from different microcomponents. This paper presents a new concept of an automated micromanipulation desktop- station including piezoelectrically driven microrobots placed on a high-precise x-y-stage of a light microscope, a CCD-camera as a local sensor subsystem, a laser sensor unit as a global sensor subsystem, a parallel computer system with C167 microcontrollers, and a Pentium PC equipped additionally with an optical grabber. The microrobots can perform high-precise manipulations (with an accuracy of up to 10 nm) and a nondestructive transport (at a speed of about 3 cm/sec) of very small objects under the microscope. To control the desktop-station automatically, an advanced control system that includes a task planning level and a real-time execution level is being developed. The main function of the task planning sub-system is to interpret the implicit action plan and to generate a sequence of explicit operations which are sent to the execution level of the control system. The main functions of the execution control level are the object recognition, image processing and feedback position control of the microrobot and the microscope stage.
Vogeser, Michael; Spöhrer, Ute
2006-01-01
Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.
Highly Automated Arrival Management and Control System Suitable for Early NextGen
NASA Technical Reports Server (NTRS)
Swenson, Harry N.; Jung, Jaewoo
2013-01-01
This is a presentation of previously published work conducted in the development of the Terminal Area Precision Scheduling and Spacing (TAPSS) system. Included are concept and technical descriptions of the TAPSS system and results from human in the loop simulations conducted at Ames Research Center. The Terminal Area Precision Scheduling and Spacing system has demonstrated through research and extensive high-fidelity simulation studies to have benefits in airport arrival throughput, supporting efficient arrival descents, and enabling mixed aircraft navigation capability operations during periods of high congestion. NASA is currently porting the TAPSS system into the FAA TBFM and STARS system prototypes to ensure its ability to operate in the FAA automation Infrastructure. NASA ATM Demonstration Project is using the the TAPSS technologies to provide the ground-based automation tools to enable airborne Interval Management (IM) capabilities. NASA and the FAA have initiated a Research Transition Team to enable potential TAPSS and IM Technology Transfer.
Rack Insertion End Effector (RIEE) automation
NASA Technical Reports Server (NTRS)
Malladi, Narasimha
1993-01-01
NASA is developing a mechanism to manipulate and insert Racks into the Space Station Logistic modules. The mechanism consists of the following: a base with three motorized degrees of freedom, a 3 section motorized boom that goes from 15 to 44 feet in length, and a Rack Insertion End Effector (RIEE) with 5 hand wheels for precise alignment. The robotics section was tasked with the automation of the RIEE unit. In this report, for the automation of the RIEE unit, application of the Perceptics Vision System was conceptually developed to determine the position and orientation of the RIEE relative to the logistic module, and a MathCad program is written to display the needed displacements for precise alignment and final insertion of the Rack. The uniqueness of this report is that the whole report is in fact a MathCad program including text, derivations, and executable equations with example inputs and outputs.
Medical Device for Automated Prick Test Reading.
Justo, Xabier; Diaz, Inaki; Gil, Jorge Juan; Gastaminza, Gabriel
2018-05-01
Allergy tests are routinely performed in most hospitals everyday. However, measuring the outcomes of these tests is still a very laborious manual task. Current methods and systems lack of precision and repeatability. This paper presents a novel mechatronic system that is able to scan a patient's entire arm and provide allergists with precise measures of wheals for diagnosis. The device is based on 3-D laser technology and specific algorithms have been developed to process the information gathered. This system aims to automate the reading of skin prick tests and make gains in speed, accuracy, and reliability. Several experiments have been performed to evaluate the performance of the system.
Evaluation of SAPHIRE: an automated approach to indexing and retrieving medical literature.
Hersh, W.; Hickam, D. H.; Haynes, R. B.; McKibbon, K. A.
1991-01-01
An analysis of SAPHIRE, an experimental information retrieval system featuring automated indexing and natural language retrieval, was performed on MEDLINE references using data previously generated for a MEDLINE evaluation. Compared with searches performed by novice and expert physicians using MEDLINE, SAPHIRE achieved comparable recall and precision. While its combined recall and precision performance did not equal the level of librarians, SAPHIRE did achieve a significantly higher level of absolute recall. SAPHIRE has other potential advantages over existing MEDLINE systems. Its natural language interface does not require knowledge of MeSH, and it provides relevance ranking of retrieved references. PMID:1807718
Gallistel, C R; Tucci, Valter; Nolan, Patrick M; Schachner, Melitta; Jakovcevski, Igor; Kheifets, Aaron; Barboza, Luendro
2014-03-05
We used a fully automated system for the behavioural measurement of physiologically meaningful properties of basic mechanisms of cognition to test two strains of heterozygous mutant mice, Bfc (batface) and L1, and their wild-type littermate controls. Both of the target genes are involved in the establishment and maintenance of synapses. We find that the Bfc heterozygotes show reduced precision in their representation of interval duration, whereas the L1 heterozygotes show increased precision. These effects are functionally specific, because many other measures made on the same mice are unaffected, namely: the accuracy of matching temporal investment ratios to income ratios in a matching protocol, the rate of instrumental and classical conditioning, the latency to initiate a cued instrumental response, the trials on task and the impulsivity in a switch paradigm, the accuracy with which mice adjust timed switches to changes in the temporal constraints, the days to acquisition, and mean onset time and onset variability in the circadian anticipation of food availability.
Gallistel, C. R.; Tucci, Valter; Nolan, Patrick M.; Schachner, Melitta; Jakovcevski, Igor; Kheifets, Aaron; Barboza, Luendro
2014-01-01
We used a fully automated system for the behavioural measurement of physiologically meaningful properties of basic mechanisms of cognition to test two strains of heterozygous mutant mice, Bfc (batface) and L1, and their wild-type littermate controls. Both of the target genes are involved in the establishment and maintenance of synapses. We find that the Bfc heterozygotes show reduced precision in their representation of interval duration, whereas the L1 heterozygotes show increased precision. These effects are functionally specific, because many other measures made on the same mice are unaffected, namely: the accuracy of matching temporal investment ratios to income ratios in a matching protocol, the rate of instrumental and classical conditioning, the latency to initiate a cued instrumental response, the trials on task and the impulsivity in a switch paradigm, the accuracy with which mice adjust timed switches to changes in the temporal constraints, the days to acquisition, and mean onset time and onset variability in the circadian anticipation of food availability. PMID:24446498
Impact of assay design on test performance: lessons learned from 25-hydroxyvitamin D.
Farrell, Christopher-John L; Soldo, Joshua; McWhinney, Brett; Bandodkar, Sushil; Herrmann, Markus
2014-11-01
Current automated immunoassays vary significantly in many aspects of their design. This study sought to establish if the theoretical advantages and disadvantages associated with different design formats of automated 25-hydroxyvitamin D (25-OHD) assays are translated into variations in assay performance in practice. 25-OHD was measured in 1236 samples using automated assays from Abbott, DiaSorin, Roche and Siemens. A subset of 362 samples had up to three liquid chromatography-tandem mass spectrometry 25-OHD analyses performed. 25-OHD₂ recovery, dilution recovery, human anti-animal antibody (HAAA) interference, 3-epi-25-OHD₃ cross-reactivity and precision of the automated assays were evaluated. The assay that combined release of 25-OHD with analyte capture in a single step showed the most accurate 25-OHD₂ recovery and the best dilution recovery. The use of vitamin D binding protein (DBP) as the capture moiety was associated with 25-OHD₂ under-recovery, a trend consistent with 3-epi-25-OHD₃ cross-reactivity and immunity to HAAA interference. Assays using animal-derived antibodies did not show 3-epi-25-OHD₃ cross-reactivity but were variably susceptible to HAAA interference. Not combining 25-OHD release and capture in one step and use of biotin-streptavidin interaction for solid phase separation were features of the assays with inferior accuracy for diluted samples. The assays that used a backfill assay format showed the best precision at high concentrations but this design did not guarantee precision at low 25-OHD concentrations. Variations in design among automated 25-OHD assays influence their performance characteristics. Consideration of the details of assay design is therefore important when selecting and validating new assays.
Reliability of Semi-Automated Segmentations in Glioblastoma.
Huber, T; Alber, G; Bette, S; Boeckh-Behrens, T; Gempt, J; Ringel, F; Alberts, E; Zimmer, C; Bauer, J S
2017-06-01
In glioblastoma, quantitative volumetric measurements of contrast-enhancing or fluid-attenuated inversion recovery (FLAIR) hyperintense tumor compartments are needed for an objective assessment of therapy response. The aim of this study was to evaluate the reliability of a semi-automated, region-growing segmentation tool for determining tumor volume in patients with glioblastoma among different users of the software. A total of 320 segmentations of tumor-associated FLAIR changes and contrast-enhancing tumor tissue were performed by different raters (neuroradiologists, medical students, and volunteers). All patients underwent high-resolution magnetic resonance imaging including a 3D-FLAIR and a 3D-MPRage sequence. Segmentations were done using a semi-automated, region-growing segmentation tool. Intra- and inter-rater-reliability were addressed by intra-class-correlation (ICC). Root-mean-square error (RMSE) was used to determine the precision error. Dice score was calculated to measure the overlap between segmentations. Semi-automated segmentation showed a high ICC (> 0.985) for all groups indicating an excellent intra- and inter-rater-reliability. Significant smaller precision errors and higher Dice scores were observed for FLAIR segmentations compared with segmentations of contrast-enhancement. Single rater segmentations showed the lowest RMSE for FLAIR of 3.3 % (MPRage: 8.2 %). Both, single raters and neuroradiologists had the lowest precision error for longitudinal evaluation of FLAIR changes. Semi-automated volumetry of glioblastoma was reliably performed by all groups of raters, even without neuroradiologic expertise. Interestingly, segmentations of tumor-associated FLAIR changes were more reliable than segmentations of contrast enhancement. In longitudinal evaluations, an experienced rater can detect progressive FLAIR changes of less than 15 % reliably in a quantitative way which could help to detect progressive disease earlier.
Huang, Shu-Hong; Chang, Yu-Shin; Juang, Jyh-Ming Jimmy; Chang, Kai-Wei; Tsai, Mong-Hsun; Lu, Tzu-Pin; Lai, Liang-Chuan; Chuang, Eric Y; Huang, Nien-Tsu
2018-03-12
In this study, we developed an automated microfluidic DNA microarray (AMDM) platform for point mutation detection of genetic variants in inherited arrhythmic diseases. The platform allows for automated and programmable reagent sequencing under precise conditions of hybridization flow and temperature control. It is composed of a commercial microfluidic control system, a microfluidic microarray device, and a temperature control unit. The automated and rapid hybridization process can be performed in the AMDM platform using Cy3 labeled oligonucleotide exons of SCN5A genetic DNA, which produces proteins associated with sodium channels abundant in the heart (cardiac) muscle cells. We then introduce a graphene oxide (GO)-assisted DNA microarray hybridization protocol to enable point mutation detection. In this protocol, a GO solution is added after the staining step to quench dyes bound to single-stranded DNA or non-perfectly matched DNA, which can improve point mutation specificity. As proof-of-concept we extracted the wild-type and mutant of exon 12 and exon 17 of SCN5A genetic DNA from patients with long QT syndrome or Brugada syndrome by touchdown PCR and performed a successful point mutation discrimination in the AMDM platform. Overall, the AMDM platform can greatly reduce laborious and time-consuming hybridization steps and prevent potential contamination. Furthermore, by introducing the reciprocating flow into the microchannel during the hybridization process, the total assay time can be reduced to 3 hours, which is 6 times faster than the conventional DNA microarray. Given the automatic assay operation, shorter assay time, and high point mutation discrimination, we believe that the AMDM platform has potential for low-cost, rapid and sensitive genetic testing in a simple and user-friendly manner, which may benefit gene screening in medical practice.
Collaborative real-time motion video analysis by human observer and image exploitation algorithms
NASA Astrophysics Data System (ADS)
Hild, Jutta; Krüger, Wolfgang; Brüstle, Stefan; Trantelle, Patrick; Unmüßig, Gabriel; Heinze, Norbert; Peinsipp-Byma, Elisabeth; Beyerer, Jürgen
2015-05-01
Motion video analysis is a challenging task, especially in real-time applications. In most safety and security critical applications, a human observer is an obligatory part of the overall analysis system. Over the last years, substantial progress has been made in the development of automated image exploitation algorithms. Hence, we investigate how the benefits of automated video analysis can be integrated suitably into the current video exploitation systems. In this paper, a system design is introduced which strives to combine both the qualities of the human observer's perception and the automated algorithms, thus aiming to improve the overall performance of a real-time video analysis system. The system design builds on prior work where we showed the benefits for the human observer by means of a user interface which utilizes the human visual focus of attention revealed by the eye gaze direction for interaction with the image exploitation system; eye tracker-based interaction allows much faster, more convenient, and equally precise moving target acquisition in video images than traditional computer mouse selection. The system design also builds on prior work we did on automated target detection, segmentation, and tracking algorithms. Beside the system design, a first pilot study is presented, where we investigated how the participants (all non-experts in video analysis) performed in initializing an object tracking subsystem by selecting a target for tracking. Preliminary results show that the gaze + key press technique is an effective, efficient, and easy to use interaction technique when performing selection operations on moving targets in videos in order to initialize an object tracking function.
A completely automated flow, heat-capacity, calorimeter for use at high temperatures and pressures
NASA Astrophysics Data System (ADS)
Rogers, P. S. Z.; Sandarusi, Jamal
1990-11-01
An automated, flow calorimeter has been constructed to measure the isobaric heat capacities of concentrated, aqueous electrolyte solutions using a differential calorimetry technique. The calorimeter is capable of operation to 700 K and 40 MPa with a measurement accuracy of 0.03% relative to the heat capacity of the pure reference fluid (water). A novel design encloses the calorimeter within a double set of separately controlled, copper, adiabatic shields that minimize calorimeter heat losses and precisely control the temperature of the inlet fluids. A multistage preheat train, used to efficiently heat the flowing fluid, includes a counter-current heat exchanger for the inlet and outlet fluid streams in tandem with two calorimeter preheaters. Complete system automation is accomplished with a distributed control scheme using multiple processors, allowing the major control tasks of calorimeter operation and control, data logging and display, and pump control to be performed simultaneously. A sophisticated pumping strategy for the two separate syringe pumps allows continuous fluid delivery. This automation system enables the calorimeter to operate unattended except for the reloading of sample fluids. In addition, automation has allowed the development and implementation of an improved heat loss calibration method that provides calorimeter calibration with absolute accuracy comparable to the overall measurement precision, even for very concentrated solutions.
NASA Astrophysics Data System (ADS)
Marchitto, T. M., Jr.; Mitra, R.; Zhong, B.; Ge, Q.; Kanakiya, B.; Lobaton, E.
2017-12-01
Identification and picking of foraminifera from sediment samples is often a laborious and repetitive task. Previous attempts to automate this process have met with limited success, but we show that recent advances in machine learning can be brought to bear on the problem. As a `proof of concept' we have developed a system that is capable of recognizing six species of extant planktonic foraminifera that are commonly used in paleoceanographic studies. Our pipeline begins with digital photographs taken under 16 different illuminations using an LED ring, which are then fused into a single 3D image. Labeled image sets were used to train various types of image classification algorithms, and performance on unlabeled image sets was measured in terms of precision (whether IDs are correct) and recall (what fraction of the target species are found). We find that Convolutional Neural Network (CNN) approaches achieve precision and recall values between 80 and 90%, which is similar precision and better recall than human expert performance using the same type of photographs. We have also trained a CNN to segment the 3D images into individual chambers and apertures, which can not only improve identification performance but also automate the measurement of foraminifera for morphometric studies. Given that there are only 35 species of extant planktonic foraminifera larger than 150 μm, we suggest that a fully automated characterization of this assemblage is attainable. This is the first step toward the realization of a foram picking robot.
Deleger, Louise; Brodzinski, Holly; Zhai, Haijun; Li, Qi; Lingren, Todd; Kirkendall, Eric S; Alessandrini, Evaline; Solti, Imre
2013-12-01
To evaluate a proposed natural language processing (NLP) and machine-learning based automated method to risk stratify abdominal pain patients by analyzing the content of the electronic health record (EHR). We analyzed the EHRs of a random sample of 2100 pediatric emergency department (ED) patients with abdominal pain, including all with a final diagnosis of appendicitis. We developed an automated system to extract relevant elements from ED physician notes and lab values and to automatically assign a risk category for acute appendicitis (high, equivocal, or low), based on the Pediatric Appendicitis Score. We evaluated the performance of the system against a manually created gold standard (chart reviews by ED physicians) for recall, specificity, and precision. The system achieved an average F-measure of 0.867 (0.869 recall and 0.863 precision) for risk classification, which was comparable to physician experts. Recall/precision were 0.897/0.952 in the low-risk category, 0.855/0.886 in the high-risk category, and 0.854/0.766 in the equivocal-risk category. The information that the system required as input to achieve high F-measure was available within the first 4 h of the ED visit. Automated appendicitis risk categorization based on EHR content, including information from clinical notes, shows comparable performance to physician chart reviewers as measured by their inter-annotator agreement and represents a promising new approach for computerized decision support to promote application of evidence-based medicine at the point of care.
NASA Astrophysics Data System (ADS)
Lu, Hong; Gargesha, Madhusudhana; Wang, Zhao; Chamie, Daniel; Attizani, Guilherme F.; Kanaya, Tomoaki; Ray, Soumya; Costa, Marco A.; Rollins, Andrew M.; Bezerra, Hiram G.; Wilson, David L.
2013-02-01
Intravascular OCT (iOCT) is an imaging modality with ideal resolution and contrast to provide accurate in vivo assessments of tissue healing following stent implantation. Our Cardiovascular Imaging Core Laboratory has served >20 international stent clinical trials with >2000 stents analyzed. Each stent requires 6-16hrs of manual analysis time and we are developing highly automated software to reduce this extreme effort. Using classification technique, physically meaningful image features, forward feature selection to limit overtraining, and leave-one-stent-out cross validation, we detected stent struts. To determine tissue coverage areas, we estimated stent "contours" by fitting detected struts and interpolation points from linearly interpolated tissue depths to a periodic cubic spline. Tissue coverage area was obtained by subtracting lumen area from the stent area. Detection was compared against manual analysis of 40 pullbacks. We obtained recall = 90+/-3% and precision = 89+/-6%. When taking struts deemed not bright enough for manual analysis into consideration, precision improved to 94+/-6%. This approached inter-observer variability (recall = 93%, precision = 96%). Differences in stent and tissue coverage areas are 0.12 +/- 0.41 mm2 and 0.09 +/- 0.42 mm2, respectively. We are developing software which will enable visualization, review, and editing of automated results, so as to provide a comprehensive stent analysis package. This should enable better and cheaper stent clinical trials, so that manufacturers can optimize the myriad of parameters (drug, coverage, bioresorbable versus metal, etc.) for stent design.
D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D
2011-01-01
Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.
Automated Cutting And Drilling Of Composite Parts
NASA Technical Reports Server (NTRS)
Warren, Charles W.
1993-01-01
Proposed automated system precisely cuts and drills large, odd-shaped parts made of composite materials. System conceived for manufacturing lightweight composite parts to replace heavier parts in Space Shuttle. Also useful in making large composite parts for other applications. Includes robot locating part to be machined, positions cutter, and positions drill. Gantry-type robot best suited for task.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webber, Nels W.
Los Alamos National Laboratory in J-1 DARHT Operations Group uses 6ft spherical vessels to contain hazardous materials produced in a hydrodynamic experiment. These contaminated vessels must be analyzed by means of a worker entering the vessel to locate, measure, and document every penetration mark on the vessel. If the worker can be replaced by a highly automated robotic system with a high precision scanner, it will eliminate the risks to the worker and provide management with an accurate 3D model of the vessel presenting the existing damage with the flexibility to manipulate the model for better and more in-depth assessment.Themore » project was successful in meeting the primary goal of installing an automated system which scanned a 6ft vessel with an elapsed time of 45 minutes. This robotic system reduces the total time for the original scope of work by 75 minutes and results in excellent data accumulation and transmission to the 3D model imaging program.« less
Brandes, Susanne; Mokhtari, Zeinab; Essig, Fabian; Hünniger, Kerstin; Kurzai, Oliver; Figge, Marc Thilo
2015-02-01
Time-lapse microscopy is an important technique to study the dynamics of various biological processes. The labor-intensive manual analysis of microscopy videos is increasingly replaced by automated segmentation and tracking methods. These methods are often limited to certain cell morphologies and/or cell stainings. In this paper, we present an automated segmentation and tracking framework that does not have these restrictions. In particular, our framework handles highly variable cell shapes and does not rely on any cell stainings. Our segmentation approach is based on a combination of spatial and temporal image variations to detect moving cells in microscopy videos. This method yields a sensitivity of 99% and a precision of 95% in object detection. The tracking of cells consists of different steps, starting from single-cell tracking based on a nearest-neighbor-approach, detection of cell-cell interactions and splitting of cell clusters, and finally combining tracklets using methods from graph theory. The segmentation and tracking framework was applied to synthetic as well as experimental datasets with varying cell densities implying different numbers of cell-cell interactions. We established a validation framework to measure the performance of our tracking technique. The cell tracking accuracy was found to be >99% for all datasets indicating a high accuracy for connecting the detected cells between different time points. Copyright © 2014 Elsevier B.V. All rights reserved.
Precision Departure Release Capability (PDRC) Final Report
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Richard; Day, Kevin Brian; Kistler, Matthew Stephen; Gaither, Frank; Juro, Greg
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows that may be subject to constraints that create localized demand/capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool, based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas/Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents research results from the PDRC research activity. Companion papers present the Concept of Operations and a Technology Description.
Precision Departure Release Capability (PDRC): NASA to FAA Research Transition
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Davis, Thomas J.
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations.
Automation of a suturing device for minimally invasive surgery.
Göpel, Tobias; Härtl, Felix; Schneider, Armin; Buss, Martin; Feussner, Hubertus
2011-07-01
In minimally invasive surgery, hand suturing is categorized as a challenge in technique as well as in its duration. This calls for an easily manageable tool, permitting an all-purpose, cost-efficient, and secure viscerosynthesis. Such a tool for this field already exists: the Autosuture EndoStitch(®). In a series of studies the potential for the EndoStitch to accelerate suturing has been proven. However, its ergonomics still limits its applicability. The goal of this study was twofold: propose an optimized and partially automated EndoStitch and compare the conventional EndoStitch to the optimized and partially automated EndoStitch with respect to the speed and precision of suturing. Based on the EndoStitch, a partially automated suturing tool has been developed. With the aid of a DC motor, triggered by a button, one can suture by one-fingered handling. Using the partially automated suturing manipulator, 20 surgeons with different levels of laparoscopic experience successfully completed a continuous suture with 10 stitches using the conventional and the partially automated suture manipulator. Before that, each participant was given 1 min of instruction and 1 min for training. Absolute suturing time and stitch accuracy were measured. The quality of the automated EndoStitch with respect to manipulation was tested with the aid of a standardized questionnaire. To compare the two instruments, t tests were used for suturing accuracy and time. Of the 20 surgeons with laparoscopic experience (fewer than 5 laparoscopic interventions, n=9; fewer than 20 laparoscopic interventions, n=7; more than 20 laparoscopic interventions, n=4), there was no significant difference between the two tested systems with respect to stitching accuracy. However, the suturing time was significantly shorter with the Autostitch (P=0.01). The difference in accuracy and speed was not statistically significant considering the laparoscopic experience of the surgeons. The weight and size of the Autostitch have been criticized as well as its cable. However, the comfortable handhold, automatic needle change, and ergonomic manipulation have been rated positive. Partially automated suturing in minimally invasive surgery offers advantages with respect to the speed of operation and ergonomics. Ongoing work in this field has to concentrate on minimization, implementation in robotic systems, and development of new operation methods (NOTES).
Spötl, Christoph
2005-09-01
The stable carbon isotopic composition of dissolved inorganic carbon (delta13C(DIC)) is traditionally determined using either direct precipitation or gas evolution methods in conjunction with offline gas preparation and measurement in a dual-inlet isotope ratio mass spectrometer. A gas evolution method based on continuous-flow technology is described here, which is easy to use and robust. Water samples (100-1500 microl depending on the carbonate alkalinity) are injected into He-filled autosampler vials in the field and analysed on an automated continuous-flow gas preparation system interfaced to an isotope ratio mass spectrometer. Sample analysis time including online preparation is 10 min and overall precision is 0.1 per thousand. This method is thus fast and can easily be automated for handling large sample batches.
Automation of Die and Mold Polishing: A Preliminary Investigation
1990-02-02
Brittenham, B.S.M.E. The Ohio State University 1989 Master’s Examination Committees Approved by Taylan Altan Gary P. Maul Advisor College of Engineering...Corporation, Mount Prospect, Illinois, September 1989. 68. Showa Precision Machinery, Robo -Polisher, Model SMR-100, Showa Precision Machinery Catalog No...Abrasives." Wear, 18, No. 2 (1971), pp. 169-170. 132 Robo -Polisher, Hodel SHR-100. Showa Precision Hachinery Catalog No. August 88-4000. Amagasaki, Japan
Design of automatic leveling and centering system of theodolite
NASA Astrophysics Data System (ADS)
Liu, Chun-tong; He, Zhen-Xin; Huang, Xian-xiang; Zhan, Ying
2012-09-01
To realize the theodolite automation and improve the azimuth Angle measurement instrument, the theodolite automatic leveling and centering system with the function of leveling error compensation is designed, which includes the system solution, key components selection, the mechanical structure of leveling and centering, and system software solution. The redesigned leveling feet are driven by the DC servo motor; and the electronic control center device is installed. Using high precision of tilt sensors as horizontal skew detection sensors ensures the effectiveness of the leveling error compensation. Aiming round mark center is located using digital image processing through surface array CCD; and leveling measurement precision can reach the pixel level, which makes the theodolite accurate centering possible. Finally, experiments are conducted using the automatic leveling and centering system of the theodolite. The results show the leveling and centering system can realize automatic operation with high centering accuracy of 0.04mm.The measurement precision of the orientation angle after leveling error compensation is improved, compared with that of in the traditional method. Automatic leveling and centering system of theodolite can satisfy the requirements of the measuring precision and its automation.
Precision Departure Release Capability (PDRC) Technology Description
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Richard; Day, Kevin; Robinson, Corissia; Null, Jody R.
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Technology Description. Companion papers include the Final Report and a Concept of Operations.
Precision Departure Release Capability (PDRC) Concept of Operations
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Capps, Richard A.; Day, Kevin Brian
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demandcapacity imbalances. When demand exceeds capacity Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in DallasFort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Concept of Operations. Companion papers include the Final Report and a Technology Description. ? SUBJECT:
Automating High-Precision X-Ray and Neutron Imaging Applications with Robotics
Hashem, Joseph Anthony; Pryor, Mitch; Landsberger, Sheldon; ...
2017-03-28
Los Alamos National Laboratory and the University of Texas at Austin recently implemented a robotically controlled nondestructive testing (NDT) system for X-ray and neutron imaging. This system is intended to address the need for accurate measurements for a variety of parts and, be able to track measurement geometry at every imaging location, and is designed for high-throughput applications. This system was deployed in a beam port at a nuclear research reactor and in an operational inspection X-ray bay. The nuclear research reactor system consisted of a precision industrial seven-axis robot, 1.1-MW TRIGA research reactor, and a scintillator-mirror-camera-based imaging system. Themore » X-ray bay system incorporated the same robot, a 225-keV microfocus X-ray source, and a custom flat panel digital detector. The robotic positioning arm is programmable and allows imaging in multiple configurations, including planar, cylindrical, as well as other user defined geometries that provide enhanced engineering evaluation capability. The imaging acquisition device is coupled with the robot for automated image acquisition. The robot can achieve target positional repeatability within 17 μm in the 3-D space. Flexible automation with nondestructive imaging saves costs, reduces dosage, adds imaging techniques, and achieves better quality results in less time. Specifics regarding the robotic system and imaging acquisition and evaluation processes are presented. In conclusion, this paper reviews the comprehensive testing and system evaluation to affirm the feasibility of robotic NDT, presents the system configuration, and reviews results for both X-ray and neutron radiography imaging applications.« less
Automating High-Precision X-Ray and Neutron Imaging Applications with Robotics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hashem, Joseph Anthony; Pryor, Mitch; Landsberger, Sheldon
Los Alamos National Laboratory and the University of Texas at Austin recently implemented a robotically controlled nondestructive testing (NDT) system for X-ray and neutron imaging. This system is intended to address the need for accurate measurements for a variety of parts and, be able to track measurement geometry at every imaging location, and is designed for high-throughput applications. This system was deployed in a beam port at a nuclear research reactor and in an operational inspection X-ray bay. The nuclear research reactor system consisted of a precision industrial seven-axis robot, 1.1-MW TRIGA research reactor, and a scintillator-mirror-camera-based imaging system. Themore » X-ray bay system incorporated the same robot, a 225-keV microfocus X-ray source, and a custom flat panel digital detector. The robotic positioning arm is programmable and allows imaging in multiple configurations, including planar, cylindrical, as well as other user defined geometries that provide enhanced engineering evaluation capability. The imaging acquisition device is coupled with the robot for automated image acquisition. The robot can achieve target positional repeatability within 17 μm in the 3-D space. Flexible automation with nondestructive imaging saves costs, reduces dosage, adds imaging techniques, and achieves better quality results in less time. Specifics regarding the robotic system and imaging acquisition and evaluation processes are presented. In conclusion, this paper reviews the comprehensive testing and system evaluation to affirm the feasibility of robotic NDT, presents the system configuration, and reviews results for both X-ray and neutron radiography imaging applications.« less
Crew/Automation Interaction in Space Transportation Systems: Lessons Learned from the Glass Cockpit
NASA Technical Reports Server (NTRS)
Rudisill, Marianne
2000-01-01
The progressive integration of automation technologies in commercial transport aircraft flight decks - the 'glass cockpit' - has had a major, and generally positive, impact on flight crew operations. Flight deck automation has provided significant benefits, such as economic efficiency, increased precision and safety, and enhanced functionality within the crew interface. These enhancements, however, may have been accrued at a price, such as complexity added to crew/automation interaction that has been implicated in a number of aircraft incidents and accidents. This report briefly describes 'glass cockpit' evolution. Some relevant aircraft accidents and incidents are described, followed by a more detailed description of human/automation issues and problems (e.g., crew error, monitoring, modes, command authority, crew coordination, workload, and training). This paper concludes with example principles and guidelines for considering 'glass cockpit' human/automation integration within space transportation systems.
Automated Solvent Seaming of Large Polyimide Membranes
NASA Technical Reports Server (NTRS)
Rood, Robert; Moore, James D.; Talley, Chris; Gierow, Paul A.
2006-01-01
A solvent-based welding process enables the joining of precise, cast polyimide membranes at their edges to form larger precise membranes. The process creates a homogeneous, optical-quality seam between abutting membranes, with no overlap and with only a very localized area of figure disturbance. The seam retains 90 percent of the strength of the parent material. The process was developed for original use in the fabrication of wide-aperture membrane optics, with areal densities of less than 1 kg/m2, for lightweight telescopes, solar concentrators, antennas, and the like to be deployed in outer space. The process is just as well applicable to the fabrication of large precise polyimide membranes for flat or inflatable solar concentrators and antenna reflectors for terrestrial applications. The process is applicable to cast membranes made of CP1 (or equivalent) polyimide. The process begins with the precise fitting together and fixturing of two membrane segments. The seam is formed by applying a metered amount of a doped solution of the same polyimide along the abutting edges of the membrane segments. After the solution has been applied, the fixtured films are allowed to dry and are then cured by convective heating. The weld material is the same as the parent material, so that what is formed is a homogeneous, strong joint that is almost indistinguishable from the parent material. The success of the process is highly dependent on formulation of the seaming solution from the correct proportion of the polyimide in a suitable solvent. In addition, the formation of reliable seams depends on the deposition of a precise amount of the seaming solution along the seam line. To ensure the required precision, deposition is performed by use of an automated apparatus comprising a modified commercially available, large-format, ink-jet print head on an automated positioning table. The printing head jets the seaming solution into the seam area at a rate controlled in coordination with the movement of the positioning table.
Kukhareva, Polina V; Kawamoto, Kensaku; Shields, David E; Barfuss, Darryl T; Halley, Anne M; Tippetts, Tyler J; Warner, Phillip B; Bray, Bruce E; Staes, Catherine J
2014-01-01
Electronic quality measurement (QM) and clinical decision support (CDS) are closely related but are typically implemented independently, resulting in significant duplication of effort. While it seems intuitive that technical approaches could be re-used across these two related use cases, such reuse is seldom reported in the literature, especially for standards-based approaches. Therefore, we evaluated the feasibility of using a standards-based CDS framework aligned with anticipated EHR certification criteria to implement electronic QM. The CDS-QM framework was used to automate a complex national quality measure (SCIP-VTE-2) at an academic healthcare system which had previously relied on time-consuming manual chart abstractions. Compared with 305 manually-reviewed reference cases, the recall of automated measurement was 100%. The precision was 96.3% (CI:92.6%-98.5%) for ascertaining the denominator and 96.2% (CI:92.3%-98.4%) for the numerator. We therefore validated that a standards-based CDS-QM framework can successfully enable automated QM, and we identified benefits and challenges with this approach. PMID:25954389
EDTA analysis on the Roche MODULAR analyser.
Davidson, D F
2007-05-01
Patient specimens can be subject to subtle interference from cross contamination by liquid-based, potassium-containing EDTA anticoagulant, leading to misinterpretation of results. A rapid method for EDTA analysis to detect such contamination is described. An in-house EDTA assay on the Roche MODULAR analyser was assessed for accuracy and precision by comparison with an adjusted calcium difference measurement (atomic absorption and o-cresolphthalein complexone colorimetry). EDTA method versus adjusted calcium difference showed: slope = 1.038 (95% confidence interval [CI] 0.949-1.131); intercept = 0.073 (95% CI 0.018-0.132) mmol/L; r = 0.914; n = 94. However, inter-assay precision of the calcium difference method was estimated to be poorer (coefficient of variation 24.8% versus 3.4% for the automated colorimetric method at an EDTA concentration of 0.25 mmol/L). Unequivocal contamination was observed at an EDTA concentration of > or =0.2 mmol/L. The automated method showed positive interference from haemolysis and negative interference from oxalate. The method was unaffected by lipaemia (triglycerides <20 mmol/L), icterus (bilirubin <500 micromol/L), glucose (<100 mmol/L), iron (<100 micromol/L), and citrate, phosphate or fluoride (all <2.5 mmol/L). The automated colorimetric assay described is an accurate, precise and rapid (3 min) means of detecting EDTA contamination of unhaemolysed biochemistry specimens.
Trajectory Specification for Automation of Terminal Air Traffic Control
NASA Technical Reports Server (NTRS)
Paielli, Russell A.
2016-01-01
"Trajectory specification" is the explicit bounding and control of aircraft tra- jectories such that the position at each point in time is constrained to a precisely defined volume of space. The bounding space is defined by cross-track, along-track, and vertical tolerances relative to a reference trajectory that specifies position as a function of time. The tolerances are dynamic and will be based on the aircraft nav- igation capabilities and the current traffic situation. A standard language will be developed to represent these specifications and to communicate them by datalink. Assuming conformance, trajectory specification can guarantee safe separation for an arbitrary period of time even in the event of an air traffic control (ATC) sys- tem or datalink failure, hence it can help to achieve the high level of safety and reliability needed for ATC automation. As a more proactive form of ATC, it can also maximize airspace capacity and reduce the reliance on tactical backup systems during normal operation. It applies to both enroute airspace and the terminal area around airports, but this paper focuses on arrival spacing in the terminal area and presents ATC algorithms and software for achieving a specified delay of runway arrival time.
Prospects for high-precision pulsar timing with the new Effelsberg PSRIX backend
NASA Astrophysics Data System (ADS)
Lazarus, P.; Karuppusamy, R.; Graikou, E.; Caballero, R. N.; Champion, D. J.; Lee, K. J.; Verbiest, J. P. W.; Kramer, M.
2016-05-01
The PSRIX backend is the primary pulsar timing instrument of the Effelsberg 100 m radio telescope since early 2011. This new ROACH-based system enables bandwidths up to 500 MHz to be recorded, significantly more than what was possible with its predecessor, the Effelsberg-Berkeley Pulsar Processor (EBPP). We review the first four years of PSRIX timing data for 33 pulsars collected as part of the monthly European Pulsar Timing Array (EPTA) observations. We describe the automated data analysis pipeline, COASTGUARD, that we developed to reduce these observations. We also introduce TOASTER, the EPTA timing data base, used to store timing results, processing information and observation metadata. Using these new tools, we measure the phase-averaged flux densities at 1.4 GHz of all 33 pulsars. For seven of these pulsars, our flux density measurements are the first values ever reported. For the other 26 pulsars, we compare our flux density measurements with previously published values. By comparing PSRIX data with EBPP data, we find an improvement of ˜2-5 times in signal-to-noise ratio, which translates to an increase of ˜2-5 times in pulse time-of-arrival (TOA) precision. We show that such an improvement in TOA precision will improve the sensitivity to the stochastic gravitational wave background. Finally, we showcase the flexibility of the new PSRIX backend by observing several millisecond-period pulsars (MSPs) at 5 and 9 GHz. Motivated by our detections, we discuss the potential for complementing existing pulsar timing array data sets with MSP monitoring campaigns at these higher frequencies.
The color bar phase meter: A simple and economical method for calibrating crystal oscillators
NASA Technical Reports Server (NTRS)
Davis, D. D.
1973-01-01
Comparison of crystal oscillators to the rubidium stabilized color burst is made easy and inexpensive by use of the color bar phase meter. Required equipment consists of an unmodified color TV receiver, a color bar synthesizer and a stop watch (a wrist watch or clock with sweep second hand may be used with reduced precision). Measurement precision of 1 x 10 to the minus 10th power can be realized in measurement times of less than two minutes. If the color bar synthesizer were commercially available, user cost should be less than $200.00, exclusive of the TV receiver. Parts cost for the color bar synthesizer which translates the crystal oscillator frequency to 3.579MHz and modulates the received RF signal before it is fed to the receiver antenna terminals is about $25.00. A more sophisticated automated version, with precision of 1 x 10 to the minus 11th power would cost about twice as much.
Automated extraction of radiation dose information from CT dose report images.
Li, Xinhua; Zhang, Da; Liu, Bob
2011-06-01
The purpose of this article is to describe the development of an automated tool for retrieving texts from CT dose report images. Optical character recognition was adopted to perform text recognitions of CT dose report images. The developed tool is able to automate the process of analyzing multiple CT examinations, including text recognition, parsing, error correction, and exporting data to spreadsheets. The results were precise for total dose-length product (DLP) and were about 95% accurate for CT dose index and DLP of scanned series.
Automated Absorber Attachment for X-ray Microcalorimeter Arrays
NASA Technical Reports Server (NTRS)
Moseley, S.; Allen, Christine; Kilbourne, Caroline; Miller, Timothy M.; Costen, Nick; Schulte, Eric; Moseley, Samuel J.
2007-01-01
Our goal is to develop a method for the automated attachment of large numbers of absorber tiles to large format detector arrays. This development includes the fabrication of high quality, closely spaced HgTe absorber tiles that are properly positioned for pick-and-place by our FC150 flip chip bonder. The FC150 also transfers the appropriate minute amount of epoxy to the detectors for permanent attachment of the absorbers. The success of this development will replace an arduous, risky and highly manual task with a reliable, high-precision automated process.
NMRNet: A deep learning approach to automated peak picking of protein NMR spectra.
Klukowski, Piotr; Augoff, Michal; Zieba, Maciej; Drwal, Maciej; Gonczarek, Adam; Walczak, Michal J
2018-03-14
Automated selection of signals in protein NMR spectra, known as peak picking, has been studied for over 20 years, nevertheless existing peak picking methods are still largely deficient. Accurate and precise automated peak picking would accelerate the structure calculation, and analysis of dynamics and interactions of macromolecules. Recent advancement in handling big data, together with an outburst of machine learning techniques, offer an opportunity to tackle the peak picking problem substantially faster than manual picking and on par with human accuracy. In particular, deep learning has proven to systematically achieve human-level performance in various recognition tasks, and thus emerges as an ideal tool to address automated identification of NMR signals. We have applied a convolutional neural network for visual analysis of multidimensional NMR spectra. A comprehensive test on 31 manually-annotated spectra has demonstrated top-tier average precision (AP) of 0.9596, 0.9058 and 0.8271 for backbone, side-chain and NOESY spectra, respectively. Furthermore, a combination of extracted peak lists with automated assignment routine, FLYA, outperformed other methods, including the manual one, and led to correct resonance assignment at the levels of 90.40%, 89.90% and 90.20% for three benchmark proteins. The proposed model is a part of a Dumpling software (platform for protein NMR data analysis), and is available at https://dumpling.bio/. michaljerzywalczak@gmail.compiotr.klukowski@pwr.edu.pl. Supplementary data are available at Bioinformatics online.
Wang, Ying; Coiera, Enrico; Runciman, William; Magrabi, Farah
2017-06-12
Approximately 10% of admissions to acute-care hospitals are associated with an adverse event. Analysis of incident reports helps to understand how and why incidents occur and can inform policy and practice for safer care. Unfortunately our capacity to monitor and respond to incident reports in a timely manner is limited by the sheer volumes of data collected. In this study, we aim to evaluate the feasibility of using multiclass classification to automate the identification of patient safety incidents in hospitals. Text based classifiers were applied to identify 10 incident types and 4 severity levels. Using the one-versus-one (OvsO) and one-versus-all (OvsA) ensemble strategies, we evaluated regularized logistic regression, linear support vector machine (SVM) and SVM with a radial-basis function (RBF) kernel. Classifiers were trained and tested with "balanced" datasets (n_ Type = 2860, n_ SeverityLevel = 1160) from a state-wide incident reporting system. Testing was also undertaken with imbalanced "stratified" datasets (n_ Type = 6000, n_ SeverityLevel =5950) from the state-wide system and an independent hospital reporting system. Classifier performance was evaluated using a confusion matrix, as well as F-score, precision and recall. The most effective combination was a OvsO ensemble of binary SVM RBF classifiers with binary count feature extraction. For incident type, classifiers performed well on balanced and stratified datasets (F-score: 78.3, 73.9%), but were worse on independent datasets (68.5%). Reports about falls, medications, pressure injury, aggression and blood products were identified with high recall and precision. "Documentation" was the hardest type to identify. For severity level, F-score for severity assessment code (SAC) 1 (extreme risk) was 87.3 and 64% for SAC4 (low risk) on balanced data. With stratified data, high recall was achieved for SAC1 (82.8-84%) but precision was poor (6.8-11.2%). High risk incidents (SAC2) were confused with medium risk incidents (SAC3). Binary classifier ensembles appear to be a feasible method for identifying incidents by type and severity level. Automated identification should enable safety problems to be detected and addressed in a more timely manner. Multi-label classifiers may be necessary for reports that relate to more than one incident type.
Peirone, Laura S; Pereyra Irujo, Gustavo A; Bolton, Alejandro; Erreguerena, Ignacio; Aguirrezábal, Luis A N
2018-01-01
Conventional field phenotyping for drought tolerance, the most important factor limiting yield at a global scale, is labor-intensive and time-consuming. Automated greenhouse platforms can increase the precision and throughput of plant phenotyping and contribute to a faster release of drought tolerant varieties. The aim of this work was to establish a framework of analysis to identify early traits which could be efficiently measured in a greenhouse automated phenotyping platform, for predicting the drought tolerance of field grown soybean genotypes. A group of genotypes was evaluated, which showed variation in their drought susceptibility index (DSI) for final biomass and leaf area. A large number of traits were measured before and after the onset of a water deficit treatment, which were analyzed under several criteria: the significance of the regression with the DSI, phenotyping cost, earliness, and repeatability. The most efficient trait was found to be transpiration efficiency measured at 13 days after emergence. This trait was further tested in a second experiment with different water deficit intensities, and validated using a different set of genotypes against field data from a trial network in a third experiment. The framework applied in this work for assessing traits under different criteria could be helpful for selecting those most efficient for automated phenotyping.
NASA Astrophysics Data System (ADS)
Obersteiner, F.; Bönisch, H.; Engel, A.
2016-01-01
We present the characterization and application of a new gas chromatography time-of-flight mass spectrometry instrument (GC-TOFMS) for the quantitative analysis of halocarbons in air samples. The setup comprises three fundamental enhancements compared to our earlier work (Hoker et al., 2015): (1) full automation, (2) a mass resolving power R = m/Δm of the TOFMS (Tofwerk AG, Switzerland) increased up to 4000 and (3) a fully accessible data format of the mass spectrometric data. Automation in combination with the accessible data allowed an in-depth characterization of the instrument. Mass accuracy was found to be approximately 5 ppm in mean after automatic recalibration of the mass axis in each measurement. A TOFMS configuration giving R = 3500 was chosen to provide an R-to-sensitivity ratio suitable for our purpose. Calculated detection limits are as low as a few femtograms by means of the accurate mass information. The precision for substance quantification was 0.15 % at the best for an individual measurement and in general mainly determined by the signal-to-noise ratio of the chromatographic peak. Detector non-linearity was found to be insignificant up to a mixing ratio of roughly 150 ppt at 0.5 L sampled volume. At higher concentrations, non-linearities of a few percent were observed (precision level: 0.2 %) but could be attributed to a potential source within the detection system. A straightforward correction for those non-linearities was applied in data processing, again by exploiting the accurate mass information. Based on the overall characterization results, the GC-TOFMS instrument was found to be very well suited for the task of quantitative halocarbon trace gas observation and a big step forward compared to scanning, quadrupole MS with low mass resolving power and a TOFMS technique reported to be non-linear and restricted by a small dynamical range.
Elrod, JoAnn Broeckel; Merchant, Raina; Daya, Mohamud; Youngquist, Scott; Salcido, David; Valenzuela, Terence; Nichol, Graham
2017-01-01
Introduction Lay use of automated external defibrillators (AEDs) before the arrival of emergency medical services (EMS) providers on scene increases survival after out-of-hospital cardiac arrest (OHCA). AEDs have been placed in public locations may be not ready for use when needed. We describe a protocol for AED surveillance that tracks these devices through time and space to improve public health, and survival as well as facilitate research. Methods and analysis Included AEDs are installed in public locations for use by laypersons to treat patients with OHCA before the arrival of EMS providers on scene. Included cases of OHCA are patients evaluated by organised EMS personnel and treated for OHCA. Enrolment of 10 000 AEDs annually will yield precision of 0.4% in the estimate of readiness for use. Enrolment of 2500 patients annually will yield precision of 1.9% in the estimate of survival to hospital discharge. Recruitment began on 21 Mar 2014 and is ongoing. AEDs are found by using multiple methods. Each AED is then tagged with a label which is a unique two-dimensional (2D) matrix code; the 2D matrix code is recorded and the location and status of the AED tracked using a smartphone; these elements are automatically passed via the internet to a secure and confidential database in real time. Whenever the 2D matrix code is rescanned for any non-clinical or clinical use of an AED, the user is queried to answer a finite set of questions about the device status. The primary outcome of any clinical use of an AED is survival to hospital discharge. Results are summarised descriptively. Ethics and dissemination These activities are conducted under a grant of authority for public health surveillance from the Food and Drug Administration. Results are provided periodically to participating sites and sponsors to improve public health and quality of care. PMID:28360255
Raterink, Robert-Jan; Witkam, Yoeri; Vreeken, Rob J; Ramautar, Rawi; Hankemeier, Thomas
2014-10-21
In the field of bioanalysis, there is an increasing demand for miniaturized, automated, robust sample pretreatment procedures that can be easily connected to direct-infusion mass spectrometry (DI-MS) in order to allow the high-throughput screening of drugs and/or their metabolites in complex body fluids like plasma. Liquid-Liquid extraction (LLE) is a common sample pretreatment technique often used for complex aqueous samples in bioanalysis. Despite significant developments that have been made in automated and miniaturized LLE procedures, fully automated LLE techniques allowing high-throughput bioanalytical studies on small-volume samples using direct infusion mass spectrometry, have not been matured yet. Here, we introduce a new fully automated micro-LLE technique based on gas-pressure assisted mixing followed by passive phase separation, coupled online to nanoelectrospray-DI-MS. Our method was characterized by varying the gas flow and its duration through the solvent mixture. For evaluation of the analytical performance, four drugs were spiked to human plasma, resulting in highly acceptable precision (RSD down to 9%) and linearity (R(2) ranging from 0.990 to 0.998). We demonstrate that our new method does not only allow the reliable extraction of analytes from small sample volumes of a few microliters in an automated and high-throughput manner, but also performs comparable or better than conventional offline LLE, in which the handling of small volumes remains challenging. Finally, we demonstrate the applicability of our method for drug screening on dried blood spots showing excellent linearity (R(2) of 0.998) and precision (RSD of 9%). In conclusion, we present the proof of principe of a new high-throughput screening platform for bioanalysis based on a new automated microLLE method, coupled online to a commercially available nano-ESI-DI-MS.
NASA Technical Reports Server (NTRS)
1996-01-01
In order to more easily join the huge sections of the Space Shuttle external tank, Marshall Space Flight Center initiated development of the existing concept of Variable Polarity Plasma Arc (VPPA) welding. VPPA welding employs a variable current waveform that allows the system to operate for preset time increments in either of two polarity modes for effective joining of light alloys. Marshall awarded the torch contract to B & B Precision Machine, which produced a torch for the Shuttle, then automated the system, and eventually delivered a small torch used by companies such as Whirlpool for sheet metal welding of appliance parts and other applications. The dependability of the torch offers cost and time advantages.
NASA Astrophysics Data System (ADS)
Lynch, John A.; Zaim, Souhil; Zhao, Jenny; Peterfy, Charles G.; Genant, Harry K.
2001-07-01
In osteoarthritis, articular cartilage loses integrity and becomes thinned. This usually occurs at sites which bear weight during normal use. Measurement of such loss from MRI scans, requires precise and reproducible techniques, which can overcome the difficulties of patient repositioning within the scanner. In this study, we combine a previously described technique for segmentation of cartilage from MRI of the knee, with a technique for 3D image registration that matches localized regions of interest at followup and baseline. Two patients, who had recently undergone meniscal surgery, and developed lesions during the 12 month followup period were examined. Image registration matched regions of interest (ROI) between baseline and followup, and changes within the cartilage lesions were estimate to be about a 16% reduction in cartilage volume within each ROI. This was more than 5 times the reproducibility of the measurement, but only represented a change of between 1 and 2% in total femoral cartilage volume. Changes in total cartilage volume may be insensitive for quantifying changes in cartilage morphology. A combined used of automated image segmentation, with 3D image registration could be a useful tool for the precise and sensitive measurement of localized changes in cartilage from MRI of the knee.
NASA Astrophysics Data System (ADS)
Qi, Yulin; Müller, Miriam; Stokes, Caroline S.; Volmer, Dietrich A.
2018-04-01
LC-MS/MS is widely utilized today for quantification of vitamin D in biological fluids. Mass spectrometric assays for vitamin D require very careful method optimization for precise and interference-free, accurate analyses however. Here, we explore chemical derivatization and matrix-assisted laser desorption/ionization (MALDI) as a rapid alternative for quantitative measurement of 25-hydroxyvitamin D3 in human serum, and compare it to results from LC-MS/MS. The method implemented an automated imaging step of each MALDI spot, to locate areas of high intensity, avoid sweet spot phenomena, and thus improve precision. There was no statistically significant difference in vitamin D quantification between the MALDI-MS/MS and LC-MS/MS: mean ± standard deviation for MALDI-MS—29.4 ± 10.3 ng/mL—versus LC-MS/MS—30.3 ± 11.2 ng/mL (P = 0.128)—for the sum of the 25-hydroxyvitamin D epimers. The MALDI-based assay avoided time-consuming chromatographic separation steps and was thus much faster than the LC-MS/MS assay. It also consumed less sample, required no organic solvents, and was readily automated. In this proof-of-concept study, MALDI-MS readily demonstrated its potential for mass spectrometric quantification of vitamin D compounds in biological fluids.
Gröschel, J; Philipp, F; Skonetzki, St; Genzwürker, H; Wetter, Th; Ellinger, K
2004-02-01
Precise documentation of medical treatment in emergency medical missions and for resuscitation is essential from a medical, legal and quality assurance point of view [Anästhesiologie und Intensivmedizin, 41 (2000) 737]. All conventional methods of time recording are either too inaccurate or elaborate for routine application. Automated speech recognition may offer a solution. A special erase programme for the documentation of all time events was developed. Standard speech recognition software (IBM ViaVoice 7.0) was adapted and installed on two different computer systems. One was a stationary PC (500MHz Pentium III, 128MB RAM, Soundblaster PCI 128 Soundcard, Win NT 4.0), the other was a mobile pen-PC that had already proven its value during emergency missions [Der Notarzt 16, p. 177] (Fujitsu Stylistic 2300, 230Mhz MMX Processor, 160MB RAM, embedded soundcard ESS 1879 chipset, Win98 2nd ed.). On both computers two different microphones were tested. One was a standard headset that came with the recognition software, the other was a small microphone (Lavalier-Kondensatormikrofon EM 116 from Vivanco), that could be attached to the operators collar. Seven women and 15 men spoke a text with 29 phrases to be recognised. Two emergency physicians tested the system in a simulated emergency setting using the collar microphone and the pen-PC with an analogue wireless connection. Overall recognition was best for the PC with a headset (89%) followed by the pen-PC with a headset (85%), the PC with a microphone (84%) and the pen-PC with a microphone (80%). Nevertheless, the difference was not statistically significant. Recognition became significantly worse (89.5% versus 82.3%, P<0.0001 ) when numbers had to be recognised. The gender of speaker and the number of words in a sentence had no influence. Average recognition in the simulated emergency setting was 75%. At no time did false recognition appear. Time recording with automated speech recognition seems to be possible in emergency medical missions. Although results show an average recognition of only 75%, it is possible that missing elements may be reconstructed more precisely. Future technology should integrate a secure wireless connection between microphone and mobile computer. The system could then prove its value for real out-of-hospital emergencies.
A manic-depressive symptom self-report in optical scanable format.
Glick, Henry A; McBride, Linda; Bauer, Mark S
2003-10-01
The Internal State Scale (ISS) is a self-report instrument that allows the simultaneous assessment of both manic and depressive symptoms in individuals with manic-depressive disorder. Prior work indicates that subscales are highly correlated with clinician ratings of mania and depression and provide a discriminant function that identifies individuals in manic/hypomanic, mixed, depressed, and euthymic mood states. A drawback to the ISS is that its items were developed in the visual analogue scale (VAS) format which is labor-intensive to score, particularly with repeat (e.g. daily) administration. A Likert-based format would allow quick and easy optical scanning for which scoring could be automated. To compare discriminating properties in Likert versus VAS format we re-analyzed previously collected data and collected new data: (a) VAS-based ISS scores from 86 subjects from a prior four-site study were re-analyzed by collapsing scores into 20 and then 10 Likert-based bins to assess loss of precision from collapsing scores, and (b) 24 additional subjects were administered the ISS in VAS and Likert formats to assess loss of precision due to instrument completion factors. Discriminant ability, including kappas and receiver operator characteristic curves, were unchanged across the two formats. Within-subjects reliability was uniformally high across formats. Likert-based scoring of the ISS can be used without loss of precision, thus making automated scoring of the ISS feasible. This format will be particularly useful for studies that require processing of large numbers of ISSs, such as those that collect frequent ratings over long periods of time and/or those that utilize large samples.
Software for Automated Image-to-Image Co-registration
NASA Technical Reports Server (NTRS)
Benkelman, Cody A.; Hughes, Heidi
2007-01-01
The project objectives are: a) Develop software to fine-tune image-to-image co-registration, presuming images are orthorectified prior to input; b) Create a reusable software development kit (SDK) to enable incorporation of these tools into other software; d) provide automated testing for quantitative analysis; and e) Develop software that applies multiple techniques to achieve subpixel precision in the co-registration of image pairs.
Automated cell counts on CSF samples: A multicenter performance evaluation of the GloCyte system.
Hod, E A; Brugnara, C; Pilichowska, M; Sandhaus, L M; Luu, H S; Forest, S K; Netterwald, J C; Reynafarje, G M; Kratz, A
2018-02-01
Automated cell counters have replaced manual enumeration of cells in blood and most body fluids. However, due to the unreliability of automated methods at very low cell counts, most laboratories continue to perform labor-intensive manual counts on many or all cerebrospinal fluid (CSF) samples. This multicenter clinical trial investigated if the GloCyte System (Advanced Instruments, Norwood, MA), a recently FDA-approved automated cell counter, which concentrates and enumerates red blood cells (RBCs) and total nucleated cells (TNCs), is sufficiently accurate and precise at very low cell counts to replace all manual CSF counts. The GloCyte System concentrates CSF and stains RBCs with fluorochrome-labeled antibodies and TNCs with nucleic acid dyes. RBCs and TNCs are then counted by digital image analysis. Residual adult and pediatric CSF samples obtained for clinical analysis at five different medical centers were used for the study. Cell counts were performed by the manual hemocytometer method and with the GloCyte System following the same protocol at all sites. The limits of the blank, detection, and quantitation, as well as precision and accuracy of the GloCyte, were determined. The GloCyte detected as few as 1 TNC/μL and 1 RBC/μL, and reliably counted as low as 3 TNCs/μL and 2 RBCs/μL. The total coefficient of variation was less than 20%. Comparison with cell counts obtained with a hemocytometer showed good correlation (>97%) between the GloCyte and the hemocytometer, including at very low cell counts. The GloCyte instrument is a precise, accurate, and stable system to obtain red cell and nucleated cell counts in CSF samples. It allows for the automated enumeration of even very low cell numbers, which is crucial for CSF analysis. These results suggest that GloCyte is an acceptable alternative to the manual method for all CSF samples, including those with normal cell counts. © 2017 John Wiley & Sons Ltd.
Simulation Test Of Descent Advisor
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Green, Steven M.
1991-01-01
Report describes piloted-simulation test of Descent Advisor (DA), subsystem of larger automation system being developed to assist human air-traffic controllers and pilots. Focuses on results of piloted simulation, in which airline crews executed controller-issued descent advisories along standard curved-path arrival routes. Crews able to achieve arrival-time precision of plus or minus 20 seconds at metering fix. Analysis of errors generated in turns resulted in further enhancements of algorithm to increase accuracies of its predicted trajectories. Evaluations by pilots indicate general support for DA concept and provide specific recommendations for improvement.
Determining Tooth Occlusal Surface Relief Indicator by Means of Automated 3d Shape Analysis
NASA Astrophysics Data System (ADS)
Gaboutchian, A. V.; Knyaz, V. A.
2017-05-01
Determining occlusal surface relief indicator plays an important role in odontometric tooth shape analysis. An analysis of the parameters of surface relief indicators provides valuable information about closure of dental arches (occlusion) and changes in structure of teeth in lifetime. Such data is relevant for dentistry or anthropology applications. Descriptive techniques commonly used for surface relief evaluation have limited precision which, as a result, does not provide for reliability of conclusions about structure and functioning of teeth. Parametric techniques developed for such applications need special facilities and are time-consuming which limits their spread and ease to access. Nevertheless the use of 3D models, obtained by photogrammetric techniques, allows attaining required measurements accuracy and has a potential for process automation. We introduce new approaches for determining tooth occlusal surface relief indicator and provide data on efficiency in use of different indicators in natural attrition evaluation.
Microfluidic Exosome Analysis toward Liquid Biopsy for Cancer.
He, Mei; Zeng, Yong
2016-08-01
Assessment of a tumor's molecular makeup using biofluid samples, known as liquid biopsy, is a prominent research topic in precision medicine for cancer, due to its noninvasive property allowing repeat sampling for monitoring molecular changes of tumors over time. Circulating exosomes recently have been recognized as promising tumor surrogates because they deliver enriched biomarkers, such as proteins, RNAs, and DNA. However, purification and characterization of these exosomes are technically challenging. Microfluidic lab-on-a-chip technology effectively addresses these challenges owing to its inherent advantages in integration and automation of multiple functional modules, enhancing sensing performance, and expediting analysis processes. In this article, we review the state-of-the-art development of microfluidic technologies for exosome isolation and molecular characterization with emphasis on their applications toward liquid biopsy-based analysis of cancer. Finally, we share our perspectives on current challenges and future directions of microfluidic exosome analysis. © 2016 Society for Laboratory Automation and Screening.
Monazzam, Azita; Razifar, Pasha; Lindhe, Örjan; Josephsson, Raymond; Långström, Bengt; Bergström, Mats
2005-01-01
Background Considering the width and importance of using Multicellular Tumor Spheroids (MTS) in oncology research, size determination of MTSs by an accurate and fast method is essential. In the present study an effective, fast and semi-automated method, SASDM, was developed to determinate the size of MTSs. The method was applied and tested in MTSs of three different cell-lines. Frozen section autoradiography and Hemotoxylin Eosin (H&E) staining was used for further confirmation. Results SASDM was shown to be effective, user-friendly, and time efficient, and to be more precise than the traditional methods and it was applicable for MTSs of different cell-lines. Furthermore, the results of image analysis showed high correspondence to the results of autoradiography and staining. Conclusion The combination of assessment of metabolic condition and image analysis in MTSs provides a good model to evaluate the effect of various anti-cancer treatments. PMID:16283948
NASA Astrophysics Data System (ADS)
Ni, Guangming; Liu, Lin; Zhang, Jing; Liu, Juanxiu; Liu, Yong
2018-01-01
With the development of the liquid crystal display (LCD) module industry, LCD modules become more and more precise with larger sizes, which demands harsh imaging requirements for automated optical inspection (AOI). Here, we report a high-resolution and clearly focused imaging optomechatronics for precise LCD module bonding AOI inspection. It first presents and achieves high-resolution imaging for LCD module bonding AOI inspection using a line scan camera (LSC) triggered by a linear optical encoder, self-adaptive focusing for the whole large imaging region using LSC, and a laser displacement sensor, which reduces the requirements of machining, assembly, and motion control of AOI devices. Results show that this system can directly achieve clearly focused imaging for AOI inspection of large LCD module bonding with 0.8 μm image resolution, 2.65-mm scan imaging width, and no limited imaging width theoretically. All of these are significant for AOI inspection in the LCD module industry and other fields that require imaging large regions with high resolution.
Precision manufacturing for clinical-quality regenerative medicines.
Williams, David J; Thomas, Robert J; Hourd, Paul C; Chandra, Amit; Ratcliffe, Elizabeth; Liu, Yang; Rayment, Erin A; Archer, J Richard
2012-08-28
Innovations in engineering applied to healthcare make a significant difference to people's lives. Market growth is guaranteed by demographics. Regulation and requirements for good manufacturing practice-extreme levels of repeatability and reliability-demand high-precision process and measurement solutions. Emerging technologies using living biological materials add complexity. This paper presents some results of work demonstrating the precision automated manufacture of living materials, particularly the expansion of populations of human stem cells for therapeutic use as regenerative medicines. The paper also describes quality engineering techniques for precision process design and improvement, and identifies the requirements for manufacturing technology and measurement systems evolution for such therapies.
Schneider, George J; Kuper, Kevin G; Abravaya, Klara; Mullen, Carolyn R; Schmidt, Marion; Bunse-Grassmann, Astrid; Sprenger-Haussels, Markus
2009-04-01
Automated sample preparation systems must meet the demands of routine diagnostics laboratories with regard to performance characteristics and compatibility with downstream assays. In this study, the performance of QIAGEN EZ1 DSP Virus Kit on the BioRobot EZ1 DSP was evaluated in combination with the Abbott RealTime HIV-1, HCV, and HBV assays, followed by thermalcycling and detection on the Abbott m2000rt platform. The following performance characteristics were evaluated: linear range and precision, sensitivity, cross-contamination, effects of interfering substances and correlation. Linearity was observed within the tested ranges (for HIV-1: 2.0-6.0 log copies/ml, HCV: 1.3-6.9 log IU/ml, HBV: 1.6-7.6 log copies/ml). Excellent precision was obtained (inter-assay standard deviation for HIV-1: 0.06-0.17 log copies/ml (>2.17 log copies/ml), HCV: 0.05-0.11 log IU/ml (>2.09 log IU/ml), HBV: 0.03-0.07 log copies/ml (>2.55 log copies/ml)), with good sensitivity (95% hit rates for HIV-1: 50 copies/ml, HCV: 12.5 IU/ml, HBV: 10 IU/ml). No cross-contamination was observed, as well as no negative impact of elevated levels of various interfering substances. In addition, HCV and HBV viral load measurements after BioRobot EZ1 DSP extraction correlated well with those obtained after Abbott m2000sp extraction. This evaluation demonstrates that the QIAGEN EZ1 DSP Virus Kit provides an attractive solution for fully automated, low throughput sample preparation for use with the Abbott RealTime HIV-1, HCV, and HBV assays.
Piloted simulation of a ground-based time-control concept for air traffic control
NASA Technical Reports Server (NTRS)
Davis, Thomas J.; Green, Steven M.
1989-01-01
A concept for aiding air traffic controllers in efficiently spacing traffic and meeting scheduled arrival times at a metering fix was developed and tested in a real time simulation. The automation aid, referred to as the ground based 4-D descent advisor (DA), is based on accurate models of aircraft performance and weather conditions. The DA generates suggested clearances, including both top-of-descent-point and speed-profile data, for one or more aircraft in order to achieve specific time or distance separation objectives. The DA algorithm is used by the air traffic controller to resolve conflicts and issue advisories to arrival aircraft. A joint simulation was conducted using a piloted simulator and an advanced concept air traffic control simulation to study the acceptability and accuracy of the DA automation aid from both the pilot's and the air traffic controller's perspectives. The results of the piloted simulation are examined. In the piloted simulation, airline crews executed controller issued descent advisories along standard curved path arrival routes, and were able to achieve an arrival time precision of + or - 20 sec at the metering fix. An analysis of errors generated in turns resulted in further enhancements of the algorithm to improve the predictive accuracy. Evaluations by pilots indicate general support for the concept and provide specific recommendations for improvement.
Brand, Andrew; Bradley, Michael T
2016-02-01
Confidence interval ( CI) widths were calculated for reported Cohen's d standardized effect sizes and examined in two automated surveys of published psychological literature. The first survey reviewed 1,902 articles from Psychological Science. The second survey reviewed a total of 5,169 articles from across the following four APA journals: Journal of Abnormal Psychology, Journal of Applied Psychology, Journal of Experimental Psychology: Human Perception and Performance, and Developmental Psychology. The median CI width for d was greater than 1 in both surveys. Hence, CI widths were, as Cohen (1994) speculated, embarrassingly large. Additional exploratory analyses revealed that CI widths varied across psychological research areas and that CI widths were not discernably decreasing over time. The theoretical implications of these findings are discussed along with ways of reducing the CI widths and thus improving precision of effect size estimation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Britten, J
WET-ETCH FIGURING (WEF) is an automated method of precisely figuring optical materials by the controlled application of aqueous etchant solution. This technology uses surface-tension-gradient-driven flow to confine and stabilize a wetted zone of an etchant solution or other aqueous processing fluid on the surface of an object. This wetted zone can be translated on the surface in a computer-controlled fashion for precise spatial control of the surface reactions occurring (e.g. chemical etching). WEF is particularly suitable for figuring very thin optical materials because it applies no thermal or mechanical stress to the material. Also, because the process is stress-free themore » workpiece can be monitored during figuring using interferometric metrology, and the measurements obtained can be used to control the figuring process in real-time--something that cannot be done with traditional figuring methods.« less
Zhang, Yu Shrike; Aleman, Julio; Shin, Su Ryon; Kim, Duckjin; Mousavi Shaegh, Seyed Ali; Massa, Solange; Riahi, Reza; Chae, Sukyoung; Hu, Ning; Avci, Huseyin; Zhang, Weijia; Silvestri, Antonia; Sanati Nezhad, Amir; Manbohi, Ahmad; De Ferrari, Fabio; Polini, Alessandro; Calzone, Giovanni; Shaikh, Noor; Alerasool, Parissa; Budina, Erica; Kang, Jian; Bhise, Nupura; Pourmand, Adel; Skardal, Aleksander; Shupe, Thomas; Bishop, Colin E.; Dokmeci, Mehmet Remzi; Atala, Anthony; Khademhosseini, Ali
2017-01-01
Organ-on-a-chip systems are miniaturized microfluidic 3D human tissue and organ models designed to recapitulate the important biological and physiological parameters of their in vivo counterparts. They have recently emerged as a viable platform for personalized medicine and drug screening. These in vitro models, featuring biomimetic compositions, architectures, and functions, are expected to replace the conventional planar, static cell cultures and bridge the gap between the currently used preclinical animal models and the human body. Multiple organoid models may be further connected together through the microfluidics in a similar manner in which they are arranged in vivo, providing the capability to analyze multiorgan interactions. Although a wide variety of human organ-on-a-chip models have been created, there are limited efforts on the integration of multisensor systems. However, in situ continual measuring is critical in precise assessment of the microenvironment parameters and the dynamic responses of the organs to pharmaceutical compounds over extended periods of time. In addition, automated and noninvasive capability is strongly desired for long-term monitoring. Here, we report a fully integrated modular physical, biochemical, and optical sensing platform through a fluidics-routing breadboard, which operates organ-on-a-chip units in a continual, dynamic, and automated manner. We believe that this platform technology has paved a potential avenue to promote the performance of current organ-on-a-chip models in drug screening by integrating a multitude of real-time sensors to achieve automated in situ monitoring of biophysical and biochemical parameters. PMID:28265064
Automated time activity classification based on global positioning system (GPS) tracking data
2011-01-01
Background Air pollution epidemiological studies are increasingly using global positioning system (GPS) to collect time-location data because they offer continuous tracking, high temporal resolution, and minimum reporting burden for participants. However, substantial uncertainties in the processing and classifying of raw GPS data create challenges for reliably characterizing time activity patterns. We developed and evaluated models to classify people's major time activity patterns from continuous GPS tracking data. Methods We developed and evaluated two automated models to classify major time activity patterns (i.e., indoor, outdoor static, outdoor walking, and in-vehicle travel) based on GPS time activity data collected under free living conditions for 47 participants (N = 131 person-days) from the Harbor Communities Time Location Study (HCTLS) in 2008 and supplemental GPS data collected from three UC-Irvine research staff (N = 21 person-days) in 2010. Time activity patterns used for model development were manually classified by research staff using information from participant GPS recordings, activity logs, and follow-up interviews. We evaluated two models: (a) a rule-based model that developed user-defined rules based on time, speed, and spatial location, and (b) a random forest decision tree model. Results Indoor, outdoor static, outdoor walking and in-vehicle travel activities accounted for 82.7%, 6.1%, 3.2% and 7.2% of manually-classified time activities in the HCTLS dataset, respectively. The rule-based model classified indoor and in-vehicle travel periods reasonably well (Indoor: sensitivity > 91%, specificity > 80%, and precision > 96%; in-vehicle travel: sensitivity > 71%, specificity > 99%, and precision > 88%), but the performance was moderate for outdoor static and outdoor walking predictions. No striking differences in performance were observed between the rule-based and the random forest models. The random forest model was fast and easy to execute, but was likely less robust than the rule-based model under the condition of biased or poor quality training data. Conclusions Our models can successfully identify indoor and in-vehicle travel points from the raw GPS data, but challenges remain in developing models to distinguish outdoor static points and walking. Accurate training data are essential in developing reliable models in classifying time-activity patterns. PMID:22082316
Automated time activity classification based on global positioning system (GPS) tracking data.
Wu, Jun; Jiang, Chengsheng; Houston, Douglas; Baker, Dean; Delfino, Ralph
2011-11-14
Air pollution epidemiological studies are increasingly using global positioning system (GPS) to collect time-location data because they offer continuous tracking, high temporal resolution, and minimum reporting burden for participants. However, substantial uncertainties in the processing and classifying of raw GPS data create challenges for reliably characterizing time activity patterns. We developed and evaluated models to classify people's major time activity patterns from continuous GPS tracking data. We developed and evaluated two automated models to classify major time activity patterns (i.e., indoor, outdoor static, outdoor walking, and in-vehicle travel) based on GPS time activity data collected under free living conditions for 47 participants (N = 131 person-days) from the Harbor Communities Time Location Study (HCTLS) in 2008 and supplemental GPS data collected from three UC-Irvine research staff (N = 21 person-days) in 2010. Time activity patterns used for model development were manually classified by research staff using information from participant GPS recordings, activity logs, and follow-up interviews. We evaluated two models: (a) a rule-based model that developed user-defined rules based on time, speed, and spatial location, and (b) a random forest decision tree model. Indoor, outdoor static, outdoor walking and in-vehicle travel activities accounted for 82.7%, 6.1%, 3.2% and 7.2% of manually-classified time activities in the HCTLS dataset, respectively. The rule-based model classified indoor and in-vehicle travel periods reasonably well (Indoor: sensitivity > 91%, specificity > 80%, and precision > 96%; in-vehicle travel: sensitivity > 71%, specificity > 99%, and precision > 88%), but the performance was moderate for outdoor static and outdoor walking predictions. No striking differences in performance were observed between the rule-based and the random forest models. The random forest model was fast and easy to execute, but was likely less robust than the rule-based model under the condition of biased or poor quality training data. Our models can successfully identify indoor and in-vehicle travel points from the raw GPS data, but challenges remain in developing models to distinguish outdoor static points and walking. Accurate training data are essential in developing reliable models in classifying time-activity patterns.
Applications of an automated stem measurer for precision forestry
N. Clark
2001-01-01
Accurate stem measurements are required for the determination of many silvicultural prescriptions, i.e., what are we going to do with a stand of trees. This would only be amplified in a precision forestry context. Many methods have been proposed for optimal ways to evaluate stems for a variety of characteristics. These methods usually involve the acquisition of total...
NASA Astrophysics Data System (ADS)
Chow, Yu Ting; Chen, Shuxun; Wang, Ran; Liu, Chichi; Kong, Chi-Wing; Li, Ronald A.; Cheng, Shuk Han; Sun, Dong
2016-04-01
Cell transfection is a technique wherein foreign genetic molecules are delivered into cells. To elucidate distinct responses during cell genetic modification, methods to achieve transfection at the single-cell level are of great value. Herein, we developed an automated micropipette-based quantitative microinjection technology that can deliver precise amounts of materials into cells. The developed microinjection system achieved precise single-cell microinjection by pre-patterning cells in an array and controlling the amount of substance delivered based on injection pressure and time. The precision of the proposed injection technique was examined by comparing the fluorescence intensities of fluorescent dye droplets with a standard concentration and water droplets with a known injection amount of the dye in oil. Injection of synthetic modified mRNA (modRNA) encoding green fluorescence proteins or a cocktail of plasmids encoding green and red fluorescence proteins into human foreskin fibroblast cells demonstrated that the resulting green fluorescence intensity or green/red fluorescence intensity ratio were well correlated with the amount of genetic material injected into the cells. Single-cell transfection via the developed microinjection technique will be of particular use in cases where cell transfection is challenging and genetically modified of selected cells are desired.
Chow, Yu Ting; Chen, Shuxun; Wang, Ran; Liu, Chichi; Kong, Chi-Wing; Li, Ronald A; Cheng, Shuk Han; Sun, Dong
2016-04-12
Cell transfection is a technique wherein foreign genetic molecules are delivered into cells. To elucidate distinct responses during cell genetic modification, methods to achieve transfection at the single-cell level are of great value. Herein, we developed an automated micropipette-based quantitative microinjection technology that can deliver precise amounts of materials into cells. The developed microinjection system achieved precise single-cell microinjection by pre-patterning cells in an array and controlling the amount of substance delivered based on injection pressure and time. The precision of the proposed injection technique was examined by comparing the fluorescence intensities of fluorescent dye droplets with a standard concentration and water droplets with a known injection amount of the dye in oil. Injection of synthetic modified mRNA (modRNA) encoding green fluorescence proteins or a cocktail of plasmids encoding green and red fluorescence proteins into human foreskin fibroblast cells demonstrated that the resulting green fluorescence intensity or green/red fluorescence intensity ratio were well correlated with the amount of genetic material injected into the cells. Single-cell transfection via the developed microinjection technique will be of particular use in cases where cell transfection is challenging and genetically modified of selected cells are desired.
Note: Automated electrochemical etching and polishing of silver scanning tunneling microscope tips.
Sasaki, Stephen S; Perdue, Shawn M; Rodriguez Perez, Alejandro; Tallarida, Nicholas; Majors, Julia H; Apkarian, V Ara; Lee, Joonhee
2013-09-01
Fabrication of sharp and smooth Ag tips is crucial in optical scanning probe microscope experiments. To ensure reproducible tip profiles, the polishing process is fully automated using a closed-loop laminar flow system to deliver the electrolytic solution to moving electrodes mounted on a motorized translational stage. The repetitive translational motion is controlled precisely on the μm scale with a stepper motor and screw-thread mechanism. The automated setup allows reproducible control over the tip profile and improves smoothness and sharpness of tips (radius 27 ± 18 nm), as measured by ultrafast field emission.
Shuttle Repair Tools Automate Vehicle Maintenance
NASA Technical Reports Server (NTRS)
2013-01-01
Successfully building, flying, and maintaining the space shuttles was an immensely complex job that required a high level of detailed, precise engineering. After each shuttle landed, it entered a maintenance, repair, and overhaul (MRO) phase. Each system was thoroughly checked and tested, and worn or damaged parts replaced, before the shuttle was rolled out for its next mission. During the MRO period, workers needed to record exactly what needed replacing and why, as well as follow precise guidelines and procedures in making their repairs. That meant traceability, and with it lots of paperwork. In 2007, the number of reports generated during electrical system repairs was getting out of hand-placing among the top three systems in terms of paperwork volume. Repair specialists at Kennedy Space Center were unhappy spending so much time at a desk and so little time actually working on the shuttle. "Engineers weren't spending their time doing technical work," says Joseph Schuh, an electrical engineer at Kennedy. "Instead, they were busy with repetitive, time-consuming processes that, while important in their own right, provided a low return on time invested." The strain of such inefficiency was bad enough that slow electrical repairs jeopardized rollout on several occasions. Knowing there had to be a way to streamline operations, Kennedy asked Martin Belson, a project manager with 30 years experience as an aerospace contractor, to co-lead a team in developing software that would reduce the effort required to document shuttle repairs. The result was System Maintenance Automated Repair Tasks (SMART) software. SMART is a tool for aggregating and applying information on every aspect of repairs, from procedures and instructions to a vehicle s troubleshooting history. Drawing on that data, SMART largely automates the processes of generating repair instructions and post-repair paperwork. In the case of the space shuttle, this meant that SMART had 30 years worth of operations that it could apply to ongoing maintenance work. According to Schuh, "SMART standardized and streamlined many shuttle repair processes, saving time and money while increasing safety and the quality of repairs." Maintenance technicians and engineers now had a tool that kept them in the field, and because SMART is capable of continually evolving, each time an engineer put it to use, it would enrich the Agency-wide knowledge base. "If an engineer sees something in the work environment that they could improve, a repair process or a procedure, SMART can incorporate that data for use in future operations," says Belson.
The Effect of Training Data Set Composition on the Performance of a Neural Image Caption Generator
2017-09-01
objects was compared using the Metric for Evaluation of Translation with Explicit Ordering (METEOR) and Consensus-Based Image Description Evaluation...using automated scoring systems. Many such systems exist, including Bilingual Evaluation Understudy (BLEU), Consensus-Based Image Description Evaluation...shown to be essential to automated scoring, which correlates highly with human precision.5 CIDEr uses a system of consensus among the captions and
Robandt, P V; Klette, K L; Sibum, M
2009-10-01
An automated solid-phase extraction coupled with liquid chromatography and tandem mass spectrometry (SPE-LC-MS-MS) method for the analysis of 11-nor-Delta(9)-tetrahydrocannabinol-9-carboxylic acid (THC-COOH) in human urine specimens was developed. The method was linear (R(2) = 0.9986) to 1000 ng/mL with no carryover evidenced at 2000 ng/mL. Limits of quantification and detection were found to be 2 ng/mL. Interrun precision was evaluated at the 15 ng/mL level over nine batches spanning 15 days (n = 45). The coefficient of variation (%CV) was found to be 5.5% over the course of the validation. Intrarun precision of a 15 ng/mL control (n = 5) ranged from 0.58% CV to 7.4% CV for the same set of analytical batches. Interference was tested using (+/-)-11-hydroxy-Delta(9)-tetrahydrocannabinol, cannabidiol, (-)-Delta(8)-tetrahydrocannabinol, and cannabinol. One hundred and nineteen specimens previously found to contain THC-COOH by a previously validated gas chromatographic mass spectrometry (GC-MS) procedure were compared to the SPE-LC-MS-MS method. Excellent agreement was found (R(2) = 0.9925) for the parallel comparison study. The automated SPE procedure eliminates the human factors of specimen handling, extraction, and derivatization, thereby reducing labor costs and rework resulting from human error or technique issues. Additionally, method runtime is greatly reduced (e.g., during parallel studies the SPE-LC-MS-MS instrument was often finished with analysis by the time the technician finished the offline SPE and derivatization procedure prior to the GC-MS analysis).
Automated extraction and validation of children's gait parameters with the Kinect.
Motiian, Saeid; Pergami, Paola; Guffey, Keegan; Mancinelli, Corrie A; Doretto, Gianfranco
2015-12-02
Gait analysis for therapy regimen prescription and monitoring requires patients to physically access clinics with specialized equipment. The timely availability of such infrastructure at the right frequency is especially important for small children. Besides being very costly, this is a challenge for many children living in rural areas. This is why this work develops a low-cost, portable, and automated approach for in-home gait analysis, based on the Microsoft Kinect. A robust and efficient method for extracting gait parameters is introduced, which copes with the high variability of noisy Kinect skeleton tracking data experienced across the population of young children. This is achieved by temporally segmenting the data with an approach based on coupling a probabilistic matching of stride template models, learned offline, with the estimation of their global and local temporal scaling. A preliminary study conducted on healthy children between 2 and 4 years of age is performed to analyze the accuracy, precision, repeatability, and concurrent validity of the proposed method against the GAITRite when measuring several spatial and temporal children's gait parameters. The method has excellent accuracy and good precision, with segmenting temporal sequences of body joint locations into stride and step cycles. Also, the spatial and temporal gait parameters, estimated automatically, exhibit good concurrent validity with those provided by the GAITRite, as well as very good repeatability. In particular, on a range of nine gait parameters, the relative and absolute agreements were found to be good and excellent, and the overall agreements were found to be good and moderate. This work enables and validates the automated use of the Kinect for children's gait analysis in healthy subjects. In particular, the approach makes a step forward towards developing a low-cost, portable, parent-operated in-home tool for clinicians assisting young children.
Mann, David L; Abernethy, Bruce; Farrow, Damian; Davis, Mark; Spratford, Wayne
2010-05-01
This article describes a new automated method for the controlled occlusion of vision during natural tasks. The method permits the time course of the presence or absence of visual information to be linked to identifiable events within the task of interest. An example application is presented in which the method is used to examine the ability of cricket batsmen to pick up useful information from the prerelease movement patterns of the opposing bowler. Two key events, separated by a consistent within-action time lag, were identified in the cricket bowling action sequence-namely, the penultimate foot strike prior to ball release (Event 1), and the subsequent moment of ball release (Event 2). Force-plate registration of Event 1 was then used as a trigger to facilitate automated occlusion of vision using liquid crystal occlusion goggles at time points relative to Event 2. Validation demonstrated that, compared with existing approaches that are based on manual triggering, this method of occlusion permitted considerable gains in temporal precision and a reduction in the number of unusable trials. A more efficient and accurate protocol to examine anticipation is produced, while preserving the important natural coupling between perception and action.
An automated 3D reconstruction method of UAV images
NASA Astrophysics Data System (ADS)
Liu, Jun; Wang, He; Liu, Xiaoyang; Li, Feng; Sun, Guangtong; Song, Ping
2015-10-01
In this paper a novel fully automated 3D reconstruction approach based on low-altitude unmanned aerial vehicle system (UAVs) images will be presented, which does not require previous camera calibration or any other external prior knowledge. Dense 3D point clouds are generated by integrating orderly feature extraction, image matching, structure from motion (SfM) and multi-view stereo (MVS) algorithms, overcoming many of the cost, time limitations of rigorous photogrammetry techniques. An image topology analysis strategy is introduced to speed up large scene reconstruction by taking advantage of the flight-control data acquired by UAV. Image topology map can significantly reduce the running time of feature matching by limiting the combination of images. A high-resolution digital surface model of the study area is produced base on UAV point clouds by constructing the triangular irregular network. Experimental results show that the proposed approach is robust and feasible for automatic 3D reconstruction of low-altitude UAV images, and has great potential for the acquisition of spatial information at large scales mapping, especially suitable for rapid response and precise modelling in disaster emergency.
NASA Astrophysics Data System (ADS)
Wollman, Adam J. M.; Miller, Helen; Foster, Simon; Leake, Mark C.
2016-10-01
Staphylococcus aureus is an important pathogen, giving rise to antimicrobial resistance in cell strains such as Methicillin Resistant S. aureus (MRSA). Here we report an image analysis framework for automated detection and image segmentation of cells in S. aureus cell clusters, and explicit identification of their cell division planes. We use a new combination of several existing analytical tools of image analysis to detect cellular and subcellular morphological features relevant to cell division from millisecond time scale sampled images of live pathogens at a detection precision of single molecules. We demonstrate this approach using a fluorescent reporter GFP fused to the protein EzrA that localises to a mid-cell plane during division and is involved in regulation of cell size and division. This image analysis framework presents a valuable platform from which to study candidate new antimicrobials which target the cell division machinery, but may also have more general application in detecting morphologically complex structures of fluorescently labelled proteins present in clusters of other types of cells.
sFIDA automation yields sub-femtomolar limit of detection for Aβ aggregates in body fluids.
Herrmann, Yvonne; Kulawik, Andreas; Kühbach, Katja; Hülsemann, Maren; Peters, Luriano; Bujnicki, Tuyen; Kravchenko, Kateryna; Linnartz, Christina; Willbold, Johannes; Zafiu, Christian; Bannach, Oliver; Willbold, Dieter
2017-03-01
Alzheimer's disease (AD) is a neurodegenerative disorder with yet non-existent therapeutic and limited diagnostic options. Reliable biomarker-based AD diagnostics are of utmost importance for the development and application of therapeutic substances. We have previously introduced a platform technology designated 'sFIDA' for the quantitation of amyloid β peptide (Aβ) aggregates as AD biomarker. In this study we implemented the sFIDA assay on an automated platform to enhance robustness and performance of the assay. In sFIDA (surface-based fluorescence intensity distribution analysis) Aβ species are immobilized by a capture antibody to a glass surface. Aβ aggregates are then multiply loaded with fluorescent antibodies and quantitated by high resolution fluorescence microscopy. As a model system for Aβ aggregates, we used Aβ-conjugated silica nanoparticles (Aβ-SiNaPs) diluted in PBS buffer and cerebrospinal fluid, respectively. Automation of the assay was realized on a liquid handling system in combination with a microplate washer. The automation of the sFIDA assay results in improved intra-assay precision, linearity and sensitivity in comparison to the manual application, and achieved a limit of detection in the sub-femtomolar range. Automation improves the precision and sensitivity of the sFIDA assay, which is a prerequisite for high-throughput measurements and future application of the technology in routine AD diagnostics. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Simulation evaluation of TIMER, a time-based, terminal air traffic, flow-management concept
NASA Technical Reports Server (NTRS)
Credeur, Leonard; Capron, William R.
1989-01-01
A description of a time-based, extended terminal area ATC concept called Traffic Intelligence for the Management of Efficient Runway scheduling (TIMER) and the results of a fast-time evaluation are presented. The TIMER concept is intended to bridge the gap between today's ATC system and a future automated time-based ATC system. The TIMER concept integrates en route metering, fuel-efficient cruise and profile descents, terminal time-based sequencing and spacing together with computer-generated controller aids, to improve delivery precision for fuller use of runway capacity. Simulation results identify and show the effects and interactions of such key variables as horizon of control location, delivery time error at both the metering fix and runway threshold, aircraft separation requirements, delay discounting, wind, aircraft heading and speed errors, and knowledge of final approach speed.
NASA Astrophysics Data System (ADS)
Griffith, D. W.; Bryant, G. R.; Deutscher, N. M.; Wilson, S. R.; Kettlewell, G.; Riggenbach, M.
2007-12-01
We describe a portable Fourier Transform InfraRed (FTIR) analyser capable of simultaneous high precision analysis of CO2, CH4, N2O and CO in air, as well as δ13C in CO2 and δD in water vapour. The instrument is based on a commercial 1 cm-1 resolution FTIR spectrometer fitted with a mid-IR globar source, 26 m multipass White cell and thermoelectrically-cooled MCT detector operating between 2000 and 7500 cm-1. Air is passed through the cell and analysed in real time without any pre-treatment except for (optional) drying. An inlet selection manifold allows automated sequential analysis of samples from one or more inlet lines, with typical measurement times of 1-10 minutes per sample. The spectrometer, inlet sampling sequence, real-time quantitative spectrum analysis, data logging and display are all under the control of a single program running on a laptop PC, and can be left unattended for continuous measurements over periods of weeks to months. Selected spectral regions of typically 100-200 cm-1 width are analysed by a least squares fitting technique to retrieve concentrations of trace gases, 13CO2 and HDO. Typical precision is better than 0.1% without the need for calibration gases. Accuracy is similar if measurements are referenced to calibration standard gases. δ13C precision is typically around 0.1‰, and for δD it is 1‰. Applications of the analyser include clean and polluted air monitoring, tower-based flux measurements such as flux gradient or integrated horizontal flux measurements, automated soil chambers, and field-based measurements of isotopic fractionation in soil-plant-atmosphere systems. The simultaneous multi-component advantages can be exploited in tracer-type emission measurements, for example of CH4 from livestock using a co-released tracer gas and downwind measurement. We have also developed an open path variant especially suited to tracer release studies and measurements of NH3 emissions from agricultural sources. An illustrative selection of applications will be presented.
Computerized Liver Volumetry on MRI by Using 3D Geodesic Active Contour Segmentation
Huynh, Hieu Trung; Karademir, Ibrahim; Oto, Aytekin; Suzuki, Kenji
2014-01-01
OBJECTIVE Our purpose was to develop an accurate automated 3D liver segmentation scheme for measuring liver volumes on MRI. SUBJECTS AND METHODS Our scheme for MRI liver volumetry consisted of three main stages. First, the preprocessing stage was applied to T1-weighted MRI of the liver in the portal venous phase to reduce noise and produce the boundary-enhanced image. This boundary-enhanced image was used as a speed function for a 3D fast-marching algorithm to generate an initial surface that roughly approximated the shape of the liver. A 3D geodesic-active-contour segmentation algorithm refined the initial surface to precisely determine the liver boundaries. The liver volumes determined by our scheme were compared with those manually traced by a radiologist, used as the reference standard. RESULTS The two volumetric methods reached excellent agreement (intraclass correlation coefficient, 0.98) without statistical significance (p = 0.42). The average (± SD) accuracy was 99.4% ± 0.14%, and the average Dice overlap coefficient was 93.6% ± 1.7%. The mean processing time for our automated scheme was 1.03 ± 0.13 minutes, whereas that for manual volumetry was 24.0 ± 4.4 minutes (p < 0.001). CONCLUSION The MRI liver volumetry based on our automated scheme agreed excellently with reference-standard volumetry, and it required substantially less completion time. PMID:24370139
Computerized liver volumetry on MRI by using 3D geodesic active contour segmentation.
Huynh, Hieu Trung; Karademir, Ibrahim; Oto, Aytekin; Suzuki, Kenji
2014-01-01
Our purpose was to develop an accurate automated 3D liver segmentation scheme for measuring liver volumes on MRI. Our scheme for MRI liver volumetry consisted of three main stages. First, the preprocessing stage was applied to T1-weighted MRI of the liver in the portal venous phase to reduce noise and produce the boundary-enhanced image. This boundary-enhanced image was used as a speed function for a 3D fast-marching algorithm to generate an initial surface that roughly approximated the shape of the liver. A 3D geodesic-active-contour segmentation algorithm refined the initial surface to precisely determine the liver boundaries. The liver volumes determined by our scheme were compared with those manually traced by a radiologist, used as the reference standard. The two volumetric methods reached excellent agreement (intraclass correlation coefficient, 0.98) without statistical significance (p = 0.42). The average (± SD) accuracy was 99.4% ± 0.14%, and the average Dice overlap coefficient was 93.6% ± 1.7%. The mean processing time for our automated scheme was 1.03 ± 0.13 minutes, whereas that for manual volumetry was 24.0 ± 4.4 minutes (p < 0.001). The MRI liver volumetry based on our automated scheme agreed excellently with reference-standard volumetry, and it required substantially less completion time.
Sensors Enable Plants to Text Message Farmers
NASA Technical Reports Server (NTRS)
2013-01-01
Long-term human spaceflight means long-term menu planning. Since every pound of cargo comes with a steep price tag, NASA has long researched technologies and techniques to allow astronauts to grow their own food, both on the journey and in some cases at their destination. Sustainable food technologies designed for space have resulted in spinoffs that improve the nutrition, safety, and durability of food on Earth. There are of course tradeoffs involved in making astronauts part-time farmers. Any time spent tending plants is time that can t be spent elsewhere: collecting data, exploring, performing routine maintenance, or sleeping. And as scarce as time is for astronauts, resources are even more limited. It is highly practical, therefore, to ensure that farming in space is as automated and precise as possible.
Scare, J A; Slusarewicz, P; Noel, M L; Wielgus, K M; Nielsen, M K
2017-11-30
Fecal egg counts are emphasized for guiding equine helminth parasite control regimens due to the rise of anthelmintic resistance. This, however, poses further challenges, since egg counting results are prone to issues such as operator dependency, method variability, equipment requirements, and time commitment. The use of image analysis software for performing fecal egg counts is promoted in recent studies to reduce the operator dependency associated with manual counts. In an attempt to remove operator dependency associated with current methods, we developed a diagnostic system that utilizes a smartphone and employs image analysis to generate automated egg counts. The aims of this study were (1) to determine precision of the first smartphone prototype, the modified McMaster and ImageJ; (2) to determine precision, accuracy, sensitivity, and specificity of the second smartphone prototype, the modified McMaster, and Mini-FLOTAC techniques. Repeated counts on fecal samples naturally infected with equine strongyle eggs were performed using each technique to evaluate precision. Triplicate counts on 36 egg count negative samples and 36 samples spiked with strongyle eggs at 5, 50, 500, and 1000 eggs per gram were performed using a second smartphone system prototype, Mini-FLOTAC, and McMaster to determine technique accuracy. Precision across the techniques was evaluated using the coefficient of variation. In regards to the first aim of the study, the McMaster technique performed with significantly less variance than the first smartphone prototype and ImageJ (p<0.0001). The smartphone and ImageJ performed with equal variance. In regards to the second aim of the study, the second smartphone system prototype had significantly better precision than the McMaster (p<0.0001) and Mini-FLOTAC (p<0.0001) methods, and the Mini-FLOTAC was significantly more precise than the McMaster (p=0.0228). Mean accuracies for the Mini-FLOTAC, McMaster, and smartphone system were 64.51%, 21.67%, and 32.53%, respectively. The Mini-FLOTAC was significantly more accurate than the McMaster (p<0.0001) and the smartphone system (p<0.0001), while the smartphone and McMaster counts did not have statistically different accuracies. Overall, the smartphone system compared favorably to manual methods with regards to precision, and reasonably with regards to accuracy. With further refinement, this system could become useful in veterinary practice. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Weinheimer, Oliver; Wielpütz, Mark O.; Konietzke, Philip; Heussel, Claus P.; Kauczor, Hans-Ulrich; Brochhausen, Christoph; Hollemann, David; Savage, Dasha; Galbán, Craig J.; Robinson, Terry E.
2017-02-01
Cystic Fibrosis (CF) results in severe bronchiectasis in nearly all cases. Bronchiectasis is a disease where parts of the airways are permanently dilated. The development and the progression of bronchiectasis is not evenly distributed over the entire lungs - rather, individual functional units are affected differently. We developed a fully automated method for the precise calculation of lobe-based airway taper indices. To calculate taper indices, some preparatory algorithms are needed. The airway tree is segmented, skeletonized and transformed to a rooted acyclic graph. This graph is used to label the airways. Then a modified version of the previously validated integral based method (IBM) for airway geometry determination is utilized. The rooted graph, the airway lumen and wall information are then used to calculate the airway taper indices. Using a computer-generated phantom simulating 10 cross sections of airways we present results showing a high accuracy of the modified IBM. The new taper index calculation method was applied to 144 volumetric inspiratory low-dose MDCT scans. The scans were acquired from 36 children with mild CF at 4 time-points (baseline, 3 month, 1 year, 2 years). We found a moderate correlation with the visual lobar Brody bronchiectasis scores by three raters (r2 = 0.36, p < .0001). The taper index has the potential to be a precise imaging biomarker but further improvements are needed. In combination with other imaging biomarkers, taper index calculation can be an important tool for monitoring the progression and the individual treatment of patients with bronchiectasis.
An automated procedure to identify biomedical articles that contain cancer-associated gene variants.
McDonald, Ryan; Scott Winters, R; Ankuda, Claire K; Murphy, Joan A; Rogers, Amy E; Pereira, Fernando; Greenblatt, Marc S; White, Peter S
2006-09-01
The proliferation of biomedical literature makes it increasingly difficult for researchers to find and manage relevant information. However, identifying research articles containing mutation data, a requisite first step in integrating large and complex mutation data sets, is currently tedious, time-consuming and imprecise. More effective mechanisms for identifying articles containing mutation information would be beneficial both for the curation of mutation databases and for individual researchers. We developed an automated method that uses information extraction, classifier, and relevance ranking techniques to determine the likelihood of MEDLINE abstracts containing information regarding genomic variation data suitable for inclusion in mutation databases. We targeted the CDKN2A (p16) gene and the procedure for document identification currently used by CDKN2A Database curators as a measure of feasibility. A set of abstracts was manually identified from a MEDLINE search as potentially containing specific CDKN2A mutation events. A subset of these abstracts was used as a training set for a maximum entropy classifier to identify text features distinguishing "relevant" from "not relevant" abstracts. Each document was represented as a set of indicative word, word pair, and entity tagger-derived genomic variation features. When applied to a test set of 200 candidate abstracts, the classifier predicted 88 articles as being relevant; of these, 29 of 32 manuscripts in which manual curation found CDKN2A sequence variants were positively predicted. Thus, the set of potentially useful articles that a manual curator would have to review was reduced by 56%, maintaining 91% recall (sensitivity) and more than doubling precision (positive predictive value). Subsequent expansion of the training set to 494 articles yielded similar precision and recall rates, and comparison of the original and expanded trials demonstrated that the average precision improved with the larger data set. Our results show that automated systems can effectively identify article subsets relevant to a given task and may prove to be powerful tools for the broader research community. This procedure can be readily adapted to any or all genes, organisms, or sets of documents. Published 2006 Wiley-Liss, Inc.
Automated microdensitometer for digitizing astronomical plates
NASA Technical Reports Server (NTRS)
Angilello, J.; Chiang, W. H.; Elmegreen, D. M.; Segmueller, A.
1984-01-01
A precision microdensitometer was built under control of an IBM S/1 time-sharing computer system. The instrument's spatial resolution is better than 20 microns. A raster scan of an area of 10x10 sq mm (500x500 raster points) takes 255 minutes. The reproducibility is excellent and the stability is good over a period of 30 hours, which is significantly longer than the time required for most scans. The intrinsic accuracy of the instrument was tested using Kodak standard filters, and it was found to be better than 3%. A comparative accuracy was tested measuring astronomical plates of galaxies for which absolute photoelectric photometry data were available. The results showed an accuracy excellent for astronomical applications.
Biocoder: A programming language for standardizing and automating biology protocols
2010-01-01
Background Published descriptions of biology protocols are often ambiguous and incomplete, making them difficult to replicate in other laboratories. However, there is increasing benefit to formalizing the descriptions of protocols, as laboratory automation systems (such as microfluidic chips) are becoming increasingly capable of executing them. Our goal in this paper is to improve both the reproducibility and automation of biology experiments by using a programming language to express the precise series of steps taken. Results We have developed BioCoder, a C++ library that enables biologists to express the exact steps needed to execute a protocol. In addition to being suitable for automation, BioCoder converts the code into a readable, English-language description for use by biologists. We have implemented over 65 protocols in BioCoder; the most complex of these was successfully executed by a biologist in the laboratory using BioCoder as the only reference. We argue that BioCoder exposes and resolves ambiguities in existing protocols, and could provide the software foundations for future automation platforms. BioCoder is freely available for download at http://research.microsoft.com/en-us/um/india/projects/biocoder/. Conclusions BioCoder represents the first practical programming system for standardizing and automating biology protocols. Our vision is to change the way that experimental methods are communicated: rather than publishing a written account of the protocols used, researchers will simply publish the code. Our experience suggests that this practice is tractable and offers many benefits. We invite other researchers to leverage BioCoder to improve the precision and completeness of their protocols, and also to adapt and extend BioCoder to new domains. PMID:21059251
Kudella, Patrick Wolfgang; Moll, Kirsten; Wahlgren, Mats; Wixforth, Achim; Westerhausen, Christoph
2016-04-18
Rosetting is associated with severe malaria and a primary cause of death in Plasmodium falciparum infections. Detailed understanding of this adhesive phenomenon may enable the development of new therapies interfering with rosette formation. For this, it is crucial to determine parameters such as rosetting and parasitaemia of laboratory strains or patient isolates, a bottleneck in malaria research due to the time consuming and error prone manual analysis of specimens. Here, the automated, free, stand-alone analysis software automated rosetting analyzer for micrographs (ARAM) to determine rosetting rate, rosette size distribution as well as parasitaemia with a convenient graphical user interface is presented. Automated rosetting analyzer for micrographs is an executable with two operation modes for automated identification of objects on images. The default mode detects red blood cells and fluorescently labelled parasitized red blood cells by combining an intensity-gradient with a threshold filter. The second mode determines object location and size distribution from a single contrast method. The obtained results are compared with standardized manual analysis. Automated rosetting analyzer for micrographs calculates statistical confidence probabilities for rosetting rate and parasitaemia. Automated rosetting analyzer for micrographs analyses 25 cell objects per second reliably delivering identical results compared to manual analysis. For the first time rosette size distribution is determined in a precise and quantitative manner employing ARAM in combination with established inhibition tests. Additionally ARAM measures the essential observables parasitaemia, rosetting rate and size as well as location of all detected objects and provides confidence intervals for the determined observables. No other existing software solution offers this range of function. The second, non-malaria specific, analysis mode of ARAM offers the functionality to detect arbitrary objects. Automated rosetting analyzer for micrographs has the capability to push malaria research to a more quantitative and statistically significant level with increased reliability due to operator independence. As an installation file for Windows © 7, 8.1 and 10 is available for free, ARAM offers a novel open and easy-to-use platform for the malaria community to elucidate resetting. © 7, 8.1 and 10 is available for free, ARAM offers a novel open and easy-to-use platform for the malaria community to elucidate rosetting.
Haug, M; Reischl, B; Prölß, G; Pollmann, C; Buckert, T; Keidel, C; Schürmann, S; Hock, M; Rupitsch, S; Heckel, M; Pöschel, T; Scheibel, T; Haynl, C; Kiriaev, L; Head, S I; Friedrich, O
2018-04-15
We engineered an automated biomechatronics system, MyoRobot, for robust objective and versatile assessment of muscle or polymer materials (bio-)mechanics. It covers multiple levels of muscle biosensor assessment, e.g. membrane voltage or contractile apparatus Ca 2+ ion responses (force resolution 1µN, 0-10mN for the given sensor; [Ca 2+ ] range ~ 100nM-25µM). It replaces previously tedious manual protocols to obtain exhaustive information on active/passive biomechanical properties across various morphological tissue levels. Deciphering mechanisms of muscle weakness requires sophisticated force protocols, dissecting contributions from altered Ca 2+ homeostasis, electro-chemical, chemico-mechanical biosensors or visco-elastic components. From whole organ to single fibre levels, experimental demands and hardware requirements increase, limiting biomechanics research potential, as reflected by only few commercial biomechatronics systems that can address resolution, experimental versatility and mostly, automation of force recordings. Our MyoRobot combines optical force transducer technology with high precision 3D actuation (e.g. voice coil, 1µm encoder resolution; stepper motors, 4µm feed motion), and customized control software, enabling modular experimentation packages and automated data pre-analysis. In small bundles and single muscle fibres, we demonstrate automated recordings of (i) caffeine-induced-, (ii) electrical field stimulation (EFS)-induced force, (iii) pCa-force, (iv) slack-tests and (v) passive length-tension curves. The system easily reproduces results from manual systems (two times larger stiffness in slow over fast muscle) and provides novel insights into unloaded shortening velocities (declining with increasing slack lengths). The MyoRobot enables automated complex biomechanics assessment in muscle research. Applications also extend to material sciences, exemplarily shown here for spider silk and collagen biopolymers. Copyright © 2017 Elsevier B.V. All rights reserved.
Automated Microfluidic Instrument for Label-Free and High-Throughput Cell Separation.
Zhang, Xinjie; Zhu, Zhixian; Xiang, Nan; Long, Feifei; Ni, Zhonghua
2018-03-20
Microfluidic technologies for cell separation were reported frequently in recent years. However, a compact microfluidic instrument enabling thoroughly automated cell separation is still rarely reported until today due to the difficult hybrid between the macrosized fluidic control system and the microsized microfluidic device. In this work, we propose a novel and automated microfluidic instrument to realize size-based separation of cancer cells in a label-free and high-throughput manner. Briefly, the instrument is equipped with a fully integrated microfluidic device and a set of robust fluid-driven and control units, and the instrument functions of precise fluid infusion and high-throughput cell separation are guaranteed by a flow regulatory chip and two cell separation chips which are the key components of the microfluidic device. With optimized control programs, the instrument is successfully applied to automatically sort human breast adenocarcinoma cell line MCF-7 from 5 mL of diluted human blood with a high recovery ratio of ∼85% within a rapid processing time of ∼23 min. We envision that our microfluidic instrument will be potentially useful in many biomedical applications, especially cell separation, enrichment, and concentration for the purpose of cell culture and analysis.
Kolocouri, Filomila; Dotsikas, Yannis; Apostolou, Constantinos; Kousoulos, Constantinos; Soumelas, Georgios-Stefanos; Loukas, Yannis L
2011-01-01
An HPLC/MS/MS method characterized by complete automation and high throughput was developed for the determination of cilazapril and its active metabolite cilazaprilat in human plasma. All sample preparation and analysis steps were performed by using 2.2 mL 96 deep-well plates, while robotic liquid handling workstations were utilized for all liquid transfer steps, including liquid-liquid extraction. The whole procedure was very fast compared to a manual procedure with vials and no automation. The method also had a very short chromatographic run time of 1.5 min. Sample analysis was performed by RP-HPLC/MS/MS with positive electrospray ionization using multiple reaction monitoring. The calibration curve was linear in the range of 0.500-300 and 0.250-150 ng/mL for cilazapril and cilazaprilat, respectively. The proposed method was fully validated and proved to be selective, accurate, precise, reproducible, and suitable for the determination of cilazapril and cilazaprilat in human plasma. Therefore, it was applied to a bioequivalence study after per os administration of 2.5 mg tablet formulations of cilazapril.
Siotto, Mariacristina; Pasqualetti, Patrizio; Marano, Massimo; Squitti, Rosanna
2014-10-01
Ceruloplasmin (Cp) is a serum ferroxidase that plays an essential role in iron metabolism. It is routinely tested by immunoturbidimetric assays that quantify the concentration of the protein both in its active and inactive forms. Cp activity is generally analyzed manually; the process is time-consuming, has a limited repeatability, and is not suitable for a clinical setting. To overcome these inconveniences, we have set the automation of the o-dianisidine Cp activity assay on a Cobas Mira Plus apparatus. The automation was rapid and repeatable, and the data were provided in terms of IU/L. The assay was adapted for human sera and showed a good precision [coefficient of variation (CV) 3.7 %] and low limit of detection (LoD 11.58 IU/L). The simultaneous analysis of Cp concentration and activity in the same run allowed us to calculate the Cp-specific activity that provides a better index of the overall Cp status. To test the usefulness of this automation, we tested this assay on 104 healthy volunteers and 36 patients with Wilson's disease, hepatic encephalopathy, and chronic liver disease. Cp activity and specific activity distinguished better patients between groups with respect to Cp concentration alone, and providing support for the clinical investigation of neurological diseases in which liver failure is one of the clinical hallmarks.
Fully automated analysis of multi-resolution four-channel micro-array genotyping data
NASA Astrophysics Data System (ADS)
Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.
2006-03-01
We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.
Peirone, Laura S.; Pereyra Irujo, Gustavo A.; Bolton, Alejandro; Erreguerena, Ignacio; Aguirrezábal, Luis A. N.
2018-01-01
Conventional field phenotyping for drought tolerance, the most important factor limiting yield at a global scale, is labor-intensive and time-consuming. Automated greenhouse platforms can increase the precision and throughput of plant phenotyping and contribute to a faster release of drought tolerant varieties. The aim of this work was to establish a framework of analysis to identify early traits which could be efficiently measured in a greenhouse automated phenotyping platform, for predicting the drought tolerance of field grown soybean genotypes. A group of genotypes was evaluated, which showed variation in their drought susceptibility index (DSI) for final biomass and leaf area. A large number of traits were measured before and after the onset of a water deficit treatment, which were analyzed under several criteria: the significance of the regression with the DSI, phenotyping cost, earliness, and repeatability. The most efficient trait was found to be transpiration efficiency measured at 13 days after emergence. This trait was further tested in a second experiment with different water deficit intensities, and validated using a different set of genotypes against field data from a trial network in a third experiment. The framework applied in this work for assessing traits under different criteria could be helpful for selecting those most efficient for automated phenotyping. PMID:29774042
Application of automated multispectral analysis to Delaware's coastal vegetation mapping
NASA Technical Reports Server (NTRS)
Klemas, V.; Daiber, F.; Bartlett, D.; Crichton, O.; Fornes, A.
1973-01-01
A baseline mapping project was undertaken in Delaware's coastal wetlands as a prelude to an evaluation of the relative value of different parcels of marsh and the setting of priorities for use of these marshes. A description of Delaware's wetlands is given and a mapping approach is discussed together with details concerning an automated analysis. The precision and resolution of the analysis was limited primarily by the quality of the imagery used.
Precision and negative predictive value of links between ClinicalTrials.gov and PubMed.
Huser, Vojtech; Cimino, James J
2012-01-01
One of the goals of translational science is to shorten the time from discovery to clinical use. Clinical trial registries were established to increase transparency in completed and ongoing clinical trials, and they support linking trials with resulting publications. We set out to investigate precision and negative predictive value (NPV) of links between ClinicalTrials.gov (CT.gov) and PubMed. CT.gov has been established to increase transparency in clinical trials and the link to PubMed is crucial for supporting a number of important functions, including ascertaining publication bias. We drew a random sample of trials downloaded from CT.gov and performed manual review of retrieved publications. We characterize two types of links between trials and publications (NCT-link originating from MEDLINE and PMID-link originating from CT.gov).The link precision is different based on type (NCT-link: 100%; PMID-link: 63% to 96%). In trials with no linked publication, we were able to find publications 44% of the time (NPV=56%) by searching PubMed. This low NPV shows that there are potentially numerous publications that should have been formally linked with the trials. Our results indicate that existing trial registry and publisher policies may not be fully enforced. We suggest some automated methods for improving link quality.
Saving Space and Time: The Tractor That Einstein Built
NASA Technical Reports Server (NTRS)
2006-01-01
In 1984, NASA initiated the Gravity Probe B (GP-B) program to test two unverified predictions of Albert Einstein s theory of general relativity, hypotheses about the ways space, time, light, and gravity relate to each other. To test these predictions, the Space Agency and researchers at Stanford University developed an experiment that would check, with extreme precision, tiny changes in the spin direction of four gyroscopes contained in an Earth satellite orbiting at a 400-mile altitude directly over the Earth s poles. When the program first began, the researchers assessed using Global Positioning System (GPS) technology to control the attitude of the GP-B spacecraft accurately. At that time, the best GPS receivers could only provide accuracy to nearly 1 meter, but the GP-B spacecraft required a system 100 times more accurate. To address this concern, researchers at Stanford designed high-performance, attitude-determining hardware that used GPS signals, perfecting a high-precision form of GPS called Carrier-Phase Differential GPS that could provide continuous real-time position, velocity, time, and attitude sensor information for all axes of a vehicle. The researchers came to the realization that controlling the GP-B spacecraft with this new system was essentially no different than controlling an airplane. Their thinking took a new direction: If this technology proved successful, the airlines and the Federal Aviation Administration (FAA) were ready commercial markets. They set out to test the new technology, the "Integrity Beacon Landing System," using it to automatically land a commercial Boeing 737 over 100 times successfully through Real-Time Kinematic (RTK) GPS technology. The thinking of the researchers shifted again, from automatically landing aircraft, to automating precision farming and construction equipment.
Garbuglia, Anna Rosa; Bibbò, Angela; Sciamanna, Roberta; Pisciotta, Marina; Capobianchi, Maria Rosaria
2017-07-01
The Aptima HCV Quant Dx assay (Aptima) is a real-time transcription-mediated amplification assay CE-approved for the diagnosis and monitoring of hepatitis C virus (HCV) infection. Aptima's analytical performance was compared to the Abbott RealTime HCV assay (RealTime) in a clinical routine setting. Overall 295 clinical plasma samples (117 prospective/fresh; 178 retrospective/frozen) from HCV-infected patients were tested in Aptima and RealTime to determine concordance on qualitative and quantitative results. Linearity and precision at low viral loads (VLs; 0.8-3.3LogIU/mL) was tested using dilutions of the 5th WHO standard, in 10 and 20 replicates in the two assays, respectively. The ability to measure different HCV genotypes and accuracy were analyzed using the Seracare EQA panel. Inter-assay agreement for qualitative results (prospective samples) was 88% (kappa=0.78). For the 127 samples with quantitative results in both assays, Aptima yielded on average slightly higher values (by 0.24LogIU/mL; Bland-Altman method) than RealTime. Concordance between assay results was excellent (R=0.98). At low VLs (0.8-3.3LogIU/mL), Aptima demonstrated good linearity and precision, similar to RealTime. Aptima detected and accurately quantified all main HCV genotypes. Aptima demonstrated excellent precision, linearity, and accuracy in all genotypes tested. Good concordance was observed between Aptima and RealTime assays in clinical samples. The performance of the Aptima assay, on the fully automated Panther platform, makes it an excellent candidate for the detection and monitoring of HCV RNA in plasma and serum samples. Copyright © 2017 Elsevier B.V. All rights reserved.
Automated peak picking and peak integration in macromolecular NMR spectra using AUTOPSY.
Koradi, R; Billeter, M; Engeli, M; Güntert, P; Wüthrich, K
1998-12-01
A new approach for automated peak picking of multidimensional protein NMR spectra with strong overlap is introduced, which makes use of the program AUTOPSY (automated peak picking for NMR spectroscopy). The main elements of this program are a novel function for local noise level calculation, the use of symmetry considerations, and the use of lineshapes extracted from well-separated peaks for resolving groups of strongly overlapping peaks. The algorithm generates peak lists with precise chemical shift and integral intensities, and a reliability measure for the recognition of each peak. The results of automated peak picking of NOESY spectra with AUTOPSY were tested in combination with the combined automated NOESY cross peak assignment and structure calculation routine NOAH implemented in the program DYANA. The quality of the resulting structures was found to be comparable with those from corresponding data obtained with manual peak picking. Copyright 1998 Academic Press.
Patterson, John P; Markgraf, Carrie G; Cirino, Maria; Bass, Alan S
2005-01-01
A series of experiments were undertaken to evaluate the accuracy, precision, specificity, and sensitivity of an automated, infrared photo beam-based open field motor activity system, the MotorMonitor v. 4.01, Hamilton-Kinder, LLC, for use in a good laboratory practices (GLP) Safety Pharmacology laboratory. This evaluation consisted of two phases: (1) system validation, employing known inputs using the EM-100 Controller Photo Beam Validation System, a robotically controlled vehicle representing a rodent and (2) biologic validation, employing groups of rats treated with the standard pharmacologic agents diazepam or D-amphetamine. The MotorMonitor's parameters that described the open-field activity of a subject were: basic movements, total distance, fine movements, x/y horizontal ambulations, rearing, and total rest time. These measurements were evaluated over a number of zones within each enclosure. System validation with the EM-100 Controller Photo Beam Validation System showed that all the parameters accurately and precisely measured what they were intended to measure, with the exception of fine movements and x/y ambulations. Biologic validation using the central nervous system depressant diazepam at 1, 2, or 5 mg/kg, i.p. produced the expected dose-dependent reduction in rat motor activity. In contrast, the central nervous system stimulant D-amphetamine produced the expected increases in rat motor activity at 0.1 and 1 mg/kg, i.p, demonstrating the specificity and sensitivity of the system. Taken together, these studies of the accuracy, precision, specificity, and sensitivity show the importance of both system and biologic validation in the evaluation of an automated open field motor activity system for use in a GLP compliant laboratory.
An Evaluation of a Flight Deck Interval Management Algorithm Including Delayed Target Trajectories
NASA Technical Reports Server (NTRS)
Swieringa, Kurt A.; Underwood, Matthew C.; Barmore, Bryan; Leonard, Robert D.
2014-01-01
NASA's first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature air traffic management technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise timebased scheduling in the terminal airspace; Controller Managed Spacing (CMS), which provides controllers with decision support tools enabling precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain precise in-trail spacing. During high demand operations, TMA-TM may produce a schedule and corresponding aircraft trajectories that include delay to ensure that a particular aircraft will be properly spaced from other aircraft at each schedule waypoint. These delayed trajectories are not communicated to the automation onboard the aircraft, forcing the IM aircraft to use the published speeds to estimate the target aircraft's estimated time of arrival. As a result, the aircraft performing IM operations may follow an aircraft whose TMA-TM generated trajectories have substantial speed deviations from the speeds expected by the spacing algorithm. Previous spacing algorithms were not designed to handle this magnitude of uncertainty. A simulation was conducted to examine a modified spacing algorithm with the ability to follow aircraft flying delayed trajectories. The simulation investigated the use of the new spacing algorithm with various delayed speed profiles and wind conditions, as well as several other variables designed to simulate real-life variability. The results and conclusions of this study indicate that the new spacing algorithm generally exhibits good performance; however, some types of target aircraft speed profiles can cause the spacing algorithm to command less than optimal speed control behavior.
Hosseini, Mohammad-Parsa; Nazem-Zadeh, Mohammad R.; Pompili, Dario; Soltanian-Zadeh, Hamid
2015-01-01
Hippocampus segmentation is a key step in the evaluation of mesial Temporal Lobe Epilepsy (mTLE) by MR images. Several automated segmentation methods have been introduced for medical image segmentation. Because of multiple edges, missing boundaries, and shape changing along its longitudinal axis, manual outlining still remains the benchmark for hippocampus segmentation, which however, is impractical for large datasets due to time constraints. In this study, four automatic methods, namely FreeSurfer, Hammer, Automatic Brain Structure Segmentation (ABSS), and LocalInfo segmentation, are evaluated to find the most accurate and applicable method that resembles the bench-mark of hippocampus. Results from these four methods are compared against those obtained using manual segmentation for T1-weighted images of 157 symptomatic mTLE patients. For performance evaluation of automatic segmentation, Dice coefficient, Hausdorff distance, Precision, and Root Mean Square (RMS) distance are extracted and compared. Among these four automated methods, ABSS generates the most accurate results and the reproducibility is more similar to expert manual outlining by statistical validation. By considering p-value<0.05, the results of performance measurement for ABSS reveal that, Dice is 4%, 13%, and 17% higher, Hausdorff is 23%, 87%, and 70% lower, precision is 5%, -5%, and 12% higher, and RMS is 19%, 62%, and 65% lower compared to LocalInfo, FreeSurfer, and Hammer, respectively. PMID:25571043
Design and realization of sort manipulator of crystal-angle sort machine
NASA Astrophysics Data System (ADS)
Wang, Ming-shun; Chen, Shu-ping; Guan, Shou-ping; Zhang, Yao-wei
2005-12-01
It is a current tendency of development in automation technology to replace manpower with manipulators in working places where dangerous, harmful, heavy or repetitive work is involved. The sort manipulator is installed in a crystal-angle sort machine to take the place of manpower, and engaged in unloading and sorting work. It is the outcome of combing together mechanism, electric transmission, and pneumatic element and micro-controller control. The step motor makes the sort manipulator operate precisely. The pneumatic elements make the sort manipulator be cleverer. Micro-controller's software bestows some simple artificial intelligence on the sort manipulator, so that it can precisely repeat its unloading and sorting work. The combination of manipulator's zero position and step motor counting control puts an end to accumulating error in long time operation. A sort manipulator's design in the practice engineering has been proved to be correct and reliable.
High accuracy wavelength calibration for a scanning visible spectrometer.
Scotti, Filippo; Bell, Ronald E
2010-10-01
Spectroscopic applications for plasma velocity measurements often require wavelength accuracies ≤0.2 Å. An automated calibration, which is stable over time and environmental conditions without the need to recalibrate after each grating movement, was developed for a scanning spectrometer to achieve high wavelength accuracy over the visible spectrum. This method fits all relevant spectrometer parameters using multiple calibration spectra. With a stepping-motor controlled sine drive, an accuracy of ∼0.25 Å has been demonstrated. With the addition of a high resolution (0.075 arc sec) optical encoder on the grating stage, greater precision (∼0.005 Å) is possible, allowing absolute velocity measurements within ∼0.3 km/s. This level of precision requires monitoring of atmospheric temperature and pressure and of grating bulk temperature to correct for changes in the refractive index of air and the groove density, respectively.
NASA Technical Reports Server (NTRS)
Hess, Ronald A.
1990-01-01
A collection of technical papers are presented that cover modeling pilot interaction with automated digital avionics systems and guidance and control algorithms for contour and nap-of-the-earth flight. The titles of the papers presented are as follows: (1) Automation effects in a multiloop manual control system; (2) A qualitative model of human interaction with complex dynamic systems; (3) Generalized predictive control of dynamic systems; (4) An application of generalized predictive control to rotorcraft terrain-following flight; (5) Self-tuning generalized predictive control applied to terrain-following flight; and (6) Precise flight path control using a predictive algorithm.
NASA Astrophysics Data System (ADS)
Burba, George; Avenson, Tom; Burkart, Andreas; Gamon, John; Guan, Kaiyu; Julitta, Tommaso; Pastorello, Gilberto; Sakowska, Karolina
2017-04-01
Multiple hundreds of flux towers are presently operational as standalone projects and as parts of larger networks. However, the vast majority of these towers do not allow straight-forward coupling with satellite data, and even fewer have optical sensors for validation of satellite products and upscaling from field to regional levels. In 2016, new tools to collect, process, and share time-synchronized flux data from multiple towers were developed and deployed globally. Originally designed to automate site and data management, these new tools can also be effective in coupling tower data with satellite data due to the following present capabilities: Fully automated FluxSuite system combines hardware, software and web-services, and does not require an expert to run it It can be incorporated into a new flux station or added to a present station, using weatherized remotely-accessible microcomputer, SmartFlux2 It utilizes EddyPro software to calculate fully-processed fluxes and footprints in near-realtime, alongside radiation, optical, weather and soil data All site data are merged into a single quality-controlled file timed using PTP time protocol Data from optical sensors can be integrated into this complete dataset via compatible dataloggers Multiple stations can be linked into time-synchronized network with automated reports and email alerts visible to PIs in real-time Remote sensing researchers without stations can form "virtual networks" of stations by collaborating with tower PIs from different physical networks The present system can then be utilized to couple ground data with satellite data via the following proposed concept: GPS-driven PTP protocol will synchronize instrumentation within the station, different stations with each other, and all of these to satellite data to precisely align optical and flux data in time Footprint size and coordinates computed and stored with flux data will help correctly align footprints and satellite motion to precisely align optical and flux data in space Current flux towers can be augmented with ground optical sensors and use standard routines to deliver continuous products (e.g. SIF, PRI, NDVI, etc.) based on automated field spectrometers (e.g., FloX and RoX, etc.) and other optical systems Schedule can be developed to point ground optical sensor into the footprint, or to run leaf chamber measurements in the footprint, at the same time with the satellite or UAV above the footprint Full snapshot of the satellite pixel can then be constructed including leaf-level, ground optical sensor, and flux measurements from the same footprint area closely coupled with the satellite measurements to help interpret satellite data, validate models, and improve upscaling Several dozens of new towers already operational globally can be readily adapted for the proposed concept. In addition, over 500 active traditional towers can be updated to synchronize their data with satellite measurements. This presentation will show how FluxSuite system is used by major networks, and describe the concept of how this approach can be utilized to couple satellite and tower data.
NASA Technical Reports Server (NTRS)
Axdahl, Erik L.
2015-01-01
Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.
CT liver volumetry using geodesic active contour segmentation with a level-set algorithm
NASA Astrophysics Data System (ADS)
Suzuki, Kenji; Epstein, Mark L.; Kohlbrenner, Ryan; Obajuluwa, Ademola; Xu, Jianwu; Hori, Masatoshi; Baron, Richard
2010-03-01
Automatic liver segmentation on CT images is challenging because the liver often abuts other organs of a similar density. Our purpose was to develop an accurate automated liver segmentation scheme for measuring liver volumes. We developed an automated volumetry scheme for the liver in CT based on a 5 step schema. First, an anisotropic smoothing filter was applied to portal-venous phase CT images to remove noise while preserving the liver structure, followed by an edge enhancer to enhance the liver boundary. By using the boundary-enhanced image as a speed function, a fastmarching algorithm generated an initial surface that roughly estimated the liver shape. A geodesic-active-contour segmentation algorithm coupled with level-set contour-evolution refined the initial surface so as to more precisely fit the liver boundary. The liver volume was calculated based on the refined liver surface. Hepatic CT scans of eighteen prospective liver donors were obtained under a liver transplant protocol with a multi-detector CT system. Automated liver volumes obtained were compared with those manually traced by a radiologist, used as "gold standard." The mean liver volume obtained with our scheme was 1,520 cc, whereas the mean manual volume was 1,486 cc, with the mean absolute difference of 104 cc (7.0%). CT liver volumetrics based on an automated scheme agreed excellently with "goldstandard" manual volumetrics (intra-class correlation coefficient was 0.95) with no statistically significant difference (p(F<=f)=0.32), and required substantially less completion time. Our automated scheme provides an efficient and accurate way of measuring liver volumes.
Fink, Christine; Uhlmann, Lorenz; Klose, Christina; Haenssle, Holger A
2018-05-17
Reliable and accurate assessment of severity in psoriasis is very important in order to meet indication criteria for initiation of systemic treatment or to evaluate treatment efficacy. The most acknowledged tool for measuring the extent of psoriatic skin changes is the Psoriasis Area and Severity Index (PASI). However, the calculation of PASI can be tedious and subjective and high intraobserver and interobserver variability is an important concern. Therefore, there is a great need for a standardised and objective method that guarantees a reproducible PASI calculation. Within this study we will investigate the precision and reproducibility of automated, computer-guided PASI measurements in comparison to trained physicians to address these limitations. Non-interventional analyses of PASI calculations by either physicians in a prospective versus retrospective setting or an automated computer-guided algorithm in 120 patients with plaque psoriasis. All retrospective PASI calculations by physicians or by the computer algorithm are based on total body digital images. The primary objective of this study is comparison of automated computer-guided PASI measurements by means of digital image analysis versus conventional, prospective or retrospective physicians' PASI assessments. Secondary endpoints include (1) the assessment of physicians' interobserver variance in PASI calculations, (2) the assessment of physicians' intraobserver variance in PASI assessments of the same patients' images after a time interval of at least 4 weeks, (3) the assessment of the deviation between physicians' prospective versus retrospective PASI calculations, and (4) the reproducibility of automated computer-guided PASI measurements by assessment of two sets of total body digital images of the same patients taken at one time point. Ethical approval was provided by the Ethics Committee of the Medical Faculty of the University of Heidelberg (ethics approval number S-379/2016). DRKS00011818; Results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perrine, Kenneth A.; Hopkins, Derek F.; Lamarche, Brian L.
2005-09-01
Biologists and computer engineers at Pacific Northwest National Laboratory have specified, designed, and implemented a hardware/software system for performing real-time, multispectral image processing on a confocal microscope. This solution is intended to extend the capabilities of the microscope, enabling scientists to conduct advanced experiments on cell signaling and other kinds of protein interactions. FRET (fluorescence resonance energy transfer) techniques are used to locate and monitor protein activity. In FRET, it is critical that spectral images be precisely aligned with each other despite disturbances in the physical imaging path caused by imperfections in lenses and cameras, and expansion and contraction ofmore » materials due to temperature changes. The central importance of this work is therefore automatic image registration. This runs in a framework that guarantees real-time performance (processing pairs of 1024x1024, 8-bit images at 15 frames per second) and enables the addition of other types of advanced image processing algorithms such as image feature characterization. The supporting system architecture consists of a Visual Basic front-end containing a series of on-screen interfaces for controlling various aspects of the microscope and a script engine for automation. One of the controls is an ActiveX component written in C++ for handling the control and transfer of images. This component interfaces with a pair of LVDS image capture boards and a PCI board containing a 6-million gate Xilinx Virtex-II FPGA. Several types of image processing are performed on the FPGA in a pipelined fashion, including the image registration. The FPGA offloads work that would otherwise need to be performed by the main CPU and has a guaranteed real-time throughput. Image registration is performed in the FPGA by applying a cubic warp on one image to precisely align it with the other image. Before each experiment, an automated calibration procedure is run in order to set up the cubic warp. During image acquisitions, the cubic warp is evaluated by way of forward differencing. Unwanted pixelation artifacts are minimized by bilinear sampling. The resulting system is state-of-the-art for biological imaging. Precisely registered images enable the reliable use of FRET techniques. In addition, real-time image processing performance allows computed images to be fed back and displayed to scientists immediately, and the pipelined nature of the FPGA allows additional image processing algorithms to be incorporated into the system without slowing throughput.« less
Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou
2013-03-01
An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting.
Munksgaard, Niels C; Cheesman, Alexander W; Gray-Spence, Andrew; Cernusak, Lucas A; Bird, Michael I
2018-06-30
Continuous measurement of stable O and H isotope compositions in water vapour requires automated calibration for remote field deployments. We developed a new low-cost device for calibration of both water vapour mole fraction and isotope composition. We coupled a commercially available dew point generator (DPG) to a laser spectrometer and developed hardware for water and air handling along with software for automated operation and data processing. We characterised isotopic fractionation in the DPG, conducted a field test and assessed the influence of critical parameters on the performance of the device. An analysis time of 1 hour was sufficient to achieve memory-free analysis of two water vapour standards and the δ 18 O and δ 2 H values were found to be independent of water vapour concentration over a range of ≈20,000-33,000 ppm. The reproducibility of the standard vapours over a 10-day period was better than 0.14 ‰ and 0.75 ‰ for δ 18 O and δ 2 H values, respectively (1 σ, n = 11) prior to drift correction and calibration. The analytical accuracy was confirmed by the analysis of a third independent vapour standard. The DPG distillation process requires that isotope calibration takes account of DPG temperature, analysis time, injected water volume and air flow rate. The automated calibration system provides high accuracy and precision and is a robust, cost-effective option for long-term field measurements of water vapour isotopes. The necessary modifications to the DPG are minor and easily reversible. Copyright © 2018 John Wiley & Sons, Ltd.
Scheerlinck, Thierry; Polfliet, Mathias; Deklerck, Rudi; Van Gompel, Gert; Buls, Nico; Vandemeulebroucke, Jef
2016-01-01
We developed a marker-free automated CT-based spatial analysis (CTSA) method to detect stem-bone migration in consecutive CT datasets and assessed the accuracy and precision in vitro. Our aim was to demonstrate that in vitro accuracy and precision of CTSA is comparable to that of radiostereometric analysis (RSA). Stem and bone were segmented in 2 CT datasets and both were registered pairwise. The resulting rigid transformations were compared and transferred to an anatomically sound coordinate system, taking the stem as reference. This resulted in 3 translation parameters and 3 rotation parameters describing the relative amount of stem-bone displacement, and it allowed calculation of the point of maximal stem migration. Accuracy was evaluated in 39 comparisons by imposing known stem migration on a stem-bone model. Precision was estimated in 20 comparisons based on a zero-migration model, and in 5 patients without stem loosening. Limits of the 95% tolerance intervals (TIs) for accuracy did not exceed 0.28 mm for translations and 0.20° for rotations (largest standard deviation of the signed error (SD(SE)): 0.081 mm and 0.057°). In vitro, limits of the 95% TI for precision in a clinically relevant setting (8 comparisons) were below 0.09 mm and 0.14° (largest SD(SE): 0.012 mm and 0.020°). In patients, the precision was lower, but acceptable, and dependent on CT scan resolution. CTSA allows detection of stem-bone migration with an accuracy and precision comparable to that of RSA. It could be valuable for evaluation of subtle stem loosening in clinical practice.
Automated semantic indexing of figure captions to improve radiology image retrieval.
Kahn, Charles E; Rubin, Daniel L
2009-01-01
We explored automated concept-based indexing of unstructured figure captions to improve retrieval of images from radiology journals. The MetaMap Transfer program (MMTx) was used to map the text of 84,846 figure captions from 9,004 peer-reviewed, English-language articles to concepts in three controlled vocabularies from the UMLS Metathesaurus, version 2006AA. Sampling procedures were used to estimate the standard information-retrieval metrics of precision and recall, and to evaluate the degree to which concept-based retrieval improved image retrieval. Precision was estimated based on a sample of 250 concepts. Recall was estimated based on a sample of 40 concepts. The authors measured the impact of concept-based retrieval to improve upon keyword-based retrieval in a random sample of 10,000 search queries issued by users of a radiology image search engine. Estimated precision was 0.897 (95% confidence interval, 0.857-0.937). Estimated recall was 0.930 (95% confidence interval, 0.838-1.000). In 5,535 of 10,000 search queries (55%), concept-based retrieval found results not identified by simple keyword matching; in 2,086 searches (21%), more than 75% of the results were found by concept-based search alone. Concept-based indexing of radiology journal figure captions achieved very high precision and recall, and significantly improved image retrieval.
Modular multiaperatures for light sensors
NASA Technical Reports Server (NTRS)
Rizzo, A. A.
1977-01-01
Process involves electroplating multiaperature masks as unit, eliminating alinement and assembly difficulties previously encountered. Technique may be applied to masks in automated and surveillance light systems, when precise, wide angle field of view is needed.
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Alan
2011-01-01
Current aircraft departure release times are based on manual estimates of aircraft takeoff times. Uncertainty in takeoff time estimates may result in missed opportunities to merge into constrained en route streams and lead to lost throughput. However, technology exists to improve takeoff time estimates by using the aircraft surface trajectory predictions that enable air traffic control tower (ATCT) decision support tools. NASA s Precision Departure Release Capability (PDRC) is designed to use automated surface trajectory-based takeoff time estimates to improve en route tactical departure scheduling. This is accomplished by integrating an ATCT decision support tool with an en route tactical departure scheduling decision support tool. The PDRC concept and prototype software have been developed, and an initial test was completed at air traffic control facilities in Dallas/Fort Worth. This paper describes the PDRC operational concept, system design, and initial observations.
Automatic pickup of arrival time of channel wave based on multi-channel constraints
NASA Astrophysics Data System (ADS)
Wang, Bao-Li
2018-03-01
Accurately detecting the arrival time of a channel wave in a coal seam is very important for in-seam seismic data processing. The arrival time greatly affects the accuracy of the channel wave inversion and the computed tomography (CT) result. However, because the signal-to-noise ratio of in-seam seismic data is reduced by the long wavelength and strong frequency dispersion, accurately timing the arrival of channel waves is extremely difficult. For this purpose, we propose a method that automatically picks up the arrival time of channel waves based on multi-channel constraints. We first estimate the Jaccard similarity coefficient of two ray paths, then apply it as a weight coefficient for stacking the multichannel dispersion spectra. The reasonableness and effectiveness of the proposed method is verified in an actual data application. Most importantly, the method increases the degree of automation and the pickup precision of the channel-wave arrival time.
Assecondi, Sara; Bianchi, A M; Hallez, H; Staelens, S; Casarotto, S; Lemahieu, I; Chiarenza, G A
2009-10-01
This article proposes a method to automatically identify and label event-related potential (ERP) components with high accuracy and precision. We present a framework, referred to as peak-picking Dynamic Time Warping (ppDTW), where a priori knowledge about the ERPs under investigation is used to define a reference signal. We developed a combination of peak-picking and Dynamic Time Warping (DTW) that makes the temporal intervals for peak-picking adaptive on the basis of the morphology of the data. We tested the procedure on experimental data recorded from a control group and from children diagnosed with developmental dyslexia. We compared our results with the traditional peak-picking. We demonstrated that our method achieves better performance than peak-picking, with an overall precision, recall and F-score of 93%, 86% and 89%, respectively, versus 93%, 80% and 85% achieved by peak-picking. We showed that our hybrid method outperforms peak-picking, when dealing with data involving several peaks of interest. The proposed method can reliably identify and label ERP components in challenging event-related recordings, thus assisting the clinician in an objective assessment of amplitudes and latencies of peaks of clinical interest.
NASA Astrophysics Data System (ADS)
Villanueva, Steven; Gaudi, B. Scott; Pogge, Richard; Stassun, Keivan G.; Eastman, Jason; Trueblood, Mark; Trueblood, Pat
2018-01-01
The DEdicated MONitor of EXotransits and Transients (DEMONEXT) is a 20 inch (0.5-m) robotic telescope that has been in operation since May 2016. Fully automated, DEMONEXT has observed over 150 transits of exoplanet candidates for the KELT survey, including confirmation observations of KELT-20b. DEMONEXT achieves 2-4 mmag precision with unbinned, 20-120 second exposures, on targets orbiting V<13 host stars. Millimagnitude precision can be achieved by binning the transits on 5-6 minute timescales. During observations of 8 hours with hundreds of consecutive exposures, DEMONEXT maintains sub-pixel (<0.5 pixels) target position stability on the CCD during good observing conditions, with degraded performance during poor observing conditions (<1 pixel). DEMONEXT achieves 1% photometry on targets with V<17 in 5 minute exposures, with detection limits of V~21. In addition to the 150 transits observed by DEMONEXT, 50 supernovae and transients haven been observed for the ASAS-SN supernovae group, as well as time-series observations of Galactic microlensing, active galactic nuclei, stellar variability, and stellar rotation.
Automated compound classification using a chemical ontology.
Bobach, Claudia; Böhme, Timo; Laube, Ulf; Püschel, Anett; Weber, Lutz
2012-12-29
Classification of chemical compounds into compound classes by using structure derived descriptors is a well-established method to aid the evaluation and abstraction of compound properties in chemical compound databases. MeSH and recently ChEBI are examples of chemical ontologies that provide a hierarchical classification of compounds into general compound classes of biological interest based on their structural as well as property or use features. In these ontologies, compounds have been assigned manually to their respective classes. However, with the ever increasing possibilities to extract new compounds from text documents using name-to-structure tools and considering the large number of compounds deposited in databases, automated and comprehensive chemical classification methods are needed to avoid the error prone and time consuming manual classification of compounds. In the present work we implement principles and methods to construct a chemical ontology of classes that shall support the automated, high-quality compound classification in chemical databases or text documents. While SMARTS expressions have already been used to define chemical structure class concepts, in the present work we have extended the expressive power of such class definitions by expanding their structure-based reasoning logic. Thus, to achieve the required precision and granularity of chemical class definitions, sets of SMARTS class definitions are connected by OR and NOT logical operators. In addition, AND logic has been implemented to allow the concomitant use of flexible atom lists and stereochemistry definitions. The resulting chemical ontology is a multi-hierarchical taxonomy of concept nodes connected by directed, transitive relationships. A proposal for a rule based definition of chemical classes has been made that allows to define chemical compound classes more precisely than before. The proposed structure-based reasoning logic allows to translate chemistry expert knowledge into a computer interpretable form, preventing erroneous compound assignments and allowing automatic compound classification. The automated assignment of compounds in databases, compound structure files or text documents to their related ontology classes is possible through the integration with a chemical structure search engine. As an application example, the annotation of chemical structure files with a prototypic ontology is demonstrated.
Automated compound classification using a chemical ontology
2012-01-01
Background Classification of chemical compounds into compound classes by using structure derived descriptors is a well-established method to aid the evaluation and abstraction of compound properties in chemical compound databases. MeSH and recently ChEBI are examples of chemical ontologies that provide a hierarchical classification of compounds into general compound classes of biological interest based on their structural as well as property or use features. In these ontologies, compounds have been assigned manually to their respective classes. However, with the ever increasing possibilities to extract new compounds from text documents using name-to-structure tools and considering the large number of compounds deposited in databases, automated and comprehensive chemical classification methods are needed to avoid the error prone and time consuming manual classification of compounds. Results In the present work we implement principles and methods to construct a chemical ontology of classes that shall support the automated, high-quality compound classification in chemical databases or text documents. While SMARTS expressions have already been used to define chemical structure class concepts, in the present work we have extended the expressive power of such class definitions by expanding their structure-based reasoning logic. Thus, to achieve the required precision and granularity of chemical class definitions, sets of SMARTS class definitions are connected by OR and NOT logical operators. In addition, AND logic has been implemented to allow the concomitant use of flexible atom lists and stereochemistry definitions. The resulting chemical ontology is a multi-hierarchical taxonomy of concept nodes connected by directed, transitive relationships. Conclusions A proposal for a rule based definition of chemical classes has been made that allows to define chemical compound classes more precisely than before. The proposed structure-based reasoning logic allows to translate chemistry expert knowledge into a computer interpretable form, preventing erroneous compound assignments and allowing automatic compound classification. The automated assignment of compounds in databases, compound structure files or text documents to their related ontology classes is possible through the integration with a chemical structure search engine. As an application example, the annotation of chemical structure files with a prototypic ontology is demonstrated. PMID:23273256
Elrod, JoAnn Broeckel; Merchant, Raina; Daya, Mohamud; Youngquist, Scott; Salcido, David; Valenzuela, Terence; Nichol, Graham
2017-03-29
Lay use of automated external defibrillators (AEDs) before the arrival of emergency medical services (EMS) providers on scene increases survival after out-of-hospital cardiac arrest (OHCA). AEDs have been placed in public locations may be not ready for use when needed. We describe a protocol for AED surveillance that tracks these devices through time and space to improve public health, and survival as well as facilitate research. Included AEDs are installed in public locations for use by laypersons to treat patients with OHCA before the arrival of EMS providers on scene. Included cases of OHCA are patients evaluated by organised EMS personnel and treated for OHCA. Enrolment of 10 000 AEDs annually will yield precision of 0.4% in the estimate of readiness for use. Enrolment of 2500 patients annually will yield precision of 1.9% in the estimate of survival to hospital discharge. Recruitment began on 21 Mar 2014 and is ongoing. AEDs are found by using multiple methods. Each AED is then tagged with a label which is a unique two-dimensional (2D) matrix code; the 2D matrix code is recorded and the location and status of the AED tracked using a smartphone; these elements are automatically passed via the internet to a secure and confidential database in real time. Whenever the 2D matrix code is rescanned for any non-clinical or clinical use of an AED, the user is queried to answer a finite set of questions about the device status. The primary outcome of any clinical use of an AED is survival to hospital discharge. Results are summarised descriptively. These activities are conducted under a grant of authority for public health surveillance from the Food and Drug Administration. Results are provided periodically to participating sites and sponsors to improve public health and quality of care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Morota, Gota; Ventura, Ricardo V; Silva, Fabyano F; Koyama, Masanori; Fernando, Samodha C
2018-04-14
Precision animal agriculture is poised to rise to prominence in the livestock enterprise in the domains of management, production, welfare, sustainability, health surveillance, and environmental footprint. Considerable progress has been made in the use of tools to routinely monitor and collect information from animals and farms in a less laborious manner than before. These efforts have enabled the animal sciences to embark on information technology-driven discoveries to improve animal agriculture. However, the growing amount and complexity of data generated by fully automated, high-throughput data recording or phenotyping platforms, including digital images, sensor and sound data, unmanned systems, and information obtained from real-time noninvasive computer vision, pose challenges to the successful implementation of precision animal agriculture. The emerging fields of machine learning and data mining are expected to be instrumental in helping meet the daunting challenges facing global agriculture. Yet, their impact and potential in "big data" analysis have not been adequately appreciated in the animal science community, where this recognition has remained only fragmentary. To address such knowledge gaps, this article outlines a framework for machine learning and data mining and offers a glimpse into how they can be applied to solve pressing problems in animal sciences.
[Medical imaging in tumor precision medicine: opportunities and challenges].
Xu, Jingjing; Tan, Yanbin; Zhang, Minming
2017-05-25
Tumor precision medicine is an emerging approach for tumor diagnosis, treatment and prevention, which takes account of individual variability of environment, lifestyle and genetic information. Tumor precision medicine is built up on the medical imaging innovations developed during the past decades, including the new hardware, new imaging agents, standardized protocols, image analysis and multimodal imaging fusion technology. Also the development of automated and reproducible analysis algorithm has extracted large amount of information from image-based features. With the continuous development and mining of tumor clinical and imaging databases, the radiogenomics, radiomics and artificial intelligence have been flourishing. Therefore, these new technological advances bring new opportunities and challenges to the application of imaging in tumor precision medicine.
Subaperture metrology technologies extend capabilities in optics manufacturing
NASA Astrophysics Data System (ADS)
Tricard, Marc; Forbes, Greg; Murphy, Paul
2005-10-01
Subaperture polishing technologies have radically changed the landscape of precision optics manufacturing and enabled the production of higher precision optics with increasingly difficult figure requirements. However, metrology is a critical piece of the optics fabrication process, and the dependence on interferometry is especially acute for computer-controlled, deterministic finishing. Without accurate full-aperture metrology, figure correction using subaperture polishing technologies would not be possible. QED Technologies has developed the Subaperture Stitching Interferometer (SSI) that extends the effective aperture and dynamic range of a phase measuring interferometer. The SSI's novel developments in software and hardware improve the capacity and accuracy of traditional interferometers, overcoming many of the limitations previously faced. The SSI performs high-accuracy automated measurements of spheres, flats, and mild aspheres up to 200 mm in diameter by stitching subaperture data. The system combines a six-axis precision workstation, a commercial Fizeau interferometer of 4" or 6" aperture, and dedicated software. QED's software automates the measurement design, data acquisition, and mathematical reconstruction of the full-aperture phase map. The stitching algorithm incorporates a general framework for compensating several types of errors introduced by the interferometer and stage mechanics. These include positioning errors, viewing system distortion, the system reference wave error, etc. The SSI has been proven to deliver the accurate and flexible metrology that is vital to precision optics fabrication. This paper will briefly review the capabilities of the SSI as a production-ready, metrology system that enables costeffective manufacturing of precision optical surfaces.
Topics in Chemical Instrumentation. Robots in the Laboratory--An Overview.
ERIC Educational Resources Information Center
Strimaitis, Janet R.
1990-01-01
Discussed are applications of robotics in the chemistry laboratory. Highlighted are issues of precision, accuracy, and system integration. Emphasized are the potential benefits of the use of robots to automate laboratory procedures. (CW)
Automated parton-shower variations in PYTHIA 8
Mrenna, S.; Skands, P.
2016-10-03
In the era of precision physics measurements at the LHC, efficient and exhaustive estimations of theoretical uncertainties play an increasingly crucial role. In the context of Monte Carlo (MC) event generators, the estimation of such uncertainties traditionally requires independent MC runs for each variation, for a linear increase in total run time. In this work, we report on an automated evaluation of the dominant (renormalization-scale and nonsingular) perturbative uncertainties in the pythia 8 event generator, with only a modest computational overhead. Each generated event is accompanied by a vector of alternative weights (one for each uncertainty variation), with each set separatelymore » preserving the total cross section. Explicit scale-compensating terms can be included, reflecting known coefficients of higher-order splitting terms and reducing the effect of the variations. In conclusion, the formalism also allows for the enhancement of rare partonic splittings, such as g→bb¯ and q→qγ, to obtain weighted samples enriched in these splittings while preserving the correct physical Sudakov factors.« less
Carbohydrate structure: the rocky road to automation.
Agirre, Jon; Davies, Gideon J; Wilson, Keith S; Cowtan, Kevin D
2017-06-01
With the introduction of intuitive graphical software, structural biologists who are not experts in crystallography are now able to build complete protein or nucleic acid models rapidly. In contrast, carbohydrates are in a wholly different situation: scant automation exists, with manual building attempts being sometimes toppled by incorrect dictionaries or refinement problems. Sugars are the most stereochemically complex family of biomolecules and, as pyranose rings, have clear conformational preferences. Despite this, all refinement programs may produce high-energy conformations at medium to low resolution, without any support from the electron density. This problem renders the affected structures unusable in glyco-chemical terms. Bringing structural glycobiology up to 'protein standards' will require a total overhaul of the methodology. Time is of the essence, as the community is steadily increasing the production rate of glycoproteins, and electron cryo-microscopy has just started to image them in precisely that resolution range where crystallographic methods falter most. Copyright © 2016 Elsevier Ltd. All rights reserved.
Xu, Yupeng; Yan, Ke; Kim, Jinman; Wang, Xiuying; Li, Changyang; Su, Li; Yu, Suqin; Xu, Xun; Feng, Dagan David
2017-01-01
Worldwide, polypoidal choroidal vasculopathy (PCV) is a common vision-threatening exudative maculopathy, and pigment epithelium detachment (PED) is an important clinical characteristic. Thus, precise and efficient PED segmentation is necessary for PCV clinical diagnosis and treatment. We propose a dual-stage learning framework via deep neural networks (DNN) for automated PED segmentation in PCV patients to avoid issues associated with manual PED segmentation (subjectivity, manual segmentation errors, and high time consumption).The optical coherence tomography scans of fifty patients were quantitatively evaluated with different algorithms and clinicians. Dual-stage DNN outperformed existing PED segmentation methods for all segmentation accuracy parameters, including true positive volume fraction (85.74 ± 8.69%), dice similarity coefficient (85.69 ± 8.08%), positive predictive value (86.02 ± 8.99%) and false positive volume fraction (0.38 ± 0.18%). Dual-stage DNN achieves accurate PED quantitative information, works with multiple types of PEDs and agrees well with manual delineation, suggesting that it is a potential automated assistant for PCV management. PMID:28966847
Xu, Yupeng; Yan, Ke; Kim, Jinman; Wang, Xiuying; Li, Changyang; Su, Li; Yu, Suqin; Xu, Xun; Feng, Dagan David
2017-09-01
Worldwide, polypoidal choroidal vasculopathy (PCV) is a common vision-threatening exudative maculopathy, and pigment epithelium detachment (PED) is an important clinical characteristic. Thus, precise and efficient PED segmentation is necessary for PCV clinical diagnosis and treatment. We propose a dual-stage learning framework via deep neural networks (DNN) for automated PED segmentation in PCV patients to avoid issues associated with manual PED segmentation (subjectivity, manual segmentation errors, and high time consumption).The optical coherence tomography scans of fifty patients were quantitatively evaluated with different algorithms and clinicians. Dual-stage DNN outperformed existing PED segmentation methods for all segmentation accuracy parameters, including true positive volume fraction (85.74 ± 8.69%), dice similarity coefficient (85.69 ± 8.08%), positive predictive value (86.02 ± 8.99%) and false positive volume fraction (0.38 ± 0.18%). Dual-stage DNN achieves accurate PED quantitative information, works with multiple types of PEDs and agrees well with manual delineation, suggesting that it is a potential automated assistant for PCV management.
NASA Astrophysics Data System (ADS)
Hermens, Ulrike; Pothen, Mario; Winands, Kai; Arntz, Kristian; Klocke, Fritz
2018-02-01
Laser-induced periodic surface structures (LIPSS) found in particular applications in the fields of surface functionalization have been investigated since many years. The direction of these ripple structures with a periodicity in the nanoscale can be manipulated by changing the laser polarization. For industrial use, it is useful to manipulate the direction of these structures automatically and to obtain smooth changes of their orientation without any visible inhomogeneity. However, currently no system solution exists that is able to control the polarization direction completely automated in one software solution so far. In this paper, a system solution is presented that includes a liquid crystal polarizer to control the polarization direction. It is synchronized with a scanner, a dynamic beam expander and a five axis-system. It provides fast switching times and small step sizes. First results of fabricated structures are also presented. In a systematic study, the conjunction of LIPSS with different orientation in two parallel line scans has been investigated.
Hung, Lien-Yu; Wang, Chih-Hung; Fu, Chien-Yu; Gopinathan, Priya; Lee, Gwo-Bin
2016-08-07
Microfluidic technologies have miniaturized a variety of biomedical applications, and these chip-based systems have several significant advantages over their large-scale counterparts. Recently, this technology has been used for automating labor-intensive and time-consuming screening processes, whereby affinity reagents, including aptamers, peptides, antibodies, polysaccharides, glycoproteins, and a variety of small molecules, are used to probe for molecular biomarkers. When compared to conventional methods, the microfluidic approaches are faster, more compact, require considerably smaller quantities of samples and reagents, and can be automated. Furthermore, they allow for more precise control of reaction conditions (e.g., pH, temperature, and shearing forces) such that more efficient screening can be performed. A variety of affinity reagents for targeting cancer cells or cancer biomarkers are now available and will likely replace conventional antibodies. In this review article, the selection of affinity reagents for cancer cells or cancer biomarkers on microfluidic platforms is reviewed with the aim of highlighting the utility of such approaches in cancer diagnostics.
Software-defined Radio Based Measurement Platform for Wireless Networks
Chao, I-Chun; Lee, Kang B.; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan
2015-01-01
End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc.) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks. PMID:27891210
Software-defined Radio Based Measurement Platform for Wireless Networks.
Chao, I-Chun; Lee, Kang B; Candell, Richard; Proctor, Frederick; Shen, Chien-Chung; Lin, Shinn-Yan
2015-10-01
End-to-end latency is critical to many distributed applications and services that are based on computer networks. There has been a dramatic push to adopt wireless networking technologies and protocols (such as WiFi, ZigBee, WirelessHART, Bluetooth, ISA100.11a, etc. ) into time-critical applications. Examples of such applications include industrial automation, telecommunications, power utility, and financial services. While performance measurement of wired networks has been extensively studied, measuring and quantifying the performance of wireless networks face new challenges and demand different approaches and techniques. In this paper, we describe the design of a measurement platform based on the technologies of software-defined radio (SDR) and IEEE 1588 Precision Time Protocol (PTP) for evaluating the performance of wireless networks.
Fully automated MR liver volumetry using watershed segmentation coupled with active contouring.
Huynh, Hieu Trung; Le-Trong, Ngoc; Bao, Pham The; Oto, Aytek; Suzuki, Kenji
2017-02-01
Our purpose is to develop a fully automated scheme for liver volume measurement in abdominal MR images, without requiring any user input or interaction. The proposed scheme is fully automatic for liver volumetry from 3D abdominal MR images, and it consists of three main stages: preprocessing, rough liver shape generation, and liver extraction. The preprocessing stage reduced noise and enhanced the liver boundaries in 3D abdominal MR images. The rough liver shape was revealed fully automatically by using the watershed segmentation, thresholding transform, morphological operations, and statistical properties of the liver. An active contour model was applied to refine the rough liver shape to precisely obtain the liver boundaries. The liver volumes calculated by the proposed scheme were compared to the "gold standard" references which were estimated by an expert abdominal radiologist. The liver volumes computed by using our developed scheme excellently agreed (Intra-class correlation coefficient was 0.94) with the "gold standard" manual volumes by the radiologist in the evaluation with 27 cases from multiple medical centers. The running time was 8.4 min per case on average. We developed a fully automated liver volumetry scheme in MR, which does not require any interaction by users. It was evaluated with cases from multiple medical centers. The liver volumetry performance of our developed system was comparable to that of the gold standard manual volumetry, and it saved radiologists' time for manual liver volumetry of 24.7 min per case.
NASA Astrophysics Data System (ADS)
York, Andrew M.
2000-11-01
The ever increasing sophistication of reconnaissance sensors reinforces the importance of timely, accurate, and equally sophisticated mission planning capabilities. Precision targeting and zero-tolerance for collateral damage and civilian casualties, stress the need for accuracy and timeliness. Recent events have highlighted the need for improvement in current planning procedures and systems. Annotating printed maps takes time and does not allow flexibility for rapid changes required in today's conflicts. We must give aircrew the ability to accurately navigate their aircraft to an area of interest, correctly position the sensor to obtain the required sensor coverage, adapt missions as required, and ensure mission success. The growth in automated mission planning system capability and the expansion of those systems to include dedicated and integrated reconnaissance modules, helps to overcome current limitations. Mission planning systems, coupled with extensive integrated visualization capabilities, allow aircrew to not only plan accurately and quickly, but know precisely when they will locate the target and visualize what the sensor will see during its operation. This paper will provide a broad overview of the current capabilities and describe how automated mission planning and visualization systems can improve and enhance the reconnaissance planning process and contribute to mission success. Think about the ultimate objective of the reconnaissance mission as we consider areas that technology can offer improvement. As we briefly review the fundamentals, remember where and how TAC RECCE systems will be used. Try to put yourself in the mindset of those who are on the front lines, working long hours at increasingly demanding tasks, trying to become familiar with new operating areas and equipment, while striving to minimize risk and optimize mission success. Technical advancements that can reduce the TAC RECCE timeline, simplify operations and instill Warfighter confidence, ultimately improve the desired outcome.
Interpretation of Blood Microbiology Results - Function of the Clinical Microbiologist.
Kristóf, Katalin; Pongrácz, Júlia
2016-04-01
The proper use and interpretation of blood microbiology results may be one of the most challenging and one of the most important functions of clinical microbiology laboratories. Effective implementation of this function requires careful consideration of specimen collection and processing, pathogen detection techniques, and prompt and precise reporting of identification and susceptibility results. The responsibility of the treating physician is proper formulation of the analytical request and to provide the laboratory with complete and precise patient information, which are inevitable prerequisites of a proper testing and interpretation. The clinical microbiologist can offer advice concerning the differential diagnosis, sampling techniques and detection methods to facilitate diagnosis. Rapid detection methods are essential, since the sooner a pathogen is detected, the better chance the patient has of getting cured. Besides the gold-standard blood culture technique, microbiologic methods that decrease the time in obtaining a relevant result are more and more utilized today. In the case of certain pathogens, the pathogen can be identified directly from the blood culture bottle after propagation with serological or automated/semi-automated systems or molecular methods or with MALDI-TOF MS (matrix-assisted laser desorption-ionization time of flight mass spectrometry). Molecular biology methods are also suitable for the rapid detection and identification of pathogens from aseptically collected blood samples. Another important duty of the microbiology laboratory is to notify the treating physician immediately about all relevant information if a positive sample is detected. The clinical microbiologist may provide important guidance regarding the clinical significance of blood isolates, since one-third to one-half of blood culture isolates are contaminants or isolates of unknown clinical significance. To fully exploit the benefits of blood culture and other (non- culture based) diagnoses, the microbiologist and the clinician should interact directly.
Park, Yongjung; Park, Younhee; Joo, Shin Young; Park, Myoung Hee; Kim, Hyon-Suk
2011-11-01
We evaluated analytic performances of an automated treponemal test and compared this test with the Venereal Disease Research Laboratory test (VDRL) and fluorescent treponemal antibody absorption test (FTA-ABS). Precision performance of the Architect Syphilis TP assay (TP; Abbott Japan, Tokyo, Japan) was assessed, and 150 serum samples were assayed with the TP before and after heat inactivation to estimate the effect of heat inactivation. A total of 616 specimens were tested with the FTA-ABS and TP, and 400 were examined with the VDRL. The TP showed good precision performance with total imprecision of less than a 10% coefficient of variation. An excellent linear relationship between results before and after heat inactivation was observed (R(2) = 0.9961). The FTA-ABS and TP agreed well with a κ coefficient of 0.981. The concordance rate between the FTA-ABS and TP was the highest (99.0%), followed by the rates between FTA-ABS and VDRL (85.0%) and between TP and VDRL (83.8%). The automated TP assay may be adequate for screening for syphilis in a large volume of samples and can be an alternative to FTA-ABS.
The String Stability of a Trajectory-Based Interval Management Algorithm in the Midterm Airspace
NASA Technical Reports Server (NTRS)
Swieringa, Kurt A.
2015-01-01
NASA's first Air Traffic Management (ATM) Technology Demonstration (ATD-1) was created to facilitate the transition of mature ATM technologies from the laboratory to operational use. The technologies selected for demonstration are the Traffic Management Advisor with Terminal Metering (TMA-TM), which provides precise time-based scheduling in the terminal airspace; Controller Managed Spacing (CMS), which provides terminal controllers with decision support tools enabling precise schedule conformance; and Interval Management (IM), which consists of flight deck automation that enables aircraft to achieve or maintain a precise spacing interval behind a target aircraft. As the percentage of IM equipped aircraft increases, controllers may provide IM clearances to sequences, or strings, of IM-equipped aircraft. It is important for these strings to maintain stable performance. This paper describes an analytic analysis of the string stability of the latest version of NASA's IM algorithm and a fast-time simulation designed to characterize the string performance of the IM algorithm. The analytic analysis showed that the spacing algorithm has stable poles, indicating that a spacing error perturbation will be reduced as a function of string position. The fast-time simulation investigated IM operations at two airports using constraints associated with the midterm airspace, including limited information of the target aircraft's intended speed profile and limited information of the wind forecast on the target aircraft's route. The results of the fast-time simulation demonstrated that the performance of the spacing algorithm is acceptable for strings of moderate length; however, there is some degradation in IM performance as a function of string position.
Auto-FPFA: An Automated Microscope for Characterizing Genetically Encoded Biosensors.
Nguyen, Tuan A; Puhl, Henry L; Pham, An K; Vogel, Steven S
2018-05-09
Genetically encoded biosensors function by linking structural change in a protein construct, typically tagged with one or more fluorescent proteins, to changes in a biological parameter of interest (such as calcium concentration, pH, phosphorylation-state, etc.). Typically, the structural change triggered by alterations in the bio-parameter is monitored as a change in either fluorescent intensity, or lifetime. Potentially, other photo-physical properties of fluorophores, such as fluorescence anisotropy, molecular brightness, concentration, and lateral and/or rotational diffusion could also be used. Furthermore, while it is likely that multiple photo-physical attributes of a biosensor might be altered as a function of the bio-parameter, standard measurements monitor only a single photo-physical trait. This limits how biosensors are designed, as well as the accuracy and interpretation of biosensor measurements. Here we describe the design and construction of an automated multimodal-microscope. This system can autonomously analyze 96 samples in a micro-titer dish and for each sample simultaneously measure intensity (photon count), fluorescence lifetime, time-resolved anisotropy, molecular brightness, lateral diffusion time, and concentration. We characterize the accuracy and precision of this instrument, and then demonstrate its utility by characterizing three types of genetically encoded calcium sensors as well as a negative control.
ICSH guidelines for the verification and performance of automated cell counters for body fluids.
Bourner, G; De la Salle, B; George, T; Tabe, Y; Baum, H; Culp, N; Keng, T B
2014-12-01
One of the many challenges facing laboratories is the verification of their automated Complete Blood Count cell counters for the enumeration of body fluids. These analyzers offer improved accuracy, precision, and efficiency in performing the enumeration of cells compared with manual methods. A patterns of practice survey was distributed to laboratories that participate in proficiency testing in Ontario, Canada, the United States, the United Kingdom, and Japan to determine the number of laboratories that are testing body fluids on automated analyzers and the performance specifications that were performed. Based on the results of this questionnaire, an International Working Group for the Verification and Performance of Automated Cell Counters for Body Fluids was formed by the International Council for Standardization in Hematology (ICSH) to prepare a set of guidelines to help laboratories plan and execute the verification of their automated cell counters to provide accurate and reliable results for automated body fluid counts. These guidelines were discussed at the ICSH General Assemblies and reviewed by an international panel of experts to achieve further consensus. © 2014 John Wiley & Sons Ltd.
Ercan, Ertuğrul; Kırılmaz, Bahadır; Kahraman, İsmail; Bayram, Vildan; Doğan, Hüseyin
2012-11-01
Flow-mediated dilation (FMD) is used to evaluate endothelial functions. Computer-assisted analysis utilizing edge detection permits continuous measurements along the vessel wall. We have developed a new fully automated software program to allow accurate and reproducible measurement. FMD has been measured and analyzed in 18 coronary artery disease (CAD) patients and 17 controls both by manually and by the software developed (computer supported) methods. The agreement between methods was assessed by Bland-Altman analysis. The mean age, body mass index and cardiovascular risk factors were higher in CAD group. Automated FMD% measurement for the control subjects was 18.3±8.5 and 6.8±6.5 for the CAD group (p=0.0001). The intraobserver and interobserver correlation for automated measurement was high (r=0.974, r=0.981, r=0.937, r=0.918, respectively). Manual FMD% at 60th second was correlated with automated FMD % (r=0.471, p=0.004). The new fully automated software© can be used to precise measurement of FMD with low intra- and interobserver variability than manual assessment.
Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing
Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang
2018-01-01
Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, feature extraction algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system. PMID:29462855
Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.
Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang
2018-02-15
Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.
Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A
2013-08-20
A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.
Quality specification in haematology: the automated blood cell count.
Buttarello, Mauro
2004-08-02
Quality specifications for automated blood cell counts include topics that go beyond the traditional analytic stage (imprecision, inaccuracy, quality control) and extend to pre- and post-analytic phases. In this review pre-analytic aspects concerning the choice of anticoagulants, maximum conservation times and differences between storage at room temperature or at 4 degrees C are considered. For the analytic phase, goals for imprecision and bias obtained with various approaches (ratio to biologic variation, state of the art, specific clinical situations) are evaluated. For the post-analytic phase, medical review criteria (algorithm, decision limit and delta check) and the structure of the report (general part and comments), which constitutes the formal act through which a laboratory communicates with clinicians, are considered. K2EDTA is considered the anticoagulant of choice for automated cell counts. Regarding storage, specimens should be analyzed as soon as possible. Storage at 4 degrees C may stabilize specimens from 24 to 72 h when complete blood count (CBC) and differential leucocyte count (DLC) is performed. For precision, analytical goals based on the state of the art are acceptable while for bias this is satisfactory only for some parameters. In haematology quality specifications for pre- and analytical phases are important, but the review criteria and the quality of the report play a central role in assuring a definite clinical value.
Effects of alcohol on automated and controlled driving performances.
Berthelon, Catherine; Gineyt, Guy
2014-05-01
Alcohol is the most frequently detected substance in fatal automobile crashes, but its precise mode of action is not always clear. The present study was designed to establish the influence of blood alcohol concentration as a function of the complexity of the scenarios. Road scenarios implying automatic or controlled driving performances were manipulated in order to identify which behavioral parameters were deteriorated. A single blind counterbalanced experiment was conducted on a driving simulator. Sixteen experienced drivers (25.3 ± 2.9 years old, 8 men and 8 women) were tested with 0, 0.3, 0.5, and 0.8 g/l of alcohol. Driving scenarios varied: road tracking, car following, and an urban scenario including events inspired by real accidents. Statistical analyses were performed on driving parameters as a function of alcohol level. Automated driving parameters such as standard deviation of lateral position measured with the road tracking and car following scenarios were impaired by alcohol, notably with the highest dose. More controlled parameters such as response time to braking and number of crashes when confronted with specific events (urban scenario) were less affected by the alcohol level. Performance decrement was greater with driving scenarios involving automated processes than with scenarios involving controlled processes.
Detection of lobular structures in normal breast tissue.
Apou, Grégory; Schaadt, Nadine S; Naegel, Benoît; Forestier, Germain; Schönmeyer, Ralf; Feuerhake, Friedrich; Wemmert, Cédric; Grote, Anne
2016-07-01
Ongoing research into inflammatory conditions raises an increasing need to evaluate immune cells in histological sections in biologically relevant regions of interest (ROIs). Herein, we compare different approaches to automatically detect lobular structures in human normal breast tissue in digitized whole slide images (WSIs). This automation is required to perform objective and consistent quantitative studies on large data sets. In normal breast tissue from nine healthy patients immunohistochemically stained for different markers, we evaluated and compared three different image analysis methods to automatically detect lobular structures in WSIs: (1) a bottom-up approach using the cell-based data for subsequent tissue level classification, (2) a top-down method starting with texture classification at tissue level analysis of cell densities in specific ROIs, and (3) a direct texture classification using deep learning technology. All three methods result in comparable overall quality allowing automated detection of lobular structures with minor advantage in sensitivity (approach 3), specificity (approach 2), or processing time (approach 1). Combining the outputs of the approaches further improved the precision. Different approaches of automated ROI detection are feasible and should be selected according to the individual needs of biomarker research. Additionally, detected ROIs could be used as a basis for quantification of immune infiltration in lobular structures. Copyright © 2016 Elsevier Ltd. All rights reserved.
A semi-automated algorithm for hypothalamus volumetry in 3 Tesla magnetic resonance images.
Wolff, Julia; Schindler, Stephanie; Lucas, Christian; Binninger, Anne-Sophie; Weinrich, Luise; Schreiber, Jan; Hegerl, Ulrich; Möller, Harald E; Leitzke, Marco; Geyer, Stefan; Schönknecht, Peter
2018-07-30
The hypothalamus, a small diencephalic gray matter structure, is part of the limbic system. Volumetric changes of this structure occur in psychiatric diseases, therefore there is increasing interest in precise volumetry. Based on our detailed volumetry algorithm for 7 Tesla magnetic resonance imaging (MRI), we developed a method for 3 Tesla MRI, adopting anatomical landmarks and work in triplanar view. We overlaid T1-weighted MR images with gray matter-tissue probability maps to combine anatomical information with tissue class segmentation. Then, we outlined regions of interest (ROIs) that covered potential hypothalamus voxels. Within these ROIs, seed growing technique helped define the hypothalamic volume using gray matter probabilities from the tissue probability maps. This yielded a semi-automated method with short processing times of 20-40 min per hypothalamus. In the MRIs of ten subjects, reliabilities were determined as intraclass correlations (ICC) and volume overlaps in percent. Three raters achieved very good intra-rater reliabilities (ICC 0.82-0.97) and good inter-rater reliabilities (ICC 0.78 and 0.82). Overlaps of intra- and inter-rater runs were very good (≥ 89.7%). We present a fast, semi-automated method for in vivo hypothalamus volumetry in 3 Tesla MRI. Copyright © 2018 Elsevier B.V. All rights reserved.
Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S
2015-03-02
A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.
Toward automated formation of microsphere arrangements using multiplexed optical tweezers
NASA Astrophysics Data System (ADS)
Rajasekaran, Keshav; Bollavaram, Manasa; Banerjee, Ashis G.
2016-09-01
Optical tweezers offer certain advantages such as multiplexing using a programmable spatial light modulator, flexibility in the choice of the manipulated object and the manipulation medium, precise control, easy object release, and minimal object damage. However, automated manipulation of multiple objects in parallel, which is essential for efficient and reliable formation of micro-scale assembly structures, poses a difficult challenge. There are two primary research issues in addressing this challenge. First, the presence of stochastic Langevin force giving rise to Brownian motion requires motion control for all the manipulated objects at fast rates of several Hz. Second, the object dynamics is non-linear and even difficult to represent analytically due to the interaction of multiple optical traps that are manipulating neighboring objects. As a result, automated controllers have not been realized for tens of objects, particularly with three dimensional motions with guaranteed collision avoidances. In this paper, we model the effect of interacting optical traps on microspheres with significant Brownian motions in stationary fluid media, and develop simplified state-space representations. These representations are used to design a model predictive controller to coordinate the motions of several spheres in real time. Preliminary experiments demonstrate the utility of the controller in automatically forming desired arrangements of varying configurations starting with randomly dispersed microspheres.
Automating lexical cross-mapping of ICNP to SNOMED CT.
Kim, Tae Youn
2016-01-01
The purpose of this study was to examine the feasibility of automating lexical cross-mapping of a logic-based nursing terminology (ICNP) to SNOMED CT using the Unified Medical Language System (UMLS) maintained by the U.S. National Library of Medicine. A two-stage approach included patterns identification, and application and evaluation of an automated term matching procedure. The performance of the automated procedure was evaluated using a test set against a gold standard (i.e. concept equivalency table) created independently by terminology experts. There were lexical similarities between ICNP diagnostic concepts and SNOMED CT. The automated term matching procedure was reliable as presented in recall of 65%, precision of 79%, accuracy of 82%, F-measure of 0.71 and the area under the receiver operating characteristics (ROC) curve of 0.78 (95% CI 0.73-0.83). When the automated procedure was not able to retrieve lexically matched concepts, it was also unlikely for terminology experts to identify a matched SNOMED CT concept. Although further research is warranted to enhance the automated matching procedure, the combination of cross-maps from UMLS and the automated procedure is useful to generate candidate mappings and thus, assist ongoing maintenance of mappings which is a significant burden to terminology developers.
Atlas-based automatic measurements of the morphology of the tibiofemoral joint
NASA Astrophysics Data System (ADS)
Brehler, M.; Thawait, G.; Shyr, W.; Ramsay, J.; Siewerdsen, J. H.; Zbijewski, W.
2017-03-01
Purpose: Anatomical metrics of the tibiofemoral joint support assessment of joint stability and surgical planning. We propose an automated, atlas-based algorithm to streamline the measurements in 3D images of the joint and reduce userdependence of the metrics arising from manual identification of the anatomical landmarks. Methods: The method is initialized with coarse registrations of a set of atlas images to the fixed input image. The initial registrations are then refined separately for the tibia and femur and the best matching atlas is selected. Finally, the anatomical landmarks of the best matching atlas are transformed onto the input image by deforming a surface model of the atlas to fit the shape of the tibial plateau in the input image (a mesh-to-volume registration). We apply the method to weight-bearing volumetric images of the knee obtained from 23 subjects using an extremity cone-beam CT system. Results of the automated algorithm were compared to an expert radiologist for measurements of Static Alignment (SA), Medial Tibial Slope (MTS) and Lateral Tibial Slope (LTS). Results: Intra-reader variability as high as 10% for LTS and 7% for MTS (ratio of standard deviation to the mean in repeated measurements) was found for expert radiologist, illustrating the potential benefits of an automated approach in improving the precision of the metrics. The proposed method achieved excellent registration of the atlas mesh to the input volumes. The resulting automated measurements yielded high correlations with expert radiologist, as indicated by correlation coefficients of 0.72 for MTS, 0.8 for LTS, and 0.89 for SA. Conclusions: The automated method for measurement of anatomical metrics of the tibiofemoral joint achieves high correlation with expert radiologist without the need for time consuming and error prone manual selection of landmarks.
Prüller, Florian; Wagner, Jasmin; Raggam, Reinhard B; Hoenigl, Martin; Kessler, Harald H; Truschnig-Wilders, Martie; Krause, Robert
2014-07-01
Testing for (1→3)-beta-D-glucan (BDG) is used for detection of invasive fungal infection. However, current assays lack automation and the ability to conduct rapid single-sample testing. The Fungitell assay was adopted for automation and evaluated using clinical samples from patients with culture-proven candidemia and from culture-negative controls in duplicate. A comparison with the standard assay protocol was made in order to establish analytical specifications. With the automated protocol, the analytical measuring range was 8-2500 pg/ml of BDG, and precision testing resulted in coefficients of variation that ranged from 3.0% to 5.5%. Samples from 15 patients with culture-proven candidemia and 94 culture-negative samples were evaluated. All culture-proven samples showed BDG values >80 pg/ml (mean 1247 pg/ml; range, 116-2990 pg/ml), which were considered positive. Of the 94 culture-negative samples, 92 had BDG values <60 pg/ml (mean, 28 pg/ml), which were considered to be negative, and 2 samples were false-positive (≥80 pg/ml; up to 124 pg/ml). Results could be obtained within 45 min and showed excellent agreement with results obtained with the standard assay protocol. The automated Fungitell assay proved to be reliable and rapid for diagnosis of candidemia. It was demonstrated to be feasible and cost efficient for both single-sample and large-scale testing of serum BDG. Its 1-h time-to-result will allow better support for clinicians in the management of antifungal therapy. © The Author 2014. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Atlas-based automatic measurements of the morphology of the tibiofemoral joint.
Brehler, M; Thawait, G; Shyr, W; Ramsay, J; Siewerdsen, J H; Zbijewski, W
2017-02-11
Anatomical metrics of the tibiofemoral joint support assessment of joint stability and surgical planning. We propose an automated, atlas-based algorithm to streamline the measurements in 3D images of the joint and reduce user-dependence of the metrics arising from manual identification of the anatomical landmarks. The method is initialized with coarse registrations of a set of atlas images to the fixed input image. The initial registrations are then refined separately for the tibia and femur and the best matching atlas is selected. Finally, the anatomical landmarks of the best matching atlas are transformed onto the input image by deforming a surface model of the atlas to fit the shape of the tibial plateau in the input image (a mesh-to-volume registration). We apply the method to weight-bearing volumetric images of the knee obtained from 23 subjects using an extremity cone-beam CT system. Results of the automated algorithm were compared to an expert radiologist for measurements of Static Alignment (SA), Medial Tibial Slope (MTS) and Lateral Tibial Slope (LTS). Intra-reader variability as high as ~10% for LTS and 7% for MTS (ratio of standard deviation to the mean in repeated measurements) was found for expert radiologist, illustrating the potential benefits of an automated approach in improving the precision of the metrics. The proposed method achieved excellent registration of the atlas mesh to the input volumes. The resulting automated measurements yielded high correlations with expert radiologist, as indicated by correlation coefficients of 0.72 for MTS, 0.8 for LTS, and 0.89 for SA. The automated method for measurement of anatomical metrics of the tibiofemoral joint achieves high correlation with expert radiologist without the need for time consuming and error prone manual selection of landmarks.
Cordero-Vaca, María; Trujillo-Rodríguez, María J; Zhang, Cheng; Pino, Verónica; Anderson, Jared L; Afonso, Ana M
2015-06-01
Four different crosslinked polymeric ionic liquid (PIL)-based sorbent coatings were evaluated in an automated direct-immersion solid-phase microextraction method (automated DI-SPME) in combination with gas chromatography (GC). The crosslinked PIL coatings were based on vinyl-alkylimidazolium- (ViCnIm-) or vinylbenzyl-alkylimidazolium- (ViBzCnIm-) IL monomers, and di-(vinylimidazolium)dodecane ((ViIm)2C12-) or di-(vinylbenzylimidazolium)dodecane ((ViBzIm)2C12-) dicationic IL crosslinkers. In addition, a PIL-based hybrid coating containing multi-walled carbon nanotubes (MWCNTs) was also studied. The studied PIL coatings were covalently attached to derivatized nitinol wires and mounted onto the Supelco assembly to ensure automation when acting as SPME coatings. Their behavior was evaluated in the determination of a group of water pollutants, after proper optimization. A comparison was carried out with three common commercial SPME fibers. It was observed that those PILs containing a benzyl group in their structures, either in the IL monomer and crosslinker (PIL-1-1) or only in the crosslinker (PIL-0-1), were the most efficient sorbents for the selected analytes. The validation of the overall automated DI-SPME-GC-flame ionization detector (FID) method gave limits of detection down to 135 μg · L(-1) for p-cresol when using the PIL-1-1 and down to 270 μg · L(-1) when using the PIL-0-1; despite their coating thickness: ~2 and ~5 μm, respectively. Average relative recoveries with waters were of 85 ± 14 % and 87 ± 15 % for PIL-1-1 and PIL-0-1, respectively. Precision values as relative standard deviation were always lower than 4.9 and 7.6 % (spiked level between 10 and 750 μg · L(-1), as intra-day precision). Graphical Abstract Automated DI-SPME-GC-FID using crosslinked-PILs sorbent coatings for the determination of waterpollutants.
Evaluation of automated assays for immunoglobulin G, M, and A measurements in dog and cat serum.
Tvarijonaviciute, Asta; Martínez-Subiela, Silvia; Caldin, Marco; Tecles, Fernando; Ceron, Jose J
2013-09-01
Measurements of immunoglobulins (Igs) in companion animals can be useful to detect deficiencies of the humoral immune system, that can be associated with opportunistic or chronic infections, or other immune-mediated disorders including B-cell neoplasms. The purpose of this study was to evaluate commercially available automated immunoturbidimetric assays designed for human IgG, M, and A measurements in canine and feline serum using species-specific calibrators. Canine and feline serum samples with different IgG, M, and A concentrations were used for the analytical validation of the assays. Intra- and inter-assay precision, linearity under dilution, spiking recovery, and limit of detection were determined. In addition, effects of lipemia, hemolysis, and bilirubinemia were evaluated. Finally, Ig concentrations were determined in small groups of diseased dogs and cats, and compared with healthy groups. Spiking recovery and linearity under dilution tests showed that the assays measured Igs in canine and feline serum samples precisely and accurately. Intra- and inter-assay imprecisions were lower than 15% in all cases. Significantly higher IgG, IgM, and IgA levels were observed in dogs with leishmaniasis, while dogs with pyometra showed a statistically significant increase in IgM and IgA concentrations in comparison with healthy dogs. Significantly higher IgG and IgM levels were observed in FIV-infected cats compared with healthy ones. The automated human Ig assays showed adequate precision and accuracy with serum samples from dogs and cats. Also, they were able to discriminate different concentrations of Igs in healthy and diseased animals. © 2013 American Society for Veterinary Clinical Pathology.
Automated and model-based assembly of an anamorphic telescope
NASA Astrophysics Data System (ADS)
Holters, Martin; Dirks, Sebastian; Stollenwerk, Jochen; Loosen, Peter
2018-02-01
Since the first usage of optical glasses there has been an increasing demand for optical systems which are highly customized for a wide field of applications. To meet the challenge of the production of so many unique systems, the development of new techniques and approaches has risen in importance. However, the assembly of precision optical systems with lot sizes of one up to a few tens of systems is still dominated by manual labor. In contrast, highly adaptive and model-based approaches may offer a solution for manufacturing with a high degree of automation and high throughput while maintaining high precision. In this work a model-based automated assembly approach based on ray-tracing is presented. This process runs autonomously, and accounts for a wide range of functionality. It firstly identifies the sequence for an optimized assembly and secondly, generates and matches intermediate figures of merit to predict the overall optical functionality of the optical system. This process also takes into account the generation of a digital twin of the optical system, by mapping key-performance-indicators like the first and the second momentum of intensity into the optical model. This approach is verified by the automatic assembly of an anamorphic telescope within an assembly cell. By continuous measuring and mapping the key-performance-indicators into the optical model, the quality of the digital twin is determined. Moreover, by measuring the optical quality and geometrical parameters of the telescope, the precision of this approach is determined. Finally, the productivity of the process is evaluated by monitoring the speed of the different steps of the process.
Design and implementation of a compliant robot with force feedback and strategy planning software
NASA Technical Reports Server (NTRS)
Premack, T.; Strempek, F. M.; Solis, L. A.; Brodd, S. S.; Cutler, E. P.; Purves, L. R.
1984-01-01
Force-feedback robotics techniques are being developed for automated precision assembly and servicing of NASA space flight equipment. Design and implementation of a prototype robot which provides compliance and monitors forces is in progress. Computer software to specify assembly steps and makes force feedback adjustments during assembly are coded and tested for three generically different precision mating problems. A model program demonstrates that a suitably autonomous robot can plan its own strategy.
Automated assembly of fast-axis collimation (FAC) lenses for diode laser bar modules
NASA Astrophysics Data System (ADS)
Miesner, Jörn; Timmermann, Andre; Meinschien, Jens; Neumann, Bernhard; Wright, Steve; Tekin, Tolga; Schröder, Henning; Westphalen, Thomas; Frischkorn, Felix
2009-02-01
Laser diodes and diode laser bars are key components in high power semiconductor lasers and solid state laser systems. During manufacture, the assembly of the fast axis collimation (FAC) lens is a crucial step. The goal of our activities is to design an automated assembly system for high volume production. In this paper the results of an intermediate milestone will be reported: a demonstration system was designed, realized and tested to prove the feasibility of all of the system components and process features. The demonstration system consists of a high precision handling system, metrology for process feedback, a powerful digital image processing system and tooling for glue dispensing, UV curing and laser operation. The system components as well as their interaction with each other were tested in an experimental system in order to glean design knowledge for the fully automated assembly system. The adjustment of the FAC lens is performed by a series of predefined steps monitored by two cameras concurrently imaging the far field and the near field intensity distributions. Feedback from these cameras processed by a powerful and efficient image processing algorithm control a five axis precision motion system to optimize the fast axis collimation of the laser beam. Automated cementing of the FAC to the diode bar completes the process. The presentation will show the system concept, the algorithm of the adjustment as well as experimental results. A critical discussion of the results will close the talk.
NASA Astrophysics Data System (ADS)
Venkataraman, Sankar; Li, Wenjing
2008-03-01
Image analysis for automated diagnosis of cervical cancer has attained high prominence in the last decade. Automated image analysis at all levels requires a basic segmentation of the region of interest (ROI) within a given image. The precision of the diagnosis is often reflected by the precision in detecting the initial region of interest, especially when some features outside the ROI mimic the ones within the same. Work described here discusses algorithms that are used to improve the cervical region of interest as a part of automated cervical image diagnosis. A vital visual aid in diagnosing cervical cancer is the aceto-whitening of the cervix after the application of acetic acid. Color and texture are used to segment acetowhite regions within the cervical ROI. Vaginal walls along with cottonswabs sometimes mimic these essential features leading to several false positives. Work presented here is focused towards detecting in-focus vaginal wall boundaries and then extrapolating them to exclude vaginal walls from the cervical ROI. In addition, discussed here is a marker-controlled watershed segmentation that is used to detect cottonswabs from the cervical ROI. A dataset comprising 50 high resolution images of the cervix acquired after 60 seconds of acetic acid application were used to test the algorithm. Out of the 50 images, 27 benefited from a new cervical ROI. Significant improvement in overall diagnosis was observed in these images as false positives caused by features outside the actual ROI mimicking acetowhite region were eliminated.
Egger, Robert; Narayanan, Rajeevan T.; Helmstaedter, Moritz; de Kock, Christiaan P. J.; Oberlaender, Marcel
2012-01-01
The three-dimensional (3D) structure of neural circuits is commonly studied by reconstructing individual or small groups of neurons in separate preparations. Investigation of structural organization principles or quantification of dendritic and axonal innervation thus requires integration of many reconstructed morphologies into a common reference frame. Here we present a standardized 3D model of the rat vibrissal cortex and introduce an automated registration tool that allows for precise placement of single neuron reconstructions. We (1) developed an automated image processing pipeline to reconstruct 3D anatomical landmarks, i.e., the barrels in Layer 4, the pia and white matter surfaces and the blood vessel pattern from high-resolution images, (2) quantified these landmarks in 12 different rats, (3) generated an average 3D model of the vibrissal cortex and (4) used rigid transformations and stepwise linear scaling to register 94 neuron morphologies, reconstructed from in vivo stainings, to the standardized cortex model. We find that anatomical landmarks vary substantially across the vibrissal cortex within an individual rat. In contrast, the 3D layout of the entire vibrissal cortex remains remarkably preserved across animals. This allows for precise registration of individual neuron reconstructions with approximately 30 µm accuracy. Our approach could be used to reconstruct and standardize other anatomically defined brain areas and may ultimately lead to a precise digital reference atlas of the rat brain. PMID:23284282
Automated Semantic Indexing of Figure Captions to Improve Radiology Image Retrieval
Kahn, Charles E.; Rubin, Daniel L.
2009-01-01
Objective We explored automated concept-based indexing of unstructured figure captions to improve retrieval of images from radiology journals. Design The MetaMap Transfer program (MMTx) was used to map the text of 84,846 figure captions from 9,004 peer-reviewed, English-language articles to concepts in three controlled vocabularies from the UMLS Metathesaurus, version 2006AA. Sampling procedures were used to estimate the standard information-retrieval metrics of precision and recall, and to evaluate the degree to which concept-based retrieval improved image retrieval. Measurements Precision was estimated based on a sample of 250 concepts. Recall was estimated based on a sample of 40 concepts. The authors measured the impact of concept-based retrieval to improve upon keyword-based retrieval in a random sample of 10,000 search queries issued by users of a radiology image search engine. Results Estimated precision was 0.897 (95% confidence interval, 0.857–0.937). Estimated recall was 0.930 (95% confidence interval, 0.838–1.000). In 5,535 of 10,000 search queries (55%), concept-based retrieval found results not identified by simple keyword matching; in 2,086 searches (21%), more than 75% of the results were found by concept-based search alone. Conclusion Concept-based indexing of radiology journal figure captions achieved very high precision and recall, and significantly improved image retrieval. PMID:19261938
Fusing Continuous-Valued Medical Labels Using a Bayesian Model.
Zhu, Tingting; Dunkley, Nic; Behar, Joachim; Clifton, David A; Clifford, Gari D
2015-12-01
With the rapid increase in volume of time series medical data available through wearable devices, there is a need to employ automated algorithms to label data. Examples of labels include interventions, changes in activity (e.g. sleep) and changes in physiology (e.g. arrhythmias). However, automated algorithms tend to be unreliable resulting in lower quality care. Expert annotations are scarce, expensive, and prone to significant inter- and intra-observer variance. To address these problems, a Bayesian Continuous-valued Label Aggregator (BCLA) is proposed to provide a reliable estimation of label aggregation while accurately infer the precision and bias of each algorithm. The BCLA was applied to QT interval (pro-arrhythmic indicator) estimation from the electrocardiogram using labels from the 2006 PhysioNet/Computing in Cardiology Challenge database. It was compared to the mean, median, and a previously proposed Expectation Maximization (EM) label aggregation approaches. While accurately predicting each labelling algorithm's bias and precision, the root-mean-square error of the BCLA was 11.78 ± 0.63 ms, significantly outperforming the best Challenge entry (15.37 ± 2.13 ms) as well as the EM, mean, and median voting strategies (14.76 ± 0.52, 17.61 ± 0.55, and 14.43 ± 0.57 ms respectively with p < 0.0001). The BCLA could therefore provide accurate estimation for medical continuous-valued label tasks in an unsupervised manner even when the ground truth is not available.
Searching for Variables in one of the WHAT Fields
NASA Astrophysics Data System (ADS)
Shporer, A.; Mazeh, T.; Moran, A.; Bakos, G.; Kovacs, G.
2007-07-01
We present preliminary results on a single field observed by WHAT, a small-aperture short focal length automated telescope with an 8.2° × 8.2° deg field of view, located at the Wise Observatory. The system is similar to the members of HATNet (http://cfa-www.harvard.edu/~gbakos/HAT/) and is aimed at searching for transiting extrasolar planets and variable objects. With 5 min integration time, the telescope achieved a precision of a few mmag for the brightest objects. We detect variables with amplitudes less than 0.01 mag. All 152 periodic variables are presented at http://wise-obs.tau.ac.il/~amit/236/.
Automated tagging of pharmaceutically active thiols under flow conditions using monobromobimane.
Tzanavaras, Paraskevas D; Karakosta, Theano D
2011-03-25
The thiol-specific derivatization reagent monobromobimane (MBB) is applied--for the first time--under flow conditions. Sequential injection analysis allows the handling of precise volumes of the reagent in the micro-liter range. The effect of the main chemical and instrumental variables was investigated using captopril (CAP), N-acetylcysteine (NAC) and penicillamine (PEN) as representative pharmaceutically active thiols. Previously reported hydrolysis of MBB due to interaction with nucleophilic components of the buffers was avoided kinetically under flow conditions. The proposed analytical scheme is suitable for the fluorimetric determination of thiols at a sampling rate of 36 h(-1). Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wyers, G. P.; Otjes, R. P.; Slanina, J.
A new diffusion denuder is described for the continuous measurement of atmospheric ammonia. Ammonia is collected in an absorption solution in a rotating denuder, separated from interfering compounds by diffusion through a semi-permeable membrane and detected by conductometry. The method is free from interferences by other atmospheric gases, with the exception of volatile amines. The detection limit is 6 ng m -3 for a 30-min integration time. This compact instrument is fully automated and suited for routine deployment in field studies. The precision is sufficiently high for micrometeorological studies of air-surface exchange of ammonia.
Study on the Automatic Detection Method and System of Multifunctional Hydrocephalus Shunt
NASA Astrophysics Data System (ADS)
Sun, Xuan; Wang, Guangzhen; Dong, Quancheng; Li, Yuzhong
2017-07-01
Aiming to the difficulty of micro pressure detection and the difficulty of micro flow control in the testing process of hydrocephalus shunt, the principle of the shunt performance detection was analyzed.In this study, the author analyzed the principle of several items of shunt performance detection,and used advanced micro pressure sensor and micro flow peristaltic pump to overcome the micro pressure detection and micro flow control technology.At the same time,This study also puted many common experimental projects integrated, and successfully developed the automatic detection system for a shunt performance detection function, to achieve a test with high precision, high efficiency and automation.
NASA Technical Reports Server (NTRS)
Nesthus, Thomas E.; Schiflett, Sammuel G.
1993-01-01
Hypobaric decompression sickness (DCS) research presents the medical monitor with the difficult task of assessing the onset and progression of DCS largely on the basis of subjective symptoms. Even with the introduction of precordial Doppler ultrasound techniques for the detection of venous gas emboli (VGE), correct prediction of DCS can be made only about 65 percent of the time according to data from the Armstrong Laboratory's (AL's) hypobaric DCS database. An AL research protocol concerned with exercise and its effects on denitrogenation efficiency includes implementation of a performance assessment test battery to evaluate cognitive functioning during a 4-h simulated 30,000 ft (9144 m) exposure. Information gained from such a test battery may assist the medical monitor in identifying early signs of DCS and subtle neurologic dysfunction related to cases of asymptomatic, but advanced, DCS. This presentation concerns the selection and integration of a test battery and the timely graphic display of subject test results for the principal investigator and medical monitor. A subset of the Automated Neuropsychological Assessment Metrics (ANAM) developed through the Office of Military Performance Assessment Technology (OMPAT) was selected. The ANAM software provides a library of simple tests designed for precise measurement of processing efficiency in a variety of cognitive domains. For our application and time constraints, two tests requiring high levels of cognitive processing and memory were chosen along with one test requiring fine psychomotor performance. Accuracy, speed, and processing throughout variables as well as RMS error were collected. An automated mood survey provided 'state' information on six scales including anger, happiness, fear, depression, activity, and fatigue. An integrated and interactive LOTUS 1-2-3 macro was developed to import and display past and present task performance and mood-change information.
Farber, Joshua M; Totterman, Saara M S; Martinez-Torteya, Antonio; Tamez-Peña, Jose G
2016-02-01
Subchondral bone (SCB) undergoes changes in the shape of the articulating bone surfaces and is currently recognized as a key target in osteoarthritis (OA) treatment. The aim of this study was to present an automated system that determines the curvature of the SCB regions of the knee and to evaluate its cross-sectional and longitudinal scan-rescan precision Six subjects with OA and six control subjects were selected from the Osteoarthritis Initiative (OAI) pilot study database. As per OAI protocol, these subjects underwent 3T MRI at baseline and every twelve months thereafter, including a 3D DESS WE sequence. We analyzed the baseline and twenty-four month images. Each subject was scanned twice at these visits, thus generating scan-rescan information. Images were segmented with an automated multi-atlas framework platform and then 3D renderings of the bone structure were created from the segmentations. Curvature maps were extracted from the 3D renderings and morphed into a reference atlas to determine precision, to generate population statistics, and to visualize cross-sectional and longitudinal curvature changes. The baseline scan-rescan root mean square error values ranged from 0.006mm(-1) to 0.013mm(-1), and from 0.007mm(-1) to 0.018mm(-1) for the SCB of the femur and the tibia, respectively. The standardized response of the mean of the longitudinal changes in curvature in these regions ranged from -0.09 to 0.02 and from -0.016 to 0.015, respectively. The fully automated system produces accurate and precise curvature maps of femoral and tibial SCB, and will provide a valuable tool for the analysis of the curvature changes of articulating bone surfaces during the course of knee OA. Copyright © 2015 Elsevier Ltd. All rights reserved.
Semi-automated 96-well liquid-liquid extraction for quantitation of drugs in biological fluids.
Zhang, N; Hoffman, K L; Li, W; Rossi, D T
2000-02-01
A semi-automated liquid-liquid extraction (LLE) technique for biological fluid sample preparation was introduced for the quantitation of four drugs in rat plasma. All liquid transferring during the sample preparation was automated using a Tomtec Quadra 96 Model 320 liquid handling robot, which processed up to 96 samples in parallel. The samples were either in 96-deep-well plate or tube-rack format. One plate of samples can be prepared in approximately 1.5 h, and the 96-well plate is directly compatible with the autosampler of an LC/MS system. Selection of organic solvents and recoveries are discussed. Also, precision, relative error, linearity and quantitation of the semi automated LLE method are estimated for four example drugs using LC/MS/MS with a multiple reaction monitoring (MRM) approach. The applicability of this method and future directions are evaluated.
Islam, Asef; Oldham, Michael J; Wexler, Anthony S
2017-11-01
Mammalian lungs are comprised of large numbers of tracheobronchial airways that transition from the trachea to alveoli. Studies as wide ranging as pollutant deposition and lung development rely on accurate characterization of these airways. Advancements in CT imaging and the value of computational approaches in eliminating the burden of manual measurement are providing increased efficiency in obtaining this geometric data. In this study, we compare an automated method to a manual one for the first six generations of three Balb/c mouse lungs. We find good agreement between manual and automated methods and that much of the disagreement can be attributed to method precision. Using the automated method, we then provide anatomical data for the entire tracheobronchial airway tree from three Balb/C mice. Anat Rec, 2017. © 2017 Wiley Periodicals, Inc. Anat Rec, 300:2046-2057, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Flexible manufacturing for photonics device assembly
NASA Technical Reports Server (NTRS)
Lu, Shin-Yee; Pocha, Michael D.; Strand, Oliver T.; Young, K. David
1994-01-01
The assembly of photonics devices such as laser diodes, optical modulators, and opto-electronics multi-chip modules (OEMCM), usually requires the placement of micron size devices such as laser diodes, and sub-micron precision attachment between optical fibers and diodes or waveguide modulators (usually referred to as pigtailing). This is a very labor intensive process. Studies done by the opto-electronics (OE) industry have shown that 95 percent of the cost of a pigtailed photonic device is due to the use of manual alignment and bonding techniques, which is the current practice in industry. At Lawrence Livermore National Laboratory, we are working to reduce the cost of packaging OE devices through the use of automation. Our efforts are concentrated on several areas that are directly related to an automated process. This paper will focus on our progress in two of those areas, in particular, an automated fiber pigtailing machine and silicon micro-technology compatible with an automated process.
Plouchart, Diane; Guizard, Guillaume; Latrille, Eric
2018-01-01
Continuous cultures in chemostats have proven their value in microbiology, microbial ecology, systems biology and bioprocess engineering, among others. In these systems, microbial growth and ecosystem performance can be quantified under stable and defined environmental conditions. This is essential when linking microbial diversity to ecosystem function. Here, a new system to test this link in anaerobic, methanogenic microbial communities is introduced. Rigorously replicated experiments or a suitable experimental design typically require operating several chemostats in parallel. However, this is labor intensive, especially when measuring biogas production. Commercial solutions for multiplying reactors performing continuous anaerobic digestion exist but are expensive and use comparably large reactor volumes, requiring the preparation of substantial amounts of media. Here, a flexible system of Lab-scale Automated and Multiplexed Anaerobic Chemostat system (LAMACs) with a working volume of 200 mL is introduced. Sterile feeding, biomass wasting and pressure monitoring are automated. One module containing six reactors fits the typical dimensions of a lab bench. Thanks to automation, time required for reactor operation and maintenance are reduced compared to traditional lab-scale systems. Several modules can be used together, and so far the parallel operation of 30 reactors was demonstrated. The chemostats are autoclavable. Parameters like reactor volume, flow rates and operating temperature can be freely set. The robustness of the system was tested in a two-month long experiment in which three inocula in four replicates, i.e., twelve continuous digesters were monitored. Statistically significant differences in the biogas production between inocula were observed. In anaerobic digestion, biogas production and consequently pressure development in a closed environment is a proxy for ecosystem performance. The precision of the pressure measurement is thus crucial. The measured maximum and minimum rates of gas production could be determined at the same precision. The LAMACs is a tool that enables us to put in practice the often-demanded need for replication and rigorous testing in microbial ecology as well as bioprocess engineering. PMID:29518106
Cluet, David; Spichty, Martin; Delattre, Marie
2014-01-01
The mitotic spindle is a microtubule-based structure that elongates to accurately segregate chromosomes during anaphase. Its position within the cell also dictates the future cell cleavage plan, thereby determining daughter cell orientation within a tissue or cell fate adoption for polarized cells. Therefore, the mitotic spindle ensures at the same time proper cell division and developmental precision. Consequently, spindle dynamics is the matter of intensive research. Among the different cellular models that have been explored, the one-cell stage C. elegans embryo has been an essential and powerful system to dissect the molecular and biophysical basis of spindle elongation and positioning. Indeed, in this large and transparent cell, spindle poles (or centrosomes) can be easily detected from simple DIC microscopy by human eyes. To perform quantitative and high-throughput analysis of spindle motion, we developed a computer program ACT for Automated-Centrosome-Tracking from DIC movies of C. elegans embryos. We therefore offer an alternative to the image acquisition and processing of transgenic lines expressing fluorescent spindle markers. Consequently, experiments on large sets of cells can be performed with a simple setup using inexpensive microscopes. Moreover, analysis of any mutant or wild-type backgrounds is accessible because laborious rounds of crosses with transgenic lines become unnecessary. Last, our program allows spindle detection in other nematode species, offering the same quality of DIC images but for which techniques of transgenesis are not accessible. Thus, our program also opens the way towards a quantitative evolutionary approach of spindle dynamics. Overall, our computer program is a unique macro for the image- and movie-processing platform ImageJ. It is user-friendly and freely available under an open-source licence. ACT allows batch-wise analysis of large sets of mitosis events. Within 2 minutes, a single movie is processed and the accuracy of the automated tracking matches the precision of the human eye. PMID:24763198
Remote Sensing and Information Technology for Large Farms
NASA Technical Reports Server (NTRS)
Williams, John E.; Ramsay, Jimmie A.
2002-01-01
A method of applying of remote sensing (RS) and information management technology to help large farms produce at maximum efficiency is undergoing development. The novelty of the method does not lie in the concept of "precision agriculture," which involves variation of seeding, of application of chemicals, and of irrigation according to the spatially and temporally local variations in the growth stages and health of crops and in the chemical and physical conditions of soils. The novelty also does not lie in the use of RS data registered with other data in a geographic information system (GIS) to guide the use of precise agricultural techniques. Instead, the novelty lies in a systematic approach to overcoming obstacles that, heretofore, have impeded the timely distribution of reliable, relevant, and sufficient GIS data to support day-to-day, acre-to-acre decisions concerning the application of precise agricultural techniques to increase production and decrease cost. The development and promotion of the method are inspired in part by a vision of equipping farm machinery to accept GIS (including RS) data and using the data for automated or semiautomated implementation of precise agricultural techniques. Primary examples of relevant GIS data include information on plant stress, soil moisture, and effects of applied chemicals, all derived by automated computational analysis of measurements taken by one or more airborne spectroradiometers. Proper management and timeliness of the large amount of GIS information are of paramount concern in agriculture. Information on stresses and changes in crops is especially perishable and important to farmers. The need for timeliness and management of information is satisfied by use of computing hardware and software capable of (1) rapid georectification and other processing of RS data, (2) packaging the output data in the form of GIS plots, and (3) making the data available to farmers and other subscribers by Internet password access. It is a goal of this development program to make RS data available no later than the data after an aerial survey. In addition, data from prior surveys are kept in the data base. Farmers can, for example, use current and prior data to analyze changes.
An automated approach to the design of decision tree classifiers
NASA Technical Reports Server (NTRS)
Argentiero, P.; Chin, R.; Beaudet, P.
1982-01-01
An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.
NASA Technical Reports Server (NTRS)
Srivastava, Sandanand; Dwivedi, Suren N.; Soon, Toh Teck; Bandi, Reddy; Banerjee, Soumen; Hughes, Cecilia
1989-01-01
The installation of robots and their use of assembly in space will create an exciting and promising future for the U.S. Space Program. The concept of assembly in space is very complicated and error prone and it is not possible unless the various parts and modules are suitably designed for automation. Certain guidelines are developed for part designing and for an easy precision assembly. Major design problems associated with automated assembly are considered and solutions to resolve these problems are evaluated in the guidelines format. Methods for gripping and methods for part feeding are developed with regard to the absence of gravity in space. The guidelines for part orientation, adjustments, compliances and various assembly construction are discussed. Design modifications of various fasteners and fastening methods are also investigated.
Attenello, Frank J; Lee, Brian; Yu, Cheng; Liu, Charles Y; Apuzzo, Michael L J
2014-01-01
A central concept of scientific advancement in the medical and surgical fields is the incorporation of successful emerging ideas and technologies throughout the scope of human endeavors. The field of automation and robotics is a pivotal representation of this concept. Arising in the mythology of Homer, the concept of automation and robotics grew exponentially over the millennia to provide the substrate for a paradigm shift in the current and future practice of neurosurgery. We trace the growth of this field from the seminal concepts of Homer and Aristotle to early incorporation into neurosurgical practice. Resulting changes provide drastic and welcome advances in areas of visualization, haptics, acoustics, dexterity, tremor reduction, motion scaling, and surgical precision. Published by Elsevier Inc.
15 CFR 200.103 - Consulting and advisory services.
Code of Federal Regulations, 2013 CFR
2013-01-01
...., details of design and construction, operational aspects, unusual or extreme conditions, methods of statistical control of the measurement process, automated acquisition of laboratory data, and data reduction... group seminars on the precision measurement of specific types of physical quantities, offering the...
15 CFR 200.103 - Consulting and advisory services.
Code of Federal Regulations, 2011 CFR
2011-01-01
...., details of design and construction, operational aspects, unusual or extreme conditions, methods of statistical control of the measurement process, automated acquisition of laboratory data, and data reduction... group seminars on the precision measurement of specific types of physical quantities, offering the...
NASA Astrophysics Data System (ADS)
Lary, D. J.
2013-12-01
A BigData case study is described where multiple datasets from several satellites, high-resolution global meteorological data, social media and in-situ observations are combined using machine learning on a distributed cluster using an automated workflow. The global particulate dataset is relevant to global public health studies and would not be possible to produce without the use of the multiple big datasets, in-situ data and machine learning.To greatly reduce the development time and enhance the functionality a high level language capable of parallel processing has been used (Matlab). A key consideration for the system is high speed access due to the large data volume, persistence of the large data volumes and a precise process time scheduling capability.
NASA Astrophysics Data System (ADS)
Lattin, Frank G.; Paul, Donald G.
1996-11-01
A sorbent-based gas chromatographic method provides continuous quantitative measurement of phosgene, hydrogen cyanide, and cyanogen chloride in ambient air. These compounds are subject to workplace exposure limits as well as regulation under terms of the Chemical Arms Treaty and Title III of the 1990 Clean Air Act amendments. The method was developed for on-sit use in a mobile laboratory during remediation operations. Incorporated into the method are automated multi-level calibrations at time weighted average concentrations, or lower. Gaseous standards are prepared in fused silica lined air sampling canisters, then transferred to the analytical system through dynamic spiking. Precision and accuracy studies performed to validate the method are described. Also described are system deactivation and passivation techniques critical to optimum method performance.
NASA Astrophysics Data System (ADS)
Song, B.; Antoun, B. R.; Boston, M.
2012-05-01
We modified the design originally developed by Kuokkala's group to develop an automated high-temperature Kolsky compression bar for characterizing high-rate properties of 304L stainless steel at elevated temperatures. Additional features have been implemented to this high-temperature Kolsky compression bar for recrystallization investigation. The new features ensure a single loading on the specimen and precise time and temperature control for quenching to the specimen after dynamic loading. Dynamic compressive stress-strain curves of 304L stainless steel were obtained at 21, 204, 427, 649, and 871 °C (or 70, 400, 800, 1200, and 1600 °F) at the same constant strain rate of 332 s-1. The specimen subjected to specific time and temperature control for quenching after a single dynamic loading was preserved for investigating microstructure recrystallization.
Robotics in space-age manufacturing
NASA Technical Reports Server (NTRS)
Jones, Chip
1991-01-01
Robotics technologies are developed to improve manufacturing of space hardware. The following applications of robotics are covered: (1) welding for the space shuttle and space station Freedom programs; (2) manipulation of high-pressure water for shuttle solid rocket booster refurbishment; (3) automating the application of insulation materials; (4) precision application of sealants; and (5) automation of inspection procedures. Commercial robots are used for these development programs, but they are teamed with advanced sensors, process controls, and computer simulation to form highly productive manufacturing systems. Many of the technologies are also being actively pursued in private sector manufacturing operations.
Automated detection of lung nodules with three-dimensional convolutional neural networks
NASA Astrophysics Data System (ADS)
Pérez, Gustavo; Arbeláez, Pablo
2017-11-01
Lung cancer is the cancer type with highest mortality rate worldwide. It has been shown that early detection with computer tomography (CT) scans can reduce deaths caused by this disease. Manual detection of cancer nodules is costly and time-consuming. We present a general framework for the detection of nodules in lung CT images. Our method consists of the pre-processing of a patient's CT with filtering and lung extraction from the entire volume using a previously calculated mask for each patient. From the extracted lungs, we perform a candidate generation stage using morphological operations, followed by the training of a three-dimensional convolutional neural network for feature representation and classification of extracted candidates for false positive reduction. We perform experiments on the publicly available LIDC-IDRI dataset. Our candidate extraction approach is effective to produce precise candidates with a recall of 99.6%. In addition, false positive reduction stage manages to successfully classify candidates and increases precision by a factor of 7.000.
NASA Astrophysics Data System (ADS)
Stanley, Kieran M.; Grant, Aoife; O'Doherty, Simon; Young, Dickon; Manning, Alistair J.; Stavert, Ann R.; Spain, T. Gerard; Salameh, Peter K.; Harth, Christina M.; Simmonds, Peter G.; Sturges, William T.; Oram, David E.; Derwent, Richard G.
2018-03-01
A network of three tall tower measurement stations was set up in 2012 across the United Kingdom to expand measurements made at the long-term background northern hemispheric site, Mace Head, Ireland. Reliable and precise in situ greenhouse gas (GHG) analysis systems were developed and deployed at three sites in the UK with automated instrumentation measuring a suite of GHGs. The UK Deriving Emissions linked to Climate Change (UK DECC) network uses tall (165-230 m) open-lattice telecommunications towers, which provide a convenient platform for boundary layer trace gas sampling. In this paper we describe the automated measurement system and first results from the UK DECC network for CO2, CH4, N2O, SF6, CO and H2. CO2 and CH4 are measured at all of the UK DECC sites by cavity ring-down spectroscopy (CRDS) with multiple inlet heights at two of the three tall tower sites to assess for boundary layer stratification. The short-term precisions (1σ on 1 min means) of CRDS measurements at background mole fractions for January 2012 to September 2015 is < 0.05 µmol mol-1 for CO2 and < 0.3 nmol mol-1 for CH4. Repeatability of standard injections (1σ) is < 0.03 µmol mol-1 for CO2 and < 0.3 nmol mol-1 for CH4 for the same time period. N2O and SF6 are measured at three of the sites, and CO and H2 measurements are made at two of the sites, from a single inlet height using gas chromatography (GC) with an electron capture detector (ECD), flame ionisation detector (FID) or reduction gas analyser (RGA). Repeatability of individual injections (1σ) on GC and RGA instruments between January 2012 and September 2015 for CH4, N2O, SF6, CO and H2 measurements were < 2.8 nmol mol-1, < 0.4 nmol mol-1, < 0.07 pmol mol-1, < 2 nmol mol-1 and < 3 nmol mol-1, respectively. Instrumentation in the network is fully automated and includes sensors for measuring a variety of instrumental parameters such as flow, pressures, and sampling temperatures. Automated alerts are generated and emailed to site operators when instrumental parameters are not within defined set ranges. Automated instrument shutdowns occur for critical errors such as carrier gas flow rate deviations. Results from the network give good spatial and temporal coverage of atmospheric mixing ratios within the UK since early 2012. Results also show that all measured GHGs are increasing in mole fraction over the selected reporting period and, except for SF6, exhibit a seasonal trend. CO2 and CH4 also show strong diurnal cycles, with night-time maxima and daytime minima in mole fractions.
Ramos, Inês I; Gregório, Bruno J R; Barreiros, Luísa; Magalhães, Luís M; Tóth, Ildikó V; Reis, Salette; Lima, José L F C; Segundo, Marcela A
2016-04-01
An automated oxygen radical absorbance capacity (ORAC) method based on programmable flow injection analysis was developed for the assessment of antioxidant reactivity. The method relies on real time spectrophotometric monitoring (540 nm) of pyrogallol red (PGR) bleaching mediated by peroxyl radicals in the presence of antioxidant compounds within the first minute of reaction, providing information about their initial reactivity against this type of radicals. The ORAC-PGR assay under programmable flow format affords a strict control of reaction conditions namely reagent mixing, temperature and reaction timing, which are critical parameters for in situ generation of peroxyl radical from 2,2'-azobis(2-amidinopropane) dihydrochloride (AAPH). The influence of reagent concentrations and programmable flow conditions on reaction development was studied, with application of 37.5 µM of PGR and 125 mM of AAPH in the flow cell, guaranteeing first order kinetics towards peroxyl radicals and pseudo-zero order towards PGR. Peroxyl-scavenging reactivity of antioxidants, bioactive compounds and phenolic-rich beverages was estimated employing the proposed methodology. Recovery assays using synthetic saliva provided values of 90 ± 5% for reduced glutathione. Detection limit calculated using the standard antioxidant compound Trolox was 8 μM. RSD values were <3.4 and <4.9%, for intra and inter-assay precision, respectively. Compared to previous batch automated ORAC assays, the developed system also accounted for high sampling frequency (29 h(-1)), low operating costs and low generation of waste. Copyright © 2015 Elsevier B.V. All rights reserved.
Marinova, Mariela; Artusi, Carlo; Brugnolo, Laura; Antonelli, Giorgia; Zaninotto, Martina; Plebani, Mario
2013-11-01
Although, due to its high specificity and sensitivity, LC-MS/MS is an efficient technique for the routine determination of immunosuppressants in whole blood, it involves time-consuming manual sample preparation. The aim of the present study was therefore to develop an automated sample-preparation protocol for the quantification of sirolimus, everolimus and tacrolimus by LC-MS/MS using a liquid handling platform. Six-level commercially available blood calibrators were used for assay development, while four quality control materials and three blood samples from patients under immunosuppressant treatment were employed for the evaluation of imprecision. Barcode reading, sample re-suspension, transfer of whole blood samples into 96-well plates, addition of internal standard solution, mixing, and protein precipitation were performed with a liquid handling platform. After plate filtration, the deproteinised supernatants were submitted for SPE on-line. The only manual steps in the entire process were de-capping of the tubes, and transfer of the well plates to the HPLC autosampler. Calibration curves were linear throughout the selected ranges. The imprecision and accuracy data for all analytes were highly satisfactory. The agreement between the results obtained with manual and those obtained with automated sample preparation was optimal (n=390, r=0.96). In daily routine (100 patient samples) the typical overall total turnaround time was less than 6h. Our findings indicate that the proposed analytical system is suitable for routine analysis, since it is straightforward and precise. Furthermore, it incurs less manual workload and less risk of error in the quantification of whole blood immunosuppressant concentrations than conventional methods. © 2013.
Johnson, Kenneth L; Mason, Christopher J; Muddiman, David C; Eckel, Jeanette E
2004-09-01
This study quantifies the experimental uncertainty for LC retention time, mass measurement precision, and ion abundance obtained from replicate nLC-dual ESI-FT-ICR analyses of the low molecular weight fraction of serum. We used ultrafiltration to enrich the < 10-kDa fraction of components from the high-abundance proteins in a pooled serum sample derived from ovarian cancer patients. The THRASH algorithm for isotope cluster detection was applied to five replicate nLC-dual ESI-FT-ICR chromatograms. A simple two-level grouping algorithm was applied to the more than 7000 isotope clusters found in each replicate and identified 497 molecular species that appeared in at least four of the replicates. In addition, a representative set of 231 isotope clusters, corresponding to 188 unique molecular species, were manually interpreted to verify the automated algorithm and to set its tolerances. For nLC retention time reproducibility, 95% of the 497 species had a 95% confidence interval of the mean of +/- 0.9 min or less without the use of chromatographic alignment procedures. Furthermore, 95% of the 497 species had a mass measurement precision of < or = 3.2 and < or = 6.3 ppm for internally and externally calibrated spectra, respectively. Moreover, 95% of replicate ion abundance measurements, covering an ion abundance range of approximately 3 orders of magnitude, had a coefficient of variation of less than 62% without using any normalization functions. The variability of ion abundance was independent of LC retention time, mass, and ion abundance quartile. These measures of analytical reproducibility establish a statistical rationale for differentiating healthy and disease patient populations for the elucidation of biomarkers in the low molecular fraction of serum. Copyright 2004 American Chemical Society
Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.
Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter
2017-09-01
Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Kimoto, Hideshi; Nozaki, Ken; Kudo, Setsuko; Kato, Ken; Negishi, Akira; Kayanne, Hajime
2002-03-01
A fully automated, continuous-flow-through type analyzer was developed to observe rapid changes in the concentration of total inorganic carbon (CT) in coastal zones. Seawater and an H3PO4 solution were fed into the analyzer's mixing coil by two high-precision valveless piston pumps. The CO2 was stripped from the seawater and moved into a carrier gas, using a newly developed continuous-flow-through CO2 extractor. A mass flow controller was used to assure a precise flow rate of the carrier gas. The CO2 concentration was then determined with a nondispersive infrared gas analyzer. This analyzer achieved a time-resolution of as good as 1 min. In field experiments on a shallow reef flat of Shiraho (Ishigaki Island, Southwest Japan), the analyzer detected short-term, yet extreme, variations in CT which manual sampling missed. Analytical values obtained by the analyzer on the boat were compared with those determined by potentiometric titration with a closed cell in a laboratory: CT(flow-through) = 0.980 x CT(titration) + 38.8 with r2 = 0.995 (n = 34; September 1998).
Automated survey of pavement distress based on 2D and 3D laser images.
DOT National Transportation Integrated Search
2011-11-01
Despite numerous efforts in recent decades, currently most information on pavement surface distresses cannot be obtained automatically, at high-speed, and at acceptable precision and bias levels. This research provided seed funding to produce a funct...
A Concept for Airborne Precision Spacing for Dependent Parallel Approaches
NASA Technical Reports Server (NTRS)
Barmore, Bryan E.; Baxley, Brian T.; Abbott, Terence S.; Capron, William R.; Smith, Colin L.; Shay, Richard F.; Hubbs, Clay
2012-01-01
The Airborne Precision Spacing concept of operations has been previously developed to support the precise delivery of aircraft landing successively on the same runway. The high-precision and consistent delivery of inter-aircraft spacing allows for increased runway throughput and the use of energy-efficient arrivals routes such as Continuous Descent Arrivals and Optimized Profile Descents. This paper describes an extension to the Airborne Precision Spacing concept to enable dependent parallel approach operations where the spacing aircraft must manage their in-trail spacing from a leading aircraft on approach to the same runway and spacing from an aircraft on approach to a parallel runway. Functionality for supporting automation is discussed as well as procedures for pilots and controllers. An analysis is performed to identify the required information and a new ADS-B report is proposed to support these information needs. Finally, several scenarios are described in detail.
A Novel ImageJ Macro for Automated Cell Death Quantitation in the Retina
Maidana, Daniel E.; Tsoka, Pavlina; Tian, Bo; Dib, Bernard; Matsumoto, Hidetaka; Kataoka, Keiko; Lin, Haijiang; Miller, Joan W.; Vavvas, Demetrios G.
2015-01-01
Purpose TUNEL assay is widely used to evaluate cell death. Quantification of TUNEL-positive (TUNEL+) cells in tissue sections is usually performed manually, ideally by two masked observers. This process is time consuming, prone to measurement errors, and not entirely reproducible. In this paper, we describe an automated quantification approach to address these difficulties. Methods We developed an ImageJ macro to quantitate cell death by TUNEL assay in retinal cross-section images. The script was coded using IJ1 programming language. To validate this tool, we selected a dataset of TUNEL assay digital images, calculated layer area and cell count manually (done by two observers), and compared measurements between observers and macro results. Results The automated macro segmented outer nuclear layer (ONL) and inner nuclear layer (INL) successfully. Automated TUNEL+ cell counts were in-between counts of inexperienced and experienced observers. The intraobserver coefficient of variation (COV) ranged from 13.09% to 25.20%. The COV between both observers was 51.11 ± 25.83% for the ONL and 56.07 ± 24.03% for the INL. Comparing observers' results with macro results, COV was 23.37 ± 15.97% for the ONL and 23.44 ± 18.56% for the INL. Conclusions We developed and validated an ImageJ macro that can be used as an accurate and precise quantitative tool for retina researchers to achieve repeatable, unbiased, fast, and accurate cell death quantitation. We believe that this standardized measurement tool could be advantageous to compare results across different research groups, as it is freely available as open source. PMID:26469755
Development of a UAV system for VNIR-TIR acquisitions in precision agriculture
NASA Astrophysics Data System (ADS)
Misopolinos, L.; Zalidis, Ch.; Liakopoulos, V.; Stavridou, D.; Katsigiannis, P.; Alexandridis, T. K.; Zalidis, G.
2015-06-01
Adoption of precision agriculture techniques requires the development of specialized tools that provide spatially distributed information. Both flying platforms and airborne sensors are being continuously evolved to cover the needs of plant and soil sensing at affordable costs. Due to restrictions in payload, flying platforms are usually limited to carry a single sensor on board. The aim of this work is to present the development of a vertical take-off and landing autonomous unmanned aerial vehicle (VTOL UAV) system for the simultaneous acquisition of high resolution vertical images at the visible, near infrared (VNIR) and thermal infrared (TIR) wavelengths. A system was developed that has the ability to trigger two cameras simultaneously with a fully automated process and no pilot intervention. A commercial unmanned hexacopter UAV platform was optimized to increase reliability, ease of operation and automation. The designed systems communication platform is based on a reduced instruction set computing (RISC) processor running Linux OS with custom developed drivers in an efficient way, while keeping the cost and weight to a minimum. Special software was also developed for the automated image capture, data processing and on board data and metadata storage. The system was tested over a kiwifruit field in northern Greece, at flying heights of 70 and 100m above the ground. The acquired images were mosaicked and geo-corrected. Images from both flying heights were of good quality and revealed unprecedented detail within the field. The normalized difference vegetation index (NDVI) was calculated along with the thermal image in order to provide information on the accurate location of stressors and other parameters related to the crop productivity. Compared to other available sources of data, this system can provide low cost, high resolution and easily repeatable information to cover the requirements of precision agriculture.
Preparation of Partial-Thickness Burn Wounds in Rodents Using a New Experimental Burning Device.
Sakamoto, Michiharu; Morimoto, Naoki; Ogino, Shuichi; Jinno, Chizuru; Kawaguchi, Atsushi; Kawai, Katsuya; Suzuki, Shigehiko
2016-06-01
The manual application of hot water or hot metal to an animal's skin surface is often used to prepare burn wound models. However, manual burn creation is subject to human variability. We developed a new device that can control the temperature, time, and pressure of contact to produce precise and reproducible animal burn wounds and investigated the conditions required to prepare various burn wounds using our new device. We prepared burn wounds on F344 rats using 3 contact times 2, 4, and 10 seconds using a stamp heated to 80°C. We observed the wound-healing process macroscopically and histologically and evaluated the burn depth using a laser speckle contrast-imaging device, which evaluated the blood flow of the wound. The changes in the burned area over time, tissue perfusion of the burn wounds, histological evaluation of the burn depth by hematoxylin-eosin and azocarmine and aniline blue staining, and the epithelialization rate (the ratio of the epithelialized area to the wound length) were evaluated on histological sections. Results indicated that the burn wounds prepared with contact times of 2, 4, and 10 seconds corresponded to superficial dermal burns, deep dermal burns, and full-thickness burns, respectively. We demonstrated that partial- and full-thickness burn wounds can be precisely and reproducibly created with our new automated burning device.
NASA Astrophysics Data System (ADS)
Burba, G. G.; Avenson, T.; Burkart, A.; Gamon, J. A.; Guan, K.; Julitta, T.; Pastorello, G.; Sakowska, K.
2017-12-01
Many hundreds of flux towers are presently operational as standalone projects and as parts of regional networks. However, the vast majority of these towers do not allow straightforward coupling with remote sensing (drone, aircraft, satellite, etc.) data, and even fewer have optical sensors for validation of remote sensing products, and upscaling from field to regional levels. In 2016-2017, new tools to collect, process, and share time-synchronized flux data from multiple towers were developed and deployed globally. Originally designed to automate site and data management, and to streamline flux data analysis, these tools allow relatively easy matching of tower data with remote sensing data: GPS-driven PTP time protocol synchronizes instrumentation within the station, different stations with each other, and all of these to remote sensing data to precisely align remote sensing and flux data in time Footprint size and coordinates computed and stored with flux data help correctly align tower flux footprints and drone, aircraft or satellite motion to precisely align optical and flux data in space Full snapshot of the remote sensing pixel can then be constructed, including leaf-level, ground optical sensor, and flux tower measurements from the same footprint area, closely coupled with the remote sensing measurements to help interpret remote sensing data, validate models, and improve upscaling Additionally, current flux towers can be augmented with advanced ground optical sensors and can use standard routines to deliver continuous products (e.g. SIF, PRI, NDVI, etc.) based on automated field spectrometers (e.g., FloX and RoX, etc.) and other optical systems. Several dozens of new towers already operational globally can be readily used for the proposed workflow. Over 500 active traditional flux towers can be updated to synchronize their data with remote sensing measurements. This presentation will show how the new tools are used by major networks, and describe how this approach can be utilized for matching remote sensing and tower data to aid in ground truthing, improve scientific interactions, and promote joint grant writing and other forms of collaboration between the flux and remote sensing communities.
Automated SEM Modal Analysis Applied to the Diogenites
NASA Technical Reports Server (NTRS)
Bowman, L. E.; Spilde, M. N.; Papike, James J.
1996-01-01
Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.
NASA Astrophysics Data System (ADS)
Ehrhart, Matthias; Lienhart, Werner
2017-09-01
The importance of automated prism tracking is increasingly triggered by the rising automation of total station measurements in machine control, monitoring and one-person operation. In this article we summarize and explain the different techniques that are used to coarsely search a prism, to precisely aim at a prism, and to identify whether the correct prism is tracked. Along with the state-of-the-art review, we discuss and experimentally evaluate possible improvements based on the image data of an additional wide-angle camera which is available for many total stations today. In cases in which the total station's fine aiming module loses the prism, the tracked object may still be visible to the wide-angle camera because of its larger field of view. The theodolite angles towards the target can then be derived from its image coordinates which facilitates a fast reacquisition of the prism. In experimental measurements we demonstrate that our image-based approach for the coarse target search is 4 to 10-times faster than conventional approaches.
High-throughput behavioral screening method for detecting auditory response defects in zebrafish.
Bang, Pascal I; Yelick, Pamela C; Malicki, Jarema J; Sewell, William F
2002-08-30
We have developed an automated, high-throughput behavioral screening method for detecting hearing defects in zebrafish. Our assay monitors a rapid escape reflex in response to a loud sound. With this approach, 36 adult zebrafish, restrained in visually isolated compartments, can be simultaneously assessed for responsiveness to near-field 400 Hz sinusoidal tone bursts. Automated, objective determinations of responses are achieved with a computer program that obtains images at precise times relative to the acoustic stimulus. Images taken with a CCD video camera before and after stimulus presentation are subtracted to reveal a response to the sound. Up to 108 fish can be screened per hour. Over 6500 fish were tested to validate the reliability of the assay. We found that 1% of these animals displayed hearing deficits. The phenotypes of non-responders were further assessed with radiological analysis for defects in the gross morphology of the auditory system. Nearly all of those showed abnormalities in conductive elements of the auditory system: the swim bladder or Weberian ossicles. Copyright 2002 Elsevier Science B.V.
Solti, Imre; Cooke, Colin R; Xia, Fei; Wurfel, Mark M
2009-11-01
This paper compares the performance of keyword and machine learning-based chest x-ray report classification for Acute Lung Injury (ALI). ALI mortality is approximately 30 percent. High mortality is, in part, a consequence of delayed manual chest x-ray classification. An automated system could reduce the time to recognize ALI and lead to reductions in mortality. For our study, 96 and 857 chest x-ray reports in two corpora were labeled by domain experts for ALI. We developed a keyword and a Maximum Entropy-based classification system. Word unigram and character n-grams provided the features for the machine learning system. The Maximum Entropy algorithm with character 6-gram achieved the highest performance (Recall=0.91, Precision=0.90 and F-measure=0.91) on the 857-report corpus. This study has shown that for the classification of ALI chest x-ray reports, the machine learning approach is superior to the keyword based system and achieves comparable results to highest performing physician annotators.
Solti, Imre; Cooke, Colin R.; Xia, Fei; Wurfel, Mark M.
2010-01-01
This paper compares the performance of keyword and machine learning-based chest x-ray report classification for Acute Lung Injury (ALI). ALI mortality is approximately 30 percent. High mortality is, in part, a consequence of delayed manual chest x-ray classification. An automated system could reduce the time to recognize ALI and lead to reductions in mortality. For our study, 96 and 857 chest x-ray reports in two corpora were labeled by domain experts for ALI. We developed a keyword and a Maximum Entropy-based classification system. Word unigram and character n-grams provided the features for the machine learning system. The Maximum Entropy algorithm with character 6-gram achieved the highest performance (Recall=0.91, Precision=0.90 and F-measure=0.91) on the 857-report corpus. This study has shown that for the classification of ALI chest x-ray reports, the machine learning approach is superior to the keyword based system and achieves comparable results to highest performing physician annotators. PMID:21152268
DeChant, Chad; Wiesner-Hanks, Tyr; Chen, Siyuan; Stewart, Ethan L; Yosinski, Jason; Gore, Michael A; Nelson, Rebecca J; Lipson, Hod
2017-11-01
Northern leaf blight (NLB) can cause severe yield loss in maize; however, scouting large areas to accurately diagnose the disease is time consuming and difficult. We demonstrate a system capable of automatically identifying NLB lesions in field-acquired images of maize plants with high reliability. This approach uses a computational pipeline of convolutional neural networks (CNNs) that addresses the challenges of limited data and the myriad irregularities that appear in images of field-grown plants. Several CNNs were trained to classify small regions of images as containing NLB lesions or not; their predictions were combined into separate heat maps, then fed into a final CNN trained to classify the entire image as containing diseased plants or not. The system achieved 96.7% accuracy on test set images not used in training. We suggest that such systems mounted on aerial- or ground-based vehicles can help in automated high-throughput plant phenotyping, precision breeding for disease resistance, and reduced pesticide use through targeted application across a variety of plant and disease categories.
An Automated Summarization Assessment Algorithm for Identifying Summarizing Strategies
Abdi, Asad; Idris, Norisma; Alguliyev, Rasim M.; Aliguliyev, Ramiz M.
2016-01-01
Background Summarization is a process to select important information from a source text. Summarizing strategies are the core cognitive processes in summarization activity. Since summarization can be important as a tool to improve comprehension, it has attracted interest of teachers for teaching summary writing through direct instruction. To do this, they need to review and assess the students' summaries and these tasks are very time-consuming. Thus, a computer-assisted assessment can be used to help teachers to conduct this task more effectively. Design/Results This paper aims to propose an algorithm based on the combination of semantic relations between words and their syntactic composition to identify summarizing strategies employed by students in summary writing. An innovative aspect of our algorithm lies in its ability to identify summarizing strategies at the syntactic and semantic levels. The efficiency of the algorithm is measured in terms of Precision, Recall and F-measure. We then implemented the algorithm for the automated summarization assessment system that can be used to identify the summarizing strategies used by students in summary writing. PMID:26735139
Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd
2018-04-01
Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.
Wong, Stephen; Hargreaves, Eric L; Baltuch, Gordon H; Jaggi, Jurg L; Danish, Shabbar F
2012-01-01
Microelectrode recording (MER) is necessary for precision localization of target structures such as the subthalamic nucleus during deep brain stimulation (DBS) surgery. Attempts to automate this process have produced quantitative temporal trends (feature activity vs. time) extracted from mobile MER data. Our goal was to evaluate computational methods of generating spatial profiles (feature activity vs. depth) from temporal trends that would decouple automated MER localization from the clinical procedure and enhance functional localization in DBS surgery. We evaluated two methods of interpolation (standard vs. kernel) that generated spatial profiles from temporal trends. We compared interpolated spatial profiles to true spatial profiles that were calculated with depth windows, using correlation coefficient analysis. Excellent approximation of true spatial profiles is achieved by interpolation. Kernel-interpolated spatial profiles produced superior correlation coefficient values at optimal kernel widths (r = 0.932-0.940) compared to standard interpolation (r = 0.891). The choice of kernel function and kernel width resulted in trade-offs in smoothing and resolution. Interpolation of feature activity to create spatial profiles from temporal trends is accurate and can standardize and facilitate MER functional localization of subcortical structures. The methods are computationally efficient, enhancing localization without imposing additional constraints on the MER clinical procedure during DBS surgery. Copyright © 2012 S. Karger AG, Basel.
Subunit mass analysis for monitoring antibody oxidation.
Sokolowska, Izabela; Mo, Jingjie; Dong, Jia; Lewis, Michael J; Hu, Ping
2017-04-01
Methionine oxidation is a common posttranslational modification (PTM) of monoclonal antibodies (mAbs). Oxidation can reduce the in-vivo half-life, efficacy and stability of the product. Peptide mapping is commonly used to monitor the levels of oxidation, but this is a relatively time-consuming method. A high-throughput, automated subunit mass analysis method was developed to monitor antibody methionine oxidation. In this method, samples were treated with IdeS, EndoS and dithiothreitol to generate three individual IgG subunits (light chain, Fd' and single chain Fc). These subunits were analyzed by reversed phase-ultra performance liquid chromatography coupled with an online quadrupole time-of-flight mass spectrometer and the levels of oxidation on each subunit were quantitated based on the deconvoluted mass spectra using the UNIFI software. The oxidation results obtained by subunit mass analysis correlated well with the results obtained by peptide mapping. Method qualification demonstrated that this subunit method had excellent repeatability and intermediate precision. In addition, UNIFI software used in this application allows automated data acquisition and processing, which makes this method suitable for high-throughput process monitoring and product characterization. Finally, subunit mass analysis revealed the different patterns of Fc methionine oxidation induced by chemical and photo stress, which makes it attractive for investigating the root cause of oxidation.
Lippi, Giuseppe; Ippolito, Luigi; Favaloro, Emmanuel J
2013-10-01
Automation in hemostasis testing is entering an exciting and unprecedented phase. This study was planned to assess the performance of the new preanalytical module on the hemostasis testing system Instrumentation Laboratory ACL TOP. The evaluation included interference studies to define reliable thresholds for rejecting samples with significant concentrations of interfering substances; within-run imprecision studies of plasma indices on four different interference degrees for each index; comparison studies with reference measures of hemolysis index, bilirubin, and triglycerides on clinical chemistry analyzers; and calculation of turnaround time with and without automatic performance of preanalytical check. The upper limits for sample rejection according to our interference studies were 3.6 g/L for hemoglobin, 13.6 mg/dL for bilirubin, and 1454 mg/dL for triglycerides. We found optimal precision for all indices (0.6% to 3.1% at clinically relevant thresholds) and highly significant correlations with reference measures on clinical chemistry analyzers (from 0.985 to 0.998). The limited increase of turnaround time (i.e., +3% and +5% with or without cap-piercing), coupled with no adjunctive costs over performance of normal coagulation assays, contribute to make the automatic check of plasma indices on ACL TOP a reliable and practical approach for improving testing quality and safeguarding patient safety.
Computer Technology for Industry
NASA Technical Reports Server (NTRS)
1979-01-01
In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.
Poma, Alessandro; Guerreiro, Antonio; Whitcombe, Michael J; Piletska, Elena V; Turner, Anthony P F; Piletsky, Sergey A
2013-06-13
Molecularly Imprinted Polymers (MIPs) are generic alternatives to antibodies in sensors, diagnostics and separations. To displace biomolecules without radical changes in infrastructure in device manufacture, MIPs should share their characteristics (solubility, size, specificity and affinity, localized binding domain) whilst maintaining the advantages of MIPs (low-cost, short development time and high stability) hence the interest in MIP nanoparticles. Herein we report a reusable solid-phase template approach (fully compatible with automation) for the synthesis of MIP nanoparticles and their precise manufacture using a prototype automated UV photochemical reactor. Batches of nanoparticles (30-400 nm) with narrow size distributions imprinted with: melamine (d = 60 nm, K d = 6.3 × 10 -8 m), vancomycin (d = 250 nm, K d = 3.4 × 10 -9 m), a peptide (d = 350 nm, K d = 4.8 × 10 -8 m) and proteins have been produced. Our instrument uses a column packed with glass beads, bearing the template. Process parameters are under computer control, requiring minimal manual intervention. For the first time we demonstrate the reliable re-use of molecular templates in the synthesis of MIPs (≥ 30 batches of nanoMIPs without loss of performance). NanoMIPs are produced template-free and the solid-phase acts both as template and affinity separation medium.
Poma, Alessandro; Guerreiro, Antonio; Whitcombe, Michael J.; Piletska, Elena V.; Turner, Anthony P.F.; Piletsky, Sergey A.
2016-01-01
Molecularly Imprinted Polymers (MIPs) are generic alternatives to antibodies in sensors, diagnostics and separations. To displace biomolecules without radical changes in infrastructure in device manufacture, MIPs should share their characteristics (solubility, size, specificity and affinity, localized binding domain) whilst maintaining the advantages of MIPs (low-cost, short development time and high stability) hence the interest in MIP nanoparticles. Herein we report a reusable solid-phase template approach (fully compatible with automation) for the synthesis of MIP nanoparticles and their precise manufacture using a prototype automated UV photochemical reactor. Batches of nanoparticles (30-400 nm) with narrow size distributions imprinted with: melamine (d = 60 nm, Kd = 6.3 × 10−8 m), vancomycin (d = 250 nm, Kd = 3.4 × 10−9 m), a peptide (d = 350 nm, Kd = 4.8 × 10−8 m) and proteins have been produced. Our instrument uses a column packed with glass beads, bearing the template. Process parameters are under computer control, requiring minimal manual intervention. For the first time we demonstrate the reliable re-use of molecular templates in the synthesis of MIPs (≥ 30 batches of nanoMIPs without loss of performance). NanoMIPs are produced template-free and the solid-phase acts both as template and affinity separation medium. PMID:26869870
Advance innovations of an intelligent sprayer for nursery and fruit tree crops
USDA-ARS?s Scientific Manuscript database
Conventional spray application technology requires excessive amounts of pesticide use to achieve effective pest control in floral, nursery, and other specialty crop productions. This onerous challenge is now overcome by our newly developed automated variable-rate, air-assisted precision sprayer. Thi...
Integration and Evaluation of Automated Pavement Distress Data in INDOT’s Pavement Management System
DOT National Transportation Integrated Search
2017-05-01
This study was in two parts. The first part established and demonstrated a framework for pavement data integration. This is critical for fulfilling QC/QA needs of INDOTs pavement management system, because the precision of the physical location re...
A comparison of five methods for monitoring the precision of automated x-ray film processors.
Nickoloff, E L; Leo, F; Reese, M
1978-11-01
Five different methods for preparing sensitometric strips used to monitor the precision of automated film processors are compared. A method for determining the sensitivity of each system to processor variations is presented; the observed statistical variability is multiplied by the system response to temperature or chemical changes. Pre-exposed sensitometric strips required the use of accurate densitometers and stringent control limits to be effective. X-ray exposed sensitometric strips demonstrated large variations in the x-ray output (2 omega approximately equal to 8.0%) over a period of one month. Some light sensitometers were capable of detecting +/- 1.0 degrees F (+/- 0.6 degrees C) variations in developer temperature in the processor and/or about 10.0 ml of chemical contamination in the processor. Nevertheless, even the light sensitometers were susceptible to problems, e.g. film emulsion selection, line voltage variations, and latent image fading. Advantages and disadvantages of the various sensitometric methods are discussed.
Bou Chakra, Elie; Hannes, Benjamin; Vieillard, Julien; Mansfield, Colin D.; Mazurczyk, Radoslav; Bouchard, Aude; Potempa, Jan; Krawczyk, Stanislas; Cabrera, Michel
2009-01-01
A novel approach to integrating biochip and microfluidic devices is reported in which microcontact printing is a key fabrication technique. The process is performed using an automated microcontact printer that has been developed as an application-specific tool. As proof-of-concept the instrument is used to consecutively and selectively graft patterns of antibodies at the bottom of a glass channel for use in microfluidic immunoassays. Importantly, feature collapse due to over compression of the PDMS stamp is avoided by fine control of the stamp’s compression during contact. The precise alignment of biomolecules at the intersection of microfluidic channel and integrated optical waveguides has been achieved, with antigen detection performed via fluorescence excitation. Thus, it has been demonstrated that this technology permits sequential microcontact printing of isolated features consisting of functional biomolecules at any position along a microfluidic channel and also that it is possible to precisely align these features with existing components. PMID:20161128
A Practical Approach for Recognizing Eating Moments with Wrist-Mounted Inertial Sensing
Thomaz, Edison; Essa, Irfan; Abowd, Gregory D.
2018-01-01
Recognizing when eating activities take place is one of the key challenges in automated food intake monitoring. Despite progress over the years, most proposed approaches have been largely impractical for everyday usage, requiring multiple on-body sensors or specialized devices such as neck collars for swallow detection. In this paper, we describe the implementation and evaluation of an approach for inferring eating moments based on 3-axis accelerometry collected with a popular off-the-shelf smartwatch. Trained with data collected in a semi-controlled laboratory setting with 20 subjects, our system recognized eating moments in two free-living condition studies (7 participants, 1 day; 1 participant, 31 days), with F-scores of 76.1% (66.7% Precision, 88.8% Recall), and 71.3% (65.2% Precision, 78.6% Recall). This work represents a contribution towards the implementation of a practical, automated system for everyday food intake monitoring, with applicability in areas ranging from health research and food journaling. PMID:29520397
Interfacing An Intelligent Decision-Maker To A Real-Time Control System
NASA Astrophysics Data System (ADS)
Evers, D. C.; Smith, D. M.; Staros, C. J.
1984-06-01
This paper discusses some of the practical aspects of implementing expert systems in a real-time environment. There is a conflict between the needs of a process control system and the computational load imposed by intelligent decision-making software. The computation required to manage a real-time control problem is primarily concerned with routine calculations which must be executed in real time. On most current hardware, non-trivial AI software should not be forced to operate under real-time constraints. In order for the system to work efficiently, the two processes must be separated by a well-defined interface. Although the precise nature of the task separation will vary with the application, the definition of the interface will need to follow certain fundamental principles in order to provide functional separation. This interface was successfully implemented in the expert scheduling software currently running the automated chemical processing facility at Lockheed-Georgia. Potential applications of this concept in the areas of airborne avionics and robotics will be discussed.
Assessing mouse behaviour throughout the light/dark cycle using automated in-cage analysis tools.
Bains, Rasneer S; Wells, Sara; Sillito, Rowland R; Armstrong, J Douglas; Cater, Heather L; Banks, Gareth; Nolan, Patrick M
2018-04-15
An important factor in reducing variability in mouse test outcomes has been to develop assays that can be used for continuous automated home cage assessment. Our experience has shown that this has been most evidenced in long-term assessment of wheel-running activity in mice. Historically, wheel-running in mice and other rodents have been used as a robust assay to determine, with precision, the inherent period of circadian rhythms in mice. Furthermore, this assay has been instrumental in dissecting the molecular genetic basis of mammalian circadian rhythms. In teasing out the elements of this test that have determined its robustness - automated assessment of an unforced behaviour in the home cage over long time intervals - we and others have been investigating whether similar test apparatus could be used to accurately discriminate differences in distinct behavioural parameters in mice. Firstly, using these systems, we explored behaviours in a number of mouse inbred strains to determine whether we could extract biologically meaningful differences. Secondly, we tested a number of relevant mutant lines to determine how discriminative these parameters were. Our findings show that, when compared to conventional out-of-cage phenotyping, a far deeper understanding of mouse mutant phenotype can be established by monitoring behaviour in the home cage over one or more light:dark cycles. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
Levin, Jennifer B; Sams, Johnny; Tatsuoka, Curtis; Cassidy, Kristin A; Sajatovic, Martha
2015-04-01
Medication nonadherence occurs in 20-60% of persons with bipolar disorder (BD) and is associated with serious negative outcomes, including relapse, hospitalization, incarceration, suicide and high healthcare costs. Various strategies have been developed to measure adherence in BD. This descriptive paper summarizes challenges and workable strategies using electronic medication monitoring in a randomized clinical trial (RCT) in patients with BD. Descriptive data from 57 nonadherent individuals with BD enrolled in a prospective RCT evaluating a novel customized adherence intervention versus control were analyzed. Analyses focused on whole group data and did not assess intervention effects. Adherence was assessed with the self-reported Tablets Routine Questionnaire and the Medication Event Monitoring System (MEMS). The majority of participants were women (74%), African American (69%), with type I BD (77%). Practical limitations of MEMS included misuse in conjunction with pill minders, polypharmacy, cost, failure to bring to research visits, losing the device, and the device impacting baseline measurement. The advantages were more precise measurement, less biased recall, and collecting data from past time periods for missed interim visits. Automated devices such as MEMS can assist investigators in evaluating adherence in patients with BD. Knowing the anticipated pitfalls allows study teams to implement preemptive procedures for successful implementation in BD adherence studies and can help pave the way for future refinements as automated adherence assessment technologies become more sophisticated and readily available.
Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan
2018-01-01
A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.
Design and Research of the Sewage Treatment Control System
NASA Astrophysics Data System (ADS)
Chu, J.; Hu, W. W.
Due to the rapid development of China's economy, the water pollution has become a problem that we have to face. In particular, how to deal with industrial wastewater has become a top priority. In wastewater treatment, the control system based on PLC has met the design requirement in real-time, reliability, precision and so on. The integration of sequence control and process control in PLC, has the characteristics of high reliability, simple network, convenient and flexible use. PLC is a powerful tool for small and medium-sized industrial automation. Therefore, the sewage treatment control system take PLC as the core of control system, can nicely solve the problem of industrial wastewater in a certain extent.
Burnishing Systems: a Short Survey of the State-of-the-art
NASA Astrophysics Data System (ADS)
Bobrovskij, I. N.
2018-01-01
The modern technological solutions allowing to implement a new technology of surface plastic deformation are considered. The technological device allowing to implement the technology of hyper productive surface plastic deformation or wide burnishing (machining time is up to 2-3 revolutions of workpiece) is presented. The device provides the constant force of instruments regardless the beating, non-roundness and other surface shape defects; usable and easily controlled force adjustment; precise installation of instruments and holders toward the along the worpieces axis; automation of the supply and retraction of instruments. Also the device allowing to implement the technology of nanostructuring burnishing is presented. The design of the device allows to eliminate the effect of auto-oscillations.
NASA Astrophysics Data System (ADS)
Chen, Andrew A.; Meng, Frank; Morioka, Craig A.; Churchill, Bernard M.; Kangarloo, Hooshang
2005-04-01
Managing pediatric patients with neurogenic bladder (NGB) involves regular laboratory, imaging, and physiologic testing. Using input from domain experts and current literature, we identified specific data points from these tests to develop the concept of an electronic disease vector for NGB. An information extraction engine was used to extract the desired data elements from free-text and semi-structured documents retrieved from the patient"s medical record. Finally, a Java-based presentation engine created graphical visualizations of the extracted data. After precision, recall, and timing evaluation, we conclude that these tools may enable clinically useful, automatically generated, and diagnosis-specific visualizations of patient data, potentially improving compliance and ultimately, outcomes.
On optimal scheduling and air traffic control in the near terminal area. M.S. Thesis
NASA Technical Reports Server (NTRS)
Sarris, A. H.
1971-01-01
A scheme is proposed for automated air traffic control of landing aircraft in the vicinity of the airport. Each aircraft is put under the control of an airport-based computer as soon as it enters the near-terminal area (NTA). Scheduling is done immediately thereafter. The aircraft is given a flight plan which, if followed precisely, will lead it to the runway at a prespecified time. The geometry of the airspace in the NTA is chosen so that delays are executed far from the outer marker, and violations of minimum altitude and lateral separations are avoided. Finally, a solution to the velocity mix problem is proposed.
Astro-geodetic platform for high accuracy geoid determinat ion
NASA Astrophysics Data System (ADS)
Bǎdescu, Octavian; Nedelcu, Dan Alin; Cǎlin, Alexandru; Dumitru, Paul Daniel; Cǎlin, Lavinia A.; Popescu, Marcel
The paper presents first technical realizations of a mobile platform for vertical deviation determination at a satisfactory precision and low cost. The conception of the platform was made in the framework of a project regarding CCD astro-geodetic vertical deviation for geoid determination or geoid modeling. The project with the acronym A-GEO represents a collaboration between Technical University of Civil Engineering Bucharest - Faculty of Geodesy, (TUCEB-FG), Astronomical Institute of the Romanian academy (AIRA), and a private partner GeoGIS Proiect S.R.L. The paper presents some hardware and software aspects regarding design, development, and automation of the platform, based on an electro-optical geodetic instrument, CCD observations and satellite time synchronization for astro-geodetic measurements.
Robust adhesive precision bonding in automated assembly cells
NASA Astrophysics Data System (ADS)
Müller, Tobias; Haag, Sebastian; Bastuck, Thomas; Gisler, Thomas; Moser, Hansruedi; Uusimaa, Petteri; Axt, Christoph; Brecher, Christian
2014-03-01
Diode lasers are gaining importance, making their way to higher output powers along with improved BPP. The assembly of micro-optics for diode laser systems goes along with the highest requirements regarding assembly precision. Assembly costs for micro-optics are driven by the requirements regarding alignment in a submicron and the corresponding challenges induced by adhesive bonding. For micro-optic assembly tasks a major challenge in adhesive bonding at highest precision level is the fact, that the bonding process is irreversible. Accordingly, the first bonding attempt needs to be successful. Today's UV-curing adhesives inherit shrinkage effects crucial for submicron tolerances of e.g. FACs. The impact of the shrinkage effects can be tackled by a suitable bonding area design, such as minimal adhesive gaps and an adapted shrinkage offset value for the specific assembly parameters. Compensating shrinkage effects is difficult, as the shrinkage of UV-curing adhesives is not constant between two different lots and varies even over the storage period even under ideal circumstances as first test results indicate. An up-to-date characterization of the adhesive appears necessary for maximum precision in optics assembly to reach highest output yields, minimal tolerances and ideal beamshaping results. Therefore, a measurement setup to precisely determine the up-to-date level of shrinkage has been setup. The goal is to provide necessary information on current shrinkage to the operator or assembly cell to adjust the compensation offset on a daily basis. Impacts of this information are expected to be an improved beam shaping result and a first-time-right production.
Jaccard, Nicolas; Griffin, Lewis D; Keser, Ana; Macown, Rhys J; Super, Alexandre; Veraitch, Farlan S; Szita, Nicolas
2014-03-01
The quantitative determination of key adherent cell culture characteristics such as confluency, morphology, and cell density is necessary for the evaluation of experimental outcomes and to provide a suitable basis for the establishment of robust cell culture protocols. Automated processing of images acquired using phase contrast microscopy (PCM), an imaging modality widely used for the visual inspection of adherent cell cultures, could enable the non-invasive determination of these characteristics. We present an image-processing approach that accurately detects cellular objects in PCM images through a combination of local contrast thresholding and post hoc correction of halo artifacts. The method was thoroughly validated using a variety of cell lines, microscope models and imaging conditions, demonstrating consistently high segmentation performance in all cases and very short processing times (<1 s per 1,208 × 960 pixels image). Based on the high segmentation performance, it was possible to precisely determine culture confluency, cell density, and the morphology of cellular objects, demonstrating the wide applicability of our algorithm for typical microscopy image processing pipelines. Furthermore, PCM image segmentation was used to facilitate the interpretation and analysis of fluorescence microscopy data, enabling the determination of temporal and spatial expression patterns of a fluorescent reporter. We created a software toolbox (PHANTAST) that bundles all the algorithms and provides an easy to use graphical user interface. Source-code for MATLAB and ImageJ is freely available under a permissive open-source license. © 2013 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc.
Methods for semi-automated indexing for high precision information retrieval.
Berrios, Daniel C; Cucina, Russell J; Fagan, Lawrence M
2002-01-01
To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.
Robandt, P V; Bui, H M; Scancella, J M; Klette, K L
2010-10-01
An automated solid-phase extraction-liquid chromatography- tandem mass spectrometry (SPE-LC-MS-MS) method using the Spark Holland Symbiosis Pharma SPE-LC coupled to a Waters Quattro Micro MS-MS was developed for the analysis of 6-acetylmorphine (6-AM) in human urine specimens. The method was linear (R² = 0.9983) to 100 ng/mL, with no carryover at 200 ng/mL. Limits of quantification and detection were found to be 2 ng/mL. Interrun precision calculated as percent coefficient of variation (%CV) and evaluated by analyzing five specimens at 10 ng/mL over nine batches (n = 45) was 3.6%. Intrarun precision evaluated from 0 to 100 ng/mL ranged from 1.0 to 4.4%CV. Other opioids (codeine, morphine, oxycodone, oxymorphone, hydromorphone, hydrocodone, and norcodeine) did not interfere in the detection, quantification, or chromatography of 6-AM or the deuterated internal standard. The quantified values for 41 authentic human urine specimens previously found to contain 6-AM by a validated gas chromatography (GC)-MS method were compared to those obtained by the SPE-LC-MS-MS method. The SPE-LC-MS-MS procedure eliminates the human factors of specimen handling, extraction, and derivatization, thereby reducing labor costs and rework resulting from human error or technique issues. The time required for extraction and analysis was reduced by approximately 50% when compared to a validated 6-AM procedure using manual SPE and GC-MS analysis.
Sivakamasundari, J; Natarajan, V
2015-01-01
Diabetic Retinopathy (DR) is a disorder that affects the structure of retinal blood vessels due to long-standing diabetes mellitus. Automated segmentation of blood vessel is vital for periodic screening and timely diagnosis. An attempt has been made to generate continuous retinal vasculature for the design of Content Based Image Retrieval (CBIR) application. The typical normal and abnormal retinal images are preprocessed to improve the vessel contrast. The blood vessels are segmented using evolutionary based Harmony Search Algorithm (HSA) combined with Otsu Multilevel Thresholding (MLT) method by best objective functions. The segmentation results are validated with corresponding ground truth images using binary similarity measures. The statistical, textural and structural features are obtained from the segmented images of normal and DR affected retina and are analyzed. CBIR in medical image retrieval applications are used to assist physicians in clinical decision-support techniques and research fields. A CBIR system is developed using HSA based Otsu MLT segmentation technique and the features obtained from the segmented images. Similarity matching is carried out between the features of query and database images using Euclidean Distance measure. Similar images are ranked and retrieved. The retrieval performance of CBIR system is evaluated in terms of precision and recall. The CBIR systems developed using HSA based Otsu MLT and conventional Otsu MLT methods are compared. The retrieval performance such as precision and recall are found to be 96% and 58% for CBIR system using HSA based Otsu MLT segmentation. This automated CBIR system could be recommended for use in computer assisted diagnosis for diabetic retinopathy screening.
Jaccard, Nicolas; Griffin, Lewis D; Keser, Ana; Macown, Rhys J; Super, Alexandre; Veraitch, Farlan S; Szita, Nicolas
2014-01-01
The quantitative determination of key adherent cell culture characteristics such as confluency, morphology, and cell density is necessary for the evaluation of experimental outcomes and to provide a suitable basis for the establishment of robust cell culture protocols. Automated processing of images acquired using phase contrast microscopy (PCM), an imaging modality widely used for the visual inspection of adherent cell cultures, could enable the non-invasive determination of these characteristics. We present an image-processing approach that accurately detects cellular objects in PCM images through a combination of local contrast thresholding and post hoc correction of halo artifacts. The method was thoroughly validated using a variety of cell lines, microscope models and imaging conditions, demonstrating consistently high segmentation performance in all cases and very short processing times (<1 s per 1,208 × 960 pixels image). Based on the high segmentation performance, it was possible to precisely determine culture confluency, cell density, and the morphology of cellular objects, demonstrating the wide applicability of our algorithm for typical microscopy image processing pipelines. Furthermore, PCM image segmentation was used to facilitate the interpretation and analysis of fluorescence microscopy data, enabling the determination of temporal and spatial expression patterns of a fluorescent reporter. We created a software toolbox (PHANTAST) that bundles all the algorithms and provides an easy to use graphical user interface. Source-code for MATLAB and ImageJ is freely available under a permissive open-source license. Biotechnol. Bioeng. 2014;111: 504–517. © 2013 Wiley Periodicals, Inc. PMID:24037521
Performance of the SIR-B digital image processing subsystem
NASA Technical Reports Server (NTRS)
Curlander, J. C.
1986-01-01
A ground-based system to generate digital SAR image products has been developed and implemented in support of the SIR-B mission. This system is designed to achieve the maximum throughput while meeting strict image fidelity criteria. Its capabilities include: automated radiometric and geometric correction of the output imagery; high-precision absolute location without tiepoint registration; filtering of the raw data to remove spurious signals from alien radars; and automated catologing to maintain a full set of radar and image production facility in support of the SIR-B science investigators routinely produces over 80 image frames per week.
Continuous Improvements to East Coast Abort Landings for Space Shuttle Aborts
NASA Technical Reports Server (NTRS)
Butler, Kevin D.
2003-01-01
Improvement initiatives in the areas of guidance, flight control, and mission operations provide increased capability for successful East Coast Abort Landings (ECAL). Automating manual crew procedures in the Space Shuttle's onboard guidance allows faster and more precise commanding of flight control parameters needed for successful ECALs. Automation also provides additional capability in areas not possible with manual control. Operational changes in the mission concept allow for the addition of new landing sites and different ascent trajectories that increase the regions of a successful landing. The larger regions of ECAL capability increase the safety of the crew and Orbiter.
Stephens, A D; Colah, R; Fucharoen, S; Hoyer, J; Keren, D; McFarlane, A; Perrett, D; Wild, B J
2015-10-01
Automated high performance liquid chromatography and Capillary electrophoresis are used to quantitate the proportion of Hemoglobin A2 (HbA2 ) in blood samples order to enable screening and diagnosis of carriers of β-thalassemia. Since there is only a very small difference in HbA2 levels between people who are carriers and people who are not carriers such analyses need to be both precise and accurate. This paper examines the different parameters of such equipment and discusses how they should be assessed. © 2015 John Wiley & Sons Ltd.
Automated solid-phase extraction and liquid chromatography for assay of cyclosporine in whole blood.
Kabra, P M; Wall, J H; Dimson, P
1987-12-01
In this rapid, precise, accurate, cost-effective, automated liquid-chromatographic procedure for determining cyclosporine in whole blood, the cyclosporine is extracted from 0.5 mL of whole blood together with 300 micrograms of cyclosporin D per liter, added as internal standard, by using an Advanced Automated Sample Processing unit. The on-line solid-phase extraction is performed on an octasilane sorbent cartridge, which is interfaced with a RP-8 guard column and an octyl analytical column, packed with 5-microns packing material. Both columns are eluted with a mobile phase containing acetonitrile/methanol/water (53/20/27 by vol) at a flow rate of 1.5 mL/min and column temperature of 70 degrees C. Absolute recovery of cyclosporine exceeded 85% and the standard curve was linear to 5000 micrograms/L. Within-run and day-to-day CVs were less than 8%. Correlation between automated and manual Bond-Elut extraction methods was excellent (r = 0.987). None of 18 drugs and four steroids tested interfered.
Savini, Vincenzo; Marrollo, Roberta; Coclite, Eleonora; Fusilli, Paola; D'Incecco, Carmine; Fazii, Paolo
2014-01-01
We report the case of a late-onset neonatal meningitis by Streptococcus agalactiae (group B Streptococcus - GBS) that was diagnosed with a latex agglutination assay (on cerebrospinal fluid, CSF), as well as by using, for the first time, Xpert GBS (Cepheid, US) on CSF. Due to empirical antibiotics given before sampling, both CSF and blood culture were negative, so the abovementioned diagnostics was crucial. Moreover, the Xpert GBS assay, performed according to an off-label, modified protocol (the system is designed for GBS-carriage intrapartum screening, based on a completely automated real time-Polymerase Chain Reaction) quickly detected the organism's genome target. Although further investigation on this test's performace on CSF is required, then, we trust it may be a promising, quick and precise diagnostic method for infections in newborns.
An environmental chamber system for prolonged metabolic studies on small animals
NASA Technical Reports Server (NTRS)
Jordan, J. P.; Huston, L. J.; Simmons, J. B., II; Clarkson, D. P.; Martz, W. W.; Schatte, C. L.
1973-01-01
Measurement of metabolic adaptation to marginally stressful environments requires both precise regulation of a variety of atmospheric factors for extended periods of time and the capacity to employ sensitive parameters in an undisturbed subject. This paper describes a metabolic chamber system which can simultaneously maintain groups of small animals in two completely separate closed environments having different pressures, temperatures and gas compositions for an indefinite period. Oxygen consumption, carbon dioxide production, food and water consumption and animal activity cycles can be continuously monitored and quantified 24 h per day while the animals are in an unrestrained state. Each chamber can be serviced and the animals handled, injected and sacrificed without subjecting them to barometric stress. Several unique electrical and mechanical components allow semi-automated data collection on a continuous basis for indefinite periods of time.
Network Design in Close-Range Photogrammetry with Short Baseline Images
NASA Astrophysics Data System (ADS)
Barazzetti, L.
2017-08-01
The avaibility of automated software for image-based 3D modelling has changed the way people acquire images for photogrammetric applications. Short baseline images are required to match image points with SIFT-like algorithms, obtaining more images than those necessary for "old fashioned" photogrammetric projects based on manual measurements. This paper describes some considerations on network design for short baseline image sequences, especially on precision and reliability of bundle adjustment. Simulated results reveal that the large number of 3D points used for image orientation has very limited impact on network precision.
Nasso, Sara; Goetze, Sandra; Martens, Lennart
2015-09-04
Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.
A Low Cost Automated Monitoring System for Landslides Using Dual Frequency GPS
NASA Astrophysics Data System (ADS)
Mills, H.; Edwards, S.
2006-12-01
Landslides are an existing and permanent threat to societies across the globe, generating financial and human losses whenever and wherever they occur. Drawing together the strands of science that provide increased understanding of landslide triggers through accurate modelling is therefore vital for the development of mitigation and management strategies. Together with climatic and geomorphological data a key input here is information on the precise location and timing of landslide events. However, the detailed monitoring of landslides and precursor movements is generally limited to episodic campaigns where limiting factors include equipment and mobilisation costs, time constraints and spatial resolution. This research has developed a geodetic tool of benefit to scientists involved in the development of closely coupled models that seek to explain trigger mechanisms such as rainfall duration and intensity and changes in groundwater pressure to actual real land movements. A fully automated low cost dual frequency GPS station for the continuous in-situ monitoring of landslide sites has been developed. System configuration combines a dual frequency GPS receiver, PC board with a GPRS modem and power supply to deliver 24hr/365day operation capability. Individual components have been chosen to provide the highest accuracies while minimising power consumption resulting in a system around half that of equivalent commercial systems. Measurement point-costs can be further reduced through the use of antenna switching and multi antenna arrays. Continuous data is delivered via mobile phone uplink and processed automatically using geodetic software. The developed system has been extensively tested on a purpose built platform capable of simulating ground movements. Co-mounted antennas have allowed direct comparisons with more expensive geodetic GPS receivers. The system is capable of delivering precise 3D coordinates with a 9 mm rms. The system can be up-scaled resulting in the increased spatial density of monitoring and yielding more detailed information on landslide movements for improved downstream modelling and monitoring.
Ring, P R; Bostick, J M
2000-04-01
A sensitive and selective high-performance liquid chromatography (HPLC) method was developed for the determination of zolpidem in human plasma. Zolpidem and the internal standard (trazodone) were extracted from human plasma that had been made basic. The basic sample was loaded onto a conditioned Bond Elut C18 cartridge, rinsed with water and eluted with methanol. Forty microliters were then injected onto the LC system. Separation was achieved on a C18 column (150 x 4.6 mm, 5 microm) with a mobile phase composed of acetonitrile:50 mM potassium phosphate monobasic at pH 6.0 (4:6, v/v). Detection was by fluorescence, with excitation at 254 nm and emission at 400 nm. The retention times of zolpidem and internal standard were approximately 4.7 and 5.3 min, respectively. The LC run time was 8 min. The assay was linear in concentration range 1-400 ng/ml for zolpidem in human plasma. The analysis of quality control samples for zolpidem (3, 30, and 300 ng/ml) demonstrated excellent precision with relative standard deviations (RSD) of 3.7, 4.6, and 3.0%, respectively (n = 18). The method was accurate with all intraday (n = 6) and overall (n = 18) mean concentrations within 5.8% from nominal at all quality control sample concentrations. This method was also performed using a Gilson Aspec XL automated sample processor and autoinjector. The samples were manually fortified with internal standard and made basic. The aspec then performed the solid phase extraction and made injections of the samples onto the LC system. Using the automated procedure for analysis, quality control samples for zolpidem (3, 30, and 300 ng/ml) demonstrated acceptable precision with RSD values of 9.0, 4.9, and 5.1%, respectively (n = 12). The method was accurate with all intracurve (n = 4) and overall (n = 12) mean values being less than 10.8% from nominal at all quality control sample concentrations.
Neumann, M; Breton, E; Cuvillon, L; Pan, L; Lorenz, C H; de Mathelin, M
2012-01-01
In this paper, an original workflow is presented for MR image plane alignment based on tracking in real-time MR images. A test device consisting of two resonant micro-coils and a passive marker is proposed for detection using image-based algorithms. Micro-coils allow for automated initialization of the object detection in dedicated low flip angle projection images; then the passive marker is tracked in clinical real-time MR images, with alternation between two oblique orthogonal image planes along the test device axis; in case the passive marker is lost in real-time images, the workflow is reinitialized. The proposed workflow was designed to minimize dedicated acquisition time to a single dedicated acquisition in the ideal case (no reinitialization required). First experiments have shown promising results for test-device tracking precision, with a mean position error of 0.79 mm and a mean orientation error of 0.24°.
Li, Xiangpeng; Brooks, Jessica C; Hu, Juan; Ford, Katarena I; Easley, Christopher J
2017-01-17
A fully automated, 16-channel microfluidic input/output multiplexer (μMUX) has been developed for interfacing to primary cells and to improve understanding of the dynamics of endocrine tissue function. The device utilizes pressure driven push-up valves for precise manipulation of nutrient input and hormone output dynamics, allowing time resolved interrogation of the cells. The ability to alternate any of the 16 channels from input to output, and vice versa, provides for high experimental flexibility without the need to alter microchannel designs. 3D-printed interface templates were custom designed to sculpt the above-channel polydimethylsiloxane (PDMS) in microdevices, creating millimeter scale reservoirs and confinement chambers to interface primary murine islets and adipose tissue explants to the μMUX sampling channels. This μMUX device and control system was first programmed for dynamic studies of pancreatic islet function to collect ∼90 minute insulin secretion profiles from groups of ∼10 islets. The automated system was also operated in temporal stimulation and cell imaging mode. Adipose tissue explants were exposed to a temporal mimic of post-prandial insulin and glucose levels, while simultaneous switching between labeled and unlabeled free fatty acid permitted fluorescent imaging of fatty acid uptake dynamics in real time over a ∼2.5 hour period. Application with varying stimulation and sampling modes on multiple murine tissue types highlights the inherent flexibility of this novel, 3D-templated μMUX device. The tissue culture reservoirs and μMUX control components presented herein should be adaptable as individual modules in other microfluidic systems, such as organ-on-a-chip devices, and should be translatable to different tissues such as liver, heart, skeletal muscle, and others.
iGAS: A framework for using electronic intraoperative medical records for genomic discovery.
Levin, Matthew A; Joseph, Thomas T; Jeff, Janina M; Nadukuru, Rajiv; Ellis, Stephen B; Bottinger, Erwin P; Kenny, Eimear E
2017-03-01
Design and implement a HIPAA and Integrating the Healthcare Enterprise (IHE) profile compliant automated pipeline, the integrated Genomics Anesthesia System (iGAS), linking genomic data from the Mount Sinai Health System (MSHS) BioMe biobank to electronic anesthesia records, including physiological data collected during the perioperative period. The resulting repository of multi-dimensional data can be used for precision medicine analysis of physiological readouts, acute medical conditions, and adverse events that can occur during surgery. A structured pipeline was developed atop our existing anesthesia data warehouse using open-source tools. The pipeline is automated using scheduled tasks. The pipeline runs weekly, and finds and identifies all new and existing anesthetic records for BioMe participants. The pipeline went live in June 2015 with 49.2% (n=15,673) of BioMe participants linked to 40,947 anesthetics. The pipeline runs weekly in minimal time. After eighteen months, an additional 3671 participants were enrolled in BioMe and the number of matched anesthetic records grew 21% to 49,545. Overall percentage of BioMe patients with anesthetics remained similar at 51.1% (n=18,128). Seven patients opted out during this time. The median number of anesthetics per participant was 2 (range 1-144). Collectively, there were over 35 million physiologic data points and 480,000 medication administrations linked to genomic data. To date, two projects are using the pipeline at MSHS. Automated integration of biobank and anesthetic data sources is feasible and practical. This integration enables large-scale genomic analyses that might inform variable physiological response to anesthetic and surgical stress, and examine genetic factors underlying adverse outcomes during and after surgery. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bala, John L.
1995-08-01
Automation and polymer science represent fundamental new technologies which can be directed toward realizing the goal of establishing a domestic, world-class, commercial optics business. Use of innovative optical designs using precision polymer optics will enable the US to play a vital role in the next generation of commercial optical products. The increased cost savings inherent in the utilization of optical-grade polymers outweighs almost every advantage of using glass for high volume situations. Optical designers must gain experience with combined refractive/diffractive designs and broaden their knowledge base regarding polymer technology beyond a cursory intellectual exercise. Implementation of a fully automated assembly system, combined with utilization of polymer optics, constitutes the type of integrated manufacturing process which will enable the US to successfully compete with the low-cost labor employed in the Far East, as well as to produce an equivalent product.
Hillarp, A; Friedman, K D; Adcock-Funk, D; Tiefenbacher, S; Nichols, W L; Chen, D; Stadler, M; Schwartz, B A
2015-11-01
The ability of von Willebrand factor (VWF) to bind platelet GP Ib and promote platelet plug formation is measured in vitro using the ristocetin cofactor (VWF:RCo) assay. Automated assay systems make testing more accessible for diagnosis, but do not necessarily improve sensitivity and accuracy. We assessed the performance of a modified automated VWF:RCo assay protocol for the Behring Coagulation System (BCS(®) ) compared to other available assay methods. Results from different VWF:RCo assays in a number of specialized commercial and research testing laboratories were compared using plasma samples with varying VWF:RCo activities (0-1.2 IU mL(-1) ). Samples were prepared by mixing VWF concentrate or plasma standard into VWF-depleted plasma. Commercially available lyophilized standard human plasma was also studied. Emphasis was put on the low measuring range. VWF:RCo accuracy was calculated based on the expected values, whereas precision was obtained from repeated measurements. In the physiological concentration range, most of the automated tests resulted in acceptable accuracy, with varying reproducibility dependent on the method. However, several assays were inaccurate in the low measuring range. Only the modified BCS protocol showed acceptable accuracy over the entire measuring range with improved reproducibility. A modified BCS(®) VWF:RCo method can improve sensitivity and thus enhances the measuring range. Furthermore, the modified BCS(®) assay displayed good precision. This study indicates that the specific modifications - namely the combination of increased ristocetin concentration, reduced platelet content, VWF-depleted plasma as on-board diluent and a two-curve calculation mode - reduces the issues seen with current VWF:RCo activity assays. © 2015 John Wiley & Sons Ltd.
Automated identification of molecular effects of drugs (AIMED)
Fathiamini, Safa; Johnson, Amber M; Zeng, Jia; Araya, Alejandro; Holla, Vijaykumar; Bailey, Ann M; Litzenburger, Beate C; Sanchez, Nora S; Khotskaya, Yekaterina; Xu, Hua; Meric-Bernstam, Funda; Bernstam, Elmer V
2016-01-01
Introduction Genomic profiling information is frequently available to oncologists, enabling targeted cancer therapy. Because clinically relevant information is rapidly emerging in the literature and elsewhere, there is a need for informatics technologies to support targeted therapies. To this end, we have developed a system for Automated Identification of Molecular Effects of Drugs, to help biomedical scientists curate this literature to facilitate decision support. Objectives To create an automated system to identify assertions in the literature concerning drugs targeting genes with therapeutic implications and characterize the challenges inherent in automating this process in rapidly evolving domains. Methods We used subject-predicate-object triples (semantic predications) and co-occurrence relations generated by applying the SemRep Natural Language Processing system to MEDLINE abstracts and ClinicalTrials.gov descriptions. We applied customized semantic queries to find drugs targeting genes of interest. The results were manually reviewed by a team of experts. Results Compared to a manually curated set of relationships, recall, precision, and F2 were 0.39, 0.21, and 0.33, respectively, which represents a 3- to 4-fold improvement over a publically available set of predications (SemMedDB) alone. Upon review of ostensibly false positive results, 26% were considered relevant additions to the reference set, and an additional 61% were considered to be relevant for review. Adding co-occurrence data improved results for drugs in early development, but not their better-established counterparts. Conclusions Precision medicine poses unique challenges for biomedical informatics systems that help domain experts find answers to their research questions. Further research is required to improve the performance of such systems, particularly for drugs in development. PMID:27107438
Robandt, Paul P; Reda, Louis J; Klette, Kevin L
2008-10-01
A fully automated system utilizing a liquid handler and an online solid-phase extraction (SPE) device coupled with liquid chromatography-tandem mass spectrometry (LC-MS-MS) was designed to process, detect, and quantify benzoylecgonine (BZE), meta-hydroxybenzoylecgonine (m-OH BZE), para-hydroxybenzoylecgonine (p-OH BZE), and norbenzoylecgonine (nor-BZE) metabolites in human urine. The method was linear for BZE, m-OH BZE, and p-OH BZE from 1.2 to 10,000 ng/mL with limits of detection (LOD) and quantification (LOQ) of 1.2 ng/mL. Nor-BZE was linear from 5 to 10,000 ng/mL with an LOD and LOQ of 1.2 and 5 ng/mL, respectively. The intrarun precision measured as the coefficient of variation of 10 replicates of a 100 ng/mL control was less than 2.6%, and the interrun precision for 5 replicates of the same control across 8 batches was less than 4.8% for all analytes. No assay interference was noted from controls containing cocaine, cocaethylene, and ecgonine methyl ester. Excellent data concordance (R2 > 0.994) was found for direct comparison of the automated SPE-LC-MS-MS procedure and an existing gas chromatography-MS procedure using 94 human urine samples previously determined to be positive for BZE. The automated specimen handling and SPE procedure, when compared to the traditional extraction schema, eliminates the human factors of specimen handling, processing, extraction, and derivatization, thereby reducing labor costs and rework resulting from batch handling issues, and may reduce the number of fume hoods required in the laboratory.
Raith, Stefan; Vogel, Eric Per; Anees, Naeema; Keul, Christine; Güth, Jan-Frederik; Edelhoff, Daniel; Fischer, Horst
2017-01-01
Chairside manufacturing based on digital image acquisition is gainingincreasing importance in dentistry. For the standardized application of these methods, it is paramount to have highly automated digital workflows that can process acquired 3D image data of dental surfaces. Artificial Neural Networks (ANNs) arenumerical methods primarily used to mimic the complex networks of neural connections in the natural brain. Our hypothesis is that an ANNcan be developed that is capable of classifying dental cusps with sufficient accuracy. This bears enormous potential for an application in chairside manufacturing workflows in the dental field, as it closes the gap between digital acquisition of dental geometries and modern computer-aided manufacturing techniques.Three-dimensional surface scans of dental casts representing natural full dental arches were transformed to range image data. These data were processed using an automated algorithm to detect candidates for tooth cusps according to salient geometrical features. These candidates were classified following common dental terminology and used as training data for a tailored ANN.For the actual cusp feature description, two different approaches were developed and applied to the available data: The first uses the relative location of the detected cusps as input data and the second method directly takes the image information given in the range images. In addition, a combination of both was implemented and investigated.Both approaches showed high performance with correct classifications of 93.3% and 93.5%, respectively, with improvements by the combination shown to be minor.This article presents for the first time a fully automated method for the classification of teeththat could be confirmed to work with sufficient precision to exhibit the potential for its use in clinical practice,which is a prerequisite for automated computer-aided planning of prosthetic treatments with subsequent automated chairside manufacturing. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Epstein, A. H.; And Others
The first phase of an ongoing library automation project at Stanford University is described. Project BALLOTS (Bibliographic Automation of Large Library Operations Using a Time-Sharing System) seeks to automate the acquisition and cataloging functions of a large library using an on-line time-sharing computer. The main objectives are to control…
Fast automated online xylanase activity assay using HPAEC-PAD.
Cürten, Christin; Anders, Nico; Juchem, Niels; Ihling, Nina; Volkenborn, Kristina; Knapp, Andreas; Jaeger, Karl-Erich; Büchs, Jochen; Spiess, Antje C
2018-01-01
In contrast to biochemical reactions, which are often carried out under automatic control and maintained overnight, the automation of chemical analysis is usually neglected. Samples are either analyzed in a rudimentary fashion using in situ techniques, or aliquots are withdrawn and stored to facilitate more precise offline measurements, which can result in sampling and storage errors. Therefore, in this study, we implemented automated reaction control, sampling, and analysis. As an example, the activities of xylanases on xylotetraose and soluble xylan were examined using high-performance anion exchange chromatography with pulsed amperometric detection (HPAEC-PAD). The reaction was performed in HPLC vials inside a temperature-controlled Dionex™ AS-AP autosampler. It was started automatically when the autosampler pipetted substrate and enzyme solution into the reaction vial. Afterwards, samples from the reaction vial were injected repeatedly for 60 min onto a CarboPac™ PA100 column for analysis. Due to the rapidity of the reaction, the analytical method and the gradient elution of 200 mM sodium hydroxide solution and 100 mM sodium hydroxide with 500 mM sodium acetate were adapted to allow for an overall separation time of 13 min and a detection limit of 0.35-1.83 mg/L (depending on the xylooligomer). This analytical method was applied to measure the soluble short-chain products (xylose, xylobiose, xylotriose, xylotetraose, xylopentaose, and longer xylooligomers) that arise during enzymatic hydrolysis. Based on that, the activities of three endoxylanases (EX) were determined as 294 U/mg for EX from Aspergillus niger, 1.69 U/mg for EX from Bacillus stearothermophilus, and 0.36 U/mg for EX from Bacillus subtilis. Graphical abstract Xylanase activity assay automation.
NASA Astrophysics Data System (ADS)
Liu, Jiamin; Chang, Kevin; Kim, Lauren; Turkbey, Evrim; Lu, Le; Yao, Jianhua; Summers, Ronald
2015-03-01
The thyroid gland plays an important role in clinical practice, especially for radiation therapy treatment planning. For patients with head and neck cancer, radiation therapy requires a precise delineation of the thyroid gland to be spared on the pre-treatment planning CT images to avoid thyroid dysfunction. In the current clinical workflow, the thyroid gland is normally manually delineated by radiologists or radiation oncologists, which is time consuming and error prone. Therefore, a system for automated segmentation of the thyroid is desirable. However, automated segmentation of the thyroid is challenging because the thyroid is inhomogeneous and surrounded by structures that have similar intensities. In this work, the thyroid gland segmentation is initially estimated by multi-atlas label fusion algorithm. The segmentation is refined by supervised statistical learning based voxel labeling with a random forest algorithm. Multiatlas label fusion (MALF) transfers expert-labeled thyroids from atlases to a target image using deformable registration. Errors produced by label transfer are reduced by label fusion that combines the results produced by all atlases into a consensus solution. Then, random forest (RF) employs an ensemble of decision trees that are trained on labeled thyroids to recognize features. The trained forest classifier is then applied to the thyroid estimated from the MALF by voxel scanning to assign the class-conditional probability. Voxels from the expert-labeled thyroids in CT volumes are treated as positive classes; background non-thyroid voxels as negatives. We applied this automated thyroid segmentation system to CT scans of 20 patients. The results showed that the MALF achieved an overall 0.75 Dice Similarity Coefficient (DSC) and the RF classification further improved the DSC to 0.81.
NASA Astrophysics Data System (ADS)
Savant, Vaibhav; Smith, Niall
2016-07-01
We report on the current status in the development of a pilot automated data acquisition and reduction pipeline based around the operation of two nodes of remotely operated robotic telescopes based in California, USA and Cork, Ireland. The observatories are primarily used as a testbed for automation and instrumentation and as a tool to facilitate STEM (Science Technology Engineering Mathematics) promotion. The Ireland node is situated at Blackrock Castle Observatory (operated by Cork Institute of Technology) and consists of two optical telescopes - 6" and 16" OTAs housed in two separate domes while the node in California is its 6" replica. Together they form a pilot Telescope ARrAy known as TARA. QuickPhot is an automated data reduction pipeline designed primarily to throw more light on the microvariability of blazars employing precision optical photometry and using data from the TARA telescopes as they constantly monitor predefined targets whenever observing conditions are favourable. After carrying out aperture photometry, if any variability above a given threshold is observed, the reporting telescope will communicate the source concerned and the other nodes will follow up with multi-band observations, taking advantage that they are located in strategically separated time-zones. Ultimately we wish to investigate the applicability of Shock-in-Jet and Geometric models. These try to explain the processes at work in AGNs which result in the formation of jets, by looking for temporal and spectral variability in TARA multi-band observations. We are also experimenting with using a Twochannel Optical PHotometric Imaging CAMera (TOΦCAM) that we have developed and which has been optimised for simultaneous two-band photometry on our 16" OTA.
Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng
2017-12-01
Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.
Astronomical algorithms for automated analysis of tissue protein expression in breast cancer
Ali, H R; Irwin, M; Morris, L; Dawson, S-J; Blows, F M; Provenzano, E; Mahler-Araujo, B; Pharoah, P D; Walton, N A; Brenton, J D; Caldas, C
2013-01-01
Background: High-throughput evaluation of tissue biomarkers in oncology has been greatly accelerated by the widespread use of tissue microarrays (TMAs) and immunohistochemistry. Although TMAs have the potential to facilitate protein expression profiling on a scale to rival experiments of tumour transcriptomes, the bottleneck and imprecision of manually scoring TMAs has impeded progress. Methods: We report image analysis algorithms adapted from astronomy for the precise automated analysis of IHC in all subcellular compartments. The power of this technique is demonstrated using over 2000 breast tumours and comparing quantitative automated scores against manual assessment by pathologists. Results: All continuous automated scores showed good correlation with their corresponding ordinal manual scores. For oestrogen receptor (ER), the correlation was 0.82, P<0.0001, for BCL2 0.72, P<0.0001 and for HER2 0.62, P<0.0001. Automated scores showed excellent concordance with manual scores for the unsupervised assignment of cases to ‘positive' or ‘negative' categories with agreement rates of up to 96%. Conclusion: The adaptation of astronomical algorithms coupled with their application to large annotated study cohorts, constitutes a powerful tool for the realisation of the enormous potential of digital pathology. PMID:23329232
Precision shape modification of nanodevices with a low-energy electron beam
Zettl, Alex; Yuzvinsky, Thomas David; Fennimore, Adam
2010-03-09
Methods of shape modifying a nanodevice by contacting it with a low-energy focused electron beam are disclosed here. In one embodiment, a nanodevice may be permanently reformed to a different geometry through an application of a deforming force and a low-energy focused electron beam. With the addition of an assist gas, material may be removed from the nanodevice through application of the low-energy focused electron beam. The independent methods of shape modification and material removal may be used either individually or simultaneously. Precision cuts with accuracies as high as 10 nm may be achieved through the use of precision low-energy Scanning Electron Microscope scan beams. These methods may be used in an automated system to produce nanodevices of very precise dimensions. These methods may be used to produce nanodevices of carbon-based, silicon-based, or other compositions by varying the assist gas.
Automated optical testing of LWIR objective lenses using focal plane array sensors
NASA Astrophysics Data System (ADS)
Winters, Daniel; Erichsen, Patrik; Domagalski, Christian; Peter, Frank; Heinisch, Josef; Dumitrescu, Eugen
2012-10-01
The image quality of today's state-of-the-art IR objective lenses is constantly improving while at the same time the market for thermography and vision grows strongly. Because of increasing demands on the quality of IR optics and increasing production volumes, the standards for image quality testing increase and tests need to be performed in shorter time. Most high-precision MTF testing equipment for the IR spectral bands in use today relies on the scanning slit method that scans a 1D detector over a pattern in the image generated by the lens under test, followed by image analysis to extract performance parameters. The disadvantages of this approach are that it is relatively slow, it requires highly trained operators for aligning the sample and the number of parameters that can be extracted is limited. In this paper we present lessons learned from the R and D process on using focal plane array (FPA) sensors for testing of long-wave IR (LWIR, 8-12 m) optics. Factors that need to be taken into account when switching from scanning slit to FPAs are e.g.: the thermal background from the environment, the low scene contrast in the LWIR, the need for advanced image processing algorithms to pre-process camera images for analysis and camera artifacts. Finally, we discuss 2 measurement systems for LWIR lens characterization that we recently developed with different target applications: 1) A fully automated system suitable for production testing and metrology that uses uncooled microbolometer cameras to automatically measure MTF (on-axis and at several o-axis positions) and parameters like EFL, FFL, autofocus curves, image plane tilt, etc. for LWIR objectives with an EFL between 1 and 12mm. The measurement cycle time for one sample is typically between 6 and 8s. 2) A high-precision research-grade system using again an uncooled LWIR camera as detector, that is very simple to align and operate. A wide range of lens parameters (MTF, EFL, astigmatism, distortion, etc.) can be easily and accurately measured with this system.
Laboratory automation: trajectory, technology, and tactics.
Markin, R S; Whalen, S A
2000-05-01
Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a modular approach, from a hardware-driven system to process control, from a one-of-a-kind novelty toward a standardized product, and from an in vitro diagnostics novelty to a marketing tool. Multiple vendors are present in the marketplace, many of whom are in vitro diagnostics manufacturers providing an automation solution coupled with their instruments, whereas others are focused automation companies. Automation technology continues to advance, acceptance continues to climb, and payback and cost justification methods are developing.
An automated real-time free phenytoin assay to replace the obsolete Abbott TDx method.
Williams, Christopher; Jones, Richard; Akl, Pascale; Blick, Kenneth
2014-01-01
Phenytoin is a commonly used anticonvulsant that is highly protein bound with a narrow therapeutic range. The unbound fraction, free phenytoin (FP), is responsible for pharmacologic effects; therefore, it is essential to measure both FP and total serum phenytoin levels. Historically, the Abbott TDx method has been widely used for the measurement of FP and was the method used in our laboratory. However, the FP TDx assay was recently discontinued by the manufacturer, so we had to develop an alternative methodology. We evaluated the Beckman-Coulter DxC800 based FP method for linearity, analytical sensitivity, and precision. The analytical measurement range of the method was 0.41 to 5.30 microg/mL. Within-run and between-run precision studies yielded CVs of 3.8% and 5.5%, respectively. The method compared favorably with the TDx method, yielding the following regression equation: DxC800 = 0.9**TDx + 0.10; r2 = 0.97 (n = 97). The new FP assay appears to be an acceptable alternative to the TDx method.
Automated Text Markup for Information Retrieval from an Electronic Textbook of Infectious Disease
Berrios, Daniel C.; Kehler, Andrew; Kim, David K.; Yu, Victor L.; Fagan, Lawrence M.
1998-01-01
The information needs of practicing clinicians frequently require textbook or journal searches. Making these sources available in electronic form improves the speed of these searches, but precision (i.e., the fraction of relevant to total documents retrieved) remains low. Improving the traditional keyword search by transforming search terms into canonical concepts does not improve search precision greatly. Kim et al. have designed and built a prototype system (MYCIN II) for computer-based information retrieval from a forthcoming electronic textbook of infectious disease. The system requires manual indexing by experts in the form of complex text markup. However, this mark-up process is time consuming (about 3 person-hours to generate, review, and transcribe the index for each of 218 chapters). We have designed and implemented a system to semiautomate the markup process. The system, information extraction for semiautomated indexing of documents (ISAID), uses query models and existing information-extraction tools to provide support for any user, including the author of the source material, to mark up tertiary information sources quickly and accurately.
Automated coregistration of MTI spectral bands
NASA Astrophysics Data System (ADS)
Theiler, James P.; Galbraith, Amy E.; Pope, Paul A.; Ramsey, Keri A.; Szymanski, John J.
2002-08-01
In the focal plane of a pushbroom imager, a linear array of pixels is scanned across the scene, building up the image one row at a time. For the Multispectral Thermal Imager (MTI), each of fifteen different spectral bands has its own linear array. These arrays are pushed across the scene together, but since each band's array is at a different position on the focal plane, a separate image is produced for each band. The standard MTI data products (LEVEL1B_R_COREG and LEVEL1B_R_GEO) resample these separate images to a common grid and produce coregistered multispectral image cubes. The coregistration software employs a direct ``dead reckoning' approach. Every pixel in the calibrated image is mapped to an absolute position on the surface of the earth, and these are resampled to produce an undistorted coregistered image of the scene. To do this requires extensive information regarding the satellite position and pointing as a function of time, the precise configuration of the focal plane, and the distortion due to the optics. These must be combined with knowledge about the position and altitude of the target on the rotating ellipsoidal earth. We will discuss the direct approach to MTI coregistration, as well as more recent attempts to tweak the precision of the band-to-band registration using correlations in the imagery itself.
LabVIEW-based control software for para-hydrogen induced polarization instrumentation.
Agraz, Jose; Grunfeld, Alexander; Li, Debiao; Cunningham, Karl; Willey, Cindy; Pozos, Robert; Wagner, Shawn
2014-04-01
The elucidation of cell metabolic mechanisms is the modern underpinning of the diagnosis, treatment, and in some cases the prevention of disease. Para-Hydrogen induced polarization (PHIP) enhances magnetic resonance imaging (MRI) signals over 10,000 fold, allowing for the MRI of cell metabolic mechanisms. This signal enhancement is the result of hyperpolarizing endogenous substances used as contrast agents during imaging. PHIP instrumentation hyperpolarizes Carbon-13 ((13)C) based substances using a process requiring control of a number of factors: chemical reaction timing, gas flow, monitoring of a static magnetic field (Bo), radio frequency (RF) irradiation timing, reaction temperature, and gas pressures. Current PHIP instruments manually control the hyperpolarization process resulting in the lack of the precise control of factors listed above, resulting in non-reproducible results. We discuss the design and implementation of a LabVIEW based computer program that automatically and precisely controls the delivery and manipulation of gases and samples, monitoring gas pressures, environmental temperature, and RF sample irradiation. We show that the automated control over the hyperpolarization process results in the hyperpolarization of hydroxyethylpropionate. The implementation of this software provides the fast prototyping of PHIP instrumentation for the evaluation of a myriad of (13)C based endogenous contrast agents used in molecular imaging.
A diabetic retinopathy detection method using an improved pillar K-means algorithm.
Gogula, Susmitha Valli; Divakar, Ch; Satyanarayana, Ch; Rao, Allam Appa
2014-01-01
The paper presents a new approach for medical image segmentation. Exudates are a visible sign of diabetic retinopathy that is the major reason of vision loss in patients with diabetes. If the exudates extend into the macular area, blindness may occur. Automated detection of exudates will assist ophthalmologists in early diagnosis. This segmentation process includes a new mechanism for clustering the elements of high-resolution images in order to improve precision and reduce computation time. The system applies K-means clustering to the image segmentation after getting optimized by Pillar algorithm; pillars are constructed in such a way that they can withstand the pressure. Improved pillar algorithm can optimize the K-means clustering for image segmentation in aspects of precision and computation time. This evaluates the proposed approach for image segmentation by comparing with Kmeans and Fuzzy C-means in a medical image. Using this method, identification of dark spot in the retina becomes easier and the proposed algorithm is applied on diabetic retinal images of all stages to identify hard and soft exudates, where the existing pillar K-means is more appropriate for brain MRI images. This proposed system help the doctors to identify the problem in the early stage and can suggest a better drug for preventing further retinal damage.
Creating Ruddlesden-Popper phases by hybrid molecular beam epitaxy
NASA Astrophysics Data System (ADS)
Haislmaier, Ryan C.; Stone, Greg; Alem, Nasim; Engel-Herbert, Roman
2016-07-01
The synthesis of a 50 unit cell thick n = 4 Srn+1TinO3n+1 (Sr5Ti4O13) Ruddlesden-Popper (RP) phase film is demonstrated by sequentially depositing SrO and TiO2 layers in an alternating fashion using hybrid molecular beam epitaxy (MBE), where Ti was supplied using titanium tetraisopropoxide (TTIP). A detailed calibration procedure is outlined for determining the shuttering times to deposit SrO and TiO2 layers with precise monolayer doses using in-situ reflection high energy electron diffraction (RHEED) as feedback. Using optimized Sr and TTIP shuttering times, a fully automated growth of the n = 4 RP phase was carried out over a period of >4.5 h. Very stable RHEED intensity oscillations were observed over the entire growth period. The structural characterization by X-ray diffraction and high resolution transmission electron microscopy revealed that a constant periodicity of four SrTiO3 perovskite unit cell blocks separating the double SrO rocksalt layer was maintained throughout the entire film thickness with a very little amount of planar faults oriented perpendicular to the growth front direction. These results illustrate that hybrid MBE is capable of layer-by-layer growth with atomic level precision and excellent flux stability.
Fast estimation of space-robots inertia parameters: A modular mathematical formulation
NASA Astrophysics Data System (ADS)
Nabavi Chashmi, Seyed Yaser; Malaek, Seyed Mohammad-Bagher
2016-10-01
This work aims to propose a new technique that considerably helps enhance time and precision needed to identify ;Inertia Parameters (IPs); of a typical Autonomous Space-Robot (ASR). Operations might include, capturing an unknown Target Space-Object (TSO), ;active space-debris removal; or ;automated in-orbit assemblies;. In these operations generating precise successive commands are essential to the success of the mission. We show how a generalized, repeatable estimation-process could play an effective role to manage the operation. With the help of the well-known Force-Based approach, a new ;modular formulation; has been developed to simultaneously identify IPs of an ASR while it captures a TSO. The idea is to reorganize the equations with associated IPs with a ;Modular Set; of matrices instead of a single matrix representing the overall system dynamics. The devised Modular Matrix Set will then facilitate the estimation process. It provides a conjugate linear model in mass and inertia terms. The new formulation is, therefore, well-suited for ;simultaneous estimation processes; using recursive algorithms like RLS. Further enhancements would be needed for cases the effect of center of mass location becomes important. Extensive case studies reveal that estimation time is drastically reduced which in-turn paves the way to acquire better results.
LabVIEW-based control software for para-hydrogen induced polarization instrumentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agraz, Jose, E-mail: joseagraz@ucla.edu; Grunfeld, Alexander; Li, Debiao
2014-04-15
The elucidation of cell metabolic mechanisms is the modern underpinning of the diagnosis, treatment, and in some cases the prevention of disease. Para-Hydrogen induced polarization (PHIP) enhances magnetic resonance imaging (MRI) signals over 10 000 fold, allowing for the MRI of cell metabolic mechanisms. This signal enhancement is the result of hyperpolarizing endogenous substances used as contrast agents during imaging. PHIP instrumentation hyperpolarizes Carbon-13 ({sup 13}C) based substances using a process requiring control of a number of factors: chemical reaction timing, gas flow, monitoring of a static magnetic field (B{sub o}), radio frequency (RF) irradiation timing, reaction temperature, and gas pressures.more » Current PHIP instruments manually control the hyperpolarization process resulting in the lack of the precise control of factors listed above, resulting in non-reproducible results. We discuss the design and implementation of a LabVIEW based computer program that automatically and precisely controls the delivery and manipulation of gases and samples, monitoring gas pressures, environmental temperature, and RF sample irradiation. We show that the automated control over the hyperpolarization process results in the hyperpolarization of hydroxyethylpropionate. The implementation of this software provides the fast prototyping of PHIP instrumentation for the evaluation of a myriad of {sup 13}C based endogenous contrast agents used in molecular imaging.« less
Applied Augmented Reality for High Precision Maintenance
NASA Astrophysics Data System (ADS)
Dever, Clark
Augmented Reality had a major consumer breakthrough this year with Pokemon Go. The underlying technologies that made that app a success with gamers can be applied to improve the efficiency and efficacy of workers. This session will explore some of the use cases for augmented reality in an industrial environment. In doing so, the environmental impacts and human factors that must be considered will be explored. Additionally, the sensors, algorithms, and visualization techniques used to realize augmented reality will be discussed. The benefits of augmented reality solutions in industrial environments include automated data recording, improved quality assurance, reduction in training costs and improved mean-time-to-resolution. As technology continues to follow Moore's law, more applications will become feasible as performance-per-dollar increases across all system components.
Dupuy, Anne Marie; Hurstel, Rémy; Bargnoux, Anne Sophie; Badiou, Stéphanie; Cristol, Jean Paul
2014-01-01
Rheumatoid factor (RF) consists of autoantibodies and because of its heterogeneity its determination is not easy. Currently, nephelometry and Elisa method are considered as reference methods. Due to consolidation, many laboratories have fully automated turbidimetric apparatus, and specific nephelemetric systems are not always available. In addition, nephelemetry is more accurate, but time consuming, expensive, and requires a specific device, resulting in a lower efficiency. Turbidimetry could be an attractive alternative. The turbidimetric RF test from Diagam meets the requirements of accuracy and precision for optimal clinical use, with an acceptable measuring range, and could be an alternative in the determination of RF, without the associated cost of a dedicated instrument, making consolidation and saving blood possible.
Optimization of the tungsten oxide technique for measurement of atmospheric ammonia
NASA Technical Reports Server (NTRS)
Brown, Kenneth G.
1987-01-01
Hollow tubes coated with tungstic acid have been shown to be of value in the determination of ammonia and nitric acid in ambient air. Practical application of this technique was demonstrated utilizing an automated sampling system for in-flight collection and analysis of atmospheric samples. Due to time constraints these previous measurements were performed on tubes that had not been well characterized in the laboratory. As a result the experimental precision could not be accurately estimated. Since the technique was being compared to other techniques for measuring these compounds, it became necessary to perform laboratory tests which would establish the reliability of the technique. This report is a summary of these laboratory experiments as they are applied to the determination of ambient ammonia concentration.
Prototype space station automation system delivered and demonstrated at NASA
NASA Technical Reports Server (NTRS)
Block, Roger F.
1987-01-01
The Automated Subsystem Control for Life Support System (ASCLSS) program has successfully developed and demonstrated a generic approach to the automation and control of Space Station subsystems. The hierarchical and distributed real time controls system places the required controls authority at every level of the automation system architecture. As a demonstration of the automation technique, the ASCLSS system automated the Air Revitalization Group (ARG) of the Space Station regenerative Environmental Control and Life Support System (ECLSS) using real-time, high fidelity simulators of the ARG processess. This automation system represents an early flight prototype and an important test bed for evaluating Space Station controls technology including future application of ADA software in real-time control and the development and demonstration of embedded artificial intelligence and expert systems (AI/ES) in distributed automation and controls systems.
Development, history, and future of automated cell counters.
Green, Ralph; Wachsmann-Hogiu, Sebastian
2015-03-01
Modern automated hematology instruments use either optical methods (light scatter), impedance-based methods based on the Coulter principle (changes in electrical current induced by blood cells flowing through an electrically charged opening), or a combination of both optical and impedance-based methods. Progressive improvement in these instruments has allowed the enumeration and evaluation of blood cells with great accuracy, precision, and speed at very low cost. Future directions of hematology instrumentation include the addition of new parameters and the development of point-of-care instrumentation. In the future, in-vivo analysis of blood cells may allow noninvasive and near-continuous measurements. Copyright © 2015 Elsevier Inc. All rights reserved.
Computer vision applications for coronagraphic optical alignment and image processing.
Savransky, Dmitry; Thomas, Sandrine J; Poyneer, Lisa A; Macintosh, Bruce A
2013-05-10
Modern coronagraphic systems require very precise alignment between optical components and can benefit greatly from automated image processing. We discuss three techniques commonly employed in the fields of computer vision and image analysis as applied to the Gemini Planet Imager, a new facility instrument for the Gemini South Observatory. We describe how feature extraction and clustering methods can be used to aid in automated system alignment tasks, and also present a search algorithm for finding regular features in science images used for calibration and data processing. Along with discussions of each technique, we present our specific implementation and show results of each one in operation.
Object detection in cinematographic video sequences for automatic indexing
NASA Astrophysics Data System (ADS)
Stauder, Jurgen; Chupeau, Bertrand; Oisel, Lionel
2003-06-01
This paper presents an object detection framework applied to cinematographic post-processing of video sequences. Post-processing is done after production and before editing. At the beginning of each shot of a video, a slate (also called clapperboard) is shown. The slate contains notably an electronic audio timecode that is necessary for audio-visual synchronization. This paper presents an object detection framework to detect slates in video sequences for automatic indexing and post-processing. It is based on five steps. The first two steps aim to reduce drastically the video data to be analyzed. They ensure high recall rate but have low precision. The first step detects images at the beginning of a shot possibly showing up a slate while the second step searches in these images for candidates regions with color distribution similar to slates. The objective is to not miss any slate while eliminating long parts of video without slate appearance. The third and fourth steps are statistical classification and pattern matching to detected and precisely locate slates in candidate regions. These steps ensure high recall rate and high precision. The objective is to detect slates with very little false alarms to minimize interactive corrections. In a last step, electronic timecodes are read from slates to automize audio-visual synchronization. The presented slate detector has a recall rate of 89% and a precision of 97,5%. By temporal integration, much more than 89% of shots in dailies are detected. By timecode coherence analysis, the precision can be raised too. Issues for future work are to accelerate the system to be faster than real-time and to extend the framework for several slate types.
MICRONERVA: A Novel Approach to Large Aperture Astronomical Spectroscopy
NASA Astrophysics Data System (ADS)
Hall, Ryan; Plavchan, Peter; Geneser, Claire; Giddens, Frank; Spangler, Sophia
2016-06-01
MICRONERVA (MICRO Novel Exoplanet Radial Velocity Array) is a project to measure precise spectroscopic radial velocities. The cost of telescopes are a strong function of diameter, and light gathering power as opposed to angular resolution is the fundamental driver for telescope design for many spectroscopic science applications. By sacrificing angular resolution, many multiple smaller fiber-fed telescopes can be combined to synthesize the light gathering power of a larger diameter telescope at a lower effective cost. For our MICRONERVA prototype, based upon the larger MINERVA project, we will attempt to demonstrate that an array of four 8-inch CPC Celestron telescopes can be automated with sufficient active guiding precision for robust nightly robotic operations. The light from each telescope is coupled into single mode fibers, which are conveniently matched to the point spread function of 8-inch telescopes, which can be diffraction limited at red wavelengths in typical seeing at good observing sites. Additionally, the output from an array of single mode fibers provides stable output illumination of a spectrograph, which is a critical requirement of future precise radial velocity instrumentation. All of the hardware from the system is automated using Python programs and ASCOM and MaxIm DL software drivers. We will present an overview of the current status of the project and plans for future work. The detection of exoplanets using the techniques of MICRONERVA could potentially enable cost reductions for many types of spectroscopic research.
Automatic evidence retrieval for systematic reviews.
Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G; Tsafnat, Guy
2014-10-01
Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing's effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Our goal was to evaluate an automatic method for citation snowballing's capacity to identify and retrieve the full text and/or abstracts of cited articles. Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews.
Automated MRI segmentation for individualized modeling of current flow in the human head.
Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C
2013-12-01
High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials.
Quantitative structure-activity relationship models that stand the test of time.
Davis, Andrew M; Wood, David J
2013-04-01
The pharmaceutical industry is in a period of intense change. While this has many drivers, attrition through the development process continues to be an important pressure. The emerging definitions of "compound quality" that are based on retrospective analyses of developmental attrition have highlighted a new direction for medicinal chemistry and the paradigm of "quality at the point of design". The time has come for retrospective analyses to catalyze prospective action. Quality at the point of design places pressure on the quality of our predictive models. Empirical QSAR models when built with care provide true predictive control, but their accuracy and precision can be improved. Here we describe AstraZeneca's experience of automation in QSAR model building and validation, and how an informatics system can provide a step-change in predictive power to project design teams, if they choose to use it.
Collection, transport and general processing of clinical specimens in Microbiology laboratory.
Sánchez-Romero, M Isabel; García-Lechuz Moya, Juan Manuel; González López, Juan José; Orta Mira, Nieves
2018-02-06
The interpretation and the accuracy of the microbiological results still depend to a great extent on the quality of the samples and their processing within the Microbiology laboratory. The type of specimen, the appropriate time to obtain the sample, the way of sampling, the storage and transport are critical points in the diagnostic process. The availability of new laboratory techniques for unusual pathogens, makes necessary the review and update of all the steps involved in the processing of the samples. Nowadays, the laboratory automation and the availability of rapid techniques allow the precision and turn-around time necessary to help the clinicians in the decision making. In order to be efficient, it is very important to obtain clinical information to use the best diagnostic tools. Copyright © 2018 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.
Driver Vigilance in Automated Vehicles: Hazard Detection Failures Are a Matter of Time.
Greenlee, Eric T; DeLucia, Patricia R; Newton, David C
2018-06-01
The primary aim of the current study was to determine whether monitoring the roadway for hazards during automated driving results in a vigilance decrement. Although automated vehicles are relatively novel, the nature of human-automation interaction within them has the classic hallmarks of a vigilance task. Drivers must maintain attention for prolonged periods of time to detect and respond to rare and unpredictable events, for example, roadway hazards that automation may be ill equipped to detect. Given the similarity with traditional vigilance tasks, we predicted that drivers of a simulated automated vehicle would demonstrate a vigilance decrement in hazard detection performance. Participants "drove" a simulated automated vehicle for 40 minutes. During that time, their task was to monitor the roadway for roadway hazards. As predicted, hazard detection rate declined precipitously, and reaction times slowed as the drive progressed. Further, subjective ratings of workload and task-related stress indicated that sustained monitoring is demanding and distressing and it is a challenge to maintain task engagement. Monitoring the roadway for potential hazards during automated driving results in workload, stress, and performance decrements similar to those observed in traditional vigilance tasks. To the degree that vigilance is required of automated vehicle drivers, performance errors and associated safety risks are likely to occur as a function of time on task. Vigilance should be a focal safety concern in the development of vehicle automation.
Toward Automated International Law Compliance Monitoring (TAILCM)
2014-07-01
5b. GRANT NUMBER N /A 5c. PROGRAM ELEMENT NUMBER Other (SAF) 6. AUTHOR(S) Leora Morgenstern 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT...0.7 0.8 0.9 1 Regulation Type Action Agent Patient Condition Exception Pr ec is io n Category Corrected and Uncorrected Precision for Each Category...89 .82 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Pr ec is io n Category Precision of Each Category for Each Adjudicator A1 A2 A3 Approved for
Jayakody, Chatura; Hull-Ryde, Emily A
2016-01-01
Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.
2010-01-01
School of Enviromental and Biological Sciences New Brunswick, NJ 08903 FTR 214 Defense Logistics Agency 8725 John J. Kingsman Rd Fort Belvoir, VA...Precision Automation X Injection Mold 1100 Rack Stock America X Injection Mold 1400 Rack AllPax X Enviromental Chamber Model: 11-679-25C Fisher
Automated mosaicking of sub-canopy video incorporating ancillary data
E. Kee; N.E. Clark; A.L. Abbott
2002-01-01
This work investigates the process of mosaicking overlapping video frames of individual tree stems in sub-canopy scenes captured with a portable multisensor instrument. The robust commercial computer vision systems that are in use today typically rely on precisely controlled conditions. Inconsistent lighting as well as image distortion caused by varying interior and...
Topic categorisation of statements in suicide notes with integrated rules and machine learning.
Kovačević, Aleksandar; Dehghan, Azad; Keane, John A; Nenadic, Goran
2012-01-01
We describe and evaluate an automated approach used as part of the i2b2 2011 challenge to identify and categorise statements in suicide notes into one of 15 topics, including Love, Guilt, Thankfulness, Hopelessness and Instructions. The approach combines a set of lexico-syntactic rules with a set of models derived by machine learning from a training dataset. The machine learning models rely on named entities, lexical, lexico-semantic and presentation features, as well as the rules that are applicable to a given statement. On a testing set of 300 suicide notes, the approach showed the overall best micro F-measure of up to 53.36%. The best precision achieved was 67.17% when only rules are used, whereas best recall of 50.57% was with integrated rules and machine learning. While some topics (eg, Sorrow, Anger, Blame) prove challenging, the performance for relatively frequent (eg, Love) and well-scoped categories (eg, Thankfulness) was comparatively higher (precision between 68% and 79%), suggesting that automated text mining approaches can be effective in topic categorisation of suicide notes.
Kelly, Mary T; Blaise, Alain; Larroque, Michel
2010-11-19
This paper reports a new, simple, rapid and economical method for routine determination of 24 amino acids and biogenic amines in grapes and wine. No sample clean-up is required and total run time including column re-equilibration is less than 40min. Following automated in-loop automated pre-column derivatisation with an o-phthaldialdehyde, N-acetyl-l-cysteine reagent, compounds were separated on a 3mm×25cm C(18) column using a binary mobile phase. The method was validated in the range 0.25-10mg/l; repeatability was less than 3% RSD and the intermediate precision ranged from 2 to 7% RSD. The method was shown to be linear by the 'lack of fit' test and the accuracy was between 97 and 101%. The LLOQ varied between 10μg/l for aspartic and glutamic acids, ethanolamine and GABA, and 100μg/l for tyrosine, phenylalanine, putrescine and cadaverine. The method was applied to grapes, white wine, red wine, honey and three species of physalis fruit. Grapes and physalis fruit were crushed, sieved, centrifuged and diluted 1/20 and 1/100, respectively, for analysis; wines and honeys were simply diluted 10-fold. It was shown using this method that the amino acid content of grapes was strongly correlated with berry volume, moderately correlated with sugar concentration and inversely correlated with total acidity. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hill, Jason E.; Fernandez-Del-Valle, Maria; Hayden, Ryan; Mitra, Sunanda
2017-02-01
Magnetic Resonance Imaging (MRI) and Magnetic Resonance Spectroscopy (MRS) together have become the gold standard in the precise quantification of body fat. The study of the quantification of fat in the human body has matured in recent years from a simplistic interest in the whole-body fat content to detailing regional fat distributions. The realization that body-fat, or adipose tissue (AT) is far from being a mere aggregate mass or deposit but a biologically active organ in and of itself, may play a role in the association between obesity and the various pathologies that are the biggest health issues of our time. Furthermore, a major bottleneck in most medical image assessments of adipose tissue content and distribution is the lack of automated image analysis. This motivated us to develop a proper and at least partially automated methodology to accurately and reproducibly determine both body fat content and distribution in the human body, which is to be applied to cross-sectional and longitudinal studies. The AT considered here is located beneath the skin (subcutaneous) as well as around the internal organs and between muscles (visceral and inter-muscular). There are also special fat depots on and around the heart (pericardial) as well as around the aorta (peri-aortic). Our methods focus on measuring and classifying these various AT deposits in the human body in an intervention study that involves the acquisition of thoracic and abdominal MR images via a Dixon technique.
Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca
2015-04-01
According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results.
Kawalilak, C E; Johnston, J D; Cooper, D M L; Olszynski, W P; Kontulainen, S A
2016-02-01
Precision errors of cortical bone micro-architecture from high-resolution peripheral quantitative computed tomography (pQCT) ranged from 1 to 16 % and did not differ between automatic or manually modified endocortical contour methods in postmenopausal women or young adults. In postmenopausal women, manually modified contours led to generally higher cortical bone properties when compared to the automated method. First, the objective of the study was to define in vivo precision errors (coefficient of variation root mean square (CV%RMS)) and least significant change (LSC) for cortical bone micro-architecture using two endocortical contouring methods: automatic (AUTO) and manually modified (MOD) in two groups (postmenopausal women and young adults) from high-resolution pQCT (HR-pQCT) scans. Second, it was to compare precision errors and bone outcomes obtained with both methods within and between groups. Using HR-pQCT, we scanned twice the distal radius and tibia of 34 postmenopausal women (mean age ± SD 74 ± 7 years) and 30 young adults (27 ± 9 years). Cortical micro-architecture was determined using AUTO and MOD contour methods. CV%RMS and LSC were calculated. Repeated measures and multivariate ANOVA were used to compare mean CV% and bone outcomes between the methods within and between the groups. Significance was accepted at P < 0.05. CV%RMS ranged from 0.9 to 16.3 %. Within-group precision did not differ between evaluation methods. Compared to young adults, postmenopausal women had better precision for radial cortical porosity (precision difference 9.3 %) and pore volume (7.5 %) with MOD. Young adults had better precision for cortical thickness (0.8 %, MOD) and tibial cortical density (0.2 %, AUTO). In postmenopausal women, MOD resulted in 0.2-54 % higher values for most cortical outcomes, as well as 6-8 % lower radial and tibial cortical BMD and 2 % lower tibial cortical thickness. Results suggest that AUTO and MOD endocortical contour methods provide comparable repeatability. In postmenopausal women, manual modification of endocortical contours led to generally higher cortical bone properties when compared to the automated method, while no between-method differences were observed in young adults.
Hellmuth, Christian; Weber, Martina; Koletzko, Berthold; Peissner, Wolfgang
2012-02-07
Despite their central importance for lipid metabolism, straightforward quantitative methods for determination of nonesterified fatty acid (NEFA) species are still missing. The protocol presented here provides unbiased quantitation of plasma NEFA species by liquid chromatography-tandem mass spectrometry (LC-MS/MS). Simple deproteination of plasma in organic solvent solution yields high accuracy, including both the unbound and initially protein-bound fractions, while avoiding interferences from hydrolysis of esterified fatty acids from other lipid classes. Sample preparation is fast and nonexpensive, hence well suited for automation and high-throughput applications. Separation of isotopologic NEFA is achieved using ultrahigh-performance liquid chromatography (UPLC) coupled to triple quadrupole LC-MS/MS detection. In combination with automated liquid handling, total assay time per sample is less than 15 min. The analytical spectrum extends beyond readily available NEFA standard compounds by a regression model predicting all the relevant analytical parameters (retention time, ion path settings, and response factor) of NEFA species based on chain length and number of double bonds. Detection of 50 NEFA species and accurate quantification of 36 NEFA species in human plasma is described, the highest numbers ever reported for a LC-MS application. Accuracy and precision are within widely accepted limits. The use of qualifier ions supports unequivocal analyte verification. © 2012 American Chemical Society
Determination of phenolic priority pollutants utilizing permeation sampling method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Guozheng.
1990-01-01
A passive permeation sampling method for the determination of phenolic priority pollutants in water was developed. Phenols in an aqueous solution permeate a polymeric membrane and are collected on a solid adsorbent in a sampling device. Both solvent and thermal desorption techniques were employed to recovery phenolic pollutants collected. In the solvent desorption, phenolic compounds collected on the XAD-7 resin, and then desorbed by acetonitrile. In the thermal desorption, phenolic compounds collected on Tenax-TA were recovered thermally, Separation and quantification is achieved by a SPB-5 capillary column gas chromatography using a flame ionization detector. There are linear relationships between themore » amount of phenolic compounds collected and the products of the exposure times and concentrations over the range from 5 ppb to 20 ppm with precisions no worse than 13%. The permeation rates of the phenolic pollutants depend upon the exposure temperature, solution pH and membrane area. Samples collected can be stored for up to two weeks without loss. This method provides a simple, convenient and inexpensive way for monitoring the time-weighted-average concentration without the use of a pumping system. An automated sampler which combines the permeation and the thermal desorption techniques together was also developed for water sample obtained from grab sampling. The on-line setup provides a high degree of automation. Detection limits at 10 ppb can be achieved using this sampler.« less
Schorling, Stefan; Schalasta, Gunnar; Enders, Gisela; Zauke, Michael
2004-01-01
The COBAS AmpliPrep instrument (Roche Diagnostics GmbH, D-68305 Mannheim, Germany) automates the entire sample preparation process of nucleic acid isolation from serum or plasma for polymerase chain reaction analysis. We report the analytical performance of the LightCycler Parvovirus B19 Quantification Kit (Roche Diagnostics) using nucleic acids isolated with the COBAS AmpliPrep instrument. Nucleic acids were extracted using the Total Nucleic Acid Isolation Kit (Roche Diagnostics) and amplified with the LightCycler Parvovirus B19 Quantification Kit. The kit combination processes 72 samples per 8-hour shift. The lower detection limit is 234 IU/ml at a 95% hit-rate, linear range approximately 104-1010 IU/ml, and overall precision 16 to 40%. Relative sensitivity and specificity in routine samples from pregnant women are 100% and 93%, respectively. Identification of a persistent parvovirus B19-infected individual by the polymerase chain reaction among 51 anti-parvovirus B19 IgM-negative samples underlines the importance of additional nucleic acid testing in pregnancy and its superiority to serology in identifying the risk of parvovirus B19 transmission via blood or blood products. Combination of the Total Nucleic Acid Isolation Kit on the COBAS AmpliPrep instrument with the LightCycler Parvovirus B19 Quantification Kit provides a reliable and time-saving tool for sensitive and accurate detection of parvovirus B19 DNA. PMID:14736825
NASA Astrophysics Data System (ADS)
Brocks, Sebastian; Bendig, Juliane; Bareth, Georg
2016-10-01
Crop surface models (CSMs) representing plant height above ground level are a useful tool for monitoring in-field crop growth variability and enabling precision agriculture applications. A semiautomated system for generating CSMs was implemented. It combines an Android application running on a set of smart cameras for image acquisition and transmission and a set of Python scripts automating the structure-from-motion (SfM) software package Agisoft Photoscan and ArcGIS. Only ground-control-point (GCP) marking was performed manually. This system was set up on a barley field experiment with nine different barley cultivars in the growing period of 2014. Images were acquired three times a day for a period of two months. CSMs were successfully generated for 95 out of 98 acquisitions between May 2 and June 30. The best linear regressions of the CSM-derived plot-wise averaged plant-heights compared to manual plant height measurements taken at four dates resulted in a coefficient of determination R2 of 0.87 and a root-mean-square error (RMSE) of 0.08 m, with Willmott's refined index of model performance dr equaling 0.78. In total, 103 mean plot heights were used in the regression based on the noon acquisition time. The presented system succeeded in semiautomatedly monitoring crop height on a plot scale to field scale.
Nestola, Marco; Schmidt, Torsten C
2017-07-07
The determination of mineral oil aromatic hydrocarbons (MOAH) in foodstuffs gained in importance over the last years as carcinogenicity cannot be excluded for certain MOAH. The existence of olefins in foodstuffs, such as edible oils and fats, can be problematic for the determination of MOAH by LC-GC-FID. Removal of these interfering substances by HPLC based on polarity differences is not possible. During gas chromatographic separation heavily overloaded peaks are observed rendering the detection of small mineral oil contaminations almost impossible. Therefore, removal of these olefins is necessary before subjection of the sample to LC-GC-FID. Epoxidation of olefins to increase their polarity proved to be a valuable tool in the past. Precision and trueness of the results as shown in a collaborative trial, however, are relying on exact reaction conditions. Additionally, it is known that certain MOAH are oxidized during epoxidation and therefore get lost. In the scope of this work, hydroboration, bromohydrin reaction, and epoxidation were examined for their potential for derivatization of unsaturated hydrocarbons with increased robustness and higher recovery of MOAH. Epoxidation by meta-chloroperoxybenzoic acid (mCPBA) delivered the best removal of olefins. Factors influencing this reaction were enlightened. Adaption of the reaction conditions and time-controlled automation increased the recovery of polycyclic MOAH. Good precision (RSD r <1.5%) and recovery (95-102%) for MOAH were also observed for sunflower and olive oils spiked with a lubricating mineral oil (at 24.5mg/kg of MOAH). The trueness of the method was verified by analyzing collaborative trial samples. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Daughtrey, E. Hunter; Adams, Jeffrey R.; Oliver, Karen D.; Kronmiller, Keith G.; McClenny, William A.
1998-09-01
A trailer-deployed automated gas chromatograph-mass spectrometer (autoGC-MS) system capable of making continuous hourly measurements was used to determine volatile organic compounds (VOCs) in ambient air at New Hendersonville, Tennessee, and Research Triangle Park, North Carolina, in 1995. The system configuration, including the autoGC-MS, trailer and transfer line, siting, and sampling plan and schedule, is described. The autoGC-MS system employs a pair of matched sorbent traps to allow simultaneous sampling and desorption. Desorption is followed by Stirling engine cryofocusing and subsequent GC separation and mass spectral identification and quantification. Quality control measurements described include evaluating precision and accuracy of replicate analyses of independently supplied audit and round-robin canisters and determining the completeness of the data sets taken in Tennessee. Data quality objectives for precision (±10%) and accuracy (±20%) of 10- to 20-ppbv audit canisters and a completeness of >75% data capture were met. Quality assurance measures used in reviewing the data set include retention time stability, calibration checks, frequency distribution checks, and checks of the mass spectra. Special procedures and tests were used to minimize sorbent trap artifacts, to verify the quality of a standard prepared in our laboratory, and to prove the integrity of the insulated, heated transfer line. A rigorous determination of total system blank concentration levels using humidified scientific air spiked with ozone allowed estimation of method detection limits, ranging from 0.01 to 1.0 ppb C, for most of the 100 target compounds, which were a composite list of the target compounds for the Photochemical Assessment Monitoring Station network, those for Environmental Protection Agency method TO-14, and selected oxygenated VOCs.
Methods for semi-automated indexing for high precision information retrieval
NASA Technical Reports Server (NTRS)
Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.
2002-01-01
OBJECTIVE: To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. DESIGN: Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. PARTICIPANTS: Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. MEASUREMENTS: Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. RESULTS: Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). SUMMARY: Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.
Automated single cell sorting and deposition in submicroliter drops
NASA Astrophysics Data System (ADS)
Salánki, Rita; Gerecsei, Tamás; Orgovan, Norbert; Sándor, Noémi; Péter, Beatrix; Bajtay, Zsuzsa; Erdei, Anna; Horvath, Robert; Szabó, Bálint
2014-08-01
Automated manipulation and sorting of single cells are challenging, when intact cells are needed for further investigations, e.g., RNA or DNA sequencing. We applied a computer controlled micropipette on a microscope admitting 80 PCR (Polymerase Chain Reaction) tubes to be filled with single cells in a cycle. Due to the Laplace pressure, fluid starts to flow out from the micropipette only above a critical pressure preventing the precise control of drop volume in the submicroliter range. We found an anomalous pressure additive to the Laplace pressure that we attribute to the evaporation of the drop. We have overcome the problem of the critical dropping pressure with sequentially operated fast fluidic valves timed with a millisecond precision. Minimum drop volume was 0.4-0.7 μl with a sorting speed of 15-20 s per cell. After picking NE-4C neuroectodermal mouse stem cells and human primary monocytes from a standard plastic Petri dish we could gently deposit single cells inside tiny drops. 94 ± 3% and 54 ± 7% of the deposited drops contained single cells for NE-4C and monocytes, respectively. 7.5 ± 4% of the drops contained multiple cells in case of monocytes. Remaining drops were empty. Number of cells deposited in a drop could be documented by imaging the Petri dish before and after sorting. We tuned the adhesion force of cells to make the manipulation successful without the application of microstructures for trapping cells on the surface. We propose that our straightforward and flexible setup opens an avenue for single cell isolation, critically needed for the rapidly growing field of single cell biology.
Methods for Semi-automated Indexing for High Precision Information Retrieval
Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.
2002-01-01
Objective. To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. Design. Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. Participants. Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. Measurements. Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. Results. Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). Summary. Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy. PMID:12386114
PyDBS: an automated image processing workflow for deep brain stimulation surgery.
D'Albis, Tiziano; Haegelen, Claire; Essert, Caroline; Fernández-Vidal, Sara; Lalys, Florent; Jannin, Pierre
2015-02-01
Deep brain stimulation (DBS) is a surgical procedure for treating motor-related neurological disorders. DBS clinical efficacy hinges on precise surgical planning and accurate electrode placement, which in turn call upon several image processing and visualization tasks, such as image registration, image segmentation, image fusion, and 3D visualization. These tasks are often performed by a heterogeneous set of software tools, which adopt differing formats and geometrical conventions and require patient-specific parameterization or interactive tuning. To overcome these issues, we introduce in this article PyDBS, a fully integrated and automated image processing workflow for DBS surgery. PyDBS consists of three image processing pipelines and three visualization modules assisting clinicians through the entire DBS surgical workflow, from the preoperative planning of electrode trajectories to the postoperative assessment of electrode placement. The system's robustness, speed, and accuracy were assessed by means of a retrospective validation, based on 92 clinical cases. The complete PyDBS workflow achieved satisfactory results in 92 % of tested cases, with a median processing time of 28 min per patient. The results obtained are compatible with the adoption of PyDBS in clinical practice.
NASA Astrophysics Data System (ADS)
Shim, Hackjoon; Lee, Soochan; Kim, Bohyeong; Tao, Cheng; Chang, Samuel; Yun, Il Dong; Lee, Sang Uk; Kwoh, Kent; Bae, Kyongtae
2008-03-01
Knee osteoarthritis is the most common debilitating health condition affecting elderly population. MR imaging of the knee is highly sensitive for diagnosis and evaluation of the extent of knee osteoarthritis. Quantitative analysis of the progression of osteoarthritis is commonly based on segmentation and measurement of articular cartilage from knee MR images. Segmentation of the knee articular cartilage, however, is extremely laborious and technically demanding, because the cartilage is of complex geometry and thin and small in size. To improve precision and efficiency of the segmentation of the cartilage, we have applied a semi-automated segmentation method that is based on an s/t graph cut algorithm. The cost function was defined integrating regional and boundary cues. While regional cues can encode any intensity distributions of two regions, "object" (cartilage) and "background" (the rest), boundary cues are based on the intensity differences between neighboring pixels. For three-dimensional (3-D) segmentation, hard constraints are also specified in 3-D way facilitating user interaction. When our proposed semi-automated method was tested on clinical patients' MR images (160 slices, 0.7 mm slice thickness), a considerable amount of segmentation time was saved with improved efficiency, compared to a manual segmentation approach.
A population MRI brain template and analysis tools for the macaque.
Seidlitz, Jakob; Sponheim, Caleb; Glen, Daniel; Ye, Frank Q; Saleem, Kadharbatcha S; Leopold, David A; Ungerleider, Leslie; Messinger, Adam
2018-04-15
The use of standard anatomical templates is common in human neuroimaging, as it facilitates data analysis and comparison across subjects and studies. For non-human primates, previous in vivo templates have lacked sufficient contrast to reliably validate known anatomical brain regions and have not provided tools for automated single-subject processing. Here we present the "National Institute of Mental Health Macaque Template", or NMT for short. The NMT is a high-resolution in vivo MRI template of the average macaque brain generated from 31 subjects, as well as a neuroimaging tool for improved data analysis and visualization. From the NMT volume, we generated maps of tissue segmentation and cortical thickness. Surface reconstructions and transformations to previously published digital brain atlases are also provided. We further provide an analysis pipeline using the NMT that automates and standardizes the time-consuming processes of brain extraction, tissue segmentation, and morphometric feature estimation for anatomical scans of individual subjects. The NMT and associated tools thus provide a common platform for precise single-subject data analysis and for characterizations of neuroimaging results across subjects and studies. Copyright © 2017 ElsevierCompany. All rights reserved.
Tracking C. elegans and its neuromuscular activity using NemaFlex
NASA Astrophysics Data System (ADS)
van Bussel, Frank; Rahman, Mizanur; Hewitt, Jennifer; Blawzdziewicz, Jerzy; Driscoll, Monica; Szewczyk, Nathaniel; Vanapalli, Siva
Recently, a novel platform has been developed for studying the behavior and physical characteristics of the nematode C. elegans. This is NemaFlex, developed by the Vanapalli group at Texas Tech University to analyze movement and muscular strength of crawling C. elegans. NemaFlex is a microfluidic device consisting of an array of deformable PDMS pillars, with which the C. elegans interacts in the course of moving through the system. Deflection measurements then allow us to calculate the force exerted by the worm via Euler-Bernoulli beam theory. For the procedure to be fully automated a fairly sophisticated software analysis has to be developed in tandem with the physical device. In particular, the usefulness of the force calculations is highly dependent on the accuracy and volume of the deflection measurements, which would be prohibitively time-consuming if carried out by hand/eye. In order to correlate the force results with muscle activations the C. elegans itself has to be tracked simultaneously, and pillar deflections precisely associated with mechanical-contact on the worm's body. Here we will outline the data processing and analysis routines that have been implemented in order to automate the calculation of these forces and muscular activations.
In-situ thermography of automated fiber placement parts
NASA Astrophysics Data System (ADS)
Gregory, Elizabeth D.; Juarez, Peter D.
2018-04-01
Automated fiber placement (AFP) provides precision and repeatable manufacturing of both simple and complex geometry composite parts. However, AFP also introduces the possibility for unique flaws such as overlapping tows, gaps between tows, tow twists, lack of layer adhesion and foreign object debris. These types of flaws can all result in a significant loss of performance in the final part. The current inspection method for these flaws is a costly and time intensive visual inspection of each ply layer. This work describes some initial efforts to incorporate thermal inspection on the AFP head and analysis of the data to identify the previously mentioned flaws. Previous bench-top laboratory experiments demonstrated that laps, gaps, and twists were identified from a thermal image. The AFP head uses an on- board lamp to preheat the surface of the part during layup to increase ply consolidation. The preheated surface is used as a thermal source to observe the state of the new material after compaction. We will present data collected with the Integrated Structural Assembly of Advanced Composites (ISAAC) AFP machine at Langley Research Center showing that changes to the temperature profile is sufficient for identifying all types of flaws.
Mayer, Horst; Brümmer, Jens; Brinkmann, Thomas
2011-01-01
To implement Lean Six Sigma in our central laboratory we conducted a project to measure single pre-analytical steps influencing turnaround time (TAT) of emergency department (ED) serum samples. The traditional approach of extracting data from the Laboratory Information System (LIS) for a retrospective calculation of a mean TAT is not suitable. Therefore, we used radiofrequency identification (RFID) chips for real time tracking of individual samples at any pre-analytical step. 1,200 serum tubes were labelled with RFID chips and were provided to the emergency department. 3 RFID receivers were installed in the laboratory: at the outlet of the pneumatic tube system, at the centrifuge, and in the analyser area. In addition, time stamps of sample entry at the automated sample distributor and communication of results from the analyser were collected from LIS. 1,023 labelled serum tubes arrived at our laboratory. 899 RFID tags were used for TAT calculation. The following transfer times were determined (median 95th percentile in min:sec): pneumatic tube system --> centrifuge (01:25/04:48), centrifuge --> sample distributor (14:06/5:33), sample distributor --> analysis system zone (02:39/15:07), analysis system zone --> result communication (12:42/22:21). Total TAT was calculated at 33:19/57:40 min:sec. Manual processes around centrifugation were identified as a major part of TAT with 44%/60% (median/95th percentile). RFID is a robust, easy to use, and error-free technology and not susceptible to interferences in the laboratory environment. With this study design we were able to measure significant variations in a single manual sample transfer process. We showed that TAT is mainly influenced by manual steps around the centrifugation process and we concluded that centrifugation should be integrated in solutions for total laboratory automation.
Does bacteriology laboratory automation reduce time to results and increase quality management?
Dauwalder, O; Landrieve, L; Laurent, F; de Montclos, M; Vandenesch, F; Lina, G
2016-03-01
Due to reductions in financial and human resources, many microbiological laboratories have merged to build very large clinical microbiology laboratories, which allow the use of fully automated laboratory instruments. For clinical chemistry and haematology, automation has reduced the time to results and improved the management of laboratory quality. The aim of this review was to examine whether fully automated laboratory instruments for microbiology can reduce time to results and impact quality management. This study focused on solutions that are currently available, including the BD Kiestra™ Work Cell Automation and Total Lab Automation and the Copan WASPLab(®). Copyright © 2015 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Li, Yuanyao; Huang, Jinsong; Jiang, Shui-Hua; Huang, Faming; Chang, Zhilu
2017-12-07
It is important to monitor the displacement time series and to explore the failure mechanism of reservoir landslide for early warning. Traditionally, it is a challenge to monitor the landslide displacements real-timely and automatically. Globe Position System (GPS) is considered as the best real-time monitoring technology, however, the accuracies of the landslide displacements monitored by GPS are not assessed effectively. A web-based GPS system is developed to monitor the landslide displacements real-timely and automatically in this study. And the discrete wavelet transform (DWT) is proposed to assess the accuracy of the GPS monitoring displacements. Wangmiao landslide in Three Gorges Reservoir area in China is used as case study. The results show that the web-based GPS system has advantages of high precision, real-time, remote control and automation for landslide monitoring; the Root Mean Square Errors of the monitoring landslide displacements are less than 5 mm. Meanwhile, the results also show that a rapidly falling reservoir water level can trigger the reactivation of Wangmiao landslide. Heavy rainfall is also an important factor, but not a crucial component.
Trajectory Specification for Terminal Air Traffic: Pairwise Conflict Detection and Resolution
NASA Technical Reports Server (NTRS)
Paielli, Russell A.; Erzberger, Heinz
2017-01-01
Trajectory Specification is the explicit bounding and control of aircraft trajectories such that the position at any point in time is constrained to a precisely defined volume of space. The bounding space is defined by cross-track, along-track, and vertical tolerances relative to a reference trajectory that specifies position as a function of time. The tolerances are dynamic and will be based on the aircraft navigation capabilities and the current traffic situation. Assuming conformance, Trajectory Specification can guarantee safe separation for an arbitrary period of time even in the event of an air traffic control (ATC) system or datalink failure; hence it can help to achieve the high level of safety and reliability needed for ATC automation. It can also reduce the reliance on tactical backup systems during normal operation. This paper applies it to the terminal area around a major airport and presents algorithms and software for detecting and resolving conflicts. A representative set of pairwise conflicts was generated, and a fast-time simulation was run on them. All conflicts were successfully resolved in real time, demonstrating the computational feasibility of the concept.
An Automated and Continuous Plant Weight Measurement System for Plant Factory
Chen, Wei-Tai; Yeh, Yu-Hui F.; Liu, Ting-Yu; Lin, Ta-Te
2016-01-01
In plant factories, plants are usually cultivated in nutrient solution under a controllable environment. Plant quality and growth are closely monitored and precisely controlled. For plant growth evaluation, plant weight is an important and commonly used indicator. Traditional plant weight measurements are destructive and laborious. In order to measure and record the plant weight during plant growth, an automated measurement system was designed and developed herein. The weight measurement system comprises a weight measurement device and an imaging system. The weight measurement device consists of a top disk, a bottom disk, a plant holder and a load cell. The load cell with a resolution of 0.1 g converts the plant weight on the plant holder disk to an analog electrical signal for a precise measurement. The top disk and bottom disk are designed to be durable for different plant sizes, so plant weight can be measured continuously throughout the whole growth period, without hindering plant growth. The results show that plant weights measured by the weight measurement device are highly correlated with the weights estimated by the stereo-vision imaging system; hence, plant weight can be measured by either method. The weight growth of selected vegetables growing in the National Taiwan University plant factory were monitored and measured using our automated plant growth weight measurement system. The experimental results demonstrate the functionality, stability and durability of this system. The information gathered by this weight system can be valuable and beneficial for hydroponic plants monitoring research and agricultural research applications. PMID:27066040
An Automated and Continuous Plant Weight Measurement System for Plant Factory.
Chen, Wei-Tai; Yeh, Yu-Hui F; Liu, Ting-Yu; Lin, Ta-Te
2016-01-01
In plant factories, plants are usually cultivated in nutrient solution under a controllable environment. Plant quality and growth are closely monitored and precisely controlled. For plant growth evaluation, plant weight is an important and commonly used indicator. Traditional plant weight measurements are destructive and laborious. In order to measure and record the plant weight during plant growth, an automated measurement system was designed and developed herein. The weight measurement system comprises a weight measurement device and an imaging system. The weight measurement device consists of a top disk, a bottom disk, a plant holder and a load cell. The load cell with a resolution of 0.1 g converts the plant weight on the plant holder disk to an analog electrical signal for a precise measurement. The top disk and bottom disk are designed to be durable for different plant sizes, so plant weight can be measured continuously throughout the whole growth period, without hindering plant growth. The results show that plant weights measured by the weight measurement device are highly correlated with the weights estimated by the stereo-vision imaging system; hence, plant weight can be measured by either method. The weight growth of selected vegetables growing in the National Taiwan University plant factory were monitored and measured using our automated plant growth weight measurement system. The experimental results demonstrate the functionality, stability and durability of this system. The information gathered by this weight system can be valuable and beneficial for hydroponic plants monitoring research and agricultural research applications.
An automated approach for mapping persistent ice and snow cover over high latitude regions
Selkowitz, David J.; Forster, Richard R.
2016-01-01
We developed an automated approach for mapping persistent ice and snow cover (glaciers and perennial snowfields) from Landsat TM and ETM+ data across a variety of topography, glacier types, and climatic conditions at high latitudes (above ~65°N). Our approach exploits all available Landsat scenes acquired during the late summer (1 August–15 September) over a multi-year period and employs an automated cloud masking algorithm optimized for snow and ice covered mountainous environments. Pixels from individual Landsat scenes were classified as snow/ice covered or snow/ice free based on the Normalized Difference Snow Index (NDSI), and pixels consistently identified as snow/ice covered over a five-year period were classified as persistent ice and snow cover. The same NDSI and ratio of snow/ice-covered days to total days thresholds applied consistently across eight study regions resulted in persistent ice and snow cover maps that agreed closely in most areas with glacier area mapped for the Randolph Glacier Inventory (RGI), with a mean accuracy (agreement with the RGI) of 0.96, a mean precision (user’s accuracy of the snow/ice cover class) of 0.92, a mean recall (producer’s accuracy of the snow/ice cover class) of 0.86, and a mean F-score (a measure that considers both precision and recall) of 0.88. We also compared results from our approach to glacier area mapped from high spatial resolution imagery at four study regions and found similar results. Accuracy was lowest in regions with substantial areas of debris-covered glacier ice, suggesting that manual editing would still be required in these regions to achieve reasonable results. The similarity of our results to those from the RGI as well as glacier area mapped from high spatial resolution imagery suggests it should be possible to apply this approach across large regions to produce updated 30-m resolution maps of persistent ice and snow cover. In the short term, automated PISC maps can be used to rapidly identify areas where substantial changes in glacier area have occurred since the most recent conventional glacier inventories, highlighting areas where updated inventories are most urgently needed. From a longer term perspective, the automated production of PISC maps represents an important step toward fully automated glacier extent monitoring using Landsat or similar sensors.
Automated 3D bioassembly of micro-tissues for biofabrication of hybrid tissue engineered constructs.
Mekhileri, N V; Lim, K S; Brown, G C J; Mutreja, I; Schon, B S; Hooper, G J; Woodfield, T B F
2018-01-12
Bottom-up biofabrication approaches combining micro-tissue fabrication techniques with extrusion-based 3D printing of thermoplastic polymer scaffolds are emerging strategies in tissue engineering. These biofabrication strategies support native self-assembly mechanisms observed in developmental stages of tissue or organoid growth as well as promoting cell-cell interactions and cell differentiation capacity. Few technologies have been developed to automate the precise assembly of micro-tissues or tissue modules into structural scaffolds. We describe an automated 3D bioassembly platform capable of fabricating simple hybrid constructs via a two-step bottom-up bioassembly strategy, as well as complex hybrid hierarchical constructs via a multistep bottom-up bioassembly strategy. The bioassembly system consisted of a fluidic-based singularisation and injection module incorporated into a commercial 3D bioprinter. The singularisation module delivers individual micro-tissues to an injection module, for insertion into precise locations within a 3D plotted scaffold. To demonstrate applicability for cartilage tissue engineering, human chondrocytes were isolated and micro-tissues of 1 mm diameter were generated utilising a high throughput 96-well plate format. Micro-tissues were singularised with an efficiency of 96.0 ± 5.1%. There was no significant difference in size, shape or viability of micro-tissues before and after automated singularisation and injection. A layer-by-layer approach or aforementioned bottom-up bioassembly strategy was employed to fabricate a bilayered construct by alternatively 3D plotting a thermoplastic (PEGT/PBT) polymer scaffold and inserting pre-differentiated chondrogenic micro-tissues or cell-laden gelatin-based (GelMA) hydrogel micro-spheres, both formed via high-throughput fabrication techniques. No significant difference in viability between the construct assembled utilising the automated bioassembly system and manually assembled construct was observed. Bioassembly of pre-differentiated micro-tissues as well as chondrocyte-laden hydrogel micro-spheres demonstrated the flexibility of the platform while supporting tissue fusion, long-term cell viability, and deposition of cartilage-specific extracellular matrix proteins. This technology provides an automated and scalable pathway for bioassembly of both simple and complex 3D tissue constructs of clinically relevant shape and size, with demonstrated capability to facilitate direct spatial organisation and hierarchical 3D assembly of micro-tissue modules, ranging from biomaterial free cell pellets to cell-laden hydrogel formulations.
A new method for automated dynamic calibration of tipping-bucket rain gauges
Humphrey, M.D.; Istok, J.D.; Lee, J.Y.; Hevesi, J.A.; Flint, A.L.
1997-01-01
Existing methods for dynamic calibration of tipping-bucket rain gauges (TBRs) can be time consuming and labor intensive. A new automated dynamic calibration system has been developed to calibrate TBRs with minimal effort. The system consists of a programmable pump, datalogger, digital balance, and computer. Calibration is performed in two steps: 1) pump calibration and 2) rain gauge calibration. Pump calibration ensures precise control of water flow rates delivered to the rain gauge funnel; rain gauge calibration ensures precise conversion of bucket tip times to actual rainfall rates. Calibration of the pump and one rain gauge for 10 selected pump rates typically requires about 8 h. Data files generated during rain gauge calibration are used to compute rainfall intensities and amounts from a record of bucket tip times collected in the field. The system was tested using 5 types of commercial TBRs (15.2-, 20.3-, and 30.5-cm diameters; 0.1-, 0.2-, and 1.0-mm resolutions) and using 14 TBRs of a single type (20.3-cm diameter; 0.1-mm resolution). Ten pump rates ranging from 3 to 154 mL min-1 were used to calibrate the TBRs and represented rainfall rates between 6 and 254 mm h-1 depending on the rain gauge diameter. All pump calibration results were very linear with R2 values greater than 0.99. All rain gauges exhibited large nonlinear underestimation errors (between 5% and 29%) that decreased with increasing rain gauge resolution and increased with increasing rainfall rate, especially for rates greater than 50 mm h-1. Calibration curves of bucket tip time against the reciprocal of the true pump rate for all rain gauges also were linear with R2 values of 0.99. Calibration data for the 14 rain gauges of the same type were very similar, as indicated by slope values that were within 14% of each other and ranged from about 367 to 417 s mm h-1. The developed system can calibrate TBRs efficiently, accurately, and virtually unattended and could be modified for use with other rain gauge designs. The system is now in routine use to calibrate TBRs in a large rainfall collection network at Yucca Mountain, Nevada.
Gómez, Ana M; Marín Sánchez, Alejandro; Muñoz, Oscar M; Colón Peña, Christian Alejandro
2015-12-01
Insulin pump therapy associated with continuous glucose monitoring has shown a positive clinical impact on diabetes control and reduction of hypoglycemia episodes. There are descriptions of the performance of this device in other populations, but its precision and accuracy in Colombia and Latin America are unknown, especially in the routine outpatient setting. Data from 33 type 1 and type 2 diabetes patients with sensor-augmented pump therapy with threshold suspend automation, MiniMed Paradigm® Veo™ (Medtronic, Northridge, California), managed at Hospital Universitario San Ignacio (Bogotá, Colombia) and receiving outpatient treatment, were analyzed. Simultaneous data from continuous glucose monitoring and capillary blood glucose were compared, and their precision and accuracy were calculating with different methods, including Clarke error grid. Analyses included 2,262 continuous glucose monitoring -reference paired glucose values. A mean absolute relative difference of 20.1% was found for all measurements, with a value higher than 23% for glucose levels ≤75mg/dL. Global compliance with the ISO criteria was 64.9%. It was higher for values >75mg/dl (68.3%, 1,308 of 1,916 readings), than for those ≤ 75mg/dl (49.4%, 171 of 346 readings). Clinical accuracy, as assessed by the Clarke error grid, showed that 91.77% of data were within the A and B zones (75.6% in hypoglycemia). A good numerical accuracy was found for continuous glucose monitoring in normo and hyperglycemia situations, with low precision in hypoglycemia. The clinical accuracy of the device was adequate, with no significant safety concerns for patients. Copyright © 2015 SEEN. Published by Elsevier España, S.L.U. All rights reserved.
Matsui, Takemi; Shinba, Toshikazu; Sun, Guanghao
2018-02-01
12.6% of major depressive disorder (MDD) patients have suicide intent, while it has been reported that 43% of patients did not consult their doctors for MDD, automated MDD screening is eagerly anticipated. Recently, in order to achieve automated screening of MDD, biomarkers such as multiplex DNA methylation profiles or physiological method using near infra-red spectroscopy (NIRS) have been studied, however, they require inspection using 96-well DNA ELIZA kit after blood sampling or significant cost. Using a single-lead electrocardiography (ECG), we developed a high-precision MDD screening system using transient autonomic responses induced by dual mental tasks. We developed a novel high precision MDD screening system which is composed of a single-lead ECG monitor, analogue to digital (AD) converter and a personal computer with measurement and analysis program written by LabView programming language. The system discriminates MDD patients from normal subjects using heat rate variability (HRV)-derived transient autonomic responses induced by dual mental tasks, i.e. verbal fluency task and random number generation task, via linear discriminant analysis (LDA) adopting HRV-related predictor variables (hear rate (HR), high frequency (HF), low frequency (LF)/HF). The proposed system was tested for 12 MDD patients (32 ± 15 years) under antidepressant treatment from Shizuoka Saiseikai General Hospital outpatient unit and 30 normal volunteers (37 ± 17 years) from Tokyo Metropolitan University. The proposed system achieved 100% sensitivity and 100% specificity in classifying 42 examinees into 12 MDD patients and 30 normal subjects. The proposed system appears promising for future HRV-based high-precision and low-cost screening of MDDs using only single-lead ECG.
Chu, Byoung-Sun; Ngo, Thao P T; Cheng, Brian B; Dain, Stephen J
2014-07-01
The accuracy and precision of any instrument should not be taken for granted. While there is an international standard for checking focimeters, there is no report of any study on their performance. A sample set of 51 focimeters (11 brands), were used to measure the spherical power of a set of lenses and the prismatic power of two lenses complying with ISO 9342-1:2005 and other calibrated prismatic lenses and the spherical power of some grey filters. The mean measured spherical power corresponded very closely with the calibrated values; however, the spread of results was substantial and 10 focimeters did not comply with ISO 8598:1996. The measurement of prism was much more accurate and precise and all the focimeters complied easily. With the grey filters, about one-third of the focimeters either showed erratic reading or an error with the equivalent of category 4 sunglasses. On the other hand, nine focimeters had stable and accurate reading on a filter with a luminous transmittance of 0.5 per cent. These results confirm that, in common with all other measurement instruments, there is a need to ensure that a focimeter is reading accurately and precisely over the range of refractive powers and luminous transmittances. The accurate and precise performance of an automated focimeter over its working life cannot be assumed. Checking before purchase with a set of calibrated lenses and some dark sunglass tints will indicate the suitability of a focimeter. Routine checking with the calibrated lenses will inform the users if a focimeter continues to indicate accurately. © 2014 The Authors. Clinical and Experimental Optometry © 2014 Optometrists Association Australia.
Clark, Hallie; Feng, Jing
2017-09-01
High-level vehicle automation has been proposed as a valuable means to enhance the mobility of older drivers, as older drivers experience age-related declines in many cognitive functions that are vital for safe driving. Recent research attempted to examine age differences in how engagement in non-driving-related activities impact driving performance, by instructing drivers to engage in mandatory pre-designed activities. While the mandatory engagement method allows a precise control of the timing and mental workload of the non-driving-related activities, it is different from how a driver would naturally engage in these activities. This study allowed younger (age 18-35, mean age=19.9years) and older drivers (age 62-81, mean age=70.4years) to freely decide when and how to engage in voluntarily chosen non-driving-related activities during simulated driving with conditional automation. We coded video recordings of participants' engagement in non-driving-related activities. We examined the effect of age, level of activity-engagement and takeover notification interval on vehicle control performance during the takeover, by comparing between the high and low engagement groups in younger and older drivers, across two takeover notification interval conditions. We found that both younger and older drivers engaged in various non-driving-related activities during the automated driving portion, with distinct preferences on the type of activity for each age group (i.e., while younger drivers mostly used an electronic device, older drivers tended to converse). There were also significant differences between the two age groups and between the two notification intervals on various driving performance measures. Older drivers benefited more than younger drivers from the longer interval in terms of response time to notifications. Voluntary engagement in non-driving-related activities did not impair takeover performance in general, although there was a trend of older drivers who were more engaged in non-driving-related activities braking harder than those with low activity-engagement during the takeover. Published by Elsevier Ltd.
Tafe, Laura J; Allen, Samantha F; Steinmetz, Heather B; Dokus, Betty A; Cook, Leanne J; Marotti, Jonathan D; Tsongalis, Gregory J
2014-08-01
HER2 fluorescence in-situ hybridization (FISH) is used in breast and gastro-esophageal carcinoma for determining HER2 gene amplification and patients' eligibility for HER2 targeted therapeutics. Traditional manual processing of the FISH slides is labor intensive because of multiple steps that require hands on manipulation of the slides and specifically timed intervals between steps. This highly manual processing also introduces inter-run and inter-operator variability that may affect the quality of the FISH result. Therefore, we sought to incorporate an automated processing instrument into our FISH workflow. Twenty-six cases including breast (20) and gastro-esophageal (6) cancer comprising 23 biopsies and three excision specimens were tested for HER2 FISH (Pathvysion, Abbott) using the Thermobrite Elite (TBE) system (Leica). Up to 12 slides can be run simultaneously. All cases were previously tested by the Pathvysion HER2 FISH assay with manual preparation. Twenty cells were counted by two observers for each case; five cases were tested on three separate runs by different operators to evaluate the precision and inter-operator variability. There was 100% concordance in the scoring between the manual and TBE methods as well as among the five cases that were tested on three runs. Only one case failed due to poor probe hybridization. In total, seven cases were positive for HER2 amplification (HER2:CEP17 ratio >2.2) and the remaining 19 were negative (HER2:CEP17 ratio <1.8) utilizing the 2007 ASCO/CAP scoring criteria. Due to the automated denaturation and hybridization, for each run, there was a reduction in labor of 3.5h which could then be dedicated to other lab functions. The TBE is a walk away pre- and post-hybridization system that automates FISH slide processing, improves work flow and consistency and saves approximately 3.5h of technologist time. The instrument has a small footprint thus occupying minimal counter space. TBE processed slides performed exceptionally well in comparison to the manual technique with no disagreement in HER2 amplification status. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Bryant, Nevin A.; Logan, Thomas L.; Zobrist, Albert L.
2006-01-01
Improvements to the automated co-registration and change detection software package, AFIDS (Automatic Fusion of Image Data System) has recently completed development for and validation by NGA/GIAT. The improvements involve the integration of the AFIDS ultra-fine gridding technique for horizontal displacement compensation with the recently evolved use of Rational Polynomial Functions/ Coefficients (RPFs/RPCs) for image raster pixel position to Latitude/Longitude indexing. Mapping and orthorectification (correction for elevation effects) of satellite imagery defies exact projective solutions because the data are not obtained from a single point (like a camera), but as a continuous process from the orbital path. Standard image processing techniques can apply approximate solutions, but advances in the state-of-the-art had to be made for precision change-detection and time-series applications where relief offsets become a controlling factor. The earlier AFIDS procedure required the availability of a camera model and knowledge of the satellite platform ephemeredes. The recent design advances connect the spacecraft sensor Rational Polynomial Function, a deductively developed model, with the AFIDS ultrafine grid, an inductively developed representation of the relationship raster pixel position to latitude /longitude. As a result, RPCs can be updated by AFIDS, a situation often necessary due to the accuracy limits of spacecraft navigation systems. An example of precision change detection will be presented from Quickbird.
Automatic Telescope Search for Extrasolar Planets
NASA Technical Reports Server (NTRS)
Henry, Gregory W.
1998-01-01
We are using automatic photoelectric telescopes at the Tennessee State University Center for Automated Space Science to search for planets around nearby stars in our galaxy. Over the past several years, wc have developed the capability to make extremely precise measurements of brightness changes in Sun-like stars with automatic telescopes. Extensive quality control and calibration measurements result in a precision of 0.l% for a single nightly observation and 0.0270 for yearly means, far better than previously thought possible with ground-based observations. We are able, for the first time, to trace brightness changes in Sun-like stars that are of similar amplitude to brightness changes in the Sun, whose changes can be observed only with space-based radiometers. Recently exciting discoveries of the first extrasolar planets have been announced, based on the detection of very small radial-velocity variations that imply the existence of planets in orbit around several Sun-like stars. Our precise brightness measurements have been crucial for the confirmation of these discoveries by helping to eliminate alternative explanations for the radial-velocity variations. With our automatic telescopes, we are also searching for transits of these planets across the disks of their stars in order to conclusively verify their existence. The detection of transits would provide the first direct measurements of the sizes, masses, and densities of these planets and, hence, information on their compositions and origins.
Benn, Neil; Turlais, Fabrice; Clark, Victoria; Jones, Mike; Clulow, Stephen
2007-03-01
The authors describe a system for collecting usage metrics from widely distributed automation systems. An application that records and stores usage data centrally, calculates run times, and charts the data was developed. Data were collected over 20 months from at least 28 workstations. The application was used to plot bar charts of date versus run time for individual workstations, the automation in a specific laboratory, or automation of a specified type. The authors show that revised user training, redeployment of equipment, and running complimentary processes on one workstation can increase the average number of runs by up to 20-fold and run times by up to 450%. Active monitoring of usage leads to more effective use of automation. Usage data could be used to determine whether purchasing particular automation was a good investment.
Image-guided smart laser system for precision implantation of cells in cartilage
NASA Astrophysics Data System (ADS)
Katta, Nitesh; Rector, John A.; Gardner, Michael R.; McElroy, Austin B.; Choy, Kevin C.; Crosby, Cody; Zoldan, Janet; Milner, Thomas E.
2017-03-01
State-of-the-art treatment for joint diseases like osteoarthritis focus on articular cartilage repair/regeneration by stem cell implantation therapy. However, the technique is limited by a lack of precision in the physician's imaging and cell deposition toolkit. We describe a novel combination of high-resolution, rapid scan-rate optical coherence tomography (OCT) alongside a short-pulsed nanosecond thulium (Tm) laser for precise cell seeding in cartilage. The superior beam quality of thulium lasers and wavelength of operation 1940 nm offers high volumetric tissue removal rates and minimizes the residual thermal footprint. OCT imaging enables targeted micro-well placement, precise cell deposition, and feature contrast. A bench-top system is constructed using a 15 W, 1940 nm, nanosecond-pulsed Tm fiber laser (500 μJ pulse energy, 100 ns pulse duration, 30kHz repetition rate) for removing tissue, and a swept source laser (1310 ± 70 nm, 100 kHz sweep rate) for OCT imaging, forming a combined Tm/OCT system - a "smart laser knife". OCT assists the smart laser knife user in characterizing cartilage to inform micro-well placement. The Tm laser creates micro-wells (2.35 mm diameter length, 1.5 mm width, 300 μm deep) and micro-incisions (1 mm wide, 200 μm deep) while OCT image-guidance assists and demonstrates this precision cutting and cell deposition with real-time feedback. To test micro-well creation and cell deposition protocol, gelatin phantoms are constructed mimicking cartilage optical properties and physiological structure. Cell viability is then assessed to illustrate the efficacy of the hydrogel deposition. Automated OCT feedback is demonstrated for cutting procedures to avoid important surface/subsurface structures. This bench-top smart laser knife system described here offers a new image-guided approach to precise stem cell seeding that can enhance the efficacy of articular cartilage repair.
Alpert, Bruce S
2018-04-06
The aim of this report is to describe a new device that can validate, by automated auscultation, individual blood pressure (BP) readings taken by automated sphygmomanometers.The Accutension Stetho utilizes a smartphone application in conjunction with a specially designed stethoscope that interfaces directly into the smartphone via the earphone jack. The Korotkoff sounds are recorded by the application and are analyzed by the operator on the screen of the smartphone simultaneously with the images from the sphygmomanometer screen during BP estimation. Current auscultatory validation standards require at least 85 subjects and strict statistical criteria for passage. A device that passes can make no guarantee of accuracy on individual patients. The Accutension Stetho is an inexpensive smartphone/stethoscope kit combination that estimates precise BP values by auscultation to confirm the accuracy of an automated sphygmomanometer's readings on individual patients. This should be of great value for both professional and, in certain circumstances, self-measurement BP. Patients will avoid both unnecessary treatment and errors of underestimation of BP, in which the patient requires therapy. The Stetho's software has been validated in an independent ANSI/AAMI/ISO standard study. The Stetho has been shown to perform without difficulty in multiple deflation-based devices by many manufacturers.
2014-01-01
Background Adverse drug reactions and adverse drug events (ADEs) are major public health issues. Many different prospective tools for the automated detection of ADEs in hospital databases have been developed and evaluated. The objective of the present study was to evaluate an automated method for the retrospective detection of ADEs with hyperkalaemia during inpatient stays. Methods We used a set of complex detection rules to take account of the patient’s clinical and biological context and the chronological relationship between the causes and the expected outcome. The dataset consisted of 3,444 inpatient stays in a French general hospital. An automated review was performed for all data and the results were compared with those of an expert chart review. The complex detection rules’ analytical quality was evaluated for ADEs. Results In terms of recall, 89.5% of ADEs with hyperkalaemia “with or without an abnormal symptom” were automatically identified (including all three serious ADEs). In terms of precision, 63.7% of the automatically identified ADEs with hyperkalaemia were true ADEs. Conclusions The use of context-sensitive rules appears to improve the automated detection of ADEs with hyperkalaemia. This type of tool may have an important role in pharmacoepidemiology via the routine analysis of large inter-hospital databases. PMID:25212108
Ficheur, Grégoire; Chazard, Emmanuel; Beuscart, Jean-Baptiste; Merlin, Béatrice; Luyckx, Michel; Beuscart, Régis
2014-09-12
Adverse drug reactions and adverse drug events (ADEs) are major public health issues. Many different prospective tools for the automated detection of ADEs in hospital databases have been developed and evaluated. The objective of the present study was to evaluate an automated method for the retrospective detection of ADEs with hyperkalaemia during inpatient stays. We used a set of complex detection rules to take account of the patient's clinical and biological context and the chronological relationship between the causes and the expected outcome. The dataset consisted of 3,444 inpatient stays in a French general hospital. An automated review was performed for all data and the results were compared with those of an expert chart review. The complex detection rules' analytical quality was evaluated for ADEs. In terms of recall, 89.5% of ADEs with hyperkalaemia "with or without an abnormal symptom" were automatically identified (including all three serious ADEs). In terms of precision, 63.7% of the automatically identified ADEs with hyperkalaemia were true ADEs. The use of context-sensitive rules appears to improve the automated detection of ADEs with hyperkalaemia. This type of tool may have an important role in pharmacoepidemiology via the routine analysis of large inter-hospital databases.
Contingency Management with Human Autonomy Teaming
NASA Technical Reports Server (NTRS)
Shively, Robert J.; Lachter, Joel B.
2018-01-01
Automation is playing an increasingly important role in many operations. It is often cheaper faster and more precise than human operators. However, automation is not perfect. There are many situations in which a human operator must step in. We refer to these instances as contingencies and the act of stepping in contingency management. Here we propose coupling Human Autonomy Teaming (HAT) with contingency management. We describe two aspects to HAT, bi-directional communication, and working agreements (or plays). Bi-directional communication like Crew Resource Management in traditional aviation, allows all parties to contribute to a decision. Working agreements specify roles and responsibilities. Importantly working agreements allow for the possibility of roles and responsibilities changing depending on environmental factors (e.g., situations the automation was not designed for, workload, risk, or trust). This allows for the automation to "automatically" become more autonomous as it becomes more trusted and/or it is updated to deal with a more complete set of possible situations. We present a concrete example using a prototype contingency management station one might find in a future airline operations center. Automation proposes reroutes for aircraft that encounter bad weather or are forced to divert for environmental or systems reasons. If specific conditions are met, these recommendations may be autonomously datalinked to the affected aircraft.
Developing Mobile BIM/2D Barcode-Based Automated Facility Management System
Chen, Yen-Pei
2014-01-01
Facility management (FM) has become an important topic in research on the operation and maintenance phase. Managing the work of FM effectively is extremely difficult owing to the variety of environments. One of the difficulties is the performance of two-dimensional (2D) graphics when depicting facilities. Building information modeling (BIM) uses precise geometry and relevant data to support the facilities depicted in three-dimensional (3D) object-oriented computer-aided design (CAD). This paper proposes a new and practical methodology with application to FM that uses an integrated 2D barcode and the BIM approach. Using 2D barcode and BIM technologies, this study proposes a mobile automated BIM-based facility management (BIMFM) system for FM staff in the operation and maintenance phase. The mobile automated BIMFM system is then applied in a selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FM practice. The combined results demonstrate that a BIMFM-like system can be an effective mobile automated FM tool. The advantage of the mobile automated BIMFM system lies not only in improving FM work efficiency for the FM staff but also in facilitating FM updates and transfers in the BIM environment. PMID:25250373
Developing mobile BIM/2D barcode-based automated facility management system.
Lin, Yu-Cheng; Su, Yu-Chih; Chen, Yen-Pei
2014-01-01
Facility management (FM) has become an important topic in research on the operation and maintenance phase. Managing the work of FM effectively is extremely difficult owing to the variety of environments. One of the difficulties is the performance of two-dimensional (2D) graphics when depicting facilities. Building information modeling (BIM) uses precise geometry and relevant data to support the facilities depicted in three-dimensional (3D) object-oriented computer-aided design (CAD). This paper proposes a new and practical methodology with application to FM that uses an integrated 2D barcode and the BIM approach. Using 2D barcode and BIM technologies, this study proposes a mobile automated BIM-based facility management (BIMFM) system for FM staff in the operation and maintenance phase. The mobile automated BIMFM system is then applied in a selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FM practice. The combined results demonstrate that a BIMFM-like system can be an effective mobile automated FM tool. The advantage of the mobile automated BIMFM system lies not only in improving FM work efficiency for the FM staff but also in facilitating FM updates and transfers in the BIM environment.
Determination of carbonate carbon in geological materials by coulometric titration
Engleman, E.E.; Jackson, L.L.; Norton, D.R.
1985-01-01
A coulometric titration is used for the determination of carbonate carbon in geological materials. Carbon dioxide is evolved from the sample by the addition of 2 M perchloric acid, with heating, and is determined by automated coulometric titration. The coulometric titration showed improved speed and precision with comparable accuracy to gravimetric and gasometric techniques. ?? 1985.
ERIC Educational Resources Information Center
Economou, A.; Tzanavaras, P. D.; Themelis, D. G.
2005-01-01
The sequential-injection analysis (SIA) is an approach to sample handling that enables the automation of manual wet-chemistry procedures in a rapid, precise and efficient manner. The experiments using SIA fits well in the course of Instrumental Chemical Analysis and especially in the section of Automatic Methods of analysis provided by chemistry…
A Hot-Wire Method Based Thermal Conductivity Measurement Apparatus for Teaching Purposes
ERIC Educational Resources Information Center
Alvarado, S.; Marin, E.; Juarez, A. G.; Calderon, A.; Ivanov, R.
2012-01-01
The implementation of an automated system based on the hot-wire technique is described for the measurement of the thermal conductivity of liquids using equipment easily available in modern physics laboratories at high schools and universities (basically a precision current source and a voltage meter, a data acquisition card, a personal computer…
Effects of imperfect automation on decision making in a simulated command and control task.
Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja
2007-02-01
Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.
Automation of a DXA-based finite element tool for clinical assessment of hip fracture risk.
Luo, Yunhua; Ahmed, Sharif; Leslie, William D
2018-03-01
Finite element analysis of medical images is a promising tool for assessing hip fracture risk. Although a number of finite element models have been developed for this purpose, none of them have been routinely used in clinic. The main reason is that the computer programs that implement the finite element models have not been completely automated, and heavy training is required before clinicians can effectively use them. By using information embedded in clinical dual energy X-ray absorptiometry (DXA), we completely automated a DXA-based finite element (FE) model that we previously developed for predicting hip fracture risk. The automated FE tool can be run as a standalone computer program with the subject's raw hip DXA image as input. The automated FE tool had greatly improved short-term precision compared with the semi-automated version. To validate the automated FE tool, a clinical cohort consisting of 100 prior hip fracture cases and 300 matched controls was obtained from a local community clinical center. Both the automated FE tool and femoral bone mineral density (BMD) were applied to discriminate the fracture cases from the controls. Femoral BMD is the gold standard reference recommended by the World Health Organization for screening osteoporosis and for assessing hip fracture risk. The accuracy was measured by the area under ROC curve (AUC) and odds ratio (OR). Compared with femoral BMD (AUC = 0.71, OR = 2.07), the automated FE tool had a considerably improved accuracy (AUC = 0.78, OR = 2.61 at the trochanter). This work made a large step toward applying our DXA-based FE model as a routine clinical tool for the assessment of hip fracture risk. Furthermore, the automated computer program can be embedded into a web-site as an internet application. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J., III; Parasuraman, Raja; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.
2003-01-01
Adaptive automation represents an advanced form of human-centered automation design. The approach to automation provides for real-time and model-based assessments of human-automation interaction, determines whether the human has entered into a hazardous state of awareness and then modulates the task environment to keep the operator in-the-loop , while maintaining an optimal state of task engagement and mental alertness. Because adaptive automation has not matured, numerous challenges remain, including what the criteria are, for determining when adaptive aiding and adaptive function allocation should take place. Human factors experts in the area have suggested a number of measures including the use of psychophysiology. This NASA Technical Paper reports on three experiments that examined the psychophysiological measures of event-related potentials, electroencephalogram, and heart-rate variability for real-time adaptive automation. The results of the experiments confirm the efficacy of these measures for use in both a developmental and operational role for adaptive automation design. The implications of these results and future directions for psychophysiology and human-centered automation design are discussed.
Force-controlled automatic microassembly of tissue engineering scaffolds
NASA Astrophysics Data System (ADS)
Zhao, Guoyong; Teo, Chee Leong; Hutmacher, Dietmar Werner; Burdet, Etienne
2010-03-01
This paper presents an automated system for 3D assembly of tissue engineering (TE) scaffolds made from biocompatible microscopic building blocks with relatively large fabrication error. It focuses on the pin-into-hole force control developed for this demanding microassembly task. A beam-like gripper with integrated force sensing at a 3 mN resolution with a 500 mN measuring range is designed, and is used to implement an admittance force-controlled insertion using commercial precision stages. Visual-based alignment followed by an insertion is complemented by a haptic exploration strategy using force and position information. The system demonstrates fully automated construction of TE scaffolds with 50 microparts whose dimension error is larger than 5%.
Precision Medicine for Heart Failure with Preserved Ejection Fraction: An Overview.
Shah, Sanjiv J
2017-06-01
There are few proven therapies for heart failure with preserved ejection fraction (HFpEF). The lack of therapies, along with increased recognition of the disorder and its underlying pathophysiology, has led to the acknowledgement that HFpEF is heterogeneous and is not likely to respond to a one-size-fits-all approach. Thus, HFpEF is a prime candidate to benefit from a precision medicine approach. For this reason, we have assembled a compendium of papers on the topic of precision medicine in HFpEF in the Journal of Cardiovascular Translational Research. These papers cover a variety of topics relevant to precision medicine in HFpEF, including automated identification of HFpEF patients; machine learning, novel molecular approaches, genomics, and deep phenotyping of HFpEF; and clinical trial designs that can be used to advance precision medicine in HFpEF. In this introductory article, we provide an overview of precision medicine in HFpEF with the hope that the work described here and in the other papers in this special theme issue will stimulate investigators and clinicians to advance a more targeted approach to HFpEF classification and treatment.
Precision Medicine for Heart Failure with Preserved Ejection Fraction: An Overview
Shah, Sanjiv J.
2017-01-01
There are few proven therapies for heart failure with preserved ejection fraction (HFpEF). The lack of therapies, along with increased recognition of the disorder and its underlying pathophysiology, has led to the acknowledgement that HFpEF is heterogeneous and is not likely to respond to a one-size-fits-all approach. Thus, HFpEF is a prime candidate to benefit from a precision medicine approach. For this reason, we have assembled a compendium of papers on the topic of precision medicine in HFpEF in the Journal of Cardiovascular Translational Research. These papers cover a variety of topics relevant to precision medicine in HFpEF, including automated identification of HFpEF patients; machine learning, novel molecular approaches, genomics, and deep phenotyping of HFpEF; and clinical trial designs that can be used to advance precision medicine in HFpEF. In this introductory article, we provide an overview of precision medicine in HFpEF with the hope that the work described here and in the other papers in this special theme issue will stimulate investigators and clinicians to advance a more targeted approach to HFpEF classification and treatment. PMID:28585183
Danker, Timm; Braun, Franziska; Silbernagl, Nikole; Guenther, Elke
2016-03-01
Manual patch clamp, the gold standard of electrophysiology, represents a powerful and versatile toolbox to stimulate, modulate, and record ion channel activity from membrane fragments and whole cells. The electrophysiological readout can be combined with fluorescent or optogenetic methods and allows for ultrafast solution exchanges using specialized microfluidic tools. A hallmark of manual patch clamp is the intentional selection of individual cells for recording, often an essential prerequisite to generate meaningful data. So far, available automation solutions rely on random cell usage in the closed environment of a chip and thus sacrifice much of this versatility by design. To parallelize and automate the traditional patch clamp technique while perpetuating the full versatility of the method, we developed an approach to automation, which is based on active cell handling and targeted electrode placement rather than on random processes. This is achieved through an automated pipette positioning system, which guides the tips of recording pipettes with micrometer precision to a microfluidic cell handling device. Using a patch pipette array mounted on a conventional micromanipulator, our automated patch clamp process mimics the original manual patch clamp as closely as possible, yet achieving a configuration where recordings are obtained from many patch electrodes in parallel. In addition, our implementation is extensible by design to allow the easy integration of specialized equipment such as ultrafast compound application tools. The resulting system offers fully automated patch clamp on purposely selected cells and combines high-quality gigaseal recordings with solution switching in the millisecond timescale.
Shayanfar, Noushin; Tobler, Ulrich; von Eckardstein, Arnold; Bestmann, Lukas
2007-01-01
Automated analysis of insoluble urine components can reduce the workload of conventional microscopic examination of urine sediment and is possibly helpful for standardization. We compared the diagnostic performance of two automated urine sediment analyzers and combined dipstick/automated urine analysis with that of the traditional dipstick/microscopy algorithm. A total of 332 specimens were collected and analyzed for insoluble urine components by microscopy and automated analyzers, namely the Iris iQ200 (Iris Diagnostics) and the UF-100 flow cytometer (Sysmex). The coefficients of variation for day-to-day quality control of the iQ200 and UF-100 analyzers were 6.5% and 5.5%, respectively, for red blood cells. We reached accuracy ranging from 68% (bacteria) to 97% (yeast) for the iQ200 and from 42% (bacteria) to 93% (yeast) for the UF-100. The combination of dipstick and automated urine sediment analysis increased the sensitivity of screening to approximately 98%. We conclude that automated urine sediment analysis is sufficiently precise and improves the workflow in a routine laboratory. In addition, it allows sediment analysis of all urine samples and thereby helps to detect pathological samples that would have been missed in the conventional two-step procedure according to the European guidelines. Although it is not a substitute for microscopic sediment examination, it can, when combined with dipstick testing, reduce the number of specimens submitted to microscopy. Visual microscopy is still required for some samples, namely, dysmorphic erythrocytes, yeasts, Trichomonas, oval fat bodies, differentiation of casts and certain crystals.
Li, Wei; Abram, François; Pelletier, Jean-Pierre; Raynauld, Jean-Pierre; Dorais, Marc; d'Anjou, Marc-André; Martel-Pelletier, Johanne
2010-01-01
Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application.
Workload Capacity: A Response Time-Based Measure of Automation Dependence.
Yamani, Yusuke; McCarley, Jason S
2016-05-01
An experiment used the workload capacity measure C(t) to quantify the processing efficiency of human-automation teams and identify operators' automation usage strategies in a speeded decision task. Although response accuracy rates and related measures are often used to measure the influence of an automated decision aid on human performance, aids can also influence response speed. Mean response times (RTs), however, conflate the influence of the human operator and the automated aid on team performance and may mask changes in the operator's performance strategy under aided conditions. The present study used a measure of parallel processing efficiency, or workload capacity, derived from empirical RT distributions as a novel gauge of human-automation performance and automation dependence in a speeded task. Participants performed a speeded probabilistic decision task with and without the assistance of an automated aid. RT distributions were used to calculate two variants of a workload capacity measure, COR(t) and CAND(t). Capacity measures gave evidence that a diagnosis from the automated aid speeded human participants' responses, and that participants did not moderate their own decision times in anticipation of diagnoses from the aid. Workload capacity provides a sensitive and informative measure of human-automation performance and operators' automation dependence in speeded tasks. © 2016, Human Factors and Ergonomics Society.
Zheng, Xianlin; Lu, Yiqing; Zhao, Jiangbo; Zhang, Yuhai; Ren, Wei; Liu, Deming; Lu, Jie; Piper, James A; Leif, Robert C; Liu, Xiaogang; Jin, Dayong
2016-01-19
Compared with routine microscopy imaging of a few analytes at a time, rapid scanning through the whole sample area of a microscope slide to locate every single target object offers many advantages in terms of simplicity, speed, throughput, and potential for robust quantitative analysis. Existing techniques that accommodate solid-phase samples incorporating individual micrometer-sized targets generally rely on digital microscopy and image analysis, with intrinsically low throughput and reliability. Here, we report an advanced on-the-fly stage scanning method to achieve high-precision target location across the whole slide. By integrating X- and Y-axis linear encoders to a motorized stage as the virtual "grids" that provide real-time positional references, we demonstrate an orthogonal scanning automated microscopy (OSAM) technique which can search a coverslip area of 50 × 24 mm(2) in just 5.3 min and locate individual 15 μm lanthanide luminescent microspheres with standard deviations of 1.38 and 1.75 μm in X and Y directions. Alongside implementation of an autofocus unit that compensates the tilt of a slide in the Z-axis in real time, we increase the luminescence detection efficiency by 35% with an improved coefficient of variation. We demonstrate the capability of advanced OSAM for robust quantification of luminescence intensities and lifetimes for a variety of micrometer-scale luminescent targets, specifically single down-shifting and upconversion microspheres, crystalline microplates, and color-barcoded microrods, as well as quantitative suspension array assays of biotinylated-DNA functionalized upconversion nanoparticles.
Epilepsy Treatment Simplified through Mobile Ketogenic Diet Planning.
Li, Hanzhou; Jauregui, Jeffrey L; Fenton, Cagla; Chee, Claire M; Bergqvist, A G Christina
2014-07-01
The Ketogenic Diet (KD) is an effective, alternative treatment for refractory epilepsy. This high fat, low protein and carbohydrate diet mimics the metabolic and hormonal changes that are associated with fasting. To maximize the effectiveness of the KD, each meal is precisely planned, calculated, and weighed to within 0.1 gram for the average three-year duration of treatment. Managing the KD is time-consuming and may deter caretakers and patients from pursuing or continuing this treatment. Thus, we investigated methods of planning KD faster and making the process more portable through mobile applications. Nutritional data was gathered from the United States Department of Agriculture (USDA) Nutrient Database. User selected foods are converted into linear equations with n variables and three constraints: prescribed fat content, prescribed protein content, and prescribed carbohydrate content. Techniques are applied to derive the solutions to the underdetermined system depending on the number of foods chosen. The method was implemented on an iOS device and tested with varieties of foods and different number of foods selected. With each case, the application's constructed meal plan was within 95% precision of the KD requirements. In this study, we attempt to reduce the time needed to calculate a meal by automating the computation of the KD via a linear algebra model. We improve upon previous KD calculators by offering optimal suggestions and incorporating the USDA database. We believe this mobile application will help make the KD and other dietary treatment preparations less time consuming and more convenient.
Closed-loop control of renal perfusion pressure in physiological experiments.
Campos-Delgado, D U; Bonilla, I; Rodríguez-Martínez, M; Sánchez-Briones, M E; Ruiz-Hernández, E
2013-07-01
This paper presents the design, experimental modeling, and control of a pump-driven renal perfusion pressure (RPP)-regulatory system to implement precise and relatively fast RPP regulation in rats. The mechatronic system is a simple, low-cost, and reliable device to automate the RPP regulation process based on flow-mediated occlusion. Hence, the regulated signal is the RPP measured in the left femoral artery of the rat, and the manipulated variable is the voltage applied to a dc motor that controls the occlusion of the aorta. The control system is implemented in a PC through the LabView software, and a data acquisition board NI USB-6210. A simple first-order linear system is proposed to approximate the dynamics in the experiment. The parameters of the model are chosen to minimize the error between the predicted and experimental output averaged from eight input/output datasets at different RPP operating conditions. A closed-loop servocontrol system based on a pole-placement PD controller plus dead-zone compensation was proposed for this purpose. First, the feedback structure was validated in simulation by considering parameter uncertainty, and constant and time-varying references. Several experimental tests were also conducted to validate in real time the closed-loop performance for stepwise and fast switching references, and the results show the effectiveness of the proposed automatic system to regulate the RPP in the rat, in a precise, accurate (mean error less than 2 mmHg) and relatively fast mode (10-15 s of response time).
NASA Astrophysics Data System (ADS)
Zhou, Shudao; Ma, Zhongliang; Wang, Min; Peng, Shuling
2018-05-01
This paper proposes a novel alignment system based on the measurement of optical path using a light beam scanning mode in a transmissometer. The system controls both the probe beam and the receiving field of view while scanning in two vertical directions. The system then calculates the azimuth angle of the transmitter and the receiver to determine the precise alignment of the optical path. Experiments show that this method can determine the alignment angles in less than 10 min with errors smaller than 66 μrad in the azimuth. This system also features high collimation precision, process automation and simple installation.
NASA Astrophysics Data System (ADS)
Sopharak, Akara; Uyyanonvara, Bunyarit; Barman, Sarah; Williamson, Thomas
To prevent blindness from diabetic retinopathy, periodic screening and early diagnosis are neccessary. Due to lack of expert ophthalmologists in rural area, automated early exudate (one of visible sign of diabetic retinopathy) detection could help to reduce the number of blindness in diabetic patients. Traditional automatic exudate detection methods are based on specific parameter configuration, while the machine learning approaches which seems more flexible may be computationally high cost. A comparative analysis of traditional and machine learning of exudates detection, namely, mathematical morphology, fuzzy c-means clustering, naive Bayesian classifier, Support Vector Machine and Nearest Neighbor classifier are presented. Detected exudates are validated with expert ophthalmologists' hand-drawn ground-truths. The sensitivity, specificity, precision, accuracy and time complexity of each method are also compared.
Automated inspection and precision grinding of spiral bevel gears
NASA Technical Reports Server (NTRS)
Frint, Harold
1987-01-01
The results are presented of a four phase MM&T program to define, develop, and evaluate an improved inspection system for spiral bevel gears. The improved method utilizes a multi-axis coordinate measuring machine which maps the working flank of the tooth and compares it to nominal reference values stored in the machine's computer. A unique feature of the system is that corrective grinding machine settings can be automatically calculated and printed out when necessary to correct an errant tooth profile. This new method eliminates most of the subjective decision making involved in the present method, which compares contact patterns obtained when the gear set is run under light load in a rolling test machine. It produces a higher quality gear with significant inspection time and cost savings.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bromenshenk, J.J.; Smith, G.C.
Honey bees (Apis mellifera L.) have been shown to be multi-media monitors of chemical exposures and resultant effects. This five-year project has developed an automated system to assess in real-time colony behavioral responses to stressors, both anthropogenic and natural, including inclement weather. Field trials at the Aberdeen Proving Ground-Edgewood included the Old O Field and J field landfills, the Canal Creek and Bush River areas, and a Churchville, MD reference site. Preliminary results show varying concentrations of bioavailable inorganic elements and chlorinated hydrocarbons in bee colonies from all Maryland sites. Industrial solvents in the air inside beehives exhibited the greatestmore » between site differences, with the highest levels occurring in hives near landfills at Old O Field, J Field, and at some sites in the Bush River and Canal Creek areas. Compared to 1996, the 1997 levels of solvents in Old O Field hives decreased by an order of magnitude, and colony performance significantly improved, probably as a consequence of capping the landfill. Recent chemical monitoring accomplishments include development of a new apparatus to quantitatively calibrate TD/GC/MS analysis, a QA/QC assessment of factors that limit the precision of these analyses, and confirmation of transport of aqueous contaminants into the hive. Real-time effects monitoring advances include development of an extensive array of software tools for automated data display, inspection, and numerical analysis and the ability to deliver data from remote locations in real time through Internet or Intranet connections.« less
Automated analysis of brain activity for seizure detection in zebrafish models of epilepsy.
Hunyadi, Borbála; Siekierska, Aleksandra; Sourbron, Jo; Copmans, Daniëlle; de Witte, Peter A M
2017-08-01
Epilepsy is a chronic neurological condition, with over 30% of cases unresponsive to treatment. Zebrafish larvae show great potential to serve as an animal model of epilepsy in drug discovery. Thanks to their high fecundity and relatively low cost, they are amenable to high-throughput screening. However, the assessment of seizure occurrences in zebrafish larvae remains a bottleneck, as visual analysis is subjective and time-consuming. For the first time, we present an automated algorithm to detect epileptic discharges in single-channel local field potential (LFP) recordings in zebrafish. First, candidate seizure segments are selected based on their energy and length. Afterwards, discriminative features are extracted from each segment. Using a labeled dataset, a support vector machine (SVM) classifier is trained to learn an optimal feature mapping. Finally, this SVM classifier is used to detect seizure segments in new signals. We tested the proposed algorithm both in a chemically-induced seizure model and a genetic epilepsy model. In both cases, the algorithm delivered similar results to visual analysis and found a significant difference in number of seizures between the epileptic and control group. Direct comparison with multichannel techniques or methods developed for different animal models is not feasible. Nevertheless, a literature review shows that our algorithm outperforms state-of-the-art techniques in terms of accuracy, precision and specificity, while maintaining a reasonable sensitivity. Our seizure detection system is a generic, time-saving and objective method to analyze zebrafish LPF, which can replace visual analysis and facilitate true high-throughput studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Generating disease-pertinent treatment vocabularies from MEDLINE citations.
Wang, Liqin; Del Fiol, Guilherme; Bray, Bruce E; Haug, Peter J
2017-01-01
Healthcare communities have identified a significant need for disease-specific information. Disease-specific ontologies are useful in assisting the retrieval of disease-relevant information from various sources. However, building these ontologies is labor intensive. Our goal is to develop a system for an automated generation of disease-pertinent concepts from a popular knowledge resource for the building of disease-specific ontologies. A pipeline system was developed with an initial focus of generating disease-specific treatment vocabularies. It was comprised of the components of disease-specific citation retrieval, predication extraction, treatment predication extraction, treatment concept extraction, and relevance ranking. A semantic schema was developed to support the extraction of treatment predications and concepts. Four ranking approaches (i.e., occurrence, interest, degree centrality, and weighted degree centrality) were proposed to measure the relevance of treatment concepts to the disease of interest. We measured the performance of four ranks in terms of the mean precision at the top 100 concepts with five diseases, as well as the precision-recall curves against two reference vocabularies. The performance of the system was also compared to two baseline approaches. The pipeline system achieved a mean precision of 0.80 for the top 100 concepts with the ranking by interest. There were no significant different among the four ranks (p=0.53). However, the pipeline-based system had significantly better performance than the two baselines. The pipeline system can be useful for an automated generation of disease-relevant treatment concepts from the biomedical literature. Copyright © 2016 Elsevier Inc. All rights reserved.
Determination of selected anions in water by ion chromatography
Fishman, Marvin J.; Pyen, Grace
1979-01-01
Ion chromatography is a rapid, sensitive, precise, and accurate method for the determination of major anions in rain water and surface waters. Simultaneous analyses of a single sample for bromide, chloride, fluoride, nitrate, nitrite, orthophosphate, and sulfate require approximately 20 minutes to obtain a chromatogram.Minimum detection limits range from 0.01 milligrams per liter for fluoride to 0.20 milligrams per liter for chloride and sulfate. Percent relative standard deviations were less than nine percent for all anions except nitrite in Standard Reference Water Samples. Only one reference sample contained nitrite and its concentration was near the minimum level of detection. Similar precision was found for chloride, nitrate, and sulfate at concentrations less than 5 milligrams per liter in rainfall samples. Precision for fluoride ranged from 12 to 22 percent, but is attributed to the low concentrations in these samples. The other anions were not detected.To determine accuracy of results, several samples were spiked with known concentrations of fluoride, chloride, nitrate, and sulfate; recoveries ranged from 96 to 103 percent. Known amounts of bromide and phosphate were added, separately, to several other waters, which contained bromide or phosphate. Recovery of added bromide and phosphate ranged from approximately 95 to 104 percent. No recovery data were obtained for nitrite.Chloride, nitrate, nitrite, orthophosphate, and sulfate, in several samples, were also determined independently by automated colorimetric procedures. An automated ion-selective electrode method was used to determine fluoride. Results are in agreement with results obtained by ion chromatography.
Schoeller, D A; Colligan, A S; Shriver, T; Avak, H; Bartok-Olson, C
2000-09-01
The doubly labeled water method is commonly used to measure total energy expenditure in free-living subjects. The method, however, requires accurate and precise deuterium abundance determinations, which can be laborious. The aim of this study was to evaluate a fully automated, high-throughput, chromium reduction technique for the measurement of deuterium abundances in physiological fluids. The chromium technique was compared with an off-line zinc bomb reduction technique and also subjected to test-retest analysis. Analysis of international water standards demonstrated that the chromium technique was accurate and had a within-day precision of <1 per thousand. Addition of organic matter to water samples demonstrated that the technique was sensitive to interference at levels between 2 and 5 g l(-1). Physiological samples could be analyzed without this interference, plasma by 10000 Da exclusion filtration, saliva by sedimentation and urine by decolorizing with carbon black. Chromium reduction of urine specimens from doubly labeled water studies indicated no bias relative to zinc reduction with a mean difference in calculated energy expenditure of -0.2 +/- 3.9%. Blinded reanalysis of urine specimens from a second doubly labeled water study demonstrated a test-retest coefficient of variation of 4%. The chromium reduction method was found to be a rapid, accurate and precise method for the analysis of urine specimens from doubly labeled water. Copyright 2000 John Wiley & Sons, Ltd.
Automated method for determining Instron Residual Seal Force of glass vial/rubber closure systems.
Ludwig, J D; Nolan, P D; Davis, C W
1993-01-01
Instron Residual Seal Force (IRSF) of glass vial/rubber closure systems was determined using an Instron 4501 Materials Testing System. Computer programs were written to process raw data and calculate IRSF values. Preliminary experiments indicated both the appearance of the stress-deformation curves and precision of the derived IRSF values were dependent on the internal dimensions and top surface geometry of the cap anvil. Therefore, a series of five cap anvils varying in shape and dimensions were machined to optimize performance and precision. Vials capped with West 4416/50 PURCOAT button closures or Helvoet compound 6207 lyophilization closures were tested with each cap anvil. Cap anvils with spherical top surfaces and narrow internal dimensions produced more precise results and more uniform stress-deformation curves than cap anvils with flat top surfaces and wider internal dimensions.
NASA Technical Reports Server (NTRS)
2000-01-01
The Automated Endoscopic System for Optimal Positioning, or AESOP, was developed by Computer Motion, Inc. under a SBIR contract from the Jet Propulsion Lab. AESOP is a robotic endoscopic positioning system used to control the motion of a camera during endoscopic surgery. The camera, which is mounted at the end of a robotic arm, previously had to be held in place by the surgical staff. With AESOP the robotic arm can make more precise and consistent movements. AESOP is also voice controlled by the surgeon. It is hoped that this technology can be used in space repair missions which require precision beyond human dexterity. A new generation of the same technology entitled the ZEUS Robotic Surgical System can make endoscopic procedures even more successful. ZEUS allows the surgeon control various instruments in its robotic arms, allowing for the precision the procedure requires.
All-optical patterning of Au nanoparticles on surfaces using optical traps.
Guffey, Mason J; Scherer, Norbert F
2010-11-10
The fabrication of nanoscale devices would be greatly enhanced by "nanomanipulators" that can position single and few objects rapidly with nanometer precision and without mechanical damage. Here, we demonstrate the feasibility and precision of an optical laser tweezer, or optical trap, approach to place single gold (Au) nanoparticles on surfaces with high precision (approximately 100 nm standard deviation). The error in the deposition process is rather small but is determined to be larger than the thermal fluctuations of single nanoparticles within the optical trap. Furthermore, areas of tens of square micrometers could be patterned in a matter of minutes. Since the method does not rely on lithography, scanning probes or a specialized surface, it is versatile and compatible with a variety of systems. We discuss active feedback methods to improve positioning accuracy and the potential for multiplexing and automation.
Image Processing for Bioluminescence Resonance Energy Transfer Measurement-BRET-Analyzer.
Chastagnier, Yan; Moutin, Enora; Hemonnot, Anne-Laure; Perroy, Julie
2017-01-01
A growing number of tools now allow live recordings of various signaling pathways and protein-protein interaction dynamics in time and space by ratiometric measurements, such as Bioluminescence Resonance Energy Transfer (BRET) Imaging. Accurate and reproducible analysis of ratiometric measurements has thus become mandatory to interpret quantitative imaging. In order to fulfill this necessity, we have developed an open source toolset for Fiji- BRET-Analyzer -allowing a systematic analysis, from image processing to ratio quantification. We share this open source solution and a step-by-step tutorial at https://github.com/ychastagnier/BRET-Analyzer. This toolset proposes (1) image background subtraction, (2) image alignment over time, (3) a composite thresholding method of the image used as the denominator of the ratio to refine the precise limits of the sample, (4) pixel by pixel division of the images and efficient distribution of the ratio intensity on a pseudocolor scale, and (5) quantification of the ratio mean intensity and standard variation among pixels in chosen areas. In addition to systematize the analysis process, we show that the BRET-Analyzer allows proper reconstitution and quantification of the ratiometric image in time and space, even from heterogeneous subcellular volumes. Indeed, analyzing twice the same images, we demonstrate that compared to standard analysis BRET-Analyzer precisely define the luminescent specimen limits, enlightening proficient strengths from small and big ensembles over time. For example, we followed and quantified, in live, scaffold proteins interaction dynamics in neuronal sub-cellular compartments including dendritic spines, for half an hour. In conclusion, BRET-Analyzer provides a complete, versatile and efficient toolset for automated reproducible and meaningful image ratio analysis.
NASA Technical Reports Server (NTRS)
Ross, Kenton W.; McKellip, Rodney D.
2005-01-01
Topics covered include: Implementation and Validation of Sensor-Based Site-Specific Crop Management; Enhanced Management of Agricultural Perennial Systems (EMAPS) Using GIS and Remote Sensing; Validation and Application of Geospatial Information for Early Identification of Stress in Wheat; Adapting and Validating Precision Technologies for Cotton Production in the Mid-Southern United States - 2004 Progress Report; Development of a System to Automatically Geo-Rectify Images; Economics of Precision Agriculture Technologies in Cotton Production-AG 2020 Prescription Farming Automation Algorithms; Field Testing a Sensor-Based Applicator for Nitrogen and Phosphorus Application; Early Detection of Citrus Diseases Using Machine Vision and DGPS; Remote Sensing of Citrus Tree Stress Levels and Factors; Spectral-based Nitrogen Sensing for Citrus; Characterization of Tree Canopies; In-field Sensing of Shallow Water Tables and Hydromorphic Soils with an Electromagnetic Induction Profiler; Maintaining the Competitiveness of Tree Fruit Production Through Precision Agriculture; Modeling and Visualizing Terrain and Remote Sensing Data for Research and Education in Precision Agriculture; Thematic Soil Mapping and Crop-Based Strategies for Site-Specific Management; and Crop-Based Strategies for Site-Specific Management.
Hello darkness my old friend: the fading of the nearby TDE ASASSN-14ae
NASA Astrophysics Data System (ADS)
Brown, Jonathan S.; Shappee, Benjamin J.; Holoien, T. W.-S.; Stanek, K. Z.; Kochanek, C. S.; Prieto, J. L.
2016-11-01
We present late-time optical spectroscopy taken with the Large Binocular Telescope's Multi-Object Double Spectrograph, an improved All-Sky Automated Survey for SuperNovae pre-discovery non-detection, and late-time Swift observations of the nearby (d = 193 Mpc, z = 0.0436) tidal disruption event (TDE) ASASSN-14ae. Our observations span from ˜20 d before to ˜750 d after discovery. The proximity of ASASSN-14ae allows us to study the optical evolution of the flare and the transition to a host-dominated state with exceptionally high precision. We measure very weak Hα emission 300 d after discovery (LH α ≃ 4 × 1039 erg s-1) and the most stringent upper limit to date on the Hα luminosity ˜750 d after discovery (LH α ≲ 1039 erg s-1), suggesting that the optical emission arising from a TDE can vanish on a time-scale as short as 1 yr. Our results have important implications for both spectroscopic detection of TDE candidates at late times, as well as the nature of TDE host galaxies themselves.
Human-rating Automated and Robotic Systems - (How HAL Can Work Safely with Astronauts)
NASA Technical Reports Server (NTRS)
Baroff, Lynn; Dischinger, Charlie; Fitts, David
2009-01-01
Long duration human space missions, as planned in the Vision for Space Exploration, will not be possible without applying unprecedented levels of automation to support the human endeavors. The automated and robotic systems must carry the load of routine housekeeping for the new generation of explorers, as well as assist their exploration science and engineering work with new precision. Fortunately, the state of automated and robotic systems is sophisticated and sturdy enough to do this work - but the systems themselves have never been human-rated as all other NASA physical systems used in human space flight have. Our intent in this paper is to provide perspective on requirements and architecture for the interfaces and interactions between human beings and the astonishing array of automated systems; and the approach we believe necessary to create human-rated systems and implement them in the space program. We will explain our proposed standard structure for automation and robotic systems, and the process by which we will develop and implement that standard as an addition to NASA s Human Rating requirements. Our work here is based on real experience with both human system and robotic system designs; for surface operations as well as for in-flight monitoring and control; and on the necessities we have discovered for human-systems integration in NASA's Constellation program. We hope this will be an invitation to dialog and to consideration of a new issue facing new generations of explorers and their outfitters.
Automatic Evidence Retrieval for Systematic Reviews
Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G
2014-01-01
Background Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing’s effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Objective Our goal was to evaluate an automatic method for citation snowballing’s capacity to identify and retrieve the full text and/or abstracts of cited articles. Methods Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. Results The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. Conclusions The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews. PMID:25274020
Creating Ruddlesden-Popper phases by hybrid molecular beam epitaxy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haislmaier, Ryan C.; Stone, Greg; Alem, Nasim
2016-07-25
The synthesis of a 50 unit cell thick n = 4 Sr{sub n+1}Ti{sub n}O{sub 3n+1} (Sr{sub 5}Ti{sub 4}O{sub 13}) Ruddlesden-Popper (RP) phase film is demonstrated by sequentially depositing SrO and TiO{sub 2} layers in an alternating fashion using hybrid molecular beam epitaxy (MBE), where Ti was supplied using titanium tetraisopropoxide (TTIP). A detailed calibration procedure is outlined for determining the shuttering times to deposit SrO and TiO{sub 2} layers with precise monolayer doses using in-situ reflection high energy electron diffraction (RHEED) as feedback. Using optimized Sr and TTIP shuttering times, a fully automated growth of the n = 4 RP phase was carried outmore » over a period of >4.5 h. Very stable RHEED intensity oscillations were observed over the entire growth period. The structural characterization by X-ray diffraction and high resolution transmission electron microscopy revealed that a constant periodicity of four SrTiO{sub 3} perovskite unit cell blocks separating the double SrO rocksalt layer was maintained throughout the entire film thickness with a very little amount of planar faults oriented perpendicular to the growth front direction. These results illustrate that hybrid MBE is capable of layer-by-layer growth with atomic level precision and excellent flux stability.« less
NASA Astrophysics Data System (ADS)
Spaulding, R. S.; Hales, B.; Beck, J. C.; Degrandpre, M. D.
2008-12-01
The four measurable inorganic carbon parameters commonly measured as part of oceanic carbon cycle studies are total dissolved inorganic carbon (DIC), total alkalinity (AT), hydrogen ion concentration (pH) and partial pressure of CO2 (pCO2). AT determination is critical for anthropogenic CO2 inventory calculations and for quantifying CaCO3 saturation. Additionally, measurement of AT in combination with one other carbonate parameter can be used to describe the inorganic carbon equilibria. Current methods for measuring AT require calibrated volumetric flasks and burettes, gravimetry, or precise flow measurements. These methods also require analysis times of ˜15 min and sample volumes of ˜200 mL, and sample introduction is not automated, resulting in labor-intensive measurements and low temporal resolution. The Tracer Monitored Titration (TMT) system was previously developed at the University of Montana for AT measurements. The TMT is not dependent on accurate gravimetric, volumetric or flow rate measurements because it relies on a pH-sensitive indicator (tracer) to track the amount of titrant added to the sample. Sample and a titrant-indicator mixture are mechanically stirred in an optical flow cell and pH is calculated using the indicator equilibrium constant and the spectrophotometrically determined concentrations of the acid and base forms of the indicator. AT is then determined using these data in a non-linear least squares regression of the AT mass and proton balances. The precision and accuracy of the TMT are 2 and 4 micromol per kg in 16 min using 110-mL of sample. The TMT is dependent on complete mixing of titrant with the sample and accurate absorbance measurements. We have developed the segmented-flow TMT (SF- TMT) to improve on these aspects and decrease sample analysis time. The TMT uses segmented flow instead of active mixing and a white LED instead of a tungsten-halogen light source. Air is added to the liquid flow stream, producing segments of liquid separated by air bubbles. Because liquid is not transferred between flow segments, there is rapid flushing which reduces sample volume to <10 mL. Additionally, the slower movement of liquid at the tube walls compared to that at the tube center creates circulation within each liquid segment, mixing the sample and eliminating the need for mechanical stirring. The white LED has higher output at the wavelengths of interest, thus improving the precision of absorbance measurements. These improvements result in a faster, simpler method for measuring AT.
ROx3: Retinal oximetry utilizing the blue-green oximetry method
NASA Astrophysics Data System (ADS)
Parsons, Jennifer Kathleen Hendryx
The ROx is a retinal oximeter under development with the purpose of non-invasively and accurately measuring oxygen saturation (SO2) in vivo. It is novel in that it utilizes the blue-green oximetry technique with on-axis illumination. ROx calibration tests were performed by inducing hypoxia in live anesthetized swine and comparing ROx measurements to SO 2 values measured by a CO-Oximeter. Calibration was not achieved to the precision required for clinical use, but limiting factors were identified and improved. The ROx was used in a set of sepsis experiments on live pigs with the intention of tracking retinal SO2 during the development of sepsis. Though conclusions are qualitative due to insufficient calibration of the device, retinal venous SO2 is shown to trend generally with central venous SO2 as sepsis develops. The novel sepsis model developed in these experiments is also described. The method of cecal ligation and perforation with additional soiling of the abdomen consistently produced controllable severe sepsis/septic shock in a matter of hours. In addition, the ROx was used to collect retinal images from a healthy human volunteer. These experiments served as a bench test for several of the additions/modifications made to the ROx. This set of experiments specifically served to illuminate problems with various light paths and image acquisition. The analysis procedure for the ROx is under development, particularly automating the process for consistency, accuracy, and time efficiency. The current stage of automation is explained, including data acquisition processes and the automated vessel fit routine. Suggestions for the next generation of device minimization are also described.
Alegro, Maryana; Theofilas, Panagiotis; Nguy, Austin; Castruita, Patricia A; Seeley, William; Heinsen, Helmut; Ushizima, Daniela M; Grinberg, Lea T
2017-04-15
Immunofluorescence (IF) plays a major role in quantifying protein expression in situ and understanding cell function. It is widely applied in assessing disease mechanisms and in drug discovery research. Automation of IF analysis can transform studies using experimental cell models. However, IF analysis of postmortem human tissue relies mostly on manual interaction, often subjected to low-throughput and prone to error, leading to low inter and intra-observer reproducibility. Human postmortem brain samples challenges neuroscientists because of the high level of autofluorescence caused by accumulation of lipofuscin pigment during aging, hindering systematic analyses. We propose a method for automating cell counting and classification in IF microscopy of human postmortem brains. Our algorithm speeds up the quantification task while improving reproducibility. Dictionary learning and sparse coding allow for constructing improved cell representations using IF images. These models are input for detection and segmentation methods. Classification occurs by means of color distances between cells and a learned set. Our method successfully detected and classified cells in 49 human brain images. We evaluated our results regarding true positive, false positive, false negative, precision, recall, false positive rate and F1 score metrics. We also measured user-experience and time saved compared to manual countings. We compared our results to four open-access IF-based cell-counting tools available in the literature. Our method showed improved accuracy for all data samples. The proposed method satisfactorily detects and classifies cells from human postmortem brain IF images, with potential to be generalized for applications in other counting tasks. Copyright © 2017 Elsevier B.V. All rights reserved.
Saha, Sajib Kumar; Fernando, Basura; Cuadros, Jorge; Xiao, Di; Kanagasingam, Yogesan
2018-04-27
Fundus images obtained in a telemedicine program are acquired at different sites that are captured by people who have varying levels of experience. These result in a relatively high percentage of images which are later marked as unreadable by graders. Unreadable images require a recapture which is time and cost intensive. An automated method that determines the image quality during acquisition is an effective alternative. To determine the image quality during acquisition, we describe here an automated method for the assessment of image quality in the context of diabetic retinopathy. The method explicitly applies machine learning techniques to access the image and to determine 'accept' and 'reject' categories. 'Reject' category image requires a recapture. A deep convolution neural network is trained to grade the images automatically. A large representative set of 7000 colour fundus images was used for the experiment which was obtained from the EyePACS that were made available by the California Healthcare Foundation. Three retinal image analysis experts were employed to categorise these images into 'accept' and 'reject' classes based on the precise definition of image quality in the context of DR. The network was trained using 3428 images. The method shows an accuracy of 100% to successfully categorise 'accept' and 'reject' images, which is about 2% higher than the traditional machine learning method. On a clinical trial, the proposed method shows 97% agreement with human grader. The method can be easily incorporated with the fundus image capturing system in the acquisition centre and can guide the photographer whether a recapture is necessary or not.
NASA Astrophysics Data System (ADS)
Yin, Yin; Fotin, Sergei V.; Periaswamy, Senthil; Kunz, Justin; Haldankar, Hrishikesh; Muradyan, Naira; Cornud, François; Turkbey, Baris; Choyke, Peter
2012-02-01
Manual delineation of the prostate is a challenging task for a clinician due to its complex and irregular shape. Furthermore, the need for precisely targeting the prostate boundary continues to grow. Planning for radiation therapy, MR-ultrasound fusion for image-guided biopsy, multi-parametric MRI tissue characterization, and context-based organ retrieval are examples where accurate prostate delineation can play a critical role in a successful patient outcome. Therefore, a robust automated full prostate segmentation system is desired. In this paper, we present an automated prostate segmentation system for 3D MR images. In this system, the prostate is segmented in two steps: the prostate displacement and size are first detected, and then the boundary is refined by a shape model. The detection approach is based on normalized gradient fields cross-correlation. This approach is fast, robust to intensity variation and provides good accuracy to initialize a prostate mean shape model. The refinement model is based on a graph-search based framework, which contains both shape and topology information during deformation. We generated the graph cost using trained classifiers and used coarse-to-fine search and region-specific classifier training. The proposed algorithm was developed using 261 training images and tested on another 290 cases. The segmentation performance using mean DSC ranging from 0.89 to 0.91 depending on the evaluation subset demonstrates state of the art performance. Running time for the system is about 20 to 40 seconds depending on image size and resolution.
Quantitative high-throughput population dynamics in continuous-culture by automated microscopy.
Merritt, Jason; Kuehn, Seppe
2016-09-12
We present a high-throughput method to measure abundance dynamics in microbial communities sustained in continuous-culture. Our method uses custom epi-fluorescence microscopes to automatically image single cells drawn from a continuously-cultured population while precisely controlling culture conditions. For clonal populations of Escherichia coli our instrument reveals history-dependent resilience and growth rate dependent aggregation.
Method for 3D noncontact measurements of cut trees package area
NASA Astrophysics Data System (ADS)
Knyaz, Vladimir A.; Vizilter, Yuri V.
2001-02-01
Progress in imaging sensors and computers create the background for numerous 3D imaging application for wide variety of manufacturing activity. Many demands for automated precise measurements are in wood branch of industry. One of them is the accurate volume definition for cut trees carried on the truck. The key point for volume estimation is determination of the front area of the cut tree package. To eliminate slow and inaccurate manual measurements being now in practice the experimental system for automated non-contact wood measurements is developed. The system includes two non-metric CCD video cameras, PC as central processing unit, frame grabbers and original software for image processing and 3D measurements. The proposed method of measurement is based on capturing the stereo pair of front of trees package and performing the image orthotranformation into the front plane. This technique allows to process transformed image for circle shapes recognition and calculating their area. The metric characteristics of the system are provided by special camera calibration procedure. The paper presents the developed method of 3D measurements, describes the hardware used for image acquisition and the software realized the developed algorithms, gives the productivity and precision characteristics of the system.
Hadimioglu, Babur; Stearns, Richard; Ellson, Richard
2016-02-01
Liquid handling instruments for life science applications based on droplet formation with focused acoustic energy or acoustic droplet ejection (ADE) were introduced commercially more than a decade ago. While the idea of "moving liquids with sound" was known in the 20th century, the development of precise methods for acoustic dispensing to aliquot life science materials in the laboratory began in earnest in the 21st century with the adaptation of the controlled "drop on demand" acoustic transfer of droplets from high-density microplates for high-throughput screening (HTS) applications. Robust ADE implementations for life science applications achieve excellent accuracy and precision by using acoustics first to sense the liquid characteristics relevant for its transfer, and then to actuate transfer of the liquid with customized application of sound energy to the given well and well fluid in the microplate. This article provides an overview of the physics behind ADE and its central role in both acoustical and rheological aspects of robust implementation of ADE in the life science laboratory and its broad range of ejectable materials. © 2015 Society for Laboratory Automation and Screening.
Rehman, Amjad; Abbas, Naveed; Saba, Tanzila; Mahmood, Toqeer; Kolivand, Hoshang
2018-04-10
Splitting the rouleaux RBCs from single RBCs and its further subdivision is a challenging area in computer-assisted diagnosis of blood. This phenomenon is applied in complete blood count, anemia, leukemia, and malaria tests. Several automated techniques are reported in the state of art for this task but face either under or over splitting problems. The current research presents a novel approach to split Rouleaux red blood cells (chains of RBCs) precisely, which are frequently observed in the thin blood smear images. Accordingly, this research address the rouleaux splitting problem in a realistic, efficient and automated way by considering the distance transform and local maxima of the rouleaux RBCs. Rouleaux RBCs are splitted by taking their local maxima as the centres to draw circles by mid-point circle algorithm. The resulting circles are further mapped with single RBC in Rouleaux to preserve its original shape. The results of the proposed approach on standard data set are presented and analyzed statistically by achieving an average recall of 0.059, an average precision of 0.067 and F-measure 0.063 are achieved through ground truth with visual inspection. © 2018 Wiley Periodicals, Inc.
Synergies between exoplanet surveys and variable star research
NASA Astrophysics Data System (ADS)
Kovacs, Geza
2017-09-01
With the discovery of the first transiting extrasolar planetary system back in 1999, a great number of projects started to hunt for other similar systems. Because the incidence rate of such systems was unknown and the length of the shallow transit events is only a few percent of the orbital period, the goal was to monitor continuously as many stars as possible for at least a period of a few months. Small aperture, large field of view automated telescope systems have been installed with a parallel development of new data reduction and analysis methods, leading to better than 1% per data point precision for thousands of stars. With the successful launch of the photometric satellites CoRoT and Kepler, the precision increased further by one-two orders of magnitude. Millions of stars have been analyzed and searched for transits. In the history of variable star astronomy this is the biggest undertaking so far, resulting in photometric time series inventories immensely valuable for the whole field. In this review we briefly discuss the methods of data analysis that were inspired by the main science driver of these surveys and highlight some of the most interesting variable star results that impact the field of variable star astronomy.
Gibb, Stuart W.; Wood, John W.; Fauzi, R.; Mantoura, C.
1995-01-01
The automation and improved design and performance of Flow Injection Gas Diffusion-Ion Chromatography (FIGD-IC), a novel technique for the simultaneous analysis of trace ammonia (NH3) and methylamines (MAs) in aqueous media, is presented. Automated Flow Injection Gas Diffusion (FIGD) promotes the selective transmembrane diffusion of MAs and NH3 from aqueous sample under strongly alkaline (pH > 12, NaOH), chelated (EDTA) conditions into a recycled acidic acceptor stream. The acceptor is then injected onto an ion chromatograph where NH3 and the MAs are fully resolved as their cations and detected conductimetrically. A versatile PC interfaced control unit and data capture unit (DCU) are employed in series to direct the selonoid valve switching sequence, IC operation and collection of data. Automation, together with other modifications improved both linearily (R2 > 0.99 MAs 0-100 nM, NH3 0-1000 nM) and precision (<8%) of FIGD-IC at nanomolar concentrations, compared with the manual procedure. The system was successfully applied to the determination of MAs and NH3 in seawater and in trapped particulate and gaseous atmospheric samples during an oceanographic research cruise. PMID:18925047
Nohara, L L; Lema, C; Bader, J O; Aguilera, R J; Almeida, I C
2010-12-01
Chagas disease affects 8-11 million people, mostly in Latin America. Sequelae include cardiac, peripheral nervous and/or gastrointestinal disorders, thus placing a large economic and social burden on endemic countries. The pathogenesis and the evolutive pattern of the disease are not fully clarified. Moreover, available drugs are partially effective and toxic, and there is no vaccine. Therefore, there is an urgent need to speed up basic and translational research in the field. Here, we applied automated high-content imaging to generate multiparametric data on a cell-by-cell basis to precisely and quickly determine several parameters associated with in vitro infection of host cell by Trypanosoma cruzi, the causative agent of Chagas disease. Automated and manual quantifications were used to determine the percentage of T. cruzi-infected cells in a 96-well microplate format and the data generated was statistically evaluated. Most importantly, this automated approach can be widely applied for discovery of potential drugs as well as molecular pathway elucidation not only in T. cruzi but also in other human intracellular pathogens. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
[Morphometry of pulmonary tissue: From manual to high throughput automation].
Sallon, C; Soulet, D; Tremblay, Y
2017-12-01
Weibel's research has shown that any alteration of the pulmonary structure has effects on function. This demonstration required a quantitative analysis of lung structures called morphometry. This is possible thanks to stereology, a set of methods based on principles of geometry and statistics. His work has helped to better understand the morphological harmony of the lung, which is essential for its proper functioning. An imbalance leads to pathophysiology such as chronic obstructive pulmonary disease in adults and bronchopulmonary dysplasia in neonates. It is by studying this imbalance that new therapeutic approaches can be developed. These advances are achievable only through morphometric analytical methods, which are increasingly precise and focused, in particular thanks to the high-throughput automation of these methods. This review makes a comparison between an automated method that we developed in the laboratory and semi-manual methods of morphometric analyzes. The automation of morphometric measurements is a fundamental asset in the study of pulmonary pathophysiology because it is an assurance of robustness, reproducibility and speed. This tool will thus contribute significantly to the acceleration of the race for the development of new drugs. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.
Automated Processing of Plasma Samples for Lipoprotein Separation by Rate-Zonal Ultracentrifugation.
Peters, Carl N; Evans, Iain E J
2016-12-01
Plasma lipoproteins are the primary means of lipid transport among tissues. Defining alterations in lipid metabolism is critical to our understanding of disease processes. However, lipoprotein measurement is limited to specialized centers. Preparation for ultracentrifugation involves the formation of complex density gradients that is both laborious and subject to handling errors. We created a fully automated device capable of forming the required gradient. The design has been made freely available for download by the authors. It is inexpensive relative to commercial density gradient formers, which generally create linear gradients unsuitable for rate-zonal ultracentrifugation. The design can easily be modified to suit user requirements and any potential future improvements. Evaluation of the device showed reliable peristaltic pump accuracy and precision for fluid delivery. We also demonstrate accurate fluid layering with reduced mixing at the gradient layers when compared to usual practice by experienced laboratory personnel. Reduction in layer mixing is of critical importance, as it is crucial for reliable lipoprotein separation. The automated device significantly reduces laboratory staff input and reduces the likelihood of error. Overall, this device creates a simple and effective solution to formation of complex density gradients. © 2015 Society for Laboratory Automation and Screening.
Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes
NASA Astrophysics Data System (ADS)
Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao
2010-06-01
To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.
Model-centric distribution automation: Capacity, reliability, and efficiency
Onen, Ahmet; Jung, Jaesung; Dilek, Murat; ...
2016-02-26
A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less
Model-centric distribution automation: Capacity, reliability, and efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onen, Ahmet; Jung, Jaesung; Dilek, Murat
A series of analyses along with field validations that evaluate efficiency, reliability, and capacity improvements of model-centric distribution automation are presented. With model-centric distribution automation, the same model is used from design to real-time control calculations. A 14-feeder system with 7 substations is considered. The analyses involve hourly time-varying loads and annual load growth factors. Phase balancing and capacitor redesign modifications are used to better prepare the system for distribution automation, where the designs are performed considering time-varying loads. Coordinated control of load tap changing transformers, line regulators, and switched capacitor banks is considered. In evaluating distribution automation versus traditionalmore » system design and operation, quasi-steady-state power flow analysis is used. In evaluating distribution automation performance for substation transformer failures, reconfiguration for restoration analysis is performed. In evaluating distribution automation for storm conditions, Monte Carlo simulations coupled with reconfiguration for restoration calculations are used. As a result, the evaluations demonstrate that model-centric distribution automation has positive effects on system efficiency, capacity, and reliability.« less
Automated MRI Segmentation for Individualized Modeling of Current Flow in the Human Head
Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.
2013-01-01
Objective High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography (HD-EEG) require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images (MRI) requires labor-intensive manual segmentation, even when leveraging available automated segmentation tools. Also, accurate placement of many high-density electrodes on individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach A fully automated segmentation technique based on Statical Parametric Mapping 8 (SPM8), including an improved tissue probability map (TPM) and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on 4 healthy subjects and 7 stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. Main results The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view (FOV) extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Significance Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials. PMID:24099977
Automated MRI segmentation for individualized modeling of current flow in the human head
NASA Astrophysics Data System (ADS)
Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.
2013-12-01
Objective. High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets.Main results. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly.Significance. Fully automated individualized modeling may now be feasible for large-sample EEG research studies and tDCS clinical trials.
Old and new techniques mixed up into optical photomask measurement method
NASA Astrophysics Data System (ADS)
Fukui, Jumpei; Tachibana, Yusaku; Osanai, Makoto
2017-07-01
It has been still highly required for cost efficient solution with easy operation for full-automated CD measurement for line width about 500nm up to 5μm on photomask, because it is frequently use such photomask in the process of manufacturing MEMS sensor for IoT and some devices made in BCD (Bipola CMOS DMOS). As reply to such demand in photomask manufacturing field, we try to take a low noise digital camera technology and LED light source for i-line, which are recently developed, into new measuring tool in order to achieve 1nm (3σ) repeatability for line width measurement between 300nm to 10μm. In addition, for the purpose of full-automated operation, it is very important to find where an initial target line in dense pattern. To achieve such auto line detection precisely, we have improved accuracy of high precision stage (20nm as 3σ) and an alignment algorithm of MEMS Stepper to combine with this tool. As for user-friendly interface, Windows based software helps a lot for not only the operation but also recipe creation or edition in Excel. Actually, in the MEMS manufacturing process, there are various photomasks which need to be check and measure frequently therefore various recipe files are also have to be created and edited frequently.. In order to meet such a requirement in photomask management, we try to make it true by mixing old and new techniques together into one system, which comes to fully automated and cost efficient tool with 1nm repeatability in CD measurement.
Kinect, a Novel Cutting Edge Tool in Pavement Data Collection
NASA Astrophysics Data System (ADS)
Mahmoudzadeh, A.; Firoozi Yeganeh, S.; Golroo, A.
2015-12-01
Pavement roughness and surface distress detection is of interest of decision makers due to vehicle safety, user satisfaction, and cost saving. Data collection, as a core of pavement management systems, is required for these detections. There are two major types of data collection: traditional/manual data collection and automated/semi-automated data collection. This paper study different non-destructive tools in detecting cracks and potholes. For this purpose, automated data collection tools, which have been utilized recently are discussed and their applications are criticized. The main issue is the significant amount of money as a capital investment needed to buy the vehicle. The main scope of this paper is to study the approach and related tools that not only are cost-effective but also precise and accurate. The new sensor called Kinect has all of these specifications. It can capture both RGB images and depth which are of significant use in measuring cracks and potholes. This sensor is able to take image of surfaces with adequate resolution to detect cracks along with measurement of distance between sensor and obstacles in front of it which results in depth of defects. This technology has been very recently studied by few researchers in different fields of studies such as project management, biomedical engineering, etc. Pavement management has not paid enough attention to use of Kinect in monitoring and detecting distresses. This paper is aimed at providing a thorough literature review on usage of Kinect in pavement management and finally proposing the best approach which is cost-effective and precise.
Da Rin, G; Vidali, M; Balboni, F; Benegiamo, A; Borin, M; Ciardelli, M L; Dima, F; Di Fabio, A; Fanelli, A; Fiorini, F; Francione, S; Germagnoli, L; Gioia, M; Lari, T; Lorubbio, M; Marini, A; Papa, A; Seghezzi, M; Solarino, L; Pipitone, S; Tilocca, E; Buoro, S
2017-12-01
Recent automated hematology analyzers (HAs) can identify and report nucleated red blood cells (NRBC) count as a separate population out of white blood cells (WBC). The aim of this study was to investigate the analytical performances of NRBC enumeration on five top of the range HAs. We evaluated the within-run and between-day precision, limit of blank (LoB), limit of detection (LoD), and limit of quantitation (LoQ) of XE-2100 and XN-module (Sysmex), ADVIA 2120i (Siemens), BC-6800 (Mindray), and UniCel DxH 800 (Beckman Coulter). Automated NRBC counts were also compared with optical microscopy (OM). The limits of detection for NRBC of the BC-6800, XN-module, XE-2100, UniCel DxH 800, and ADVIA 2120i are 0.035×10 9 /L, 0.019×10 9 /L, 0.067×10 9 /L, 0.038×10 9 /L, and 0.167×10 9 /L, respectively. Our data indicated excellent performance in terms of precision. The agreement with OM was excellent for BC-6800, XN-module, and XE-2100 (Bias 0.023, 0.019, and 0.033×10 9 /L, respectively). ADVIA 2120i displayed a significant constant error and UniCel DxH 800 both proportional and small constant error. Regards to NRBC counting, the performances shown by BC-6800, XN-module, and XE-2100 are excellent also a low count, ADVIA 2120i and UniCel DxH 800 need to be improved. © 2017 John Wiley & Sons Ltd.
Ba, B B; Corniot, A G; Ducint, D; Breilh, D; Grellet, J; Saux, M C
1999-03-05
An isocratic high-performance liquid chromatographic method with automated solid-phase extraction has been developed to determine foscarnet in calf and human serums. Extraction was performed with an anion exchanger, SAX, from which the analyte was eluted with a 50 mM potassium pyrophosphate buffer, pH 8.4. The mobile phase consisted of methanol-40 mM disodium hydrogenphosphate, pH 7.6 containing 0.25 mM tetrahexylammonium hydrogensulphate (25:75, v/v). The analyte was separated on a polyether ether ketone (PEEK) column 150x4.6 mm I.D. packed with Kromasil 100 C18, 5 microm. Amperometric detection allowed a quantification limit of 15 microM. The assay was linear from 15 to 240 microM. The recovery of foscarnet from calf serum ranged from 60.65+/-1.89% for 15 microM to 67.45+/-1.24% for 200 microM. The coefficient of variation was < or = 3.73% for intra-assay precision and < or =7.24% for inter-assay precision for calf serum concentrations ranged from 15 to 800 microM. For the same samples, the deviation from the nominal value ranged from -8.97% to +5.40% for same day accuracy and from -4.50% to +2.77% for day-to-day accuracy. Selectivity was satisfactory towards potential co-medications. Replacement of human serum by calf serum for calibration standards and quality control samples was validated. Automation brought more protection against biohazards and increase in productivity for routine monitoring and pharmacokinetic studies.
A Microfluidic Platform for Correlative Live-Cell and Super-Resolution Microscopy
Tam, Johnny; Cordier, Guillaume Alan; Bálint, Štefan; Sandoval Álvarez, Ángel; Borbely, Joseph Steven; Lakadamyali, Melike
2014-01-01
Recently, super-resolution microscopy methods such as stochastic optical reconstruction microscopy (STORM) have enabled visualization of subcellular structures below the optical resolution limit. Due to the poor temporal resolution, however, these methods have mostly been used to image fixed cells or dynamic processes that evolve on slow time-scales. In particular, fast dynamic processes and their relationship to the underlying ultrastructure or nanoscale protein organization cannot be discerned. To overcome this limitation, we have recently developed a correlative and sequential imaging method that combines live-cell and super-resolution microscopy. This approach adds dynamic background to ultrastructural images providing a new dimension to the interpretation of super-resolution data. However, currently, it suffers from the need to carry out tedious steps of sample preparation manually. To alleviate this problem, we implemented a simple and versatile microfluidic platform that streamlines the sample preparation steps in between live-cell and super-resolution imaging. The platform is based on a microfluidic chip with parallel, miniaturized imaging chambers and an automated fluid-injection device, which delivers a precise amount of a specified reagent to the selected imaging chamber at a specific time within the experiment. We demonstrate that this system can be used for live-cell imaging, automated fixation, and immunostaining of adherent mammalian cells in situ followed by STORM imaging. We further demonstrate an application by correlating mitochondrial dynamics, morphology, and nanoscale mitochondrial protein distribution in live and super-resolution images. PMID:25545548