Sample records for automated multiple pass

  1. Initial Correction versus Negative Marking in Multiple Choice Examinations

    ERIC Educational Resources Information Center

    Van Hecke, Tanja

    2015-01-01

    Optimal assessment tools should measure in a limited time the knowledge of students in a correct and unbiased way. A method for automating the scoring is multiple choice scoring. This article compares scoring methods from a probabilistic point of view by modelling the probability to pass: the number right scoring, the initial correction (IC) and…

  2. Spacecraft control center automation using the generic inferential executor (GENIE)

    NASA Technical Reports Server (NTRS)

    Hartley, Jonathan; Luczak, Ed; Stump, Doug

    1996-01-01

    The increasing requirement to dramatically reduce the cost of mission operations led to increased emphasis on automation technology. The expert system technology used at the Goddard Space Flight Center (MD) is currently being applied to the automation of spacecraft control center activities. The generic inferential executor (GENIE) is a tool which allows pass automation applications to be constructed. The pass script templates constructed encode the tasks necessary to mimic flight operations team interactions with the spacecraft during a pass. These templates can be configured with data specific to a particular pass. Animated graphical displays illustrate the progress during the pass. The first GENIE application automates passes of the solar, anomalous and magnetospheric particle explorer (SAMPEX) spacecraft.

  3. Using paradata to investigate food reporting patterns in AMPM

    USDA-ARS?s Scientific Manuscript database

    The USDA Automated Multiple Pass Method (AMPM) Blaise instrument collects 24-hour dietary recalls for the What We Eat In America, National Health and Nutrition Examination Survey. Each year it is used in approximately 10,000 interviews which ask individuals to recall the foods and beverages that we...

  4. The U.S. Department of Agriculture Automated Multiple-Pass Method accurately assesses sodium intakes

    USDA-ARS?s Scientific Manuscript database

    Accurate and practical methods to monitor sodium intake of the U.S. population are critical given current sodium reduction strategies. While the gold standard for estimating sodium intake is the 24 hour urine collection, few studies have used this biomarker to evaluate the accuracy of a dietary ins...

  5. Passing in Command Line Arguments and Parallel Cluster/Multicore Batching in R with batch.

    PubMed

    Hoffmann, Thomas J

    2011-03-01

    It is often useful to rerun a command line R script with some slight change in the parameters used to run it - a new set of parameters for a simulation, a different dataset to process, etc. The R package batch provides a means to pass in multiple command line options, including vectors of values in the usual R format, easily into R. The same script can be setup to run things in parallel via different command line arguments. The R package batch also provides a means to simplify this parallel batching by allowing one to use R and an R-like syntax for arguments to spread a script across a cluster or local multicore/multiprocessor computer, with automated syntax for several popular cluster types. Finally it provides a means to aggregate the results together of multiple processes run on a cluster.

  6. Automated carbon dioxide cleaning system

    NASA Technical Reports Server (NTRS)

    Hoppe, David T.

    1991-01-01

    Solidified CO2 pellets are an effective blast media for the cleaning of a variety of materials. CO2 is obtained from the waste gas streams generated from other manufacturing processes and therefore does not contribute to the greenhouse effect, depletion of the ozone layer, or the environmental burden of hazardous waste disposal. The system is capable of removing as much as 90 percent of the contamination from a surface in one pass or to a high cleanliness level after multiple passes. Although the system is packaged and designed for manual hand held cleaning processes, the nozzle can easily be attached to the end effector of a robot for automated cleaning of predefined and known geometries. Specific tailoring of cleaning parameters are required to optimize the process for each individual geometry. Using optimum cleaning parameters the CO2 systems were shown to be capable of cleaning to molecular levels below 0.7 mg/sq ft. The systems were effective for removing a variety of contaminants such as lubricating oils, cutting oils, grease, alcohol residue, biological films, and silicone. The system was effective on steel, aluminum, and carbon phenolic substrates.

  7. Data fusion for automated non-destructive inspection

    PubMed Central

    Brierley, N.; Tippetts, T.; Cawley, P.

    2014-01-01

    In industrial non-destructive evaluation (NDE), it is increasingly common for data acquisition to be automated, driving a recent substantial increase in the availability of data. The collected data need to be analysed, typically necessitating the painstaking manual labour of a skilled operator. Moreover, in automated NDE a region of an inspected component is typically interrogated several times, be it within a single data channel due to multiple probe passes, across several channels acquired simultaneously or over the course of repeated inspections. The systematic combination of these diverse readings is recognized to offer an opportunity to improve the reliability of the inspection, but is not achievable in a manual analysis. This paper describes a data-fusion-based software framework providing a partial automation capability, allowing component regions to be declared defect-free to a very high probability while readily identifying defect indications, thereby optimizing the use of the operator's time. The system is designed to applicable to a wide range of automated NDE scenarios, but the processing is exemplified using the industrial ultrasonic immersion inspection of aerospace turbine discs. Results obtained for industrial datasets demonstrate an orders-of-magnitude reduction in false-call rates, for a given probability of detection, achievable using the developed software system. PMID:25002828

  8. Cost-Effective Telemetry and Command Ground Systems Automation Strategy for the Soil Moisture Active Passive (SMAP) Mission

    NASA Technical Reports Server (NTRS)

    Choi, Josh; Sanders, Antonio

    2012-01-01

    Soil Moisture Active Passive (SMAP) is an Earth-orbiting, remote-sensing NASA mission slated for launch in 2014. The ground data system (GDS) being developed for SMAP is composed of many heterogeneous subsystems, ranging from those that support planning and sequencing to those used for real-time operations, and even further to those that enable science data exchange. A full end-to-end automation of the GDS may result in cost savings during mission operations, but it would require a significant upfront investment to develop such a comprehensive automation. As demonstrated by the Jason-1 and Wide-field Infrared Survey Explorer (WISE) missions, a measure of "lights-out" automation for routine, orbital pass, ground operations can still reduce mission costs through smaller staffing of operators and limiting their working hours. The challenge, then, for the SMAP GDS engineering team, is to formulate an automated operations strategy--and corresponding system architecture -- to minimize operator intervention during routine operations, while balancing the development costs associated with the scope and complexity of automation. This paper discusses the automated operations approach being developed for the SMAP GDS. The focus is on automating the activities involved in routine passes, which limits the scope to real-time operations. A key subsystem of the SMAP GDS -- NASA's AMMOS Mission Data Processing and Control System (AMPCS) -- provides a set of capabilities that enable such automation. Also discussed are the lights-out pass automations of the Jason-1 and WISE missions and how they informed the automation strategy for SMAP. The paper aims to provide insights into what is necessary in automating the GDS operations for Earth satellite missions.

  9. Cost-Effective Telemetry and Command Ground Systems Automation Strategy for the Soil Moisture Active Passive (SMAP) Mission

    NASA Technical Reports Server (NTRS)

    Choi, Joshua S.; Sanders, Antonio L.

    2012-01-01

    Soil Moisture Active Passive (SMAP) is an Earth-orbiting, remote-sensing NASA mission slated for launch in 2014.[double dagger] The ground data system (GDS) being developed for SMAP is composed of many heterogeneous subsystems, ranging from those that support planning and sequencing to those used for real-time operations, and even further to those that enable science data exchange. A full end-to-end automation of the GDS may result in cost savings during mission operations, but it would require a significant upfront investment to develop such comprehensive automation. As demonstrated by the Jason-1 and Wide-field Infrared Survey Explorer (WISE) missions, a measure of "lights-out" automation for routine, orbital pass ground operations can still reduce mission cost through smaller staffing of operators and limited work hours. The challenge, then, for the SMAP GDS engineering team is to formulate an automated operations strategy--and corresponding system architecture--to minimize operator intervention during operations, while balancing the development cost associated with the scope and complexity of automation. This paper discusses the automated operations approach being developed for the SMAP GDS. The focus is on automating the activities involved in routine passes, which limits the scope to real-time operations. A key subsystem of the SMAP GDS--NASA's AMMOS Mission Data Processing and Control System (AMPCS)--provides a set of capabilities that enable such automation. Also discussed are the lights-out pass automations of the Jason-1 and WISE missions and how they informed the automation strategy for SMAP. The paper aims to provide insights into what is necessary in automating the GDS operations for Earth satellite missions.

  10. PipeOnline 2.0: automated EST processing and functional data sorting.

    PubMed

    Ayoubi, Patricia; Jin, Xiaojing; Leite, Saul; Liu, Xianghui; Martajaja, Jeson; Abduraham, Abdurashid; Wan, Qiaolan; Yan, Wei; Misawa, Eduardo; Prade, Rolf A

    2002-11-01

    Expressed sequence tags (ESTs) are generated and deposited in the public domain, as redundant, unannotated, single-pass reactions, with virtually no biological content. PipeOnline automatically analyses and transforms large collections of raw DNA-sequence data from chromatograms or FASTA files by calling the quality of bases, screening and removing vector sequences, assembling and rewriting consensus sequences of redundant input files into a unigene EST data set and finally through translation, amino acid sequence similarity searches, annotation of public databases and functional data. PipeOnline generates an annotated database, retaining the processed unigene sequence, clone/file history, alignments with similar sequences, and proposed functional classification, if available. Functional annotation is automatic and based on a novel method that relies on homology of amino acid sequence multiplicity within GenBank records. Records are examined through a function ordered browser or keyword queries with automated export of results. PipeOnline offers customization for individual projects (MyPipeOnline), automated updating and alert service. PipeOnline is available at http://stress-genomics.org.

  11. The Accutension Stetho, an automated auscultatory device to validate automated sphygmomanometer readings in individual patients.

    PubMed

    Alpert, Bruce S

    2018-04-06

    The aim of this report is to describe a new device that can validate, by automated auscultation, individual blood pressure (BP) readings taken by automated sphygmomanometers.The Accutension Stetho utilizes a smartphone application in conjunction with a specially designed stethoscope that interfaces directly into the smartphone via the earphone jack. The Korotkoff sounds are recorded by the application and are analyzed by the operator on the screen of the smartphone simultaneously with the images from the sphygmomanometer screen during BP estimation. Current auscultatory validation standards require at least 85 subjects and strict statistical criteria for passage. A device that passes can make no guarantee of accuracy on individual patients. The Accutension Stetho is an inexpensive smartphone/stethoscope kit combination that estimates precise BP values by auscultation to confirm the accuracy of an automated sphygmomanometer's readings on individual patients. This should be of great value for both professional and, in certain circumstances, self-measurement BP. Patients will avoid both unnecessary treatment and errors of underestimation of BP, in which the patient requires therapy. The Stetho's software has been validated in an independent ANSI/AAMI/ISO standard study. The Stetho has been shown to perform without difficulty in multiple deflation-based devices by many manufacturers.

  12. Automated mask and wafer defect classification using a novel method for generalized CD variation measurements

    NASA Astrophysics Data System (ADS)

    Verechagin, V.; Kris, R.; Schwarzband, I.; Milstein, A.; Cohen, B.; Shkalim, A.; Levy, S.; Price, D.; Bal, E.

    2018-03-01

    Over the years, mask and wafers defects dispositioning has become an increasingly challenging and time consuming task. With design rules getting smaller, OPC getting complex and scanner illumination taking on free-form shapes - the probability of a user to perform accurate and repeatable classification of defects detected by mask inspection tools into pass/fail bins is reducing. The critical challenging of mask defect metrology for small nodes ( < 30 nm) was reviewed in [1]. While Critical Dimension (CD) variation measurement is still the method of choice for determining a mask defect future impact on wafer, the high complexity of OPCs combined with high variability in pattern shapes poses a challenge for any automated CD variation measurement method. In this study, a novel approach for measurement generalization is presented. CD variation assessment performance is evaluated on multiple different complex shape patterns, and is benchmarked against an existing qualified measurement methodology.

  13. Advancing haemostasis automation--successful implementation of robotic centrifugation and sample processing in a tertiary service hospital.

    PubMed

    Sédille-Mostafaie, Nazanin; Engler, Hanna; Lutz, Susanne; Korte, Wolfgang

    2013-06-01

    Laboratories today face increasing pressure to automate operations due to increasing workloads and the need to reduce expenditure. Few studies to date have focussed on the laboratory automation of preanalytical coagulation specimen processing. In the present study, we examined whether a clinical chemistry automation protocol meets the preanalytical requirements for the analyses of coagulation. During the implementation of laboratory automation, we began to operate a pre- and postanalytical automation system. The preanalytical unit processes blood specimens for chemistry, immunology and coagulation by automated specimen processing. As the production of platelet-poor plasma is highly dependent on optimal centrifugation, we examined specimen handling under different centrifugation conditions in order to produce optimal platelet deficient plasma specimens. To this end, manually processed models centrifuged at 1500 g for 5 and 20 min were compared to an automated centrifugation model at 3000 g for 7 min. For analytical assays that are performed frequently enough to be targets for full automation, Passing-Bablok regression analysis showed close agreement between different centrifugation methods, with a correlation coefficient between 0.98 and 0.99 and a bias between -5% and +6%. For seldom performed assays that do not mandate full automation, the Passing-Bablok regression analysis showed acceptable to poor agreement between different centrifugation methods. A full automation solution is suitable and can be recommended for frequent haemostasis testing.

  14. Concurrent ultrasonic weld evaluation system

    DOEpatents

    Hood, Donald W.; Johnson, John A.; Smartt, Herschel B.

    1987-01-01

    A system for concurrent, non-destructive evaluation of partially completed welds for use in conjunction with an automated welder. The system utilizes real time, automated ultrasonic inspection of a welding operation as the welds are being made by providing a transducer which follows a short distance behind the welding head. Reflected ultrasonic signals are analyzed utilizing computer based digital pattern recognition techniques to discriminate between good and flawed welds on a pass by pass basis. The system also distinguishes between types of weld flaws.

  15. Concurrent ultrasonic weld evaluation system

    DOEpatents

    Hood, D.W.; Johnson, J.A.; Smartt, H.B.

    1985-09-04

    A system for concurrent, non-destructive evaluation of partially completed welds for use in conjunction with an automated welder. The system utilizes real time, automated ultrasonic inspection of a welding operation as the welds are being made by providing a transducer which follows a short distance behind the welding head. Reflected ultrasonic signals are analyzed utilizing computer based digital pattern recognition techniques to discriminate between good and flawed welds on a pass by pass basis. The system also distinguishes between types of weld flaws.

  16. Concurrent ultrasonic weld evaluation system

    DOEpatents

    Hood, D.W.; Johnson, J.A.; Smartt, H.B.

    1987-12-15

    A system for concurrent, non-destructive evaluation of partially completed welds for use in conjunction with an automated welder is disclosed. The system utilizes real time, automated ultrasonic inspection of a welding operation as the welds are being made by providing a transducer which follows a short distance behind the welding head. Reflected ultrasonic signals are analyzed utilizing computer based digital pattern recognition techniques to discriminate between good and flawed welds on a pass by pass basis. The system also distinguishes between types of weld flaws. 5 figs.

  17. Regional Myocardial Blood Volume and Flow: First-Pass MR Imaging with Polylysine-Gd-DTPA

    PubMed Central

    Wilke, Norbert; Kroll, Keith; Merkle, Hellmut; Wang, Ying; Ishibashi, Yukata; Xu, Ya; Zhang, Jiani; Jerosch-Herold, Michael; Mühler, Andreas; Stillman, Arthur E.; Bassingthwaighte, James B.; Bache, Robert; Ugurbil, Kamil

    2010-01-01

    The authors investigated the utility of an intravascular magnetic resonance (MR) contrast agent, poly-L-lysine-gadolinium diethylenetriaminepentaacetic acid (DTPA), for differentiating acutely ischemic from normally perfused myocardium with first-pass MR imaging. Hypoperfused regions, identified with microspheres, on the first-pass images displayed significantly decreased signal intensities compared with normally perfused myocardium (P < .0007). Estimates of regional myocardial blood content, obtained by measuring the ratio of areas under the signal intensity-versus-time curves in tissue regions and the left ventricular chamber, averaged 0.12 mL/g ± 0.04 (n = 35), compared with a value of 0.11 mL/g ± 0.05 measured with radiolabeled albumin in the same tissue regions. To obtain MR estimates of regional myocardial blood flow, in situ calibration curves were used to transform first-pass intensity-time curves into content-time curves for analysis with a multiple-pathway, axially distributed model. Flow estimates, obtained by automated parameter optimization, averaged 1.2 mL/min/g ± 0.5 [n = 29), compared with 1.3 mL/min/g ± 0.3 obtained with tracer microspheres in the same tissue specimens at the same time. The results represent a combination of T1-weighted first-pass imaging, intravascular relaxation agents, and a spatially distributed perfusion model to obtain absolute regional myocardial blood flow and volume. PMID:7766986

  18. Automating the process for locating no-passing zones using georeferencing data.

    DOT National Transportation Integrated Search

    2012-08-01

    This research created a method of using global positioning system (GPS) coordinates to identify the location of no-passing zones in two-lane highways. Analytical algorithms were developed for analyzing the availability of sight distance along the ali...

  19. Automated calculation of passing sight distance using GPS data

    DOT National Transportation Integrated Search

    2006-07-01

    Most of the rural highways in the United States of America are two-lane, two-way highways. In order to ensure smooth flow of traffic, maximum-passing opportunities must be provided on these highways, where the fast moving vehicles can overtake slow m...

  20. Automated calculation of passing sight distance using global positioning system data

    DOT National Transportation Integrated Search

    2006-07-01

    Most of the rural highways in the United States of America are two-lane, two-way highways. In order to ensure smooth flow of traffic, maximum-passing opportunities must be provided on these highways, where the fast moving vehicles can overtake slow m...

  1. Computer-Based Algorithmic Determination of Muscle Movement Onset Using M-Mode Ultrasonography

    DTIC Science & Technology

    2017-05-01

    contraction images were analyzed visually and with three different classes of algorithms: pixel standard deviation (SD), high-pass filter and Teager Kaiser...Linear relationships and agreements between computed and visual muscle onset were calculated. The top algorithms were high-pass filtered with a 30 Hz...suggest that computer automated determination using high-pass filtering is a potential objective alternative to visual determination in human

  2. 78 FR 18479 - Drawbridge Operation Regulations; Pass Manchac, LA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-27

    ... governing the operation of the Canadian National (CN) Railroad automated bascule span drawbridge across Pass... necessary to upgrade the electrical drive system and replace the seals on the gear drive unit that operates... Operations, telephone 202-366-9826. SUPPLEMENTARY INFORMATION: The CN Railroad has requested a temporary...

  3. Challenges in converting an interviewer-administered food probe database to self-administration in the National Cancer Institute Automated Self-administered 24-Hour Recall (ASA24).

    PubMed

    Zimmerman, Thea Palmer; Hull, Stephen G; McNutt, Suzanne; Mittl, Beth; Islam, Noemi; Guenther, Patricia M; Thompson, Frances E; Potischman, Nancy A; Subar, Amy F

    2009-12-01

    The National Cancer Institute (NCI) is developing an automated, self-administered 24-hour dietary recall (ASA24) application to collect and code dietary intake data. The goal of the ASA24 development is to create a web-based dietary interview based on the US Department of Agriculture (USDA) Automated Multiple Pass Method (AMPM) instrument currently used in the National Health and Nutrition Examination Survey (NHANES). The ASA24 food list, detail probes, and portion probes were drawn from the AMPM instrument; portion-size pictures from Baylor College of Medicine's Food Intake Recording Software System (FIRSSt) were added; and the food code/portion code assignments were linked to the USDA Food and Nutrient Database for Dietary Studies (FNDDS). The requirements that the interview be self-administered and fully auto-coded presented several challenges as the AMPM probes and responses were linked with the FNDDS food codes and portion pictures. This linking was accomplished through a "food pathway," or the sequence of steps that leads from a respondent's initial food selection, through the AMPM probes and portion pictures, to the point at which a food code and gram weight portion size are assigned. The ASA24 interview database that accomplishes this contains more than 1,100 food probes and more than 2 million food pathways and will include about 10,000 pictures of individual foods depicting up to 8 portion sizes per food. The ASA24 will make the administration of multiple days of recalls in large-scale studies economical and feasible.

  4. Ultrastructural evaluation of multiple pass low energy versus single pass high energy radio-frequency treatment.

    PubMed

    Kist, David; Burns, A Jay; Sanner, Roth; Counters, Jeff; Zelickson, Brian

    2006-02-01

    The radio-frequency (RF) device is a system capable of volumetric heating of the mid to deep dermis and selective heating of the fibrous septa strands and fascia layer. Clinically, these effects promote dermal collagen production, and tightening of these deep subcutaneous structures. A new technique of using multiple low energy passes has been described which results in lower patient discomfort and fewer side effects. This technique has also been anecdotally described as giving more reproducible and reliable clinical results of tissue tightening and contouring. This study will compare ultrastructural changes in collagen between a single pass high energy versus up to five passes of a multiple pass lower energy treatment. Three subjects were consented and treated in the preauricular region with the RF device using single or multiple passes (three or five) in the same 1.5 cm(2) treatment area with a slight delay between passes to allow tissue cooling. Biopsies from each treatment region and a control biopsy were taken immediately, 24 hours or 6 months post treatment for electron microscopic examination of the 0-1 mm and 1-2 mm levels. Sections of tissue 1 mm x 1 mm x 80 nm were examined with an RCA EMU-4 Transmission Electron Microscope. Twenty sections from 6 blocks from each 1 mm depth were examined by 2 blinded observers. The morphology and degree of collagen change in relation to area examined was compared to the control tissue, and estimated using a quantitative scale. Ultrastructural examination of tissue showed that an increased amount of collagen fibril changes with increasing passes at energies of 97 J (three passes) and 122 J (five passes), respectively. The changes seen after five multiple passes were similar to those detected after much more painful single pass high-energy treatments. This ultrastructural study shows changes in collagen fibril morphology with an increased effect demonstrated at greater depths of the skin with multiple low-fluence passes and at lesser depths with single pass higher fluence settings. Findings suggest that similar collagen fibril alteration can occur with multiple pass low-energy treatments and single pulse high-energy treatments. The lower fluence multiple pass approach is associated with less patient discomfort, less side effects, and more consistent clinical results. Copyright 2005 Wiley-Liss, Inc.

  5. Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy.

    PubMed

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-08-01

    To target bacterial pathogens that invade and proliferate inside host cells, it is necessary to design intervention strategies directed against bacterial attachment, cellular invasion and intracellular proliferation. We present an automated microscopy-based, fast, high-throughput method for analyzing size and number of intracellular bacterial colonies in infected tissue culture cells. Cells are seeded in 48-well plates and infected with a GFP-expressing bacterial pathogen. Following gentamicin treatment to remove extracellular pathogens, cells are fixed and cell nuclei stained. This is followed by automated microscopy and subsequent semi-automated spot detection to determine the number of intracellular bacterial colonies, their size distribution, and the average number per host cell. Multiple 48-well plates can be processed sequentially and the procedure can be completed in one working day. As a model we quantified intracellular bacterial colonies formed by uropathogenic Escherichia coli (UPEC) during infection of human kidney cells (HKC-8). Urinary tract infections caused by UPEC are among the most common bacterial infectious diseases in humans. UPEC can colonize tissues of the urinary tract and is responsible for acute, chronic, and recurrent infections. In the bladder, UPEC can form intracellular quiescent reservoirs, thought to be responsible for recurrent infections. In the kidney, UPEC can colonize renal epithelial cells and pass to the blood stream, either via epithelial cell disruption or transcellular passage, to cause sepsis. Intracellular colonies are known to be clonal, originating from single invading UPEC. In our experimental setup, we found UPEC CFT073 intracellular bacterial colonies to be heterogeneous in size and present in nearly one third of the HKC-8 cells. This high-throughput experimental format substantially reduces experimental time and enables fast screening of the intracellular bacterial load and cellular distribution of multiple bacterial isolates. This will be a powerful experimental tool facilitating the study of bacterial invasion, drug resistance, and the development of new therapeutics. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Scheduling with Automatic Resolution of Conflicts

    NASA Technical Reports Server (NTRS)

    Clement, Bradley; Schaffer, Steve

    2006-01-01

    DSN Requirement Scheduler is a computer program that automatically schedules, reschedules, and resolves conflicts for allocations of resources of NASA s Deep Space Network (DSN) on the basis of ever-changing project requirements for DSN services. As used here, resources signifies, primarily, DSN antennas, ancillary equipment, and times during which they are available. Examples of project-required DSN services include arraying, segmentation, very-long-baseline interferometry, and multiple spacecraft per aperture. Requirements can include periodic reservations of specific or optional resources during specific time intervals or within ranges specified in terms of starting times and durations. This program is built on the Automated Scheduling and Planning Environment (ASPEN) software system (aspects of which have been described in previous NASA Tech Briefs articles), with customization to reflect requirements and constraints involved in allocation of DSN resources. Unlike prior DSN-resource- scheduling programs that make single passes through the requirements and require human intervention to resolve conflicts, this program makes repeated passes in a continuing search for all possible allocations, provides a best-effort solution at any time, and presents alternative solutions among which users can choose.

  7. Human performance in a multiple-task environment: effects of automation reliability on visual attention allocation.

    PubMed

    Cullen, Ralph H; Rogers, Wendy A; Fisk, Arthur D

    2013-11-01

    Diagnostic automation has been posited to alleviate the high demands of multiple-task environments; however, mixed effects have been found pertaining to performance aid success. To better understand these effects, attention allocation must be studied directly. We developed a multiple-task environment to study the effects of automation on visual attention. Participants interacted with a system providing varying levels of automation and automation reliability and then were transferred to a system with no support. Attention allocation was measured by tracking the number of times each task was viewed. We found that participants receiving automation allocated their time according to the task frequency and that tasks that benefited most from automation were most harmed when it was removed. The results suggest that the degree to which automation affects multiple-task performance is dependent on the relative attributes of the tasks involved. Moreover, there is an inverse relationship between support and cost when automation fails. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  8. Automated Point Cloud Correspondence Detection for Underwater Mapping Using AUVs

    NASA Technical Reports Server (NTRS)

    Hammond, Marcus; Clark, Ashley; Mahajan, Aditya; Sharma, Sumant; Rock, Stephen

    2015-01-01

    An algorithm for automating correspondence detection between point clouds composed of multibeam sonar data is presented. This allows accurate initialization for point cloud alignment techniques even in cases where accurate inertial navigation is not available, such as iceberg profiling or vehicles with low-grade inertial navigation systems. Techniques from computer vision literature are used to extract, label, and match keypoints between "pseudo-images" generated from these point clouds. Image matches are refined using RANSAC and information about the vehicle trajectory. The resulting correspondences can be used to initialize an iterative closest point (ICP) registration algorithm to estimate accumulated navigation error and aid in the creation of accurate, self-consistent maps. The results presented use multibeam sonar data obtained from multiple overlapping passes of an underwater canyon in Monterey Bay, California. Using strict matching criteria, the method detects 23 between-swath correspondence events in a set of 155 pseudo-images with zero false positives. Using less conservative matching criteria doubles the number of matches but introduces several false positive matches as well. Heuristics based on known vehicle trajectory information are used to eliminate these.

  9. Statistically Comparing the Performance of Multiple Automated Raters across Multiple Items

    ERIC Educational Resources Information Center

    Kieftenbeld, Vincent; Boyer, Michelle

    2017-01-01

    Automated scoring systems are typically evaluated by comparing the performance of a single automated rater item-by-item to human raters. This presents a challenge when the performance of multiple raters needs to be compared across multiple items. Rankings could depend on specifics of the ranking procedure; observed differences could be due to…

  10. The impact of injector-based contrast agent administration in time-resolved MRA.

    PubMed

    Budjan, Johannes; Attenberger, Ulrike I; Schoenberg, Stefan O; Pietsch, Hubertus; Jost, Gregor

    2018-05-01

    Time-resolved contrast-enhanced MR angiography (4D-MRA), which allows the simultaneous visualization of the vasculature and blood-flow dynamics, is widely used in clinical routine. In this study, the impact of two different contrast agent injection methods on 4D-MRA was examined in a controlled, standardized setting in an animal model. Six anesthetized Goettingen minipigs underwent two identical 4D-MRA examinations at 1.5 T in a single session. The contrast agent (0.1 mmol/kg body weight gadobutrol, followed by 20 ml saline) was injected using either manual injection or an automated injection system. A quantitative comparison of vascular signal enhancement and quantitative renal perfusion analyses were performed. Analysis of signal enhancement revealed higher peak enhancements and shorter time to peak intervals for the automated injection. Significantly different bolus shapes were found: automated injection resulted in a compact first-pass bolus shape clearly separated from the recirculation while manual injection resulted in a disrupted first-pass bolus with two peaks. In the quantitative perfusion analyses, statistically significant differences in plasma flow values were found between the injection methods. The results of both qualitative and quantitative 4D-MRA depend on the contrast agent injection method, with automated injection providing more defined bolus shapes and more standardized examination protocols. • Automated and manual contrast agent injection result in different bolus shapes in 4D-MRA. • Manual injection results in an undefined and interrupted bolus with two peaks. • Automated injection provides more defined bolus shapes. • Automated injection can lead to more standardized examination protocols.

  11. Automated four color CD4/CD8 analysis of leukocytes by scanning fluorescence microscopy using Quantum dots

    NASA Astrophysics Data System (ADS)

    Bocsi, Jozsef; Mittag, Anja; Varga, Viktor S.; Molnar, Bela; Tulassay, Zsolt; Sack, Ulrich; Lenz, Dominik; Tarnok, Attila

    2006-02-01

    Scanning Fluorescence Microscope (SFM) is a new technique for automated motorized microscopes to measure multiple fluorochrome labeled cells (Bocsi et al. Cytometry 2004, 61A:1). The ratio of CD4+/CD8+ cells is an important in immune diagnostics in immunodeficiency and HIV. Therefor a four-color staining protocol (DNA, CD3, CD4 and CD8) for automated SFM analysis of lymphocytes was developed. EDTA uncoagulated blood was stained with organic and inorganic (Quantum dots) fluorochromes in different combinations. Aliquots of samples were measured by Flow Cytometry (FCM) and SFM. By SFM specimens were scanned and digitized using four fluorescence filter sets. Automated cell detection (based on Hoechst 33342 fluorescence), CD3, CD4 and CD8 detection were performed, CD4/CD8 ratio was calculated. Fluorescence signals were well separable on SFM and FCM. Passing and Bablok regression of all CD4/CD8 ratios obtained by FCM and SFM (F(X)=0.0577+0.9378x) are in the 95% confidence interval. Cusum test did not show significant deviation from linearity (P>0.10). This comparison indicates that there is no systemic bias between the two different methods. In SFM analyses the inorganic Quantum dot staining was very stable in PBS in contrast to the organic fluorescent dyes, but bleached shortly after mounting with antioxidant and free radical scavenger mounting media. This shows the difficulty of combinations of organic dyes and Quantum dots. Slide based multi-fluorescence labeling system and automated SFM are applicable tools for the CD4/CD8 ratio determination in peripheral blood samples. Quantum Dots are stable inorganic fluorescence labels that may be used as reliable high resolution dyes for cell labeling.

  12. Automation Framework for Flight Dynamics Products Generation

    NASA Technical Reports Server (NTRS)

    Wiegand, Robert E.; Esposito, Timothy C.; Watson, John S.; Jun, Linda; Shoan, Wendy; Matusow, Carla

    2010-01-01

    XFDS provides an easily adaptable automation platform. To date it has been used to support flight dynamics operations. It coordinates the execution of other applications such as Satellite TookKit, FreeFlyer, MATLAB, and Perl code. It provides a mechanism for passing messages among a collection of XFDS processes, and allows sending and receiving of GMSEC messages. A unified and consistent graphical user interface (GUI) is used for the various tools. Its automation configuration is stored in text files, and can be edited either directly or using the GUI.

  13. Multiple pass gas absorption cell utilizing a spherical mirror opposite one or more pair of obliquely disposed flat mirrors

    NASA Technical Reports Server (NTRS)

    Pearson, Richard (Inventor); Lynch, Dana H. (Inventor); Gunter, William D. (Inventor)

    1995-01-01

    A method and apparatus for passing light bundles through a multiple pass sampling cell is disclosed. The multiple pass sampling cell includes a sampling chamber having first and second ends positioned along a longitudinal axis of the sampling cell. The sampling cell further includes an entrance opening, located adjacent the first end of the sampling cell at a first azimuthal angular position. The entrance opening permits a light bundle to pass into the sampling cell. The sampling cell also includes an exit opening at a second azimuthal angular position. The light exit permits a light bundle to pass out of the sampling cell after the light bundle has followed a predetermined path.

  14. Small cities face greater impact from automation.

    PubMed

    Frank, Morgan R; Sun, Lijun; Cebrian, Manuel; Youn, Hyejin; Rahwan, Iyad

    2018-02-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. © 2018 The Authors.

  15. Small cities face greater impact from automation

    PubMed Central

    Sun, Lijun; Cebrian, Manuel; Rahwan, Iyad

    2018-01-01

    The city has proved to be the most successful form of human agglomeration and provides wide employment opportunities for its dwellers. As advances in robotics and artificial intelligence revive concerns about the impact of automation on jobs, a question looms: how will automation affect employment in cities? Here, we provide a comparative picture of the impact of automation across US urban areas. Small cities will undertake greater adjustments, such as worker displacement and job content substitutions. We demonstrate that large cities exhibit increased occupational and skill specialization due to increased abundance of managerial and technical professions. These occupations are not easily automatable, and, thus, reduce the potential impact of automation in large cities. Our results pass several robustness checks including potential errors in the estimation of occupational automation and subsampling of occupations. Our study provides the first empirical law connecting two societal forces: urban agglomeration and automation's impact on employment. PMID:29436514

  16. A Short Note on the Relationship between Pass Rate and Multiple Attempts

    ERIC Educational Resources Information Center

    Cheng, Ying; Liu, Cheng

    2016-01-01

    For a certification, licensure, or placement exam, allowing examinees to take multiple attempts at the test could effectively change the pass rate. Change in the pass rate can occur without any change in the underlying latent trait, and can be an artifact of multiple attempts and imperfect reliability of the test. By deriving formulae to compute…

  17. Multiple-pass high-pressure homogenization of milk for the development of pasteurization-like processing conditions.

    PubMed

    Ruiz-Espinosa, H; Amador-Espejo, G G; Barcenas-Pozos, M E; Angulo-Guerrero, J O; Garcia, H S; Welti-Chanes, J

    2013-02-01

    Multiple-pass ultrahigh pressure homogenization (UHPH) was used for reducing microbial population of both indigenous spoilage microflora in whole raw milk and a baroresistant pathogen (Staphylococcus aureus) inoculated in whole sterile milk to define pasteurization-like processing conditions. Response surface methodology was followed and multiple response optimization of UHPH operating pressure (OP) (100, 175, 250 MPa) and number of passes (N) (1-5) was conducted through overlaid contour plot analysis. Increasing OP and N had a significant effect (P < 0·05) on microbial reduction of both spoilage microflora and Staph. aureus in milk. Optimized UHPH processes (five 202-MPa passes; four 232-MPa passes) defined a region where a 5-log(10) reduction of total bacterial count of milk and a baroresistant pathogen are attainable, as a requisite parameter for establishing an alternative method of pasteurization. Multiple-pass UHPH optimized conditions might help in producing safe milk without the detrimental effects associated with thermal pasteurization. © 2012 The Society for Applied Microbiology.

  18. Paperless Procurement: The Impact of Advanced Automation

    DTIC Science & Technology

    1992-09-01

    System. POPS = Paperless Order Processing System; RADMIS = Research and Development Management Information System; SAACONS=Standard Army Automated... order processing system, which then updates the contractor’s production (or delivery) scheduling and contract accounting applications. In return, the...used by the DLA’s POPS. 3-5 into an EDI delivery order and pass it directly to the distributor’s or manufacturer’s order processing system. That

  19. Automated X-ray quality control of catalytic converters

    NASA Astrophysics Data System (ADS)

    Shashishekhar, N.; Veselitza, D.

    2017-02-01

    Catalytic converters are devices attached to the exhaust system of automobile or other engines to eliminate or substantially reduce polluting emissions. They consist of coated substrates enclosed in a stainless steel housing. The substrate is typically made of ceramic honeycombs; however stainless steel foil honeycombs are also used. The coating is usually a slurry of alumina, silica, rare earth oxides and platinum group metals. The slurry also known as the wash coat is applied to the substrate in two doses, one on each end of the substrate; in some cases multiple layers of coating are applied. X-ray imaging is used to inspect the applied coating depth on a substrate to confirm compliance with quality requirements. Automated image analysis techniques are employed to measure the coating depth from the X-ray image. Coating depth is assessed by analysis of attenuation line profiles in the image. Edge detection algorithms with noise reduction and outlier rejection are used to calculate the coating depth at a specified point along an attenuation line profile. Quality control of the product is accomplished using several attenuation line profile regions for coating depth measurements, with individual pass or fail criteria specified for each region.

  20. An automated hand hygiene training system improves hand hygiene technique but not compliance.

    PubMed

    Kwok, Yen Lee Angela; Callard, Michelle; McLaws, Mary-Louise

    2015-08-01

    The hand hygiene technique that the World Health Organization recommends for cleansing hands with soap and water or alcohol-based handrub consists of 7 poses. We used an automated training system to improve clinicians' hand hygiene technique and test whether this affected hospitalwide hand hygiene compliance. Seven hundred eighty-nine medical and nursing staff volunteered to participate in a self-directed training session using the automated training system. The proportion of successful first attempts was reported for each of the 7 poses. Hand hygiene compliance was collected according to the national requirement and rates for 2011-2014 were used to determine the effect of the training system on compliance. The highest pass rate was for pose 1 (palm to palm) at 77% (606 out of 789), whereas pose 6 (clean thumbs) had the lowest pass rate at 27% (216 out of 789). One hundred volunteers provided feedback to 8 items related to satisfaction with the automated training system and most (86%) expressed a high degree of satisfaction and all reported that this method was time-efficient. There was no significant change in compliance rates after the introduction of the automated training system. Observed compliance during the posttraining period declined but increased to 82% in response to other strategies. Technology for training clinicians in the 7 poses played an important education role but did not affect compliance rates. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  1. Automated electric valve for electrokinetic separation in a networked microfluidic chip.

    PubMed

    Cui, Huanchun; Huang, Zheng; Dutta, Prashanta; Ivory, Cornelius F

    2007-02-15

    This paper describes an automated electric valve system designed to reduce dispersion and sample loss into a side channel when an electrokinetically mobilized concentration zone passes a T-junction in a networked microfluidic chip. One way to reduce dispersion is to control current streamlines since charged species are driven along them in the absence of electroosmotic flow. Computer simulations demonstrate that dispersion and sample loss can be reduced by applying a constant additional electric field in the side channel to straighten current streamlines in linear electrokinetic flow (zone electrophoresis). This additional electric field was provided by a pair of platinum microelectrodes integrated into the chip in the vicinity of the T-junction. Both simulations and experiments of this electric valve with constant valve voltages were shown to provide unsatisfactory valve performance during nonlinear electrophoresis (isotachophoresis). On the basis of these results, however, an automated electric valve system was developed with improved valve performance. Experiments conducted with this system showed decreased dispersion and increased reproducibility as protein zones isotachophoretically passed the T-junction. Simulations of the automated electric valve offer further support that the desired shape of current streamlines was maintained at the T-junction during isotachophoresis. Valve performance was evaluated at different valve currents based on statistical variance due to dispersion. With the automated control system, two integrated microelectrodes provide an effective way to manipulate current streamlines, thus acting as an electric valve for charged species in electrokinetic separations.

  2. Advanced I&C for Fault-Tolerant Supervisory Control of Small Modular Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Daniel G.

    In this research, we have developed a supervisory control approach to enable automated control of SMRs. By design the supervisory control system has an hierarchical, interconnected, adaptive control architecture. A considerable advantage to this architecture is that it allows subsystems to communicate at different/finer granularity, facilitates monitoring of process at the modular and plant levels, and enables supervisory control. We have investigated the deployment of automation, monitoring, and data collection technologies to enable operation of multiple SMRs. Each unit's controller collects and transfers information from local loops and optimize that unit’s parameters. Information is passed from the each SMR unitmore » controller to the supervisory controller, which supervises the actions of SMR units and manage plant processes. The information processed at the supervisory level will provide operators the necessary information needed for reactor, unit, and plant operation. In conjunction with the supervisory effort, we have investigated techniques for fault-tolerant networks, over which information is transmitted between local loops and the supervisory controller to maintain a safe level of operational normalcy in the presence of anomalies. The fault-tolerance of the supervisory control architecture, the network that supports it, and the impact of fault-tolerance on multi-unit SMR plant control has been a second focus of this research. To this end, we have investigated the deployment of advanced automation, monitoring, and data collection and communications technologies to enable operation of multiple SMRs. We have created a fault-tolerant multi-unit SMR supervisory controller that collects and transfers information from local loops, supervise their actions, and adaptively optimize the controller parameters. The goal of this research has been to develop the methodologies and procedures for fault-tolerant supervisory control of small modular reactors. To achieve this goal, we have identified the following objectives. These objective are an ordered approach to the research: I) Development of a supervisory digital I&C system II) Fault-tolerance of the supervisory control architecture III) Automated decision making and online monitoring.« less

  3. Automated DNA mutation detection using universal conditions direct sequencing: application to ten muscular dystrophy genes

    PubMed Central

    2009-01-01

    Background One of the most common and efficient methods for detecting mutations in genes is PCR amplification followed by direct sequencing. Until recently, the process of designing PCR assays has been to focus on individual assay parameters rather than concentrating on matching conditions for a set of assays. Primers for each individual assay were selected based on location and sequence concerns. The two primer sequences were then iteratively adjusted to make the individual assays work properly. This generally resulted in groups of assays with different annealing temperatures that required the use of multiple thermal cyclers or multiple passes in a single thermal cycler making diagnostic testing time-consuming, laborious and expensive. These factors have severely hampered diagnostic testing services, leaving many families without an answer for the exact cause of a familial genetic disease. A search of GeneTests for sequencing analysis of the entire coding sequence for genes that are known to cause muscular dystrophies returns only a small list of laboratories that perform comprehensive gene panels. The hypothesis for the study was that a complete set of universal assays can be designed to amplify and sequence any gene or family of genes using computer aided design tools. If true, this would allow automation and optimization of the mutation detection process resulting in reduced cost and increased throughput. Results An automated process has been developed for the detection of deletions, duplications/insertions and point mutations in any gene or family of genes and has been applied to ten genes known to bear mutations that cause muscular dystrophy: DMD; CAV3; CAPN3; FKRP; TRIM32; LMNA; SGCA; SGCB; SGCG; SGCD. Using this process, mutations have been found in five DMD patients and four LGMD patients (one in the FKRP gene, one in the CAV3 gene, and two likely causative heterozygous pairs of variations in the CAPN3 gene of two other patients). Methods and assay sequences are reported in this paper. Conclusion This automated process allows laboratories to discover DNA variations in a short time and at low cost. PMID:19835634

  4. Automated DNA mutation detection using universal conditions direct sequencing: application to ten muscular dystrophy genes.

    PubMed

    Bennett, Richard R; Schneider, Hal E; Estrella, Elicia; Burgess, Stephanie; Cheng, Andrew S; Barrett, Caitlin; Lip, Va; Lai, Poh San; Shen, Yiping; Wu, Bai-Lin; Darras, Basil T; Beggs, Alan H; Kunkel, Louis M

    2009-10-18

    One of the most common and efficient methods for detecting mutations in genes is PCR amplification followed by direct sequencing. Until recently, the process of designing PCR assays has been to focus on individual assay parameters rather than concentrating on matching conditions for a set of assays. Primers for each individual assay were selected based on location and sequence concerns. The two primer sequences were then iteratively adjusted to make the individual assays work properly. This generally resulted in groups of assays with different annealing temperatures that required the use of multiple thermal cyclers or multiple passes in a single thermal cycler making diagnostic testing time-consuming, laborious and expensive.These factors have severely hampered diagnostic testing services, leaving many families without an answer for the exact cause of a familial genetic disease. A search of GeneTests for sequencing analysis of the entire coding sequence for genes that are known to cause muscular dystrophies returns only a small list of laboratories that perform comprehensive gene panels.The hypothesis for the study was that a complete set of universal assays can be designed to amplify and sequence any gene or family of genes using computer aided design tools. If true, this would allow automation and optimization of the mutation detection process resulting in reduced cost and increased throughput. An automated process has been developed for the detection of deletions, duplications/insertions and point mutations in any gene or family of genes and has been applied to ten genes known to bear mutations that cause muscular dystrophy: DMD; CAV3; CAPN3; FKRP; TRIM32; LMNA; SGCA; SGCB; SGCG; SGCD. Using this process, mutations have been found in five DMD patients and four LGMD patients (one in the FKRP gene, one in the CAV3 gene, and two likely causative heterozygous pairs of variations in the CAPN3 gene of two other patients). Methods and assay sequences are reported in this paper. This automated process allows laboratories to discover DNA variations in a short time and at low cost.

  5. Automated processing of first-pass radioisotope ventriculography data to determine essential central circulation parameters

    NASA Astrophysics Data System (ADS)

    Krotov, Aleksei; Pankin, Victor

    2017-09-01

    The assessment of central circulation (including heart function) parameters is vital in the preventive diagnostics of inherent and acquired heart failures and during polychemotherapy. The protocols currently applied in Russia do not fully utilize the first-pass assessment (FPRNA) and that results in poor data formalization, while the FPRNA is the one of the fastest, affordable and compact methods among other radioisotope diagnostics protocols. A non-imaging algorithm basing on existing protocols has been designed to use the readings of an additional detector above vena subclavia to determine the total blood volume (TBV), not requiring blood sampling in contrast to current protocols. An automated processing of precordial detector readings is presented, in order to determine the heart strike volume (SV). Two techniques to estimate the ejection fraction (EF) of the heart are discussed.

  6. Multiple pass laser amplifier system

    DOEpatents

    Brueckner, Keith A.; Jorna, Siebe; Moncur, N. Kent

    1977-01-01

    A laser amplification method for increasing the energy extraction efficiency from laser amplifiers while reducing the energy flux that passes through a flux limited system which includes apparatus for decomposing a linearly polarized light beam into multiple components, passing the components through an amplifier in delayed time sequence and recombining the amplified components into an in phase linearly polarized beam.

  7. Multiple pass and multiple layer friction stir welding and material enhancement processes

    DOEpatents

    Feng, Zhili [Knoxville, TN; David, Stan A [Knoxville, TN; Frederick, David Alan [Harriman, TN

    2010-07-27

    Processes for friction stir welding, typically for comparatively thick plate materials using multiple passes and multiple layers of a friction stir welding tool. In some embodiments a first portion of a fabrication preform and a second portion of the fabrication preform are placed adjacent to each other to form a joint, and there may be a groove adjacent the joint. The joint is welded and then, where a groove exists, a filler may be disposed in the groove, and the seams between the filler and the first and second portions of the fabrication preform may be friction stir welded. In some embodiments two portions of a fabrication preform are abutted to form a joint, where the joint may, for example, be a lap joint, a bevel joint or a butt joint. In some embodiments a plurality of passes of a friction stir welding tool may be used, with some passes welding from one side of a fabrication preform and other passes welding from the other side of the fabrication preform.

  8. Automated saliva processing for LC-MS/MS: Improving laboratory efficiency in cortisol and cortisone testing.

    PubMed

    Antonelli, Giorgia; Padoan, Andrea; Artusi, Carlo; Marinova, Mariela; Zaninotto, Martina; Plebani, Mario

    2016-04-01

    The aim of this study was to implement in our routine practice an automated saliva preparation protocol for quantification of cortisol (F) and cortisone (E) by LC-MS/MS using a liquid handling platform, maintaining the previously defined reference intervals with the manual preparation. Addition of internal standard solution to saliva samples and calibrators and SPE on μ-elution 96-well plate were performed by liquid handling platform. After extraction, the eluates were submitted to LC-MS/MS analysis. The manual steps within the entire process were to transfer saliva samples in suitable tubes, to put the cap mat and transfer of the collection plate to the LC auto sampler. Transference of the reference intervals from the manual to the automated procedure was established by Passing Bablok regression on 120 saliva samples analyzed simultaneously with the two procedures. Calibration curves were linear throughout the selected ranges. The imprecision ranged from 2 to 10%, with recoveries from 95 to 116%. Passing Bablok regression demonstrated no significant bias. The liquid handling platform translates the manual steps into automated operations allowing for saving hands-on time, while maintaining assay reproducibility and ensuring reliability of results, making it implementable in our routine with the previous established reference intervals. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. Disposable and removable nucleic acid extraction and purification cartridges for automated flow-through systems

    DOEpatents

    Regan, John Frederick

    2014-09-09

    Removable cartridges are used on automated flow-through systems for the purpose of extracting and purifying genetic material from complex matrices. Different types of cartridges are paired with specific automated protocols to concentrate, extract, and purifying pathogenic or human genetic material. Their flow-through nature allows large quantities sample to be processed. Matrices may be filtered using size exclusion and/or affinity filters to concentrate the pathogen of interest. Lysed material is ultimately passed through a filter to remove the insoluble material before the soluble genetic material is delivered past a silica-like membrane that binds the genetic material, where it is washed, dried, and eluted. Cartridges are inserted into the housing areas of flow-through automated instruments, which are equipped with sensors to ensure proper placement and usage of the cartridges. Properly inserted cartridges create fluid- and air-tight seals with the flow lines of an automated instrument.

  10. Arcnet(R) On-Fiber -- A Viable Factory Automation Alternative

    NASA Astrophysics Data System (ADS)

    Karlin, Geof; Tucker, Carol S.

    1987-01-01

    Manufacturers need to improve their operating methods and increase their productivity so they can compete successfully in the marketplace. This goal can be achieved through factory automation, and the key to this automation is successful data base management and factory integration. However, large scale factory automation and integration requires effective communications, and this has given rise to an interest in various Local Area Networks or LANs. In a completely integrated and automated factory, the entire organization must have access to the data base, and all departments and functions must be able to communicate with each other. Traditionally, these departments and functions use incompatible equipment, and the ability to make such equipment communicate presents numerous problems. ARCNET, a token-passing LAN which has a significant presence in the office environment today, coupled with fiber optic cable, the cable of the future, provide an effective, low-cost solution to a number of these problems.

  11. LIDT test coupled with gamma radiation degraded optics

    NASA Astrophysics Data System (ADS)

    IOAN, M.-R.

    2016-06-01

    A laser can operate in regular but also in nuclear ionizing radiation environments. This paper presents the results of a real time measuring method used to detect the laser induced damage threshold (LIDT) in the optical surfaces/volumes of TEMPAX borosilicate glasses operating in high gamma rays fields. The laser damage quantification technique is applied by using of an automated station intended to measure the damage threshold of optical components, according to the International Standard ISO 21254. Single and multiple pulses laser damage thresholds were determined. For an optical material, life time when it is subjected to multiple pulses of high power laser radiation can be predicted. A few ns pulses shooting laser, operating in regular conditions, inflects damage to a target by its intense electrical component but also in a lower manner by local absorption of its transported thermal energy. When the beam is passing thru optical glass elements affected by ionizing radiation fields, the thermal component is starting to have a more important role, because of the increased thermal absorption in the material's volume caused by the radiation induced color centers. LIDT results on TEMPAX optical glass windows, with the contribution due to the gamma radiation effects (ionization mainly by Compton effect in this case), are presented. This contribution was highlighted and quantified. Energetic, temporal and spatial beam characterizations (according to ISO 11554 standards) and LIDT tests were performed using a high power Nd: YAG laser (1064 nm), before passing the beam through each irradiated glass sample (0 kGy, 1.3 kGy and 21.2 kGy).

  12. Highly automated on-orbit operations of the NuSTAR telescope

    NASA Astrophysics Data System (ADS)

    Roberts, Bryce; Bester, Manfred; Dumlao, Renee; Eckert, Marty; Johnson, Sam; Lewis, Mark; McDonald, John; Pease, Deron; Picard, Greg; Thorsness, Jeremy

    2014-08-01

    UC Berkeley's Space Sciences Laboratory (SSL) currently operates a fleet of seven NASA satellites, which conduct research in the fields of space physics and astronomy. The newest addition to this fleet is a high-energy X-ray telescope called the Nuclear Spectroscopic Telescope Array (NuSTAR). Since 2012, SSL has conducted on-orbit operations for NuSTAR on behalf of the lead institution, principle investigator, and Science Operations Center at the California Institute of Technology. NuSTAR operations benefit from a truly multi-mission ground system architecture design focused on automation and autonomy that has been honed by over a decade of continual improvement and ground network expansion. This architecture has made flight operations possible with nominal 40 hours per week staffing, while not compromising mission safety. The remote NuSTAR Science Operation Center (SOC) and Mission Operations Center (MOC) are joined by a two-way electronic interface that allows the SOC to submit automatically validated telescope pointing requests, and also to receive raw data products that are automatically produced after downlink. Command loads are built and uploaded weekly, and a web-based timeline allows both the SOC and MOC to monitor the state of currently scheduled spacecraft activities. Network routing and the command and control system are fully automated by MOC's central scheduling system. A closed-loop data accounting system automatically detects and retransmits data gaps. All passes are monitored by two independent paging systems, which alert staff of pass support problems or anomalous telemetry. NuSTAR mission operations now require less than one attended pass support per workday.

  13. Automated GMA welding of austenitic stainless steel pipe

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tahash, G.J.

    1996-12-31

    The study focused on reducing weld cycle times of rotatable subassemblies (spools) using automated welding equipment. A unique automatic Gas Metal Arc Welding (GMAW) system was used to produce a series of pipe to pipe welds on 141 mm (5 in.) schedule 80 seamless stainless steel pipe. After manual tack welding, the adaptive control system welded the root pass of the argon gas backed open vee groove circumferential butt joints in the IG rotated position with short circuiting transfer GMAW. The fill and cover passes were welded automatically with spray transfer GMAW. Automatic welding cycle times were found to bemore » 50--80 percent shorter than the current techniques of roll welding with Shielded Metal Arc Welding and manual Gas Tungsten Arc Welding. Weld costs ({Brit_pounds}/m), including amortization, for the various systems were compared. The cost of automated GMA welds was virtually equivalent to the most competitive methods while depositing 75% more filler metal per year. Also investigated were metallurgical effects generated by weld thermal cycling, and the associated effects on mechanical properties of the weld joint. Mechanical properties of the welds met or exceeded those of the base metal. Sensitization of the pipe did not occur in the heat affected zone (HAZ), based on the absence of evidence of intergranular attack in modified Strauss corrosion tests and despite the fact of interpass temperatures well above recommended maximums. Cooling rates of 3--5 C/s in the heat affected zone of the four pass welds were measured by thermocouple technique and found to be within the non-sensitizing range for this alloy.« less

  14. Multiple Robots Localization Via Data Sharing

    DTIC Science & Technology

    2015-09-01

    multiple humans, each with specialized skills complementing each other, work to create the solution. Hence, there is a motivation to think in terms of...pygame.Color(255,255,255) COLORBLACK = pygame.Color(0,0,0) F. AUTOMATE.PY The automate.py file is a helper file to assist in running multiple simulation

  15. Toward Abstracting the Communication Intent in Applications to Improve Portability and Productivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mintz, Tiffany M; Hernandez, Oscar R; Kartsaklis, Christos

    Programming with communication libraries such as the Message Passing Interface (MPI) obscures the high-level intent of the communication in an application and makes static communication analysis difficult to do. Compilers are unaware of communication libraries specifics, leading to the exclusion of communication patterns from any automated analysis and optimizations. To overcome this, communication patterns can be expressed at higher-levels of abstraction and incrementally added to existing MPI applications. In this paper, we propose the use of directives to clearly express the communication intent of an application in a way that is not specific to a given communication library. Our communicationmore » directives allow programmers to express communication among processes in a portable way, giving hints to the compiler on regions of computations that can be overlapped with communication and relaxing communication constraints on the ordering, completion and synchronization of the communication imposed by specific libraries such as MPI. The directives can then be translated by the compiler into message passing calls that efficiently implement the intended pattern and be targeted to multiple communication libraries. Thus far, we have used the directives to express point-to-point communication patterns in C, C++ and Fortran applications, and have translated them to MPI and SHMEM.« less

  16. OrthoMCL: Identification of Ortholog Groups for Eukaryotic Genomes

    PubMed Central

    Li, Li; Stoeckert, Christian J.; Roos, David S.

    2003-01-01

    The identification of orthologous groups is useful for genome annotation, studies on gene/protein evolution, comparative genomics, and the identification of taxonomically restricted sequences. Methods successfully exploited for prokaryotic genome analysis have proved difficult to apply to eukaryotes, however, as larger genomes may contain multiple paralogous genes, and sequence information is often incomplete. OrthoMCL provides a scalable method for constructing orthologous groups across multiple eukaryotic taxa, using a Markov Cluster algorithm to group (putative) orthologs and paralogs. This method performs similarly to the INPARANOID algorithm when applied to two genomes, but can be extended to cluster orthologs from multiple species. OrthoMCL clusters are coherent with groups identified by EGO, but improved recognition of “recent” paralogs permits overlapping EGO groups representing the same gene to be merged. Comparison with previously assigned EC annotations suggests a high degree of reliability, implying utility for automated eukaryotic genome annotation. OrthoMCL has been applied to the proteome data set from seven publicly available genomes (human, fly, worm, yeast, Arabidopsis, the malaria parasite Plasmodium falciparum, and Escherichia coli). A Web interface allows queries based on individual genes or user-defined phylogenetic patterns (http://www.cbil.upenn.edu/gene-family). Analysis of clusters incorporating P. falciparum genes identifies numerous enzymes that were incompletely annotated in first-pass annotation of the parasite genome. PMID:12952885

  17. Teaching basic life support with an automated external defibrillator using the two-stage or the four-stage teaching technique.

    PubMed

    Bjørnshave, Katrine; Krogh, Lise Q; Hansen, Svend B; Nebsbjerg, Mette A; Thim, Troels; Løfgren, Bo

    2018-02-01

    Laypersons often hesitate to perform basic life support (BLS) and use an automated external defibrillator (AED) because of self-perceived lack of knowledge and skills. Training may reduce the barrier to intervene. Reduced training time and costs may allow training of more laypersons. The aim of this study was to compare BLS/AED skills' acquisition and self-evaluated BLS/AED skills after instructor-led training with a two-stage versus a four-stage teaching technique. Laypersons were randomized to either two-stage or four-stage teaching technique courses. Immediately after training, the participants were tested in a simulated cardiac arrest scenario to assess their BLS/AED skills. Skills were assessed using the European Resuscitation Council BLS/AED assessment form. The primary endpoint was passing the test (17 of 17 skills adequately performed). A prespecified noninferiority margin of 20% was used. The two-stage teaching technique (n=72, pass rate 57%) was noninferior to the four-stage technique (n=70, pass rate 59%), with a difference in pass rates of -2%; 95% confidence interval: -18 to 15%. Neither were there significant differences between the two-stage and four-stage groups in the chest compression rate (114±12 vs. 115±14/min), chest compression depth (47±9 vs. 48±9 mm) and number of sufficient rescue breaths between compression cycles (1.7±0.5 vs. 1.6±0.7). In both groups, all participants believed that their training had improved their skills. Teaching laypersons BLS/AED using the two-stage teaching technique was noninferior to the four-stage teaching technique, although the pass rate was -2% (95% confidence interval: -18 to 15%) lower with the two-stage teaching technique.

  18. An automated spring-loaded needle for endoscopic ultrasound-guided abdominal paracentesis in cancer patients

    PubMed Central

    Suzuki, Rei; Irisawa, Atsushi; Bhutani, Manoop S; Hikichi, Takuto; Takagi, Tadayuki; Shibukawa, Goro; Sato, Ai; Sato, Masaki; Ikeda, Tsunehiko; Watanabe, Ko; Nakamura, Jun; Annangi, Srinadh; Tasaki, Kazuhiro; Obara, Katsutoshi; Ohira, Hiromasa

    2014-01-01

    AIM: To evaluate the feasibility of using an automated spring-loaded needle device for endoscopic ultrasound (EUS)-guided abdominal paracentesis (EUS-P) to see if this would make it easier to puncture the mobile and lax gastric wall for EUS-P. METHODS: The EUS database and electronic medical records at Fukushima Medical University Hospital were searched from January 2001 to April 2011. Patients with a history of cancer and who underwent EUS-P using an automated spring-loaded needle device with a 22-gauge puncture needle were included. The needle was passed through the instrument channel and advanced through the gastrointestinal wall under EUS guidance into the echo-free space in the abdominal cavity and ascitic fluid was collected. The confirmed diagnosis of malignant ascites included positive cytology and results from careful clinical observation for at least 6 mo in patients with negative cytology. The technical success rate, cytology results and complications were evaluated. RESULTS: We found 11 patients who underwent EUS-P with an automated spring-loaded needle device. In 4 cases, ascites was revealed only with EUS but not in other imaging modalities. EUS-P was done in 7 other cases because there was minimal ascitic fluid and no safe window for percutaneous abdominal aspiration. Ascitic fluid was obtained in all cases by EUS-P. The average amount aspirated was 14.1 mL (range 0.5-38 mL) and that was sent for cytological exam. The etiology of ascitic fluid was benign in 5 patients and malignant in 6. In all cases, ascitic fluid was obtained with the first needle pass. No procedure-related adverse effects occurred. CONCLUSION: EUS-P with an automated spring-loaded needle device is a feasible and safe method for ascites evaluation. PMID:24567793

  19. Invention and validation of an automated camera system that uses optical character recognition to identify patient name mislabeled samples.

    PubMed

    Hawker, Charles D; McCarthy, William; Cleveland, David; Messinger, Bonnie L

    2014-03-01

    Mislabeled samples are a serious problem in most clinical laboratories. Published error rates range from 0.39/1000 to as high as 1.12%. Standardization of bar codes and label formats has not yet achieved the needed improvement. The mislabel rate in our laboratory, although low compared with published rates, prompted us to seek a solution to achieve zero errors. To reduce or eliminate our mislabeled samples, we invented an automated device using 4 cameras to photograph the outside of a sample tube. The system uses optical character recognition (OCR) to look for discrepancies between the patient name in our laboratory information system (LIS) vs the patient name on the customer label. All discrepancies detected by the system's software then require human inspection. The system was installed on our automated track and validated with production samples. We obtained 1 009 830 images during the validation period, and every image was reviewed. OCR passed approximately 75% of the samples, and no mislabeled samples were passed. The 25% failed by the system included 121 samples actually mislabeled by patient name and 148 samples with spelling discrepancies between the patient name on the customer label and the patient name in our LIS. Only 71 of the 121 mislabeled samples detected by OCR were found through our normal quality assurance process. We have invented an automated camera system that uses OCR technology to identify potential mislabeled samples. We have validated this system using samples transported on our automated track. Full implementation of this technology offers the possibility of zero mislabeled samples in the preanalytic stage.

  20. Automated feature extraction for retinal vascular biometry in zebrafish using OCT angiography

    NASA Astrophysics Data System (ADS)

    Bozic, Ivan; Rao, Gopikrishna M.; Desai, Vineet; Tao, Yuankai K.

    2017-02-01

    Zebrafish have been identified as an ideal model for angiogenesis because of anatomical and functional similarities with other vertebrates. The scale and complexity of zebrafish assays are limited by the need to manually treat and serially screen animals, and recent technological advances have focused on automation and improving throughput. Here, we use optical coherence tomography (OCT) and OCT angiography (OCT-A) to perform noninvasive, in vivo imaging of retinal vasculature in zebrafish. OCT-A summed voxel projections were low pass filtered and skeletonized to create an en face vascular map prior to connectivity analysis. Vascular segmentation was referenced to the optic nerve head (ONH), which was identified by automatically segmenting the retinal pigment epithelium boundary on the OCT structural volume. The first vessel branch generation was identified as skeleton segments with branch points closest to the ONH, and subsequent generations were found iteratively by expanding the search space outwards from the ONH. Biometric parameters, including length, curvature, and branch angle of each vessel segment were calculated and grouped by branch generation. Despite manual handling and alignment of each animal over multiple time points, we observe distinct qualitative patterns that enable unique identification of each eye from individual animals. We believe this OCT-based retinal biometry method can be applied for automated animal identification and handling in high-throughput organism-level pharmacological assays and genetic screens. In addition, these extracted features may enable high-resolution quantification of longitudinal vascular changes as a method for studying zebrafish models of retinal neovascularization and vascular remodeling.

  1. "Intelligences That Plants Can Pass On": Play Dough, Fun and Teaching Strategies with Insights to Multiple Intelligences

    ERIC Educational Resources Information Center

    Laughlin, Kevin; Foley, Andi

    2012-01-01

    The "Intelligences That Plants Can Pass On" is an activity that involves several of Gardner's Multiple Intelligences and was designed for demonstrating the practical use of Multiple Intelligences in delivering education programs to all ages of learners. Instructions are provided for how to implement this activity, and the activity is linked to…

  2. Inspection of thick welded joints using laser-ultrasonic SAFT.

    PubMed

    Lévesque, D; Asaumi, Y; Lord, M; Bescond, C; Hatanaka, H; Tagami, M; Monchalin, J-P

    2016-07-01

    The detection of defects in thick butt joints in the early phase of multi-pass arc welding would be very valuable to reduce cost and time in the necessity of reworking. As a non-contact method, the laser-ultrasonic technique (LUT) has the potential for the automated inspection of welds, ultimately online during manufacturing. In this study, testing has been carried out using LUT combined with the synthetic aperture focusing technique (SAFT) on 25 and 50mm thick butt welded joints of steel both completed and partially welded. EDM slits of 2 or 3mm height were inserted at different depths in the multi-pass welding process to simulate a lack of fusion. Line scans transverse to the weld are performed with the generation and detection laser spots superimposed directly on the surface of the weld bead. A CCD line camera is used to simultaneously acquire the surface profile for correction in the SAFT processing. All artificial defects but also real defects are visualized in the investigated thick butt weld specimens, either completed or partially welded after a given number of passes. The results obtained clearly show the potential of using the LUT with SAFT for the automated inspection of arc welds or hybrid laser-arc welds during manufacturing. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  3. Automated brush plating process for solid oxide fuel cells

    DOEpatents

    Long, Jeffrey William

    2003-01-01

    A method of depositing a metal coating (28) on the interconnect (26) of a tubular, hollow fuel cell (10) contains the steps of providing the fuel cell (10) having an exposed interconnect surface (26); contacting the inside of the fuel cell (10) with a cathode (45) without use of any liquid materials; passing electrical current through a contacting applicator (46) which contains a metal electrolyte solution; passing the current from the applicator (46) to the cathode (45) and contacting the interconnect (26) with the applicator (46) and coating all of the exposed interconnect surface.

  4. Automated macromolecular crystallization screening

    DOEpatents

    Segelke, Brent W.; Rupp, Bernhard; Krupka, Heike I.

    2005-03-01

    An automated macromolecular crystallization screening system wherein a multiplicity of reagent mixes are produced. A multiplicity of analysis plates is produced utilizing the reagent mixes combined with a sample. The analysis plates are incubated to promote growth of crystals. Images of the crystals are made. The images are analyzed with regard to suitability of the crystals for analysis by x-ray crystallography. A design of reagent mixes is produced based upon the expected suitability of the crystals for analysis by x-ray crystallography. A second multiplicity of mixes of the reagent components is produced utilizing the design and a second multiplicity of reagent mixes is used for a second round of automated macromolecular crystallization screening. In one embodiment the multiplicity of reagent mixes are produced by a random selection of reagent components.

  5. Recycling isotachophoresis - A novel approach to preparative protein fractionation

    NASA Technical Reports Server (NTRS)

    Sloan, Jeffrey E.; Thormann, Wolfgang; Bier, Milan; Twitty, Garland E.; Mosher, Richard A.

    1986-01-01

    The concept of automated recycling isotachophoresis (RITP) as a purification methodology is discussed, in addition to a description of the apparatus. In the present automated RITP, the computer system follows the achievement of steady state using arrays of universal and specific sensors, monitors the position of the front edge of the zone structure, activates the counterflow if the leading boundary passes a specified position along the separation axis, or changes the applied current, accordingly. The system demonstrates high resolution, in addition to higher processing rates than are possible in zone electrophoresis or isoelectric focusing.

  6. Examining single- and multiple-process theories of trust in automation.

    PubMed

    Rice, Stephen

    2009-07-01

    The author examined the effects of human responses to automation alerts and nonalerts. Previous research has shown that automation false alarms and misses have differential effects on human trust (i.e., automation false alarms tend to affect operator compliance, whereas automation misses tend to affect operator reliance). Participants performed a simulated combat task, whereby they examined aerial photographs for the presence of enemy targets. A diagnostic aid provided a recommendation during each trial. The author manipulated the reliability and response bias of the aid to provide appropriate data for state-trace analyses. The analyses provided strong evidence that only a multiple-process theory of operator trust can explain the effects of automation errors on human dependence behaviors. The author discusses the theoretical and practical implications of this finding.

  7. Space Shuttle Program Primary Avionics Software System (PASS) Success Legacy -Major Accomplishments and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Orr, James K.

    2010-01-01

    This presentation has shown the accomplishments of the PASS project over three decades and highlighted the lessons learned. Over the entire time, our goal has been to continuously improve our process, implement automation for both quality and increased productivity, and identify and remove all defects due to prior execution of a flawed process in addition to improving our processes following identification of significant process escapes. Morale and workforce instability have been issues, most significantly during 1993 to 1998 (period of consolidation in aerospace industry). The PASS project has also consulted with others, including the Software Engineering Institute, so as to be an early evaluator, adopter, and adapter of state-of-the-art software engineering innovations.

  8. Delivering real-time status and arrival information to commuter rail passengers at complex stations

    DOT National Transportation Integrated Search

    2003-08-01

    Software was developed for calculating real-time train status in an Automated Train Information Display System (ATIDS) at NJ Transit. Interfaces were developed for passing schedules and real-time train position and routing data from a rail traffic co...

  9. High-throughput biological small-angle X-ray scattering with a robotically loaded capillary cell

    PubMed Central

    Nielsen, S. S.; Møller, M.; Gillilan, R. E.

    2012-01-01

    With the rise in popularity of biological small-angle X-ray scattering (BioSAXS) measurements, synchrotron beamlines are confronted with an ever-increasing number of samples from a wide range of solution conditions. To meet these demands, an increasing number of beamlines worldwide have begun to provide automated liquid-handling systems for sample loading. This article presents an automated sample-loading system for BioSAXS beamlines, which combines single-channel disposable-tip pipetting with a vacuum-enclosed temperature-controlled capillary flow cell. The design incorporates an easily changeable capillary to reduce the incidence of X-ray window fouling and cross contamination. Both the robot-control and the data-processing systems are written in Python. The data-processing code, RAW, has been enhanced with several new features to form a user-friendly BioSAXS pipeline for the robot. The flow cell also supports efficient manual loading and sample recovery. An effective rinse protocol for the sample cell is developed and tested. Fluid dynamics within the sample capillary reveals a vortex ring pattern of circulation that redistributes radiation-damaged material. Radiation damage is most severe in the boundary layer near the capillary surface. At typical flow speeds, capillaries below 2 mm in diameter are beginning to enter the Stokes (creeping flow) regime in which mixing due to oscillation is limited. Analysis within this regime shows that single-pass exposure and multiple-pass exposure of a sample plug are functionally the same with regard to exposed volume when plug motion reversal is slow. The robot was tested on three different beamlines at the Cornell High-Energy Synchrotron Source, with a variety of detectors and beam characteristics, and it has been used successfully in several published studies as well as in two introductory short courses on basic BioSAXS methods. PMID:22509071

  10. Multiple high voltage output DC-to-DC power converter

    NASA Technical Reports Server (NTRS)

    Cronin, Donald L. (Inventor); Farber, Bertrand F. (Inventor); Gehm, Hartmut K. (Inventor); Goldin, Daniel S. (Inventor)

    1977-01-01

    Disclosed is a multiple output DC-to-DC converter. The DC input power is filtered and passed through a chopper preregulator. The chopper output is then passed through a current source inverter controlled by a squarewave generator. The resultant AC is passed through the primary winding of a transformer, with high voltages induced in a plurality of secondary windings. The high voltage secondary outputs are each solid-state rectified for passage to individual output loads. Multiple feedback loops control the operation of the chopper preregulator, one being responsive to the current through the primary winding and another responsive to the DC voltage level at a selected output.

  11. Increased Resistance to Flow and Ventilator Failure Secondary to Faulty CO2 Absorbent Insert Not Detected During Automated Anesthesia Machine Check: A Case Report.

    PubMed

    Moreno-Duarte, Ingrid; Montenegro, Julio; Balonov, Konstantin; Schumann, Roman

    2017-04-15

    Most modern anesthesia workstations provide automated checkout, which indicates the readiness of the anesthesia machine. In this case report, an anesthesia machine passed the automated machine checkout. Minutes after the induction of general anesthesia, we observed a mismatch between the selected and delivered tidal volumes in the volume auto flow mode with increased inspiratory resistance during manual ventilation. Endotracheal tube kinking, circuit obstruction, leaks, and patient-related factors were ruled out. Further investigation revealed a broken internal insert within the CO2 absorbent canister that allowed absorbent granules to cause a partial obstruction to inspiratory and expiratory flow triggering contradictory alarms. We concluded that even when the automated machine checkout indicates machine readiness, unforeseen equipment failure due to unexpected events can occur and require providers to remain vigilant.

  12. Automation of Ocean Product Metrics

    DTIC Science & Technology

    2008-09-30

    Presented in: Ocean Sciences 2008 Conf., 5 Mar 2008. Shriver, J., J. D. Dykes, and J. Fabre: Automation of Operational Ocean Product Metrics. Presented in 2008 EGU General Assembly , 14 April 2008. 9 ...processing (multiple data cuts per day) and multiple-nested models. Routines for generating automated evaluations of model forecast statistics will be...developed and pre-existing tools will be collected to create a generalized tool set, which will include user-interface tools to the metrics data

  13. Multiple-rotor-cycle 2D PASS experiments with applications to (207)Pb NMR spectroscopy.

    PubMed

    Vogt, F G; Gibson, J M; Aurentz, D J; Mueller, K T; Benesi, A J

    2000-03-01

    Thetwo-dimensional phase-adjusted spinning sidebands (2D PASS) experiment is a useful technique for simplifying magic-angle spinning (MAS) NMR spectra that contain overlapping or complicated spinning sideband manifolds. The pulse sequence separates spinning sidebands by their order in a two-dimensional experiment. The result is an isotropic/anisotropic correlation experiment, in which a sheared projection of the 2D spectrum effectively yields an isotropic spectrum with no sidebands. The original 2D PASS experiment works best at lower MAS speeds (1-5 kHz). At higher spinning speeds (8-12 kHz) the experiment requires higher RF power levels so that the pulses do not overlap. In the case of nuclei such as (207)Pb, a large chemical shift anisotropy often yields too many spinning sidebands to be handled by a reasonable 2D PASS experiment unless higher spinning speeds are used. Performing the experiment at these speeds requires fewer 2D rows and a correspondingly shorter experimental time. Therefore, we have implemented PASS pulse sequences that occupy multiple MAS rotor cycles, thereby avoiding pulse overlap. These multiple-rotor-cycle 2D PASS sequences are intended for use in high-speed MAS situations such as those required by (207)Pb. A version of the multiple-rotor-cycle 2D PASS sequence that uses composite pulses to suppress spectral artifacts is also presented. These sequences are demonstrated on (207)Pb test samples, including lead zirconate, a perovskite-phase compound that is representative of a large class of interesting materials. Copyright 2000 Academic Press.

  14. Peer Assisted Study Sessions for Postgraduate International Students in Australia

    ERIC Educational Resources Information Center

    Zaccagnini, Melissa; Verenikina, Irina

    2013-01-01

    Peer Assisted Study Sessions (PASS), a peer led academic support program that has multiple documented academic, social, and transition benefits, is increasingly being utilised in Australian institutions. Whilst PASS has been evaluated from multiple angles in regard to the undergraduate cohort, there is limited research regarding the benefits of…

  15. Aquarius's Instrument Science Data System (ISDS) Automated to Acquire, Process, Trend Data and Produce Radiometric System Assessment Reports

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Aquarius Radiometer, a subsystem of the Aquarius Instrument required a data acquisition ground system to support calibration and radiometer performance assessment. To support calibration and compose performance assessments, we developed an automated system which uploaded raw data to a ftp server and saved raw and processed data to a database. This paper details the overall functionalities of the Aquarius Instrument Science Data System (ISDS) and the individual electrical ground support equipment (EGSE) which produced data files that were infused into the ISDS. Real time EGSEs include an ICDS Simulator, Calibration GSE, Labview controlled power supply, and a chamber data acquisition system. ICDS Simulator serves as a test conductor primary workstation, collecting radiometer housekeeping (HK) and science data and passing commands and HK telemetry collection request to the radiometer. Calibration GSE (Radiometer Active Test Source) provides source choice from multiple targets for the radiometer external calibration. Power Supply GSE, controlled by labview, provides real time voltage and current monitoring of the radiometer. And finally the chamber data acquisition system produces data reflecting chamber vacuum pressure, thermistor temperatures, AVG and watts. Each GSE system produce text based data files every two to six minutes and automatically copies the data files to the Central Archiver PC. The Archiver PC stores the data files, schedules automated uploads of these files to an external FTP server, and accepts request to copy all data files to the ISDS for offline data processing and analysis. Aquarius Radiometer ISDS contains PHP and MATLab programs to parse, process and save all data to a MySQL database. Analysis tools (MATLab programs) in the ISDS system are capable of displaying radiometer science, telemetry and auxiliary data in near real time as well as performing data analysis and producing automated performance assessment reports of the Aquarius Radiometer.

  16. Fully Automated RNAscope In Situ Hybridization Assays for Formalin‐Fixed Paraffin‐Embedded Cells and Tissues

    PubMed Central

    Anderson, Courtney M.; Zhang, Bingqing; Miller, Melanie; Butko, Emerald; Wu, Xingyong; Laver, Thomas; Kernag, Casey; Kim, Jeffrey; Luo, Yuling; Lamparski, Henry; Park, Emily; Su, Nan

    2016-01-01

    ABSTRACT Biomarkers such as DNA, RNA, and protein are powerful tools in clinical diagnostics and therapeutic development for many diseases. Identifying RNA expression at the single cell level within the morphological context by RNA in situ hybridization provides a great deal of information on gene expression changes over conventional techniques that analyze bulk tissue, yet widespread use of this technique in the clinical setting has been hampered by the dearth of automated RNA ISH assays. Here we present an automated version of the RNA ISH technology RNAscope that is adaptable to multiple automation platforms. The automated RNAscope assay yields a high signal‐to‐noise ratio with little to no background staining and results comparable to the manual assay. In addition, the automated duplex RNAscope assay was able to detect two biomarkers simultaneously. Lastly, assay consistency and reproducibility were confirmed by quantification of TATA‐box binding protein (TBP) mRNA signals across multiple lots and multiple experiments. Taken together, the data presented in this study demonstrate that the automated RNAscope technology is a high performance RNA ISH assay with broad applicability in biomarker research and diagnostic assay development. J. Cell. Biochem. 117: 2201–2208, 2016. © 2016 The Authors. Journal of Cellular Biochemistry Published by Wiley Periodicals, Inc. PMID:27191821

  17. ARTIP: Automated Radio Telescope Image Processing Pipeline

    NASA Astrophysics Data System (ADS)

    Sharma, Ravi; Gyanchandani, Dolly; Kulkarni, Sarang; Gupta, Neeraj; Pathak, Vineet; Pande, Arti; Joshi, Unmesh

    2018-02-01

    The Automated Radio Telescope Image Processing Pipeline (ARTIP) automates the entire process of flagging, calibrating, and imaging for radio-interferometric data. ARTIP starts with raw data, i.e. a measurement set and goes through multiple stages, such as flux calibration, bandpass calibration, phase calibration, and imaging to generate continuum and spectral line images. Each stage can also be run independently. The pipeline provides continuous feedback to the user through various messages, charts and logs. It is written using standard python libraries and the CASA package. The pipeline can deal with datasets with multiple spectral windows and also multiple target sources which may have arbitrary combinations of flux/bandpass/phase calibrators.

  18. Macrophage Responses to Epithelial Dysfunction Promote Lung Fibrosis in Aging

    DTIC Science & Technology

    2017-10-01

    and Christman, 2016, AJRCMB) and at the time of this report listed among highly accessed on AJRCMB website . Importantly, our protocol and findings...seq were prepared using a high-throughput automated robotic platform (Agilent Bravo) to minimize a batch effect, all libraries have passed the QC

  19. Beacon Spacecraft Operations: Lessons in Automation

    NASA Technical Reports Server (NTRS)

    Sherwood, R.; Schlutsmeyer, A.; Sue, M.; Szijjarto, J.; Wyatt, E. J.

    2000-01-01

    A new approach to mission operations has been flight validated on NASA's Deep Space One (DS1) mission that launched in October 1998. The beacon monitor operations technology is aimed at decreasing the total volume of downlinked engineering telemetry by reducing the frequency of downlink and the volume of data received per pass.

  20. Evaluating single-pass catch as a tool for identifying spatial pattern in fish distribution

    USGS Publications Warehouse

    Bateman, Douglas S.; Gresswell, Robert E.; Torgersen, Christian E.

    2005-01-01

    We evaluate the efficacy of single-pass electrofishing without blocknets as a tool for collecting spatially continuous fish distribution data in headwater streams. We compare spatial patterns in abundance, sampling effort, and length-frequency distributions from single-pass sampling of coastal cutthroat trout (Oncorhynchus clarki clarki) to data obtained from a more precise multiple-pass removal electrofishing method in two mid-sized (500–1000 ha) forested watersheds in western Oregon. Abundance estimates from single- and multiple-pass removal electrofishing were positively correlated in both watersheds, r = 0.99 and 0.86. There were no significant trends in capture probabilities at the watershed scale (P > 0.05). Moreover, among-sample variation in fish abundance was higher than within-sample error in both streams indicating that increased precision of unit-scale abundance estimates would provide less information on patterns of abundance than increasing the fraction of habitat units sampled. In the two watersheds, respectively, single-pass electrofishing captured 78 and 74% of the estimated population of cutthroat trout with 7 and 10% of the effort. At the scale of intermediate-sized watersheds, single-pass electrofishing exhibited a sufficient level of precision to be effective in detecting spatial patterns of cutthroat trout abundance and may be a useful tool for providing the context for investigating fish-habitat relationships at multiple scales.

  1. Variable pathlength cavity spectroscopy development of an automated prototype

    NASA Astrophysics Data System (ADS)

    Schmeling, Ryan Andrew

    Spectroscopy is the study of the interaction of electromagnetic radiation (EMR) with matter to probe the chemical and physical properties of atoms and molecules. The primary types of analytical spectroscopy are absorption, emission, and scattering methods. Absorption spectroscopy can quantitatively determine the chemical concentration of a given species in a sample by the relationship described by Beer's Law. Upon inspection of Beer's Law, it becomes apparent that for a given analyte concentration, the only experimental variable is the pathlength. Over the past ˜75 years, several approaches to physically increasing the pathlength have been reported in the literature. These have included not only larger cuvettes and novel techniques such as Differential Optical Absorption Spectroscopy, but also numerous designs that are based upon the creation of an optical cavity in which multiple reflections through the sample are made possible. The cavity-based designs range from the White Cell (1942) to Cavity Ring-Down Spectroscopy (O'Keefe and Deacon, 1998). In the White Cell approach, the incident beam is directed off-axis to repeatedly reflect concave mirror surfaces. Numerous variations of the White Cell design have been reported, and it has found wide application in infrared absorption spectroscopy in what have become to be known as "light pipes". In the CRDS design, on the other hand, highly reflective dielectric mirrors situated for on-axis reflections result in the measurement of the exponential decay of trapped light that passes through the exit mirror. CRDS has proven over the past two decades to be a powerful technique for ultra-trace analysis (< 10-15 g), with practical applications ranging from atmospheric monitoring of greenhouse gases to biomedical "breath screening" as a means to identify disease states. In this thesis, a novel approach to ultra-trace analysis by absorption spectroscopy is described. In this approach known as Variable Pathlength Cavity Spectroscopy (VPCS), a high finesse optical cavity is created by two flat, parallel, dielectric mirrors -- one of which is rotating. Source light from a pulsed dye laser (488 nm) enters the optical cavity in the same manner as in Cavity Ring-Down Spectroscopy (CRDS), i.e., by passing through the cavity entrance mirror. However, unlike CRDS in which the mirrors are fixed, concave, and mechanically unaltered, the cavity exit mirror contains a slit (1.0 mm diameter) that is rotated at high speed on an axle, thereby transmitting a small fraction of the trapped light to a photomultiplier tube detector. In this approach, unlike CRDS, absorbance is measured directly. In previous prototype designs of the VPCS instrument, instrument control (alignment) and data acquisition and reduction were performed manually; these functions were both inefficient and tedious. Despite this, the VPCS was validated in "proof of concept" testing, as described with a previous prototype (Frost, 2011). Frost demonstrated that the pathlength enhancement increased 53-fold compared to single-pass absorption measurements in monitoring NO2 (g) at part-per-billion levels. The goal of the present work is to improve upon the previous prototype ("P4") that required manual alignment, data collection, and data reduction by creating a completely automated version of VPCS -- i.e., the "P5" prototype. By developing source code in LabVIEW(TM), demonstration that the VPCS can be completely controlled in an automated fashion is described. Computationally, a Field-Programmable Gate Array is used to automate the process of data collection and reduction in real-time. It is shown that the inputs and outputs of the P5 instrument can be continuously monitored, allowing for real-time triggering of the source laser, collection of all data, and reduction of the data to report absorbance. Furthermore, it is shown that the VPCS can be automatically aligned -- also in real-time on the order of microseconds -- to a high degree of precision by using servo-actuators that adjust the beam position based upon the input from a sensitive CCD camera. With the implementation of this hardware and LabVIEW code, more precise data collection and reduction is done. With this new fully automated design, the instrument characteristics (e.g., to include factors such as rotation speed, off-set angle, and pathlength variation) can improve the enhancement by ˜130-fold vs. single-pass absorption measurements.

  2. System-wide versus component-specific trust using multiple aids.

    PubMed

    Keller, David; Rice, Stephen

    2010-01-01

    Previous research in operator trust toward automated aids has focused primarily on single aids. The current study focuses on how operator trust is affected by the presence of multiple aids. Two competing theories of multiple-trust are presented. A component-specific trust theory predicts that operators will differentially place their trust in automated aids that vary in reliability. A system-wide trust theory predicts that operators will treat multiple imperfect aids as one "system" and merge their trust across aids despite differences in the aids' reliability. A simulated flight task was used to test these theories, whereby operators performed a pursuit tracking task while concurrently monitoring multiple system gauges that were augmented with perfect or imperfect automated aids. The data revealed that a system-wide trust theory best predicted the data; operators merged their trust across both aids, behaving toward a perfectly reliable aid in the same manner as they did towards unreliable aids.

  3. The Chandra Source Catalog : Automated Source Correlation

    NASA Astrophysics Data System (ADS)

    Hain, Roger; Evans, I. N.; Evans, J. D.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    Chandra Source Catalog (CSC) master source pipeline processing seeks to automatically detect sources and compute their properties. Since Chandra is a pointed mission and not a sky survey, different sky regions are observed for a different number of times at varying orientations, resolutions, and other heterogeneous conditions. While this provides an opportunity to collect data from a potentially large number of observing passes, it also creates challenges in determining the best way to combine different detection results for the most accurate characterization of the detected sources. The CSC master source pipeline correlates data from multiple observations by updating existing cataloged source information with new data from the same sky region as they become available. This process sometimes leads to relatively straightforward conclusions, such as when single sources from two observations are similar in size and position. Other observation results require more logic to combine, such as one observation finding a single, large source and another identifying multiple, smaller sources at the same position. We present examples of different overlapping source detections processed in the current version of the CSC master source pipeline. We explain how they are resolved into entries in the master source database, and examine the challenges of computing source properties for the same source detected multiple times. Future enhancements are also discussed. This work is supported by NASA contract NAS8-03060 (CXC).

  4. Introducing passive acoustic filter in acoustic based condition monitoring: Motor bike piston-bore fault identification

    NASA Astrophysics Data System (ADS)

    Jena, D. P.; Panigrahi, S. N.

    2016-03-01

    Requirement of designing a sophisticated digital band-pass filter in acoustic based condition monitoring has been eliminated by introducing a passive acoustic filter in the present work. So far, no one has attempted to explore the possibility of implementing passive acoustic filters in acoustic based condition monitoring as a pre-conditioner. In order to enhance the acoustic based condition monitoring, a passive acoustic band-pass filter has been designed and deployed. Towards achieving an efficient band-pass acoustic filter, a generalized design methodology has been proposed to design and optimize the desired acoustic filter using multiple filter components in series. An appropriate objective function has been identified for genetic algorithm (GA) based optimization technique with multiple design constraints. In addition, the sturdiness of the proposed method has been demonstrated in designing a band-pass filter by using an n-branch Quincke tube, a high pass filter and multiple Helmholtz resonators. The performance of the designed acoustic band-pass filter has been shown by investigating the piston-bore defect of a motor-bike using engine noise signature. On the introducing a passive acoustic filter in acoustic based condition monitoring reveals the enhancement in machine learning based fault identification practice significantly. This is also a first attempt of its own kind.

  5. Validation of an automated system for aliquoting of HIV-1 Env-pseudotyped virus stocks.

    PubMed

    Schultz, Anke; Germann, Anja; Fuss, Martina; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A; Montefiori, David C; Zimmermann, Heiko; von Briesen, Hagen

    2018-01-01

    The standardized assessments of HIV-specific immune responses are of main interest in the preclinical and clinical stage of HIV-1 vaccine development. In this regard, HIV-1 Env-pseudotyped viruses play a central role for the evaluation of neutralizing antibody profiles and are produced according to Good Clinical Laboratory Practice- (GCLP-) compliant manual and automated procedures. To further improve and complete the automated production cycle an automated system for aliquoting HIV-1 pseudovirus stocks has been implemented. The automation platform consists of a modified Tecan-based system including a robot platform for handling racks containing 48 cryovials, a Decapper, a tubing pump and a safety device consisting of ultrasound sensors for online liquid level detection of each individual cryovial. With the aim to aliquot the HIV-1 pseudoviruses in an automated manner under GCLP-compliant conditions a validation plan was developed where the acceptance criteria-accuracy, precision as well as the specificity and robustness-were defined and summarized. By passing the validation experiments described in this article the automated system for aliquoting has been successfully validated. This allows the standardized and operator independent distribution of small-scale and bulk amounts of HIV-1 pseudovirus stocks with a precise and reproducible outcome to support upcoming clinical vaccine trials.

  6. Validation of an automated system for aliquoting of HIV-1 Env-pseudotyped virus stocks

    PubMed Central

    Schultz, Anke; Germann, Anja; Fuss, Martina; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; Zimmermann, Heiko

    2018-01-01

    The standardized assessments of HIV-specific immune responses are of main interest in the preclinical and clinical stage of HIV-1 vaccine development. In this regard, HIV-1 Env-pseudotyped viruses play a central role for the evaluation of neutralizing antibody profiles and are produced according to Good Clinical Laboratory Practice- (GCLP-) compliant manual and automated procedures. To further improve and complete the automated production cycle an automated system for aliquoting HIV-1 pseudovirus stocks has been implemented. The automation platform consists of a modified Tecan-based system including a robot platform for handling racks containing 48 cryovials, a Decapper, a tubing pump and a safety device consisting of ultrasound sensors for online liquid level detection of each individual cryovial. With the aim to aliquot the HIV-1 pseudoviruses in an automated manner under GCLP-compliant conditions a validation plan was developed where the acceptance criteria—accuracy, precision as well as the specificity and robustness—were defined and summarized. By passing the validation experiments described in this article the automated system for aliquoting has been successfully validated. This allows the standardized and operator independent distribution of small-scale and bulk amounts of HIV-1 pseudovirus stocks with a precise and reproducible outcome to support upcoming clinical vaccine trials. PMID:29300769

  7. Best-Quality Vessel Identification Using Vessel Quality Measure in Multiple-Phase Coronary CT Angiography.

    PubMed

    Hadjiiski, Lubomir; Liu, Jordan; Chan, Heang-Ping; Zhou, Chuan; Wei, Jun; Chughtai, Aamer; Kuriakose, Jean; Agarwal, Prachi; Kazerooni, Ella

    2016-01-01

    The detection of stenotic plaques strongly depends on the quality of the coronary arterial tree imaged with coronary CT angiography (cCTA). However, it is time consuming for the radiologist to select the best-quality vessels from the multiple-phase cCTA for interpretation in clinical practice. We are developing an automated method for selection of the best-quality vessels from coronary arterial trees in multiple-phase cCTA to facilitate radiologist's reading or computerized analysis. Our automated method consists of vessel segmentation, vessel registration, corresponding vessel branch matching, vessel quality measure (VQM) estimation, and automatic selection of best branches based on VQM. For every branch, the VQM was calculated as the average radial gradient. An observer preference study was conducted to visually compare the quality of the selected vessels. 167 corresponding branch pairs were evaluated by two radiologists. The agreement between the first radiologist and the automated selection was 76% with kappa of 0.49. The agreement between the second radiologist and the automated selection was also 76% with kappa of 0.45. The agreement between the two radiologists was 81% with kappa of 0.57. The observer preference study demonstrated the feasibility of the proposed automated method for the selection of the best-quality vessels from multiple cCTA phases.

  8. PASS2: an automated database of protein alignments organised as structural superfamilies.

    PubMed

    Bhaduri, Anirban; Pugalenthi, Ganesan; Sowdhamini, Ramanathan

    2004-04-02

    The functional selection and three-dimensional structural constraints of proteins in nature often relates to the retention of significant sequence similarity between proteins of similar fold and function despite poor sequence identity. Organization of structure-based sequence alignments for distantly related proteins, provides a map of the conserved and critical regions of the protein universe that is useful for the analysis of folding principles, for the evolutionary unification of protein families and for maximizing the information return from experimental structure determination. The Protein Alignment organised as Structural Superfamily (PASS2) database represents continuously updated, structural alignments for evolutionary related, sequentially distant proteins. An automated and updated version of PASS2 is, in direct correspondence with SCOP 1.63, consisting of sequences having identity below 40% among themselves. Protein domains have been grouped into 628 multi-member superfamilies and 566 single member superfamilies. Structure-based sequence alignments for the superfamilies have been obtained using COMPARER, while initial equivalencies have been derived from a preliminary superposition using LSQMAN or STAMP 4.0. The final sequence alignments have been annotated for structural features using JOY4.0. The database is supplemented with sequence relatives belonging to different genomes, conserved spatially interacting and structural motifs, probabilistic hidden markov models of superfamilies based on the alignments and useful links to other databases. Probabilistic models and sensitive position specific profiles obtained from reliable superfamily alignments aid annotation of remote homologues and are useful tools in structural and functional genomics. PASS2 presents the phylogeny of its members both based on sequence and structural dissimilarities. Clustering of members allows us to understand diversification of the family members. The search engine has been improved for simpler browsing of the database. The database resolves alignments among the structural domains consisting of evolutionarily diverged set of sequences. Availability of reliable sequence alignments of distantly related proteins despite poor sequence identity and single-member superfamilies permit better sampling of structures in libraries for fold recognition of new sequences and for the understanding of protein structure-function relationships of individual superfamilies. PASS2 is accessible at http://www.ncbs.res.in/~faculty/mini/campass/pass2.html

  9. Influencing Trust for Human-Automation Collaborative Scheduling of Multiple Unmanned Vehicles.

    PubMed

    Clare, Andrew S; Cummings, Mary L; Repenning, Nelson P

    2015-11-01

    We examined the impact of priming on operator trust and system performance when supervising a decentralized network of heterogeneous unmanned vehicles (UVs). Advances in autonomy have enabled a future vision of single-operator control of multiple heterogeneous UVs. Real-time scheduling for multiple UVs in uncertain environments requires the computational ability of optimization algorithms combined with the judgment and adaptability of human supervisors. Because of system and environmental uncertainty, appropriate operator trust will be instrumental to maintain high system performance and prevent cognitive overload. Three groups of operators experienced different levels of trust priming prior to conducting simulated missions in an existing, multiple-UV simulation environment. Participants who play computer and video games frequently were found to have a higher propensity to overtrust automation. By priming gamers to lower their initial trust to a more appropriate level, system performance was improved by 10% as compared to gamers who were primed to have higher trust in the automation. Priming was successful at adjusting the operator's initial and dynamic trust in the automated scheduling algorithm, which had a substantial impact on system performance. These results have important implications for personnel selection and training for futuristic multi-UV systems under human supervision. Although gamers may bring valuable skills, they may also be potentially prone to automation bias. Priming during training and regular priming throughout missions may be one potential method for overcoming this propensity to overtrust automation. © 2015, Human Factors and Ergonomics Society.

  10. The role of human-automation consensus in multiple unmanned vehicle scheduling.

    PubMed

    Cummings, M L; Clare, Andrew; Hart, Christin

    2010-02-01

    This study examined the impact of increasing automation replanning rates on operator performance and workload when supervising a decentralized network of heterogeneous unmanned vehicles. Futuristic unmanned vehicles systems will invert the operator-to-vehicle ratio so that one operator can control multiple dissimilar vehicles connected through a decentralized network. Significant human-automation collaboration will be needed because of automation brittleness, but such collaboration could cause high workload. Three increasing levels of replanning were tested on an existing multiple unmanned vehicle simulation environment that leverages decentralized algorithms for vehicle routing and task allocation in conjunction with human supervision. Rapid replanning can cause high operator workload, ultimately resulting in poorer overall system performance. Poor performance was associated with a lack of operator consensus for when to accept the automation's suggested prompts for new plan consideration as well as negative attitudes toward unmanned aerial vehicles in general. Participants with video game experience tended to collaborate more with the automation, which resulted in better performance. In decentralized unmanned vehicle networks, operators who ignore the automation's requests for new plan consideration and impose rapid replans both increase their own workload and reduce the ability of the vehicle network to operate at its maximum capacity. These findings have implications for personnel selection and training for futuristic systems involving human collaboration with decentralized algorithms embedded in networks of autonomous systems.

  11. Secure Key Storage with PUFs

    NASA Astrophysics Data System (ADS)

    Skoric, Boris; Schrijen, Geert-Jan; Tuyls, Pim; Ignatenko, Tanya; Willems, Frans

    Nowadays, people carry around devices (cell phones, PDAs, bank passes, etc.) that have a high value. That value is often contained in the data stored in it or lies in the services the device can grant access to (by using secret identification information stored in it). These devices often operate in hostile environments and their protection level is not adequate to deal with that situation. Bank passes and credit cards contain a magnetic stripe where identification information is stored. In the case of bank passes, a PIN is additionally required to withdraw money from an ATM (Automated Teller Machine). At various occasions, it has been shown that by placing a small coil in the reader, the magnetic information stored in the stripe can easily be copied and used to produce a cloned card. Together with eavesdropping the PIN (by listening to the keypad or recording it with a camera), an attacker can easily impersonate the legitimate owner of the bank pass by using the cloned card in combination with the eavesdropped PIN.

  12. The Use of Gamma-Ray Imaging to Improve Portal Monitor Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ziock, Klaus-Peter; Collins, Jeff; Fabris, Lorenzo

    2008-01-01

    We have constructed a prototype, rapid-deployment portal monitor that uses visible-light and gamma-ray imaging to allow simultaneous monitoring of multiple lanes of traffic from the side of a roadway. Our Roadside Tracker uses automated target acquisition and tracking (TAT) software to identify and track vehicles in visible light images. The field of view of the visible camera overlaps with and is calibrated to that of a one-dimensional gamma-ray imager. The TAT code passes information on when vehicles enter and exit the system field of view and when they cross gamma-ray pixel boundaries. Based on this in-formation, the gamma-ray imager "harvests"more » the gamma-ray data specific to each vehicle, integrating its radiation signature for the entire time that it is in the field of view. In this fashion we are able to generate vehicle-specific radiation signatures and avoid source confusion problems that plague nonimaging approaches to the same problem.« less

  13. Utilizing ORACLE tools within Unix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferguson, R.

    1995-07-01

    Large databases, by their very nature, often serve as repositories of data which may be needed by other systems. The transmission of this data to other systems has in the past involved several layers of human intervention. The Integrated Cargo Data Base (ICDB) developed by Martin Marietta Energy Systems for the Military Traffic Management Command as part of the Worldwide Port System provides data integration and worldwide tracking of cargo that passes through common-user ocean cargo ports. One of the key functions of ICDB is data distribution of a variety of data files to a number of other systems. Developmentmore » of automated data distribution procedures had to deal with the following constraints: (1) variable generation time for data files, (2) use of only current data for data files, (3) use of a minimum number of select statements, (4) creation of unique data files for multiple recipients, (5) automatic transmission of data files to recipients, and (6) avoidance of extensive and long-term data storage.« less

  14. Automated Test Case Generator for Phishing Prevention Using Generative Grammars and Discriminative Methods

    ERIC Educational Resources Information Center

    Palka, Sean

    2015-01-01

    This research details a methodology designed for creating content in support of various phishing prevention tasks including live exercises and detection algorithm research. Our system uses probabilistic context-free grammars (PCFG) and variable interpolation as part of a multi-pass method to create diverse and consistent phishing email content on…

  15. 75 FR 21250 - Privacy Act of 1974; Systems of Records

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-23

    ... information of the current tenants of NSA/CSS facilities; to create and track the status of visit requests and... facility; to track inside the NSA/CSS facility authorized NSA/CSS employee and visitor badges as they are used to pass through automated turnstile system, access office suites and other work areas; to track...

  16. Two antenna, two pass interferometric synthetic aperture radar

    DOEpatents

    Martinez, Ana; Doerry, Armin W.; Bickel, Douglas L.

    2005-06-28

    A multi-antenna, multi-pass IFSAR mode utilizing data driven alignment of multiple independent passes can combine the scaling accuracy of a two-antenna, one-pass IFSAR mode with the height-noise performance of a one-antenna, two-pass IFSAR mode. A two-antenna, two-pass IFSAR mode can accurately estimate the larger antenna baseline from the data itself and reduce height-noise, allowing for more accurate information about target ground position locations and heights. The two-antenna, two-pass IFSAR mode can use coarser IFSAR data to estimate the larger antenna baseline. Multi-pass IFSAR can be extended to more than two (2) passes, thereby allowing true three-dimensional radar imaging from stand-off aircraft and satellite platforms.

  17. Workflow Automation: A Collective Case Study

    ERIC Educational Resources Information Center

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  18. The Incidence of Clueing in Multiple Choice Testbank Questions in Accounting: Some Evidence from Australia

    ERIC Educational Resources Information Center

    Ibbett, Nicole L.; Wheldon, Brett J.

    2016-01-01

    In 2014 Central Queensland University (CQU) in Australia banned the use of multiple choice questions (MCQs) as an assessment tool. One of the reasons given for this decision was that MCQs provide an opportunity for students to "pass" by merely guessing their answers. The mathematical likelihood of a student passing by guessing alone can…

  19. 40 CFR 51.357 - Test procedures and standards.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... invalid test condition, unsafe conditions, fast pass/fail algorithms, or, in the case of the on-board... using approved fast pass or fast fail algorithms and multiple pass/fail algorithms may be used during the test cycle to eliminate false failures. The transient test procedure, including algorithms and...

  20. 40 CFR 51.357 - Test procedures and standards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... invalid test condition, unsafe conditions, fast pass/fail algorithms, or, in the case of the on-board... using approved fast pass or fast fail algorithms and multiple pass/fail algorithms may be used during the test cycle to eliminate false failures. The transient test procedure, including algorithms and...

  1. 40 CFR 51.357 - Test procedures and standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... invalid test condition, unsafe conditions, fast pass/fail algorithms, or, in the case of the on-board... using approved fast pass or fast fail algorithms and multiple pass/fail algorithms may be used during the test cycle to eliminate false failures. The transient test procedure, including algorithms and...

  2. 40 CFR 51.357 - Test procedures and standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... invalid test condition, unsafe conditions, fast pass/fail algorithms, or, in the case of the on-board... using approved fast pass or fast fail algorithms and multiple pass/fail algorithms may be used during the test cycle to eliminate false failures. The transient test procedure, including algorithms and...

  3. Description and theory of operation of the computer by-pass system for the NASA F-8 digital fly-by-wire control system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A triplex digital flight control system was installed in a NASA F-8C airplane to provide fail operate, full authority control. The triplex digital computers and interface circuitry process the pilot commands and aircraft motion feedback parameters according to the selected control laws, and they output the surface commands as an analog signal to the servoelectronics for position control of the aircraft's power actuators. The system and theory of operation of the computer by pass and servoelectronics are described and an automated ground test for each axis is included.

  4. Polarization Control with Piezoelectric and LiNbO3 Transducers

    NASA Astrophysics Data System (ADS)

    Bradley, E.; Miles, E.; Loginov, B.; Vu, N.

    Several Polarization control transducers have appeared on the market, and now automated, endless polarization control systems using these transducers are becoming available. Unfortunately it is not entirely clear what benchmark performance tests a polarization control system must pass, and the polarization disturbances a system must handle are open to some debate. We present quantitative measurements of realistic polarization disturbances and two benchmark tests we have successfully used to evaluate the performance of an automated, endless polarization control system. We use these tests to compare the performance of a system using piezoelectric transducers to that of a system using LiNbO3 transducers.

  5. Automated process for solvent separation of organic/inorganic substance

    DOEpatents

    Schweighardt, F.K.

    1986-07-29

    There is described an automated process for the solvent separation of organic/inorganic substances that operates continuously and unattended and eliminates potential errors resulting from subjectivity and the aging of the sample during analysis. In the process, metered amounts of one or more solvents are passed sequentially through a filter containing the sample under the direction of a microprocessor control apparatus. The mixture in the filter is agitated by ultrasonic cavitation for a timed period and the filtrate is collected. The filtrate of each solvent extraction is collected individually and the residue on the filter element is collected to complete the extraction process. 4 figs.

  6. Automated process for solvent separation of organic/inorganic substance

    DOEpatents

    Schweighardt, Frank K.

    1986-01-01

    There is described an automated process for the solvent separation of organic/inorganic substances that operates continuously and unattended and eliminates potential errors resulting from subjectivity and the aging of the sample during analysis. In the process, metered amounts of one or more solvents are passed sequentially through a filter containing the sample under the direction of a microprocessor control apparatus. The mixture in the filter is agitated by ultrasonic cavitation for a timed period and the filtrate is collected. The filtrate of each solvent extraction is collected individually and the residue on the filter element is collected to complete the extraction process.

  7. Automated apparatus for solvent separation of a coal liquefaction product stream

    DOEpatents

    Schweighardt, Frank K.

    1985-01-01

    An automated apparatus for the solvent separation of a coal liquefaction product stream that operates continuously and unattended and eliminates potential errors resulting from subjectivity and the aging of the sample during analysis. In use of the apparatus, metered amounts of one or more solvents are passed sequentially through a filter containing the sample under the direction of a microprocessor control means. The mixture in the filter is agitated by means of ultrasonic cavitation for a timed period and the filtrate is collected. The filtrate of each solvent extraction is collected individually and the residue on the filter element is collected to complete the extraction process.

  8. Automated acoustic localization and call association for vocalizing humpback whales on the Navy's Pacific Missile Range Facility.

    PubMed

    Helble, Tyler A; Ierley, Glenn R; D'Spain, Gerald L; Martin, Stephen W

    2015-01-01

    Time difference of arrival (TDOA) methods for acoustically localizing multiple marine mammals have been applied to recorded data from the Navy's Pacific Missile Range Facility in order to localize and track humpback whales. Modifications to established methods were necessary in order to simultaneously track multiple animals on the range faster than real-time and in a fully automated way, while minimizing the number of incorrect localizations. The resulting algorithms were run with no human intervention at computational speeds faster than the data recording speed on over forty days of acoustic recordings from the range, spanning multiple years. Spatial localizations based on correlating sequences of units originating from within the range produce estimates having a standard deviation typically 10 m or less (due primarily to TDOA measurement errors), and a bias of 20 m or less (due primarily to sound speed mismatch). An automated method for associating units to individual whales is presented, enabling automated humpback song analyses to be performed.

  9. Aeroassisted orbit transfer vehicle trajectory analysis

    NASA Technical Reports Server (NTRS)

    Braun, Robert D.; Suit, William T.

    1988-01-01

    The emphasis in this study was on the use of multiple pass trajectories for aerobraking. However, for comparison, single pass trajectories, trajectories using ballutes, and trajectories corrupted by atmospheric anomolies were run. A two-pass trajectory was chosen to determine the relation between sensitivity to errors and payload to orbit. Trajectories that used only aerodynamic forces for maneuvering could put more weight into the target orbits but were very sensitive to variations from the planned trajectors. Using some thrust control resulted in less payload to orbit, but greatly reduced the sensitivity to variations from nominal trajectories. When compared to the non-thrusting trajectories investigated, the judicious use of thrusting resulted in multiple pass trajectories that gave 97 percent of the payload to orbit with almost none of the sensitivity to variations from the nominal.

  10. Smart Phase Tuning in Microwave Photonic Integrated Circuits Toward Automated Frequency Multiplication by Design

    NASA Astrophysics Data System (ADS)

    Nabavi, N.

    2018-07-01

    The author investigates the monitoring methods for fine adjustment of the previously proposed on-chip architecture for frequency multiplication and translation of harmonics by design. Digital signal processing (DSP) algorithms are utilized to create an optimized microwave photonic integrated circuit functionality toward automated frequency multiplication. The implemented DSP algorithms are formed on discrete Fourier transform and optimization-based algorithms (Greedy and gradient-based algorithms), which are analytically derived and numerically compared based on the accuracy and speed of convergence criteria.

  11. Consolidation of Surface Coatings by Friction Stir Techniques

    DTIC Science & Technology

    2010-09-01

    alloy samples were plasma sprayed with a Titanium-Nickel-Chrome coating or a Titanium coating. Single and multiple pass experiments were performed...based coatings onto the Aluminum alloy surface. Results showed that the most successful results were accomplished using a flat, pinless tool, with...properties. Aluminum alloy samples were plasma sprayed with a Titanium-Nickel-Chrome coating or a Titanium coating. Single and multiple pass experiments

  12. Electrical characterization of standard and radiation-hardened RCA CDP1856D 4-BIT, CMOS, bus buffer/separator

    NASA Technical Reports Server (NTRS)

    Stokes, R. L.

    1979-01-01

    Tests performed to determine accuracy and efficiency of bus separators used in microprocessors are presented. Functional, AC parametric, and DC parametric tests were performed in a Tektronix S-3260 automated test system. All the devices passed the functional tests and yielded nominal values in the parametric test.

  13. Thermal effectiveness of multiple shell and tube pass TEMA E heat exchangers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pignotti, A.; Tamborenea, P.I.

    1988-02-01

    The thermal effectiveness of a TEMAE shell-and-tube heat exchanger, with one shell pass and an arbitrary number of tube passes, is determined under the usual simplifying assumptions of perfect transverse mixing of the shell fluid, no phase change, and temperature independence of the heat capacity rates and the heat transfer coefficient. A purely algebraic solution is obtained for the effectiveness as a functions of the heat capacity rate ratio and the number of heat transfer units. The case with M shell passes and N tube passes is easily expressed in terms of the single-shell-pass case.

  14. STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.

    PubMed

    Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X

    2009-08-01

    This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.

  15. Automated assembly of oligosaccharides containing multiple cis-glycosidic linkages

    NASA Astrophysics Data System (ADS)

    Hahm, Heung Sik; Hurevich, Mattan; Seeberger, Peter H.

    2016-09-01

    Automated glycan assembly (AGA) has advanced from a concept to a commercial technology that rapidly provides access to diverse oligosaccharide chains as long as 30-mers. To date, AGA was mainly employed to incorporate trans-glycosidic linkages, where C2 participating protecting groups ensure stereoselective couplings. Stereocontrol during the installation of cis-glycosidic linkages cannot rely on C2-participation and anomeric mixtures are typically formed. Here, we demonstrate that oligosaccharides containing multiple cis-glycosidic linkages can be prepared efficiently by AGA using monosaccharide building blocks equipped with remote participating protecting groups. The concept is illustrated by the automated syntheses of biologically relevant oligosaccharides bearing various cis-galactosidic and cis-glucosidic linkages. This work provides further proof that AGA facilitates the synthesis of complex oligosaccharides with multiple cis-linkages and other biologically important oligosaccharides.

  16. Automated Historical and Real-Time Cyclone Discovery With Multimodal Remote Satellite Measurements

    NASA Astrophysics Data System (ADS)

    Ho, S.; Talukder, A.; Liu, T.; Tang, W.; Bingham, A.

    2008-12-01

    Existing cyclone detection and tracking solutions involve extensive manual analysis of modeled-data and field campaign data by teams of experts. We have developed a novel automated global cyclone detection and tracking system by assimilating and sharing information from multiple remote satellites. This unprecedented solution of combining multiple remote satellite measurements in an autonomous manner allows leveraging off the strengths of each individual satellite. Use of multiple satellite data sources also results in significantly improved temporal tracking accuracy for cyclones. Our solution involves an automated feature extraction and machine learning technique based on an ensemble classifier and Kalman filter for cyclone detection and tracking from multiple heterogeneous satellite data sources. Our feature-based methodology that focuses on automated cyclone discovery is fundamentally different from, and actually complements, the well-known Dvorak technique for cyclone intensity estimation (that often relies on manual detection of cyclonic regions) from field and remote data. Our solution currently employs the QuikSCAT wind measurement and the merged level 3 TRMM precipitation data for automated cyclone discovery. Assimilation of other types of remote measurements is ongoing and planned in the near future. Experimental results of our automated solution on historical cyclone datasets demonstrate the superior performance of our automated approach compared to previous work. Performance of our detection solution compares favorably against the list of cyclones occurring in North Atlantic Ocean for the 2005 calendar year reported by the National Hurricane Center (NHC) in our initial analysis. We have also demonstrated the robustness of our cyclone tracking methodology in other regions over the world by using multiple heterogeneous satellite data for detection and tracking of three arbitrary historical cyclones in other regions. Our cyclone detection and tracking methodology can be applied to (i) historical data to support Earth scientists in climate modeling, cyclonic-climate interactions, and obtain a better understanding of the cause and effects of cyclone (e.g. cyclo-genesis), and (ii) automatic cyclone discovery in near real-time using streaming satellite to support and improve the planning of global cyclone field campaigns. Additional satellite data from GOES and other orbiting satellites can be easily assimilated and integrated into our automated cyclone detection and tracking module to improve the temporal tracking accuracy of cyclones down to ½ hr and reduce the incidence of false alarms.

  17. Corner heating in rectangular solid oxide electrochemical cell generators

    DOEpatents

    Reichner, Philip

    1989-01-01

    Disclosed is an improvement in a solid oxide electrochemical cell generator 1 having a rectangular design with four sides that meet at corners, and containing multiplicity of electrically connected fuel cells 11, where a fuel gas is passed over one side of said cells and an oxygen containing gas is passed into said cells, and said fuel is burned to form heat, electricity, and an exhaust gas. The improvement comprises passing the exhaust gases over the multiplicity of cells 11 in such a way that more of the heat in said exhaust gases flows at the corners of the generator, such as through channels 19.

  18. Multiplexed detection of mycotoxins in foods with a regenerable array.

    PubMed

    Ngundi, Miriam M; Shriver-Lake, Lisa C; Moore, Martin H; Ligler, Frances S; Taitt, Chris R

    2006-12-01

    The occurrence of different mycotoxins in cereal products calls for the development of a rapid, sensitive, and reliable detection method that is capable of analyzing samples for multiple toxins simultaneously. In this study, we report the development and application of a multiplexed competitive assay for the simultaneous detection of ochratoxin A (OTA) and deoxynivalenol (DON) in spiked barley, cornmeal, and wheat, as well as in naturally contaminated maize samples. Fluoroimmunoassays were performed with the Naval Research Laboratory array biosensor, by both a manual and an automated version of the system. This system employs evanescent-wave fluorescence excitation to probe binding events as they occur on the surface of a waveguide. Methanolic extracts of the samples were diluted threefold with buffer containing a mixture of fluorescent antibodies and were then passed over the arrays of mycotoxins immobilized on a waveguide. Fluorescent signals of the surface-bound antibody-antigen complexes decreased with increasing concentrations of free mycotoxins in the extract. After sample analysis was completed, surfaces were regenerated with 6 M guanidine hydrochloride in 50 mM glycine, pH 2.0. The limits of detection determined by the manual biosensor system were as follows: 1, 180, and 65 ng/g for DON and 1, 60, and 85 ng/g for OTA in cornmeal, wheat, and barley, respectively. The limits of detection in cornmeal determined with the automated array biosensor were 15 and 150 ng/g for OTA and DON, respectively.

  19. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  20. Sphere based fluid systems

    NASA Technical Reports Server (NTRS)

    Elleman, Daniel D. (Inventor); Wang, Taylor G. (Inventor)

    1989-01-01

    Systems are described for using multiple closely-packed spheres. In one system for passing fluid, a multiplicity of spheres lie within a container, with all of the spheres having the same outside diameter and with the spheres being closely nested in one another to create multiple interstitial passages of a known size and configuration and smooth walls. The container has an inlet and outlet for passing fluid through the interstitial passages formed between the nested spheres. The small interstitial passages can be used to filter out material, especially biological material such as cells in a fluid, where the cells can be easily destroyed if passed across sharp edges. The outer surface of the spheres can contain a material that absorbs a constitutent in the flowing fluid, such as a particular contamination gas, or can contain a catalyst to chemically react the fluid passing therethrough, the use of multiple small spheres assuring a large area of contact of these surfaces of the spheres with the fluid. In a system for storing and releasing a fluid such as hydrogen as a fuel, the spheres can include a hollow shell containing the fluid to be stored, and located within a compressable container that can be compressed to break the shells and release the stored fluid.

  1. Three-Dimensional Echocardiographic Assessment of Left Heart Chamber Size and Function with Fully Automated Quantification Software in Patients with Atrial Fibrillation.

    PubMed

    Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki

    2016-10-01

    Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  2. Quantification of the first-order high-pass filter's influence on the automatic measurements of the electrocardiogram.

    PubMed

    Isaksen, Jonas; Leber, Remo; Schmid, Ramun; Schmid, Hans-Jakob; Generali, Gianluca; Abächerli, Roger

    2017-02-01

    The first-order high-pass filter (AC coupling) has previously been shown to affect the ECG for higher cut-off frequencies. We seek to find a systematic deviation in computer measurements of the electrocardiogram when the AC coupling with a 0.05 Hz first-order high-pass filter is used. The standard 12-lead electrocardiogram from 1248 patients and the automated measurements of their DC and AC coupled version were used. We expect a large unipolar QRS-complex to produce a deviation in the opposite direction in the ST-segment. We found a strong correlation between the QRS integral and the offset throughout the ST-segment. The coefficient for J amplitude deviation was found to be -0.277 µV/(µV⋅s). Potential dangerous alterations to the diagnostically important ST-segment were found. Medical professionals and software developers for electrocardiogram interpretation programs should be aware of such high-pass filter effects since they could be misinterpreted as pathophysiology or some pathophysiology could be masked by these effects. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Assessing the efficacy of single-pass backpack electrofishing to characterize fish community structure

    USGS Publications Warehouse

    Meador, M.R.; McIntyre, J.P.; Pollock, K.H.

    2003-01-01

    Two-pass backpack electrofishing data collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program were analyzed to assess the efficacy of single-pass backpack electrofishing. A two-capture removal model was used to estimate, within 10 river basins across the United States, proportional fish species richness from one-pass electrofishing and probabilities of detection for individual fish species. Mean estimated species richness from first-pass sampling (ps1) ranged from 80.7% to 100% of estimated total species richness for each river basin, based on at least seven samples per basin. However, ps1 values for individual sites ranged from 40% to 100% of estimated total species richness. Additional species unique to the second pass were collected in 50.3% of the samples. Of these, cyprinids and centrarchids were collected most frequently. Proportional fish species richness estimated for the first pass increased significantly with decreasing stream width for 1 of the 10 river basins. When used to calculate probabilities of detection of individual fish species, the removal model failed 48% of the time because the number of individuals of a species was greater in the second pass than in the first pass. Single-pass backpack electrofishing data alone may make it difficult to determine whether characterized fish community structure data are real or spurious. The two-pass removal model can be used to assess the effectiveness of sampling species richness with a single electrofishing pass. However, the two-pass removal model may have limited utility to determine probabilities of detection of individual species and, thus, limit the ability to assess the effectiveness of single-pass sampling to characterize species relative abundances. Multiple-pass (at least three passes) backpack electrofishing at a large number of sites may not be cost-effective as part of a standardized sampling protocol for large-geographic-scale studies. However, multiple-pass electrofishing at some sites may be necessary to better evaluate the adequacy of single-pass electrofishing and to help make meaningful interpretations of fish community structure.

  4. CNN for breaking text-based CAPTCHA with noise

    NASA Astrophysics Data System (ADS)

    Liu, Kaixuan; Zhang, Rong; Qing, Ke

    2017-07-01

    A CAPTCHA ("Completely Automated Public Turing test to tell Computers and Human Apart") system is a program that most humans can pass but current computer programs could hardly pass. As the most common type of CAPTCHAs , text-based CAPTCHA has been widely used in different websites to defense network bots. In order to breaking textbased CAPTCHA, in this paper, two trained CNN models are connected for the segmentation and classification of CAPTCHA images. Then base on these two models, we apply sliding window segmentation and voting classification methods realize an end-to-end CAPTCHA breaking system with high success rate. The experiment results show that our method is robust and effective in breaking text-based CAPTCHA with noise.

  5. Addressing Drop-Out and Sustained Effort Issues with Large Practical Groups Using an Automated Delivery and Assessment System

    ERIC Educational Resources Information Center

    de-la-Fuente-Valentin, Luis; Pardo, Abelardo; Kloos, Carlos Delgado

    2013-01-01

    The acquisition of programming skills specially in introductory programming courses poses an important challenge for freshmen students of engineering programs. These courses require students to devote a sustained effort during the whole course and a failure to do so may contribute to not passing the course. However, it is difficult for the…

  6. Automated platform for designing multiple robot work cells

    NASA Astrophysics Data System (ADS)

    Osman, N. S.; Rahman, M. A. A.; Rahman, A. A. Abdul; Kamsani, S. H.; Bali Mohamad, B. M.; Mohamad, E.; Zaini, Z. A.; Rahman, M. F. Ab; Mohamad Hatta, M. N. H.

    2017-06-01

    Designing the multiple robot work cells is very knowledge-intensive, intricate, and time-consuming process. This paper elaborates the development process of a computer-aided design program for generating the multiple robot work cells which offer a user-friendly interface. The primary purpose of this work is to provide a fast and easy platform for less cost and human involvement with minimum trial and errors adjustments. The automated platform is constructed based on the variant-shaped configuration concept with its mathematical model. A robot work cell layout, system components, and construction procedure of the automated platform are discussed in this paper where integration of these items will be able to automatically provide the optimum robot work cell design according to the information set by the user. This system is implemented on top of CATIA V5 software and utilises its Part Design, Assembly Design, and Macro tool. The current outcomes of this work provide a basis for future investigation in developing a flexible configuration system for the multiple robot work cells.

  7. Prototype automated post-MECO ascent I-load Verification Data Table

    NASA Technical Reports Server (NTRS)

    Lardas, George D.

    1990-01-01

    A prototype automated processor for quality assurance of Space Shuttle post-Main Engine Cut Off (MECO) ascent initialization parameters (I-loads) is described. The processor incorporates Clips rules adapted from the quality assurance criteria for the post-MECO ascent I-loads. Specifically, the criteria are implemented for nominal and abort targets, as given in the 'I-load Verification Data Table, Part 3, Post-MECO Ascent, Version 2.1, December 1989.' This processor, ivdt, compares a given l-load set with the stated mission design and quality assurance criteria. It determines which I-loads violate the stated criteria, and presents a summary of I-loads that pass or fail the tests.

  8. Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing

    NASA Astrophysics Data System (ADS)

    Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo

    2016-04-01

    The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing scheme take advantage of 3D data multiplicity by continuous real time data focusing. Pre-stack reflection angle gathers G(x, θ; v) are computed at nv different velocities (by the mean of Kirchhoff depth-migration kernels, that can naturally cope with any acquisition pattern and handle irregular sampling issues). It must be noted that the analysis of pre-stack reflection angle gathers plays a key-role in automated detection: targets are identified and the best local propagation velocities are recovered through a correlation estimate computed for all the nv reflection angle gathers. Indeed, the data redundancy of 3D GPR acquisitions highly improves the proposed automatic detection reliability. The goal of real-time automated processing has been pursued without the need of specific high performance processing hardware (a simple laptop is required). Moreover, the automatization of the entire surveying process allows to obtain high quality and repeatable results without the need of skilled interpreters. The proposed acquisition procedure has been extensively tested: more than 100 Km of acquired data prove the feasibility of the proposed approach.

  9. Integrating Automation into a Multi-Mission Operations Center

    NASA Technical Reports Server (NTRS)

    Surka, Derek M.; Jones, Lori; Crouse, Patrick; Cary, Everett A, Jr.; Esposito, Timothy C.

    2007-01-01

    NASA Goddard Space Flight Center's Space Science Mission Operations (SSMO) Project is currently tackling the challenge of minimizing ground operations costs for multiple satellites that have surpassed their prime mission phase and are well into extended mission. These missions are being reengineered into a multi-mission operations center built around modern information technologies and a common ground system infrastructure. The effort began with the integration of four SMEX missions into a similar architecture that provides command and control capabilities and demonstrates fleet automation and control concepts as a pathfinder for additional mission integrations. The reengineered ground system, called the Multi-Mission Operations Center (MMOC), is now undergoing a transformation to support other SSMO missions, which include SOHO, Wind, and ACE. This paper presents the automation principles and lessons learned to date for integrating automation into an existing operations environment for multiple satellites.

  10. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .../Automated Credit Account File Structure This is the structure of the data file to provide information to the... remainder of the data fields defined below should be populated. For data provided in the Sweep/Automated... number. The Account Identifier may be composed of more than one physical data element. If multiple fields...

  11. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .../Automated Credit Account File Structure This is the structure of the data file to provide information to the... remainder of the data fields defined below should be populated. For data provided in the Sweep/Automated... number. The Account Identifier may be composed of more than one physical data element. If multiple fields...

  12. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .../Automated Credit Account File Structure This is the structure of the data file to provide information to the... remainder of the data fields defined below should be populated. For data provided in the Sweep/Automated... number. The Account Identifier may be composed of more than one physical data element. If multiple fields...

  13. 12 CFR Appendix D to Part 360 - Sweep/Automated Credit Account File Structure

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .../Automated Credit Account File Structure This is the structure of the data file to provide information to the... remainder of the data fields defined below should be populated. For data provided in the Sweep/Automated... number. The Account Identifier may be composed of more than one physical data element. If multiple fields...

  14. Fundamental Use of Surgical Energy (FUSE) certification: validation and predictors of success.

    PubMed

    Robinson, Thomas N; Olasky, Jaisa; Young, Patricia; Feldman, Liane S; Fuchshuber, Pascal R; Jones, Stephanie B; Madani, Amin; Brunt, Michael; Mikami, Dean; Jackson, Gretchen P; Mischna, Jessica; Schwaitzberg, Steven; Jones, Daniel B

    2016-03-01

    The Fundamental Use of Surgical Energy (FUSE) program includes a Web-based didactic curriculum and a high-stakes multiple-choice question examination with the goal to provide certification of knowledge on the safe use of surgical energy-based devices. The purpose of this study was (1) to set a passing score through a psychometrically sound process and (2) to determine what pretest factors predicted passing the FUSE examination. Beta-testing of multiple-choice questions on 62 topics of importance to the safe use of surgical energy-based devices was performed. Eligible test takers were physicians with a minimum of 1 year of surgical training who were recruited by FUSE task force members. A pretest survey collected baseline information. A total of 227 individuals completed the FUSE beta-test, and 208 completed the pretest survey. The passing/cut score for the first test form of the FUSE multiple-choice examination was determined using the modified Angoff methodology and for the second test form was determined using a linear equating methodology. The overall passing rate across the two examination forms was 81.5%. Self-reported time studying the FUSE Web-based curriculum for a minimum of >2 h was associated with a passing examination score (p < 0.001). Performance was not different based on increased years of surgical practice (p = 0.363), self-reported expertise on one or more types of energy-based devices (p = 0.683), participation in the FUSE postgraduate course (p = 0.426), or having reviewed the FUSE manual (p = 0.428). Logistic regression found that studying the FUSE didactics for >2 h predicted a passing score (OR 3.61; 95% CI 1.44-9.05; p = 0.006) independent of the other baseline characteristics recorded. The development of the FUSE examination, including the passing score, followed a psychometrically sound process. Self-reported time studying the FUSE curriculum predicted a passing score independent of other pretest characteristics such as years in practice and self-reported expertise.

  15. Trellis Tone Modulation Multiple-Access for Peer Discovery in D2D Networks

    PubMed Central

    Lim, Chiwoo; Kim, Sang-Hyo

    2018-01-01

    In this paper, a new non-orthogonal multiple-access scheme, trellis tone modulation multiple-access (TTMMA), is proposed for peer discovery of distributed device-to-device (D2D) communication. The range and capacity of discovery are important performance metrics in peer discovery. The proposed trellis tone modulation uses single-tone transmission and achieves a long discovery range due to its low Peak-to-Average Power Ratio (PAPR). The TTMMA also exploits non-orthogonal resource assignment to increase the discovery capacity. For the multi-user detection of superposed multiple-access signals, a message-passing algorithm with supplementary schemes are proposed. With TTMMA and its message-passing demodulation, approximately 1.5 times the number of devices are discovered compared to the conventional frequency division multiple-access (FDMA)-based discovery. PMID:29673167

  16. Trellis Tone Modulation Multiple-Access for Peer Discovery in D2D Networks.

    PubMed

    Lim, Chiwoo; Jang, Min; Kim, Sang-Hyo

    2018-04-17

    In this paper, a new non-orthogonal multiple-access scheme, trellis tone modulation multiple-access (TTMMA), is proposed for peer discovery of distributed device-to-device (D2D) communication. The range and capacity of discovery are important performance metrics in peer discovery. The proposed trellis tone modulation uses single-tone transmission and achieves a long discovery range due to its low Peak-to-Average Power Ratio (PAPR). The TTMMA also exploits non-orthogonal resource assignment to increase the discovery capacity. For the multi-user detection of superposed multiple-access signals, a message-passing algorithm with supplementary schemes are proposed. With TTMMA and its message-passing demodulation, approximately 1.5 times the number of devices are discovered compared to the conventional frequency division multiple-access (FDMA)-based discovery.

  17. Poster - 09: A MATLAB-based Program for Automated Quality Assurance of a Prostate Brachytherapy Ultrasound System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poon, Justin; Sabondjian, Eric; Sankreacha, Raxa

    Purpose: A robust Quality Assurance (QA) program is essential for prostate brachytherapy ultrasound systems due to the importance of imaging accuracy during treatment and planning. Task Group 128 of the American Association of Physicists in Medicine has recommended a set of QA tests covering grayscale visibility, depth of penetration, axial and lateral resolution, distance measurement, area measurement, volume measurement, and template/electronic grid alignment. Making manual measurements on the ultrasound system can be slow and inaccurate, so a MATLAB program was developed for automation of the described tests. Methods: Test images were acquired using a BK Medical Flex Focus 400 ultrasoundmore » scanner and 8848 transducer with the CIRS Brachytherapy QA Phantom – Model 045A. For each test, the program automatically segments the inputted image(s), makes the appropriate measurements, and indicates if the test passed or failed. The program was tested by analyzing two sets of images, where the measurements from the first set were used as baseline values. Results: The program successfully analyzed the images for each test and determined if any action limits were exceeded. All tests passed – the measurements made by the program were consistent and met the requirements outlined by Task Group 128. Conclusions: The MATLAB program we have developed can be used for automated QA of an ultrasound system for prostate brachytherapy. The GUI provides a user-friendly way to analyze images without the need for any manual measurement, potentially removing intra- and inter-user variability for more consistent results.« less

  18. Advanced Technologies and Methodology for Automated Ultrasonic Testing Systems Quantification

    DOT National Transportation Integrated Search

    2011-04-29

    For automated ultrasonic testing (AUT) detection and sizing accuracy, this program developed a methodology for quantification of AUT systems, advancing and quantifying AUT systems imagecapture capabilities, quantifying the performance of multiple AUT...

  19. BEHAVIORAL COACHING TO IMPROVE OFFENSIVE LINE PASS-BLOCKING SKILLS OF HIGH SCHOOL FOOTBALL ATHLETES

    PubMed Central

    Stokes, John V; Luiselli, James K; Reed, Derek D; Fleming, Richard K

    2010-01-01

    We evaluated several behavioral coaching procedures for improving offensive line pass-blocking skills with 5 high school varsity football players. Pass blocking was measured during practice drills and games, and our intervention included descriptive feedback with and without video feedback and teaching with acoustical guidance (TAG). Intervention components and pass blocking were evaluated in a multiple baseline design, which showed that video feedback and TAG were the most effective procedures. For all players, improved pass blocking matched a standard derived by observing more experienced linemen and was evident in games. Additional intervention was required to maintain pass-blocking proficiency. Issues pertinent to behavioral coaching and sport psychology research are discussed. PMID:21358905

  20. Automated biosurveillance data from England and Wales, 1991-2011.

    PubMed

    Enki, Doyo G; Noufaily, Angela; Garthwaite, Paul H; Andrews, Nick J; Charlett, André; Lane, Chris; Farrington, C Paddy

    2013-01-01

    Outbreak detection systems for use with very large multiple surveillance databases must be suited both to the data available and to the requirements of full automation. To inform the development of more effective outbreak detection algorithms, we analyzed 20 years of data (1991-2011) from a large laboratory surveillance database used for outbreak detection in England and Wales. The data relate to 3,303 distinct types of infectious pathogens, with a frequency range spanning 6 orders of magnitude. Several hundred organism types were reported each week. We describe the diversity of seasonal patterns, trends, artifacts, and extra-Poisson variability to which an effective multiple laboratory-based outbreak detection system must adjust. We provide empirical information to guide the selection of simple statistical models for automated surveillance of multiple organisms, in the light of the key requirements of such outbreak detection systems, namely, robustness, flexibility, and sensitivity.

  1. Automated Biosurveillance Data from England and Wales, 1991–2011

    PubMed Central

    Enki, Doyo G.; Noufaily, Angela; Garthwaite, Paul H.; Andrews, Nick J.; Charlett, André; Lane, Chris

    2013-01-01

    Outbreak detection systems for use with very large multiple surveillance databases must be suited both to the data available and to the requirements of full automation. To inform the development of more effective outbreak detection algorithms, we analyzed 20 years of data (1991–2011) from a large laboratory surveillance database used for outbreak detection in England and Wales. The data relate to 3,303 distinct types of infectious pathogens, with a frequency range spanning 6 orders of magnitude. Several hundred organism types were reported each week. We describe the diversity of seasonal patterns, trends, artifacts, and extra-Poisson variability to which an effective multiple laboratory-based outbreak detection system must adjust. We provide empirical information to guide the selection of simple statistical models for automated surveillance of multiple organisms, in the light of the key requirements of such outbreak detection systems, namely, robustness, flexibility, and sensitivity. PMID:23260848

  2. Rotor noise due to blade-turbulence interaction

    NASA Astrophysics Data System (ADS)

    Ishimaru, K.

    1983-01-01

    The time-averaged intensity density function of the acoustic radiation from rotating blades is derived by replacing blades with rotating dipoles. This derivation is done under the following turbulent inflow conditions: turbulent ingestion with no inlet strut wakes, inflow turbulence elongation and contraction with no inlet strut wakes, and inlet strut wakes. Dimensional analysis reveals two non-dimensional parameters which play important roles in generating the blade-passing frequency tone and its multiples. The elongation and contraction of inflow turbulence has a strong effect on the generation of the blade-passing frequency tone and its multiples. Increasing the number of rotor blades widens the peak at the blade-passing frequency and its multiples. Increasing the rotational speed widens the peak under the condition that the non-dimensional parameter involving the rotational speed is fixed. The number of struts and blades should be chosen so that (the least common multiple of them)-(rotational speed) is in the cutoff range of Sears' function, in order to minimize the effect of the mean flow deficit on the time averaged intensity density function.

  3. Public health surveillance of automated external defibrillators in the USA: protocol for the dynamic automated external defibrillator registry study

    PubMed Central

    Elrod, JoAnn Broeckel; Merchant, Raina; Daya, Mohamud; Youngquist, Scott; Salcido, David; Valenzuela, Terence; Nichol, Graham

    2017-01-01

    Introduction Lay use of automated external defibrillators (AEDs) before the arrival of emergency medical services (EMS) providers on scene increases survival after out-of-hospital cardiac arrest (OHCA). AEDs have been placed in public locations may be not ready for use when needed. We describe a protocol for AED surveillance that tracks these devices through time and space to improve public health, and survival as well as facilitate research. Methods and analysis Included AEDs are installed in public locations for use by laypersons to treat patients with OHCA before the arrival of EMS providers on scene. Included cases of OHCA are patients evaluated by organised EMS personnel and treated for OHCA. Enrolment of 10 000 AEDs annually will yield precision of 0.4% in the estimate of readiness for use. Enrolment of 2500 patients annually will yield precision of 1.9% in the estimate of survival to hospital discharge. Recruitment began on 21 Mar 2014 and is ongoing. AEDs are found by using multiple methods. Each AED is then tagged with a label which is a unique two-dimensional (2D) matrix code; the 2D matrix code is recorded and the location and status of the AED tracked using a smartphone; these elements are automatically passed via the internet to a secure and confidential database in real time. Whenever the 2D matrix code is rescanned for any non-clinical or clinical use of an AED, the user is queried to answer a finite set of questions about the device status. The primary outcome of any clinical use of an AED is survival to hospital discharge. Results are summarised descriptively. Ethics and dissemination These activities are conducted under a grant of authority for public health surveillance from the Food and Drug Administration. Results are provided periodically to participating sites and sponsors to improve public health and quality of care. PMID:28360255

  4. A semi-automated method of monitoring dam passage of American Eels Anguilla rostrata

    USGS Publications Warehouse

    Welsh, Stuart A.; Aldinger, Joni L.

    2014-01-01

    Fish passage facilities at dams have become an important focus of fishery management in riverine systems. Given the personnel and travel costs associated with physical monitoring programs, automated or semi-automated systems are an attractive alternative for monitoring fish passage facilities. We designed and tested a semi-automated system for eel ladder monitoring at Millville Dam on the lower Shenandoah River, West Virginia. A motion-activated eel ladder camera (ELC) photographed each yellow-phase American Eel Anguilla rostrata that passed through the ladder. Digital images (with date and time stamps) of American Eels allowed for total daily counts and measurements of eel TL using photogrammetric methods with digital imaging software. We compared physical counts of American Eels with camera-based counts; TLs obtained with a measuring board were compared with TLs derived from photogrammetric methods. Data from the ELC were consistent with data obtained by physical methods, thus supporting the semi-automated camera system as a viable option for monitoring American Eel passage. Time stamps on digital images allowed for the documentation of eel passage time—data that were not obtainable from physical monitoring efforts. The ELC has application to eel ladder facilities but can also be used to monitor dam passage of other taxa, such as crayfishes, lampreys, and water snakes.

  5. Facilitating relational framing in children and individuals with developmental delay using the relational completion procedure.

    PubMed

    Walsh, Sinead; Horgan, Jennifer; May, Richard J; Dymond, Simon; Whelan, Robert

    2014-01-01

    The Relational Completion Procedure is effective for establishing same, opposite and comparative derived relations in verbally able adults, but to date it has not been used to establish relational frames in young children or those with developmental delay. In Experiment 1, the Relational Completion Procedure was used with the goal of establishing two 3-member sameness networks in nine individuals with Autism Spectrum Disorder (eight with language delay). A multiple exemplar intervention was employed to facilitate derived relational responding when required. Seven of nine participants in Experiment 1 passed tests for derived relations. In Experiment 2, eight participants (all of whom, except one, had a verbal repertoire) were given training with the aim of establishing two 4-member sameness networks. Three of these participants were typically developing young children aged between 5 and 6 years old, all of whom demonstrated derived relations, as did four of the five participants with developmental delay. These data demonstrate that it is possible to reliably establish derived relations in young children and those with developmental delay using an automated procedure. © Society for the Experimental Analysis of Behavior.

  6. Two pass method and radiation interchange processing when applied to thermal-structural analysis of large space truss structures

    NASA Technical Reports Server (NTRS)

    Warren, Andrew H.; Arelt, Joseph E.; Lalicata, Anthony L.; Rogers, Karen M.

    1993-01-01

    A method of efficient and automated thermal-structural processing of very large space structures is presented. The method interfaces the finite element and finite difference techniques. It also results in a pronounced reduction of the quantity of computations, computer resources and manpower required for the task, while assuring the desired accuracy of the results.

  7. Xenon International Automated Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-08-05

    The Xenon International Automated Control software monitors, displays status, and allows for manual operator control as well as fully automatic control of multiple commercial and PNNL designed hardware components to generate and transmit atmospheric radioxenon concentration measurements every six hours.

  8. Mission control of multiple unmanned aerial vehicles: a workload analysis.

    PubMed

    Dixon, Stephen R; Wickens, Christopher D; Chang, Dervon

    2005-01-01

    With unmanned aerial vehicles (UAVs), 36 licensed pilots flew both single-UAV and dual-UAV simulated military missions. Pilots were required to navigate each UAV through a series of mission legs in one of the following three conditions: a baseline condition, an auditory autoalert condition, and an autopilot condition. Pilots were responsible for (a) mission completion, (b) target search, and (c) systems monitoring. Results revealed that both the autoalert and the autopilot automation improved overall performance by reducing task interference and alleviating workload. The autoalert system benefited performance both in the automated task and mission completion task, whereas the autopilot system benefited performance in the automated task, the mission completion task, and the target search task. Practical implications for the study include the suggestion that reliable automation can help alleviate task interference and reduce workload, thereby allowing pilots to better handle concurrent tasks during single- and multiple-UAV flight control.

  9. Change in methodology for collection of drinking water intake in What We Eat in America/National Health and Nutrition Examination Survey: implications for analysis.

    PubMed

    Sebastian, Rhonda S; Wilkinson Enns, Cecilia; Goldman, Joseph D; Moshfegh, Alanna J

    2012-07-01

    To provide updated estimates of drinking water intake (total, tap, plain bottled) for groups aged ≥1 year in the USA and to determine whether intakes collected in 2005-2006 using the Automated Multiple-Pass Method for the 24 h recall differ from intakes collected in 2003-2004 via post-recall food-frequency type questions. Cross-sectional, observational study. What We Eat in America (WWEIA), the dietary intake component of the US National Health and Nutrition Examination Survey (NHANES). Individuals aged ≥1 year in 2003-2004 (n 8249) and 2005-2006 (n 8437) with one complete 24 h recall. The estimate for the percentage of individuals who reported total drinking water in 2005-2006 was significantly (P < 0·0000) smaller (76·9 %) than that for 2003-2004 (87·1 %), attributable to a lower percentage reporting tap water (54·1 % in 2005-2006 v. 67·0 % in 2003-2004; P = 0·0001). Estimates of mean tap water intake differed between the survey cycles for men aged ≥71 years. Survey variables must be examined before combining or comparing data from multiple WWEIA/NHANES release cycles. For at least some age/gender groups, drinking water intake data from NHANES cycles prior to 2005-2006 should not be considered comparable to more recent data.

  10. Intel NX to PVM 3.2 message passing conversion library

    NASA Technical Reports Server (NTRS)

    Arthur, Trey; Nelson, Michael L.

    1993-01-01

    NASA Langley Research Center has developed a library that allows Intel NX message passing codes to be executed under the more popular and widely supported Parallel Virtual Machine (PVM) message passing library. PVM was developed at Oak Ridge National Labs and has become the defacto standard for message passing. This library will allow the many programs that were developed on the Intel iPSC/860 or Intel Paragon in a Single Program Multiple Data (SPMD) design to be ported to the numerous architectures that PVM (version 3.2) supports. Also, the library adds global operations capability to PVM. A familiarity with Intel NX and PVM message passing is assumed.

  11. Catalytic reactor for promoting a chemical reaction on a fluid passing therethrough

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Subir (Inventor); Pfefferle, William C. (Inventor)

    2001-01-01

    A catalytic reactor with an auxiliary heating structure for raising the temperature of a fluid passing therethrough whereby the catalytic reaction is promoted. The invention is a apparatus employing multiple electrical heating elements electrically isolated from one another by insulators that are an integral part of the flow path. The invention provides step heating of a fluid as the fluid passes through the reactor.

  12. A Real-Time Rejection Circuit to Automatically Reject Multiple Interfering Hopping Signals While Passing a Lower Level Desired Signal.

    DTIC Science & Technology

    contain the low level desired frequency components that are passed through an inverse transform device for producing a frequency domain signal of the desired signal uncorrupted by unwanted signals. Patent applications. (RRH)

  13. A chandelier-illuminated anterior chamber maintainer for use during descemet stripping automated endothelial keratoplasty in patients with advanced bullous keratopathy.

    PubMed

    Inoue, Tomoyuki; Oshima, Yusuke; Hori, Yuich; Maeda, Naoyuki

    2010-08-01

    A new 25-gauge illuminated anterior chamber maintainer composed of a 25-gauge infusion cannula through which a 29-gauge chandelier fiber probe passes was developed for use during Descemet stripping automated endothelial keratoplasty to treat patients with advanced bullous keratopathy. This device, which is compatible with a xenon or mercury vapor illuminator to generate powerful wide-angle illumination from the cone-shaped chandelier fiber tip, is self-retained at the corneal limbus after insertion of the infusion cannula through a corneal side port. Because of its bifunctionality, that is, bright illumination and adequate irrigation flow, excellent visibility with stable anterior chamber maintenance can be concurrently obtained for Descemet stripping, endothelial graft insertion, and subsequent intraocular manipulations without the need for use of a biologic staining technique or ophthalmic viscosurgical products even in patients with severe corneal haze. This new device facilitates safe and simple intraocular manipulation during Descemet stripping automated endothelial keratoplasty.

  14. A Complexity Metric for Automated Separation

    NASA Technical Reports Server (NTRS)

    Aweiss, Arwa

    2009-01-01

    A metric is proposed to characterize airspace complexity with respect to an automated separation assurance function. The Maneuver Option metric is a function of the number of conflict-free trajectory change options the automated separation assurance function is able to identify for each aircraft in the airspace at a given time. By aggregating the metric for all aircraft in a region of airspace, a measure of the instantaneous complexity of the airspace is produced. A six-hour simulation of Fort Worth Center air traffic was conducted to assess the metric. Results showed aircraft were twice as likely to be constrained in the vertical dimension than the horizontal one. By application of this metric, situations found to be most complex were those where level overflights and descending arrivals passed through or merged into an arrival stream. The metric identified high complexity regions that correlate well with current air traffic control operations. The Maneuver Option metric did not correlate with traffic count alone, a result consistent with complexity metrics for human-controlled airspace.

  15. Development of a Web-Based 24-h Dietary Recall for a French-Canadian Population.

    PubMed

    Jacques, Simon; Lemieux, Simone; Lamarche, Benoît; Laramée, Catherine; Corneau, Louise; Lapointe, Annie; Tessier-Grenier, Maude; Robitaille, Julie

    2016-11-15

    Twenty-four-hour dietary recalls can provide high-quality dietary intake data, but are considered expensive, as they rely on trained professionals for both their administration and coding. The objective of this study was to develop an automated, self-administered web-based 24-h recall (R24W) for a French-Canadian population. The development of R24W was inspired by the United States Department of Agriculture (USDA) Automated Multiple-Pass Method. Questions about the context of meals/snacks were included. Toppings, sauces and spices frequently added to each food/dish were suggested systematically. A list of frequently forgotten food was also suggested. An interactive summary allows the respondent to track the progress of the questionnaire and to modify or remove food as needed. The R24W prototype was pre-tested for usability and functionality in a convenience sample of 29 subjects between the ages of 23 and 65 years, who had to complete one recall, as well as a satisfaction questionnaire. R24W includes a list of 2865 food items, distributed into 16 categories and 98 subcategories. A total of 687 recipes were created for mixed dishes, including 336 ethnic recipes. Pictures of food items illustrate up to eight servings per food item. The pre-test demonstrated that R24W is easy to complete and to understand. This new dietary assessment tool is a simple and inexpensive tool that will facilitate diet assessment of individuals in large-scale studies, but validation studies are needed prior to the utilization of the R24W.

  16. Development of a Web-Based 24-h Dietary Recall for a French-Canadian Population

    PubMed Central

    Jacques, Simon; Lemieux, Simone; Lamarche, Benoît; Laramée, Catherine; Corneau, Louise; Lapointe, Annie; Tessier-Grenier, Maude; Robitaille, Julie

    2016-01-01

    Twenty-four-hour dietary recalls can provide high-quality dietary intake data, but are considered expensive, as they rely on trained professionals for both their administration and coding. The objective of this study was to develop an automated, self-administered web-based 24-h recall (R24W) for a French-Canadian population. The development of R24W was inspired by the United States Department of Agriculture (USDA) Automated Multiple-Pass Method. Questions about the context of meals/snacks were included. Toppings, sauces and spices frequently added to each food/dish were suggested systematically. A list of frequently forgotten food was also suggested. An interactive summary allows the respondent to track the progress of the questionnaire and to modify or remove food as needed. The R24W prototype was pre-tested for usability and functionality in a convenience sample of 29 subjects between the ages of 23 and 65 years, who had to complete one recall, as well as a satisfaction questionnaire. R24W includes a list of 2865 food items, distributed into 16 categories and 98 subcategories. A total of 687 recipes were created for mixed dishes, including 336 ethnic recipes. Pictures of food items illustrate up to eight servings per food item. The pre-test demonstrated that R24W is easy to complete and to understand. This new dietary assessment tool is a simple and inexpensive tool that will facilitate diet assessment of individuals in large-scale studies, but validation studies are needed prior to the utilization of the R24W. PMID:27854276

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teplitsky, Ella; Joshi, Karan; Ericson, Daniel L.

    We describe a high throughput method for screening up to 1728 distinct chemicals with protein crystals on a single microplate. Acoustic droplet ejection (ADE) was used to co-position 2.5 nL of protein, precipitant, and chemicals on a MiTeGen in situ-1 crystallization plate™ for screening by co-crystallization or soaking. ADE-transferred droplets follow a precise trajectory which allows all components to be transferred through small apertures in the microplate lid. The apertures were large enough for 2.5 nL droplets to pass through them, but small enough so that they did not disrupt the internal environment created by the mother liquor. Using thismore » system, thermolysin and trypsin crystals were efficiently screened for binding to a heavy-metal mini-library. Fluorescence and X-ray diffraction were used to confirm that each chemical in the heavy-metal library was correctly paired with the intended protein crystal. Moreover, a fragment mini-library was screened to observe two known lysozyme We describe a high throughput method for screening up to 1728 distinct chemicals with protein crystals on a single microplate. Acoustic droplet ejection (ADE) was used to co-position 2.5 nL of protein, precipitant, and chemicals on a MiTeGen in situ-1 crystallization plate™ for screening by co-crystallization or soaking. ADE-transferred droplets follow a precise trajectory which allows all components to be transferred through small apertures in the microplate lid. The apertures were large enough for 2.5 nL droplets to pass through them, but small enough so that they did not disrupt the internal environment created by the mother liquor. Using this system, thermolysin and trypsin crystals were efficiently screened for binding to a heavy-metal mini-library. Fluorescence and X-ray diffraction were used to confirm that each chemical in the heavy-metal library was correctly paired with the intended protein crystal. A fragment mini-library was screened to observe two known lysozyme ligands using both co-crystallization and soaking. A similar approach was used to identify multiple, novel thaumatin binding sites for ascorbic acid. This technology pushes towards a faster, automated, and more flexible strategy for high throughput screening of chemical libraries (such as fragment libraries) using as little as 2.5 nL of each component.ds using both co-crystallization and soaking. We used a A similar approach to identify multiple, novel thaumatin binding sites for ascorbic acid. This technology pushes towards a faster, automated, and more flexible strategy for high throughput screening of chemical libraries (such as fragment libraries) using as little as 2.5 nL of each component.« less

  18. High throughput screening using acoustic droplet ejection to combine protein crystals and chemical libraries on crystallization plates at high density

    DOE PAGES

    Teplitsky, Ella; Joshi, Karan; Ericson, Daniel L.; ...

    2015-07-01

    We describe a high throughput method for screening up to 1728 distinct chemicals with protein crystals on a single microplate. Acoustic droplet ejection (ADE) was used to co-position 2.5 nL of protein, precipitant, and chemicals on a MiTeGen in situ-1 crystallization plate™ for screening by co-crystallization or soaking. ADE-transferred droplets follow a precise trajectory which allows all components to be transferred through small apertures in the microplate lid. The apertures were large enough for 2.5 nL droplets to pass through them, but small enough so that they did not disrupt the internal environment created by the mother liquor. Using thismore » system, thermolysin and trypsin crystals were efficiently screened for binding to a heavy-metal mini-library. Fluorescence and X-ray diffraction were used to confirm that each chemical in the heavy-metal library was correctly paired with the intended protein crystal. Moreover, a fragment mini-library was screened to observe two known lysozyme We describe a high throughput method for screening up to 1728 distinct chemicals with protein crystals on a single microplate. Acoustic droplet ejection (ADE) was used to co-position 2.5 nL of protein, precipitant, and chemicals on a MiTeGen in situ-1 crystallization plate™ for screening by co-crystallization or soaking. ADE-transferred droplets follow a precise trajectory which allows all components to be transferred through small apertures in the microplate lid. The apertures were large enough for 2.5 nL droplets to pass through them, but small enough so that they did not disrupt the internal environment created by the mother liquor. Using this system, thermolysin and trypsin crystals were efficiently screened for binding to a heavy-metal mini-library. Fluorescence and X-ray diffraction were used to confirm that each chemical in the heavy-metal library was correctly paired with the intended protein crystal. A fragment mini-library was screened to observe two known lysozyme ligands using both co-crystallization and soaking. A similar approach was used to identify multiple, novel thaumatin binding sites for ascorbic acid. This technology pushes towards a faster, automated, and more flexible strategy for high throughput screening of chemical libraries (such as fragment libraries) using as little as 2.5 nL of each component.ds using both co-crystallization and soaking. We used a A similar approach to identify multiple, novel thaumatin binding sites for ascorbic acid. This technology pushes towards a faster, automated, and more flexible strategy for high throughput screening of chemical libraries (such as fragment libraries) using as little as 2.5 nL of each component.« less

  19. Apparatus and method to achieve high-resolution microscopy with non-diffracting or refracting radiation

    DOEpatents

    Tobin, Jr., Kenneth W.; Bingham, Philip R.; Hawari, Ayman I.

    2012-11-06

    An imaging system employing a coded aperture mask having multiple pinholes is provided. The coded aperture mask is placed at a radiation source to pass the radiation through. The radiation impinges on, and passes through an object, which alters the radiation by absorption and/or scattering. Upon passing through the object, the radiation is detected at a detector plane to form an encoded image, which includes information on the absorption and/or scattering caused by the material and structural attributes of the object. The encoded image is decoded to provide a reconstructed image of the object. Because the coded aperture mask includes multiple pinholes, the radiation intensity is greater than a comparable system employing a single pinhole, thereby enabling a higher resolution. Further, the decoding of the encoded image can be performed to generate multiple images of the object at different distances from the detector plane. Methods and programs for operating the imaging system are also disclosed.

  20. Office automation: The administrative window into the integrated DBMS

    NASA Technical Reports Server (NTRS)

    Brock, G. H.

    1985-01-01

    In parallel to the evolution of Management Information Systems from simple data files to complex data bases, the stand-alone computer systems have been migrating toward fully integrated systems serving the work force. The next major productivity gain may very well be to make these highly sophisticated working level Data Base Management Systems (DMBS) serve all levels of management with reports of varying levels of detail. Most attempts by the DBMS development organization to provide useful information to management seem to bog down in the quagmire of competing working level requirements. Most large DBMS development organizations possess three to five year backlogs. Perhaps Office Automation is the vehicle that brings to pass the Management Information System that really serves management. A good office automation system manned by a team of facilitators seeking opportunities to serve end-users could go a long way toward defining a DBMS that serves management. This paper will briefly discuss the problems of the DBMS organization, alternative approaches to solving some of the major problems, a debate about problems that may have no solution, and finally how office automation fits into the development of the Manager's Management Information System.

  1. Automating Nearshore Bathymetry Extraction from Wave Motion in Satellite Optical Imagery

    DTIC Science & Technology

    2012-03-01

    positions and overlap in the electromagnetic spectrum (From DigitalGlobe, 2011b). ..............................18  Figure 9.  STK snap shot of...to-Noise Ratio STK Satellite Tool Kit UTM Universal Transverse Mercator WKB Wave Kinematics Bathymetry xviii THIS PAGE INTENTIONALLY LEFT...planned over the coming months. 21 Figure 9. STK snap shot of WorldView-2 collection pass. C. METHOD The imagery was collected at about 2200Z

  2. Residual Stress Distribution and Microstructure of a Multiple Laser-Peened Near-Alpha Titanium Alloy

    NASA Astrophysics Data System (ADS)

    Umapathi, A.; Swaroop, S.

    2018-04-01

    Laser peening without coating (LPwC) was performed on a Ti-2.5 Cu alloy with multiple passes (1, 3 and 5), using a Nd:YAG laser (1064 nm) at a constant overlap rate of 70% and power density of 6.7 GW cm-2. Hardness and residual stress profiles indicated thermal softening near the surface (< 100 μm) and bulk softening due to adiabatic heating. Maximum hardness (235 HV at 500 μm) and maximum residual stress (- 890 MPa at 100 μm) were observed for LPwC with 1 pass. Surface roughness and surface 3-D topography imaging showed that the surface roughness increased with the increase in the number of passes. XRD results indicated no significant β phases. However, peak shifts, broadening and asymmetry were observed and interpreted based on dislocation activity. Microstructures indicated no melting or resolidification or refinement of grains at the surface. Twin density was found to increase with the increase in the number of passes.

  3. Residual Stress Distribution and Microstructure of a Multiple Laser-Peened Near-Alpha Titanium Alloy

    NASA Astrophysics Data System (ADS)

    Umapathi, A.; Swaroop, S.

    2018-05-01

    Laser peening without coating (LPwC) was performed on a Ti-2.5 Cu alloy with multiple passes (1, 3 and 5), using a Nd:YAG laser (1064 nm) at a constant overlap rate of 70% and power density of 6.7 GW cm-2. Hardness and residual stress profiles indicated thermal softening near the surface (< 100 μm) and bulk softening due to adiabatic heating. Maximum hardness (235 HV at 500 μm) and maximum residual stress (- 890 MPa at 100 μm) were observed for LPwC with 1 pass. Surface roughness and surface 3-D topography imaging showed that the surface roughness increased with the increase in the number of passes. XRD results indicated no significant β phases. However, peak shifts, broadening and asymmetry were observed and interpreted based on dislocation activity. Microstructures indicated no melting or resolidification or refinement of grains at the surface. Twin density was found to increase with the increase in the number of passes.

  4. Design requirements for SRB production control system. Volume 2: System requirements and conceptual description

    NASA Technical Reports Server (NTRS)

    1981-01-01

    In the development of the business system for the SRB automated production control system, special attention had to be paid to the unique environment posed by the space shuttle. The issues posed by this environment, and the means by which they were addressed, are reviewed. The change in management philosphy which will be required as NASA switches from one-of-a-kind launches to multiple launches is discussed. The implications of the assembly process on the business system are described. These issues include multiple missions, multiple locations and facilities, maintenance and refurbishment, multiple sources, and multiple contractors. The implications of these aspects on the automated production control system are reviewed including an assessment of the six major subsystems, as well as four other subsystem. Some general system requirements which flow through the entire business system are described.

  5. Performance Measurement, Visualization and Modeling of Parallel and Distributed Programs

    NASA Technical Reports Server (NTRS)

    Yan, Jerry C.; Sarukkai, Sekhar R.; Mehra, Pankaj; Lum, Henry, Jr. (Technical Monitor)

    1994-01-01

    This paper presents a methodology for debugging the performance of message-passing programs on both tightly coupled and loosely coupled distributed-memory machines. The AIMS (Automated Instrumentation and Monitoring System) toolkit, a suite of software tools for measurement and analysis of performance, is introduced and its application illustrated using several benchmark programs drawn from the field of computational fluid dynamics. AIMS includes (i) Xinstrument, a powerful source-code instrumentor, which supports both Fortran77 and C as well as a number of different message-passing libraries including Intel's NX Thinking Machines' CMMD, and PVM; (ii) Monitor, a library of timestamping and trace -collection routines that run on supercomputers (such as Intel's iPSC/860, Delta, and Paragon and Thinking Machines' CM5) as well as on networks of workstations (including Convex Cluster and SparcStations connected by a LAN); (iii) Visualization Kernel, a trace-animation facility that supports source-code clickback, simultaneous visualization of computation and communication patterns, as well as analysis of data movements; (iv) Statistics Kernel, an advanced profiling facility, that associates a variety of performance data with various syntactic components of a parallel program; (v) Index Kernel, a diagnostic tool that helps pinpoint performance bottlenecks through the use of abstract indices; (vi) Modeling Kernel, a facility for automated modeling of message-passing programs that supports both simulation -based and analytical approaches to performance prediction and scalability analysis; (vii) Intrusion Compensator, a utility for recovering true performance from observed performance by removing the overheads of monitoring and their effects on the communication pattern of the program; and (viii) Compatibility Tools, that convert AIMS-generated traces into formats used by other performance-visualization tools, such as ParaGraph, Pablo, and certain AVS/Explorer modules.

  6. Automation for deep space vehicle monitoring

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M.

    1991-01-01

    Information on automation for deep space vehicle monitoring is given in viewgraph form. Information is given on automation goals and strategy; the Monitor Analyzer of Real-time Voyager Engineering Link (MARVEL); intelligent input data management; decision theory for making tradeoffs; dynamic tradeoff evaluation; evaluation of anomaly detection results; evaluation of data management methods; system level analysis with cooperating expert systems; the distributed architecture of multiple expert systems; and event driven response.

  7. Modeling Multiple Human-Automation Distributed Systems using Network-form Games

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume

    2012-01-01

    The paper describes at a high-level the network-form game framework (based on Bayes net and game theory), which can be used to model and analyze safety issues in large, distributed, mixed human-automation systems such as NextGen.

  8. Development and validation of inexpensive, automated, dynamic flux chambers

    EPA Science Inventory

    We developed and validated an automated, inexpensive, and continuous multiple-species gas-flux monitoring system that can provide data for a variety of relevant atmospheric pollutants, including O3, CO2, and NOx. Validation consisted of conducting concurrent gas-phase dry deposit...

  9. Reconstruction of the esophagojejunostomy by double stapling method using EEA™ OrVil™ in laparoscopic total gastrectomy and proximal gastrectomy

    PubMed Central

    2011-01-01

    Here we report the method of anastomosis based on double stapling technique (hereinafter, DST) using a trans-oral anvil delivery system (EEATM OrVilTM) for reconstructing the esophagus and lifted jejunum following laparoscopic total gastrectomy or proximal gastric resection. As a basic technique, laparoscopic total gastrectomy employed Roux-en-Y reconstruction, laparoscopic proximal gastrectomy employed double tract reconstruction, and end-to-side anastomosis was used for the cut-off stump of the esophagus and lifted jejunum. We used EEATM OrVilTM as a device that permitted mechanical purse-string suture similarly to conventional EEA, and endo-Surgitie. After the gastric lymph node dissection, the esophagus was cut off using an automated stapler. EEATM OrVilTM was orally and slowly inserted from the valve tip, and a small hole was created at the tip of the obliquely cut-off stump with scissors to let the valve tip pass through. Yarn was cut to disconnect the anvil from a tube and the anvil head was retained in the esophagus. The end-Surgitie was inserted at the right subcostal margin, and after the looped-shaped thread was wrapped around the esophageal stump opening, assisting Maryland forceps inserted at the left subcostal and left abdomen were used to grasp the left and right esophageal stump. The surgeon inserted anvil grasping forceps into the right abdomen, and after grasping the esophagus with the forceps, tightened the end Surgitie, thereby completing the purse-string suture on the esophageal stump. The main unit of the automated stapler was inserted from the cut-off stump of the lifted jejunum, and a trocar was made to pass through. To prevent dropout of the small intestines from the automated stapler, the automated stapler and the lifted jejunum were fastened with silk thread, the abdomen was again inflated, and the lifted jejunum was led into the abdominal cavity. When it was confirmed that the automated stapler and center rod were made completely linear, the anvil and the main unit were connected with each other and firing was carried out. Then, DST-based anastomosis was completed with no dog-ear. The method may facilitate safe laparoscopic anastomosis between the esophagus and reconstructed intestine. This is also considered to serve as a useful anastomosis technique for upper levels of the esophagus in laparotomy. PMID:21599911

  10. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    NASA Astrophysics Data System (ADS)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  11. Petri net modelling of buffers in automated manufacturing systems.

    PubMed

    Zhou, M; Dicesare, F

    1996-01-01

    This paper presents Petri net models of buffers and a methodology by which buffers can be included in a system without introducing deadlocks or overflows. The context is automated manufacturing. The buffers and models are classified as random order or order preserved (first-in-first-out or last-in-first-out), single-input-single-output or multiple-input-multiple-output, part type and/or space distinguishable or indistinguishable, and bounded or safe. Theoretical results for the development of Petri net models which include buffer modules are developed. This theory provides the conditions under which the system properties of boundedness, liveness, and reversibility are preserved. The results are illustrated through two manufacturing system examples: a multiple machine and multiple buffer production line and an automatic storage and retrieval system in the context of flexible manufacturing.

  12. Spectrometer capillary vessel and method of making same

    DOEpatents

    Linehan, John C.; Yonker, Clement R.; Zemanian, Thomas S.; Franz, James A.

    1995-01-01

    The present invention is an arrangement of a glass capillary tube for use in spectroscopy. In particular, the invention is a capillary arranged in a manner permitting a plurality or multiplicity of passes of a sample material through a spectroscopic measurement zone. In a preferred embodiment, the multi-pass capillary is insertable within a standard NMR sample tube. The present invention further includes a method of making the multi-pass capillary tube and an apparatus for spinning the tube.

  13. Improved multiple-pass Raman spectrometer

    NASA Astrophysics Data System (ADS)

    Kc, Utsav; Silver, Joel A.; Hovde, David C.; Varghese, Philip L.

    2011-08-01

    An improved Raman gain spectrometer for flame measurements of gas temperature and species concentrations is described. This instrument uses a multiple-pass optical cell to enhance the incident light intensity in the measurement volume. The Raman signal is 83 times larger than from a single pass, and the Raman signal-to-noise ratio (SNR) in room-temperature air of 153 is an improvement over that from a single-pass cell by a factor of 9.3 when the cell is operated with 100 passes and the signal is integrated over 20 laser shots. The SNR improvement with the multipass cell is even higher for flame measurements at atmospheric pressure, because detector readout noise is more significant for single-pass measurements when the gas density is lower. Raman scattering is collected and dispersed in a spectrograph with a transmission grating and recorded with a fast gated CCD array detector to help eliminate flame interferences. The instrument is used to record spontaneous Raman spectra from N2, CO2, O2, and CO in a methane--air flame. Curve fits of the recorded Raman spectra to detailed simulations of nitrogen spectra are used to determine the flame temperature from the shapes of the spectral signatures and from the ratio of the total intensities of the Stokes and anti-Stokes signals. The temperatures measured are in good agreement with radiation-corrected thermocouple measurements for a range of equivalence ratios.

  14. Influence of multiple-passes on microstructure and mechanical properties of Al-Mg/SiC surface composites fabricated via underwater friction stir processing

    NASA Astrophysics Data System (ADS)

    Srivastava, Manu; Rathee, Sandeep; Maheshwari, Sachin; Siddiquee, Arshad Noor

    2018-06-01

    Friction stir processing (FSP) is a relatively newly developed solid-state process involving surface modifications for fabricating metal matrix surface composites. Obtaining metal matrix nano-composites with uniform dispersion of reinforcement particles via FSP route is an intricate task to accomplish. In this work, AA5059/SiC nano surface composites (SCs) were developed. Effect of multiple FSP passes and SiC addition on microstructure and mechanical properties of fabricated SCs during underwater condition was investigated. Results reflected that the average microhardness value of base metal (BM) increases from 85 Hv to 159 Hv in stir zone of four pass underwater friction stir processed (FSPed) SC. Highest ultimate tensile strength (UTS) achieved during four pass FSPed sample was 377 MPa that is higher than UTS of BM (321 MPa) and four pass FSPed sample developed at ambient air FSP conditions (347 MPa). An appreciably narrower heat affected zone is obtained owing to fast cooling and reduced heat conduction during underwater FSP, amounting to higher UTS as compared to BM and SC at ambient conditions. Thus, it can be concluded that surrounding medium and number of FSP passes have significant impact on mechanical properties of fabricated SCs. Analysis of microstructures and distribution of SiC particles in fabricated SCs were studied by optical microscope and FESEM respectively and found in good corroboration with the mechanical properties.

  15. SpcAudace: Spectroscopic processing and analysis package of Audela software

    NASA Astrophysics Data System (ADS)

    Mauclaire, Benjamin

    2017-11-01

    SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.

  16. Apparatus for the concurrent ultrasonic inspection of partially completed welds

    DOEpatents

    Johnson, John A.

    2000-01-01

    An apparatus for the concurrent nondestructive evaluation of partially completed welds is described and which is used in combination with an automated welder and which includes an ultrasonic signal generator mounted on the welder and which generates an ultrasonic signal which is directed toward one side of the partially completed welds; an ultrasonic signal receiver mounted on the automated welder for detecting ultrasonic signals which are transmitted by the ultrasonic signal generator and which are reflected or diffracted from one side of the partially completed weld or which passes through a given region of the partially completed weld; and an analysis assembly coupled with the ultrasonic signal receiver and which processes the ultrasonic signals received by the ultrasonic signal receiver to identify welding flaws in the partially completed weld.

  17. The Impact of Automation Reliability and Operator Fatigue on Performance and Reliance

    DTIC Science & Technology

    2016-09-23

    Matthews1 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER H0HJ (53290813) 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8... performance . Reliability Reliability of automation is a key factor in an operator’s reliance on automation. Previous work has shown that... Performance in a complex multiple-task environment during a laboratory-based simulation of occasional night work . Human Factors: The Journal of the

  18. Using Simultaneous Prompting Procedure to Promote Recall of Multiplication Facts by Middle School Students with Cognitive Impairment

    ERIC Educational Resources Information Center

    Rao, Shaila; Mallow, Lynette

    2009-01-01

    This study examined effectiveness of simultaneous prompting system in teaching students with cognitive impairment to automate recall of multiplication facts. A multiple probes design with multiple sets of math facts and replicated across multiple subjects was used to assess effectiveness of simultaneous prompting on recall of basic multiplication…

  19. Automated matching of multiple terrestrial laser scans for stem mapping without the use of artificial references

    NASA Astrophysics Data System (ADS)

    Liu, Jingbin; Liang, Xinlian; Hyyppä, Juha; Yu, Xiaowei; Lehtomäki, Matti; Pyörälä, Jiri; Zhu, Lingli; Wang, Yunsheng; Chen, Ruizhi

    2017-04-01

    Terrestrial laser scanning has been widely used to analyze the 3D structure of a forest in detail and to generate data at the level of a reference plot for forest inventories without destructive measurements. Multi-scan terrestrial laser scanning is more commonly applied to collect plot-level data so that all of the stems can be detected and analyzed. However, it is necessary to match the point clouds of multiple scans to yield a point cloud with automated processing. Mismatches between datasets will lead to errors during the processing of multi-scan data. Classic registration methods based on flat surfaces cannot be directly applied in forest environments; therefore, artificial reference objects have conventionally been used to assist with scan matching. The use of artificial references requires additional labor and expertise, as well as greatly increasing the cost. In this study, we present an automated processing method for plot-level stem mapping that matches multiple scans without artificial references. In contrast to previous studies, the registration method developed in this study exploits the natural geometric characteristics among a set of tree stems in a plot and combines the point clouds of multiple scans into a unified coordinate system. Integrating multiple scans improves the overall performance of stem mapping in terms of the correctness of tree detection, as well as the bias and the root-mean-square errors of forest attributes such as diameter at breast height and tree height. In addition, the automated processing method makes stem mapping more reliable and consistent among plots, reduces the costs associated with plot-based stem mapping, and enhances the efficiency.

  20. Long range targeting for space based rendezvous

    NASA Technical Reports Server (NTRS)

    Everett, Louis J.; Redfield, R. C.

    1995-01-01

    The work performed under this grant supported the Dexterous Flight Experiment one STS-62 The project required developing hardware and software for automating a TRAC sensor on orbit. The hardware developed by for the flight has been documented through standard NASA channels since it has to pass safety, environmental, and other issues. The software has not been documented previously, therefore, this report provides a software manual for the TRAC code developed for the grant.

  1. Adaptive function allocation reduces performance costs of static automation

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja; Mouloua, Mustapha; Molloy, Robert; Hilburn, Brian

    1993-01-01

    Adaptive automation offers the option of flexible function allocation between the pilot and on-board computer systems. One of the important claims for the superiority of adaptive over static automation is that such systems do not suffer from some of the drawbacks associated with conventional function allocation. Several experiments designed to test this claim are reported in this article. The efficacy of adaptive function allocation was examined using a laboratory flight-simulation task involving multiple functions of tracking, fuel-management, and systems monitoring. The results show that monitoring inefficiency represents one of the performance costs of static automation. Adaptive function allocation can reduce the performance cost associated with long-term static automation.

  2. Low-pass sequencing for microbial comparative genomics

    PubMed Central

    Goo, Young Ah; Roach, Jared; Glusman, Gustavo; Baliga, Nitin S; Deutsch, Kerry; Pan, Min; Kennedy, Sean; DasSarma, Shiladitya; Victor Ng, Wailap; Hood, Leroy

    2004-01-01

    Background We studied four extremely halophilic archaea by low-pass shotgun sequencing: (1) the metabolically versatile Haloarcula marismortui; (2) the non-pigmented Natrialba asiatica; (3) the psychrophile Halorubrum lacusprofundi and (4) the Dead Sea isolate Halobaculum gomorrense. Approximately one thousand single pass genomic sequences per genome were obtained. The data were analyzed by comparative genomic analyses using the completed Halobacterium sp. NRC-1 genome as a reference. Low-pass shotgun sequencing is a simple, inexpensive, and rapid approach that can readily be performed on any cultured microbe. Results As expected, the four archaeal halophiles analyzed exhibit both bacterial and eukaryotic characteristics as well as uniquely archaeal traits. All five halophiles exhibit greater than sixty percent GC content and low isoelectric points (pI) for their predicted proteins. Multiple insertion sequence (IS) elements, often involved in genome rearrangements, were identified in H. lacusprofundi and H. marismortui. The core biological functions that govern cellular and genetic mechanisms of H. sp. NRC-1 appear to be conserved in these four other halophiles. Multiple TATA box binding protein (TBP) and transcription factor IIB (TFB) homologs were identified from most of the four shotgunned halophiles. The reconstructed molecular tree of all five halophiles shows a large divergence between these species, but with the closest relationship being between H. sp. NRC-1 and H. lacusprofundi. Conclusion Despite the diverse habitats of these species, all five halophiles share (1) high GC content and (2) low protein isoelectric points, which are characteristics associated with environmental exposure to UV radiation and hypersalinity, respectively. Identification of multiple IS elements in the genome of H. lacusprofundi and H. marismortui suggest that genome structure and dynamic genome reorganization might be similar to that previously observed in the IS-element rich genome of H. sp. NRC-1. Identification of multiple TBP and TFB homologs in these four halophiles are consistent with the hypothesis that different types of complex transcriptional regulation may occur through multiple TBP-TFB combinations in response to rapidly changing environmental conditions. Low-pass shotgun sequence analyses of genomes permit extensive and diverse analyses, and should be generally useful for comparative microbial genomics. PMID:14718067

  3. SEAHT: A computer program for the use of intersecting arcs of altimeter data for sea surface height refinement

    NASA Technical Reports Server (NTRS)

    Allen, C. P.; Martin, C. F.

    1977-01-01

    The SEAHT program is designed to process multiple passes of altimeter data with intersecting ground tracks, with the estimation of corrections for orbital errors to each pass such that the data has the best overall agreement at the crossover points. Orbit error for each pass is modeled as a polynomial in time, with optional orders of 0, 1, or 2. One or more passes may be constrained in the adjustment process, thus allowing passes with the best orbits to provide the overall level and orientation of the estimated sea surface heights. Intersections which disagree by more than an input edit level are not used in the error parameter estimation. In the program implementation, passes are grouped into South-North passes and North-South passes, with the North-South passes partitioned out for the estimation of orbit error parameters. Computer core utilization is thus dependent on the number of parameters estimated for the set of South-North arcs, but is independent on the number of North-South passes. Estimated corrections for each pass are applied to the data at its input data rate and an output tape is written which contains the corrected data.

  4. Coaxial charged particle energy analyzer

    NASA Technical Reports Server (NTRS)

    Kelly, Michael A. (Inventor); Bryson, III, Charles E. (Inventor); Wu, Warren (Inventor)

    2011-01-01

    A non-dispersive electrostatic energy analyzer for electrons and other charged particles having a generally coaxial structure of a sequentially arranged sections of an electrostatic lens to focus the beam through an iris and preferably including an ellipsoidally shaped input grid for collimating a wide acceptance beam from a charged-particle source, an electrostatic high-pass filter including a planar exit grid, and an electrostatic low-pass filter. The low-pass filter is configured to reflect low-energy particles back towards a charged particle detector located within the low-pass filter. Each section comprises multiple tubular or conical electrodes arranged about the central axis. The voltages on the lens are scanned to place a selected energy band of the accepted beam at a selected energy at the iris. Voltages on the high-pass and low-pass filters remain substantially fixed during the scan.

  5. Human mixed lymphocyte cultures. Evaluation of microculture technique utilizing the multiple automated sample harvester (MASH)

    PubMed Central

    Thurman, G. B.; Strong, D. M.; Ahmed, A.; Green, S. S.; Sell, K. W.; Hartzman, R. J.; Bach, F. H.

    1973-01-01

    Use of lymphocyte cultures for in vitro studies such as pretransplant histocompatibility testing has established the need for standardization of this technique. A microculture technique has been developed that has facilitated the culturing of lymphocytes and increased the quantity of cultures feasible, while lowering the variation between replicate samples. Cultures were prepared for determination of tritiated thymidine incorporation using a Multiple Automated Sample Harvester (MASH). Using this system, the parameters that influence the in vitro responsiveness of human lymphocytes to allogeneic lymphocytes have been investigated. PMID:4271568

  6. Operations management system advanced automation: Fault detection isolation and recovery prototyping

    NASA Technical Reports Server (NTRS)

    Hanson, Matt

    1990-01-01

    The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.

  7. Spectrometer capillary vessel and method of making same

    DOEpatents

    Linehan, J.C.; Yonker, C.R.; Zemanian, T.S.; Franz, J.A.

    1995-11-21

    The present invention is an arrangement of a glass capillary tube for use in spectroscopy. In particular, the invention is a capillary arranged in a manner permitting a plurality or multiplicity of passes of a sample material through a spectroscopic measurement zone. In a preferred embodiment, the multi-pass capillary is insertable within a standard NMR sample tube. The present invention further includes a method of making the multi-pass capillary tube and an apparatus for spinning the tube. 13 figs.

  8. Fiber sensor network with multipoint sensing using double-pass hybrid LPFG-FBG sensor configuration

    NASA Astrophysics Data System (ADS)

    Yong, Yun-Thung; Lee, Sheng-Chyan; Rahman, Faidz Abd

    2017-03-01

    This is a study on double-pass intensity-based hybrid Long Period Fiber Grating (LPFG)and Fiber Bragg Grating (FBG) sensor configuration where a fiber sensor network was constructed with multiple sensing capability. The sensing principle is based on interrogation of intensity changes of the reflected signal from an FBG caused by the LPFG spectral response to the surrounding perturbations. The sensor network developed was tested in monitoring diesel adulteration of up to a distance of 8 km. Kerosene concentration from 0% to 50% was added as adulterant into diesel. The sensitivity of the double-pass hybrid LPFG-FBG sensor over multiple points was>0.21 dB/% (for adulteration range of 0-30%) and >0.45 dB/% from 30% to 50% adulteration. It is found that the sensitivity can drop up to 35% when the fiber length increased from 0 km to 8 km (for the case of adulteration of 0-30%). With the multiple sensing capabilities, normalized FBG's reflected power can be demodulated at the same time for comparison of sensitivity performance across various fiber sensors.

  9. Speciation analysis of arsenic in biological matrices by automated hydride generation-cryotrapping-atomic absorption spectrometry with multiple microflame quartz tube atomizer (multiatomizer).

    EPA Science Inventory

    This paper describes an automated system for the oxidation state specific speciation of inorganic and methylated arsenicals by selective hydride generation - cryotrapping- gas chromatography - atomic absorption spectrometry with the multiatomizer. The corresponding arsines are ge...

  10. Cest Analysis: Automated Change Detection from Very-High Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Ehlers, M.; Klonus, S.; Jarmer, T.; Sofina, N.; Michel, U.; Reinartz, P.; Sirmacek, B.

    2012-08-01

    A fast detection, visualization and assessment of change in areas of crisis or catastrophes are important requirements for coordination and planning of help. Through the availability of new satellites and/or airborne sensors with very high spatial resolutions (e.g., WorldView, GeoEye) new remote sensing data are available for a better detection, delineation and visualization of change. For automated change detection, a large number of algorithms has been proposed and developed. From previous studies, however, it is evident that to-date no single algorithm has the potential for being a reliable change detector for all possible scenarios. This paper introduces the Combined Edge Segment Texture (CEST) analysis, a decision-tree based cooperative suite of algorithms for automated change detection that is especially designed for the generation of new satellites with very high spatial resolution. The method incorporates frequency based filtering, texture analysis, and image segmentation techniques. For the frequency analysis, different band pass filters can be applied to identify the relevant frequency information for change detection. After transforming the multitemporal images via a fast Fourier transform (FFT) and applying the most suitable band pass filter, different methods are available to extract changed structures: differencing and correlation in the frequency domain and correlation and edge detection in the spatial domain. Best results are obtained using edge extraction. For the texture analysis, different 'Haralick' parameters can be calculated (e.g., energy, correlation, contrast, inverse distance moment) with 'energy' so far providing the most accurate results. These algorithms are combined with a prior segmentation of the image data as well as with morphological operations for a final binary change result. A rule-based combination (CEST) of the change algorithms is applied to calculate the probability of change for a particular location. CEST was tested with high-resolution satellite images of the crisis areas of Darfur (Sudan). CEST results are compared with a number of standard algorithms for automated change detection such as image difference, image ratioe, principal component analysis, delta cue technique and post classification change detection. The new combined method shows superior results averaging between 45% and 15% improvement in accuracy.

  11. Developing image processing meta-algorithms with data mining of multiple metrics.

    PubMed

    Leung, Kelvin; Cunha, Alexandre; Toga, A W; Parker, D Stott

    2014-01-01

    People often use multiple metrics in image processing, but here we take a novel approach of mining the values of batteries of metrics on image processing results. We present a case for extending image processing methods to incorporate automated mining of multiple image metric values. Here by a metric we mean any image similarity or distance measure, and in this paper we consider intensity-based and statistical image measures and focus on registration as an image processing problem. We show how it is possible to develop meta-algorithms that evaluate different image processing results with a number of different metrics and mine the results in an automated fashion so as to select the best results. We show that the mining of multiple metrics offers a variety of potential benefits for many image processing problems, including improved robustness and validation.

  12. A no-key-exchange secure image sharing scheme based on Shamir's three-pass cryptography protocol and the multiple-parameter fractional Fourier transform.

    PubMed

    Lang, Jun

    2012-01-30

    In this paper, we propose a novel secure image sharing scheme based on Shamir's three-pass protocol and the multiple-parameter fractional Fourier transform (MPFRFT), which can safely exchange information with no advance distribution of either secret keys or public keys between users. The image is encrypted directly by the MPFRFT spectrum without the use of phase keys, and information can be shared by transmitting the encrypted image (or message) three times between users. Numerical simulation results are given to verify the performance of the proposed algorithm.

  13. Voxel-based morphometry and automated lobar volumetry: The trade-off between spatial scale and statistical correction

    PubMed Central

    Voormolen, Eduard H.J.; Wei, Corie; Chow, Eva W.C.; Bassett, Anne S.; Mikulis, David J.; Crawley, Adrian P.

    2011-01-01

    Voxel-based morphometry (VBM) and automated lobar region of interest (ROI) volumetry are comprehensive and fast methods to detect differences in overall brain anatomy on magnetic resonance images. However, VBM and automated lobar ROI volumetry have detected dissimilar gray matter differences within identical image sets in our own experience and in previous reports. To gain more insight into how diverging results arise and to attempt to establish whether one method is superior to the other, we investigated how differences in spatial scale and in the need to statistically correct for multiple spatial comparisons influence the relative sensitivity of either technique to group differences in gray matter volumes. We assessed the performance of both techniques on a small dataset containing simulated gray matter deficits and additionally on a dataset of 22q11-deletion syndrome patients with schizophrenia (22q11DS-SZ) vs. matched controls. VBM was more sensitive to simulated focal deficits compared to automated ROI volumetry, and could detect global cortical deficits equally well. Moreover, theoretical calculations of VBM and ROI detection sensitivities to focal deficits showed that at increasing ROI size, ROI volumetry suffers more from loss in sensitivity than VBM. Furthermore, VBM and automated ROI found corresponding GM deficits in 22q11DS-SZ patients, except in the parietal lobe. Here, automated lobar ROI volumetry found a significant deficit only after a smaller subregion of interest was employed. Thus, sensitivity to focal differences is impaired relatively more by averaging over larger volumes in automated ROI methods than by the correction for multiple comparisons in VBM. These findings indicate that VBM is to be preferred over automated lobar-scale ROI volumetry for assessing gray matter volume differences between groups. PMID:19619660

  14. LHCb Conditions database operation assistance systems

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Shapoval, I.; Cattaneo, M.; Degaudenzi, H.; Santinelli, R.

    2012-12-01

    The Conditions Database (CondDB) of the LHCb experiment provides versioned, time dependent geometry and conditions data for all LHCb data processing applications (simulation, high level trigger (HLT), reconstruction, analysis) in a heterogeneous computing environment ranging from user laptops to the HLT farm and the Grid. These different use cases impose front-end support for multiple database technologies (Oracle and SQLite are used). Sophisticated distribution tools are required to ensure timely and robust delivery of updates to all environments. The content of the database has to be managed to ensure that updates are internally consistent and externally compatible with multiple versions of the physics application software. In this paper we describe three systems that we have developed to address these issues. The first system is a CondDB state tracking extension to the Oracle 3D Streams replication technology, to trap cases when the CondDB replication was corrupted. Second, an automated distribution system for the SQLite-based CondDB, providing also smart backup and checkout mechanisms for the CondDB managers and LHCb users respectively. And, finally, a system to verify and monitor the internal (CondDB self-consistency) and external (LHCb physics software vs. CondDB) compatibility. The former two systems are used in production in the LHCb experiment and have achieved the desired goal of higher flexibility and robustness for the management and operation of the CondDB. The latter one has been fully designed and is passing currently to the implementation stage.

  15. DIDBase: Intelligent, Interactive Archiving Technology for Ionogram Data

    NASA Astrophysics Data System (ADS)

    Reinisch, B. W.; Khmyrov, G.; Galkin, I. A.; Kozlov, A.

    2004-12-01

    Vertical ionospheric sounding data have been used in a variety of scenarios for ionospheric now-casting. Growing need for an accurate real-time specification of vertical electron density distribution at multiple locations stimulates interest to intelligent data management systems that can arrange concurrent, remote access to the acquired data. This type of data access requires high level of interaction and organization to support routing of data between ionosondes, data analysts, quality validation experts, end user applications, data managers, and online data repositories such as the World Data Centers. Digital Ionogram Database (DIDBase) is a pilot project started at UMASS Lowell in 2001, sponsored in part by the Air Force Research Laboratory, for management of real-time and retro data from a network of 50 digisondes. The DIDBase archives hold both raw and derived digisonde data under management of a commerical strength DBMS, providing convenient means for automated ingestion of real-time data from online digisondes (40 locations worldwide as of September 2004), remote read access to the data over HTTP Web protocol (http://ulcar.uml.edu/DIDBase/), remote read/write access from SAO Explorer workstations used for data visualization and interactive editing, and an ADRES subsystem for automated management of data requests. DIDBase and ADRES employ cross-platform solutions for all involved software, exchange protocols, and data. The paper briefly describes the DIDBase operations during a recent Cal/Val campaign for the SSUSI/SSULI instruments on the DMSP F16 spacecraft. Here 26 online digisondes provided ground-truth NmF2 data for the overhead and limb passes of the spacecraft. Since the start of the campaign in December 2003, the total number of the ADRES requests exceeded 9,000 by summer 2004.

  16. System for evaluating weld quality using eddy currents

    DOEpatents

    Todorov, Evgueni I.; Hay, Jacob

    2017-12-12

    Electromagnetic and eddy current techniques for fast automated real-time and near real-time inspection and monitoring systems for high production rate joining processes. An eddy current system, array and method for the fast examination of welds to detect anomalies such as missed seam (MS) and lack of penetration (LOP) the system, array and methods capable of detecting and sizing surface and slightly subsurface flaws at various orientations in connection with at least the first and second weld pass.

  17. The Joint Modular Intermodal Container, is this the Future of Naval Logistics?

    DTIC Science & Technology

    2005-06-01

    pallet size. Contrast this with the commercial shipping industry , which for the last 40 years has been moving non-bulk goods in hyper-efficient container...a Heavy UNREP station than a current STREAM .7station7. Figure 4: Heavy UNREP Enables New Loads to be passed Between Ships UACN et Enggines (12,000 ls...man-hours are being spent on inefficient and relatively inaccurate paper-based accounting methods. The industry standard for automated accounting

  18. Fast, Automated, Scalable Generation of Textured 3D Models of Indoor Environments

    DTIC Science & Technology

    2014-12-18

    expensive travel and on-site visits. Different applications require models of different complexities, both with and without furniture geometry. The...environment and to localize the system in the environment over time. The datasets shown in this paper were generated by a backpack -mounted system that uses 2D...voxel is found to intersect the line segment from a scanner to a corresponding scan point. If a laser passes through a voxel, that voxel is considered

  19. Development of a multiple-step process for the microbial decontamination of beef trim.

    PubMed

    Kang, D H; Koohmaraie, M; Dorsa, W J; Siragusa, G R

    2001-01-01

    A multiple-hurdle antimicrobial process for beef trim was developed. The microbial profiles of inoculated lean beef trim tissue (BTL) and fat-covered lean beef trim (BTF) were monitored during prolonged refrigerated storage following the application of successive multiple antimicrobial treatments applied to inoculated beef trim on a processing conveyor belt set at a belt speed of 1 cm/s. Beef trim (meat size approximately 15 by 15 cm) was preinoculated with bovine feces before all treatments that included the following: control, no treatment; water wash at 65 psi for five passes; water plus lactic acid (2% [vol/vol] room temperature lactic acid wash at 30 psi for three passes); combination treatment 1 (water plus 65 degrees C hot water at 30 psi for one pass plus hot air at 510 degrees C for four passes plus lactic acid), combination treatment 2 (water plus hot water at 82 degrees C for one pass plus hot air at 510 degrees C for five passes plus lactic acid), and combination treatment 3 (water plus hot water at 82 degrees C for three passes plus hot air at 510 degrees C for six passes plus lactic acid). The effects of treatments on bacterial populations were monitored by enumerating mesophilic aerobic bacteria (APC), presumptive lactic acid bacteria (PLAB), psychrotrophic bacteria (PCT), coliforms, and Escherichia coli biotype 1 on product stored for up to 7 days at 4 degrees C. In the case of BTL, the numbers of APC, PCT, and PLAB increased during storage at 5 degrees C, whereas the numbers of coliform and E. coli decreased on average by 1.8 log CFU/cm2, then remained constant following the initial reduction. Negligible effects on color quality were observed from multihurdle treatment combination 1. In the case of the BTF, the microbial reductions by treatments were much greater than the reduction on BTL. The pH of treated BTF increased more slowly than the pH of treated BTL, resulting in further reduction of the microflora on BTF. Except for control and water treatments, all sample treatments involving lactic acid resulted in continuously decreasing microbial populations. Based on microbial reduction and quality aspects, it was concluded that successively applied combination antimicrobial treatments for meat trim could offer potential food safety benefits.

  20. Validation of the G.LAB MD2200 wrist blood pressure monitor according to the European Society of Hypertension, the British Hypertension Society, and the International Organization for Standardization Protocols.

    PubMed

    Liu, Ze-Yu; Zhang, Qing-Han; Ye, Xiao-Lei; Liu, Da-Peng; Cheng, Kang; Zhang, Chun-Hai; Wan, Yi

    2017-04-01

    To validate the G.LAB MD2200 automated wrist blood pressure (BP) monitors according to the European Society of Hypertension International Protocol (ESH-IP) revision 2010, the British Hypertension Society (BHS), and the International Organization for Standardization (ISO) 81060-2:2013 protocols. The device was assessed on 33 participants according to the ESH requirements and was then tested on 85 participants according to the BHS and ISO 81060-2:2013 criteria. The validation procedures and data analysis followed the protocols precisely. The G.LAB MD2200 devices passed all parts of ESH-IP revision 2010 for both systolic and diastolic BP, with a device-observer difference of 2.15±5.51 and 1.51±5.16 mmHg, respectively. The device achieved A/A grading for the BHS protocol and it also fulfilled the criteria of ISO 81060-2:2013, with mean differences of systolic and diastolic BP between the device and the observer of 2.19±5.21 and 2.11±4.70 mmHg, respectively. The G.LAB MD2200 automated wrist BP monitor passed the ESH-IP revision 2010 and the ISO 81060-2:2013 protocol, and achieved the A/A grade of the BHS protocol, which can be recommended for self-measurement in the general population.

  1. Biosonar-inspired technology: goals, challenges and insights.

    PubMed

    Müller, Rolf; Kuc, Roman

    2007-12-01

    Bioinspired engineering based on biosonar systems in nature is reviewed and discussed in terms of the merits of different approaches and their results: biosonar systems are attractive technological paragons because of their capabilities, built-in task-specific knowledge, intelligent system integration and diversity. Insights from the diverse set of sensing tasks solved by bats are relevant to a wide range of application areas such as sonar, biomedical ultrasound, non-destructive testing, sensors for autonomous systems and wireless communication. Challenges in the design of bioinspired sonar systems are posed by transducer performance, actuation for sensor mobility, design, actuation and integration of beamforming baffle shapes, echo encoding for signal processing, estimation algorithms and their implementations, as well as system integration and feedback control. The discussed examples of experimental systems have capabilities that include localization and tracking using binaural and multiple-band hearing as well as self-generated dynamic cues, classification of small deterministic and large random targets, beamforming with bioinspired baffle shapes, neuromorphic spike processing, artifact rejection in sonar maps and passing range estimation. In future research, bioinspired engineering could capitalize on some of its strengths to serve as a model system for basic automation methodologies for the bioinspired engineering process.

  2. No wisdom in the crowd: genome annotation in the era of big data - current status and future prospects.

    PubMed

    Danchin, Antoine; Ouzounis, Christos; Tokuyasu, Taku; Zucker, Jean-Daniel

    2018-07-01

    Science and engineering rely on the accumulation and dissemination of knowledge to make discoveries and create new designs. Discovery-driven genome research rests on knowledge passed on via gene annotations. In response to the deluge of sequencing big data, standard annotation practice employs automated procedures that rely on majority rules. We argue this hinders progress through the generation and propagation of errors, leading investigators into blind alleys. More subtly, this inductive process discourages the discovery of novelty, which remains essential in biological research and reflects the nature of biology itself. Annotation systems, rather than being repositories of facts, should be tools that support multiple modes of inference. By combining deduction, induction and abduction, investigators can generate hypotheses when accurate knowledge is extracted from model databases. A key stance is to depart from 'the sequence tells the structure tells the function' fallacy, placing function first. We illustrate our approach with examples of critical or unexpected pathways, using MicroScope to demonstrate how tools can be implemented following the principles we advocate. We end with a challenge to the reader. © 2018 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.

  3. A simple algorithm for identifying periods of snow accumulation on a radiometer

    NASA Astrophysics Data System (ADS)

    Lapo, Karl E.; Hinkelman, Laura M.; Landry, Christopher C.; Massmann, Adam K.; Lundquist, Jessica D.

    2015-09-01

    Downwelling solar, Qsi, and longwave, Qli, irradiances at the earth's surface are the primary energy inputs for many hydrologic processes, and uncertainties in measurements of these two terms confound evaluations of estimated irradiances and negatively impact hydrologic modeling. Observations of Qsi and Qli in cold environments are subject to conditions that create additional uncertainties not encountered in other climates, specifically the accumulation of snow on uplooking radiometers. To address this issue, we present an automated method for estimating these periods of snow accumulation. Our method is based on forest interception of snow and uses common meteorological observations. In this algorithm, snow accumulation must exceed a threshold to obscure the sensor and is only removed through scouring by wind or melting. The algorithm is evaluated at two sites representing different mountain climates: (1) Snoqualmie Pass, Washington (maritime) and (2) the Senator Beck Basin Study Area, Colorado (continental). The algorithm agrees well with time-lapse camera observations at the Washington site and with multiple measurements at the Colorado site, with 70-80% of observed snow accumulation events correctly identified. We suggest using the method for quality controlling irradiance observations in snow-dominated climates where regular, daily maintenance is not possible.

  4. A Multiple Agent Model of Human Performance in Automated Air Traffic Control and Flight Management Operations

    NASA Technical Reports Server (NTRS)

    Corker, Kevin; Pisanich, Gregory; Condon, Gregory W. (Technical Monitor)

    1995-01-01

    A predictive model of human operator performance (flight crew and air traffic control (ATC)) has been developed and applied in order to evaluate the impact of automation developments in flight management and air traffic control. The model is used to predict the performance of a two person flight crew and the ATC operators generating and responding to clearances aided by the Center TRACON Automation System (CTAS). The purpose of the modeling is to support evaluation and design of automated aids for flight management and airspace management and to predict required changes in procedure both air and ground in response to advancing automation in both domains. Additional information is contained in the original extended abstract.

  5. Complacency and bias in human use of automation: an attentional integration.

    PubMed

    Parasuraman, Raja; Manzey, Dietrich H

    2010-06-01

    Our aim was to review empirical studies of complacency and bias in human interaction with automated and decision support systems and provide an integrated theoretical model for their explanation. Automation-related complacency and automation bias have typically been considered separately and independently. Studies on complacency and automation bias were analyzed with respect to the cognitive processes involved. Automation complacency occurs under conditions of multiple-task load, when manual tasks compete with the automated task for the operator's attention. Automation complacency is found in both naive and expert participants and cannot be overcome with simple practice. Automation bias results in making both omission and commission errors when decision aids are imperfect. Automation bias occurs in both naive and expert participants, cannot be prevented by training or instructions, and can affect decision making in individuals as well as in teams. While automation bias has been conceived of as a special case of decision bias, our analysis suggests that it also depends on attentional processes similar to those involved in automation-related complacency. Complacency and automation bias represent different manifestations of overlapping automation-induced phenomena, with attention playing a central role. An integrated model of complacency and automation bias shows that they result from the dynamic interaction of personal, situational, and automation-related characteristics. The integrated model and attentional synthesis provides a heuristic framework for further research on complacency and automation bias and design options for mitigating such effects in automated and decision support systems.

  6. Knowledge-based approaches to the maintenance of a large controlled medical terminology.

    PubMed Central

    Cimino, J J; Clayton, P D; Hripcsak, G; Johnson, S B

    1994-01-01

    OBJECTIVE: Develop a knowledge-based representation for a controlled terminology of clinical information to facilitate creation, maintenance, and use of the terminology. DESIGN: The Medical Entities Dictionary (MED) is a semantic network, based on the Unified Medical Language System (UMLS), with a directed acyclic graph to represent multiple hierarchies. Terms from four hospital systems (laboratory, electrocardiography, medical records coding, and pharmacy) were added as nodes in the network. Additional knowledge about terms, added as semantic links, was used to assist in integration, harmonization, and automated classification of disparate terminologies. RESULTS: The MED contains 32,767 terms and is in active clinical use. Automated classification was successfully applied to terms for laboratory specimens, laboratory tests, and medications. One benefit of the approach has been the automated inclusion of medications into multiple pharmacologic and allergenic classes that were not present in the pharmacy system. Another benefit has been the reduction of maintenance efforts by 90%. CONCLUSION: The MED is a hybrid of terminology and knowledge. It provides domain coverage, synonymy, consistency of views, explicit relationships, and multiple classification while preventing redundancy, ambiguity (homonymy) and misclassification. PMID:7719786

  7. KAMO: towards automated data processing for microcrystals.

    PubMed

    Yamashita, Keitaro; Hirata, Kunio; Yamamoto, Masaki

    2018-05-01

    In protein microcrystallography, radiation damage often hampers complete and high-resolution data collection from a single crystal, even under cryogenic conditions. One promising solution is to collect small wedges of data (5-10°) separately from multiple crystals. The data from these crystals can then be merged into a complete reflection-intensity set. However, data processing of multiple small-wedge data sets is challenging. Here, a new open-source data-processing pipeline, KAMO, which utilizes existing programs, including the XDS and CCP4 packages, has been developed to automate whole data-processing tasks in the case of multiple small-wedge data sets. Firstly, KAMO processes individual data sets and collates those indexed with equivalent unit-cell parameters. The space group is then chosen and any indexing ambiguity is resolved. Finally, clustering is performed, followed by merging with outlier rejections, and a report is subsequently created. Using synthetic and several real-world data sets collected from hundreds of crystals, it was demonstrated that merged structure-factor amplitudes can be obtained in a largely automated manner using KAMO, which greatly facilitated the structure analyses of challenging targets that only produced microcrystals. open access.

  8. Generation of X-rays by electrons recycling through thin internal targets of cyclic accelerators

    NASA Astrophysics Data System (ADS)

    Kaplin, V.; Kuznetsov, S.; Uglov, S.

    2018-05-01

    The use of thin (< 10‑3 radiation length) internal targets in cyclic accelerators leads to multiple passes (recycling effect) of electrons through them. The multiplicity of electron passes (M) is determined by the electron energy, accelerator parameters, the thickness, structure and material of a target and leads to an increase in the effective target thickness and the efficiency of radiation generation. The increase of M leads to the increase in the emittance of electron beams which can change the characteristics of radiation processes. The experimental results obtained using the Tomsk synchrotron and betatron showed the possibility of increasing the yield and brightness of coherent X-rays generated by the electrons passing (recycling) through thin crystals and periodic multilayers placed into the chambers of accelerators, when the recycling effect did not influence on the spectral and angular characteristics of generated X-rays.

  9. Numerical simulation of aerodynamic performance of a couple multiple units high-speed train

    NASA Astrophysics Data System (ADS)

    Niu, Ji-qiang; Zhou, Dan; Liu, Tang-hong; Liang, Xi-feng

    2017-05-01

    In order to determine the effect of the coupling region on train aerodynamic performance, and how the coupling region affects aerodynamic performance of the couple multiple units trains when they both run and pass each other in open air, the entrance of two such trains into a tunnel and their passing each other in the tunnel was simulated in Fluent 14.0. The numerical algorithm employed in this study was verified by the data of scaled and full-scale train tests, and the difference lies within an acceptable range. The results demonstrate that the distribution of aerodynamic forces on the train cars is altered by the coupling region; however, the coupling region has marginal effect on the drag and lateral force on the whole train under crosswind, and the lateral force on the train cars is more sensitive to couple multiple units compared to the other two force coefficients. It is also determined that the component of the coupling region increases the fluctuation of aerodynamic coefficients for each train car under crosswind. Affected by the coupling region, a positive pressure pulse was introduced in the alternating pressure produced by trains passing by each other in the open air, and the amplitude of the alternating pressure was decreased by the coupling region. The amplitude of the alternating pressure on the train or on the tunnel was significantly decreased by the coupling region of the train. This phenomenon did not alter the distribution law of pressure on the train and tunnel; moreover, the effect of the coupling region on trains passing by each other in the tunnel is stronger than that on a single train passing through the tunnel.

  10. Improving liquid chromatography-tandem mass spectrometry determinations by modifying noise frequency spectrum between two consecutive wavelet-based low-pass filtering procedures.

    PubMed

    Chen, Hsiao-Ping; Liao, Hui-Ju; Huang, Chih-Min; Wang, Shau-Chun; Yu, Sung-Nien

    2010-04-23

    This paper employs one chemometric technique to modify the noise spectrum of liquid chromatography-tandem mass spectrometry (LC-MS/MS) chromatogram between two consecutive wavelet-based low-pass filter procedures to improve the peak signal-to-noise (S/N) ratio enhancement. Although similar techniques of using other sets of low-pass procedures such as matched filters have been published, the procedures developed in this work are able to avoid peak broadening disadvantages inherent in matched filters. In addition, unlike Fourier transform-based low-pass filters, wavelet-based filters efficiently reject noises in the chromatograms directly in the time domain without distorting the original signals. In this work, the low-pass filtering procedures sequentially convolve the original chromatograms against each set of low pass filters to result in approximation coefficients, representing the low-frequency wavelets, of the first five resolution levels. The tedious trials of setting threshold values to properly shrink each wavelet are therefore no longer required. This noise modification technique is to multiply one wavelet-based low-pass filtered LC-MS/MS chromatogram with another artificial chromatogram added with thermal noises prior to the other wavelet-based low-pass filter. Because low-pass filter cannot eliminate frequency components below its cut-off frequency, more efficient peak S/N ratio improvement cannot be accomplished using consecutive low-pass filter procedures to process LC-MS/MS chromatograms. In contrast, when the low-pass filtered LC-MS/MS chromatogram is conditioned with the multiplication alteration prior to the other low-pass filter, much better ratio improvement is achieved. The noise frequency spectrum of low-pass filtered chromatogram, which originally contains frequency components below the filter cut-off frequency, is altered to span a broader range with multiplication operation. When the frequency range of this modified noise spectrum shifts toward the high frequency regimes, the other low-pass filter is able to provide better filtering efficiency to obtain higher peak S/N ratios. Real LC-MS/MS chromatograms, of which typically less than 6-fold peak S/N ratio improvement achieved with two consecutive wavelet-based low-pass filters remains the same S/N ratio improvement using one-step wavelet-based low-pass filter, are improved to accomplish much better ratio enhancement 25-folds to 40-folds typically when the noise frequency spectrum is modified between two low-pass filters. The linear standard curves using the filtered LC-MS/MS signals are validated. The filtered LC-MS/MS signals are also reproducible. The more accurate determinations of very low concentration samples (S/N ratio about 7-9) are obtained using the filtered signals than the determinations using the original signals. Copyright 2010 Elsevier B.V. All rights reserved.

  11. A post-processing system for automated rectification and registration of spaceborne SAR imagery

    NASA Technical Reports Server (NTRS)

    Curlander, John C.; Kwok, Ronald; Pang, Shirley S.

    1987-01-01

    An automated post-processing system has been developed that interfaces with the raw image output of the operational digital SAR correlator. This system is designed for optimal efficiency by using advanced signal processing hardware and an algorithm that requires no operator interaction, such as the determination of ground control points. The standard output is a geocoded image product (i.e. resampled to a specified map projection). The system is capable of producing multiframe mosaics for large-scale mapping by combining images in both the along-track direction and adjacent cross-track swaths from ascending and descending passes over the same target area. The output products have absolute location uncertainty of less than 50 m and relative distortion (scale factor and skew) of less than 0.1 per cent relative to local variations from the assumed geoid.

  12. The Careful Puppet Master: Reducing risk and fortifying acceptance testing with Jenkins CI

    NASA Astrophysics Data System (ADS)

    Smith, Jason A.; Richman, Gabriel; DeStefano, John; Pryor, James; Rao, Tejas; Strecker-Kellogg, William; Wong, Tony

    2015-12-01

    Centralized configuration management, including the use of automation tools such as Puppet, can greatly increase provisioning speed and efficiency when configuring new systems or making changes to existing systems, reduce duplication of work, and improve automated processes. However, centralized management also brings with it a level of inherent risk: a single change in just one file can quickly be pushed out to thousands of computers and, if that change is not properly and thoroughly tested and contains an error, could result in catastrophic damage to many services, potentially bringing an entire computer facility offline. Change management procedures can—and should—be formalized in order to prevent such accidents. However, like the configuration management process itself, if such procedures are not automated, they can be difficult to enforce strictly. Therefore, to reduce the risk of merging potentially harmful changes into our production Puppet environment, we have created an automated testing system, which includes the Jenkins CI tool, to manage our Puppet testing process. This system includes the proposed changes and runs Puppet on a pool of dozens of RedHat Enterprise Virtualization (RHEV) virtual machines (VMs) that replicate most of our important production services for the purpose of testing. This paper describes our automated test system and how it hooks into our production approval process for automatic acceptance testing. All pending changes that have been pushed to production must pass this validation process before they can be approved and merged into production.

  13. A Comparison of Two Scoring Methods for an Automated Speech Scoring System

    ERIC Educational Resources Information Center

    Xi, Xiaoming; Higgins, Derrick; Zechner, Klaus; Williamson, David

    2012-01-01

    This paper compares two alternative scoring methods--multiple regression and classification trees--for an automated speech scoring system used in a practice environment. The two methods were evaluated on two criteria: construct representation and empirical performance in predicting human scores. The empirical performance of the two scoring models…

  14. NCLEX-RN Examination Performance by BSN Graduates of Four Historically Black Colleges and Universities

    ERIC Educational Resources Information Center

    Chesney, Anita M.

    2010-01-01

    This qualitative multiple-case study research explored and described differences as well as NCLEX-RN preparation strategies used by Historically Black College and University (HBCU) baccalaureate nursing programs with consistent NCLEX pass rates versus those with inconsistent pass rates. Two of the four selected programs had a history of consistent…

  15. A new automated multiple allergen simultaneous test-chemiluminescent assay (MAST-CLA) using an AP720S analyzer.

    PubMed

    Lee, Sungsil; Lim, Hwan Sub; Park, Jungyong; Kim, Hyon Suk

    2009-04-01

    In the diagnosis of atopic diseases, allergen detection is a crucial step. Multiple allergen simultaneous test-chemiluminescent assay (MAST-CLA) is a simple and noninvasive method for in vitro screening of allergen-specific IgE antibodies. The Korean Inhalant Panel test on 20 patients and Food Panel test on 19 patients were performed using the conventional manual MAST-CLA kit and the new automated MAST-CLA method (automated AP720S system for the Optigen Assay; Hitachi Chemical Diagnostics, Inc., USA) simultaneously. The results were evaluated for positive reactivity and concordance. The results of inhalant panel gave a relatively higher class level result than the food panel. The 8 patients out of 20 (40%) of the inhalation panel, and 9 patients out of 18 (47.4%) of the food panel showed 100% concordance between the 2 systems. Eighteen patients (90%) of the Inhalation Panel and sixteen patients (84.2%) of the Food Panel showed more than 91% concordance. These results suggest that the MAST-CLA assay using the new, automated AP720S analyzer performs well, showing a high concordance rate with conventional MAST-CLA. Compared to manual MAST-CLA, the automated AP720S system has a shorter assay time and uses a smaller serum volume (500 microl) along with other conveniences.

  16. Ultrascalable petaflop parallel supercomputer

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Chiu, George [Cross River, NY; Cipolla, Thomas M [Katonah, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Hall, Shawn [Pleasantville, NY; Haring, Rudolf A [Cortlandt Manor, NY; Heidelberger, Philip [Cortlandt Manor, NY; Kopcsay, Gerard V [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Salapura, Valentina [Chappaqua, NY; Sugavanam, Krishnan [Mahopac, NY; Takken, Todd [Brewster, NY

    2010-07-20

    A massively parallel supercomputer of petaOPS-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC) having up to four processing elements. The ASIC nodes are interconnected by multiple independent networks that optimally maximize the throughput of packet communications between nodes with minimal latency. The multiple networks may include three high-speed networks for parallel algorithm message passing including a Torus, collective network, and a Global Asynchronous network that provides global barrier and notification functions. These multiple independent networks may be collaboratively or independently utilized according to the needs or phases of an algorithm for optimizing algorithm processing performance. The use of a DMA engine is provided to facilitate message passing among the nodes without the expenditure of processing resources at the node.

  17. Illumina Unamplified Indexed Library Construction: An Automated Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hack, Christopher A.; Sczyrba, Alexander; Cheng, Jan-Fang

    Manual library construction is a limiting factor in Illumina sequencing. Constructing libraries by hand is costly, time-consuming, low-throughput, and ergonomically hazardous, and constructing multiple libraries introduces risk of library failure due to pipetting errors. The ability to construct multiple libraries simultaneously in automated fashion represents significant cost and time savings. Here we present a strategy to construct up to 96 unamplified indexed libraries using Illumina TruSeq reagents and a Biomek FX robotic platform. We also present data to indicate that this library construction method has little or no risk of cross-contamination between samples.

  18. Developing Image Processing Meta-Algorithms with Data Mining of Multiple Metrics

    PubMed Central

    Cunha, Alexandre; Toga, A. W.; Parker, D. Stott

    2014-01-01

    People often use multiple metrics in image processing, but here we take a novel approach of mining the values of batteries of metrics on image processing results. We present a case for extending image processing methods to incorporate automated mining of multiple image metric values. Here by a metric we mean any image similarity or distance measure, and in this paper we consider intensity-based and statistical image measures and focus on registration as an image processing problem. We show how it is possible to develop meta-algorithms that evaluate different image processing results with a number of different metrics and mine the results in an automated fashion so as to select the best results. We show that the mining of multiple metrics offers a variety of potential benefits for many image processing problems, including improved robustness and validation. PMID:24653748

  19. Performance of the Automated Self-Administered 24-hour Recall relative to a measure of true intakes and to an interviewer-administered 24-h recall.

    PubMed

    Kirkpatrick, Sharon I; Subar, Amy F; Douglass, Deirdre; Zimmerman, Thea P; Thompson, Frances E; Kahle, Lisa L; George, Stephanie M; Dodd, Kevin W; Potischman, Nancy

    2014-07-01

    The Automated Self-Administered 24-hour Recall (ASA24), a freely available Web-based tool, was developed to enhance the feasibility of collecting high-quality dietary intake data from large samples. The purpose of this study was to assess the criterion validity of ASA24 through a feeding study in which the true intake for 3 meals was known. True intake and plate waste from 3 meals were ascertained for 81 adults by inconspicuously weighing foods and beverages offered at a buffet before and after each participant served him- or herself. Participants were randomly assigned to complete an ASA24 or an interviewer-administered Automated Multiple-Pass Method (AMPM) recall the following day. With the use of linear and Poisson regression analysis, we examined the associations between recall mode and 1) the proportions of items consumed for which a match was reported and that were excluded, 2) the number of intrusions (items reported but not consumed), and 3) differences between energy, nutrient, food group, and portion size estimates based on true and reported intakes. Respondents completing ASA24 reported 80% of items truly consumed compared with 83% in AMPM (P = 0.07). For both ASA24 and AMPM, additions to or ingredients in multicomponent foods and drinks were more frequently omitted than were main foods or drinks. The number of intrusions was higher in ASA24 (P < 0.01). Little evidence of differences by recall mode was found in the gap between true and reported energy, nutrient, and food group intakes or portion sizes. Although the interviewer-administered AMPM performed somewhat better relative to true intakes for matches, exclusions, and intrusions, ASA24 performed well. Given the substantial cost savings that ASA24 offers, it has the potential to make important contributions to research aimed at describing the diets of populations, assessing the effect of interventions on diet, and elucidating diet and health relations. This trial was registered at clinicaltrials.gov as NCT00978406. © 2014 American Society for Nutrition.

  20. Public health surveillance of automated external defibrillators in the USA: protocol for the dynamic automated external defibrillator registry study.

    PubMed

    Elrod, JoAnn Broeckel; Merchant, Raina; Daya, Mohamud; Youngquist, Scott; Salcido, David; Valenzuela, Terence; Nichol, Graham

    2017-03-29

    Lay use of automated external defibrillators (AEDs) before the arrival of emergency medical services (EMS) providers on scene increases survival after out-of-hospital cardiac arrest (OHCA). AEDs have been placed in public locations may be not ready for use when needed. We describe a protocol for AED surveillance that tracks these devices through time and space to improve public health, and survival as well as facilitate research. Included AEDs are installed in public locations for use by laypersons to treat patients with OHCA before the arrival of EMS providers on scene. Included cases of OHCA are patients evaluated by organised EMS personnel and treated for OHCA. Enrolment of 10 000 AEDs annually will yield precision of 0.4% in the estimate of readiness for use. Enrolment of 2500 patients annually will yield precision of 1.9% in the estimate of survival to hospital discharge. Recruitment began on 21 Mar 2014 and is ongoing. AEDs are found by using multiple methods. Each AED is then tagged with a label which is a unique two-dimensional (2D) matrix code; the 2D matrix code is recorded and the location and status of the AED tracked using a smartphone; these elements are automatically passed via the internet to a secure and confidential database in real time. Whenever the 2D matrix code is rescanned for any non-clinical or clinical use of an AED, the user is queried to answer a finite set of questions about the device status. The primary outcome of any clinical use of an AED is survival to hospital discharge. Results are summarised descriptively. These activities are conducted under a grant of authority for public health surveillance from the Food and Drug Administration. Results are provided periodically to participating sites and sponsors to improve public health and quality of care. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  1. Demonstration of improvement in the signal-to-noise ratio of Thomson scattering signal obtained by using a multi-pass optical cavity on the Tokyo Spherical Tokamak-2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Togashi, H., E-mail: togashi@fusion.k.u-tokyo.ac.jp; Ejiri, A.; Nakamura, K.

    2014-11-15

    The multi-pass Thomson scattering (TS) scheme enables obtaining many photons by accumulating multiple TS signals. The signal-to-noise ratio (SNR) depends on the accumulation number. In this study, we performed multi-pass TS measurements for ohmically heated plasmas, and the relationship between SNR and the accumulation number was investigated. As a result, improvement of SNR in this experiment indicated similar tendency to that calculated for the background noise dominant situation.

  2. Effects of automation of information-processing functions on teamwork.

    PubMed

    Wright, Melanie C; Kaber, David B

    2005-01-01

    We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.

  3. Development of an integrated semi-automated system for in vitro pharmacodynamic modelling.

    PubMed

    Wang, Liangsu; Wismer, Michael K; Racine, Fred; Conway, Donald; Giacobbe, Robert A; Berejnaia, Olga; Kath, Gary S

    2008-11-01

    The aim of this study was to develop an integrated system for in vitro pharmacodynamic modelling of antimicrobials with greater flexibility, easier control and better accuracy than existing in vitro models. Custom-made bottle caps, fittings, valve controllers and a modified bench-top shaking incubator were used. A temperature-controlled automated sample collector was built. Computer software was developed to manage experiments and to control the entire system including solenoid pinch valves, peristaltic pumps and the sample collector. The system was validated by pharmacokinetic simulations of linezolid 600 mg infusion. The antibacterial effect of linezolid against multiple Staphylococcus aureus strains was also studied in this system. An integrated semi-automated bench-top system was built and validated. The temperature-controlled automated sample collector allowed unattended collection and temporary storage of samples. The system software reduced the labour necessary for many tasks and also improved the timing accuracy for performing simultaneous actions in multiple parallel experiments. The system was able to simulate human pharmacokinetics of linezolid 600 mg intravenous infusion accurately. A pharmacodynamic study of linezolid against multiple S. aureus strains with a range of MICs showed that the required 24 h free drug AUC/MIC ratio was approximately 30 in order to keep the organism counts at the same level as their initial inoculum and was about > or = 68 in order to achieve > 2 log(10) cfu/mL reduction in the in vitro model. The integrated semi-automated bench-top system provided the ability to overcome many of the drawbacks of existing in vitro models. It can be used for various simple or complicated pharmacokinetic/pharmacodynamic studies efficiently and conveniently.

  4. Long-term maintenance of human induced pluripotent stem cells by automated cell culture system.

    PubMed

    Konagaya, Shuhei; Ando, Takeshi; Yamauchi, Toshiaki; Suemori, Hirofumi; Iwata, Hiroo

    2015-11-17

    Pluripotent stem cells, such as embryonic stem cells and induced pluripotent stem (iPS) cells, are regarded as new sources for cell replacement therapy. These cells can unlimitedly expand under undifferentiated conditions and be differentiated into multiple cell types. Automated culture systems enable the large-scale production of cells. In addition to reducing the time and effort of researchers, an automated culture system improves the reproducibility of cell cultures. In the present study, we newly designed a fully automated cell culture system for human iPS maintenance. Using an automated culture system, hiPS cells maintained their undifferentiated state for 60 days. Automatically prepared hiPS cells had a potency of differentiation into three germ layer cells including dopaminergic neurons and pancreatic cells.

  5. A multiple-drawer medication layout problem in automated dispensing cabinets.

    PubMed

    Pazour, Jennifer A; Meller, Russell D

    2012-12-01

    In this paper we investigate the problem of locating medications in automated dispensing cabinets (ADCs) to minimize human selection errors. We formulate the multiple-drawer medication layout problem and show that the problem can be formulated as a quadratic assignment problem. As a way to evaluate various medication layouts, we develop a similarity rating for medication pairs. To solve industry-sized problem instances, we develop a heuristic approach. We use hospital ADC transaction data to conduct a computational experiment to test the performance of our developed heuristics, to demonstrate how our approach can aid in ADC design trade-offs, and to illustrate the potential improvements that can be made when applying an analytical process to the multiple-drawer medication layout problem. Finally, we present conclusions and future research directions.

  6. A data-driven multiplicative fault diagnosis approach for automation processes.

    PubMed

    Hao, Haiyang; Zhang, Kai; Ding, Steven X; Chen, Zhiwen; Lei, Yaguo

    2014-09-01

    This paper presents a new data-driven method for diagnosing multiplicative key performance degradation in automation processes. Different from the well-established additive fault diagnosis approaches, the proposed method aims at identifying those low-level components which increase the variability of process variables and cause performance degradation. Based on process data, features of multiplicative fault are extracted. To identify the root cause, the impact of fault on each process variable is evaluated in the sense of contribution to performance degradation. Then, a numerical example is used to illustrate the functionalities of the method and Monte-Carlo simulation is performed to demonstrate the effectiveness from the statistical viewpoint. Finally, to show the practical applicability, a case study on the Tennessee Eastman process is presented. Copyright © 2013. Published by Elsevier Ltd.

  7. Automated hardwood lumber grading utilizing a multiple sensor machine vision technology

    Treesearch

    D. Earl Kline; Chris Surak; Philip A. Araman

    2003-01-01

    Over the last 10 years, scientists at the Thomas M. Brooks Forest Products Center, the Bradley Department of Electrical and Computer Engineering, and the USDA Forest Service have been working on lumber scanning systems that can accurately locate and identify defects in hardwood lumber. Current R&D efforts are targeted toward developing automated lumber grading...

  8. Applying State-of-the-Art Technologies to Reduce Escape Times from Fires Using Environmental Sensing, Improved Occupant Egress Guidance, and Multiple Communication Protocols

    DTIC Science & Technology

    2009-02-06

    that could monitor sensors, evaluate environmental 4 conditions, and control visual and sound devices was conducted. The home automation products used...the prototype system. Use of off-the-shelf home automation products allowed the implementation of an egress control prototype suitable for test and

  9. Standards for space automation and robotics

    NASA Technical Reports Server (NTRS)

    Kader, Jac B.; Loftin, R. B.

    1992-01-01

    The AIAA's Committee on Standards for Space Automation and Robotics (COS/SAR) is charged with the identification of key functions and critical technologies applicable to multiple missions that reflect fundamental consideration of environmental factors. COS/SAR's standards/practices/guidelines implementation methods will be based on reliability, performance, and operations, as well as economic viability and life-cycle costs, simplicity, and modularity.

  10. An Evaluation of the Effectiveness of an Automated Observation and Feedback System on Safe Sitting Postures

    ERIC Educational Resources Information Center

    Yu, Eunjeong; Moon, Kwangsu; Oah, Shezeen; Lee, Yohaeng

    2013-01-01

    This study evaluated the effectiveness of an automated observation and feedback system in improving safe sitting postures. Participants were four office workers. The dependent variables were the percentages of time participants spent in five safe body positions during experimental sessions. We used a multiple-baseline design counterbalanced across…

  11. Differentiating Instruction through Multiple Intelligences in a Middle School Mathematics Classroom

    ERIC Educational Resources Information Center

    Jones, Marcella

    2017-01-01

    Eighth grade students at a middle school in a southern state were required a mathematics pass rate of 67.6% to meet annual yearly progress (AYP). Black and Hispanic students performed below the required pass rate on state assessments; thus, the school did not make AYP from 2007-2010. In an attempt to address low test scores in mathematics, the…

  12. FCAT Retakes: Trends in Multiple Attempts at Satisfying FCAT Graduation Requirements. Research Brief. Volume 0805

    ERIC Educational Resources Information Center

    Froman, Terry; Brown, Shelly

    2009-01-01

    According to Florida Law, students must pass the Grade 10 FCAT, among other academic requirements, in order to receive a standard high school diploma. Specifically, students must achieve a "passing" score of 300 or above on both the FCAT SSS Reading and the FCAT SSS Mathematics tests. Technically, students can retake the FCAT as many…

  13. Automated extraction of radiation dose information from CT dose report images.

    PubMed

    Li, Xinhua; Zhang, Da; Liu, Bob

    2011-06-01

    The purpose of this article is to describe the development of an automated tool for retrieving texts from CT dose report images. Optical character recognition was adopted to perform text recognitions of CT dose report images. The developed tool is able to automate the process of analyzing multiple CT examinations, including text recognition, parsing, error correction, and exporting data to spreadsheets. The results were precise for total dose-length product (DLP) and were about 95% accurate for CT dose index and DLP of scanned series.

  14. Community lay rescuer automated external defibrillation programs: key state legislative components and implementation strategies: a summary of a decade of experience for healthcare providers, policymakers, legislators, employers, and community leaders from the American Heart Association Emergency Cardiovascular Care Committee, Council on Clinical Cardiology, and Office of State Advocacy.

    PubMed

    Aufderheide, Tom; Hazinski, Mary Fran; Nichol, Graham; Steffens, Suzanne Smith; Buroker, Andrew; McCune, Robin; Stapleton, Edward; Nadkarni, Vinay; Potts, Jerry; Ramirez, Raymond R; Eigel, Brian; Epstein, Andrew; Sayre, Michael; Halperin, Henry; Cummins, Richard O

    2006-03-07

    Cardiovascular disease is a leading cause of death for adults > or =40 years of age. The American Heart Association (AHA) estimates that sudden cardiac arrest is responsible for about 250,000 out-of-hospital deaths annually in the United States. Since the early 1990s, the AHA has called for innovative approaches to reduce time to cardiopulmonary resuscitation (CPR) and defibrillation and improve survival from sudden cardiac arrest. In the mid-1990s, the AHA launched a public health initiative to promote early CPR and early use of automated external defibrillators (AEDs) by trained lay responders in community (lay rescuer) AED programs. Between 1995 and 2000, all 50 states passed laws and regulations concerning lay rescuer AED programs. In addition, the Cardiac Arrest Survival Act (CASA, Public Law 106-505) was passed and signed into federal law in 2000. The variations in state and federal legislation and regulations have complicated efforts to promote lay rescuer AED programs and in some cases have created impediments to such programs. Since 2000, most states have reexamined lay rescuer AED statutes, and many have passed legislation to remove impediments and encourage the development of lay rescuer AED programs. The purpose of this statement is to help policymakers develop new legislation or revise existing legislation to remove barriers to effective community lay rescuer AED programs. Important areas that should be considered in state legislation and regulations are highlighted, and sample legislation sections are included. Potential sources of controversy and the rationale for proposed legislative components are noted. This statement will not address legislation to support home AED programs. Such recommendations may be made after the conclusion of a large study of home AED use.

  15. Multiphysical FE-analysis of a front-end bending phenomenon in a hot strip mill

    NASA Astrophysics Data System (ADS)

    Ilmola, Joonas; Seppälä, Oskari; Leinonen, Olli; Pohjonen, Aarne; Larkiola, Jari; Jokisaari, Juha; Putaansuu, Eero

    2018-05-01

    In hot steel rolling processes, a slab is generally rolled to a transfer bar in a roughing process and to a strip in a hot strip rolling process. Over several rolling passes the front-end may bend upward or downward due to asymmetrical rolling conditions causing entry problems in the next rolling pass. Many different factors may affect the front-end bending phenomenon and are very challenging to measure. Thus, a customized finite element model is designed and built to simulate the front-end bending phenomenon in a hot strip rolling process. To simulate the functioning of the hot strip mill precisely, automated controlling logic of the mill must be considered. In this paper we studied the effect of roll bite friction conditions and amount of reduction on the front-end bending phenomenon in a hot strip rolling process.

  16. Universal explosive detection system for homeland security applications

    NASA Astrophysics Data System (ADS)

    Lee, Vincent Y.; Bromberg, Edward E. A.

    2010-04-01

    L-3 Communications CyTerra Corporation has developed a high throughput universal explosive detection system (PassPort) to automatically screen the passengers in airports without requiring them to remove their shoes. The technical approach is based on the patented energetic material detection (EMD) technology. By analyzing the results of sample heating with an infrared camera, one can distinguish the deflagration or decomposition of an energetic material from other clutters such as flammables and general background substances. This becomes the basis of a universal explosive detection system that does not require a library and is capable of detecting trace levels of explosives with a low false alarm rate. The PassPort is a simple turnstile type device and integrates a non-intrusive aerodynamic sampling scheme that has been shown capable of detecting trace levels of explosives on shoes. A detailed description of the detection theory and the automated sampling techniques, as well as the field test results, will be presented.

  17. Inventory management and reagent supply for automated chemistry.

    PubMed

    Kuzniar, E

    1999-08-01

    Developments in automated chemistry have kept pace with developments in HTS such that hundreds of thousands of new compounds can be rapidly synthesized in the belief that the greater the number and diversity of compounds that can be screened, the more successful HTS will be. The increasing use of automation for Multiple Parallel Synthesis (MPS) and the move to automated combinatorial library production is placing an overwhelming burden on the management of reagents. Although automation has improved the efficiency of the processes involved in compound synthesis, the bottleneck has shifted to ordering, collating and preparing reagents for automated chemistry resulting in loss of time, materials and momentum. Major efficiencies have already been made in the area of compound management for high throughput screening. Most of these efficiencies have been achieved with sophisticated library management systems using advanced engineering and data handling for the storage, tracking and retrieval of millions of compounds. The Automation Partnership has already provided many of the top pharmaceutical companies with modular automated storage, preparation and retrieval systems to manage compound libraries for high throughput screening. This article describes how these systems may be implemented to solve the specific problems of inventory management and reagent supply for automated chemistry.

  18. Physiological Self-Regulation and Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  19. 3D Imaging and Automated Ice Bottom Tracking of Canadian Arctic Archipelago Ice Sounding Data

    NASA Astrophysics Data System (ADS)

    Paden, J. D.; Xu, M.; Sprick, J.; Athinarapu, S.; Crandall, D.; Burgess, D. O.; Sharp, M. J.; Fox, G. C.; Leuschen, C.; Stumpf, T. M.

    2016-12-01

    The basal topography of the Canadian Arctic Archipelago ice caps is unknown for a number of the glaciers which drain the ice caps. The basal topography is needed for calculating present sea level contribution using the surface mass balance and discharge method and to understand future sea level contributions using ice flow model studies. During the NASA Operation IceBridge 2014 arctic campaign, the Multichannel Coherent Radar Depth Sounder (MCoRDS) used a three transmit beam setting (left beam, nadir beam, right beam) to illuminate a wide swath across the ice glacier in a single pass during three flights over the archipelago. In post processing we have used a combination of 3D imaging methods to produce images for each of the three beams which are then merged to produce a single digitally formed wide swath beam. Because of the high volume of data produced by 3D imaging, manual tracking of the ice bottom is impractical on a large scale. To solve this problem, we propose an automated technique for extracting ice bottom surfaces by viewing the task as an inference problem on a probabilistic graphical model. We first estimate layer boundaries to generate a seed surface, and then incorporate additional sources of evidence, such as ice masks, surface digital elevation models, and feedback from human users, to refine the surface in a discrete energy minimization formulation. We investigate the performance of the imaging and tracking algorithms using flight crossovers since crossing lines should produce consistent maps of the terrain beneath the ice surface and compare manually tracked "ground truth" to the automated tracking algorithms. We found the swath width at the nominal flight altitude of 1000 m to be approximately 3 km. Since many of the glaciers in the archipelago are narrower than this, the radar imaging, in these instances, was able to measure the full glacier cavity in a single pass.

  20. Automated identification of best-quality coronary artery segments from multiple-phase coronary CT angiography (cCTA) for vessel analysis

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Hadjiiski, Lubomir M.; Chughtai, Aamer; Wei, Jun; Kazerooni, Ella A.

    2016-03-01

    We are developing an automated method to identify the best quality segment among the corresponding segments in multiple-phase cCTA. The coronary artery trees are automatically extracted from different cCTA phases using our multi-scale vessel segmentation and tracking method. An automated registration method is then used to align the multiple-phase artery trees. The corresponding coronary artery segments are identified in the registered vessel trees and are straightened by curved planar reformation (CPR). Four features are extracted from each segment in each phase as quality indicators in the original CT volume and the straightened CPR volume. Each quality indicator is used as a voting classifier to vote the corresponding segments. A newly designed weighted voting ensemble (WVE) classifier is finally used to determine the best-quality coronary segment. An observer preference study is conducted with three readers to visually rate the quality of the vessels in 1 to 6 rankings. Six and 10 cCTA cases are used as training and test set in this preliminary study. For the 10 test cases, the agreement between automatically identified best-quality (AI-BQ) segments and radiologist's top 2 rankings is 79.7%, and between AI-BQ and the other two readers are 74.8% and 83.7%, respectively. The results demonstrated that the performance of our automated method was comparable to those of experienced readers for identification of the best-quality coronary segments.

  1. Automated centrifugal-microfluidic platform for DNA purification using laser burst valve and coriolis effect.

    PubMed

    Choi, Min-Seong; Yoo, Jae-Chern

    2015-04-01

    We report a fully automated DNA purification platform with a micropored membrane in the channel utilizing centrifugal microfluidics on a lab-on-a-disc (LOD). The microfluidic flow in the LOD, into which the reagents are injected for DNA purification, is controlled by a single motor and laser burst valve. The sample and reagents pass successively through the micropored membrane in the channel when each laser burst valve is opened. The Coriolis effect is used by rotating the LOD bi-directionally to increase the purity of the DNA, thereby preventing the mixing of the waste and elution solutions. The total process from the lysed sample injection into the LOD to obtaining the purified DNA was finished within 7 min with only one manual step. The experimental result for Salmonella shows that the proposed microfluidic platform is comparable to the existing devices in terms of the purity and yield of DNA.

  2. AutoLock: a semiautomated system for radiotherapy treatment plan quality control

    PubMed Central

    Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.

    2015-01-01

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498

  3. AutoLock: a semiautomated system for radiotherapy treatment plan quality control.

    PubMed

    Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G

    2015-05-08

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.

  4. Automation of Coordinated Planning Between Observatories: The Visual Observation Layout Tool (VOLT)

    NASA Technical Reports Server (NTRS)

    Maks, Lori; Koratkar, Anuradha; Kerbel, Uri; Pell, Vince

    2002-01-01

    Fulfilling the promise of the era of great observatories, NASA now has more than three space-based astronomical telescopes operating in different wavebands. This situation provides astronomers with the unique opportunity of simultaneously observing a target in multiple wavebands with these observatories. Currently scheduling multiple observatories simultaneously, for coordinated observations, is highly inefficient. Coordinated observations require painstaking manual collaboration among the observatory staff at each observatory. Because they are time-consuming and expensive to schedule, observatories often limit the number of coordinated observations that can be conducted. In order to exploit new paradigms for observatory operation, the Advanced Architectures and Automation Branch of NASA's Goddard Space Flight Center has developed a tool called the Visual Observation Layout Tool (VOLT). The main objective of VOLT is to provide a visual tool to automate the planning of coordinated observations by multiple astronomical observatories. Four of NASA's space-based astronomical observatories - the Hubble Space Telescope (HST), Far Ultraviolet Spectroscopic Explorer (FUSE), Rossi X-ray Timing Explorer (RXTE) and Chandra - are enthusiastically pursuing the use of VOLT. This paper will focus on the purpose for developing VOLT, as well as the lessons learned during the infusion of VOLT into the planning and scheduling operations of these observatories.

  5. SeqFIRE: a web application for automated extraction of indel regions and conserved blocks from protein multiple sequence alignments.

    PubMed

    Ajawatanawong, Pravech; Atkinson, Gemma C; Watson-Haigh, Nathan S; Mackenzie, Bryony; Baldauf, Sandra L

    2012-07-01

    Analyses of multiple sequence alignments generally focus on well-defined conserved sequence blocks, while the rest of the alignment is largely ignored or discarded. This is especially true in phylogenomics, where large multigene datasets are produced through automated pipelines. However, some of the most powerful phylogenetic markers have been found in the variable length regions of multiple alignments, particularly insertions/deletions (indels) in protein sequences. We have developed Sequence Feature and Indel Region Extractor (SeqFIRE) to enable the automated identification and extraction of indels from protein sequence alignments. The program can also extract conserved blocks and identify fast evolving sites using a combination of conservation and entropy. All major variables can be adjusted by the user, allowing them to identify the sets of variables most suited to a particular analysis or dataset. Thus, all major tasks in preparing an alignment for further analysis are combined in a single flexible and user-friendly program. The output includes a numbered list of indels, alignments in NEXUS format with indels annotated or removed and indel-only matrices. SeqFIRE is a user-friendly web application, freely available online at www.seqfire.org/.

  6. Repetition rate multiplication of frequency comb using all-pass fiber resonator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Lijun; Yang, Honglei; Zhang, Hongyuan

    2016-09-15

    We propose a stable method for repetition rate multiplication of a 250-MHz Er-fiber frequency comb by a phase-locked all-pass fiber ring resonator, whose phase-locking configuration is simple. The optical path length of the fiber ring resonator is automatically controlled to be accurately an odd multiple of half of the original cavity length using an electronical phase-locking unit with an optical delay line. As for shorter cavity length of the comb, high-order odd multiple is preferable. Because the power loss depends only on the net-attenuation of the fiber ring resonator, the energetic efficiency of the proposed method is high. The inputmore » and output optical spectrums show that the spectral width of the frequency comb is clearly preserved. Besides, experimental results show less pulse intensity fluctuation and 35 dB suppression ratio of side-modes while providing a good long-term and short-term frequency stability. Higher-order repetition rate multiplication to several GHz can be obtained by using several fiber ring resonators in cascade configuration.« less

  7. Modeling and deadlock avoidance of automated manufacturing systems with multiple automated guided vehicles.

    PubMed

    Wu, Naiqi; Zhou, MengChu

    2005-12-01

    An automated manufacturing system (AMS) contains a number of versatile machines (or workstations), buffers, an automated material handling system (MHS), and is computer-controlled. An effective and flexible alternative for implementing MHS is to use automated guided vehicle (AGV) system. The deadlock issue in AMS is very important in its operation and has extensively been studied. The deadlock problems were separately treated for parts in production and transportation and many techniques were developed for each problem. However, such treatment does not take the advantage of the flexibility offered by multiple AGVs. In general, it is intractable to obtain maximally permissive control policy for either problem. Instead, this paper investigates these two problems in an integrated way. First we model an AGV system and part processing processes by resource-oriented Petri nets, respectively. Then the two models are integrated by using macro transitions. Based on the combined model, a novel control policy for deadlock avoidance is proposed. It is shown to be maximally permissive with computational complexity of O (n2) where n is the number of machines in AMS if the complexity for controlling the part transportation by AGVs is not considered. Thus, the complexity of deadlock avoidance for the whole system is bounded by the complexity in controlling the AGV system. An illustrative example shows its application and power.

  8. Effective application of multiple locus variable number of tandem repeats analysis to tracing Staphylococcus aureus in food-processing environment.

    PubMed

    Rešková, Z; Koreňová, J; Kuchta, T

    2014-04-01

    A total of 256 isolates of Staphylococcus aureus were isolated from 98 samples (34 swabs and 64 food samples) obtained from small or medium meat- and cheese-processing plants in Slovakia. The strains were genotypically characterized by multiple locus variable number of tandem repeats analysis (MLVA), involving multiplex polymerase chain reaction (PCR) with subsequent separation of the amplified DNA fragments by an automated flow-through gel electrophoresis. With the panel of isolates, MLVA produced 31 profile types, which was a sufficient discrimination to facilitate the description of spatial and temporal aspects of contamination. Further data on MLVA discrimination were obtained by typing a subpanel of strains by multiple locus sequence typing (MLST). MLVA coupled to automated electrophoresis proved to be an effective, comparatively fast and inexpensive method for tracing S. aureus contamination of food-processing factories. Subspecies genotyping of microbial contaminants in food-processing factories may facilitate identification of spatial and temporal aspects of the contamination. This may help to properly manage the process hygiene. With S. aureus, multiple locus variable number of tandem repeats analysis (MLVA) proved to be an effective method for the purpose, being sufficiently discriminative, yet comparatively fast and inexpensive. The application of automated flow-through gel electrophoresis to separation of DNA fragments produced by multiplex PCR helped to improve the accuracy and speed of the method. © 2013 The Society for Applied Microbiology.

  9. A satellite-based personal communication system for the 21st century

    NASA Technical Reports Server (NTRS)

    Sue, Miles K.; Dessouky, Khaled; Levitt, Barry; Rafferty, William

    1990-01-01

    Interest in personal communications (PCOMM) has been stimulated by recent developments in satellite and terrestrial mobile communications. A personal access satellite system (PASS) concept was developed at the Jet Propulsion Laboratory (JPL) which has many attractive user features, including service diversity and a handheld terminal. Significant technical challenges addressed in formulating the PASS space and ground segments are discussed. PASS system concept and basic design features, high risk enabling technologies, an optimized multiple access scheme, alternative antenna coverage concepts, the use of non-geostationary orbits, user terminal radiation constraints, and user terminal frequency reference are covered.

  10. A sorting system with automated gates permits individual operant experiments with mice from a social home cage.

    PubMed

    Winter, York; Schaefers, Andrea T U

    2011-03-30

    Behavioral experiments based on operant procedures can be time-consuming for small amounts of data. While individual testing and handling of animals can influence attention, emotion, and behavior, and interfere with experimental outcome, many operant protocols require individual testing. We developed an RFID-technology- and transponder-based sorting system that allows removing the human factor for longer-term experiments. Identity detectors and automated gates route mice individually from their social home cage to an adjacent operant compartment with 24/7 operation. CD1-mice learnt quickly to individually pass through the sorting system. At no time did more than a single mouse enter the operant compartment. After 3 days of adjusting to the sorting system, groups of 4 mice completed about 50 experimental trials per day in the operant compartment without experimenter intervention. The automated sorting system eliminates handling, isolation, and disturbance of the animals, eliminates experimenter-induced variability, saves experimenter time, and is financially economical. It makes possible a new approach for high-throughput experimentation, and is a viable tool for increasing quality and efficiency of many behavioral and neurobiological investigations. It can connect a social home cage, through individual sorting automation, to diverse setups including classical operant chambers, mazes, or arenas with video-based behavior classification. Such highly automated systems will permit efficient high-throughput screening even for transgenic animals with only subtle neurological or psychiatric symptoms where elaborate or longer-term protocols are required for behavioral diagnosis. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Automated UAV-based mapping for airborne reconnaissance and video exploitation

    NASA Astrophysics Data System (ADS)

    Se, Stephen; Firoozfam, Pezhman; Goldstein, Norman; Wu, Linda; Dutkiewicz, Melanie; Pace, Paul; Naud, J. L. Pierre

    2009-05-01

    Airborne surveillance and reconnaissance are essential for successful military missions. Such capabilities are critical for force protection, situational awareness, mission planning, damage assessment and others. UAVs gather huge amount of video data but it is extremely labour-intensive for operators to analyse hours and hours of received data. At MDA, we have developed a suite of tools towards automated video exploitation including calibration, visualization, change detection and 3D reconstruction. The on-going work is to improve the robustness of these tools and automate the process as much as possible. Our calibration tool extracts and matches tie-points in the video frames incrementally to recover the camera calibration and poses, which are then refined by bundle adjustment. Our visualization tool stabilizes the video, expands its field-of-view and creates a geo-referenced mosaic from the video frames. It is important to identify anomalies in a scene, which may include detecting any improvised explosive devices (IED). However, it is tedious and difficult to compare video clips to look for differences manually. Our change detection tool allows the user to load two video clips taken from two passes at different times and flags any changes between them. 3D models are useful for situational awareness, as it is easier to understand the scene by visualizing it in 3D. Our 3D reconstruction tool creates calibrated photo-realistic 3D models from video clips taken from different viewpoints, using both semi-automated and automated approaches. The resulting 3D models also allow distance measurements and line-of- sight analysis.

  12. High peak-power kilohertz laser system employing single-stage multi-pass amplification

    DOEpatents

    Shan, Bing; Wang, Chun; Chang, Zenghu

    2006-05-23

    The present invention describes a technique for achieving high peak power output in a laser employing single-stage, multi-pass amplification. High gain is achieved by employing a very small "seed" beam diameter in gain medium, and maintaining the small beam diameter for multiple high-gain pre-amplification passes through a pumped gain medium, then leading the beam out of the amplifier cavity, changing the beam diameter and sending it back to the amplifier cavity for additional, high-power amplification passes through the gain medium. In these power amplification passes, the beam diameter in gain medium is increased and carefully matched to the pump laser's beam diameter for high efficiency extraction of energy from the pumped gain medium. A method of "grooming" the beam by means of a far-field spatial filter in the process of changing the beam size within the single-stage amplifier is also described.

  13. Using Vision and Speech Features for Automated Prediction of Performance Metrics in Multimodal Dialogs. Research Report. ETS RR-17-20

    ERIC Educational Resources Information Center

    Ramanarayanan, Vikram; Lange, Patrick; Evanini, Keelan; Molloy, Hillary; Tsuprun, Eugene; Qian, Yao; Suendermann-Oeft, David

    2017-01-01

    Predicting and analyzing multimodal dialog user experience (UX) metrics, such as overall call experience, caller engagement, and latency, among other metrics, in an ongoing manner is important for evaluating such systems. We investigate automated prediction of multiple such metrics collected from crowdsourced interactions with an open-source,…

  14. Investigating the Human Computer Interaction Problems with Automated Teller Machine Navigation Menus

    ERIC Educational Resources Information Center

    Curran, Kevin; King, David

    2008-01-01

    Purpose: The automated teller machine (ATM) has become an integral part of our society. However, using the ATM can often be a frustrating experience as people frequently reinsert cards to conduct multiple transactions. This has led to the research question of whether ATM menus are designed in an optimal manner. This paper aims to address the…

  15. Automating quantum dot barcode assays using microfluidics and magnetism for the development of a point-of-care device.

    PubMed

    Gao, Yali; Lam, Albert W Y; Chan, Warren C W

    2013-04-24

    The impact of detecting multiple infectious diseases simultaneously at point-of-care with good sensitivity, specificity, and reproducibility would be enormous for containing the spread of diseases in both resource-limited and rich countries. Many barcoding technologies have been introduced for addressing this need as barcodes can be applied to detecting thousands of genetic and protein biomarkers simultaneously. However, the assay process is not automated and is tedious and requires skilled technicians. Barcoding technology is currently limited to use in resource-rich settings. Here we used magnetism and microfluidics technology to automate the multiple steps in a quantum dot barcode assay. The quantum dot-barcoded microbeads are sequentially (a) introduced into the chip, (b) magnetically moved to a stream containing target molecules, (c) moved back to the original stream containing secondary probes, (d) washed, and (e) finally aligned for detection. The assay requires 20 min, has a limit of detection of 1.2 nM, and can detect genetic targets for HIV, hepatitis B, and syphilis. This study provides a simple strategy to automate the entire barcode assay process and moves barcoding technologies one step closer to point-of-care applications.

  16. Passing messages between biological networks to refine predicted interactions.

    PubMed

    Glass, Kimberly; Huttenhower, Curtis; Quackenbush, John; Yuan, Guo-Cheng

    2013-01-01

    Regulatory network reconstruction is a fundamental problem in computational biology. There are significant limitations to such reconstruction using individual datasets, and increasingly people attempt to construct networks using multiple, independent datasets obtained from complementary sources, but methods for this integration are lacking. We developed PANDA (Passing Attributes between Networks for Data Assimilation), a message-passing model using multiple sources of information to predict regulatory relationships, and used it to integrate protein-protein interaction, gene expression, and sequence motif data to reconstruct genome-wide, condition-specific regulatory networks in yeast as a model. The resulting networks were not only more accurate than those produced using individual data sets and other existing methods, but they also captured information regarding specific biological mechanisms and pathways that were missed using other methodologies. PANDA is scalable to higher eukaryotes, applicable to specific tissue or cell type data and conceptually generalizable to include a variety of regulatory, interaction, expression, and other genome-scale data. An implementation of the PANDA algorithm is available at www.sourceforge.net/projects/panda-net.

  17. PEGASUS 5: An Automated Pre-Processor for Overset-Grid CFD

    NASA Technical Reports Server (NTRS)

    Suhs, Norman E.; Rogers, Stuart E.; Dietz, William E.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    An all new, automated version of the PEGASUS software has been developed and tested. PEGASUS provides the hole-cutting and connectivity information between overlapping grids, and is used as the final part of the grid generation process for overset-grid computational fluid dynamics approaches. The new PEGASUS code (Version 5) has many new features: automated hole cutting; a projection scheme for fixing gaps in overset surfaces; more efficient interpolation search methods using an alternating digital tree; hole-size optimization based on adding additional layers of fringe points; and an automatic restart capability. The new code has also been parallelized using the Message Passing Interface standard. The parallelization performance provides efficient speed-up of the execution time by an order of magnitude, and up to a factor of 30 for very large problems. The results of three example cases are presented: a three-element high-lift airfoil, a generic business jet configuration, and a complete Boeing 777-200 aircraft in a high-lift landing configuration. Comparisons of the computed flow fields for the airfoil and 777 test cases between the old and new versions of the PEGASUS codes show excellent agreement with each other and with experimental results.

  18. Laser Spiderweb Sensor Used with Portable Handheld Devices

    NASA Technical Reports Server (NTRS)

    Scott, David C. (Inventor); Ksendzov, Alexander (Inventor); George, Warren P. (Inventor); Smith, James A. (Inventor); Steinkraus, Joel M. (Inventor); Hofmann, Douglas C. (Inventor); Aljabri, Abdullah S. (Inventor); Bendig, Rudi M. (Inventor)

    2017-01-01

    A portable spectrometer, including a smart phone case storing a portable spectrometer, wherein the portable spectrometer includes a cavity; a source for emitting electromagnetic radiation that is directed on a sample in the cavity, wherein the electromagnetic radiation is reflected within the cavity to form multiple passes of the electromagnetic radiation through the sample; a detector for detecting the electromagnetic radiation after the electromagnetic radiation has made the multiple passes through the sample in the cavity, the detector outputting a signal in response to the detecting; and a device for communicating the signal to a smart phone, wherein the smart phone executes an application that performs a spectral analysis of the signal.

  19. Mitigation of tropospheric InSAR phase artifacts through differential multisquint processing

    NASA Technical Reports Server (NTRS)

    Chen, Curtis W.

    2004-01-01

    We propose a technique for mitigating tropospheric phase errors in repeat-pass interferometric synthetic aperture radar (InSAR). The mitigation technique is based upon the acquisition of multisquint InSAR data. On each satellite pass over a target area, the radar instrument will acquire images from multiple squint (azimuth) angles, from which multiple interferograms can be formed. The diversity of viewing angles associated with the multisquint acquisition can be used to solve for two components of the 3-D surface displacement vector as well as for the differential tropospheric phase. We describe a model for the performance of the multisquint technique, and we present an assessment of the performance expected.

  20. Harmonic generation with multiple wiggler schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonifacio, R.; De Salvo, L.; Pierini, P.

    1995-02-01

    In this paper the authors give a simple theoretical description of the basic physics of the single pass high gain free electron laser (FEL), describing in some detail the FEL bunching properties and the harmonic generation technique with a multiple-wiggler scheme or a high gain optical klystron configuration.

  1. Dust Tolerant Commodity Transfer Interface Mechanisms for Planetary Surfaces

    NASA Technical Reports Server (NTRS)

    Townsend, Ivan I.; Mueller, Robert P.; Tamasy, Gabor J.

    2014-01-01

    Regolith is present on most planetary surfaces such as Earth's moon, Mars, and Asteroids. If human crews and robotic machinery are to operate on these regolith covered surfaces, they must face the consequences of interacting with regolith fines which consist of particles below 100 microns in diameter down to as small as submicron scale particles. Such fine dust will intrude into mechanisms and interfaces causing a variety of problems such as contamination of clean fluid lines, jamming of mechanisms and damaging connector seals and couplings. Since multiple elements must be assembled in space for system level functionality, it will be inevitable that interfaces will be necessary for structural connections, and to pass commodities such as cryogenic liquid propellants, purge and buffer gases, water, breathing air, pressurizing gases, heat exchange fluids, power and data. When fine regolith dust is present in the environment it can be lofted into interfaces where it can compromise the utility of the interface by preventing the connections from being successfully mated, or by inducing fluid leaks or degradation of power and data transmission. A dust tolerant, hand held "quick-disconnect" cryogenic fluids connector housing has been developed at NASA KSC which can be used by astronaut crews to connect flex lines that will transfer propellants and other useful fluids to the end user. In addition, a dust tolerant, automated, cryogenic fluid, multiple connector, power and data interface mechanism prototype has been developed, fabricated and demonstrated by NASA at Kennedy Space Center (KSC). The design and operation of these prototypes are explained and discussed.

  2. An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery

    PubMed Central

    Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing

    2010-01-01

    The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897

  3. The cobas® 6800/8800 System: a new era of automation in molecular diagnostics.

    PubMed

    Cobb, Bryan; Simon, Christian O; Stramer, Susan L; Body, Barbara; Mitchell, P Shawn; Reisch, Natasa; Stevens, Wendy; Carmona, Sergio; Katz, Louis; Will, Stephen; Liesenfeld, Oliver

    2017-02-01

    Molecular diagnostics is a key component of laboratory medicine. Here, the authors review key triggers of ever-increasing automation in nucleic acid amplification testing (NAAT) with a focus on specific automated Polymerase Chain Reaction (PCR) testing and platforms such as the recently launched cobas® 6800 and cobas® 8800 Systems. The benefits of such automation for different stakeholders including patients, clinicians, laboratory personnel, hospital administrators, payers, and manufacturers are described. Areas Covered: The authors describe how molecular diagnostics has achieved total laboratory automation over time, rivaling clinical chemistry to significantly improve testing efficiency. Finally, the authors discuss how advances in automation decrease the development time for new tests enabling clinicians to more readily provide test results. Expert Commentary: The advancements described enable complete diagnostic solutions whereby specific test results can be combined with relevant patient data sets to allow healthcare providers to deliver comprehensive clinical recommendations in multiple fields ranging from infectious disease to outbreak management and blood safety solutions.

  4. Automated Pathogenesis-Based Diagnosis of Lumbar Neural Foraminal Stenosis via Deep Multiscale Multitask Learning.

    PubMed

    Han, Zhongyi; Wei, Benzheng; Leung, Stephanie; Nachum, Ilanit Ben; Laidley, David; Li, Shuo

    2018-02-15

    Pathogenesis-based diagnosis is a key step to prevent and control lumbar neural foraminal stenosis (LNFS). It conducts both early diagnosis and comprehensive assessment by drawing crucial pathological links between pathogenic factors and LNFS. Automated pathogenesis-based diagnosis would simultaneously localize and grade multiple spinal organs (neural foramina, vertebrae, intervertebral discs) to diagnose LNFS and discover pathogenic factors. The automated way facilitates planning optimal therapeutic schedules and relieving clinicians from laborious workloads. However, no successful work has been achieved yet due to its extreme challenges since 1) multiple targets: each lumbar spine has at least 17 target organs, 2) multiple scales: each type of target organ has structural complexity and various scales across subjects, and 3) multiple tasks, i.e., simultaneous localization and diagnosis of all lumbar organs, are extremely difficult than individual tasks. To address these huge challenges, we propose a deep multiscale multitask learning network (DMML-Net) integrating a multiscale multi-output learning and a multitask regression learning into a fully convolutional network. 1) DMML-Net merges semantic representations to reinforce the salience of numerous target organs. 2) DMML-Net extends multiscale convolutional layers as multiple output layers to boost the scale-invariance for various organs. 3) DMML-Net joins a multitask regression module and a multitask loss module to prompt the mutual benefit between tasks. Extensive experimental results demonstrate that DMML-Net achieves high performance (0.845 mean average precision) on T1/T2-weighted MRI scans from 200 subjects. This endows our method an efficient tool for clinical LNFS diagnosis.

  5. Improving Grid Resilience through Informed Decision-making (IGRID)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burnham, Laurie; Stamber, Kevin L.; Jeffers, Robert Fredric

    The transformation of the distribution grid from a centralized to decentralized architecture, with bi-directional power and data flows, is made possible by a surge in network intelligence and grid automation. While changes are largely beneficial, the interface between grid operator and automated technologies is not well understood, nor are the benefits and risks of automation. Quantifying and understanding the latter is an important facet of grid resilience that needs to be fully investigated. The work described in this document represents the first empirical study aimed at identifying and mitigating the vulnerabilities posed by automation for a grid that for themore » foreseeable future will remain a human-in-the-loop critical infrastructure. Our scenario-based methodology enabled us to conduct a series of experimental studies to identify causal relationships between grid-operator performance and automated technologies and to collect measurements of human performance as a function of automation. Our findings, though preliminary, suggest there are predictive patterns in the interplay between human operators and automation, patterns that can inform the rollout of distribution automation and the hiring and training of operators, and contribute in multiple and significant ways to the field of grid resilience.« less

  6. Rotor Noise due to Blade-Turbulence Interaction.

    NASA Astrophysics Data System (ADS)

    Ishimaru, Kiyoto

    The time-averaged intensity density function of the acoustic radiation from rotating blades is derived by replacing blades with rotating dipoles. This derivation is done under the following turbulent inflow conditions: turbulent ingestion with no inlet strut wakes, inflow turbulence elongation and contraction with no inlet strut wakes, and inlet strut wakes. Dimensional analysis reveals two non-dimensional parameters which play important roles in generating the blade-passing frequency tone and its multiples. The elongation and contraction of inflow turbulence has a strong effect on the generation of the blade-passing frequency tone and its multiples. Increasing the number of rotor blades widens the peak at the blade-passing frequency and its multiples. Increasing the rotational speed widens the peak under the condition that the non-dimensional parameter involving the rotational speed is fixed. The number of struts and blades should be chosen so that (the least common multiple of them)(.)(rotational speed) is in the cutoff range of Sears' function, in order to minimize the effect of the mean flow deficit on the time averaged intensity density function. The acoustic intensity density function is not necessarily stationary even if the inflow turbulence is homogeneous and isotropic. The time variation of the propagation path due to the rotation should be considered in the computation of the intensity density function; for instance, in the present rotor specification, the rotor radius is about 0.3 m and the rotational speed Mach number is about 0.2.

  7. Management of radioactive waste gases from PET radiopharmaceutical synthesis using cost effective capture systems integrated with a cyclotron safety system.

    PubMed

    Stimson, D H R; Pringle, A J; Maillet, D; King, A R; Nevin, S T; Venkatachalam, T K; Reutens, D C; Bhalla, R

    2016-09-01

    The emphasis on the reduction of gaseous radioactive effluent associated with PET radiochemistry laboratories has increased. Various radioactive gas capture strategies have been employed historically including expensive automated compression systems. We have implemented a new cost-effective strategy employing gas capture bags with electronic feedback that are integrated with the cyclotron safety system. Our strategy is suitable for multiple automated 18 F radiosynthesis modules and individual automated 11 C radiosynthesis modules. We describe novel gas capture systems that minimize the risk of human error and are routinely used in our facility.

  8. Automated measurement of zebrafish larval movement

    PubMed Central

    Cario, Clinton L; Farrell, Thomas C; Milanese, Chiara; Burton, Edward A

    2011-01-01

    Abstract The zebrafish is a powerful vertebrate model that is readily amenable to genetic, pharmacological and environmental manipulations to elucidate the molecular and cellular basis of movement and behaviour. We report software enabling automated analysis of zebrafish movement from video recordings captured with cameras ranging from a basic camcorder to more specialized equipment. The software, which is provided as open-source MATLAB functions, can be freely modified and distributed, and is compatible with multiwell plates under a wide range of experimental conditions. Automated measurement of zebrafish movement using this technique will be useful for multiple applications in neuroscience, pharmacology and neuropsychiatry. PMID:21646414

  9. Automated measurement of zebrafish larval movement.

    PubMed

    Cario, Clinton L; Farrell, Thomas C; Milanese, Chiara; Burton, Edward A

    2011-08-01

    The zebrafish is a powerful vertebrate model that is readily amenable to genetic, pharmacological and environmental manipulations to elucidate the molecular and cellular basis of movement and behaviour. We report software enabling automated analysis of zebrafish movement from video recordings captured with cameras ranging from a basic camcorder to more specialized equipment. The software, which is provided as open-source MATLAB functions, can be freely modified and distributed, and is compatible with multiwell plates under a wide range of experimental conditions. Automated measurement of zebrafish movement using this technique will be useful for multiple applications in neuroscience, pharmacology and neuropsychiatry.

  10. OR automation systems.

    PubMed

    2002-12-01

    An operating room (OR) automation system is a combination of hardware and software designed to address efficiency issues in the OR by controling multiple devices via a common interface. Systems range from the relatively basic--allowing control of a few devices within a single OR--to advanced designs that are capable of not only controlling a wide range of devices within the OR but also exchanging information with remote locations.

  11. Software for Automated Image-to-Image Co-registration

    NASA Technical Reports Server (NTRS)

    Benkelman, Cody A.; Hughes, Heidi

    2007-01-01

    The project objectives are: a) Develop software to fine-tune image-to-image co-registration, presuming images are orthorectified prior to input; b) Create a reusable software development kit (SDK) to enable incorporation of these tools into other software; d) provide automated testing for quantitative analysis; and e) Develop software that applies multiple techniques to achieve subpixel precision in the co-registration of image pairs.

  12. Lab-on-a-Chip Proteomic Assays for Psychiatric Disorders.

    PubMed

    Peter, Harald; Wienke, Julia; Guest, Paul C; Bistolas, Nikitas; Bier, Frank F

    2017-01-01

    Lab-on-a-chip assays allow rapid identification of multiple parameters on an automated user-friendly platform. Here we describe a fully automated multiplex immunoassay and readout in less than 15 min using the Fraunhofer in vitro diagnostics (ivD) platform to enable inexpensive point-of-care profiling of sera or a single drop of blood from patients with various diseases such as psychiatric disorders.

  13. The reliability of the pass/fail decision for assessments comprised of multiple components.

    PubMed

    Möltner, Andreas; Tımbıl, Sevgi; Jünger, Jana

    2015-01-01

    The decision having the most serious consequences for a student taking an assessment is the one to pass or fail that student. For this reason, the reliability of the pass/fail decision must be determined for high quality assessments, just as the measurement reliability of the point values. Assessments in a particular subject (graded course credit) are often composed of multiple components that must be passed independently of each other. When "conjunctively" combining separate pass/fail decisions, as with other complex decision rules for passing, adequate methods of analysis are necessary for estimating the accuracy and consistency of these classifications. To date, very few papers have addressed this issue; a generally applicable procedure was published by Douglas and Mislevy in 2010. Using the example of an assessment comprised of several parts that must be passed separately, this study analyzes the reliability underlying the decision to pass or fail students and discusses the impact of an improved method for identifying those who do not fulfill the minimum requirements. The accuracy and consistency of the decision to pass or fail an examinee in the subject cluster Internal Medicine/General Medicine/Clinical Chemistry at the University of Heidelberg's Faculty of Medicine was investigated. This cluster requires students to separately pass three components (two written exams and an OSCE), whereby students may reattempt to pass each component twice. Our analysis was carried out using the method described by Douglas and Mislevy. Frequently, when complex logical connections exist between the individual pass/fail decisions in the case of low failure rates, only a very low reliability for the overall decision to grant graded course credit can be achieved, even if high reliabilities exist for the various components. For the example analyzed here, the classification accuracy and consistency when conjunctively combining the three individual parts is relatively low with κ=0.49 or κ=0.47, despite the good reliability of over 0.75 for each of the three components. The option to repeat each component twice leads to a situation in which only about half of the candidates who do not satisfy the minimum requirements would fail the overall assessment, while the other half is able to continue their studies despite having deficient knowledge and skills. The method put forth by Douglas and Mislevy allows the analysis of the decision accuracy and consistency for complex combinations of scores from different components. Even in the case of highly reliable components, it is not necessarily so that a reliable pass/fail decision has been reached - for instance in the case of low failure rates. Assessments must be administered with the explicit goal of identifying examinees that do not fulfill the minimum requirements.

  14. The reliability of the pass/fail decision for assessments comprised of multiple components

    PubMed Central

    Möltner, Andreas; Tımbıl, Sevgi; Jünger, Jana

    2015-01-01

    Objective: The decision having the most serious consequences for a student taking an assessment is the one to pass or fail that student. For this reason, the reliability of the pass/fail decision must be determined for high quality assessments, just as the measurement reliability of the point values. Assessments in a particular subject (graded course credit) are often composed of multiple components that must be passed independently of each other. When “conjunctively” combining separate pass/fail decisions, as with other complex decision rules for passing, adequate methods of analysis are necessary for estimating the accuracy and consistency of these classifications. To date, very few papers have addressed this issue; a generally applicable procedure was published by Douglas and Mislevy in 2010. Using the example of an assessment comprised of several parts that must be passed separately, this study analyzes the reliability underlying the decision to pass or fail students and discusses the impact of an improved method for identifying those who do not fulfill the minimum requirements. Method: The accuracy and consistency of the decision to pass or fail an examinee in the subject cluster Internal Medicine/General Medicine/Clinical Chemistry at the University of Heidelberg’s Faculty of Medicine was investigated. This cluster requires students to separately pass three components (two written exams and an OSCE), whereby students may reattempt to pass each component twice. Our analysis was carried out using the method described by Douglas and Mislevy. Results: Frequently, when complex logical connections exist between the individual pass/fail decisions in the case of low failure rates, only a very low reliability for the overall decision to grant graded course credit can be achieved, even if high reliabilities exist for the various components. For the example analyzed here, the classification accuracy and consistency when conjunctively combining the three individual parts is relatively low with κ=0.49 or κ=0.47, despite the good reliability of over 0.75 for each of the three components. The option to repeat each component twice leads to a situation in which only about half of the candidates who do not satisfy the minimum requirements would fail the overall assessment, while the other half is able to continue their studies despite having deficient knowledge and skills. Conclusion: The method put forth by Douglas and Mislevy allows the analysis of the decision accuracy and consistency for complex combinations of scores from different components. Even in the case of highly reliable components, it is not necessarily so that a reliable pass/fail decision has been reached – for instance in the case of low failure rates. Assessments must be administered with the explicit goal of identifying examinees that do not fulfill the minimum requirements. PMID:26483855

  15. NASA Systems Autonomy Demonstration Project - Development of Space Station automation technology

    NASA Technical Reports Server (NTRS)

    Bull, John S.; Brown, Richard; Friedland, Peter; Wong, Carla M.; Bates, William

    1987-01-01

    A 1984 Congressional expansion of the 1958 National Aeronautics and Space Act mandated that NASA conduct programs, as part of the Space Station program, which will yield the U.S. material benefits, particularly in the areas of advanced automation and robotics systems. Demonstration programs are scheduled for automated systems such as the thermal control, expert system coordination of Station subsystems, and automation of multiple subsystems. The programs focus the R&D efforts and provide a gateway for transfer of technology to industry. The NASA Office of Aeronautics and Space Technology is responsible for directing, funding and evaluating the Systems Autonomy Demonstration Project, which will include simulated interactions between novice personnel and astronauts and several automated, expert subsystems to explore the effectiveness of the man-machine interface being developed. Features and progress on the TEXSYS prototype thermal control system expert system are outlined.

  16. Fabrication of glass microspheres with conducting surfaces

    DOEpatents

    Elsholz, William E.

    1984-01-01

    A method for making hollow glass microspheres with conducting surfaces by adding a conducting vapor to a region of the glass fabrication furnace. As droplets or particles of glass forming material pass through multiple zones of different temperature in a glass fabrication furnace, and are transformed into hollow glass microspheres, the microspheres pass through a region of conducting vapor, forming a conducting coating on the surface of the microspheres.

  17. Fabrication of glass microspheres with conducting surfaces

    DOEpatents

    Elsholz, W.E.

    1982-09-30

    A method for making hollow glass microspheres with conducting surfaces by adding a conducting vapor to a region of the glass fabrication furnace. As droplets or particles of glass forming material pass through multiple zones of different temperature in a glass fabrication furnace, and are transformed into hollow glass microspheres, the microspheres pass through a region of conducting vapor, forming a conducting coating on the surface of the microspheres.

  18. Sensor-Augmented Insulin Pumps and Hypoglycemia Prevention in Type 1 Diabetes.

    PubMed

    Steineck, Isabelle; Ranjan, Ajenthen; Nørgaard, Kirsten; Schmidt, Signe

    2017-01-01

    Hypoglycemia can lead to seizures, unconsciousness, or death. Insulin pump treatment reduces the frequency of severe hypoglycemia compared with multiple daily injections treatment. The addition of a continuous glucose monitor, so-called sensor-augmented pump (SAP) treatment, has the potential to further limit the duration and severity of hypoglycemia as the system can detect and in some systems act on impending and prevailing low blood glucose levels. In this narrative review we summarize the available knowledge on SAPs with and without automated insulin suspension, in relation to hypoglycemia prevention. We present evidence from randomized trials, observational studies, and meta-analyses including nonpregnant individuals with type 1 diabetes mellitus. We also outline concerns regarding SAPs with and without automated insulin suspension. There is evidence that SAP treatment reduces episodes of moderate and severe hypoglycemia compared with multiple daily injections plus self-monitoring of blood glucose. There is some evidence that SAPs both with and without automated suspension reduces the frequency of severe hypoglycemic events compared with insulin pumps without continuous glucose monitoring.

  19. High-speed autoverifying technology for printed wiring boards

    NASA Astrophysics Data System (ADS)

    Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi

    1996-10-01

    We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).

  20. Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations

    PubMed Central

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.

    2016-01-01

    Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces. PMID:27736999

  1. Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations.

    PubMed

    Hess, Becky M; Amidan, Brett G; Anderson, Kevin K; Hutchison, Janine R

    2016-01-01

    Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces.

  2. The Operational Use of an Automated High Frequency Radio System Incorporating Automatic Link Establishment and Single-Tone Serial Modem Technology for U.S. Navy Ship-Shore Communications

    DTIC Science & Technology

    1993-10-01

    between the link chronologically in the following sections. quality analysis ( LQA ) score measured by ALE and single- tone serial modem performance. A...receiving ends in turn and (propagation permitting), pass traffic and terminate the are used to calculate a combined link quality analysis ( LQA ...score. The LQA score is displayed to the operator NCCOSC RDTE DIV installation team accomplished the as a number on an arbitrary scale of 0 to 120, with a

  3. 3D Tracking of Mating Events in Wild Swarms of the Malaria Mosquito Anopheles gambiae

    PubMed Central

    Butail, Sachit; Manoukis, Nicholas; Diallo, Moussa; Yaro, Alpha S.; Dao, Adama; Traoré, Sekou F.; Ribeiro, José M.; Lehmann, Tovi; Paley, Derek A.

    2013-01-01

    We describe an automated tracking system that allows us to reconstruct the 3D kinematics of individual mosquitoes in swarms of Anopheles gambiae. The inputs to the tracking system are video streams recorded from a stereo camera system. The tracker uses a two-pass procedure to automatically localize and track mosquitoes within the swarm. A human-in-the-loop step verifies the estimates and connects broken tracks. The tracker performance is illustrated using footage of mating events filmed in Mali in August 2010. PMID:22254411

  4. Parallel File System I/O Performance Testing On LANL Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiens, Isaac Christian; Green, Jennifer Kathleen

    2016-08-18

    These are slides from a presentation on parallel file system I/O performance testing on LANL clusters. I/O is a known bottleneck for HPC applications. Performance optimization of I/O is often required. This summer project entailed integrating IOR under Pavilion and automating the results analysis. The slides cover the following topics: scope of the work, tools utilized, IOR-Pavilion test workflow, build script, IOR parameters, how parameters are passed to IOR, *run_ior: functionality, Python IOR-Output Parser, Splunk data format, Splunk dashboard and features, and future work.

  5. VizieR Online Data Catalog: OGLE-II DIA microlensing events (Wozniak+, 2001)

    NASA Astrophysics Data System (ADS)

    Wozniak, P. R.; Udalski, A.; Szymanski, M.; Kubiak, M.; Pietrzynski, G.; Soszynski, I.; Zebrun, K.

    2002-11-01

    We present a sample of microlensing events discovered in the Difference Image Analysis (DIA) of the OGLE-II images collected during three observing seasons, 1997-1999. 4424 light curves pass our criteria on the presence of a brightening episode on top of a constant baseline. Among those, 512 candidate microlensing events were selected visually. We designed an automated procedure, which unambiguously selects up to 237 best events. Including eight candidate events recovered by other means, a total of 520 light curves are presented in this work. (4 data files).

  6. High-density grids for efficient data collection from multiple crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassettemore » or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into theBlu-Ice/DCSSexperimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. As a result, crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures.« less

  7. High-density grids for efficient data collection from multiple crystals

    PubMed Central

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto; Barnes, Christopher O.; Bonagura, Christopher A.; Brehmer, Winnie; Brunger, Axel T.; Calero, Guillermo; Caradoc-Davies, Tom T.; Chatterjee, Ruchira; Degrado, William F.; Fraser, James S.; Ibrahim, Mohamed; Kern, Jan; Kobilka, Brian K.; Kruse, Andrew C.; Larsson, Karl M.; Lemke, Heinrik T.; Lyubimov, Artem Y.; Manglik, Aashish; McPhillips, Scott E.; Norgren, Erik; Pang, Siew S.; Soltis, S. M.; Song, Jinhu; Thomaston, Jessica; Tsai, Yingssu; Weis, William I.; Woldeyes, Rahel A.; Yachandra, Vittal; Yano, Junko; Zouni, Athina; Cohen, Aina E.

    2016-01-01

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassette or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into the Blu-Ice/DCSS experimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. Crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures. PMID:26894529

  8. High-density grids for efficient data collection from multiple crystals

    DOE PAGES

    Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto; ...

    2015-11-03

    Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassettemore » or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into theBlu-Ice/DCSSexperimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. As a result, crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures.« less

  9. Synchrotron X-ray microbeam diffraction measurements of full elastic long range internal strain and stress tensors in commercial-purity aluminum processed by multiple passes of equal-channel angular pressing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phan, Thien Q.; Levine, Lyle E.; Lee, I-Fang

    Synchrotron X-ray microbeam diffraction was used to measure the full elastic long range internal strain and stress tensors of low dislocation density regions within the submicrometer grain/subgrain structure of equal-channel angular pressed (ECAP) aluminum alloy AA1050 after 1, 2, and 8 passes using route B C. This is the first time that full tensors were measured in plastically deformed metals at this length scale. The maximum (most tensile or least compressive) principal elastic strain directions for the unloaded 1 pass sample for the grain/subgrain interiors align well with the pressing direction, and are more random for the 2 and 8more » pass samples. The measurements reported here indicate that the local stresses and strains become increasingly isotropic (homogenized) with increasing ECAP passes using route BC. The average maximum (in magnitude) LRISs are -0.43 σ a for 1 pass, -0.44 σ a for 2 pass, and 0.14 σ a for the 8 pass sample. Furthermore, these LRISs are larger than those reported previously because those earlier measurements were unable to measure the full stress tensor. Significantly, the measured stresses are inconsistent with the two-component composite model.« less

  10. Synchrotron X-ray microbeam diffraction measurements of full elastic long range internal strain and stress tensors in commercial-purity aluminum processed by multiple passes of equal-channel angular pressing

    DOE PAGES

    Phan, Thien Q.; Levine, Lyle E.; Lee, I-Fang; ...

    2016-04-23

    Synchrotron X-ray microbeam diffraction was used to measure the full elastic long range internal strain and stress tensors of low dislocation density regions within the submicrometer grain/subgrain structure of equal-channel angular pressed (ECAP) aluminum alloy AA1050 after 1, 2, and 8 passes using route B C. This is the first time that full tensors were measured in plastically deformed metals at this length scale. The maximum (most tensile or least compressive) principal elastic strain directions for the unloaded 1 pass sample for the grain/subgrain interiors align well with the pressing direction, and are more random for the 2 and 8more » pass samples. The measurements reported here indicate that the local stresses and strains become increasingly isotropic (homogenized) with increasing ECAP passes using route BC. The average maximum (in magnitude) LRISs are -0.43 σ a for 1 pass, -0.44 σ a for 2 pass, and 0.14 σ a for the 8 pass sample. Furthermore, these LRISs are larger than those reported previously because those earlier measurements were unable to measure the full stress tensor. Significantly, the measured stresses are inconsistent with the two-component composite model.« less

  11. Nutritional quality of major meals consumed away from home in Brazil and its association with the overall diet quality.

    PubMed

    Gorgulho, Bartira Mendes; Fisberg, Regina Mara; Marchioni, Dirce Maria Lobo

    2013-08-01

    The objective of the study is to evaluate the nutritional quality of meals consumed away from home and its association with overall diet quality. Data was obtained from 834 participants of a Health Survey in São Paulo, Brazil. Food intake was measured by a 24-hour dietary recall applied telephonically using the Automated Multiple-Pass Method. Overall dietary quality was assessed by the Brazilian Healthy Eating Index Revised (B-HEIR) and the Meal Quality Index (MQI) was used to evaluate dietary quality of the main meals. The association between the B-HEIR and the MQI was assessed by linear regression analysis. The consumption of at least one of the three main meals away from home was reported for 32% of respondents (70 adolescents, 156 adults and 40 elderly). The average MQI score of lunch consumed away from home was lower than lunch consumed at home, with higher amounts of total and saturated fats. The average score of B-HEIR was 58 points and was associated with the MQI score, energy, meal consumption location and gender. Lunch consumed away from home presented the worst quality, being higher in total and saturated fat. However, the meals consumed at home also need improvement. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Integrating LPR with CCTV systems: problems and solutions

    NASA Astrophysics Data System (ADS)

    Bissessar, David; Gorodnichy, Dmitry O.

    2011-06-01

    A new generation of high-resolution surveillance cameras makes it possible to apply video processing and recognition techniques on live video feeds for the purpose of automatically detecting and identifying objects and events of interest. This paper addresses a particular application of detecting and identifying vehicles passing through a checkpoint. This application is of interest to border services agencies and is also related to many other applications. With many commercial automated License Plate Recognition (LPR) systems available on the market, some of which are available as a plug-in for surveillance systems, this application still poses many unresolved technological challenges, the main two of which are: i) multiple and often noisy license plate readings generated for the same vehicle, and ii) failure to detect a vehicle or license plate altogether when the license plate is occluded or not visible. This paper presents a solution to both of these problems. A data fusion technique based on the Levenshtein distance is used to resolve the first problem. An integration of a commercial LPR system with the in-house built Video Analytic Platform is used to solve the latter. The developed solution has been tested in field environments and has been shown to yield a substantial improvement over standard off-the-shelf LPR systems.

  13. Consensus coding sequence (CCDS) database: a standardized set of human and mouse protein-coding regions supported by expert curation

    PubMed Central

    Pujar, Shashikant; O’Leary, Nuala A; Farrell, Catherine M; Mudge, Jonathan M; Wallin, Craig; Diekhans, Mark; Barnes, If; Bennett, Ruth; Berry, Andrew E; Cox, Eric; Davidson, Claire; Goldfarb, Tamara; Gonzalez, Jose M; Hunt, Toby; Jackson, John; Joardar, Vinita; Kay, Mike P; Kodali, Vamsi K; McAndrews, Monica; McGarvey, Kelly M; Murphy, Michael; Rajput, Bhanu; Rangwala, Sanjida H; Riddick, Lillian D; Seal, Ruth L; Webb, David; Zhu, Sophia; Aken, Bronwen L; Bult, Carol J; Frankish, Adam; Pruitt, Kim D

    2018-01-01

    Abstract The Consensus Coding Sequence (CCDS) project provides a dataset of protein-coding regions that are identically annotated on the human and mouse reference genome assembly in genome annotations produced independently by NCBI and the Ensembl group at EMBL-EBI. This dataset is the product of an international collaboration that includes NCBI, Ensembl, HUGO Gene Nomenclature Committee, Mouse Genome Informatics and University of California, Santa Cruz. Identically annotated coding regions, which are generated using an automated pipeline and pass multiple quality assurance checks, are assigned a stable and tracked identifier (CCDS ID). Additionally, coordinated manual review by expert curators from the CCDS collaboration helps in maintaining the integrity and high quality of the dataset. The CCDS data are available through an interactive web page (https://www.ncbi.nlm.nih.gov/CCDS/CcdsBrowse.cgi) and an FTP site (ftp://ftp.ncbi.nlm.nih.gov/pub/CCDS/). In this paper, we outline the ongoing work, growth and stability of the CCDS dataset and provide updates on new collaboration members and new features added to the CCDS user interface. We also present expert curation scenarios, with specific examples highlighting the importance of an accurate reference genome assembly and the crucial role played by input from the research community. PMID:29126148

  14. Mission Planning and Scheduling System for NASA's Lunar Reconnaissance Mission

    NASA Technical Reports Server (NTRS)

    Garcia, Gonzalo; Barnoy, Assaf; Beech, Theresa; Saylor, Rick; Cosgrove, Jennifer Sager; Ritter, Sheila

    2009-01-01

    In the framework of NASA's return to the Moon efforts, the Lunar Reconnaissance Orbiter (LRO) is the first step. It is an unmanned mission to create a comprehensive atlas of the Moon's features and resources necessary to design and build a lunar outpost. LRO is scheduled for launch in April, 2009. LRO carries a payload comprised of six instruments and one technology demonstration. In addition to its scientific mission LRO will use new technologies, systems and flight operations concepts to reduce risk and increase productivity of future missions. As part of the effort to achieve robust and efficient operations, the LRO Mission Operations Team (MOT) will use its Mission Planning System (MPS) to manage the operational activities of the mission during the Lunar Orbit Insertion (LOI) and operational phases of the mission. The MPS, based on GMV's flexplan tool and developed for NASA with Honeywell Technology Solutions (prime contractor), will receive activity and slew maneuver requests from multiple science operations centers (SOC), as well as from the spacecraft engineers. flexplan will apply scheduling rules to all the requests received and will generate conflict free command schedules in the form of daily stored command loads for the orbiter and a set of daily pass scripts that help automate nominal real-time operations.

  15. An evaluation of demographic factors affecting performance in a paediatric membership multiple-choice examination.

    PubMed

    Menzies, Lara; Minson, Susan; Brightwell, Alexandra; Davies-Muir, Anna; Long, Andrew; Fertleman, Caroline

    2015-02-01

    To determine if demographic factors are associated with outcome in a multiple-choice, electronically marked paediatric postgraduate examination. Retrospective analysis of pass rates of UK trainees sitting Membership of the Royal College of Paediatrics and Child Health (MRCPCH) part 1B from 2007 to 2011. Data collected by the RCPCH from examination candidates were analysed to assess the effects of gender, age, and country and university of medical qualification on examination outcome. At first attempt at MRCPCH part 1B, the overall pass rate from 2007 to 2011 was 843/2056 (41.0%). In univariate analysis, passing the examination was associated with being a UK graduate (649/1376 (47.2%)) compared with being an international medical graduate (130/520 (25.0%)) (OR 2.68 (95% CI 2.14 to 3.36), p<0.001). There was strong evidence that the proportion of candidates passing the examination differed for graduates of the 19 different UK medical schools (Fisher's exact test p<0.001). In multivariate logistic regression analysis, after adjustment for age, sex and whether the part 1A examination was taken concurrently, being a UK graduate was still strongly associated with passing the examination (OR 3.17 (95% CI 2.41 to 4.17), p<0.001). UK graduates performed best at 26-27 years of age (52.4% pass rate), whereas overseas graduates performed best at ≥38 years of age (50.8% pass rate). MRCPCH part 1B outcome was related to place of primary medical qualification, with a significantly lower pass rate for international medical graduates compared with UK graduates, as well as significant variation in examination outcome between graduates from different UK medical schools. These data may be used to guide new initiatives to improve support and education for these trainees and to inform development of undergraduate curricula and help trainees prepare more successfully for postgraduate examinations. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  16. Data registration for automated non-destructive inspection with multiple data sets

    NASA Astrophysics Data System (ADS)

    Tippetts, T.; Brierley, N.; Cawley, P.

    2013-01-01

    In many NDE applications, multiple sources of data are available covering the same region of a part under inspection. These overlapping data can come from intersecting scan patterns, sensors in an array configuration, or repeated inspections at different times. In many cases these data sets are analysed independently, with separate assessments for each channel or data file. It should be possible to improve the overall reliability of the inspection by combining multiple sources of information, simultaneously increasing the Probability of Detection (POD) and decreasing the Probability of False Alarm (PFA). Data registration, i.e. mapping the data to matching coordinates in space, is both an essential prerequisite and a challenging obstacle to this type of data fusion. This paper describes optimization techniques for matching and aligning features in NDE data. Examples from automated ultrasound inspection of aircraft engine discs illustrate the approach.

  17. Single-Pass Percutaneous Liver Biopsy for Diffuse Liver Disease Using an Automated Device: Experience in 154 Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivera-Sanfeliz, Gerant, E-mail: gerantrivera@ucsd.edu; Kinney, Thomas B.; Rose, Steven C.

    2005-06-15

    Purpose: To describe our experience with ultrasound (US)-guided percutaneous liver biopsies using the INRAD 18G Express core needle biopsy system.Methods: One hundred and fifty-four consecutive percutaneous core liver biopsy procedures were performed in 153 men in a single institution over 37 months. The medical charts, pathology reports, and radiology files were retrospectively reviewed. The number of needle passes, type of guidance, change in hematocrit level, and adequacy of specimens for histologic analysis were evaluated.Results: All biopsies were performed for histologic staging of chronic liver diseases. The majority of patients had hepatitis C (134/153, 90.2%). All patients were discharged to homemore » after 4 hr of postprocedural observation. In 145 of 154 (94%) biopsies, a single needle pass was sufficient for diagnosis. US guidance was utilized in all but one of the procedures (153/154, 99.4%). The mean hematocrit decrease was 1.2% (44.1-42.9%). Pain requiring narcotic analgesia, the most frequent complication, occurred in 28 of 154 procedures (18.2%). No major complications occurred. The specimens were diagnostic in 152 of 154 procedures (98.7%).Conclusions: Single-pass percutaneous US-guided liver biopsy with the INRAD 18G Express core needle biopsy system is safe and provides definitive pathologic diagnosis of chronic liver disease. It can be performed on an outpatient basis. Routine post-biopsy monitoring of hematocrit level in stable, asymptomatic patients is probably not warranted.« less

  18. Tools for Coordinated Planning Between Observatories

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Fishman, Mark; Grella, Vince; Kerbel, Uri; Maks, Lori; Misra, Dharitri; Pell, Vince; Powers, Edward I. (Technical Monitor)

    2001-01-01

    With the realization of NASA's era of great observatories, there are now more than three space-based telescopes operating in different wavebands. This situation provides astronomers with a unique opportunity to simultaneously observe with multiple observatories. Yet scheduling multiple observatories simultaneously is highly inefficient when compared to observations using only one single observatory. Thus, programs using multiple observatories are limited not due to scientific restrictions, but due to operational inefficiencies. At present, multi-observatory programs are conducted by submitting observing proposals separately to each concerned observatory. To assure that the proposed observations can be scheduled, each observatory's staff has to check that the observations are valid and meet all the constraints for their own observatory; in addition, they have to verify that the observations satisfy the constraints of the other observatories. Thus, coordinated observations require painstaking manual collaboration among the observatory staff at each observatory. Due to the lack of automated tools for coordinated observations, this process is time consuming, error-prone, and the outcome of the requests is not certain until the very end. To increase observatory operations efficiency, such manpower intensive processes need to undergo re-engineering. To overcome this critical deficiency, Goddard Space Flight Center's Advanced Architectures and Automation Branch is developing a prototype effort called the Visual Observation Layout Tool (VOLT). The main objective of the VOLT project is to provide visual tools to help automate the planning of coordinated observations by multiple astronomical observatories, as well as to increase the scheduling probability of all observations.

  19. Predicting Robust Vocabulary Growth from Measures of Incremental Learning

    ERIC Educational Resources Information Center

    Frishkoff, Gwen A.; Perfetti, Charles A.; Collins-Thompson, Kevyn

    2011-01-01

    We report a study of incremental learning of new word meanings over multiple episodes. A new method called MESA (Markov Estimation of Semantic Association) tracked this learning through the automated assessment of learner-generated definitions. The multiple word learning episodes varied in the strength of contextual constraint provided by…

  20. Systems and Algorithms for Automated Collaborative Observation Using Networked Robotic Cameras

    ERIC Educational Resources Information Center

    Xu, Yiliang

    2011-01-01

    The development of telerobotic systems has evolved from Single Operator Single Robot (SOSR) systems to Multiple Operator Multiple Robot (MOMR) systems. The relationship between human operators and robots follows the master-slave control architecture and the requests for controlling robot actuation are completely generated by human operators. …

  1. Using support vector machines to identify literacy skills: Evidence from eye movements.

    PubMed

    Lou, Ya; Liu, Yanping; Kaakinen, Johanna K; Li, Xingshan

    2017-06-01

    Is inferring readers' literacy skills possible by analyzing their eye movements during text reading? This study used Support Vector Machines (SVM) to analyze eye movement data from 61 undergraduate students who read a multiple-paragraph, multiple-topic expository text. Forward fixation time, first-pass rereading time, second-pass fixation time, and regression path reading time on different regions of the text were provided as features. The SVM classification algorithm assisted in distinguishing high-literacy-skilled readers from low-literacy-skilled readers with 80.3 % accuracy. Results demonstrate the effectiveness of combining eye tracking and machine learning techniques to detect readers with low literacy skills, and suggest that such approaches can be potentially used in predicting other cognitive abilities.

  2. An Automated DAKOTA and VULCAN-CFD Framework with Application to Supersonic Facility Nozzle Flowpath Optimization

    NASA Technical Reports Server (NTRS)

    Axdahl, Erik L.

    2015-01-01

    Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.

  3. Policy-based secure communication with automatic key management for industrial control and automation systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chernoguzov, Alexander; Markham, Thomas R.; Haridas, Harshal S.

    A method includes generating at least one access vector associated with a specified device in an industrial process control and automation system. The specified device has one of multiple device roles. The at least one access vector is generated based on one or more communication policies defining communications between one or more pairs of devices roles in the industrial process control and automation system, where each pair of device roles includes the device role of the specified device. The method also includes providing the at least one access vector to at least one of the specified device and one ormore » more other devices in the industrial process control and automation system in order to control communications to or from the specified device.« less

  4. Sci—Thur PM: Planning and Delivery — 03: Automated delivery and quality assurance of a modulated electron radiation therapy plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Connell, T; Papaconstadopoulos, P; Alexander, A

    2014-08-15

    Modulated electron radiation therapy (MERT) offers the potential to improve healthy tissue sparing through increased dose conformity. Challenges remain, however, in accurate beamlet dose calculation, plan optimization, collimation method and delivery accuracy. In this work, we investigate the accuracy and efficiency of an end-to-end MERT plan and automated-delivery workflow for the electron boost portion of a previously treated whole breast irradiation case. Dose calculations were performed using Monte Carlo methods and beam weights were determined using a research-based treatment planning system capable of inverse optimization. The plan was delivered to radiochromic film placed in a water equivalent phantom for verification,more » using an automated motorized tertiary collimator. The automated delivery, which covered 4 electron energies, 196 subfields and 6183 total MU was completed in 25.8 minutes, including 6.2 minutes of beam-on time with the remainder of the delivery time spent on collimator leaf motion and the automated interfacing with the accelerator in service mode. The delivery time could be reduced by 5.3 minutes with minor electron collimator modifications and the beam-on time could be reduced by and estimated factor of 2–3 through redesign of the scattering foils. Comparison of the planned and delivered film dose gave 3%/3 mm gamma pass rates of 62.1, 99.8, 97.8, 98.3, and 98.7 percent for the 9, 12, 16, 20 MeV, and combined energy deliveries respectively. Good results were also seen in the delivery verification performed with a MapCHECK 2 device. The results showed that accurate and efficient MERT delivery is possible with current technologies.« less

  5. Disposable world-to-chip interface for digital microfluidics

    DOEpatents

    Van Dam, R. Michael; Shah, Gaurav; Keng, Pei-Yuin

    2017-05-16

    The present disclosure sets forth incorporating microfluidic chips interfaces for use with digital microfluidic processes. Methods and devices according to the present disclosure utilize compact, integrated platforms that interface with a chip upstream and downstream of the reaction, as well as between intermediate reaction steps if needed. In some embodiments these interfaces are automated, including automation of a multiple reagent process. Various reagent delivery systems and methods are also disclosed.

  6. Organizational principles of cloud storage to support collaborative biomedical research.

    PubMed

    Kanbar, Lara J; Shalish, Wissam; Robles-Rubio, Carlos A; Precup, Doina; Brown, Karen; Sant'Anna, Guilherme M; Kearney, Robert E

    2015-08-01

    This paper describes organizational guidelines and an anonymization protocol for the management of sensitive information in interdisciplinary, multi-institutional studies with multiple collaborators. This protocol is flexible, automated, and suitable for use in cloud-based projects as well as for publication of supplementary information in journal papers. A sample implementation of the anonymization protocol is illustrated for an ongoing study dealing with Automated Prediction of EXtubation readiness (APEX).

  7. Automated Long-Term Monitoring of Parallel Microfluidic Operations Applying a Machine Vision-Assisted Positioning Method

    PubMed Central

    Yip, Hon Ming; Li, John C. S.; Cui, Xin; Gao, Qiannan; Leung, Chi Chiu

    2014-01-01

    As microfluidics has been applied extensively in many cell and biochemical applications, monitoring the related processes is an important requirement. In this work, we design and fabricate a high-throughput microfluidic device which contains 32 microchambers to perform automated parallel microfluidic operations and monitoring on an automated stage of a microscope. Images are captured at multiple spots on the device during the operations for monitoring samples in microchambers in parallel; yet the device positions may vary at different time points throughout operations as the device moves back and forth on a motorized microscopic stage. Here, we report an image-based positioning strategy to realign the chamber position before every recording of microscopic image. We fabricate alignment marks at defined locations next to the chambers in the microfluidic device as reference positions. We also develop image processing algorithms to recognize the chamber positions in real-time, followed by realigning the chambers to their preset positions in the captured images. We perform experiments to validate and characterize the device functionality and the automated realignment operation. Together, this microfluidic realignment strategy can be a platform technology to achieve precise positioning of multiple chambers for general microfluidic applications requiring long-term parallel monitoring of cell and biochemical activities. PMID:25133248

  8. Feasibility of using a reliable automated Doppler flow velocity measurements for research and clinical practices

    NASA Astrophysics Data System (ADS)

    Zolgharni, Massoud; Dhutia, Niti M.; Cole, Graham D.; Willson, Keith; Francis, Darrel P.

    2014-03-01

    Echocardiographers are often unkeen to make the considerable time investment to make additional multiple measurements of Doppler velocity. Main hurdle to obtaining multiple measurements is the time required to manually trace a series of Doppler traces. To make it easier to analyse more beats, we present an automated system for Doppler envelope quantification. It analyses long Doppler strips, spanning many heartbeats, and does not require the electrocardiogram to isolate individual beats. We tested its measurement of velocity-time-integral and peak-velocity against the reference standard defined as the average of three experts who each made three separate measurements. The automated measurements of velocity-time-integral showed strong correspondence (R2 = 0.94) and good Bland-Altman agreement (SD = 6.92%) with the reference consensus expert values, and indeed performed as well as the individual experts (R2 = 0.90 to 0.96, SD = 5.66% to 7.64%). The same performance was observed for peak-velocities; (R2 = 0.98, SD = 2.95%) and (R2 = 0.93 to 0.98, SD = 2.94% to 5.12%). This automated technology allows <10 times as many beats to be acquired and analysed compared to the conventional manual approach, with each beat maintaining its accuracy.

  9. Automated determination of arterial input function for DCE-MRI of the prostate

    NASA Astrophysics Data System (ADS)

    Zhu, Yingxuan; Chang, Ming-Ching; Gupta, Sandeep

    2011-03-01

    Prostate cancer is one of the commonest cancers in the world. Dynamic contrast enhanced MRI (DCE-MRI) provides an opportunity for non-invasive diagnosis, staging, and treatment monitoring. Quantitative analysis of DCE-MRI relies on determination of an accurate arterial input function (AIF). Although several methods for automated AIF detection have been proposed in literature, none are optimized for use in prostate DCE-MRI, which is particularly challenging due to large spatial signal inhomogeneity. In this paper, we propose a fully automated method for determining the AIF from prostate DCE-MRI. Our method is based on modeling pixel uptake curves as gamma variate functions (GVF). First, we analytically compute bounds on GVF parameters for more robust fitting. Next, we approximate a GVF for each pixel based on local time domain information, and eliminate the pixels with false estimated AIFs using the deduced upper and lower bounds. This makes the algorithm robust to signal inhomogeneity. After that, according to spatial information such as similarity and distance between pixels, we formulate the global AIF selection as an energy minimization problem and solve it using a message passing algorithm to further rule out the weak pixels and optimize the detected AIF. Our method is fully automated without training or a priori setting of parameters. Experimental results on clinical data have shown that our method obtained promising detection accuracy (all detected pixels inside major arteries), and a very good match with expert traced manual AIF.

  10. Optimization of Spiral-Based Pulse Sequences for First Pass Myocardial Perfusion Imaging

    PubMed Central

    Salerno, Michael; Sica, Christopher T.; Kramer, Christopher M.; Meyer, Craig H.

    2010-01-01

    While spiral trajectories have multiple attractive features such as their isotropic resolution, acquisition efficiency, and robustness to motion, there has been limited application of these techniques to first pass perfusion imaging because of potential off-resonance and inconsistent data artifacts. Spiral trajectories may also be less sensitive to dark-rim artifacts (DRA) that are caused, at least in part, by cardiac motion. By careful consideration of the spiral trajectory readout duration, flip angle strategy, and image reconstruction strategy, spiral artifacts can be abated to create high quality first pass myocardial perfusion images with high SNR. The goal of this paper was to design interleaved spiral pulse sequences for first-pass myocardial perfusion imaging, and to evaluate them clinically for image quality and the presence of dark-rim, blurring, and dropout artifacts. PMID:21590802

  11. Playing with the Multiple Intelligences: How Play Helps Them Grow

    ERIC Educational Resources Information Center

    Eberle, Scott G.

    2011-01-01

    Howard Gardner first posited a list of "multiple intelligences" as a liberating alternative to the assumptions underlying traditional IQ testing in his widely read study "Frames of Mind" (1983). Play has appeared only in passing in Gardner's thinking about intelligence, however, even though play instructs and trains the verbal, interpersonal,…

  12. Multiple-Ring Digital Communication Network

    NASA Technical Reports Server (NTRS)

    Kirkham, Harold

    1992-01-01

    Optical-fiber digital communication network to support data-acquisition and control functions of electric-power-distribution networks. Optical-fiber links of communication network follow power-distribution routes. Since fiber crosses open power switches, communication network includes multiple interconnected loops with occasional spurs. At each intersection node is needed. Nodes of communication network include power-distribution substations and power-controlling units. In addition to serving data acquisition and control functions, each node acts as repeater, passing on messages to next node(s). Multiple-ring communication network operates on new AbNET protocol and features fiber-optic communication.

  13. Automated ensemble assembly and validation of microbial genomes.

    PubMed

    Koren, Sergey; Treangen, Todd J; Hill, Christopher M; Pop, Mihai; Phillippy, Adam M

    2014-05-03

    The continued democratization of DNA sequencing has sparked a new wave of development of genome assembly and assembly validation methods. As individual research labs, rather than centralized centers, begin to sequence the majority of new genomes, it is important to establish best practices for genome assembly. However, recent evaluations such as GAGE and the Assemblathon have concluded that there is no single best approach to genome assembly. Instead, it is preferable to generate multiple assemblies and validate them to determine which is most useful for the desired analysis; this is a labor-intensive process that is often impossible or unfeasible. To encourage best practices supported by the community, we present iMetAMOS, an automated ensemble assembly pipeline; iMetAMOS encapsulates the process of running, validating, and selecting a single assembly from multiple assemblies. iMetAMOS packages several leading open-source tools into a single binary that automates parameter selection and execution of multiple assemblers, scores the resulting assemblies based on multiple validation metrics, and annotates the assemblies for genes and contaminants. We demonstrate the utility of the ensemble process on 225 previously unassembled Mycobacterium tuberculosis genomes as well as a Rhodobacter sphaeroides benchmark dataset. On these real data, iMetAMOS reliably produces validated assemblies and identifies potential contamination without user intervention. In addition, intelligent parameter selection produces assemblies of R. sphaeroides comparable to or exceeding the quality of those from the GAGE-B evaluation, affecting the relative ranking of some assemblers. Ensemble assembly with iMetAMOS provides users with multiple, validated assemblies for each genome. Although computationally limited to small or mid-sized genomes, this approach is the most effective and reproducible means for generating high-quality assemblies and enables users to select an assembly best tailored to their specific needs.

  14. Automated Big Data Analysis in Bottom-up and Targeted Proteomics

    PubMed Central

    van der Plas-Duivesteijn, Suzanne; Domański, Dominik; Smith, Derek; Borchers, Christoph; Palmblad, Magnus; Mohamme, Yassene

    2014-01-01

    Similar to other data intensive sciences, analyzing mass spectrometry-based proteomics data involves multiple steps and diverse software using different algorithms and data formats and sizes. Besides that the distributed and evolving nature of the data in online repositories, another challenge is that a scientists have to deal with many steps of analysis pipelines. A documented data processing is also becoming an essential part for the overall reproducibility of the results. Thanks to different e-Science initiatives, scientific workflow engines have become a means for automated, sharable and reproducible data processing. While these are designed as general tools, they can be employed to solve different challenges that we are facing in handling our Big Data. Here we present three use cases: improving the performance of different spectral search engines by decomposing input data and recomposing the resulting files, building spectral libraries from more than 20 million spectra, and integrating information from multiple resources to select most appropriate peptides for targeted proteomics analyses. The three use cases demonstrate different challenges in exploiting proteomics data analysis. In the first we integrate local and cloud processing resources in order to obtain better performance resulting in more than 30-fold speed improvement. By considering search engines as legacy software our solution is applicable to multiple search algorithms. The second use case is an example of automated processing of many data files of different sizes and locations, starting with raw data and ending with the final, ready-to-use library. This demonstrates the robustness and fault tolerance when dealing with huge amount data stored in multiple files. The third use case demonstrates retrieval and integration of information and data from multiple online repositories. In addition to the diversity of data formats and Web interfaces, this use case also illustrates how to deal with incomplete data.

  15. Combined process automation for large-scale EEG analysis.

    PubMed

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Passing Messages between Biological Networks to Refine Predicted Interactions

    PubMed Central

    Glass, Kimberly; Huttenhower, Curtis; Quackenbush, John; Yuan, Guo-Cheng

    2013-01-01

    Regulatory network reconstruction is a fundamental problem in computational biology. There are significant limitations to such reconstruction using individual datasets, and increasingly people attempt to construct networks using multiple, independent datasets obtained from complementary sources, but methods for this integration are lacking. We developed PANDA (Passing Attributes between Networks for Data Assimilation), a message-passing model using multiple sources of information to predict regulatory relationships, and used it to integrate protein-protein interaction, gene expression, and sequence motif data to reconstruct genome-wide, condition-specific regulatory networks in yeast as a model. The resulting networks were not only more accurate than those produced using individual data sets and other existing methods, but they also captured information regarding specific biological mechanisms and pathways that were missed using other methodologies. PANDA is scalable to higher eukaryotes, applicable to specific tissue or cell type data and conceptually generalizable to include a variety of regulatory, interaction, expression, and other genome-scale data. An implementation of the PANDA algorithm is available at www.sourceforge.net/projects/panda-net. PMID:23741402

  17. Multiple point mutations in a shuttle vector propagated in human cells: evidence for an error-prone DNA polymerase activity.

    PubMed

    Seidman, M M; Bredberg, A; Seetharam, S; Kraemer, K H

    1987-07-01

    Mutagenesis was studied at the DNA-sequence level in human fibroblast and lymphoid cells by use of a shuttle vector plasmid, pZ189, containing a suppressor tRNA marker gene. In a series of experiments, 62 plasmids were recovered that had two to six base substitutions in the 160-base-pair marker gene. Approximately 20-30% of the mutant plasmids that were recovered after passing ultraviolet-treated pZ189 through a repair-proficient human fibroblast line contained these multiple mutations. In contrast, passage of ultraviolet-treated pZ189 through an excision-repair-deficient (xeroderma pigmentosum) line yielded only 2% multiple base substitution mutants. Introducing a single-strand nick in otherwise unmodified pZ189 adjacent to the marker, followed by passage through the xeroderma pigmentosum cells, resulted in about 66% multiple base substitution mutants. The multiple mutations were found in a 160-base-pair region containing the marker gene but were rarely found in an adjacent 170-base-pair region. Passing ultraviolet-treated or nicked pZ189 through a repair-proficient human B-cell line also yielded multiple base substitution mutations in 20-33% of the mutant plasmids. An explanation for these multiple mutations is that they were generated by an error-prone polymerase while filling gaps. These mutations share many of the properties displayed by mutations in the immunoglobulin hypervariable regions.

  18. Classification of Automated Search Traffic

    NASA Astrophysics Data System (ADS)

    Buehrer, Greg; Stokes, Jack W.; Chellapilla, Kumar; Platt, John C.

    As web search providers seek to improve both relevance and response times, they are challenged by the ever-increasing tax of automated search query traffic. Third party systems interact with search engines for a variety of reasons, such as monitoring a web site’s rank, augmenting online games, or possibly to maliciously alter click-through rates. In this paper, we investigate automated traffic (sometimes referred to as bot traffic) in the query stream of a large search engine provider. We define automated traffic as any search query not generated by a human in real time. We first provide examples of different categories of query logs generated by automated means. We then develop many different features that distinguish between queries generated by people searching for information, and those generated by automated processes. We categorize these features into two classes, either an interpretation of the physical model of human interactions, or as behavioral patterns of automated interactions. Using the these detection features, we next classify the query stream using multiple binary classifiers. In addition, a multiclass classifier is then developed to identify subclasses of both normal and automated traffic. An active learning algorithm is used to suggest which user sessions to label to improve the accuracy of the multiclass classifier, while also seeking to discover new classes of automated traffic. Performance analysis are then provided. Finally, the multiclass classifier is used to predict the subclass distribution for the search query stream.

  19. Data Streams: An Overview and Scientific Applications

    NASA Astrophysics Data System (ADS)

    Aggarwal, Charu C.

    In recent years, advances in hardware technology have facilitated the ability to collect data continuously. Simple transactions of everyday life such as using a credit card, a phone, or browsing the web lead to automated data storage. Similarly, advances in information technology have lead to large flows of data across IP networks. In many cases, these large volumes of data can be mined for interesting and relevant information in a wide variety of applications. When the volume of the underlying data is very large, it leads to a number of computational and mining challenges: With increasing volume of the data, it is no longer possible to process the data efficiently by using multiple passes. Rather, one can process a data item at most once. This leads to constraints on the implementation of the underlying algorithms. Therefore, stream mining algorithms typically need to be designed so that the algorithms work with one pass of the data. In most cases, there is an inherent temporal component to the stream mining process. This is because the data may evolve over time. This behavior of data streams is referred to as temporal locality. Therefore, a straightforward adaptation of one-pass mining algorithms may not be an effective solution to the task. Stream mining algorithms need to be carefully designed with a clear focus on the evolution of the underlying data. Another important characteristic of data streams is that they are often mined in a distributed fashion. Furthermore, the individual processors may have limited processing and memory. Examples of such cases include sensor networks, in which it may be desirable to perform in-network processing of data stream with limited processing and memory [1, 2]. This chapter will provide an overview of the key challenges in stream mining algorithms which arise from the unique setup in which these problems are encountered. This chapter is organized as follows. In the next section, we will discuss the generic challenges that stream mining poses to a variety of data management and data mining problems. The next section also deals with several issues which arise in the context of data stream management. In Sect. 3, we discuss several mining algorithms on the data stream model. Section 4 discusses various scientific applications of data streams. Section 5 discusses the research directions and conclusions.

  20. Automatically updating predictive modeling workflows support decision-making in drug design.

    PubMed

    Muegge, Ingo; Bentzien, Jörg; Mukherjee, Prasenjit; Hughes, Robert O

    2016-09-01

    Using predictive models for early decision-making in drug discovery has become standard practice. We suggest that model building needs to be automated with minimum input and low technical maintenance requirements. Models perform best when tailored to answering specific compound optimization related questions. If qualitative answers are required, 2-bin classification models are preferred. Integrating predictive modeling results with structural information stimulates better decision making. For in silico models supporting rapid structure-activity relationship cycles the performance deteriorates within weeks. Frequent automated updates of predictive models ensure best predictions. Consensus between multiple modeling approaches increases the prediction confidence. Combining qualified and nonqualified data optimally uses all available information. Dose predictions provide a holistic alternative to multiple individual property predictions for reaching complex decisions.

  1. Enabling a systems biology knowledgebase with gaggle and firegoose

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baliga, Nitin S.

    The overall goal of this project was to extend the existing Gaggle and Firegoose systems to develop an open-source technology that runs over the web and links desktop applications with many databases and software applications. This technology would enable researchers to incorporate workflows for data analysis that can be executed from this interface to other online applications. The four specific aims were to (1) provide one-click mapping of genes, proteins, and complexes across databases and species; (2) enable multiple simultaneous workflows; (3) expand sophisticated data analysis for online resources; and enhance open-source development of the Gaggle-Firegoose infrastructure. Gaggle is anmore » open-source Java software system that integrates existing bioinformatics programs and data sources into a user-friendly, extensible environment to allow interactive exploration, visualization, and analysis of systems biology data. Firegoose is an extension to the Mozilla Firefox web browser that enables data transfer between websites and desktop tools including Gaggle. In the last phase of this funding period, we have made substantial progress on development and application of the Gaggle integration framework. We implemented the workspace to the Network Portal. Users can capture data from Firegoose and save them to the workspace. Users can create workflows to start multiple software components programmatically and pass data between them. Results of analysis can be saved to the cloud so that they can be easily restored on any machine. We also developed the Gaggle Chrome Goose, a plugin for the Google Chrome browser in tandem with an opencpu server in the Amazon EC2 cloud. This allows users to interactively perform data analysis on a single web page using the R packages deployed on the opencpu server. The cloud-based framework facilitates collaboration between researchers from multiple organizations. We have made a number of enhancements to the cmonkey2 application to enable and improve the integration within different environments, and we have created a new tools pipeline for generating EGRIN2 models in a largely automated way.« less

  2. Performance analysis of distributed applications using automatic classification of communication inefficiencies

    DOEpatents

    Vetter, Jeffrey S.

    2005-02-01

    The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.

  3. Reliability Assessment of the Defense Automated Neurobehavioral Assessment (DANA) in Extreme Environments

    DTIC Science & Technology

    2015-05-01

    multiple automated cognitive tests, data management and reporting capabilities, and executive menu. The DANA battery was given on a Trimble NOMAD ...handheld computing device using a stylus for consistency. The NOMAD runs a custom version of the Android Operating System and has a color 3.5 inch... digit pairs are shown below a key, & the participant indicates it matches the one in the key. PRO This test targets decision-making capabilities

  4. Demonstration of Human-Autonomy Teaming Principles

    NASA Technical Reports Server (NTRS)

    Shively, Robert Jay

    2016-01-01

    Known problems with automation include lack of mode awareness, automation brittleness, and risk of miscalibrated trust. Human-Autonomy Teaming (HAT) is essential for improving these problems. We have identified some critical components of HAT and ran a part-task study to introduce these components to a ground station that supports flight following of multiple aircraft. Our goal was to demonstrate, evaluate, and refine HAT principles. This presentation provides a brief summary of the study and initial findings.

  5. Spectral Analysis of Breast Cancer on Tissue Microarrays: Seeing Beyond Morphology

    DTIC Science & Technology

    2005-04-01

    Harvey N., Szymanski J.J., Bloch J.J., Mitchell M. investigation of image feature extraction by a genetic algorithm. Proc. SPIE 1999;3812:24-31. 11...automated feature extraction using multiple data sources. Proc. SPIE 2003;5099:190-200. 15 4 Spectral-Spatial Analysis of Urine Cytology Angeletti et al...Appendix Contents: 1. Harvey, N.R., Levenson, R.M., Rimm, D.L. (2003) Investigation of Automated Feature Extraction Techniques for Applications in

  6. Increasing capacity of baseband digital data communication networks

    DOEpatents

    Frankel, Robert S.; Herman, Alexander

    1985-01-01

    This invention provides broadband network capabilities for baseband digital collision detection transceiver equipment for communication between a plurality of data stations by affording simultaneous transmission of multiple channels over a broadband pass transmission link such as a coaxial cable. Thus, a fundamental carrier wave is transmitted on said link, received at local data stations and used to detect signals on different baseband channels for reception. For transmission the carrier wave typically is used for segregating a plurality of at least two transmission channels into typically single sideband upper and lower pass bands of baseband bandwidth capability adequately separated with guard bands to permit simple separation for receiving by means of pass band filters, etc.

  7. Increasing capacity of baseband digital data communication networks

    DOEpatents

    Frankel, R.S.; Herman, A.

    This invention provides broadbank network capabilities for baseband digital collision detection transceiver equipment for communication between a plurality of data stations by affording simultaneous transmission of multiple channels over a broadband pass transmission link such as a coaxial cable. Thus, a fundamental carrier wave is transmitted on said link, received at local data stations and used to detect signals on different baseband channels for reception. For transmission the carrier wave typically is used for segregating a plurality of at least two transmission channels into typically single sideband upper and lower pass bands of baseband bandwidth capability adequately separated with guard bands to permit simple separation for receiving by means of pass band filters, etc.

  8. Large eccentric laser angioplasty catheter

    NASA Astrophysics Data System (ADS)

    Taylor, Kevin D.; Reiser, Christopher

    1997-05-01

    In response to recent demand for increased debulking of large diameter coronary vascular segments, a large eccentric catheter for excimer laser coronary angioplasty has been developed. The outer tip diameter is 2.0 mm and incorporates approximately 300 fibers of 50 micron diameter in a monorail- type percutaneous catheter. The basic function of the device is to ablate a coronary atherosclerotic lesion with 308 nm excimer laser pulses, while passing the tip of the catheter through the lesion. By employing multiple passes through the lesion, rotating the catheter 90 degrees after each pass, we expect to create luminal diameters close to 3 mm with this device. Design characteristics, in-vitro testing, and initial clinical experience is presented.

  9. Ultrashort pulse amplification in cryogenically cooled amplifiers

    DOEpatents

    Backus, Sterling J.; Kapteyn, Henry C.; Murnane, Margaret Mary

    2004-10-12

    A laser amplifier system amplifies pulses in a single "stage" from .about.10.sup.-9 joules to more than 10.sup.-3 joules, with average power of 1-10 watts, and beam quality M.sup.2 <2. The laser medium is cooled substantially below room temperature, as a means to improve the optical and thermal characteristics of the medium. This is done with the medium inside a sealed, evacuated or purged cell to avoid moisture or other materials condensing on the surface. A "seed" pulse from a separate laser is passed through the laser medium, one or more times, in any of a variety of configurations including single-pass, multiple-pass, and regenerative amplifier configurations.

  10. A Relational Frame Training Intervention to Raise Intelligence Quotients: A Pilot Study

    ERIC Educational Resources Information Center

    Cassidy, Sarah; Roche, Bryan; Hayes, Steven C.

    2011-01-01

    The current research consisted of 2 studies designed to test the effectiveness of automated multiple-exemplar relational training in raising children's general intellectual skills. In Study 1, 4 participants were exposed to multiple exemplar training in stimulus equivalence and the relational frames of SAME, OPPOSITE, MORE THAN, and LESS THAN…

  11. Guidelines on ergonomic aspects of control rooms

    NASA Technical Reports Server (NTRS)

    Mitchell, C. M.; Bocast, A. K.; Stewart, L. J.

    1983-01-01

    The anthropometry, workstation design, and environmental design of control rooms are outlined. The automated interface and VDTs and displays and various modes of communication between the system and the human operator using VDTs are discussed. The man in the loop is examined, the single controller single task framework and multiple controller multiple tasks issues are considered.

  12. Architecting Human Operator Trust in Automation to Improve System Effectiveness in Multiple Unmanned Aerial Vehicles (UAV) Control

    DTIC Science & Technology

    2009-03-01

    like to extend our appreciation to our research sponsor Dr. Janet Miller from the Air Force Research Labs, and her colleague Dr. Cheryl Batchelor, for...for single-operator control of multiple UAVs. Drs. Brian Tsou, Lamar Warfield , Justin Estepp and Benjamin Knott , meanwhile, contributed to our

  13. The Noise of a Forward Swept Fan

    NASA Technical Reports Server (NTRS)

    Dittmar, James H.; Elliott, David M.; Fite, E. Brian

    2003-01-01

    A forward swept fan, designated the Quiet High Speed Fan (QHSF), was tested in the NASA Glenn 9-by 15-foot Low Speed Wind Tunnel to investigate its noise reduction relative to a baseline fan of the same aerodynamic performance. The objective of the Quiet High Speed Fan was a 6 decibel reduction in the Effective Perceived Noise relative to the baseline fan at the takeoff condition. The intent of the Quiet High Speed Fan design was to provide both a multiple pure tone noise reduction from the forward sweep of the fan rotor and a rotor-stator interaction blade passing tone noise reduction from a leaned stator. The tunnel noise data indicted that the Quiet High Speed Fan was quieter than the baseline fan for a significant portion of the operating line and was 6 dB quieter near the takeoff condition. Although reductions in the multiple pure tones were observed, the vast majority of the EPNdB reduction was a result of the reduction in the blade passing tone and its harmonics. The baseline fan's blade passing tone was dominated by the rotor-strut interaction mechanism. The observed blade passing tone reduction could be the result of either the redesign of the Quiet High Speed Fan Rotor or the redesigned stator. The exact cause of this rotor-strut noise reduction, whether from the rotor or stator redesign, was not discernable from this experiment.

  14. Automated Method of Frequency Determination in Software Metric Data Through the Use of the Multiple Signal Classification (MUSIC) Algorithm

    DTIC Science & Technology

    1998-06-26

    METHOD OF FREQUENCY DETERMINATION 4 IN SOFTWARE METRIC DATA THROUGH THE USE OF THE 5 MULTIPLE SIGNAL CLASSIFICATION ( MUSIC ) ALGORITHM 6 7 STATEMENT OF...graph showing the estimated power spectral 12 density (PSD) generated by the multiple signal classification 13 ( MUSIC ) algorithm from the data set used...implemented in this module; however, it is preferred to use 1 the Multiple Signal Classification ( MUSIC ) algorithm. The MUSIC 2 algorithm is

  15. BCube: Building a Geoscience Brokering Framework

    NASA Astrophysics Data System (ADS)

    Jodha Khalsa, Siri; Nativi, Stefano; Duerr, Ruth; Pearlman, Jay

    2014-05-01

    BCube is addressing the need for effective and efficient multi-disciplinary collaboration and interoperability through the advancement of brokering technologies. As a prototype "building block" for NSF's EarthCube cyberinfrastructure initiative, BCube is demonstrating how a broker can serve as an intermediary between information systems that implement well-defined interfaces, thereby providing a bridge between communities that employ different specifications. Building on the GEOSS Discover and Access Broker (DAB), BCube will develop new modules and services including: • Expanded semantic brokering capabilities • Business Model support for work flows • Automated metadata generation • Automated linking to services discovered via web crawling • Credential passing for seamless access to data • Ranking of search results from brokered catalogs Because facilitating cross-discipline research involves cultural and well as technical challenges, BCube is also addressing the sociological and educational components of infrastructure development. We are working, initially, with four geoscience disciplines: hydrology, oceans, polar and weather, with an emphasis on connecting existing domain infrastructure elements to facilitate cross-domain communications.

  16. Automated Performance Prediction of Message-Passing Parallel Programs

    NASA Technical Reports Server (NTRS)

    Block, Robert J.; Sarukkai, Sekhar; Mehra, Pankaj; Woodrow, Thomas S. (Technical Monitor)

    1995-01-01

    The increasing use of massively parallel supercomputers to solve large-scale scientific problems has generated a need for tools that can predict scalability trends of applications written for these machines. Much work has been done to create simple models that represent important characteristics of parallel programs, such as latency, network contention, and communication volume. But many of these methods still require substantial manual effort to represent an application in the model's format. The NIK toolkit described in this paper is the result of an on-going effort to automate the formation of analytic expressions of program execution time, with a minimum of programmer assistance. In this paper we demonstrate the feasibility of our approach, by extending previous work to detect and model communication patterns automatically, with and without overlapped computations. The predictions derived from these models agree, within reasonable limits, with execution times of programs measured on the Intel iPSC/860 and Paragon. Further, we demonstrate the use of MK in selecting optimal computational grain size and studying various scalability metrics.

  17. Enhancing sensitivity of biconical tapered fiber sensors with multiple passes through the taper

    NASA Astrophysics Data System (ADS)

    Cohoon, Gregory; Boyter, Chris; Errico, Michael; Vandervoort, Kurt; Salik, Ertan

    2010-03-01

    A single biconical fiber taper is a simple and low-cost yet powerful sensor. With a distinct strength in refractive index (RI) sensing, biconical tapered fiber sensors can find their place in handheld sensor platforms, especially as biosensors that are greatly needed in health care, environmental protection, food safety, and biodefense. We report doubling of sensitivity for these sensors with two passes through the tapered region, which becomes possible through the use of sensitive and high-dynamic-range photodetectors. In a proof-of-principle experiment, we measured transmission through the taper when it was immersed in isopropyl alcohol-water mixtures of varying concentrations, in which a thin gold layer at the tip of the fiber acted as a mirror enabling two passes through the tapered region. This improved the sensitivity from 0.43 dB/vol % in the single-pass case to 0.78 dB/vol % with two passes through the taper. The refractive index detection limit was estimated to be ~1.2×10-5 RI units (RIU) and ~0.6×10-5 RIU in the single- and double-pass schemes, respectively. We predict that further enhancement of sensitivity may be achieved with a higher number of passes through the taper.

  18. TU-F-CAMPUS-I-05: Semi-Automated, Open Source MRI Quality Assurance and Quality Control Program for Multi-Unit Institution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yung, J; Stefan, W; Reeve, D

    2015-06-15

    Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help preventmore » costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large datasets. Longitudinal data can reveal trends that although are within passing criteria indicate underlying system issues.« less

  19. NASA Tech Briefs, December 2007

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Topics include: Ka-Band TWT High-Efficiency Power Combiner for High-Rate Data Transmission; Reusable, Extensible High-Level Data-Distribution Concept; Processing Satellite Imagery To Detect Waste Tire Piles; Monitoring by Use of Clusters of Sensor-Data Vectors; Circuit and Method for Communication Over DC Power Line; Switched Band-Pass Filters for Adaptive Transceivers; Noncoherent DTTLs for Symbol Synchronization; High-Voltage Power Supply With Fast Rise and Fall Times; Waveguide Calibrator for Multi-Element Probe Calibration; Four-Way Ka-Band Power Combiner; Loss-of-Control-Inhibitor Systems for Aircraft; Improved Underwater Excitation-Emission Matrix Fluorometer; Metrology Camera System Using Two-Color Interferometry; Design and Fabrication of High-Efficiency CMOS/CCD Imagers; Foam Core Shielding for Spacecraft CHEM-Based Self-Deploying Planetary Storage Tanks Sequestration of Single-Walled Carbon Nanotubes in a Polymer PPC750 Performance Monitor Application-Program-Installer Builder Using Visual Odometry to Estimate Position and Attitude Design and Data Management System Simple, Script-Based Science Processing Archive Automated Rocket Propulsion Test Management Online Remote Sensing Interface Fusing Image Data for Calculating Position of an Object Implementation of a Point Algorithm for Real-Time Convex Optimization Handling Input and Output for COAMPS Modeling and Grid Generation of Iced Airfoils Automated Identification of Nucleotide Sequences Balloon Design Software Rocket Science 101 Interactive Educational Program Creep Forming of Carbon-Reinforced Ceramic-Matrix Composites Dog-Bone Horns for Piezoelectric Ultrasonic/Sonic Actuators Benchtop Detection of Proteins Recombinant Collagenlike Proteins Remote Sensing of Parasitic Nematodes in Plants Direct Coupling From WGM Resonator Disks to Photodetectors Using Digital Radiography To Image Liquid Nitrogen in Voids Multiple-Parameter, Low-False-Alarm Fire-Detection Systems Mosaic-Detector-Based Fluorescence Spectral Imager Plasmoid Thruster for High Specific-Impulse Propulsion Analysis Method for Quantifying Vehicle Design Goals Improved Tracking of Targets by Cameras on a Mars Rover Sample Caching Subsystem Multistage Passive Cooler for Spaceborne Instruments GVIPS Models and Software Stowable Energy-Absorbing Rocker-Bogie Suspensions

  20. Automated extraction of natural drainage density patterns for the conterminous United States through high performance computing

    USGS Publications Warehouse

    Stanislawski, Larry V.; Falgout, Jeff T.; Buttenfield, Barbara P.

    2015-01-01

    Hydrographic networks form an important data foundation for cartographic base mapping and for hydrologic analysis. Drainage density patterns for these networks can be derived to characterize local landscape, bedrock and climate conditions, and further inform hydrologic and geomorphological analysis by indicating areas where too few headwater channels have been extracted. But natural drainage density patterns are not consistently available in existing hydrographic data for the United States because compilation and capture criteria historically varied, along with climate, during the period of data collection over the various terrain types throughout the country. This paper demonstrates an automated workflow that is being tested in a high-performance computing environment by the U.S. Geological Survey (USGS) to map natural drainage density patterns at the 1:24,000-scale (24K) for the conterminous United States. Hydrographic network drainage patterns may be extracted from elevation data to guide corrections for existing hydrographic network data. The paper describes three stages in this workflow including data pre-processing, natural channel extraction, and generation of drainage density patterns from extracted channels. The workflow is concurrently implemented by executing procedures on multiple subbasin watersheds within the U.S. National Hydrography Dataset (NHD). Pre-processing defines parameters that are needed for the extraction process. Extraction proceeds in standard fashion: filling sinks, developing flow direction and weighted flow accumulation rasters. Drainage channels with assigned Strahler stream order are extracted within a subbasin and simplified. Drainage density patterns are then estimated with 100-meter resolution and subsequently smoothed with a low-pass filter. The extraction process is found to be of better quality in higher slope terrains. Concurrent processing through the high performance computing environment is shown to facilitate and refine the choice of drainage density extraction parameters and more readily improve extraction procedures than conventional processing.

  1. Simple setup for gas-phase H/D exchange mass spectrometry coupled to electron transfer dissociation and ion mobility for analysis of polypeptide structure on a liquid chromatographic time scale.

    PubMed

    Mistarz, Ulrik H; Brown, Jeffery M; Haselmann, Kim F; Rand, Kasper D

    2014-12-02

    Gas-phase hydrogen/deuterium exchange (HDX) is a fast and sensitive, yet unharnessed analytical approach for providing information on the structural properties of biomolecules, in a complementary manner to mass analysis. Here, we describe a simple setup for ND3-mediated millisecond gas-phase HDX inside a mass spectrometer immediately after ESI (gas-phase HDX-MS) and show utility for studying the primary and higher-order structure of peptides and proteins. HDX was achieved by passing N2-gas through a container filled with aqueous deuterated ammonia reagent (ND3/D2O) and admitting the saturated gas immediately upstream or downstream of the primary skimmer cone. The approach was implemented on three commercially available mass spectrometers and required no or minor fully reversible reconfiguration of gas-inlets of the ion source. Results from gas-phase HDX-MS of peptides using the aqueous ND3/D2O as HDX reagent indicate that labeling is facilitated exclusively through gaseous ND3, yielding similar results to the infusion of purified ND3-gas, while circumventing the complications associated with the use of hazardous purified gases. Comparison of the solution-phase- and gas-phase deuterium uptake of Leu-Enkephalin and Glu-Fibrinopeptide B, confirmed that this gas-phase HDX-MS approach allows for labeling of sites (heteroatom-bound non-amide hydrogens located on side-chains, N-terminus and C-terminus) not accessed by classical solution-phase HDX-MS. The simple setup is compatible with liquid chromatography and a chip-based automated nanoESI interface, allowing for online gas-phase HDX-MS analysis of peptides and proteins separated on a liquid chromatographic time scale at increased throughput. Furthermore, online gas-phase HDX-MS could be performed in tandem with ion mobility separation or electron transfer dissociation, thus enabling multiple orthogonal analyses of the structural properties of peptides and proteins in a single automated LC-MS workflow.

  2. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing

    PubMed Central

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-01-01

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, feature extraction algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system. PMID:29462855

  3. Clarity: An Open Source Manager for Laboratory Automation

    PubMed Central

    Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.

    2013-01-01

    Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169

  4. Enhanced Automated Guidance System for Horizontal Auger Boring Based on Image Processing.

    PubMed

    Wu, Lingling; Wen, Guojun; Wang, Yudan; Huang, Lei; Zhou, Jiang

    2018-02-15

    Horizontal auger boring (HAB) is a widely used trenchless technology for the high-accuracy installation of gravity or pressure pipelines on line and grade. Differing from other pipeline installations, HAB requires a more precise and automated guidance system for use in a practical project. This paper proposes an economic and enhanced automated optical guidance system, based on optimization research of light-emitting diode (LED) light target and five automated image processing bore-path deviation algorithms. An LED light target was optimized for many qualities, including light color, filter plate color, luminous intensity, and LED layout. The image preprocessing algorithm, direction location algorithm, angle measurement algorithm, deflection detection algorithm, and auto-focus algorithm, compiled in MATLAB, are used to automate image processing for deflection computing and judging. After multiple indoor experiments, this guidance system is applied in a project of hot water pipeline installation, with accuracy controlled within 2 mm in 48-m distance, providing accurate line and grade controls and verifying the feasibility and reliability of the guidance system.

  5. Use of an automated drug distribution cabinet system in a disaster response mobile emergency department.

    PubMed

    Morchel, Herman; Ogedegbe, Chinwe; Desai, Nilesh; Faley, Brian; Mahmood, Nasir; Moro, Gary Del; Feldman, Joseph

    2015-01-01

    This article describes the innovative use of an automated drug distribution cabinet system for medication supply in a disaster response mobile Emergency Department vehicle. Prior to the use of the automated drug distribution cabinet system described in this article, the mobile hospitals were stocked as needed with drugs in individual boxes and draws. Experience with multiple deployments found this method to be very cumbersome and labor intensive, both in preparation, operational use, and demobilization. For a recent deployment to provide emergency medical care at the 2014 Super Bowl football event, the automated drug distribution cabinet system in the Institution's main campus Emergency Department was duplicated and incorporated into the mobile Emergency Department. This method of drug stocking and dispensing was found to be far more efficient than gathering and placing drugs in onboard draws and racks. Automated drug distribution cabinet systems can be used to significantly improve patient care and overall efficiency in mobile hospital deployments.

  6. CERES AuTomAted job Loading SYSTem (CATALYST): An automated workflow manager for satellite data production

    NASA Astrophysics Data System (ADS)

    Gleason, J. L.; Hillyer, T. N.; Wilkins, J.

    2012-12-01

    The CERES Science Team integrates data from 5 CERES instruments onboard the Terra, Aqua and NPP missions. The processing chain fuses CERES observations with data from 19 other unique sources. The addition of CERES Flight Model 5 (FM5) onboard NPP, coupled with ground processing system upgrades further emphasizes the need for an automated job-submission utility to manage multiple processing streams concurrently. The operator-driven, legacy-processing approach relied on manually staging data from magnetic tape to limited spinning disk attached to a shared memory architecture system. The migration of CERES production code to a distributed, cluster computing environment with approximately one petabyte of spinning disk containing all precursor input data products facilitates the development of a CERES-specific, automated workflow manager. In the cluster environment, I/O is the primary system resource in contention across jobs. Therefore, system load can be maximized with a throttling workload manager. This poster discusses a Java and Perl implementation of an automated job management tool tailored for CERES processing.

  7. Agile based "Semi-"Automated Data ingest process : ORNL DAAC example

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.

    2015-12-01

    The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.

  8. A swath across the great divide: Kelp forests across the Samalga Pass biogeographic break

    USGS Publications Warehouse

    Konar, Brenda H.; Edwards, Matthew S.; Bland, Aaron; Metzger, Jacob; Ravelo, Alexandra; Traiger, Sarah; Weitzman, Ben P.

    2017-01-01

    Biogeographic breaks are often described as locations where a large number of species reach their geographic range limits. Samalga Pass, in the eastern Aleutian Archipelago, is a known biogeographic break for the spatial distribution of several species of offshore-pelagic communities, including numerous species of cold-water corals, zooplankton, fish, marine mammals, and seabirds. However, it remains unclear whether Samalga Pass also serves as a biogeographic break for nearshore benthic communities. The occurrence of biogeographic breaks across multiple habitats has not often been described. In this study, we examined if the biogeographic break for offshore-pelagic communities applies to nearshore kelp forests. To examine whether Samalga Pass serves as a biogeographic break for kelp forest communities, this study compared abundance, biomass and percent bottom cover of species associated with kelp forests on either side of the pass. We observed marked differences in kelp forest community structure, with some species reaching their geographic range limits on the opposing sides of the pass. In particular, the habitat-forming kelp Nereocystis luetkeana, and the predatory sea stars Pycnopodia helianthoides and Orthasterias koehleri all occurred on the eastern side of Samalga Pass but were not observed west of the pass. In contrast, the sea star Leptasterias camtschatica dispar was observed only on the western side of the pass. We also observed differences in overall abundance and biomass of numerous associated fish, invertebrate and macroalgal species on opposing sides of the pass. We conclude that Samalga Pass is important biogeographic break for kelp forest communities in the Aleutian Archipelago and may demark the geographic range limits of several ecologically important species.

  9. Acoustic performance of inlet multiple-pure-tone suppressors installed on NASA quiet engine C

    NASA Technical Reports Server (NTRS)

    Bloomer, H. E.; Schaefer, J. W.; Rice, E. J.; Feiler, C. E.

    1977-01-01

    The length of multiple-pure-tone (MPT) treatment required to reasonably suppress the MPT's produced by a supersonic tip speed fan was defined. Other suppression, broadband, and blade passing frequency, which might be accomplished were also determined. The experimental results are presented in terms of both far-field and duct acoustic data.

  10. Increasing Reading Motivation in Elementary and Middle School Students through the Use of Multiple Intelligences

    ERIC Educational Resources Information Center

    Buschick, Mary E.; Shipton, Tracey A.; Winner, Laurie M.; Wise, Melissa D.

    2007-01-01

    The problem is that with each passing year it becomes increasingly harder to maintain student motivation to read and improve reading comprehension. The purpose of this project was to increase reading motivation in elementary and middle school students through the use of multiple intelligences. This project was conducted by four teacher researchers…

  11. Sensor-Augmented Insulin Pumps and Hypoglycemia Prevention in Type 1 Diabetes

    PubMed Central

    Steineck, Isabelle; Ranjan, Ajenthen; Nørgaard, Kirsten; Schmidt, Signe

    2016-01-01

    Hypoglycemia can lead to seizures, unconsciousness, or death. Insulin pump treatment reduces the frequency of severe hypoglycemia compared with multiple daily injections treatment. The addition of a continuous glucose monitor, so-called sensor-augmented pump (SAP) treatment, has the potential to further limit the duration and severity of hypoglycemia as the system can detect and in some systems act on impending and prevailing low blood glucose levels. In this narrative review we summarize the available knowledge on SAPs with and without automated insulin suspension, in relation to hypoglycemia prevention. We present evidence from randomized trials, observational studies, and meta-analyses including nonpregnant individuals with type 1 diabetes mellitus. We also outline concerns regarding SAPs with and without automated insulin suspension. There is evidence that SAP treatment reduces episodes of moderate and severe hypoglycemia compared with multiple daily injections plus self-monitoring of blood glucose. There is some evidence that SAPs both with and without automated suspension reduces the frequency of severe hypoglycemic events compared with insulin pumps without continuous glucose monitoring. PMID:28264173

  12. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGES

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  13. Slide Set: Reproducible image analysis and batch processing with ImageJ.

    PubMed

    Nanes, Benjamin A

    2015-11-01

    Most imaging studies in the biological sciences rely on analyses that are relatively simple. However, manual repetition of analysis tasks across multiple regions in many images can complicate even the simplest analysis, making record keeping difficult, increasing the potential for error, and limiting reproducibility. While fully automated solutions are necessary for very large data sets, they are sometimes impractical for the small- and medium-sized data sets common in biology. Here we present the Slide Set plugin for ImageJ, which provides a framework for reproducible image analysis and batch processing. Slide Set organizes data into tables, associating image files with regions of interest and other relevant information. Analysis commands are automatically repeated over each image in the data set, and multiple commands can be chained together for more complex analysis tasks. All analysis parameters are saved, ensuring transparency and reproducibility. Slide Set includes a variety of built-in analysis commands and can be easily extended to automate other ImageJ plugins, reducing the manual repetition of image analysis without the set-up effort or programming expertise required for a fully automated solution.

  14. Improving automatic peptide mass fingerprint protein identification by combining many peak sets.

    PubMed

    Rögnvaldsson, Thorsteinn; Häkkinen, Jari; Lindberg, Claes; Marko-Varga, György; Potthast, Frank; Samuelsson, Jim

    2004-08-05

    An automated peak picking strategy is presented where several peak sets with different signal-to-noise levels are combined to form a more reliable statement on the protein identity. The strategy is compared against both manual peak picking and industry standard automated peak picking on a set of mass spectra obtained after tryptic in gel digestion of 2D-gel samples from human fetal fibroblasts. The set of spectra contain samples ranging from strong to weak spectra, and the proposed multiple-scale method is shown to be much better on weak spectra than the industry standard method and a human operator, and equal in performance to these on strong and medium strong spectra. It is also demonstrated that peak sets selected by a human operator display a considerable variability and that it is impossible to speak of a single "true" peak set for a given spectrum. The described multiple-scale strategy both avoids time-consuming parameter tuning and exceeds the human operator in protein identification efficiency. The strategy therefore promises reliable automated user-independent protein identification using peptide mass fingerprints.

  15. Direction-dependent waist-shift-difference of Gaussian beam in a multiple-pass zigzag slab amplifier and geometrical optics compensation method.

    PubMed

    Li, Zhaoyang; Kurita, Takashi; Miyanaga, Noriaki

    2017-10-20

    Zigzag and non-zigzag beam waist shifts in a multiple-pass zigzag slab amplifier are investigated based on the propagation of a Gaussian beam. Different incident angles in the zigzag and non-zigzag planes would introduce a direction-dependent waist-shift-difference, which distorts the beam quality in both the near- and far-fields. The theoretical model and analytical expressions of this phenomenon are presented, and intensity distributions in the two orthogonal planes are simulated and compared. A geometrical optics compensation method by a beam with 90° rotation is proposed, which not only could correct the direction-dependent waist-shift-difference but also possibly average the traditional thermally induced wavefront-distortion-difference between the horizontal and vertical beam directions.

  16. Learning Multiple Band-Pass Filters for Sleep Stage Estimation: Towards Care Support for Aged Persons

    NASA Astrophysics Data System (ADS)

    Takadama, Keiki; Hirose, Kazuyuki; Matsushima, Hiroyasu; Hattori, Kiyohiko; Nakajima, Nobuo

    This paper proposes the sleep stage estimation method that can provide an accurate estimation for each person without connecting any devices to human's body. In particular, our method learns the appropriate multiple band-pass filters to extract the specific wave pattern of heartbeat, which is required to estimate the sleep stage. For an accurate estimation, this paper employs Learning Classifier System (LCS) as the data-mining techniques and extends it to estimate the sleep stage. Extensive experiments on five subjects in mixed health confirm the following implications: (1) the proposed method can provide more accurate sleep stage estimation than the conventional method, and (2) the sleep stage estimation calculated by the proposed method is robust regardless of the physical condition of the subject.

  17. Elucidating Grinding Mechanism by Theoretical and Experimental Investigations

    PubMed Central

    Kubo, Akihiko; Chowdhury, M. A. K.

    2018-01-01

    Grinding is one of the essential manufacturing processes for producing brittle or hard materials-based precision parts (e.g., optical lenses). In grinding, a grinding wheel removes the desired amount of material by passing the same area on the workpiece surface multiple times. How the topography of a workpiece surface evolves with these passes is thus an important research issue, which has not yet been addressed elaborately. The present paper tackles this issue from both the theoretical and the experimental points of view. In particular, this paper presents the results of experimental and theoretical investigations on the multi-pass surface grinding operations where the workpiece surface is made of glass and the grinding wheel consists of cBN abrasive grains. Both investigations confirm that a great deal of stochasticity is involved in the grinding mechanism, and the complexity of the workpiece surface gradually increases along with the number of passes. PMID:29425160

  18. Elucidating Grinding Mechanism by Theoretical and Experimental Investigations.

    PubMed

    Ullah, Amm Sharif; Caggiano, Alessandra; Kubo, Akihiko; Chowdhury, M A K

    2018-02-09

    Grinding is one of the essential manufacturing processes for producing brittle or hard materials-based precision parts (e.g., optical lenses). In grinding, a grinding wheel removes the desired amount of material by passing the same area on the workpiece surface multiple times. How the topography of a workpiece surface evolves with these passes is thus an important research issue, which has not yet been addressed elaborately. The present paper tackles this issue from both the theoretical and the experimental points of view. In particular, this paper presents the results of experimental and theoretical investigations on the multi-pass surface grinding operations where the workpiece surface is made of glass and the grinding wheel consists of cBN abrasive grains. Both investigations confirm that a great deal of stochasticity is involved in the grinding mechanism, and the complexity of the workpiece surface gradually increases along with the number of passes.

  19. Optical Magnetometry using Multipass Cells with overlapping beams

    NASA Astrophysics Data System (ADS)

    McDonough, Nathaniel David; Lucivero, Vito Giovanni; Dural, Nezih; Romalis, Michael

    2017-04-01

    In recent years, multipass cells with cylindrical mirrors have proven to be a successful way of making highly sensitive atomic magnetometers. In such cells a small laser beam makes 40 to 100 passes within the cell without significant overlap with itself. Here we describe a new multi-pass geometry which uses spherical mirrors to reflect the probe beam multiple times over the same cell region. Such geometry reduces the effects of atomic diffusion while preserving the advantages of multi-pass cells over standing-wave cavities, namely a deterministic number of passes and absence of interference. We have fabricated several cells with this geometry and obtained good agreement between the measured and calculated levels of quantum spin noise. We will report on our effort to characterize the diffusion spin-correlation function in these cells and operation of the cell as a magnetometer. This work is supported by DARPA.

  20. Linear fixed-field multipass arcs for recirculating linear accelerators

    DOE PAGES

    Morozov, V. S.; Bogacz, S. A.; Roblin, Y. R.; ...

    2012-06-14

    Recirculating Linear Accelerators (RLA's) provide a compact and efficient way of accelerating particle beams to medium and high energies by reusing the same linac for multiple passes. In the conventional scheme, after each pass, the different energy beams coming out of the linac are separated and directed into appropriate arcs for recirculation, with each pass requiring a separate fixed-energy arc. In this paper we present a concept of an RLA return arc based on linear combined-function magnets, in which two and potentially more consecutive passes with very different energies are transported through the same string of magnets. By adjusting themore » dipole and quadrupole components of the constituting linear combined-function magnets, the arc is designed to be achromatic and to have zero initial and final reference orbit offsets for all transported beam energies. We demonstrate the concept by developing a design for a droplet-shaped return arc for a dog-bone RLA capable of transporting two beam passes with momenta different by a factor of two. Finally, we present the results of tracking simulations of the two passes and lay out the path to end-to-end design and simulation of a complete dog-bone RLA.« less

  1. Gas formation in ground beef chubs due to Hafnia alvei is reduced by multiple applications of antimicrobial interventions to artificially inoculated beef trim stock.

    PubMed

    Kang, Dong-Hyun; Arthur, Terrance M; Siragusa, Gregory R

    2002-10-01

    Gas-forming microorganisms were isolated from gas-swollen ground beef chubs obtained from a commercial source and were phenotypically identified as Hafnia alvei. In in situ experiments, the isolated H. alvei strains produced gas in inoculated irradiation-sterilized ground beef chubs. A five-strain cocktail of H. alvei isolates was inoculated on beef trim. The inoculated beef trim samples were treated with either a water wash (W) at 65 psi for five passes (a pass refers to the application of successive multiple antimicrobial treatments to inoculated beef trim on a moving processing conveyor belt at a speed of 1 cm/s under heat ducts or oscillating spray nozzles), W plus a 2% (vol/vol) lactic acid wash (L) at room temperature at 30 psi for three passes (W/L), or a combination treatment (COMB) consisting of W plus 82 degrees C water for three passes plus 510 degrees C hot air for six passes plus L, or were not treated (control). After treatment, the beef trim was ground and vacuum packaged. The numbers of H. alvei were reduced with water alone and with the aforementioned antimicrobial intervention treatments. For the untreated and inoculated control samples, the numbers of H. alvei increased from 7.03 to 8.40 log CFU/g after 7 days of incubation at 4 degrees C. However, the numbers of H. alvei treated by successive antimicrobial interventions (COMB) were initially reduced to 5.25 log CFU/g and increased to just 6.9 log CFU/g after 7 days of incubation at 4 degrees C. Gas was produced in untreated control samples after 3 days at 15 degrees C (15 of 15 inoculated chubs). However, in meat treated with W, W/L, and COMB, gas was produced after 4 to 5, 7 to 8, and 9 to 10 days of storage at 15 degrees C, respectively. These results demonstrate the effectiveness of multiple antimicrobial interventions in reducing H. alvei numbers on beef trim and subsequently delaying gas formation in the resulting ground beef chubs.

  2. Eddy-Current Inspection Of Tab Seals On Beverage Cans

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph

    1994-01-01

    Eddy-current inspection system monitors tab seals on beverage cans. Device inspects all cans at usual production rate of 1,500 to 2,000 cans per minute. Automated inspection of all units replaces visual inspection by microscope aided by mass spectrometry. System detects defects in real time. Sealed cans on conveyor pass near one of two coils in differential eddy-current probe. Other coil in differential eddy-current probe positioned near stationary reference can on which tab seal is known to be of acceptable quality. Signal of certain magnitude at output of probe indicates defective can, automatically ejected from conveyor.

  3. Design, Manufacture and Deliver a Fully Automated Instrument to Measure, Record and Analyze the Oxygen Equilibrium Curve of Blood. Phase 2

    DTIC Science & Technology

    1994-06-20

    1040 Spruce Street, Trenton, New Jersey 08648. It is a square 1.56 in. on a side by 0.19 in. thick. It is a low current, moderate capacity module ...The module requires a d.c. voltage for its operation. We use a pulsating d.c. voltage and alter its duty cycle to control the amount of heating or...voltages that saturate the D/A output modules that pass the signal from the computer to the power electronics. The range can be extended, but with some

  4. Quantification of fibre polymerization through Fourier space image analysis

    PubMed Central

    Nekouzadeh, Ali; Genin, Guy M.

    2011-01-01

    Quantification of changes in the total length of randomly oriented and possibly curved lines appearing in an image is a necessity in a wide variety of biological applications. Here, we present an automated approach based upon Fourier space analysis. Scaled, band-pass filtered power spectral densities of greyscale images are integrated to provide a quantitative measurement of the total length of lines of a particular range of thicknesses appearing in an image. A procedure is presented to correct for changes in image intensity. The method is most accurate for two-dimensional processes with fibres that do not occlude one another. PMID:24959096

  5. Aid for the Medical Laboratory

    NASA Technical Reports Server (NTRS)

    1986-01-01

    A process for separating chemical compounds in fluids resulted from a Jet Propulsion Laboratory (JPL)/LAPD project. The technique involves pouring a blood or urine sample into an extraction tube where packing material contained in a disposable tube called an "extraction column" absorbs water and spreads the specimen as a thin film, making it easy to identify specific components. When a solvent passes through the packing material, the desired compound dissolves and exits through the tube's bottom stem and is collected. Called AUDRI, Automated Drug Identification, it is commercially produced by Analytichem International which has successfully advanced the original technology.

  6. Simultaneous real-time data collection methods

    NASA Technical Reports Server (NTRS)

    Klincsek, Thomas

    1992-01-01

    This paper describes the development of electronic test equipment which executes, supervises, and reports on various tests. This validation process uses computers to analyze test results and report conclusions. The test equipment consists of an electronics component and the data collection and reporting unit. The PC software, display screens, and real-time data-base are described. Pass-fail procedures and data replay are discussed. The OS2 operating system and Presentation Manager user interface system were used to create a highly interactive automated system. The system outputs are hardcopy printouts and MS DOS format files which may be used as input for other PC programs.

  7. Integrating Research, Quality Improvement, and Medical Education for Better Handoffs and Safer Care: Disseminating, Adapting, and Implementing the I-PASS Program.

    PubMed

    Starmer, Amy J; Spector, Nancy D; West, Daniel C; Srivastava, Rajendu; Sectish, Theodore C; Landrigan, Christopher P

    2017-07-01

    In 2009 the I-PASS Study Group was formed by patient safety, medical education, health services research, and clinical experts from multiple institutions in the United States and Canada. When the I-PASS Handoff Program, which was developed by the I-PASS Study Group, was implemented in nine hospitals, it was associated with a 30% reduction in injuries due to medical errors and significant improvements in handoff processes, without any adverse effects on provider work flow. To effectively disseminate and adapt I-PASS for use across specialties and disciplines, a series of federally and privately funded dissemination and implementation projects were carried out following the publication of the initial study. The results of these efforts have informed ongoing initiatives intended to continue adapting and scaling the program. As of this writing, I-PASS Study Group members have directly worked with more than 50 hospitals to facilitate implementation of I-PASS. To further disseminate I-PASS, Study Group members delivered hundreds of academic presentations, including plenaries at scientific meetings, workshops, and institutional Grand Rounds. Some 3,563 individuals, representing more than 500 institutions in the 50 states in the United States, the District of Columbia, Puerto Rico, and 57 other countries, have requested access to I-PASS materials. Most recently, the I-PASS SM Patient Safety Institute has developed a virtual immersion training platform, mobile handoff observational tools, and processes to facilitate further spread of I-PASS. Implementation of I-PASS has been associated with substantial improvements in patient safety and can be applied to a variety of disciplines and types of patient handoffs. Widespread implementation of I-PASS has the potential to substantially improve patient safety in the United States and beyond. Copyright © 2017 The Joint Commission. Published by Elsevier Inc. All rights reserved.

  8. Overview of Sparse Graph for Multiple Access in Future Mobile Networks

    NASA Astrophysics Data System (ADS)

    Lei, Jing; Li, Baoguo; Li, Erbao; Gong, Zhenghui

    2017-10-01

    Multiple access via sparse graph, such as low density signature (LDS) and sparse code multiple access (SCMA), is a promising technique for future wireless communications. This survey presents an overview of the developments in this burgeoning field, including transmitter structures, extrinsic information transform (EXIT) chart analysis and comparisons with existing multiple access techniques. Such technique enables multiple access under overloaded conditions to achieve a satisfactory performance. Message passing algorithm is utilized for multi-user detection in the receiver, and structures of the sparse graph are illustrated in detail. Outlooks and challenges of this technique are also presented.

  9. Automated Astrometric Analysis of Satellite Observations using Wide-field Imaging

    NASA Astrophysics Data System (ADS)

    Skuljan, J.; Kay, J.

    2016-09-01

    An observational trial was conducted in the South Island of New Zealand from 24 to 28 February 2015, as a collaborative effort between the United Kingdom and New Zealand in the area of space situational awareness. The aim of the trial was to observe a number of satellites in low Earth orbit using wide-field imaging from two separate locations, in order to determine the space trajectory and compare the measurements with the predictions based on the standard two-line elements. This activity was an initial step in building a space situational awareness capability at the Defence Technology Agency of the New Zealand Defence Force. New Zealand has an important strategic position as the last land mass that many satellites selected for deorbiting pass before entering the Earth's atmosphere over the dedicated disposal area in the South Pacific. A preliminary analysis of the trial data has demonstrated that relatively inexpensive equipment can be used to successfully detect satellites at moderate altitudes. A total of 60 satellite passes were observed over the five nights of observation and about 2600 images were collected. A combination of cooled CCD and standard DSLR cameras were used, with a selection of lenses between 17 mm and 50 mm in focal length, covering a relatively wide field of view of 25 to 60 degrees. The CCD cameras were equipped with custom-made GPS modules to record the time of exposure with a high accuracy of one millisecond, or better. Specialised software has been developed for automated astrometric analysis of the trial data. The astrometric solution is obtained as a two-dimensional least-squares polynomial fit to the measured pixel positions of a large number of stars (typically 1000) detected across the image. The star identification is fully automated and works well for all camera-lens combinations used in the trial. A moderate polynomial degree of 3 to 5 is selected to take into account any image distortions introduced by the lens. A typical RMS error of the least-squares fit is about 0.1 pixels, which corresponds to about 4 to 10 seconds of arc in the sky, depending on the pixel scale (field of view). This gives a typical uncertainty between 10 and 25 metres in measuring the position of a satellite at a characteristic range of 500 kilometres. The results of this trial have confirmed that wide-field measurements based on standard photographic equipment and using automated astrometric analysis techniques can be used to improve the current orbital models of satellites in low Earth orbit.

  10. Installation of multiple automated external defibrillators to prevent sudden death in school-aged children.

    PubMed

    Higaki, Takashi; Chisaka, Toshiyuki; Moritani, Tomozo; Ohta, Masaaki; Takata, Hidemi; Yamauchi, Toshifumi; Yamaguchi, Youhei; Konishi, Kyoko; Yamamoto, Eiichi; Ochi, Fumihiro; Eguchi, Mariko; Eguchi-Ishimae, Minenori; Mitani, Yoshihide; Ishii, Eiichi

    2016-12-01

    Recently, a student died of idiopathic ventricular fibrillation in a school where an automated external defibrillator (AED) had been installed. The tragedy could not be prevented because the only AED in the school was installed in the teachers' office, far from the school ground where the accident took place. This prompted establishment of a multiple AED system in schools. The aim of this study was to analyze the efficacy of the multiple AED system to prevent sudden death in school-aged children. Assumed accident sites consisted of the school ground, gymnasium, Judo and Kendo hall, swimming pool, and classrooms on the first and the fourth floor. Multiple AED were installed in the teachers' office, gymnasium, some classrooms, and also provided as a portable AED in a rucksack. The time from the accident site to the teachers' office for single AED, and from the accident site to the nearest AED for multiple AED, was calculated. The AED retrieval time was significantly shorter in 55 elementary schools and in 29 junior high schools when multiple AED were installed compared with single AED. Except for the classroom on the fourth floor, the number of people who took >120 s to bring the AED to the accident site was lower when multiple AED were installed compared with the single AED. Multiple AED provided in appropriate sites can reduce the time to reach the casualty and hence prevent sudden death in school-aged children. © 2016 Japan Pediatric Society.

  11. Estimating Procurement Cost Growth Using Logistic and Multiple Regression

    DTIC Science & Technology

    2003-03-01

    Figure 4). The plots fail to pass the visual inspection for constant variance as well as the Breusch - Pagan test (Neter, 1996: 112) at an alpha level...plots fail to pass the visual inspection for constant variance as well as the Breusch - Pagan test at an alpha level of 0.05. Based on these findings...amount of cost growth a program will have 13 once model A deems that the program will incur cost growth. Sipple conducts validation testing on

  12. Apparatus for measuring particle properties

    DOEpatents

    Rader, Daniel J.; Castaneda, Jaime N.; Grasser, Thomas W.; Brockmann, John E.

    1998-01-01

    An apparatus for determining particle properties from detected light scattered by the particles. The apparatus uses a light beam with novel intensity characteristics to discriminate between particles that pass through the beam and those that pass through an edge of the beam. The apparatus can also discriminate between light scattered by one particle and light scattered by multiple particles. The particle's size can be determined from the intensity of the light scattered. The particle's velocity can be determined from the elapsed time between various intensities of the light scattered.

  13. Fully automated analysis of multi-resolution four-channel micro-array genotyping data

    NASA Astrophysics Data System (ADS)

    Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.

    2006-03-01

    We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.

  14. Integrated Multi-process Microfluidic Systems for Automating Analysis

    PubMed Central

    Yang, Weichun; Woolley, Adam T.

    2010-01-01

    Microfluidic technologies have been applied extensively in rapid sample analysis. Some current challenges for standard microfluidic systems are relatively high detection limits, and reduced resolving power and peak capacity compared to conventional approaches. The integration of multiple functions and components onto a single platform can overcome these separation and detection limitations of microfluidics. Multiplexed systems can greatly increase peak capacity in multidimensional separations and can increase sample throughput by analyzing many samples simultaneously. On-chip sample preparation, including labeling, preconcentration, cleanup and amplification, can all serve to speed up and automate processes in integrated microfluidic systems. This paper summarizes advances in integrated multi-process microfluidic systems for automated analysis, their benefits and areas for needed improvement. PMID:20514343

  15. Results From the Imaging and Radiation Oncology Core Houston's Anthropomorphic Phantoms Used for Proton Therapy Clinical Trial Credentialing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Paige A., E-mail: pataylor@mdanderson.org; Kry, Stephen F.; Alvarez, Paola

    Purpose: The purpose of this study was to summarize the findings of anthropomorphic proton phantom irradiations analyzed by the Imaging and Radiation Oncology Core Houston QA Center (IROC Houston). Methods and Materials: A total of 103 phantoms were irradiated by proton therapy centers participating in clinical trials. The anthropomorphic phantoms simulated heterogeneous anatomy of a head, liver, lung, prostate, and spine. Treatment plans included those for scattered, uniform scanning, and pencil beam scanning beam delivery modalities using 5 different treatment planning systems. For every phantom irradiation, point doses and planar doses were measured using thermoluminescent dosimeters (TLD) and film, respectively. Differencesmore » between measured and planned doses were studied as a function of phantom, beam delivery modality, motion, repeat attempt, treatment planning system, and date of irradiation. Results: The phantom pass rate (overall, 79%) was high for simple phantoms and lower for phantoms that introduced higher levels of difficulty, such as motion, multiple targets, or increased heterogeneity. All treatment planning systems overestimated dose to the target, compared to TLD measurements. Errors in range calculation resulted in several failed phantoms. There was no correlation between treatment planning system and pass rate. The pass rates for each individual phantom are not improving over time, but when individual institutions received feedback about failed phantom irradiations, pass rates did improve. Conclusions: The proton phantom pass rates are not as high as desired and emphasize potential deficiencies in proton therapy planning and/or delivery. There are many areas for improvement with the proton phantom irradiations, such as treatment planning system dose agreement, range calculations, accounting for motion, and irradiation of multiple targets.« less

  16. Towards Detection of Learner Misconceptions in a Medical Learning Environment: A Subgroup Discovery Approach

    ERIC Educational Resources Information Center

    Poitras, Eric G.; Doleck, Tenzin; Lajoie, Susanne P.

    2018-01-01

    Ill-structured problems, by definition, have multiple paths to a solution and are multifaceted making automated assessment and feedback a difficult challenge. Diagnostic reasoning about medical cases meet the criteria of ill-structured problem solving since there are multiple solution paths. The goal of this study was to develop an adaptive…

  17. The Display of Multiple Choice Question Bank on Microfilm

    ERIC Educational Resources Information Center

    Stevens, J. M.; Harris, F. T. C.

    1977-01-01

    An automated question bank maintained by the Department of Research and Services in Education at the Middlesex Hospital Medical School provides a printed copy of each of 25,000 multiple choice questions (95 percent relating to the whole spectrum of the medical curriculum). Problems with this procedure led to experimental work storing the data on…

  18. Towards Automated Annotation of Benthic Survey Images: Variability of Human Experts and Operational Modes of Automation

    PubMed Central

    Beijbom, Oscar; Edmunds, Peter J.; Roelfsema, Chris; Smith, Jennifer; Kline, David I.; Neal, Benjamin P.; Dunlap, Matthew J.; Moriarty, Vincent; Fan, Tung-Yung; Tan, Chih-Jui; Chan, Stephen; Treibitz, Tali; Gamst, Anthony; Mitchell, B. Greg; Kriegman, David

    2015-01-01

    Global climate change and other anthropogenic stressors have heightened the need to rapidly characterize ecological changes in marine benthic communities across large scales. Digital photography enables rapid collection of survey images to meet this need, but the subsequent image annotation is typically a time consuming, manual task. We investigated the feasibility of using automated point-annotation to expedite cover estimation of the 17 dominant benthic categories from survey-images captured at four Pacific coral reefs. Inter- and intra- annotator variability among six human experts was quantified and compared to semi- and fully- automated annotation methods, which are made available at coralnet.ucsd.edu. Our results indicate high expert agreement for identification of coral genera, but lower agreement for algal functional groups, in particular between turf algae and crustose coralline algae. This indicates the need for unequivocal definitions of algal groups, careful training of multiple annotators, and enhanced imaging technology. Semi-automated annotation, where 50% of the annotation decisions were performed automatically, yielded cover estimate errors comparable to those of the human experts. Furthermore, fully-automated annotation yielded rapid, unbiased cover estimates but with increased variance. These results show that automated annotation can increase spatial coverage and decrease time and financial outlay for image-based reef surveys. PMID:26154157

  19. Development and application of an automated precision solar radiometer

    NASA Astrophysics Data System (ADS)

    Qiu, Gang-gang; Li, Xin; Zhang, Quan; Zheng, Xiao-bing; Yan, Jing

    2016-10-01

    Automated filed vicarious calibration is becoming a growing trend for satellite remote sensor, which require a solar radiometer have to automatic measure reliable data for a long time whatever the weather conditions and transfer measurement data to the user office. An automated precision solar radiometer has been developed. It is used in measuring the solar spectral irradiance received at the Earth surface. The instrument consists of 8 parallel separate silicon-photodiode-based channels with narrow band-pass filters from the visible to near-IR regions. Each channel has a 2.0° full-angle Filed of View (FOV). The detectors and filters are temperature stabilized using a Thermal Energy Converter at 30+/-0.2°. The instrument is pointed toward the sun via an auto-tracking system that actively tracks the sun within a +/-0.1°. It collects data automatically and communicates with user terminal through BDS (China's BeiDou Navigation Satellite System) while records data as a redundant in internal memory, including working state and error. The solar radiometer is automated in the sense that it requires no supervision throughout the whole process of working. It calculates start-time and stop-time every day matched with the time of sunrise and sunset, and stop working once the precipitation. Calibrated via Langley curves and simultaneous observed with CE318, the different of Aerosol Optical Depth (AOD) is within 5%. The solar radiometer had run in all kinds of harsh weather condition in Gobi in Dunhuang and obtain the AODs nearly eight months continuously. This paper presents instrument design analysis, atmospheric optical depth retrievals as well as the experiment result.

  20. Development of a simple indocyanine green measurement method using an automated biochemical analyser.

    PubMed

    Sato, Yuka; Seimiya, Masanori; Yoshida, Toshihiko; Sawabe, Yuji; Hokazono, Eisaku; Osawa, Susumu; Matsushita, Kazuyuki

    2017-01-01

    Background The indocyanine green retention rate is important for assessing the severity of liver disorders. In the conventional method, blood needs to be collected twice. In the present study, we developed an automated indocyanine green method that does not require blood sampling before intravenous indocyanine green injections and is applicable to an automated biochemical analyser. Methods The serum samples of 471 patients collected before and after intravenous indocyanine green injections and submitted to the clinical laboratory of our hospital were used as samples. The standard procedure established by the Japan Society of Hepatology was used as the standard method. In the automated indocyanine green method, serum collected after an intravenous indocyanine green injection was mixed with the saline reagent containing a surfactant, and the indocyanine green concentration was measured at a dominant wavelength of 805 nm and a complementary wavelength of 884 nm. Results The coefficient of variations of the within- and between-run reproducibilities of this method were 2% or lower, and dilution linearity passing the origin was noted up to 10 mg/L indocyanine green. The reagent was stable for four weeks or longer. Haemoglobin, bilirubin and chyle had no impact on the results obtained. The correlation coefficient between the standard method (x) and this method (y) was r=0.995; however, slight divergence was noted in turbid samples. Conclusion Divergence in turbid samples may have corresponded to false negativity with the standard procedure. Our method may be highly practical because blood sampling before indocyanine green loading is unnecessary and measurements are simple.

  1. Band-pass filtering algorithms for adaptive control of compressor pre-stall modes in aircraft gas-turbine engine

    NASA Astrophysics Data System (ADS)

    Kuznetsova, T. A.

    2018-05-01

    The methods for increasing gas-turbine aircraft engines' (GTE) adaptive properties to interference based on empowerment of automatic control systems (ACS) are analyzed. The flow pulsation in suction and a discharge line of the compressor, which may cause the stall, are considered as the interference. The algorithmic solution to the problem of GTE pre-stall modes’ control adapted to stability boundary is proposed. The aim of the study is to develop the band-pass filtering algorithms to provide the detection functions of the compressor pre-stall modes for ACS GTE. The characteristic feature of pre-stall effect is the increase of pressure pulsation amplitude over the impeller at the multiples of the rotor’ frequencies. The used method is based on a band-pass filter combining low-pass and high-pass digital filters. The impulse response of the high-pass filter is determined through a known low-pass filter impulse response by spectral inversion. The resulting transfer function of the second order band-pass filter (BPF) corresponds to a stable system. The two circuit implementations of BPF are synthesized. Designed band-pass filtering algorithms were tested in MATLAB environment. Comparative analysis of amplitude-frequency response of proposed implementation allows choosing the BPF scheme providing the best quality of filtration. The BPF reaction to the periodic sinusoidal signal, simulating the experimentally obtained pressure pulsation function in the pre-stall mode, was considered. The results of model experiment demonstrated the effectiveness of applying band-pass filtering algorithms as part of ACS to identify the pre-stall mode of the compressor for detection of pressure fluctuations’ peaks, characterizing the compressor’s approach to the stability boundary.

  2. Semi-automated 96-well liquid-liquid extraction for quantitation of drugs in biological fluids.

    PubMed

    Zhang, N; Hoffman, K L; Li, W; Rossi, D T

    2000-02-01

    A semi-automated liquid-liquid extraction (LLE) technique for biological fluid sample preparation was introduced for the quantitation of four drugs in rat plasma. All liquid transferring during the sample preparation was automated using a Tomtec Quadra 96 Model 320 liquid handling robot, which processed up to 96 samples in parallel. The samples were either in 96-deep-well plate or tube-rack format. One plate of samples can be prepared in approximately 1.5 h, and the 96-well plate is directly compatible with the autosampler of an LC/MS system. Selection of organic solvents and recoveries are discussed. Also, precision, relative error, linearity and quantitation of the semi automated LLE method are estimated for four example drugs using LC/MS/MS with a multiple reaction monitoring (MRM) approach. The applicability of this method and future directions are evaluated.

  3. BioBlocks: Programming Protocols in Biology Made Easier.

    PubMed

    Gupta, Vishal; Irimia, Jesús; Pau, Iván; Rodríguez-Patón, Alfonso

    2017-07-21

    The methods to execute biological experiments are evolving. Affordable fluid handling robots and on-demand biology enterprises are making automating entire experiments a reality. Automation offers the benefit of high-throughput experimentation, rapid prototyping, and improved reproducibility of results. However, learning to automate and codify experiments is a difficult task as it requires programming expertise. Here, we present a web-based visual development environment called BioBlocks for describing experimental protocols in biology. It is based on Google's Blockly and Scratch, and requires little or no experience in computer programming to automate the execution of experiments. The experiments can be specified, saved, modified, and shared between multiple users in an easy manner. BioBlocks is open-source and can be customized to execute protocols on local robotic platforms or remotely, that is, in the cloud. It aims to serve as a de facto open standard for programming protocols in Biology.

  4. Electrofishing and the effects of depletion sampling on fish health: A review and recommendations for additional study

    USGS Publications Warehouse

    Panek, F.M.; Densmore, Christine L.; Cipriano, R.C.; Bruckner, A.W.; Shchelkunov, I.S.

    2011-01-01

    Depletion sampling in combination with multiple-pass electrofishing is an important fisheries management tool for wadeable streams. This combination of techniques has been used routinely by federal and state fishery management agencies for several decades as a reliable means to obtain quantitative data on trout populations or to describe fish community structure. In this paper we review the effects of electrofishing on fish and discuss this within the context of depletion sampling and multiple exposures of fishes to electric fields. The multiple wave forms most commonly used in sampling (alternating current, direct current, and pulsed direct current) are discussed as well as electrofishing induced response, injury and physiological stress. Fish that survive electrofishing injuries are more likely to suffer short and long-term adverse effects to their behavior, health, growth, or reproduction. Of greatest concern are the native, non-target species that may be subjected to multiple electrical shocks during the course of a 3-pass depletion survey. These exposures and their effects on the non-target species warrant further study as do the overall effects of electrofishing on populations and community structure. 

  5. Establishment of a PID Pass/Fail Test for Crystalline Silicon Modules by Examining Field Performance for Five Years: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hacke, Peter L

    In an experiment with five module designs and multiple replicas, it is found that crystalline silicon cell modules that can pass a criterion of less than 5 percent power degradation in stress test conditions of 60 degrees Celsius, 85 percent relative humidity (RH), 96 h, and nameplate-rated system voltage bias show no power degradation by potential induced degradation in the range of 4-6 years duration in the Florida, USA environment. This data suggests that this chamber stress level is useful as a pass/fail criterion for PID, and will help ensure against degradation by system voltage stress in Florida, or lessmore » stressful climates, for at least 5 years.« less

  6. Protein function prediction--the power of multiplicity.

    PubMed

    Rentzsch, Robert; Orengo, Christine A

    2009-04-01

    Advances in experimental and computational methods have quietly ushered in a new era in protein function annotation. This 'age of multiplicity' is marked by the notion that only the use of multiple tools, multiple evidence and considering the multiple aspects of function can give us the broad picture that 21st century biology will need to link and alter micro- and macroscopic phenotypes. It might also help us to undo past mistakes by removing errors from our databases and prevent us from producing more. On the downside, multiplicity is often confusing. We therefore systematically review methods and resources for automated protein function prediction, looking at individual (biochemical) and contextual (network) functions, respectively.

  7. A novel nano-immunoassay method for quantification of proteins from CD138-purified myeloma cells: biological and clinical utility

    PubMed Central

    Misiewicz-Krzeminska, Irena; Corchete, Luis Antonio; Rojas, Elizabeta A.; Martínez-López, Joaquín; García-Sanz, Ramón; Oriol, Albert; Bladé, Joan; Lahuerta, Juan-José; Miguel, Jesús San; Mateos, María-Victoria; Gutiérrez, Norma C.

    2018-01-01

    Protein analysis in bone marrow samples from patients with multiple myeloma has been limited by the low concentration of proteins obtained after CD138+ cell selection. A novel approach based on capillary nano-immunoassay could make it possible to quantify dozens of proteins from each myeloma sample in an automated manner. Here we present a method for the accurate and robust quantification of the expression of multiple proteins extracted from CD138-purified multiple myeloma samples frozen in RLT Plus buffer, which is commonly used for nucleic acid preservation and isolation. Additionally, the biological and clinical value of this analysis for a panel of 12 proteins essential to the pathogenesis of multiple myeloma was evaluated in 63 patients with newly diagnosed multiple myeloma. The analysis of the prognostic impact of CRBN/Cereblon and IKZF1/Ikaros mRNA/protein showed that only the protein levels were able to predict progression-free survival of patients; mRNA levels were not associated with prognosis. Interestingly, high levels of Cereblon and Ikaros proteins were associated with longer progression-free survival only in patients who received immunomodulatory drugs and not in those treated with other drugs. In conclusion, the capillary nano-immunoassay platform provides a novel opportunity for automated quantification of the expression of more than 20 proteins in CD138+ primary multiple myeloma samples. PMID:29545347

  8. A novel nano-immunoassay method for quantification of proteins from CD138-purified myeloma cells: biological and clinical utility.

    PubMed

    Misiewicz-Krzeminska, Irena; Corchete, Luis Antonio; Rojas, Elizabeta A; Martínez-López, Joaquín; García-Sanz, Ramón; Oriol, Albert; Bladé, Joan; Lahuerta, Juan-José; Miguel, Jesús San; Mateos, María-Victoria; Gutiérrez, Norma C

    2018-05-01

    Protein analysis in bone marrow samples from patients with multiple myeloma has been limited by the low concentration of proteins obtained after CD138 + cell selection. A novel approach based on capillary nano-immunoassay could make it possible to quantify dozens of proteins from each myeloma sample in an automated manner. Here we present a method for the accurate and robust quantification of the expression of multiple proteins extracted from CD138-purified multiple myeloma samples frozen in RLT Plus buffer, which is commonly used for nucleic acid preservation and isolation. Additionally, the biological and clinical value of this analysis for a panel of 12 proteins essential to the pathogenesis of multiple myeloma was evaluated in 63 patients with newly diagnosed multiple myeloma. The analysis of the prognostic impact of CRBN /Cereblon and IKZF1 /Ikaros mRNA/protein showed that only the protein levels were able to predict progression-free survival of patients; mRNA levels were not associated with prognosis. Interestingly, high levels of Cereblon and Ikaros proteins were associated with longer progression-free survival only in patients who received immunomodulatory drugs and not in those treated with other drugs. In conclusion, the capillary nano-immunoassay platform provides a novel opportunity for automated quantification of the expression of more than 20 proteins in CD138 + primary multiple myeloma samples. Copyright © 2018 Ferrata Storti Foundation.

  9. Multiple scattering induced negative refraction of matter waves

    PubMed Central

    Pinsker, Florian

    2016-01-01

    Starting from fundamental multiple scattering theory it is shown that negative refraction indices are feasible for matter waves passing a well-defined ensemble of scatterers. A simple approach to this topic is presented and explicit examples for systems of scatterers in 1D and 3D are stated that imply negative refraction for a generic incoming quantum wave packet. Essential features of the effective scattering field, densities and frequency spectrum of scatterers are considered. Additionally it is shown that negative refraction indices allow perfect transmission of the wave passing the ensemble of scatterers. Finally the concept of the superlens is discussed, since it is based on negative refraction and can be extended to matter waves utilizing the observations presented in this paper which thus paves the way to ‘untouchable’ quantum systems in analogy to cloaking devices for electromagnetic waves. PMID:26857266

  10. Assisted reproductive technology use, embryo transfer practices, and birth outcomes after infertility insurance mandates: New Jersey and Connecticut.

    PubMed

    Crawford, Sara; Boulet, Sheree L; Jamieson, Denise J; Stone, Carol; Mullen, Jewel; Kissin, Dmitry M

    2016-02-01

    To explore whether recently enacted infertility mandates including coverage for assisted reproductive technology (ART) treatment in New Jersey (2001) and Connecticut (2005) increased ART use, improved embryo transfer practices, and decreased multiple birth rates. Retrospective cohort study using data from the National ART Surveillance System. We explored trends in ART use, embryo transfer practices and birth outcomes, and compared changes in practices and outcomes during a 2-year period before and after passing the mandate between mandate and non-mandate states. Not applicable. Cycles of ART performed in the United States between 1996 and 2013. Infertility insurance mandates including coverage for ART treatment passed in New Jersey (2001) and Connecticut (2005). Number of ART cycles performed, number of embryos transferred, multiple live birth rates. Both New Jersey and Connecticut experienced an increase in ART use greater than the non-mandate states. The mean number of embryos transferred decreased significantly in New Jersey and Connecticut; however, the magnitudes were not significantly different from non-mandate states. There was no significant change in ART birth outcomes in either mandate state except for an increase in live births in Connecticut; the magnitude was not different from non-mandate states. The infertility insurance mandates passed in New Jersey and Connecticut were associated with increased ART treatment use but not a decrease in the number of embryos transferred or the rate of multiples; however, applicability of the mandates was limited. Published by Elsevier Inc.

  11. Reducing Fuel Consumption through Semi-Automated Platooning with Class 8 Tractor Trailer Combinations (Poster)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lammert, M.; Gonder, J.

    This poster describes the National Renewable Energy Laboratory's evaluation of the fuel savings potential of semi-automated truck platooning. Platooning involves reducing aerodynamic drag by grouping vehicles together and decreasing the distance between them through the use of electronic coupling, which allows multiple vehicles to accelerate or brake simultaneously. The NREL study addressed the need for data on American style line-haul sleeper cabs with modern aerodynamics and over a range of trucking speeds common in the United States.

  12. Hierarchically Parallelized Constrained Nonlinear Solvers with Automated Substructuring

    NASA Technical Reports Server (NTRS)

    Padovan, Joe; Kwang, Abel

    1994-01-01

    This paper develops a parallelizable multilevel multiple constrained nonlinear equation solver. The substructuring process is automated to yield appropriately balanced partitioning of each succeeding level. Due to the generality of the procedure,_sequential, as well as partially and fully parallel environments can be handled. This includes both single and multiprocessor assignment per individual partition. Several benchmark examples are presented. These illustrate the robustness of the procedure as well as its capability to yield significant reductions in memory utilization and calculational effort due both to updating and inversion.

  13. Testing primates with joystick-based automated apparatus - Lessons from the Language Research Center's Computerized Test System

    NASA Technical Reports Server (NTRS)

    Washburn, David A.; Rumbaugh, Duane M.

    1992-01-01

    Nonhuman primates provide useful models for studying a variety of medical, biological, and behavioral topics. Four years of joystick-based automated testing of monkeys using the Language Research Center's Computerized Test System (LRC-CTS) are examined to derive hints and principles for comparable testing with other species - including humans. The results of multiple parametric studies are reviewed, and reliability data are presented to reveal the surprises and pitfalls associated with video-task testing of performance.

  14. Multichannel microscale system for high throughput preparative separation with comprehensive collection and analysis

    DOEpatents

    Karger, Barry L.; Kotler, Lev; Foret, Frantisek; Minarik, Marek; Kleparnik, Karel

    2003-12-09

    A modular multiple lane or capillary electrophoresis (chromatography) system that permits automated parallel separation and comprehensive collection of all fractions from samples in all lanes or columns, with the option of further on-line automated sample fraction analysis, is disclosed. Preferably, fractions are collected in a multi-well fraction collection unit, or plate (40). The multi-well collection plate (40) is preferably made of a solvent permeable gel, most preferably a hydrophilic, polymeric gel such as agarose or cross-linked polyacrylamide.

  15. Extratropical Cyclone

    Atmospheric Science Data Center

    2013-04-16

    ... using data from multiple MISR cameras within automated computer processing algorithms. The stereoscopic algorithms used to generate ... NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Science Mission Directorate, Washington, D.C. The Terra spacecraft is managed ...

  16. Fusion of multiple quadratic penalty function support vector machines (QPFSVM) for automated sea mine detection and classification

    NASA Astrophysics Data System (ADS)

    Dobeck, Gerald J.; Cobb, J. Tory

    2002-08-01

    The high-resolution sonar is one of the principal sensors used by the Navy to detect and classify sea mines in minehunting operations. For such sonar systems, substantial effort has been devoted to the development of automated detection and classification (D/C) algorithms. These have been spurred by several factors including (1) aids for operators to reduce work overload, (2) more optimal use of all available data, and (3) the introduction of unmanned minehunting systems. The environments where sea mines are typically laid (harbor areas, shipping lanes, and the littorals) give rise to many false alarms caused by natural, biologic, and man-made clutter. The objective of the automated D/C algorithms is to eliminate most of these false alarms while still maintaining a very high probability of mine detection and classification (PdPc). In recent years, the benefits of fusing the outputs of multiple D/C algorithms have been studied. We refer to this as Algorithm Fusion. The results have been remarkable, including reliable robustness to new environments. The Quadratic Penalty Function Support Vector Machine (QPFSVM) algorithm to aid in the automated detection and classification of sea mines is introduced in this paper. The QPFSVM algorithm is easy to train, simple to implement, and robust to feature space dimension. Outputs of successive SVM algorithms are cascaded in stages (fused) to improve the Probability of Classification (Pc) and reduce the number of false alarms. Even though our experience has been gained in the area of sea mine detection and classification, the principles described herein are general and can be applied to fusion of any D/C problem (e.g., automated medical diagnosis or automatic target recognition for ballistic missile defense).

  17. Automated wholeslide analysis of multiplex-brightfield IHC images for cancer cells and carcinoma-associated fibroblasts

    NASA Astrophysics Data System (ADS)

    Lorsakul, Auranuch; Andersson, Emilia; Vega Harring, Suzana; Sade, Hadassah; Grimm, Oliver; Bredno, Joerg

    2017-03-01

    Multiplex-brightfield immunohistochemistry (IHC) staining and quantitative measurement of multiple biomarkers can support therapeutic targeting of carcinoma-associated fibroblasts (CAF). This paper presents an automated digitalpathology solution to simultaneously analyze multiple biomarker expressions within a single tissue section stained with an IHC duplex assay. Our method was verified against ground truth provided by expert pathologists. In the first stage, the automated method quantified epithelial-carcinoma cells expressing cytokeratin (CK) using robust nucleus detection and supervised cell-by-cell classification algorithms with a combination of nucleus and contextual features. Using fibroblast activation protein (FAP) as biomarker for CAFs, the algorithm was trained, based on ground truth obtained from pathologists, to automatically identify tumor-associated stroma using a supervised-generation rule. The algorithm reported distance to nearest neighbor in the populations of tumor cells and activated-stromal fibroblasts as a wholeslide measure of spatial relationships. A total of 45 slides from six indications (breast, pancreatic, colorectal, lung, ovarian, and head-and-neck cancers) were included for training and verification. CK-positive cells detected by the algorithm were verified by a pathologist with good agreement (R2=0.98) to ground-truth count. For the area occupied by FAP-positive cells, the inter-observer agreement between two sets of ground-truth measurements was R2=0.93 whereas the algorithm reproduced the pathologists' areas with R2=0.96. The proposed methodology enables automated image analysis to measure spatial relationships of cells stained in an IHC-multiplex assay. Our proof-of-concept results show an automated algorithm can be trained to reproduce the expert assessment and provide quantitative readouts that potentially support a cutoff determination in hypothesis testing related to CAF-targeting-therapy decisions.

  18. A comparison between two different automated total 25-hydroxyvitamin D immunoassay methods using liquid chromatography-tandem mass spectrometry.

    PubMed

    Kocak, Fatma Emel; Ozturk, Bahadir; Isiklar, Ozben Ozden; Genc, Ozlem; Unlu, Ali; Altuntas, Irfan

    2015-01-01

    Total 25-hydroxyvitamin D [25(OH)D] is the most reliable indicator of vitamin D status. In this study, we compared two automated immunoassay methods, the Abbott Architect 25-OH Vitamin D assay and the Roche Cobas Vitamin D total assay, with the liquid chromatography-tandem mass spectrometry (LC-MS/MS). One hundred venous blood samples were randomly selected from routine vitamin D tests. Two of the serum aliquots were analyzed at the Abbott Architect i2000 and the Roche Cobas 6000's module e601 in our laboratory within the same day. The other serum aliquots were analyzed at the LC-MS/MS in different laboratory. Passing-Bablok regression analysis and Bland-Altman plot were used to compare methods. Inter-rater agreement was analyzed using kappa (κ) analysis. The Roche assay showed acceptable agreement with the LC-MS/MS based on Passing-Bablok analysis (intercept: -5.23 nmol/L, 95% CI: -8.73 to 0.19; slope: 0.97, 95% CI: 0.77 to 1.15). The Abbott assay showed proportional (slope: 0.77, 95% CI: 0.67 to 0.85) and constant differences (intercept: 17.08 nmol/L; 95% CI: 12.98 to 21.39). A mean bias of 15.1% was observed for the Abbott and a mean bias of -14.1% was observed for the Roche based on the Bland-Altman plots. We found strong to nearly perfect agreement in vitamin D status between the immunoassays and LC-MS/MS. (κ: 0.83 for Abbott, κ: 0.93 for Roche) using kappa analysis. Both immunoassays demonstrated acceptable performance, but the Roche Cobas assay demonstrated better performance than the Abbott Architect in the studied samples.

  19. X-Band Acquisition Aid Software

    NASA Technical Reports Server (NTRS)

    Britcliffe, Michael J.; Strain, Martha M.; Wert, Michael

    2011-01-01

    The X-band Acquisition Aid (AAP) software is a low-cost acquisition aid for the Deep Space Network (DSN) antennas, and is used while acquiring a spacecraft shortly after it has launched. When enabled, the acquisition aid provides corrections to the antenna-predicted trajectory of the spacecraft to compensate for the variations that occur during the actual launch. The AAP software also provides the corrections to the antenna-predicted trajectory to the navigation team that uses the corrections to refine their model of the spacecraft in order to produce improved antenna-predicted trajectories for each spacecraft that passes over each complex. The software provides an automated Acquisition Aid receiver calibration, and provides graphical displays to the operator and remote viewers via an Ethernet connection. It has a Web server, and the remote workstations use the Firefox browser to view the displays. At any given time, only one operator can control any particular display in order to avoid conflicting commands from more than one control point. The configuration and control is accomplished solely via the graphical displays. The operator does not have to remember any commands. Only a few configuration parameters need to be changed, and can be saved to the appropriate spacecraft-dependent configuration file on the AAP s hard disk. AAP automates the calibration sequence by first commanding the antenna to the correct position, starting the receiver calibration sequence, and then providing the operator with the option of accepting or rejecting the new calibration parameters. If accepted, the new parameters are stored in the appropriate spacecraft-dependent configuration file. The calibration can be performed on the Sun, greatly expanding the window of opportunity for calibration. The spacecraft traditionally used for calibration is in view typically twice per day, and only for about ten minutes each pass.

  20. Adaptive pattern recognition by mini-max neural networks as a part of an intelligent processor

    NASA Technical Reports Server (NTRS)

    Szu, Harold H.

    1990-01-01

    In this decade and progressing into 21st Century, NASA will have missions including Space Station and the Earth related Planet Sciences. To support these missions, a high degree of sophistication in machine automation and an increasing amount of data processing throughput rate are necessary. Meeting these challenges requires intelligent machines, designed to support the necessary automations in a remote space and hazardous environment. There are two approaches to designing these intelligent machines. One of these is the knowledge-based expert system approach, namely AI. The other is a non-rule approach based on parallel and distributed computing for adaptive fault-tolerances, namely Neural or Natural Intelligence (NI). The union of AI and NI is the solution to the problem stated above. The NI segment of this unit extracts features automatically by applying Cauchy simulated annealing to a mini-max cost energy function. The feature discovered by NI can then be passed to the AI system for future processing, and vice versa. This passing increases reliability, for AI can follow the NI formulated algorithm exactly, and can provide the context knowledge base as the constraints of neurocomputing. The mini-max cost function that solves the unknown feature can furthermore give us a top-down architectural design of neural networks by means of Taylor series expansion of the cost function. A typical mini-max cost function consists of the sample variance of each class in the numerator, and separation of the center of each class in the denominator. Thus, when the total cost energy is minimized, the conflicting goals of intraclass clustering and interclass segregation are achieved simultaneously.

  1. Sci-Thur AM: YIS – 08: Automated Imaging Quality Assurance for Image-Guided Small Animal Irradiators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnstone, Chris; Bazalova-Carter, Magdalena

    Purpose: To develop quality assurance (QA) standards and tolerance levels for image quality of small animal irradiators. Methods: A fully automated in-house QA software for image analysis of a commercial microCT phantom was created. Quantitative analyses of CT linearity, signal-to-noise ratio (SNR), uniformity and noise, geometric accuracy, modulation transfer function (MTF), and CT number evaluation was performed. Phantom microCT scans from seven institutions acquired with varying parameters (kVp, mA, time, voxel size, and frame rate) and five irradiator units (Xstrahl SARRP, PXI X-RAD 225Cx, PXI X-RAD SmART, GE explore CT/RT 140, and GE Explore CT 120) were analyzed. Multi-institutional datamore » sets were compared using our in-house software to establish pass/fail criteria for each QA test. Results: CT linearity (R2>0.996) was excellent at all but Institution 2. Acceptable SNR (>35) and noise levels (<55HU) were obtained at four of the seven institutions, where failing scans were acquired with less than 120mAs. Acceptable MTF (>1.5 lp/mm for MTF=0.2) was obtained at all but Institution 6 due to the largest scan voxel size (0.35mm). The geometric accuracy passed (<1.5%) at five of the seven institutions. Conclusion: Our QA software can be used to rapidly perform quantitative imaging QA for small animal irradiators, accumulate results over time, and display possible changes in imaging functionality from its original performance and/or from the recommended tolerance levels. This tool will aid researchers in maintaining high image quality, enabling precise conformal dose delivery to small animals.« less

  2. Multiple sort flow cytometer

    DOEpatents

    Van den Engh, Ger; Esposito, Richard J.

    1996-01-01

    A flow cytometer utilizes multiple lasers for excitation and respective fluorescence of identified dyes bonded to specific cells or events to identify and verify multiple events to be sorted from a sheath flow and droplet stream. Once identified, verified and timed in the sheath flow, each event is independently tagged upon separation from the flow by an electrical charge of +60, +120, or +180 volts and passed through oppositely charged deflection plates with ground planes to yield a focused six way deflection of at least six events in a narrow plane.

  3. Association between fully automated MRI-based volumetry of different brain regions and neuropsychological test performance in patients with amnestic mild cognitive impairment and Alzheimer's disease.

    PubMed

    Arlt, Sönke; Buchert, Ralph; Spies, Lothar; Eichenlaub, Martin; Lehmbeck, Jan T; Jahn, Holger

    2013-06-01

    Fully automated magnetic resonance imaging (MRI)-based volumetry may serve as biomarker for the diagnosis in patients with mild cognitive impairment (MCI) or dementia. We aimed at investigating the relation between fully automated MRI-based volumetric measures and neuropsychological test performance in amnestic MCI and patients with mild dementia due to Alzheimer's disease (AD) in a cross-sectional and longitudinal study. In order to assess a possible prognostic value of fully automated MRI-based volumetry for future cognitive performance, the rate of change of neuropsychological test performance over time was also tested for its correlation with fully automated MRI-based volumetry at baseline. In 50 subjects, 18 with amnestic MCI, 21 with mild AD, and 11 controls, neuropsychological testing and T1-weighted MRI were performed at baseline and at a mean follow-up interval of 2.1 ± 0.5 years (n = 19). Fully automated MRI volumetry of the grey matter volume (GMV) was performed using a combined stereotactic normalisation and segmentation approach as provided by SPM8 and a set of pre-defined binary lobe masks. Left and right hippocampus masks were derived from probabilistic cytoarchitectonic maps. Volumes of the inner and outer liquor space were also determined automatically from the MRI. Pearson's test was used for the correlation analyses. Left hippocampal GMV was significantly correlated with performance in memory tasks, and left temporal GMV was related to performance in language tasks. Bilateral frontal, parietal and occipital GMVs were correlated to performance in neuropsychological tests comprising multiple domains. Rate of GMV change in the left hippocampus was correlated with decline of performance in the Boston Naming Test (BNT), Mini-Mental Status Examination, and trail making test B (TMT-B). The decrease of BNT and TMT-A performance over time correlated with the loss of grey matter in multiple brain regions. We conclude that fully automated MRI-based volumetry allows detection of regional grey matter volume loss that correlates with neuropsychological performance in patients with amnestic MCI or mild AD. Because of the high level of automation, MRI-based volumetry may easily be integrated into clinical routine to complement the current diagnostic procedure.

  4. Multifunctional picoliter droplet manipulation platform and its application in single cell analysis.

    PubMed

    Gu, Shu-Qing; Zhang, Yun-Xia; Zhu, Ying; Du, Wen-Bin; Yao, Bo; Fang, Qun

    2011-10-01

    We developed an automated and multifunctional microfluidic platform based on DropLab to perform flexible generation and complex manipulations of picoliter-scale droplets. Multiple manipulations including precise droplet generation, sequential reagent merging, and multistep solid-phase extraction for picoliter-scale droplets could be achieved in the present platform. The system precision in generating picoliter-scale droplets was significantly improved by minimizing the thermo-induced fluctuation of flow rate. A novel droplet fusion technique based on the difference of droplet interfacial tensions was developed without the need of special microchannel networks or external devices. It enabled sequential addition of reagents to droplets on demand for multistep reactions. We also developed an effective picoliter-scale droplet splitting technique with magnetic actuation. The difficulty in phase separation of magnetic beads from picoliter-scale droplets due to the high interfacial tension was overcome using ferromagnetic particles to carry the magnetic beads to pass through the phase interface. With this technique, multistep solid-phase extraction was achieved among picoliter-scale droplets. The present platform had the ability to perform complex multistep manipulations to picoliter-scale droplets, which is particularly required for single cell analysis. Its utility and potentials in single cell analysis were preliminarily demonstrated in achieving high-efficiency single-cell encapsulation, enzyme activity assay at the single cell level, and especially, single cell DNA purification based on solid-phase extraction.

  5. Comparison of plasma ammonia results from seven different automated platforms in use throughout Central Australia.

    PubMed

    Markus, Corey; Metz, Michael

    2017-04-01

    The clinical catchment area for the Metabolic service at the Women's and Children's Hospital in Adelaide, South Australia, covers nearly 2.5millionkm 2 . Care of children with metabolic disorders in these remote areas is assisted from Adelaide, and at times, using plasma ammonia results from laboratories up to 3000km away. There are seven different platforms measuring plasma ammonia within this vast clinical catchment area. Hence, a correlation study was conducted to examine the relationship between plasma ammonia results from the seven different platforms in use throughout central Australia. Multiple aliquots of plasma from remainder EDTA samples for haematological investigations were frozen. Samples were then dispatched on dry ice to the laboratories being correlated. At an agreed date and time correlation samples were thawed and plasma ammonia measured. Passing-Bablok regression analysis showed slopes ranging from 1.00 to 1.10 and y-intercepts ranging from -10μmol/L to 1μmol/L. Despite the absence of a reference method or reference material and troublesome pre-analytical effects in ammonia measurement, plasma ammonia results from the different platforms in general compare well. The study also demonstrates that samples for ammonia measurement can be transported over great distances and still correlate well. Furthermore, a common reference interval for plasma ammonia may be a possibility. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.

  6. Consensus coding sequence (CCDS) database: a standardized set of human and mouse protein-coding regions supported by expert curation.

    PubMed

    Pujar, Shashikant; O'Leary, Nuala A; Farrell, Catherine M; Loveland, Jane E; Mudge, Jonathan M; Wallin, Craig; Girón, Carlos G; Diekhans, Mark; Barnes, If; Bennett, Ruth; Berry, Andrew E; Cox, Eric; Davidson, Claire; Goldfarb, Tamara; Gonzalez, Jose M; Hunt, Toby; Jackson, John; Joardar, Vinita; Kay, Mike P; Kodali, Vamsi K; Martin, Fergal J; McAndrews, Monica; McGarvey, Kelly M; Murphy, Michael; Rajput, Bhanu; Rangwala, Sanjida H; Riddick, Lillian D; Seal, Ruth L; Suner, Marie-Marthe; Webb, David; Zhu, Sophia; Aken, Bronwen L; Bruford, Elspeth A; Bult, Carol J; Frankish, Adam; Murphy, Terence; Pruitt, Kim D

    2018-01-04

    The Consensus Coding Sequence (CCDS) project provides a dataset of protein-coding regions that are identically annotated on the human and mouse reference genome assembly in genome annotations produced independently by NCBI and the Ensembl group at EMBL-EBI. This dataset is the product of an international collaboration that includes NCBI, Ensembl, HUGO Gene Nomenclature Committee, Mouse Genome Informatics and University of California, Santa Cruz. Identically annotated coding regions, which are generated using an automated pipeline and pass multiple quality assurance checks, are assigned a stable and tracked identifier (CCDS ID). Additionally, coordinated manual review by expert curators from the CCDS collaboration helps in maintaining the integrity and high quality of the dataset. The CCDS data are available through an interactive web page (https://www.ncbi.nlm.nih.gov/CCDS/CcdsBrowse.cgi) and an FTP site (ftp://ftp.ncbi.nlm.nih.gov/pub/CCDS/). In this paper, we outline the ongoing work, growth and stability of the CCDS dataset and provide updates on new collaboration members and new features added to the CCDS user interface. We also present expert curation scenarios, with specific examples highlighting the importance of an accurate reference genome assembly and the crucial role played by input from the research community. Published by Oxford University Press on behalf of Nucleic Acids Research 2017.

  7. Delegation control of multiple unmanned systems

    NASA Astrophysics Data System (ADS)

    Flaherty, Susan R.; Shively, Robert J.

    2010-04-01

    Maturing technologies and complex payloads coupled with a future objective to reduce the logistics burden of current unmanned aerial systems (UAS) operations require a change to the 2-crew employment paradigm. Increased automation and operator supervisory control of unmanned systems have been advocated to meet the objective of reducing the crew requirements, while managing future technologies. Specifically, a delegation control employment strategy has resulted in reduced workload and higher situation awareness for single operators controlling multiple unmanned systems in empirical studies1,2. Delegation control is characterized by the ability for an operator to call a single "play" that initiates prescribed default actions for each vehicle and associated sensor related to a common mission goal. Based upon the effectiveness of delegation control in simulation, the U.S. Army Aeroflightdynamics Directorate (AFDD) developed a Delegation Control (DelCon) operator interface with voice recognition implementation for play selection, real-time play modification, and play status with automation transparency to enable single operator control of multiple unmanned systems in flight. AFDD successfully demonstrated delegation control in a Troops-in-Contact mission scenario at Ft. Ord in 2009. This summary showcases the effort as a beneficial advance in single operator control of multiple UAS.

  8. Multiple HPV genotype infection impact on invasive cervical cancer presentation and survival

    PubMed Central

    Martins, Toni Ricardo; Mendoza Lopez, Rossana V.; Sadalla, José Carlos; de Carvalho, João Paulo Mancusi; Baracat, Edmund Chada

    2017-01-01

    Background Invasive cervical cancer (ICC) is the third most common malignant neoplasm affecting Brazilian women. Little is known about the impact of specific HPV genotypes in the prognosis of ICC. We hypothesized that HPV genotype would impact ICC clinical presentation and survival. Methods Women diagnosed with ICC at the Instituto do Câncer do Estado de São Paulo (ICESP) between May 2008 and June 2012 were included in the study and were followed until December 2015. HPV genotype was detected from formalin-fixed paraffin-embedded (FFPE) tumor tissue samples using Onclarity™ system (BD Viper™ LT automated system). Results 292 patients aged 50±14 years were analyzed. HPVDNA was detected in 84% of patients. The HPV genotypes studied were: HPV16 (64%), HPV18 (10%), HPV33-58 (7%), HPV45 (5%), HPV31 (4%) and other high-risk HPV genotypes (11%). HPV genotypes showed different distributions regarding histological type and clinical stage. Patients were followed for 35±21 months. The overall survival at 5 years after diagnosis of cervical cancer was 54%. Age, clinical staging, histological type and multiple HPV genotypes infection detected in the same tumor specimen were associated with poorer overall survival on multivariate Cox proportional hazard analysis (p<0.05). No specific HPV genotype affected survival. Conclusion Multiple HPV genotype infection was associated with poorer ICC survival in our study, compared with single genotype infection. HPV genotyping from FFPE tumor tissue using an automated assay such as the Onclarity BD™ assay provides a simpler alternative for routine clinical use. Impact This is the largest study employing an automated HPV genotyping assay using FFPE of ICC. Multiple HPV genotype infection adversely influenced survival. PMID:28829791

  9. The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.

    1994-01-01

    Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.

  10. The use of interactive computer vision and robot hand controllers for enhancing manufacturing safety

    NASA Astrophysics Data System (ADS)

    Marzwell, Neville I.; Jacobus, Charles J.; Peurach, Thomas M.; Mitchell, Brian T.

    1994-02-01

    Current available robotic systems provide limited support for CAD-based model-driven visualization, sensing algorithm development and integration, and automated graphical planning systems. This paper describes ongoing work which provides the functionality necessary to apply advanced robotics to automated manufacturing and assembly operations. An interface has been built which incorporates 6-DOF tactile manipulation, displays for three dimensional graphical models, and automated tracking functions which depend on automated machine vision. A set of tools for single and multiple focal plane sensor image processing and understanding has been demonstrated which utilizes object recognition models. The resulting tool will enable sensing and planning from computationally simple graphical objects. A synergistic interplay between human and operator vision is created from programmable feedback received from the controller. This approach can be used as the basis for implementing enhanced safety in automated robotics manufacturing, assembly, repair and inspection tasks in both ground and space applications. Thus, an interactive capability has been developed to match the modeled environment to the real task environment for safe and predictable task execution.

  11. Refurbishment and Automation of Thermal Vacuum Facilities at NASA/GSFC

    NASA Technical Reports Server (NTRS)

    Dunn, Jamie; Gomez, Carlos; Donohue, John; Johnson, Chris; Palmer, John; Sushon, Janet

    1999-01-01

    The thermal vacuum facilities located at the Goddard Space Flight Center (GSFC) have supported both manned and unmanned space flight since the 1960s. Of the eleven facilities, currently ten of the systems are scheduled for refurbishment or replacement as part of a five-year implementation. Expected return on investment includes the reduction in test schedules, improvements in safety of facility operations, and reduction in the personnel support required for a test. Additionally, GSFC will become a global resource renowned for expertise in thermal engineering, mechanical engineering, and for the automation of thermal vacuum facilities and tests. Automation of the thermal vacuum facilities includes the utilization of Programmable Logic Controllers (PLCs), the use of Supervisory Control and Data Acquisition (SCADA) systems, and the development of a centralized Test Data Management System. These components allow the computer control and automation of mechanical components such as valves and pumps. The project of refurbishment and automation began in 1996 and has resulted in complete computer control of one facility (Facility 281), and the integration of electronically controlled devices and PLCs in multiple others.

  12. Refurbishment and Automation of Thermal Vacuum Facilities at NASA/GSFC

    NASA Technical Reports Server (NTRS)

    Dunn, Jamie; Gomez, Carlos; Donohue, John; Johnson, Chris; Palmer, John; Sushon, Janet

    1998-01-01

    The thermal vacuum facilities located at the Goddard Space Flight Center (GSFC) have supported both manned and unmanned space flight since the 1960s. Of the eleven facilities, currently ten of the systems are scheduled for refurbishment or replacement as part of a five-year implementation. Expected return on investment includes the reduction in test schedules, improvements in safety of facility operations, and reduction in the personnel support required for a test. Additionally, GSFC will become a global resource renowned for expertise in thermal engineering, mechanical engineering, and for the automation of thermal vacuum facilities and tests. Automation of the thermal vacuum facilities includes the utilization of Programmable Logic Controllers (PLCs), the use of Supervisory Control and Data Acquisition (SCADA) systems, and the development of a centralized Test Data Management System. These components allow the computer control and automation of mechanical components such as valves and pumps. The project of refurbishment and automation began in 1996 and has resulted in complete computer control of one facility (Facility 281), and the integration of electronically controlled devices and PLCs in multiple others.

  13. MARS: bringing the automation of small-molecule bioanalytical sample preparations to a new frontier.

    PubMed

    Li, Ming; Chou, Judy; Jing, Jing; Xu, Hui; Costa, Aldo; Caputo, Robin; Mikkilineni, Rajesh; Flannelly-King, Shane; Rohde, Ellen; Gan, Lawrence; Klunk, Lewis; Yang, Liyu

    2012-06-01

    In recent years, there has been a growing interest in automating small-molecule bioanalytical sample preparations specifically using the Hamilton MicroLab(®) STAR liquid-handling platform. In the most extensive work reported thus far, multiple small-molecule sample preparation assay types (protein precipitation extraction, SPE and liquid-liquid extraction) have been integrated into a suite that is composed of graphical user interfaces and Hamilton scripts. Using that suite, bioanalytical scientists have been able to automate various sample preparation methods to a great extent. However, there are still areas that could benefit from further automation, specifically, the full integration of analytical standard and QC sample preparation with study sample extraction in one continuous run, real-time 2D barcode scanning on the Hamilton deck and direct Laboratory Information Management System database connectivity. We developed a new small-molecule sample-preparation automation system that improves in all of the aforementioned areas. The improved system presented herein further streamlines the bioanalytical workflow, simplifies batch run design, reduces analyst intervention and eliminates sample-handling error.

  14. Intelligent vehicle control: Opportunities for terrestrial-space system integration

    NASA Technical Reports Server (NTRS)

    Shoemaker, Charles

    1994-01-01

    For 11 years the Department of Defense has cooperated with a diverse array of other Federal agencies including the National Institute of Standards and Technology, the Jet Propulsion Laboratory, and the Department of Energy, to develop robotics technology for unmanned ground systems. These activities have addressed control system architectures supporting sharing of tasks between the system operator and various automated subsystems, man-machine interfaces to intelligent vehicles systems, video compression supporting vehicle driving in low data rate digital communication environments, multiple simultaneous vehicle control by a single operator, path planning and retrace, and automated obstacle detection and avoidance subsystem. Performance metrics and test facilities for robotic vehicles were developed permitting objective performance assessment of a variety of operator-automated vehicle control regimes. Progress in these areas will be described in the context of robotic vehicle testbeds specifically developed for automated vehicle research. These initiatives, particularly as regards the data compression, task sharing, and automated mobility topics, also have relevance in the space environment. The intersection of technology development interests between these two communities will be discussed in this paper.

  15. Automated characterization and assembly of individual nanowires for device fabrication.

    PubMed

    Yu, Kaiyan; Yi, Jingang; Shan, Jerry W

    2018-05-15

    The automated sorting and positioning of nanowires and nanotubes is essential to enabling the scalable manufacturing of nanodevices for a variety of applications. However, two fundamental challenges still remain: (i) automated placement of individual nanostructures in precise locations, and (ii) the characterization and sorting of highly variable nanomaterials to construct well-controlled nanodevices. Here, we propose and demonstrate an integrated, electric-field based method for the simultaneous automated characterization, manipulation, and assembly of nanowires (ACMAN) with selectable electrical conductivities into nanodevices. We combine contactless and solution-based electro-orientation spectroscopy and electrophoresis-based motion-control, planning and manipulation strategies to simultaneously characterize and manipulate multiple individual nanowires. These nanowires can be selected according to their electrical characteristics and precisely positioned at different locations in a low-conductivity liquid to form functional nanodevices with desired electrical properties. We validate the ACMAN design by assembling field-effect transistors (FETs) with silicon nanowires of selected electrical conductivities. The design scheme provides a key enabling technology for the scalable, automated sorting and assembly of nanowires and nanotubes to build functional nanodevices.

  16. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a modular approach, from a hardware-driven system to process control, from a one-of-a-kind novelty toward a standardized product, and from an in vitro diagnostics novelty to a marketing tool. Multiple vendors are present in the marketplace, many of whom are in vitro diagnostics manufacturers providing an automation solution coupled with their instruments, whereas others are focused automation companies. Automation technology continues to advance, acceptance continues to climb, and payback and cost justification methods are developing.

  17. Raman accumulator as a fusion laser driver

    DOEpatents

    George, E. Victor; Swingle, James C.

    1985-01-01

    Apparatus for simultaneous laser pulse amplification and compression, using multiple pass Raman scattering in one Raman cell and pulse switchout from the optical cavity through use of a dichroic device associated with the Raman cell.

  18. Raman accumulator as a fusion laser driver

    DOEpatents

    George, E.V.; Swingle, J.C.

    1982-03-31

    Apparatus for simultaneous laser pulse amplification and compression, using multiple pass Raman scattering in one Raman cell and pulse switchout from the optical cavity through use of a dichroic device associated with the Raman cell.

  19. Optimal Cluster Mill Pass Scheduling With an Accurate and Rapid New Strip Crown Model

    NASA Astrophysics Data System (ADS)

    Malik, Arif S.; Grandhi, Ramana V.; Zipf, Mark E.

    2007-05-01

    Besides the requirement to roll coiled sheet at high levels of productivity, the optimal pass scheduling of cluster-type reversing cold mills presents the added challenge of assigning mill parameters that facilitate the best possible strip flatness. The pressures of intense global competition, and the requirements for increasingly thinner, higher quality specialty sheet products that are more difficult to roll, continue to force metal producers to commission innovative flatness-control technologies. This means that during the on-line computerized set-up of rolling mills, the mathematical model should not only determine the minimum total number of passes and maximum rolling speed, it should simultaneously optimize the pass-schedule so that desired flatness is assured, either by manual or automated means. In many cases today, however, on-line prediction of strip crown and corresponding flatness for the complex cluster-type rolling mills is typically addressed either by trial and error, by approximate deflection models for equivalent vertical roll-stacks, or by non-physical pattern recognition style models. The abundance of the aforementioned methods is largely due to the complexity of cluster-type mill configurations and the lack of deflection models with sufficient accuracy and speed for on-line use. Without adequate assignment of the pass-schedule set-up parameters, it may be difficult or impossible to achieve the required strip flatness. In this paper, we demonstrate optimization of cluster mill pass-schedules using a new accurate and rapid strip crown model. This pass-schedule optimization includes computations of the predicted strip thickness profile to validate mathematical constraints. In contrast to many of the existing methods for on-line prediction of strip crown and flatness on cluster mills, the demonstrated method requires minimal prior tuning and no extensive training with collected mill data. To rapidly and accurately solve the multi-contact problem and predict the strip crown, a new customized semi-analytical modeling technique that couples the Finite Element Method (FEM) with classical solid mechanics was developed to model the deflection of the rolls and strip while under load. The technique employed offers several important advantages over traditional methods to calculate strip crown, including continuity of elastic foundations, non-iterative solution when using predetermined foundation moduli, continuous third-order displacement fields, simple stress-field determination, and a comparatively faster solution time.

  20. Advanced manufacturing rules check (MRC) for fully automated assessment of complex reticle designs

    NASA Astrophysics Data System (ADS)

    Gladhill, R.; Aguilar, D.; Buck, P. D.; Dawkins, D.; Nolke, S.; Riddick, J.; Straub, J. A.

    2005-11-01

    Advanced electronic design automation (EDA) tools, with their simulation, modeling, design rule checking, and optical proximity correction capabilities, have facilitated the improvement of first pass wafer yields. While the data produced by these tools may have been processed for optimal wafer manufacturing, it is possible for the same data to be far from ideal for photomask manufacturing, particularly at lithography and inspection stages, resulting in production delays and increased costs. The same EDA tools used to produce the data can be used to detect potential problems for photomask manufacturing in the data. A production implementation of automated photomask manufacturing rule checking (MRC) is presented and discussed for various photomask lithography and inspection lines. This paper will focus on identifying data which may cause production delays at the mask inspection stage. It will be shown how photomask MRC can be used to discover data related problems prior to inspection, separating jobs which are likely to have problems at inspection from those which are not. Photomask MRC can also be used to identify geometries requiring adjustment of inspection parameters for optimal inspection, and to assist with any special handling or change of routing requirements. With this foreknowledge, steps can be taken to avoid production delays that increase manufacturing costs. Finally, the data flow implemented for MRC can be used as a platform for other photomask data preparation tasks.

  1. Automated detection of coronal mass ejections in three-dimensions using multi-viewpoint observations

    NASA Astrophysics Data System (ADS)

    Hutton, J.; Morgan, H.

    2017-03-01

    A new, automated method of detecting coronal mass ejections (CMEs) in three dimensions for the LASCO C2 and STEREO COR2 coronagraphs is presented. By triangulating isolated CME signal from the three coronagraphs over a sliding window of five hours, the most likely region through which CMEs pass at 5 R⊙ is identified. The centre and size of the region gives the most likely direction of propagation and approximate angular extent. The Automated CME Triangulation (ACT) method is tested extensively using a series of synthetic CME images created using a wireframe flux rope density model, and on a sample of real coronagraph data; including halo CMEs. The accuracy of the angular difference (σ) between the detection and true input of the synthetic CMEs is σ = 7.14°, and remains acceptable for a broad range of CME positions relative to the observer, the relative separation of the three observers and even through the loss of one coronagraph. For real data, the method gives results that compare well with the distribution of low coronal sources and results from another instrument and technique made further from the Sun. The true three dimension (3D)-corrected kinematics and mass/density are discussed. The results of the new method will be incorporated into the CORIMP database in the near future, enabling improved space weather diagnostics and forecasting.

  2. Automation of nanoflow liquid chromatography-tandem mass spectrometry for proteome analysis by using a strong cation exchange trap column.

    PubMed

    Jiang, Xiaogang; Feng, Shun; Tian, Ruijun; Han, Guanghui; Jiang, Xinning; Ye, Mingliang; Zou, Hanfa

    2007-02-01

    An approach was developed to automate sample introduction for nanoflow LC-MS/MS (microLC-MS/MS) analysis using a strong cation exchange (SCX) trap column. The system consisted of a 100 microm id x 2 cm SCX trap column and a 75 microm id x 12 cm C18 RP analytical column. During the sample loading step, the flow passing through the SCX trap column was directed to waste for loading a large volume of sample at high flow rate. Then the peptides bound on the SCX trap column were eluted onto the RP analytical column by a high salt buffer followed by RP chromatographic separation of the peptides at nanoliter flow rate. It was observed that higher performance of separation could be achieved with the system using SCX trap column than with the system using C18 trap column. The high proteomic coverage using this approach was demonstrated in the analysis of tryptic digest of BSA and yeast cell lysate. In addition, this system was also applied to two-dimensional separation of tryptic digest of human hepatocellular carcinoma cell line SMMC-7721 for large scale proteome analysis. This system was fully automated and required minimum changes on current microLC-MS/MS system. This system represented a promising platform for routine proteome analysis.

  3. Sunglint Detection for Unmanned and Automated Platforms

    PubMed Central

    Garaba, Shungudzemwoyo Pascal; Schulz, Jan; Wernand, Marcel Robert; Zielinski, Oliver

    2012-01-01

    We present an empirical quality control protocol for above-water radiometric sampling focussing on identifying sunglint situations. Using hyperspectral radiometers, measurements were taken on an automated and unmanned seaborne platform in northwest European shelf seas. In parallel, a camera system was used to capture sea surface and sky images of the investigated points. The quality control consists of meteorological flags, to mask dusk, dawn, precipitation and low light conditions, utilizing incoming solar irradiance (ES) spectra. Using 629 from a total of 3,121 spectral measurements that passed the test conditions of the meteorological flagging, a new sunglint flag was developed. To predict sunglint conspicuous in the simultaneously available sea surface images a sunglint image detection algorithm was developed and implemented. Applying this algorithm, two sets of data, one with (having too much or detectable white pixels or sunglint) and one without sunglint (having least visible/detectable white pixel or sunglint), were derived. To identify the most effective sunglint flagging criteria we evaluated the spectral characteristics of these two data sets using water leaving radiance (LW) and remote sensing reflectance (RRS). Spectral conditions satisfying ‘mean LW (700–950 nm) < 2 mW·m−2·nm−1·Sr−1’ or alternatively ‘minimum RRS (700–950 nm) < 0.010 Sr−1’, mask most measurements affected by sunglint, providing an efficient empirical flagging of sunglint in automated quality control.

  4. Generic and Automated Data Evaluation in Analytical Measurement.

    PubMed

    Adam, Martin; Fleischer, Heidi; Thurow, Kerstin

    2017-04-01

    In the past year, automation has become more and more important in the field of elemental and structural chemical analysis to reduce the high degree of manual operation and processing time as well as human errors. Thus, a high number of data points are generated, which requires fast and automated data evaluation. To handle the preprocessed export data from different analytical devices with software from various vendors offering a standardized solution without any programming knowledge should be preferred. In modern laboratories, multiple users will use this software on multiple personal computers with different operating systems (e.g., Windows, Macintosh, Linux). Also, mobile devices such as smartphones and tablets have gained growing importance. The developed software, Project Analytical Data Evaluation (ADE), is implemented as a web application. To transmit the preevaluated data from the device software to the Project ADE, the exported XML report files are detected and the included data are imported into the entities database using the Data Upload software. Different calculation types of a sample within one measurement series (e.g., method validation) are identified using information tags inside the sample name. The results are presented in tables and diagrams on different information levels (general, detailed for one analyte or sample).

  5. Shotgun protein sequencing: assembly of peptide tandem mass spectra from mixtures of modified proteins.

    PubMed

    Bandeira, Nuno; Clauser, Karl R; Pevzner, Pavel A

    2007-07-01

    Despite significant advances in the identification of known proteins, the analysis of unknown proteins by MS/MS still remains a challenging open problem. Although Klaus Biemann recognized the potential of MS/MS for sequencing of unknown proteins in the 1980s, low throughput Edman degradation followed by cloning still remains the main method to sequence unknown proteins. The automated interpretation of MS/MS spectra has been limited by a focus on individual spectra and has not capitalized on the information contained in spectra of overlapping peptides. Indeed the powerful shotgun DNA sequencing strategies have not been extended to automated protein sequencing. We demonstrate, for the first time, the feasibility of automated shotgun protein sequencing of protein mixtures by utilizing MS/MS spectra of overlapping and possibly modified peptides generated via multiple proteases of different specificities. We validate this approach by generating highly accurate de novo reconstructions of multiple regions of various proteins in western diamondback rattlesnake venom. We further argue that shotgun protein sequencing has the potential to overcome the limitations of current protein sequencing approaches and thus catalyze the otherwise impractical applications of proteomics methodologies in studies of unknown proteins.

  6. Foundations of the Bandera Abstraction Tools

    NASA Technical Reports Server (NTRS)

    Hatcliff, John; Dwyer, Matthew B.; Pasareanu, Corina S.; Robby

    2003-01-01

    Current research is demonstrating that model-checking and other forms of automated finite-state verification can be effective for checking properties of software systems. Due to the exponential costs associated with model-checking, multiple forms of abstraction are often necessary to obtain system models that are tractable for automated checking. The Bandera Tool Set provides multiple forms of automated support for compiling concurrent Java software systems to models that can be supplied to several different model-checking tools. In this paper, we describe the foundations of Bandera's data abstraction mechanism which is used to reduce the cardinality (and the program's state-space) of data domains in software to be model-checked. From a technical standpoint, the form of data abstraction used in Bandera is simple, and it is based on classical presentations of abstract interpretation. We describe the mechanisms that Bandera provides for declaring abstractions, for attaching abstractions to programs, and for generating abstracted programs and properties. The contributions of this work are the design and implementation of various forms of tool support required for effective application of data abstraction to software components written in a programming language like Java which has a rich set of linguistic features.

  7. Automated Track Recognition and Event Reconstruction in Nuclear Emulsion

    NASA Technical Reports Server (NTRS)

    Deines-Jones, P.; Cherry, M. L.; Dabrowska, A.; Holynski, R.; Jones, W. V.; Kolganova, E. D.; Kudzia, D.; Nilsen, B. S.; Olszewski, A.; Pozharova, E. A.; hide

    1998-01-01

    The major advantages of nuclear emulsion for detecting charged particles are its submicron position resolution and sensitivity to minimum ionizing particles. These must be balanced, however, against the difficult manual microscope measurement by skilled observers required for the analysis. We have developed an automated system to acquire and analyze the microscope images from emulsion chambers. Each emulsion plate is analyzed independently, allowing coincidence techniques to be used in order to reject back- ground and estimate error rates. The system has been used to analyze a sample of high-multiplicity Pb-Pb interactions (charged particle multiplicities approx. 1100) produced by the 158 GeV/c per nucleon Pb-208 beam at CERN. Automatically reconstructed track lists agree with our best manual measurements to 3%. We describe the image analysis and track reconstruction techniques, and discuss the measurement and reconstruction uncertainties.

  8. Iterative pass optimization of sequence data

    NASA Technical Reports Server (NTRS)

    Wheeler, Ward C.

    2003-01-01

    The problem of determining the minimum-cost hypothetical ancestral sequences for a given cladogram is known to be NP-complete. This "tree alignment" problem has motivated the considerable effort placed in multiple sequence alignment procedures. Wheeler in 1996 proposed a heuristic method, direct optimization, to calculate cladogram costs without the intervention of multiple sequence alignment. This method, though more efficient in time and more effective in cladogram length than many alignment-based procedures, greedily optimizes nodes based on descendent information only. In their proposal of an exact multiple alignment solution, Sankoff et al. in 1976 described a heuristic procedure--the iterative improvement method--to create alignments at internal nodes by solving a series of median problems. The combination of a three-sequence direct optimization with iterative improvement and a branch-length-based cladogram cost procedure, provides an algorithm that frequently results in superior (i.e., lower) cladogram costs. This iterative pass optimization is both computation and memory intensive, but economies can be made to reduce this burden. An example in arthropod systematics is discussed. c2003 The Willi Hennig Society. Published by Elsevier Science (USA). All rights reserved.

  9. [Novel Device for Creating Multiple Artificial Chordae Loops in Mitral Valve Repair].

    PubMed

    Shimamura, Yoshiei; Maisawa, Kazuma

    2017-08-01

    A novel device to create multiple artificial chordae loops for mitral repair is developed. The device consists of a circular metal base with a removable central rod on one end, which can easily be attached or removed by screwing into a hole located on the base, and 51 fixed rods placed radially around the central rod at distances of 10~60 mm from the central rod. A needle with CV-4 e-polytetrafluoroethylene suture is passed through a pledget, and the suture is looped from the central rod around the fixed rod located at the desired loop length. The needle is then passed back through the pledget. The suture is tied over the pledget, bringing it in contact with the central rod. When multiple loops of various lengths are required, different fixed rods located at distances corresponding to the required loop lengths are used. Following creation of the necessary loops, the central rod is unscrewed, and the loops are released from the device. Construction of artificial chordae with this device is quick, reliable, reproducible, and increases the technical possibilities for mitral valve repair.

  10. Pulsed dye laser double-pass treatment of patients with resistant capillary malformations.

    PubMed

    Rajaratnam, Ratna; Laughlin, Sharyn A; Dudley, Denis

    2011-07-01

    The pulsed dye laser is an effective and established treatment for port-wine stains and has become the generally accepted standard of care. However, in many cases, complete clearance cannot be achieved as a significant proportion of lesions become resistant to treatment. Multiple passes or pulse-stacking techniques have been used to improve the extent and rate of fading, but concerns over increased adverse effects have limited this clinical approach. In this work, a double-pass technique with the pulsed dye laser has been described, which may allow for increased depth of vascular injury, greater efficacy, and an acceptable risk profile. Our aim was to determine the efficacy and the rate of side-effects for a double-pass protocol with a pulsed dye laser (PDL) to treat patients previously treated with PDL and/or other laser modalities. A retrospective chart review was conducted of 26 patients treated with a minimum of three double-pass treatments alone, or in combination, with single pass conventional PDL. Almost half of the patients (n = 12) showed either a moderate or significant improvement in fading compared to pre-treatment photographs with the double-pass technique. In a further 12 patients, there was a mild improvement. In two patients, there was no change. Sixteen patients developed mild side-effects: blisters (n = 5), dry scabs (n = 11) and transient hyperpigmentation (n = 4). This preliminary experience suggests that a double-pass technique at defined intervals between the first and second treatment with PDL can further lighten some port-wine stains, which are resistant to conventional single-pass treatments. This technique may be a useful addition to the laser treatment of PWS and deserves further scrutiny with randomized prospective studies and histological analysis to confirm the increased depth of vascular injury.

  11. PRAIS: Distributed, real-time knowledge-based systems made easy

    NASA Technical Reports Server (NTRS)

    Goldstein, David G.

    1990-01-01

    This paper discusses an architecture for real-time, distributed (parallel) knowledge-based systems called the Parallel Real-time Artificial Intelligence System (PRAIS). PRAIS strives for transparently parallelizing production (rule-based) systems, even when under real-time constraints. PRAIS accomplishes these goals by incorporating a dynamic task scheduler, operating system extensions for fact handling, and message-passing among multiple copies of CLIPS executing on a virtual blackboard. This distributed knowledge-based system tool uses the portability of CLIPS and common message-passing protocols to operate over a heterogeneous network of processors.

  12. Apparatus for measuring particle properties

    DOEpatents

    Rader, D.J.; Castaneda, J.N.; Grasser, T.W.; Brockmann, J.E.

    1998-08-11

    An apparatus is described for determining particle properties from detected light scattered by the particles. The apparatus uses a light beam with novel intensity characteristics to discriminate between particles that pass through the beam and those that pass through an edge of the beam. The apparatus can also discriminate between light scattered by one particle and light scattered by multiple particles. The particle`s size can be determined from the intensity of the light scattered. The particle`s velocity can be determined from the elapsed time between various intensities of the light scattered. 11 figs.

  13. The effects and outcomes of electrolyte disturbances and asphyxia on newborns hearing

    PubMed Central

    Liang, Chun; Hong, Qi; Jiang, Tao-Tao; Gao, Yan; Yao, Xiao-Fang; Luo, Xiao-Xing; Zhuo, Xiu-Hui; Shinn, Jennifer B.; Jones, Raleigh O.; Zhao, Hong-Bo; Lu, Guang-Jin

    2013-01-01

    Objective To determine the effect of electrolyte disturbances (ED) and asphyxia on infant hearing and hearing outcomes. Study Design We conducted newborn hearing screening with transient evoked otoacoustic emission (TEOAE) test on a large scale (>5,000 infants). The effects of ED and asphyxia on infant hearing and hearing outcomes were evaluated. Result The pass rate of TEOAE test was significantly reduced in preterm infants with ED (83.1%, multiple logistic regression analysis: P<0.01) but not in full-term infants with ED (93.6%, P=0.41). However, there was no significant reduction in the pass rate in infants with asphyxia (P=0.85). We further found that hypocalcaemia significantly reduced the pass rate of TEOAE test (86.8%, P<0.01). In the follow-up recheck at 3 months of age, the pass rate remained low (44.4%, P<0.01). Conclusion ED is a high-risk factor for preterm infant hearing. Hypocalcaemia can produce more significant impairment with a low recovery rate. PMID:23648318

  14. The contaminant analysis automation robot implementation for the automated laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Younkin, J.R.; Igou, R.E.; Urenda, T.D.

    1995-12-31

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLMmore » when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation.« less

  15. Automated PET Radiotracer Manufacture on the BG75 System and Imaging Validation Studies of [18F]fluoromisonidazole ([18F]FMISO).

    PubMed

    Yuan, Hong; Frank, Jonathan E; Merrill, Joseph R; Hillesheim, Daniel A; Khachaturian, Mark H; Anzellotti, Atilio I

    2016-01-01

    The hypoxia PET tracer, 1-[18F]fluoro-3-(2-nitro-1Himidazol- 1-yl)-propan-2-ol ([18F]FMISO) is the first radiotracer developed for hypoxia PET imaging and has shown promising for cancer diagnosis and prognosis. However, access to [18F]FMISO radiotracer is limited due to the needed cyclotron and radiochemistry expertise. The study aimed to develop the automated production method on the [18F]FMISO radiotracer with the novel fully automated platform of the BG75 system and validate its usage on animal tumor models. [18F]FMISO was produced with the dose synthesis cartridge automatically on the BG75 system. Validation of [18F]FMISO hypoxia imaging functionality was conducted on two tumor mouse models (FaDu/U87 tumor). The distribution of [18F]FMISO within tumor was further validated by the standard hypoxia marker EF5. The average radiochemical purity was (99±1) % and the average pH was 5.5±0.2 with other quality attributes passing standard criteria (n=12). Overall biodistribution for [18F]FMISO in both tumor models was consistent with reported studies where bladder and large intestines presented highest activity at 90 min post injection. High spatial correlation was found between [18F]FMISO autoradiography and EF5 hypoxia staining, indicating high hypoxia specificity of [18MF]FMISO. This study shows that qualified [18F]FMISO can be efficiently produced on the BG75 system in an automated "dose-on-demand" mode using single dose disposable cards. The possibilities of having a low-cost, automated system manufacturing ([18F]Fluoride production + synthesis + QC) different radiotracers will greatly enhance the potential for PET technology to reach new geographical areas and underserved patient populations. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  16. An automated approach for extracting Barrier Island morphology from digital elevation models

    NASA Astrophysics Data System (ADS)

    Wernette, Phillipe; Houser, Chris; Bishop, Michael P.

    2016-06-01

    The response and recovery of a barrier island to extreme storms depends on the elevation of the dune base and crest, both of which can vary considerably alongshore and through time. Quantifying the response to and recovery from storms requires that we can first identify and differentiate the dune(s) from the beach and back-barrier, which in turn depends on accurate identification and delineation of the dune toe, crest and heel. The purpose of this paper is to introduce a multi-scale automated approach for extracting beach, dune (dune toe, dune crest and dune heel), and barrier island morphology. The automated approach introduced here extracts the shoreline and back-barrier shoreline based on elevation thresholds, and extracts the dune toe, dune crest and dune heel based on the average relative relief (RR) across multiple spatial scales of analysis. The multi-scale automated RR approach to extracting dune toe, dune crest, and dune heel based upon relative relief is more objective than traditional approaches because every pixel is analyzed across multiple computational scales and the identification of features is based on the calculated RR values. The RR approach out-performed contemporary approaches and represents a fast objective means to define important beach and dune features for predicting barrier island response to storms. The RR method also does not require that the dune toe, crest, or heel are spatially continuous, which is important because dune morphology is likely naturally variable alongshore.

  17. Microbleed detection using automated segmentation (MIDAS): a new method applicable to standard clinical MR images.

    PubMed

    Seghier, Mohamed L; Kolanko, Magdalena A; Leff, Alexander P; Jäger, Hans R; Gregoire, Simone M; Werring, David J

    2011-03-23

    Cerebral microbleeds, visible on gradient-recalled echo (GRE) T2* MRI, have generated increasing interest as an imaging marker of small vessel diseases, with relevance for intracerebral bleeding risk or brain dysfunction. Manual rating methods have limited reliability and are time-consuming. We developed a new method for microbleed detection using automated segmentation (MIDAS) and compared it with a validated visual rating system. In thirty consecutive stroke service patients, standard GRE T2* images were acquired and manually rated for microbleeds by a trained observer. After spatially normalizing each patient's GRE T2* images into a standard stereotaxic space, the automated microbleed detection algorithm (MIDAS) identified cerebral microbleeds by explicitly incorporating an "extra" tissue class for abnormal voxels within a unified segmentation-normalization model. The agreement between manual and automated methods was assessed using the intraclass correlation coefficient (ICC) and Kappa statistic. We found that MIDAS had generally moderate to good agreement with the manual reference method for the presence of lobar microbleeds (Kappa = 0.43, improved to 0.65 after manual exclusion of obvious artefacts). Agreement for the number of microbleeds was very good for lobar regions: (ICC = 0.71, improved to ICC = 0.87). MIDAS successfully detected all patients with multiple (≥2) lobar microbleeds. MIDAS can identify microbleeds on standard MR datasets, and with an additional rapid editing step shows good agreement with a validated visual rating system. MIDAS may be useful in screening for multiple lobar microbleeds.

  18. A message passing kernel for the hypercluster parallel processing test bed

    NASA Technical Reports Server (NTRS)

    Blech, Richard A.; Quealy, Angela; Cole, Gary L.

    1989-01-01

    A Message-Passing Kernel (MPK) for the Hypercluster parallel-processing test bed is described. The Hypercluster is being developed at the NASA Lewis Research Center to support investigations of parallel algorithms and architectures for computational fluid and structural mechanics applications. The Hypercluster resembles the hypercube architecture except that each node consists of multiple processors communicating through shared memory. The MPK efficiently routes information through the Hypercluster, using a message-passing protocol when necessary and faster shared-memory communication whenever possible. The MPK also interfaces all of the processors with the Hypercluster operating system (HYCLOPS), which runs on a Front-End Processor (FEP). This approach distributes many of the I/O tasks to the Hypercluster processors and eliminates the need for a separate I/O support program on the FEP.

  19. Validation of the Omron M6 (HEM-7001-E) upper arm blood pressure measuring device according to the International Protocol in elderly patients.

    PubMed

    Altunkan, Sekip; Iliman, Nevzat; Altunkan, Erkan

    2008-04-01

    Despite the widespread use of automated self-measurement monitors, there is limited published evidence on their accuracy and reliability on different patient groups. The objective of this study was to evaluate the accuracy and reliability of the Omron M6 (HEM-7001-E) upper-arm blood pressure (BP) device against mercury sphygmomanometer on elderly patients according to the criteria of the International Protocol. Thirty-three patients above 65 years of age, who were classified based on the BP categories of the International Protocol, were recruited for the study. BP measurements at the upper arm with the Omron M6 were compared with the results obtained by two trained observers using a mercury sphygmomanometer. Nine sequential BP measurements were taken. During the validation study, 99 measurements were obtained from 33 patients for comparison. The first phase was carried out on 15 patients and if the device passed this phase, 18 more patients were selected. Mean discrepancies and standard deviations of the device sphygmomanometer were 1.4+/-5.3 mmHg for systolic BP (SBP) and -1.4+/-4.5 mmHg for diastolic BP (DBP) in the study group. The device passed phase 1 in 15 patients. In phase 2.1, from the total 99 comparisons, 76, 92, and 97 for SBP and 77, 94, and 99 for DBP were less than 5, 10, and 15 mmHg, respectively. The Omron M6 passed phases 2.1 and 2.2 in the elderly group of patients. The Omron M6 (HEM-7001-E) upper-arm BP monitor passed according to the International Protocol criteria and can be recommended for use in elderly patients.

  20. Transforming Our SMEX Organization by Way of Innovation, Standardization, and Automation

    NASA Technical Reports Server (NTRS)

    Madden, Maureen; Crouse, Pat; Carry, Everett; Esposito, timothy; Parker, Jeffrey; Bradley, David

    2006-01-01

    NASA's Small Explorer (SMEX) Flight Operations Team (FOT) is currently tackling the challenge of supporting ground operations for several satellites that have surpassed their designed lifetime and have a dwindling budget. At Goddard Space Flight Center (GSFC), these missions are presently being reengineered into a fleet-oriented ground system. When complete, this ground system will provide command and control of four SMEX missions, and will demonstrate fleet automation and control concepts as a pathfinder for additional mission integrations. A goal of this reengineering effort is to demonstrate new ground-system technologies that show promise of supporting longer mission lifecycles and simplifying component integration. In pursuit of this goal, the SMEX organization has had to examine standardization, innovation, and automation. A core technology being demonstrated in this effort is the GSFC Mission Services Evolution Center (GMSEC) architecture. The GMSEC architecture focuses on providing standard interfaces for ground system applications to promote application interoperability. Building around commercial Message Oriented Middleware and providing a common messaging standard allows GMSEC to provide the capabilities necessary to support integration of new software components into existing missions and increase the level of interaction within the system. For SMS, GMSEC has become the technology platform to transform flight operations with the innovation and automation necessary to reduce operational costs. The automation technologies supported in SMEX are built upon capabilities provided by the GMSEC architecture that allows the FOT to further reduce the involvement of the console, operator. Initially, SMEX is automating only routine operations, such as safety and health monitoring, basic commanding, and system recovery. The operational concepts being developed here will reduce the need for staffed passes and are a necessity for future fleet management. As this project continues to evolve, additional innovations beyond GMSEC and automation have, and will continue to be developed. The team developed techniques for migrating ground systems of existing on-orbit assets. The tools necessary to monitor and control software failures were integrated and tailored for operational environments. All this was done with a focus of extending fleet operations to mission beyond SMU. The result of this work is the foundation for a broader fleet-capable ground system that will include several missions supported by the Space Science Mission Operations Project.

  1. Standardized Automated CO2/H2O Flux Systems for Individual Research Groups and Flux Networks

    NASA Astrophysics Data System (ADS)

    Burba, George; Begashaw, Israel; Fratini, Gerardo; Griessbaum, Frank; Kathilankal, James; Xu, Liukang; Franz, Daniela; Joseph, Everette; Larmanou, Eric; Miller, Scott; Papale, Dario; Sabbatini, Simone; Sachs, Torsten; Sakai, Ricardo; McDermitt, Dayle

    2017-04-01

    In recent years, spatial and temporal flux data coverage improved significantly, and on multiple scales, from a single station to continental networks, due to standardization, automation, and management of data collection, and better handling of the extensive amounts of generated data. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are required to effectively and efficiently handle the entire process. Such tools are needed to maximize time dedicated to authoring publications and answering research questions, and to minimize time and expenses spent on data acquisition, processing, and quality control. Thus, these tools should produce standardized verifiable datasets and provide a way to cross-share the standardized data with external collaborators to leverage available funding, promote data analyses and publications. LI-COR gas analyzers are widely used in past and present flux networks such as AmeriFlux, ICOS, AsiaFlux, OzFlux, NEON, CarboEurope, and FluxNet-Canada, etc. These analyzers have gone through several major improvements over the past 30 years. However, in 2016, a three-prong development was completed to create an automated flux system which can accept multiple sonic anemometer and datalogger models, compute final and complete fluxes on-site, merge final fluxes with supporting weather soil and radiation data, monitor station outputs and send automated alerts to researchers, and allow secure sharing and cross-sharing of the station and data access. Two types of these research systems were developed: open-path (LI-7500RS) and enclosed-path (LI-7200RS). Key developments included: • Improvement of gas analyzer performance • Standardization and automation of final flux calculations onsite, and in real-time • Seamless integration with latest site management and data sharing tools In terms of the gas analyzer performance, the RS analyzers are based on established LI-7500/A and LI-7200 models, and the improvements focused on increased stability in the presence of contamination, refining temperature control and compensation, and providing more accurate fast gas concentration measurements. In terms of the flux calculations, improvements focused on automating the on-site flux calculations using EddyPro® software run by a weatherized fully digital microcomputer, SmartFlux2. In terms of site management and data sharing, the development focused on web-based software, FluxSuite, which allows real-time station monitoring and data access by multiple users. The presentation will describe details for the key developments and will include results from field tests of the RS gas analyzer models in comparison with older models and control reference instruments.

  2. SPOT4 Management Centre

    NASA Technical Reports Server (NTRS)

    Labrune, Yves; Labbe, X.; Roussel, A.; Vielcanet, P.

    1994-01-01

    In the context of the CNES SPOT4 program CISI is particularly responsible for the development of the SPOT4 Management Centre, part of the SPOT4 ground control system located at CNES Toulouse (France) designed to provide simultaneous control over two satellites. The main operational activities are timed to synchronize with satellite visibilities (ten usable passes per day). The automatic capability of this system is achieved through agenda services (sequence of operations as defined and planned by operator). Therefore, the SPOT4 Management Centre offers limited, efficient and secure human interventions for supervision and decision making. This paper emphasizes the main system characteristics as degree of automation, level of dependability and system parameterization.

  3. A Comparison of Automatic Parallelization Tools/Compilers on the SGI Origin 2000 Using the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Frumkin, Michael; Hribar, Michelle; Jin, Hao-Qiang; Waheed, Abdul; Yan, Jerry

    1998-01-01

    Porting applications to new high performance parallel and distributed computing platforms is a challenging task. Since writing parallel code by hand is extremely time consuming and costly, porting codes would ideally be automated by using some parallelization tools and compilers. In this paper, we compare the performance of the hand written NAB Parallel Benchmarks against three parallel versions generated with the help of tools and compilers: 1) CAPTools: an interactive computer aided parallelization too] that generates message passing code, 2) the Portland Group's HPF compiler and 3) using compiler directives with the native FORTAN77 compiler on the SGI Origin2000.

  4. flexplan: Mission Planning System for the Lunar Reconnaissance Orbiter

    NASA Technical Reports Server (NTRS)

    Barnoy, Assaf; Beech, Theresa

    2013-01-01

    flexplan is a mission planning and scheduling (MPS) tool that uses soft algorithms to define mission scheduling rules and constraints. This allows the operator to configure the tool for any mission without the need to modify or recompile code. In addition, flexplan uses an ID system to track every output on the schedule to the input from which it was generated. This allows flexplan to receive feedback as the schedules are executed, and update the status of all activities in a Web-based client. flexplan outputs include various planning reports, stored command loads for the Lunar Reconnaissance Orbiter (LRO), ephemeris loads, and pass scripts for automation.

  5. Parametric analysis of plastic strain and force distribution in single pass metal spinning

    NASA Astrophysics Data System (ADS)

    Choudhary, Shashank; Tejesh, Chiruvolu Mohan; Regalla, Srinivasa Prakash; Suresh, Kurra

    2013-12-01

    Metal spinning also known as spin forming is one of the sheet metal working processes by which an axis-symmetric part can be formed from a flat sheet metal blank. Parts are produced by pressing a blunt edged tool or roller on to the blank which in turn is mounted on a rotating mandrel. This paper discusses about the setting up a 3-D finite element simulation of single pass metal spinning in LS-Dyna. Four parameters were considered namely blank thickness, roller nose radius, feed ratio and mandrel speed and the variation in forces and plastic strain were analysed using the full-factorial design of experiments (DOE) method of simulation experiments. For some of these DOE runs, physical experiments on extra deep drawing (EDD) sheet metal were carried out using En31 tool on a lathe machine. Simulation results are able to predict the zone of unsafe thinning in the sheet and high forming forces that are hint to the necessity for less-expensive and semi-automated machine tools to help the household and small scale spinning workers widely prevalent in India.

  6. Acoustic results of supersonic tip speed fan blade modification

    NASA Technical Reports Server (NTRS)

    Jutras, R. R.; Kazin, S. B.

    1974-01-01

    A supersonic tip speed single stage fan was modified with the intent of reducing multiple pure tone (MPT) or buzz saw noise. There were three modifications to the blades from the original design. The modifications to the blade resulted in an increase in cascade throat area causing the shock to start at a lower corrected fan speed. The acoustic results without acoustically absorbing liners showed substantial reduction in multiple pure tone levels. However, an increase in the blade passing frequency noise at takeoff fan speed accompanied the MPT reduction. The net result however, was a reduction in the maximum 1000-foot (304.8 m) altitude level flyover PNL. For the case with acoustic treatment in the inlet outer wall, the takeoff noise increased relative to an acoustically treated baseline. This was largely due to the increased blade passing frequency noise which was not effectively reduced by the liner.

  7. Method and apparatus for a catalytic firebox reactor

    DOEpatents

    Smith, Lance L.; Etemad, Shahrokh; Ulkarim, Hasan; Castaldi, Marco J.; Pfefferle, William C.

    2001-01-01

    A catalytic firebox reactor employing an exothermic catalytic reaction channel and multiple cooling conduits for creating a partially reacted fuel/oxidant mixture. An oxidation catalyst is deposited on the walls forming the boundary between the multiple cooling conduits and the exothermic catalytic reaction channel, on the side of the walls facing the exothermic catalytic reaction channel. This configuration allows the oxidation catalyst to be backside cooled by any fluid passing through the cooling conduits. The heat of reaction is added to both the fluid in the exothermic catalytic reaction channel and the fluid passing through the cooling conduits. After discharge of the fluids from the exothermic catalytic reaction channel, the fluids mix to create a single combined flow. A further innovation in the reactor incorporates geometric changes in the exothermic catalytic reaction channel to provide streamwise variation of the velocity of the fluids in the reactor.

  8. A Bayesian model averaging method for improving SMT phrase table

    NASA Astrophysics Data System (ADS)

    Duan, Nan

    2013-03-01

    Previous methods on improving translation quality by employing multiple SMT models usually carry out as a second-pass decision procedure on hypotheses from multiple systems using extra features instead of using features in existing models in more depth. In this paper, we propose translation model generalization (TMG), an approach that updates probability feature values for the translation model being used based on the model itself and a set of auxiliary models, aiming to alleviate the over-estimation problem and enhance translation quality in the first-pass decoding phase. We validate our approach for translation models based on auxiliary models built by two different ways. We also introduce novel probability variance features into the log-linear models for further improvements. We conclude our approach can be developed independently and integrated into current SMT pipeline directly. We demonstrate BLEU improvements on the NIST Chinese-to-English MT tasks for single-system decodings.

  9. A Simple Automated Method for the Determination of Nitrate and Nitrite in Infant Formula and Milk Powder Using Sequential Injection Analysis

    PubMed Central

    Pistón, Mariela; Mollo, Alicia; Knochen, Moisés

    2011-01-01

    A fast and efficient automated method using a sequential injection analysis (SIA) system, based on the Griess, reaction was developed for the determination of nitrate and nitrite in infant formulas and milk powder. The system enables to mix a measured amount of sample (previously constituted in the liquid form and deproteinized) with the chromogenic reagent to produce a colored substance whose absorbance was recorded. For nitrate determination, an on-line prereduction step was added by passing the sample through a Cd minicolumn. The system was controlled from a PC by means of a user-friendly program. Figures of merit include linearity (r2 > 0.999 for both analytes), limits of detection (0.32 mg kg−1 NO3-N, and 0.05 mg kg−1 NO2-N), and precision (sr%) 0.8–3.0. Results were statistically in good agreement with those obtained with the reference ISO-IDF method. The sampling frequency was 30 hour−1 (nitrate) and 80 hour−1 (nitrite) when performed separately. PMID:21960750

  10. Automated locomotor activity monitoring as a quality control assay for mass-reared tephritid flies.

    PubMed

    Dominiak, Bernard C; Fanson, Benjamin G; Collins, Samuel R; Taylor, Phillip W

    2014-02-01

    The Sterile Insect Technique (SIT) requires vast numbers of consistently high quality insects to be produced over long periods. Quality control (QC) procedures are critical to effective SIT, both providing quality assurance and warning of operational deficiencies. We here present a potential new QC assay for mass rearing of Queensland fruit flies (Bactrocera tryoni Froggatt) for SIT; locomotor activity monitoring. We investigated whether automated locomotor activity monitors (LAMs) that simply detect how often a fly passes an infrared sensor in a glass tube might provide similar insights but with much greater economy. Activity levels were generally lower for females than for males, and declined over five days in the monitor for both sexes. Female activity levels were not affected by irradiation, but males irradiated at 60 or 70 Gy had reduced activity levels compared with unirradiated controls. We also found some evidence that mild heat shock of pupae results in adults with reduced activity. LAM offers a convenient, effective and economical assay to probe such changes. © 2013 Society of Chemical Industry.

  11. Sample registration software for process automation in the Neutron Activation Analysis (NAA) Facility in Malaysia nuclear agency

    NASA Astrophysics Data System (ADS)

    Rahman, Nur Aira Abd; Yussup, Nolida; Salim, Nazaratul Ashifa Bt. Abdullah; Ibrahim, Maslina Bt. Mohd; Mokhtar, Mukhlis B.; Soh@Shaari, Syirrazie Bin Che; Azman, Azraf B.; Ismail, Nadiah Binti

    2015-04-01

    Neutron Activation Analysis (NAA) had been established in Nuclear Malaysia since 1980s. Most of the procedures established were done manually including sample registration. The samples were recorded manually in a logbook and given ID number. Then all samples, standards, SRM and blank were recorded on the irradiation vial and several forms prior to irradiation. These manual procedures carried out by the NAA laboratory personnel were time consuming and not efficient. Sample registration software is developed as part of IAEA/CRP project on `Development of Process Automation in the Neutron Activation Analysis (NAA) Facility in Malaysia Nuclear Agency (RC17399)'. The objective of the project is to create a pc-based data entry software during sample preparation stage. This is an effective method to replace redundant manual data entries that needs to be completed by laboratory personnel. The software developed will automatically generate sample code for each sample in one batch, create printable registration forms for administration purpose, and store selected parameters that will be passed to sample analysis program. The software is developed by using National Instruments Labview 8.6.

  12. On grey levels in random CAPTCHA generation

    NASA Astrophysics Data System (ADS)

    Newton, Fraser; Kouritzin, Michael A.

    2011-06-01

    A CAPTCHA is an automatically generated test designed to distinguish between humans and computer programs; specifically, they are designed to be easy for humans but difficult for computer programs to pass in order to prevent the abuse of resources by automated bots. They are commonly seen guarding webmail registration forms, online auction sites, and preventing brute force attacks on passwords. In the following, we address the question: How does adding a grey level to random CAPTCHA generation affect the utility of the CAPTCHA? We treat the problem of generating the random CAPTCHA as one of random field simulation: An initial state of background noise is evolved over time using Gibbs sampling and an efficient algorithm for generating correlated random variables. This approach has already been found to yield highly-readable yet difficult-to-crack CAPTCHAs. We detail how the requisite parameters for introducing grey levels are estimated and how we generate the random CAPTCHA. The resulting CAPTCHA will be evaluated in terms of human readability as well as its resistance to automated attacks in the forms of character segmentation and optical character recognition.

  13. Electronic drop sensing in microfluidic devices: automated operation of a nanoliter viscometer

    PubMed Central

    Srivastava, Nimisha; Burns, Mark A.

    2007-01-01

    We describe three droplet sensing techniques: a digital electrode, an analog electrode, and a thermal method. All three techniques use a single layer of metal lines that is easy to microfabricate and an electronic signal can be produced using low DC voltages. While the electrode methods utilize changes in electrical conductivity when the air/liquid interface of the droplet passes over a pair of electrodes, the thermal method is based on convective heat loss from a locally heated region. For the electrode method, the analog technique is able to detect 25 nL droplets while the digital technique is capable of detecting droplets as small as 100 pL. For thermal sensing, temperature profiles in the range of 36 °C and higher were used. Finally, we have used the digital electrode method and an array of electrodes located at preset distances to automate the operation of a previously described microfluidic viscometer. The viscometer is completely controlled by a laptop computer, and the total time for operation including setup, calibration, sample addition and viscosity calculation is approximately 4 minutes. PMID:16738725

  14. Hurricane Juliette

    Atmospheric Science Data Center

    2013-04-19

    ... right is the cloud-top height field derived using automated computer processing of the data from multiple MISR cameras. Relative height ... NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Science Mission Directorate, Washington, D.C. The Terra spacecraft is managed ...

  15. A Liquid-Handling Robot for Automated Attachment of Biomolecules to Microbeads.

    PubMed

    Enten, Aaron; Yang, Yujia; Ye, Zihan; Chu, Ryan; Van, Tam; Rothschild, Ben; Gonzalez, Francisco; Sulchek, Todd

    2016-08-01

    Diagnostics, drug delivery, and other biomedical industries rely on cross-linking ligands to microbead surfaces. Microbead functionalization requires multiple steps of liquid exchange, incubation, and mixing, which are laborious and time intensive. Although automated systems exist, they are expensive and cumbersome, limiting their routine use in biomedical laboratories. We present a small, bench-top robotic system that automates microparticle functionalization and streamlines sample preparation. The robot uses a programmable microcontroller to regulate liquid exchange, incubation, and mixing functions. Filters with a pore diameter smaller than the minimum bead diameter are used to prevent bead loss during liquid exchange. The robot uses three liquid reagents and processes up to 10(7) microbeads per batch. The effectiveness of microbead functionalization was compared with a manual covalent coupling process and evaluated via flow cytometry and fluorescent imaging. The mean percentages of successfully functionalized beads were 91% and 92% for the robot and manual methods, respectively, with less than 5% bead loss. Although the two methods share similar qualities, the automated approach required approximately 10 min of active labor, compared with 3 h for the manual approach. These results suggest that a low-cost, automated microbead functionalization system can streamline sample preparation with minimal operator intervention. © 2015 Society for Laboratory Automation and Screening.

  16. Automated single-trial assessment of laser-evoked potentials as an objective functional diagnostic tool for the nociceptive system.

    PubMed

    Hatem, S M; Hu, L; Ragé, M; Gierasimowicz, A; Plaghki, L; Bouhassira, D; Attal, N; Iannetti, G D; Mouraux, A

    2012-12-01

    To assess the clinical usefulness of an automated analysis of event-related potentials (ERPs). Nociceptive laser-evoked potentials (LEPs) and non-nociceptive somatosensory electrically-evoked potentials (SEPs) were recorded in 37 patients with syringomyelia and 21 controls. LEP and SEP peak amplitudes and latencies were estimated using a single-trial automated approach based on time-frequency wavelet filtering and multiple linear regression, as well as a conventional approach based on visual inspection. The amplitudes and latencies of normal and abnormal LEP and SEP peaks were identified reliably using both approaches, with similar sensitivity and specificity. Because the automated approach provided an unbiased solution to account for average waveforms where no ERP could be identified visually, it revealed significant differences between patients and controls that were not revealed using the visual approach. The automated analysis of ERPs characterized reliably and objectively LEP and SEP waveforms in patients. The automated single-trial analysis can be used to characterize normal and abnormal ERPs with a similar sensitivity and specificity as visual inspection. While this does not justify its use in a routine clinical setting, the technique could be useful to avoid observer-dependent biases in clinical research. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  17. Total nucleated cell and leukocyte differential counts in canine pleural and peritoneal fluid and equine synovial fluid samples: comparison of automated and manual methods.

    PubMed

    Brudvig, Jean M; Swenson, Cheryl L

    2015-12-01

    Rapid and precise measurement of total and differential nucleated cell counts is a crucial diagnostic component of cavitary and synovial fluid analyses. The objectives of this study included (1) evaluation of reliability and precision of canine and equine fluid total nucleated cell count (TNCC) determined by the benchtop Abaxis VetScan HM5, in comparison with the automated reference instruments ADVIA 120 and the scil Vet abc, respectively, and (2) comparison of automated with manual canine differential nucleated cell counts. The TNCC and differential counts in canine pleural and peritoneal, and equine synovial fluids were determined on the Abaxis VetScan HM5 and compared with the ADVIA 120 and Vet abc analyzer, respectively. Statistical analyses included correlation, least squares fit linear regression, Passing-Bablok regression, and Bland-Altman difference plots. In addition, precision of the total cell count generated by the VetScan HM5 was determined. Agreement was excellent without significant constant or proportional bias for canine cavitary fluid TNCC. Automated and manual differential counts had R(2)  < .5 for individual cell types (least squares fit linear regression). Equine synovial fluid TNCC agreed but with some bias due to the VetScan HM5 overestimating TNCC compared to the Vet abc. Intra-assay precision of the VetScan HM5 in 3 fluid samples was 2-31%. The Abaxis VetScan HM5 provided rapid, reliable TNCC for canine and equine fluid samples. The differential nucleated cell count should be verified microscopically as counts from the VetScan HM5 and also from the ADVIA 120 were often incorrect in canine fluid samples. © 2015 American Society for Veterinary Clinical Pathology.

  18. SPICODYN: A Toolbox for the Analysis of Neuronal Network Dynamics and Connectivity from Multi-Site Spike Signal Recordings.

    PubMed

    Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo

    2018-01-01

    We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.

  19. Imaging mass spectrometry data reduction: automated feature identification and extraction.

    PubMed

    McDonnell, Liam A; van Remoortere, Alexandra; de Velde, Nico; van Zeijl, René J M; Deelder, André M

    2010-12-01

    Imaging MS now enables the parallel analysis of hundreds of biomolecules, spanning multiple molecular classes, which allows tissues to be described by their molecular content and distribution. When combined with advanced data analysis routines, tissues can be analyzed and classified based solely on their molecular content. Such molecular histology techniques have been used to distinguish regions with differential molecular signatures that could not be distinguished using established histologic tools. However, its potential to provide an independent, complementary analysis of clinical tissues has been limited by the very large file sizes and large number of discrete variables associated with imaging MS experiments. Here we demonstrate data reduction tools, based on automated feature identification and extraction, for peptide, protein, and lipid imaging MS, using multiple imaging MS technologies, that reduce data loads and the number of variables by >100×, and that highlight highly-localized features that can be missed using standard data analysis strategies. It is then demonstrated how these capabilities enable multivariate analysis on large imaging MS datasets spanning multiple tissues. Copyright © 2010 American Society for Mass Spectrometry. Published by Elsevier Inc. All rights reserved.

  20. Thermionic nuclear reactor with internal heat distribution and multiple duct cooling

    DOEpatents

    Fisher, C.R.; Perry, L.W. Jr.

    1975-11-01

    A Thermionic Nuclear Reactor is described having multiple ribbon-like coolant ducts passing through the core, intertwined among the thermionic fuel elements to provide independent cooling paths. Heat pipes are disposed in the core between and adjacent to the thermionic fuel elements and the ribbon ducting, for the purpose of more uniformly distributing the heat of fission among the thermionic fuel elements and the ducts.

  1. Automated Microfluidic Filtration and Immunocytochemistry Detection System for Capture and Enumeration of Circulating Tumor Cells and Other Rare Cell Populations in Blood.

    PubMed

    Pugia, Michael; Magbanua, Mark Jesus M; Park, John W

    2017-01-01

    Isolation by size using a filter membrane offers an antigen-independent method for capturing rare cells present in blood of cancer patients. Multiple cell types, including circulating tumor cells (CTCs), captured on the filter membrane can be simultaneously identified via immunocytochemistry (ICC) analysis of specific cellular biomarkers. Here, we describe an automated microfluidic filtration method combined with a liquid handling system for sequential ICC assays to detect and enumerate non-hematologic rare cells in blood.

  2. On automating domain connectivity for overset grids

    NASA Technical Reports Server (NTRS)

    Chiu, Ing-Tsau

    1994-01-01

    An alternative method for domain connectivity among systems of overset grids is presented. Reference uniform Cartesian systems of points are used to achieve highly efficient domain connectivity, and form the basis for a future fully automated system. The Cartesian systems are used to approximated body surfaces and to map the computational space of component grids. By exploiting the characteristics of Cartesian Systems, Chimera type hole-cutting and identification of donor elements for intergrid boundary points can be carried out very efficiently. The method is tested for a range of geometrically complex multiple-body overset grid systems.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Mathew; Bowen, Brian; Coles, Dwight

    The Middleware Automated Deployment Utilities consists the these three components: MAD: Utility designed to automate the deployment of java applications to multiple java application servers. The product contains a front end web utility and backend deployment scripts. MAR: Web front end to maintain and update the components inside database. MWR-Encrypt: Web utility to convert a text string to an encrypted string that is used by the Oracle Weblogic application server. The encryption is done using the built in functions if the Oracle Weblogic product and is mainly used to create an encrypted version of a database password.

  4. The historic predictive value of Canadian orthopedic surgery residents' orthopedic in-training examination scores on their success on the RCPSC certification examination.

    PubMed

    Yen, David; Athwal, George S; Cole, Gary

    2014-08-01

    Positive correlation between the orthopedic in-training examination (OITE) and success in the American Board of Orthopaedic Surgery examination has been reported. Canadian training programs in internal medicine, anesthesiology and urology have found a positive correlation between in-training examination scores and performance on the Royal College of Physicians and Surgeons of Canada (RCPSC) certification examination. We sought to determine the potential predictive value of the OITE scores of Canadian orthopedic surgery residents on their success on their RCPSC examinations. A total of 118 Canadian orthopedic surgery residents had their annual OITE scores during their 5 years of training matched to the RCPSC examination oral and multiple-choice questions and to overall examination pass/fail scores. We calculated Pearson correlations between the in-training examination for each postgraduate year and the certification oral and multiple-choice questions and pass/fail marks. There was a predictive association between the OITE and success on the RCPSC examination. The association was strongest between the OITE and the written multiple-choice examination and weakest between the OITE and the overall examination pass/fail marks. Overall, the OITE was able to provide useful feedback to Canadian orthopedic surgery residents and their training programs in preparing them for their RCPSC examinations. However, when these data were collected, truly normative data based on a Canadian sample were not available. Further study is warranted based on a more refined analysis of the OITE, which is now being produced and includes normative percentile data based on Canadian residents.

  5. MODIS polarization performance and anomalous four-cycle polarization phenomenon

    NASA Astrophysics Data System (ADS)

    Young, James B.; Knight, Ed; Merrow, Cindy

    1998-10-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) will be one of the primary instruments observing the earth on the Earth Observing System (EOS) scheduled for launch in 1999. MODIS polarization performance characterization was required for the 0.4 to 0.6 micrometers (VIS), 0.6 micrometers to 1.0 micrometers (NIR), and 1.0 micrometers to 2.3 micrometers (SWIR) regions. A polarized source assembly (PSA) consisting of a collimator with a rotatable Ahrens polarizer was used to illuminate MODIS with a linearly polarized beam. MODIS signal function having two-cycles per 360 degrees prism rotation signal function was expected. However, some spectral bands had a distinct four-cycle anomalous signal. The expected two-cycle function was present in all regions with the four-cycle anomaly being limited to the NIR region. Fourier analysis was very useful tooling determining the cause of the anomaly. A simplified polarization model of the PSA and MODIS was generated using Mueller matrices-Stokes vector formalism. Parametric modeling illustrated that this anomaly could be produced by energy having multiple passes between PSA Ahrens prism and the MODIS focal plane filters. Furthermore, the model gave NIR four-cycle magnitudes that were consistent with observations. The IVS and SWIR optical trans had birefringent elements that served to scramble the multiple pass anomaly. The model validity was demonstrated with an experimental setup that had partial aperture illumination which eliminated the possibility of multiple passes. The four-cycle response was eliminated while producing the same two-cycle polarization response. Data will be shown to illustrate the four-cycle phenomenon.

  6. Standard setting: comparison of two methods.

    PubMed

    George, Sanju; Haque, M Sayeed; Oyebode, Femi

    2006-09-14

    The outcome of assessments is determined by the standard-setting method used. There is a wide range of standard-setting methods and the two used most extensively in undergraduate medical education in the UK are the norm-reference and the criterion-reference methods. The aims of the study were to compare these two standard-setting methods for a multiple-choice question examination and to estimate the test-retest and inter-rater reliability of the modified Angoff method. The norm-reference method of standard-setting (mean minus 1 SD) was applied to the 'raw' scores of 78 4th-year medical students on a multiple-choice examination (MCQ). Two panels of raters also set the standard using the modified Angoff method for the same multiple-choice question paper on two occasions (6 months apart). We compared the pass/fail rates derived from the norm reference and the Angoff methods and also assessed the test-retest and inter-rater reliability of the modified Angoff method. The pass rate with the norm-reference method was 85% (66/78) and that by the Angoff method was 100% (78 out of 78). The percentage agreement between Angoff method and norm-reference was 78% (95% CI 69% - 87%). The modified Angoff method had an inter-rater reliability of 0.81-0.82 and a test-retest reliability of 0.59-0.74. There were significant differences in the outcomes of these two standard-setting methods, as shown by the difference in the proportion of candidates that passed and failed the assessment. The modified Angoff method was found to have good inter-rater reliability and moderate test-retest reliability.

  7. Automated Guided-Wave Scanning Developed to Characterize Materials and Detect Defects

    NASA Technical Reports Server (NTRS)

    Martin, Richard E.; Gyekenyeski, Andrew L.; Roth, Don J.

    2004-01-01

    The Nondestructive Evaluation (NDE) Group of the Optical Instrumentation Technology Branch at the NASA Glenn Research Center has developed a scanning system that uses guided waves to characterize materials and detect defects. The technique uses two ultrasonic transducers to interrogate the condition of a material. The sending transducer introduces an ultrasonic pulse at a point on the surface of the specimen, and the receiving transducer detects the signal after it has passed through the material. The aim of the method is to correlate certain parameters in both the time and frequency domains of the detected waveform to characteristics of the material between the two transducers. The scanning system is shown. The waveform parameters of interest include the attenuation due to internal damping, waveform shape parameters, and frequency shifts due to material changes. For the most part, guided waves are used to gauge the damage state and defect growth of materials subjected to various mechanical or environmental loads. The technique has been applied to polymer matrix composites, ceramic matrix composites, and metal matrix composites as well as metallic alloys. Historically, guided wave analysis has been a point-by-point, manual technique with waveforms collected at discrete locations and postprocessed. Data collection and analysis of this type limits the amount of detail that can be obtained. Also, the manual movement of the sensors is prone to user error and is time consuming. The development of an automated guided-wave scanning system has allowed the method to be applied to a wide variety of materials in a consistent, repeatable manner. Experimental studies have been conducted to determine the repeatability of the system as well as compare the results obtained using more traditional NDE methods. The following screen capture shows guided-wave scan results for a ceramic matrix composite plate, including images for each of nine calculated parameters. The system can display up to 18 different wave parameters. Multiple scans of the test specimen demonstrated excellent repeatability in the measurement of all the guided-wave parameters, far exceeding the traditional point-by-point technique. In addition, the scan was able to detect a subsurface defect that was confirmed using flash thermography This technology is being further refined to provide a more robust and efficient software environment. Future hardware upgrades will allow for multiple receiving transducers and the ability to scan more complex surfaces. This work supports composite materials development and testing under the Ultra-Efficient Engine Technology (UEET) Project, but it also will be applied to other material systems under development for a wide range of applications.

  8. Flexible software architecture for user-interface and machine control in laboratory automation.

    PubMed

    Arutunian, E B; Meldrum, D R; Friedman, N A; Moody, S E

    1998-10-01

    We describe a modular, layered software architecture for automated laboratory instruments. The design consists of a sophisticated user interface, a machine controller and multiple individual hardware subsystems, each interacting through a client-server architecture built entirely on top of open Internet standards. In our implementation, the user-interface components are built as Java applets that are downloaded from a server integrated into the machine controller. The user-interface client can thereby provide laboratory personnel with a familiar environment for experiment design through a standard World Wide Web browser. Data management and security are seamlessly integrated at the machine-controller layer using QNX, a real-time operating system. This layer also controls hardware subsystems through a second client-server interface. This architecture has proven flexible and relatively easy to implement and allows users to operate laboratory automation instruments remotely through an Internet connection. The software architecture was implemented and demonstrated on the Acapella, an automated fluid-sample-processing system that is under development at the University of Washington.

  9. Modelling of human-machine interaction in equipment design of manufacturing cells

    NASA Astrophysics Data System (ADS)

    Cochran, David S.; Arinez, Jorge F.; Collins, Micah T.; Bi, Zhuming

    2017-08-01

    This paper proposes a systematic approach to model human-machine interactions (HMIs) in supervisory control of machining operations; it characterises the coexistence of machines and humans for an enterprise to balance the goals of automation/productivity and flexibility/agility. In the proposed HMI model, an operator is associated with a set of behavioural roles as a supervisor for multiple, semi-automated manufacturing processes. The model is innovative in the sense that (1) it represents an HMI based on its functions for process control but provides the flexibility for ongoing improvements in the execution of manufacturing processes; (2) it provides a computational tool to define functional requirements for an operator in HMIs. The proposed model can be used to design production systems at different levels of an enterprise architecture, particularly at the machine level in a production system where operators interact with semi-automation to accomplish the goal of 'autonomation' - automation that augments the capabilities of human beings.

  10. An anatomy of industrial robots and their controls

    NASA Astrophysics Data System (ADS)

    Luh, J. Y. S.

    1983-02-01

    The modernization of manufacturing facilities by means of automation represents an approach for increasing productivity in industry. The three existing types of automation are related to continuous process controls, the use of transfer conveyor methods, and the employment of programmable automation for the low-volume batch production of discrete parts. The industrial robots, which are defined as computer controlled mechanics manipulators, belong to the area of programmable automation. Typically, the robots perform tasks of arc welding, paint spraying, or foundary operation. One may assign a robot to perform a variety of job assignments simply by changing the appropriate computer program. The present investigation is concerned with an evaluation of the potential of the robot on the basis of its basic structure and controls. It is found that robots function well in limited areas of industry. If the range of tasks which robots can perform is to be expanded, it is necessary to provide multiple-task sensors, or special tooling, or even automatic tooling.

  11. Experimental research control software system

    NASA Astrophysics Data System (ADS)

    Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.

    2014-05-01

    A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.

  12. Multiple sort flow cytometer

    DOEpatents

    Engh, G. van den; Esposito, R.J.

    1996-01-09

    A flow cytometer utilizes multiple lasers for excitation and respective fluorescence of identified dyes bonded to specific cells or events to identify and verify multiple events to be sorted from a sheath flow and droplet stream. Once identified, verified and timed in the sheath flow, each event is independently tagged upon separation from the flow by an electrical charge of +60, +120, or +180 volts and passed through oppositely charged deflection plates with ground planes to yield a focused six way deflection of at least six events in a narrow plane. 8 figs.

  13. Using the arthroscopic surgery skill evaluation tool as a pass-fail examination.

    PubMed

    Koehler, Ryan J; Nicandri, Gregg T

    2013-12-04

    Examination of arthroscopic skill requires evaluation tools that are valid and reliable with clear criteria for passing. The Arthroscopic Surgery Skill Evaluation Tool was developed as a video-based assessment of technical skill with criteria for passing established by a panel of experts. The purpose of this study was to test the validity and reliability of the Arthroscopic Surgery Skill Evaluation Tool as a pass-fail examination of arthroscopic skill. Twenty-eight residents and two sports medicine faculty members were recorded performing diagnostic knee arthroscopy on a left and right cadaveric specimen in our arthroscopic skills laboratory. Procedure videos were evaluated with use of the Arthroscopic Surgery Skill Evaluation Tool by two raters blind to subject identity. Subjects were considered to pass the Arthroscopic Surgery Skill Evaluation Tool when they attained scores of ≥ 3 on all eight assessment domains. The raters agreed on a pass-fail rating for fifty-five of sixty videos rated with an interclass correlation coefficient value of 0.83. Ten of thirty participants were assigned passing scores by both raters for both diagnostic arthroscopies performed in the laboratory. Receiver operating characteristic analysis demonstrated that logging more than eighty arthroscopic cases or performing more than thirty-five arthroscopic knee cases was predictive of attaining a passing Arthroscopic Surgery Skill Evaluation Tool score on both procedures performed in the laboratory. The Arthroscopic Surgery Skill Evaluation Tool is valid and reliable as a pass-fail examination of diagnostic arthroscopy of the knee in the simulation laboratory. This study demonstrates that the Arthroscopic Surgery Skill Evaluation Tool may be a useful tool for pass-fail examination of diagnostic arthroscopy of the knee in the simulation laboratory. Further study is necessary to determine whether the Arthroscopic Surgery Skill Evaluation Tool can be used for the assessment of multiple arthroscopic procedures and whether it can be used to evaluate arthroscopic procedures performed in the operating room.

  14. Technology optimization techniques for multicomponent optical band-pass filter manufacturing

    NASA Astrophysics Data System (ADS)

    Baranov, Yuri P.; Gryaznov, Georgiy M.; Rodionov, Andrey Y.; Obrezkov, Andrey V.; Medvedev, Roman V.; Chivanov, Alexey N.

    2016-04-01

    Narrowband optical devices (like IR-sensing devices, celestial navigation systems, solar-blind UV-systems and many others) are one of the most fast-growing areas in optical manufacturing. However, signal strength in this type of applications is quite low and performance of devices depends on attenuation level of wavelengths out of operating range. Modern detectors (photodiodes, matrix detectors, photomultiplier tubes and others) usually do not have required selectivity or have higher sensitivity to background spectrum at worst. Manufacturing of a single component band-pass filter with high attenuation level of wavelength is resource-intensive task. Sometimes it's not possible to find solution for this problem using existing technologies. Different types of filters have technology variations of transmittance profile shape due to various production factors. At the same time there are multiple tasks with strict requirements for background spectrum attenuation in narrowband optical devices. For example, in solar-blind UV-system wavelengths above 290-300 nm must be attenuated by 180dB. In this paper techniques of multi-component optical band-pass filters assembly from multiple single elements with technology variations of transmittance profile shape for optimal signal-tonoise ratio (SNR) were proposed. Relationships between signal-to-noise ratio and different characteristics of transmittance profile shape were shown. Obtained practical results were in rather good agreement with our calculations.

  15. Origin of sulfur for elemental sulfur concentration in salt dome cap rocks, Gulf Coast Basin, USA

    NASA Astrophysics Data System (ADS)

    Hill, J. M.; Kyle, R.; Loyd, S. J.

    2017-12-01

    Calcite cap rocks of the Boling and Main Pass salt domes contain large elemental sulfur accumulations. Isotopic and petrographic data indicate complex histories of cap rock paragenesis for both domes. Whereas paragenetic complexity is in part due to the open nature of these hydrodynamic systems, a comprehensive understanding of elemental sulfur sources and concentration mechanisms is lacking. Large ranges in traditional sulfur isotope compositions (δ34S) among oxidized and reduced sulfur-bearing phases has led some to infer that microbial sulfate reduction and/or influx of sulfide-rich formation waters occurred during calcite cap rock formation. Ultimately, traditional sulfur isotope analyses alone cannot distinguish among local microbial or exogenous sulfur sources. Recently, multiple sulfur isotope (32S, 33S, 34S, 36S) studies reveal small, but measurable differences in mass-dependent behavior of microbial and abiogenic processes. To distinguish between the proposed sulfur sources, multiple-sulfur-isotope analyses have been performed on native sulfur from the Boling and Main Pass cap rocks. Similarities or deviations from equilibrium relationships indicate which pathways were responsible for native sulfur precipitation. Pathway determination provides insight into Gulf Coast cap rock development and potentially highlights the conditions that led to anomalous sulfur enrichment in Boling and Main Pass Domes.

  16. THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL

    EPA Science Inventory

    A toolkit for distributed hydrologic modeling at multiple scales using a geographic information system is presented. This open-source, freely available software was developed through a collaborative endeavor involving two Universities and two government agencies. Called the Auto...

  17. Nanomechanical testing system

    DOEpatents

    Vodnick, David James; Dwivedi, Arpit; Keranen, Lucas Paul; Okerlund, Michael David; Schmitz, Roger William; Warren, Oden Lee; Young, Christopher David

    2014-07-08

    An automated testing system includes systems and methods to facilitate inline production testing of samples at a micro (multiple microns) or less scale with a mechanical testing instrument. In an example, the system includes a probe changing assembly for coupling and decoupling a probe of the instrument. The probe changing assembly includes a probe change unit configured to grasp one of a plurality of probes in a probe magazine and couple one of the probes with an instrument probe receptacle. An actuator is coupled with the probe change unit, and the actuator is configured to move and align the probe change unit with the probe magazine and the instrument probe receptacle. In another example, the automated testing system includes a multiple degree of freedom stage for aligning a sample testing location with the instrument. The stage includes a sample stage and a stage actuator assembly including translational and rotational actuators.

  18. Nanomechanical testing system

    DOEpatents

    Vodnick, David James; Dwivedi, Arpit; Keranen, Lucas Paul; Okerlund, Michael David; Schmitz, Roger William; Warren, Oden Lee; Young, Christopher David

    2015-01-27

    An automated testing system includes systems and methods to facilitate inline production testing of samples at a micro (multiple microns) or less scale with a mechanical testing instrument. In an example, the system includes a probe changing assembly for coupling and decoupling a probe of the instrument. The probe changing assembly includes a probe change unit configured to grasp one of a plurality of probes in a probe magazine and couple one of the probes with an instrument probe receptacle. An actuator is coupled with the probe change unit, and the actuator is configured to move and align the probe change unit with the probe magazine and the instrument probe receptacle. In another example, the automated testing system includes a multiple degree of freedom stage for aligning a sample testing location with the instrument. The stage includes a sample stage and a stage actuator assembly including translational and rotational actuators.

  19. Nanomechanical testing system

    DOEpatents

    Vodnick, David James; Dwivedi, Arpit; Keranen, Lucas Paul; Okerlund, Michael David; Schmitz, Roger William; Warren, Oden Lee; Young, Christopher David

    2015-02-24

    An automated testing system includes systems and methods to facilitate inline production testing of samples at a micro (multiple microns) or less scale with a mechanical testing instrument. In an example, the system includes a probe changing assembly for coupling and decoupling a probe of the instrument. The probe changing assembly includes a probe change unit configured to grasp one of a plurality of probes in a probe magazine and couple one of the probes with an instrument probe receptacle. An actuator is coupled with the probe change unit, and the actuator is configured to move and align the probe change unit with the probe magazine and the instrument probe receptacle. In another example, the automated testing system includes a multiple degree of freedom stage for aligning a sample testing location with the instrument. The stage includes a sample stage and a stage actuator assembly including translational and rotational actuators.

  20. Expressing Intervals in Automated Service Negotiation

    NASA Astrophysics Data System (ADS)

    Clark, Kassidy P.; Warnier, Martijn; van Splunter, Sander; Brazier, Frances M. T.

    During automated negotiation of services between autonomous agents, utility functions are used to evaluate the terms of negotiation. These terms often include intervals of values which are prone to misinterpretation. It is often unclear if an interval embodies a continuum of real numbers or a subset of natural numbers. Furthermore, it is often unclear if an agent is expected to choose only one value, multiple values, a sub-interval or even multiple sub-intervals. Additional semantics are needed to clarify these issues. Normally, these semantics are stored in a domain ontology. However, ontologies are typically domain specific and static in nature. For dynamic environments, in which autonomous agents negotiate resources whose attributes and relationships change rapidly, semantics should be made explicit in the service negotiation. This paper identifies issues that are prone to misinterpretation and proposes a notation for expressing intervals. This notation is illustrated using an example in WS-Agreement.

  1. SIFT optimization and automation for matching images from multiple temporal sources

    NASA Astrophysics Data System (ADS)

    Castillo-Carrión, Sebastián; Guerrero-Ginel, José-Emilio

    2017-05-01

    Scale Invariant Feature Transformation (SIFT) was applied to extract tie-points from multiple source images. Although SIFT is reported to perform reliably under widely different radiometric and geometric conditions, using the default input parameters resulted in too few points being found. We found that the best solution was to focus on large features as these are more robust and not prone to scene changes over time, which constitutes a first approach to the automation of processes using mapping applications such as geometric correction, creation of orthophotos and 3D models generation. The optimization of five key SIFT parameters is proposed as a way of increasing the number of correct matches; the performance of SIFT is explored in different images and parameter values, finding optimization values which are corroborated using different validation imagery. The results show that the optimization model improves the performance of SIFT in correlating multitemporal images captured from different sources.

  2. Cooling system with automated seasonal freeze protection

    DOEpatents

    Campbell, Levi A.; Chu, Richard C.; David, Milnes P.; Ellsworth, Jr., Michael J.; Iyengar, Madhusudan K.; Simons, Robert E.; Singh, Prabjit; Zhang, Jing

    2016-05-24

    An automated multi-fluid cooling system and method are provided for cooling an electronic component(s). The cooling system includes a coolant loop, a coolant tank, multiple valves, and a controller. The coolant loop is at least partially exposed to outdoor ambient air temperature(s) during normal operation, and the coolant tank includes first and second reservoirs containing first and second fluids, respectively. The first fluid freezes at a lower temperature than the second, the second fluid has superior cooling properties compared with the first, and the two fluids are soluble. The multiple valves are controllable to selectively couple the first or second fluid into the coolant in the coolant loop, wherein the coolant includes at least the second fluid. The controller automatically controls the valves to vary first fluid concentration level in the coolant loop based on historical, current, or anticipated outdoor air ambient temperature(s) for a time of year.

  3. Automated high-throughput flow-through real-time diagnostic system

    DOEpatents

    Regan, John Frederick

    2012-10-30

    An automated real-time flow-through system capable of processing multiple samples in an asynchronous, simultaneous, and parallel fashion for nucleic acid extraction and purification, followed by assay assembly, genetic amplification, multiplex detection, analysis, and decontamination. The system is able to hold and access an unlimited number of fluorescent reagents that may be used to screen samples for the presence of specific sequences. The apparatus works by associating extracted and purified sample with a series of reagent plugs that have been formed in a flow channel and delivered to a flow-through real-time amplification detector that has a multiplicity of optical windows, to which the sample-reagent plugs are placed in an operative position. The diagnostic apparatus includes sample multi-position valves, a master sample multi-position valve, a master reagent multi-position valve, reagent multi-position valves, and an optical amplification/detection system.

  4. Cooling method with automated seasonal freeze protection

    DOEpatents

    Cambell, Levi; Chu, Richard; David, Milnes; Ellsworth, Jr, Michael; Iyengar, Madhusudan; Simons, Robert; Singh, Prabjit; Zhang, Jing

    2016-05-31

    An automated multi-fluid cooling method is provided for cooling an electronic component(s). The method includes obtaining a coolant loop, and providing a coolant tank, multiple valves, and a controller. The coolant loop is at least partially exposed to outdoor ambient air temperature(s) during normal operation, and the coolant tank includes first and second reservoirs containing first and second fluids, respectively. The first fluid freezes at a lower temperature than the second, the second fluid has superior cooling properties compared with the first, and the two fluids are soluble. The multiple valves are controllable to selectively couple the first or second fluid into the coolant in the coolant loop, wherein the coolant includes at least the second fluid. The controller automatically controls the valves to vary first fluid concentration level in the coolant loop based on historical, current, or anticipated outdoor air ambient temperature(s) for a time of year.

  5. Noise Suppression Based on Multi-Model Compositions Using Multi-Pass Search with Multi-Label N-gram Models

    NASA Astrophysics Data System (ADS)

    Jitsuhiro, Takatoshi; Toriyama, Tomoji; Kogure, Kiyoshi

    We propose a noise suppression method based on multi-model compositions and multi-pass search. In real environments, input speech for speech recognition includes many kinds of noise signals. To obtain good recognized candidates, suppressing many kinds of noise signals at once and finding target speech is important. Before noise suppression, to find speech and noise label sequences, we introduce multi-pass search with acoustic models including many kinds of noise models and their compositions, their n-gram models, and their lexicon. Noise suppression is frame-synchronously performed using the multiple models selected by recognized label sequences with time alignments. We evaluated this method using the E-Nightingale task, which contains voice memoranda spoken by nurses during actual work at hospitals. The proposed method obtained higher performance than the conventional method.

  6. OASIS is Automated Statistical Inference for Segmentation, with applications to multiple sclerosis lesion segmentation in MRI.

    PubMed

    Sweeney, Elizabeth M; Shinohara, Russell T; Shiee, Navid; Mateen, Farrah J; Chudgar, Avni A; Cuzzocreo, Jennifer L; Calabresi, Peter A; Pham, Dzung L; Reich, Daniel S; Crainiceanu, Ciprian M

    2013-01-01

    Magnetic resonance imaging (MRI) can be used to detect lesions in the brains of multiple sclerosis (MS) patients and is essential for diagnosing the disease and monitoring its progression. In practice, lesion load is often quantified by either manual or semi-automated segmentation of MRI, which is time-consuming, costly, and associated with large inter- and intra-observer variability. We propose OASIS is Automated Statistical Inference for Segmentation (OASIS), an automated statistical method for segmenting MS lesions in MRI studies. We use logistic regression models incorporating multiple MRI modalities to estimate voxel-level probabilities of lesion presence. Intensity-normalized T1-weighted, T2-weighted, fluid-attenuated inversion recovery and proton density volumes from 131 MRI studies (98 MS subjects, 33 healthy subjects) with manual lesion segmentations were used to train and validate our model. Within this set, OASIS detected lesions with a partial area under the receiver operating characteristic curve for clinically relevant false positive rates of 1% and below of 0.59% (95% CI; [0.50%, 0.67%]) at the voxel level. An experienced MS neuroradiologist compared these segmentations to those produced by LesionTOADS, an image segmentation software that provides segmentation of both lesions and normal brain structures. For lesions, OASIS out-performed LesionTOADS in 74% (95% CI: [65%, 82%]) of cases for the 98 MS subjects. To further validate the method, we applied OASIS to 169 MRI studies acquired at a separate center. The neuroradiologist again compared the OASIS segmentations to those from LesionTOADS. For lesions, OASIS ranked higher than LesionTOADS in 77% (95% CI: [71%, 83%]) of cases. For a randomly selected subset of 50 of these studies, one additional radiologist and one neurologist also scored the images. Within this set, the neuroradiologist ranked OASIS higher than LesionTOADS in 76% (95% CI: [64%, 88%]) of cases, the neurologist 66% (95% CI: [52%, 78%]) and the radiologist 52% (95% CI: [38%, 66%]). OASIS obtains the estimated probability for each voxel to be part of a lesion by weighting each imaging modality with coefficient weights. These coefficients are explicit, obtained using standard model fitting techniques, and can be reused in other imaging studies. This fully automated method allows sensitive and specific detection of lesion presence and may be rapidly applied to large collections of images.

  7. Hyperswitch Communication Network Computer

    NASA Technical Reports Server (NTRS)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  8. Multiple-Array Detection, Association and Location of Infrasound and Seismo-Acoustic Events - Utilization of Ground Truth Information

    DTIC Science & Technology

    2010-09-01

    MULTIPLE-ARRAY DETECTION, ASSOCIATION AND LOCATION OF INFRASOUND AND SEISMO-ACOUSTIC EVENTS – UTILIZATION OF GROUND TRUTH INFORMATION Stephen J...and infrasound data from seismo-acoustic arrays and apply the methodology to regional networks for validation with ground truth information. In the...initial year of the project automated techniques for detecting, associating and locating infrasound signals were developed. Recently, the location

  9. Video to Text (V2T) in Wide Area Motion Imagery

    DTIC Science & Technology

    2015-09-01

    microtext) or a document (e.g., using Sphinx or Apache NLP ) as an automated approach [102]. Previous work in natural language full-text searching...language processing ( NLP ) based module. The heart of the structured text processing module includes the following seven key word banks...Features Tracker MHT Multiple Hypothesis Tracking MIL Multiple Instance Learning NLP Natural Language Processing OAB Online AdaBoost OF Optic Flow

  10. Mission operations technology

    NASA Astrophysics Data System (ADS)

    Varsi, Giulio

    In the last decade, the operation of a spacecraft after launch has emerged as a major component of the total cost of the mission. This trend is sustained by the increasing complexity, flexibility, and data gathering capability of the space assets and by their greater reliability and consequent longevity. The trend can, however, be moderated by the progressive transfer of selected functions from the ground to the spacecraft and by application, on the ground, of new technology. Advances in ground operations derive from the introduction in the mission operations environment of advanced microprocessor-based workstations in the class of a few million instructions per second and from the selective application of artificial intelligence technology. In the last few years a number of these applications have been developed, tested in operational settings and successfully demonstrated to users. Some are now being integrated in mission operations facilities. An analysis of mission operations indicates that the key areas are: concurrent control of multiple missions; automated/interactive production of command sequences of high integrity at low cost; automated monitoring of spacecraft health and automated aides for fault diagnosis; automated allocation of resources; automated processing of science data; and high-fidelity, high-speed spacecraft simulation. Examples of major advances in selected areas are described.

  11. RNA–protein binding kinetics in an automated microfluidic reactor

    PubMed Central

    Ridgeway, William K.; Seitaridou, Effrosyni; Phillips, Rob; Williamson, James R.

    2009-01-01

    Microfluidic chips can automate biochemical assays on the nanoliter scale, which is of considerable utility for RNA–protein binding reactions that would otherwise require large quantities of proteins. Unfortunately, complex reactions involving multiple reactants cannot be prepared in current microfluidic mixer designs, nor is investigation of long-time scale reactions possible. Here, a microfluidic ‘Riboreactor’ has been designed and constructed to facilitate the study of kinetics of RNA–protein complex formation over long time scales. With computer automation, the reactor can prepare binding reactions from any combination of eight reagents, and is optimized to monitor long reaction times. By integrating a two-photon microscope into the microfluidic platform, 5-nl reactions can be observed for longer than 1000 s with single-molecule sensitivity and negligible photobleaching. Using the Riboreactor, RNA–protein binding reactions with a fragment of the bacterial 30S ribosome were prepared in a fully automated fashion and binding rates were consistent with rates obtained from conventional assays. The microfluidic chip successfully combines automation, low sample consumption, ultra-sensitive fluorescence detection and a high degree of reproducibility. The chip should be able to probe complex reaction networks describing the assembly of large multicomponent RNPs such as the ribosome. PMID:19759214

  12. Apparatus and method for loading and unloading multiple digital tape cassettes utilizing a removable magazine

    DOEpatents

    Lindenmeyer, C.W.

    1993-01-26

    An apparatus and method to automate the handling of multiple digital tape cassettes for processing by commercially available cassette tape readers and recorders. A removable magazine rack stores a plurality of tape cassettes, and cooperates with a shuttle device that automatically inserts and removes cassettes from the magazine to the reader and vice-versa. Photocells are used to identify and index to the desired tape cassette. The apparatus allows digital information stored on multiple cassettes to be processed without significant operator intervention.

  13. Apparatus and method for loading and unloading multiple digital tape cassettes utilizing a removable magazine

    DOEpatents

    Lindenmeyer, Carl W.

    1993-01-01

    An apparatus and method to automate the handling of multiple digital tape cassettes for processing by commercially available cassette tape readers and recorders. A removable magazine rack stores a plurality of tape cassettes, and cooperates with a shuttle device that automatically inserts and removes cassettes from the magazine to the reader and vice-versa. Photocells are used to identify and index to the desired tape cassette. The apparatus allows digital information stored on multiple cassettes to be processed without significant operator intervention.

  14. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsai, Yingssu; Stanford University, 333 Campus Drive, Mudd Building, Stanford, CA 94305-5080; McPhillips, Scott E.

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data,more » performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully demonstrated. This workflow was run once on the same 96 samples that the group had examined manually and the workflow cycled successfully through all of the samples, collected data from the same samples that were selected manually and located the same peaks of unmodeled density in the resulting difference Fourier maps.« less

  15. Classification methods to detect sleep apnea in adults based on respiratory and oximetry signals: a systematic review.

    PubMed

    Uddin, M B; Chow, C M; Su, S W

    2018-03-26

    Sleep apnea (SA), a common sleep disorder, can significantly decrease the quality of life, and is closely associated with major health risks such as cardiovascular disease, sudden death, depression, and hypertension. The normal diagnostic process of SA using polysomnography is costly and time consuming. In addition, the accuracy of different classification methods to detect SA varies with the use of different physiological signals. If an effective, reliable, and accurate classification method is developed, then the diagnosis of SA and its associated treatment will be time-efficient and economical. This study aims to systematically review the literature and present an overview of classification methods to detect SA using respiratory and oximetry signals and address the automated detection approach. Sixty-two included studies revealed the application of single and multiple signals (respiratory and oximetry) for the diagnosis of SA. Both airflow and oxygen saturation signals alone were effective in detecting SA in the case of binary decision-making, whereas multiple signals were good for multi-class detection. In addition, some machine learning methods were superior to the other classification methods for SA detection using respiratory and oximetry signals. To deal with the respiratory and oximetry signals, a good choice of classification method as well as the consideration of associated factors would result in high accuracy in the detection of SA. An accurate classification method should provide a high detection rate with an automated (independent of human action) analysis of respiratory and oximetry signals. Future high-quality automated studies using large samples of data from multiple patient groups or record batches are recommended.

  16. Application of acoustic imaging techniques on snowmobile pass-by noise.

    PubMed

    Padois, Thomas; Berry, Alain

    2017-02-01

    Snowmobile manufacturers invest important efforts to reduce the noise emission of their products. The noise sources of snowmobiles are multiple and closely spaced, leading to difficult source separation in practice. In this study, source imaging results for snowmobile pass-by noise are discussed. The experiments involve a 193-microphone Underbrink array, with synchronization of acoustic with video data provided by a high-speed camera. Both conventional beamforming and Clean-SC deconvolution are implemented to provide noise source maps of the snowmobile. The results clearly reveal noise emission from the engine, exhaust, and track depending on the frequency range considered.

  17. PyPanda: a Python package for gene regulatory network reconstruction

    PubMed Central

    van IJzendoorn, David G.P.; Glass, Kimberly; Quackenbush, John; Kuijjer, Marieke L.

    2016-01-01

    Summary: PANDA (Passing Attributes between Networks for Data Assimilation) is a gene regulatory network inference method that uses message-passing to integrate multiple sources of ‘omics data. PANDA was originally coded in C ++. In this application note we describe PyPanda, the Python version of PANDA. PyPanda runs considerably faster than the C ++ version and includes additional features for network analysis. Availability and implementation: The open source PyPanda Python package is freely available at http://github.com/davidvi/pypanda. Contact: mkuijjer@jimmy.harvard.edu or d.g.p.van_ijzendoorn@lumc.nl PMID:27402905

  18. PyPanda: a Python package for gene regulatory network reconstruction.

    PubMed

    van IJzendoorn, David G P; Glass, Kimberly; Quackenbush, John; Kuijjer, Marieke L

    2016-11-01

    PANDA (Passing Attributes between Networks for Data Assimilation) is a gene regulatory network inference method that uses message-passing to integrate multiple sources of 'omics data. PANDA was originally coded in C ++. In this application note we describe PyPanda, the Python version of PANDA. PyPanda runs considerably faster than the C ++ version and includes additional features for network analysis. The open source PyPanda Python package is freely available at http://github.com/davidvi/pypanda CONTACT: mkuijjer@jimmy.harvard.edu or d.g.p.van_ijzendoorn@lumc.nl. © The Author 2016. Published by Oxford University Press.

  19. Multichannel, Active Low-Pass Filters

    NASA Technical Reports Server (NTRS)

    Lev, James J.

    1989-01-01

    Multichannel integrated circuits cascaded to obtain matched characteristics. Gain and phase characteristics of channels of multichannel, multistage, active, low-pass filter matched by making filter of cascaded multichannel integrated-circuit operational amplifiers. Concept takes advantage of inherent equality of electrical characteristics of nominally-identical circuit elements made on same integrated-circuit chip. Characteristics of channels vary identically with changes in temperature. If additional matched channels needed, chips containing more than two operational amplifiers apiece (e.g., commercial quad operational amplifliers) used. Concept applicable to variety of equipment requiring matched gain and phase in multiple channels - radar, test instruments, communication circuits, and equipment for electronic countermeasures.

  20. Microbleed Detection Using Automated Segmentation (MIDAS): A New Method Applicable to Standard Clinical MR Images

    PubMed Central

    Seghier, Mohamed L.; Kolanko, Magdalena A.; Leff, Alexander P.; Jäger, Hans R.; Gregoire, Simone M.; Werring, David J.

    2011-01-01

    Background Cerebral microbleeds, visible on gradient-recalled echo (GRE) T2* MRI, have generated increasing interest as an imaging marker of small vessel diseases, with relevance for intracerebral bleeding risk or brain dysfunction. Methodology/Principal Findings Manual rating methods have limited reliability and are time-consuming. We developed a new method for microbleed detection using automated segmentation (MIDAS) and compared it with a validated visual rating system. In thirty consecutive stroke service patients, standard GRE T2* images were acquired and manually rated for microbleeds by a trained observer. After spatially normalizing each patient's GRE T2* images into a standard stereotaxic space, the automated microbleed detection algorithm (MIDAS) identified cerebral microbleeds by explicitly incorporating an “extra” tissue class for abnormal voxels within a unified segmentation-normalization model. The agreement between manual and automated methods was assessed using the intraclass correlation coefficient (ICC) and Kappa statistic. We found that MIDAS had generally moderate to good agreement with the manual reference method for the presence of lobar microbleeds (Kappa = 0.43, improved to 0.65 after manual exclusion of obvious artefacts). Agreement for the number of microbleeds was very good for lobar regions: (ICC = 0.71, improved to ICC = 0.87). MIDAS successfully detected all patients with multiple (≥2) lobar microbleeds. Conclusions/Significance MIDAS can identify microbleeds on standard MR datasets, and with an additional rapid editing step shows good agreement with a validated visual rating system. MIDAS may be useful in screening for multiple lobar microbleeds. PMID:21448456

Top