Turbidity-controlled sampling for suspended sediment load estimation
Jack Lewis
2003-01-01
Abstract - Automated data collection is essential to effectively measure suspended sediment loads in storm events, particularly in small basins. Continuous turbidity measurements can be used, along with discharge, in an automated system that makes real-time sampling decisions to facilitate sediment load estimation. The Turbidity Threshold Sampling method distributes...
Chapin, Thomas
2015-01-01
Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.
Misclassification of OSA severity with automated scoring of home sleep recordings.
Aurora, R Nisha; Swartz, Rachel; Punjabi, Naresh M
2015-03-01
The advent of home sleep testing has allowed for the development of an ambulatory care model for OSA that most health-care providers can easily deploy. Although automated algorithms that accompany home sleep monitors can identify and classify disordered breathing events, it is unclear whether manual scoring followed by expert review of home sleep recordings is of any value. Thus, this study examined the agreement between automated and manual scoring of home sleep recordings. Two type 3 monitors (ApneaLink Plus [ResMed] and Embletta [Embla Systems]) were examined in distinct study samples. Data from manual and automated scoring were available for 200 subjects. Two thresholds for oxygen desaturation (≥ 3% and ≥ 4%) were used to define disordered breathing events. Agreement between manual and automated scoring was examined using Pearson correlation coefficients and Bland-Altman analyses. Automated scoring consistently underscored disordered breathing events compared with manual scoring for both sleep monitors irrespective of whether a ≥ 3% or ≥ 4% oxygen desaturation threshold was used to define the apnea-hypopnea index (AHI). For the ApneaLink Plus monitor, Bland-Altman analyses revealed an average AHI difference between manual and automated scoring of 6.1 (95% CI, 4.9-7.3) and 4.6 (95% CI, 3.5-5.6) events/h for the ≥ 3% and ≥ 4% oxygen desaturation thresholds, respectively. Similarly for the Embletta monitor, the average difference between manual and automated scoring was 5.3 (95% CI, 3.2-7.3) and 8.4 (95% CI, 7.2-9.6) events/h, respectively. Although agreement between automated and manual scoring of home sleep recordings varies based on the device used, modest agreement was observed between the two approaches. However, manual review of home sleep test recordings can decrease the misclassification of OSA severity, particularly for those with mild disease. ClinicalTrials.gov; No.: NCT01503164; www.clinicaltrials.gov.
Misclassification of OSA Severity With Automated Scoring of Home Sleep Recordings
Aurora, R. Nisha; Swartz, Rachel
2015-01-01
BACKGROUND: The advent of home sleep testing has allowed for the development of an ambulatory care model for OSA that most health-care providers can easily deploy. Although automated algorithms that accompany home sleep monitors can identify and classify disordered breathing events, it is unclear whether manual scoring followed by expert review of home sleep recordings is of any value. Thus, this study examined the agreement between automated and manual scoring of home sleep recordings. METHODS: Two type 3 monitors (ApneaLink Plus [ResMed] and Embletta [Embla Systems]) were examined in distinct study samples. Data from manual and automated scoring were available for 200 subjects. Two thresholds for oxygen desaturation (≥ 3% and ≥ 4%) were used to define disordered breathing events. Agreement between manual and automated scoring was examined using Pearson correlation coefficients and Bland-Altman analyses. RESULTS: Automated scoring consistently underscored disordered breathing events compared with manual scoring for both sleep monitors irrespective of whether a ≥ 3% or ≥ 4% oxygen desaturation threshold was used to define the apnea-hypopnea index (AHI). For the ApneaLink Plus monitor, Bland-Altman analyses revealed an average AHI difference between manual and automated scoring of 6.1 (95% CI, 4.9-7.3) and 4.6 (95% CI, 3.5-5.6) events/h for the ≥ 3% and ≥ 4% oxygen desaturation thresholds, respectively. Similarly for the Embletta monitor, the average difference between manual and automated scoring was 5.3 (95% CI, 3.2-7.3) and 8.4 (95% CI, 7.2-9.6) events/h, respectively. CONCLUSIONS: Although agreement between automated and manual scoring of home sleep recordings varies based on the device used, modest agreement was observed between the two approaches. However, manual review of home sleep test recordings can decrease the misclassification of OSA severity, particularly for those with mild disease. TRIAL REGISTRY: ClinicalTrials.gov; No.: NCT01503164; www.clinicaltrials.gov PMID:25411804
Besmer, Michael D.; Weissbrodt, David G.; Kratochvil, Bradley E.; Sigrist, Jürg A.; Weyland, Mathias S.; Hammes, Frederik
2014-01-01
Fluorescent staining coupled with flow cytometry (FCM) is often used for the monitoring, quantification and characterization of bacteria in engineered and environmental aquatic ecosystems including seawater, freshwater, drinking water, wastewater, and industrial bioreactors. However, infrequent grab sampling hampers accurate characterization and subsequent understanding of microbial dynamics in all of these ecosystems. A logic technological progression is high throughput and full automation of the sampling, staining, measurement, and data analysis steps. Here we assess the feasibility and applicability of automated FCM by means of actual data sets produced with prototype instrumentation. As proof-of-concept we demonstrate examples of microbial dynamics in (i) flowing tap water from a municipal drinking water supply network and (ii) river water from a small creek subject to two rainfall events. In both cases, automated measurements were done at 15-min intervals during 12–14 consecutive days, yielding more than 1000 individual data points for each ecosystem. The extensive data sets derived from the automated measurements allowed for the establishment of baseline data for each ecosystem, as well as for the recognition of daily variations and specific events that would most likely be missed (or miss-characterized) by infrequent sampling. In addition, the online FCM data from the river water was combined and correlated with online measurements of abiotic parameters, showing considerable potential for a better understanding of cause-and-effect relationships in aquatic ecosystems. Although several challenges remain, the successful operation of an automated online FCM system and the basic interpretation of the resulting data sets represent a breakthrough toward the eventual establishment of fully automated online microbiological monitoring technologies. PMID:24917858
In 1993, the University of Michigan Air Quality Laboratory (UMAQL) designed a new wet-only precipitation collection system that was utilized in the Lake Michigan Loading Study. The collection system was designed to collect discrete mercury and trace element samples on an event b...
NASA Astrophysics Data System (ADS)
Pisano, Luca; Vessia, Giovanna; Vennari, Carmela; Parise, Mario
2015-04-01
Empirical rainfall thresholds are a well established method to draw information about Duration (D) and Cumulated (E) values of the rainfalls that are likely to initiate shallow landslides. To this end, rain-gauge records of rainfall heights are commonly used. Several procedures can be applied to address the calculation of the Duration-Cumulated height and, eventually, the Intensity values related to the rainfall events responsible for shallow landslide onset. A large number of procedures are drawn from particular geological settings and climate conditions based on an expert identification of the rainfall event. A few researchers recently devised automated procedures to reconstruct the rainfall events responsible for landslide onset. In this study, 300 pairs of D, E couples, related to shallow landslides that occurred in a ten year span 2002-2012 on the Italian territory, have been drawn by means of two procedures: the expert method (Brunetti et al., 2010) and the automated method (Vessia et al., 2014). The two procedures start from the same sources of information on shallow landslides occurred during or soon after a rainfall. Although they have in common the method to select the date (up to the hour of the landslide occurrence), the site of the landslide and the choice of the rain-gauge representative for the rainfall, they differ when calculating the Duration and Cumulated height of the rainfall event. Moreover, the expert procedure identifies only one D, E pair for each landslide whereas the automated procedure draws 6 possible D,E pairs for the same landslide event. Each one of the 300 D, E pairs calculated by the automated procedure reproduces about 80% of the E values and about 60% of the D values calculated by the expert procedure. Unfortunately, no standard methods are available for checking the forecasting ability of both the expert and the automated reconstruction of the true D, E pairs that result in shallow landslide. Nonetheless, a statistical analysis on marginal distributions of the seven samples of 300 D and E values are performed in this study. The main objective of this statistical analysis is to highlight similarities and differences in the two sets of samples of Duration and Cumulated values collected by the two procedures. At first, the sample distributions have been investigated: the seven E samples are Lognormal distributed, whereas the D samples are all distributed Weibull like. On E samples, due to their Lognormal distribution, statistical tests can be applied to check two null hypotheses: equal mean values through the Student test, equal standard deviations through the Fisher test. These two hypotheses are accepted for the seven E samples, meaning that they come from the same population, at a confidence level of 95%. Conversely, the preceding tests cannot be applied to the seven D samples that are Weibull distributed with shape parameters k ranging between 0.9 to 1.2. Nonetheless, the two procedures calculate the rainfall event through the selection of the E values; after that the D is drawn. Thus, the results of this statistical analysis preliminary confirms the similarities of the two D,E pair set of values drawn from the two different procedures. References Brunetti, M.T., Peruccacci, S., Rossi, M., Luciani, S., Valigi, D., and Guzzetti, F.: Rainfall thresholds for the possible occurrence of landslides in Italy, Nat. Hazards Earth Syst. Sci., 10, 447-458, doi:10.5194/nhess-10-447-2010, 2010. Vessia G., Parise M., Brunetti M.T., Peruccacci S., Rossi M., Vennari C., and Guzzetti F.: Automated reconstruction of rainfall events responsible for shallow landslides, Nat. Hazards Earth Syst. Sci., 14, 2399-2408, doi: 10.5194/nhess-14-2399-2014, 2014.
Azim, Syed; Juergens, Craig; Hines, John; McLaws, Mary-Louise
2016-07-01
Human auditing and collating hand hygiene compliance data take hundreds of hours. We report on 24/7 overt observations to establish adjusted average daily hand hygiene opportunities (HHOs) used as the denominator in an automated surveillance that reports daily compliance rates. Overt 24/7 automated surveillance collected HHOs in medical and surgical wards. Accredited auditors observed health care workers' interaction between patient and patient zones to collect the total number of HHOs, indications, and compliance and noncompliance. Automated surveillance captured compliance (ie, events) via low power radio connected to alcohol-based handrub (ABHR) dispensers. Events were divided by HHOs, adjusted for daily patient-to-nurse ratio, to establish daily rates. Human auditors collected 21,450 HHOs during 24/7 with 1,532 average unadjusted HHOs per day. This was 4.4 times larger than the minimum ward sample required for accreditation. The average adjusted HHOs for ABHR alone on the medical ward was 63 HHOs per patient day and 40 HHOs per patient day on the surgical ward. From July 1, 2014-July 31, 2015 the automated surveillance system collected 889,968 events. Automated surveillance collects 4 times the amount of data on each ward per day than a human auditor usually collects for a quarterly compliance report. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.
Turbidity threshold sampling for suspended sediment load estimation
Jack Lewis; Rand Eads
2001-01-01
Abstract - The paper discusses an automated procedure for measuring turbidity and sampling suspended sediment. The basic equipment consists of a programmable data logger, an in situ turbidimeter, a pumping sampler, and a stage-measuring device. The data logger program employs turbidity to govern sample collection during each transport event. Mounting configurations and...
Intra-storm variability and soluble fractionation was explored for summer-time rain events in Steubenville, Ohio to evaluate the physical processes controlling mercury (Hg) in wet deposition in this industrialized region. Comprehensive precipitation sample collection was conducte...
Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest
NASA Technical Reports Server (NTRS)
Rohloff, Kurt
2010-01-01
The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.
Automation: Decision Aid or Decision Maker?
NASA Technical Reports Server (NTRS)
Skitka, Linda J.
1998-01-01
This study clarified that automation bias is something unique to automated decision making contexts, and is not the result of a general tendency toward complacency. By comparing performance on exactly the same events on the same tasks with and without an automated decision aid, we were able to determine that at least the omission error part of automation bias is due to the unique context created by having an automated decision aid, and is not a phenomena that would occur even if people were not in an automated context. However, this study also revealed that having an automated decision aid did lead to modestly improved performance across all non-error events. Participants in the non- automated condition responded with 83.68% accuracy, whereas participants in the automated condition responded with 88.67% accuracy, across all events. Automated decision aids clearly led to better overall performance when they were accurate. People performed almost exactly at the level of reliability as the automation (which across events was 88% reliable). However, also clear, is that the presence of less than 100% accurate automated decision aids creates a context in which new kinds of errors in decision making can occur. Participants in the non-automated condition responded with 97% accuracy on the six "error" events, whereas participants in the automated condition had only a 65% accuracy rate when confronted with those same six events. In short, the presence of an AMA can lead to vigilance decrements that can lead to errors in decision making.
VizieR Online Data Catalog: OGLE-II DIA microlensing events (Wozniak+, 2001)
NASA Astrophysics Data System (ADS)
Wozniak, P. R.; Udalski, A.; Szymanski, M.; Kubiak, M.; Pietrzynski, G.; Soszynski, I.; Zebrun, K.
2002-11-01
We present a sample of microlensing events discovered in the Difference Image Analysis (DIA) of the OGLE-II images collected during three observing seasons, 1997-1999. 4424 light curves pass our criteria on the presence of a brightening episode on top of a constant baseline. Among those, 512 candidate microlensing events were selected visually. We designed an automated procedure, which unambiguously selects up to 237 best events. Including eight candidate events recovered by other means, a total of 520 light curves are presented in this work. (4 data files).
Impact of sampling techniques on measured stormwater quality data for small streams
Harmel, R.D.; Slade, R.M.; Haney, R.L.
2010-01-01
Science-based sampling methodologies are needed to enhance water quality characterization for setting appropriate water quality standards, developing Total Maximum Daily Loads, and managing nonpoint source pollution. Storm event sampling, which is vital for adequate assessment of water quality in small (wadeable) streams, is typically conducted by manual grab or integrated sampling or with an automated sampler. Although it is typically assumed that samples from a single point adequately represent mean cross-sectional concentrations, especially for dissolved constituents, this assumption of well-mixed conditions has received limited evaluation. Similarly, the impact of temporal (within-storm) concentration variability is rarely considered. Therefore, this study evaluated differences in stormwater quality measured in small streams with several common sampling techniques, which in essence evaluated within-channel and within-storm concentration variability. Constituent concentrations from manual grab samples and from integrated samples were compared for 31 events, then concentrations were also compared for seven events with automated sample collection. Comparison of sampling techniques indicated varying degrees of concentration variability within channel cross sections for both dissolved and particulate constituents, which is contrary to common assumptions of substantial variability in particulate concentrations and of minimal variability in dissolved concentrations. Results also indicated the potential for substantial within-storm (temporal) concentration variability for both dissolved and particulate constituents. Thus, failing to account for potential cross-sectional and temporal concentration variability in stormwater monitoring projects can introduce additional uncertainty in measured water quality data. Copyright ?? 2010 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.
Oosterwijk, J C; Knepflé, C F; Mesker, W E; Vrolijk, H; Sloos, W C; Pattenier, H; Ravkin, I; van Ommen, G J; Kanhai, H H; Tanke, H J
1998-01-01
This article explores the feasibility of the use of automated microscopy and image analysis to detect the presence of rare fetal nucleated red blood cells (NRBCs) circulating in maternal blood. The rationales for enrichment and for automated image analysis for "rare-event" detection are reviewed. We also describe the application of automated image analysis to 42 maternal blood samples, using a protocol consisting of one-step enrichment followed by immunocytochemical staining for fetal hemoglobin (HbF) and FISH for X- and Y-chromosomal sequences. Automated image analysis consisted of multimode microscopy and subsequent visual evaluation of image memories containing the selected objects. The FISH results were compared with the results of conventional karyotyping of the chorionic villi. By use of manual screening, 43% of the slides were found to be positive (>=1 NRBC), with a mean number of 11 NRBCs (range 1-40). By automated microscopy, 52% were positive, with on average 17 NRBCs (range 1-111). There was a good correlation between both manual and automated screening, but the NRBC yield from automated image analysis was found to be superior to that from manual screening (P=.0443), particularly when the NRBC count was >15. Seven (64%) of 11 XY fetuses were correctly diagnosed by FISH analysis of automatically detected cells, and all discrepancies were restricted to the lower cell-count range. We believe that automated microscopy and image analysis reduce the screening workload, are more sensitive than manual evaluation, and can be used to detect rare HbF-containing NRBCs in maternal blood. PMID:9837832
Code of Federal Regulations, 2013 CFR
2013-01-01
... Office for an inspection assignment (See § 301.2 (yyy)). (1) If the Automated Import Information System... place samples in the vehicle for easy removal and reinspection by an import inspector. (3) In the event...
Code of Federal Regulations, 2010 CFR
2010-01-01
... Office for an inspection assignment (See § 301.2 (yyy)). (1) If the Automated Import Information System... place samples in the vehicle for easy removal and reinspection by an import inspector. (3) In the event...
Code of Federal Regulations, 2014 CFR
2014-01-01
... Office for an inspection assignment (See § 301.2 (yyy)). (1) If the Automated Import Information System... place samples in the vehicle for easy removal and reinspection by an import inspector. (3) In the event...
Code of Federal Regulations, 2012 CFR
2012-01-01
... Office for an inspection assignment (See § 301.2 (yyy)). (1) If the Automated Import Information System... place samples in the vehicle for easy removal and reinspection by an import inspector. (3) In the event...
Code of Federal Regulations, 2011 CFR
2011-01-01
... Office for an inspection assignment (See § 301.2 (yyy)). (1) If the Automated Import Information System... place samples in the vehicle for easy removal and reinspection by an import inspector. (3) In the event...
Automated reconstruction of rainfall events responsible for shallow landslides
NASA Astrophysics Data System (ADS)
Vessia, G.; Parise, M.; Brunetti, M. T.; Peruccacci, S.; Rossi, M.; Vennari, C.; Guzzetti, F.
2014-04-01
Over the last 40 years, many contributions have been devoted to identifying the empirical rainfall thresholds (e.g. intensity vs. duration ID, cumulated rainfall vs. duration ED, cumulated rainfall vs. intensity EI) for the initiation of shallow landslides, based on local as well as worldwide inventories. Although different methods to trace the threshold curves have been proposed and discussed in literature, a systematic study to develop an automated procedure to select the rainfall event responsible for the landslide occurrence has rarely been addressed. Nonetheless, objective criteria for estimating the rainfall responsible for the landslide occurrence (effective rainfall) play a prominent role on the threshold values. In this paper, two criteria for the identification of the effective rainfall events are presented: (1) the first is based on the analysis of the time series of rainfall mean intensity values over one month preceding the landslide occurrence, and (2) the second on the analysis of the trend in the time function of the cumulated mean intensity series calculated from the rainfall records measured through rain gauges. The two criteria have been implemented in an automated procedure written in R language. A sample of 100 shallow landslides collected in Italy by the CNR-IRPI research group from 2002 to 2012 has been used to calibrate the proposed procedure. The cumulated rainfall E and duration D of rainfall events that triggered the documented landslides are calculated through the new procedure and are fitted with power law in the (D,E) diagram. The results are discussed by comparing the (D,E) pairs calculated by the automated procedure and the ones by the expert method.
HepSim: A repository with predictions for high-energy physics experiments
Chekanov, S. V.
2015-02-03
A file repository for calculations of cross sections and kinematic distributions using Monte Carlo generators for high-energy collisions is discussed. The repository is used to facilitate effective preservation and archiving of data from theoretical calculations and for comparisons with experimental data. The HepSim data library is publicly accessible and includes a number of Monte Carlo event samples with Standard Model predictions for current and future experiments. The HepSim project includes a software package to automate the process of downloading and viewing online Monte Carlo event samples. Data streaming over a network for end-user analysis is discussed.
Assessing drivers' response during automated driver support system failures with non-driving tasks.
Shen, Sijun; Neyens, David M
2017-06-01
With the increase in automated driver support systems, drivers are shifting from operating their vehicles to supervising their automation. As a result, it is important to understand how drivers interact with these automated systems and evaluate their effect on driver responses to safety critical events. This study aimed to identify how drivers responded when experiencing a safety critical event in automated vehicles while also engaged in non-driving tasks. In total 48 participants were included in this driving simulator study with two levels of automated driving: (a) driving with no automation and (b) driving with adaptive cruise control (ACC) and lane keeping (LK) systems engaged; and also two levels of a non-driving task (a) watching a movie or (b) no non-driving task. In addition to driving performance measures, non-driving task performance and the mean glance duration for the non-driving task were compared between the two levels of automated driving. Drivers using the automated systems responded worse than those manually driving in terms of reaction time, lane departure duration, and maximum steering wheel angle to an induced lane departure event. These results also found that non-driving tasks further impaired driver responses to a safety critical event in the automated system condition. In the automated driving condition, driver responses to the safety critical events were slower, especially when engaged in a non-driving task. Traditional driver performance variables may not necessarily effectively and accurately evaluate driver responses to events when supervising autonomous vehicle systems. Thus, it is important to develop and use appropriate variables to quantify drivers' performance under these conditions. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.
APDS: the autonomous pathogen detection system.
Hindson, Benjamin J; Makarewicz, Anthony J; Setlur, Ujwal S; Henderer, Bruce D; McBride, Mary T; Dzenitis, John M
2005-04-15
We have developed and tested a fully autonomous pathogen detection system (APDS) capable of continuously monitoring the environment for airborne biological threat agents. The system was developed to provide early warning to civilians in the event of a bioterrorism incident and can be used at high profile events for short-term, intensive monitoring or in major public buildings or transportation nodes for long-term monitoring. The APDS is completely automated, offering continuous aerosol sampling, in-line sample preparation fluidics, multiplexed detection and identification immunoassays, and nucleic acid-based polymerase chain reaction (PCR) amplification and detection. Highly multiplexed antibody-based and duplex nucleic acid-based assays are combined to reduce false positives to a very low level, lower reagent costs, and significantly expand the detection capabilities of this biosensor. This article provides an overview of the current design and operation of the APDS. Certain sub-components of the ADPS are described in detail, including the aerosol collector, the automated sample preparation module that performs multiplexed immunoassays with confirmatory PCR, and the data monitoring and communications system. Data obtained from an APDS that operated continuously for 7 days in a major U.S. transportation hub is reported.
Automated parton-shower variations in PYTHIA 8
Mrenna, S.; Skands, P.
2016-10-03
In the era of precision physics measurements at the LHC, efficient and exhaustive estimations of theoretical uncertainties play an increasingly crucial role. In the context of Monte Carlo (MC) event generators, the estimation of such uncertainties traditionally requires independent MC runs for each variation, for a linear increase in total run time. In this work, we report on an automated evaluation of the dominant (renormalization-scale and nonsingular) perturbative uncertainties in the pythia 8 event generator, with only a modest computational overhead. Each generated event is accompanied by a vector of alternative weights (one for each uncertainty variation), with each set separatelymore » preserving the total cross section. Explicit scale-compensating terms can be included, reflecting known coefficients of higher-order splitting terms and reducing the effect of the variations. In conclusion, the formalism also allows for the enhancement of rare partonic splittings, such as g→bb¯ and q→qγ, to obtain weighted samples enriched in these splittings while preserving the correct physical Sudakov factors.« less
Sequence-of-events-driven automation of the deep space network
NASA Technical Reports Server (NTRS)
Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.
1996-01-01
In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.
Sequence-of-Events-Driven Automation of the Deep Space Network
NASA Technical Reports Server (NTRS)
Hill, R., Jr.; Fayyad, K.; Smyth, C.; Santos, T.; Chen, R.; Chien, S.; Bevan, R.
1996-01-01
In February 1995, sequence-of-events (SOE)-driven automation technology was demonstrated for a Voyager telemetry downlink track at DSS 13. This demonstration entailed automated generation of an operations procedure (in the form of a temporal dependency network) from project SOE information using artificial intelligence planning technology and automated execution of the temporal dependency network using the link monitor and control operator assistant system. This article describes the overall approach to SOE-driven automation that was demonstrated, identifies gaps in SOE definitions and project profiles that hamper automation, and provides detailed measurements of the knowledge engineering effort required for automation.
Automation of high-frequency sampling of environmental waters for reactive species
NASA Astrophysics Data System (ADS)
Kim, H.; Bishop, J. K.; Wood, T.; Fung, I.; Fong, M.
2011-12-01
Trace metals, particularly iron and manganese, play a critical role in some ecosystems as a limiting factor to determine primary productivity, in geochemistry, especially redox chemistry as important electron donors and acceptors, and in aquatic environments as carriers of contaminant transport. Dynamics of trace metals are closely related to various hydrologic events such as rainfall. Storm flow triggers dramatic changes of both dissolved and particulate trace metals concentrations and affects other important environmental parameters linked to trace metal behavior such as dissolved organic carbon (DOC). To improve our understanding of behaviors of trace metals and underlying processes, water chemistry information must be collected for an adequately long period of time at higher frequency than conventional manual sampling (e.g. weekly, biweekly). In this study, we developed an automated sampling system to document the dynamics of trace metals, focusing on Fe and Mn, and DOC for a multiple-year high-frequency geochemistry time series in a small catchment, called Rivendell located at Angelo Coast Range Reserve, California. We are sampling ground and streamwater using the automated sampling system in daily-frequency and the condition of the site is substantially variable from season to season. The ranges of pH of ground and streamwater are pH 5 - 7 and pH 7.8 - 8.3, respectively. DOC is usually sub-ppm, but during rain events, it increases by an order of magnitude. The automated sampling system focuses on two aspects- 1) a modified design of sampler to improve sample integrity for trace metals and DOC and 2) remote controlling system to update sampling volume and timing according to hydrological conditions. To maintain sample integrity, the developed method employed gravity filtering using large volume syringes (140mL) and syringe filters connected to a set of polypropylene bottles and a borosilicate bottle via Teflon tubing. Without filtration, in a few days, the dissolved concentration of Fe and Mn in the ground and streamwater samples stored in low density polyethylene (LDPE) sample bags decreased by 89% and 97%, respectively. In some cases of groundwater, the concentration of Ca decreased by 25%, due to degassing of CO2. However, DOC of the samples in LDPE bags without filtration increased up to 50% in 2 weeks, suggesting contamination from the bag. Performance of the new design was evaluated using the Fe-Mn-spiked Rivendell samples and environmental water samples collected from 1) Rivendell, 2) the Strawberry Creek located at the University of California, Berkeley campus, and 3) the San Francisco Bay. The samples were filtered using the developed method and stored in room temperature in 2 - 3 weeks without further treatment. The method improved the sample integrity significantly; the average recovery rates of Fe, Mn, DOC, and Ca were 92%, 98%, 90%, and 97%, respectively.
Comparison of water-quality samples collected by siphon samplers and automatic samplers in Wisconsin
Graczyk, David J.; Robertson, Dale M.; Rose, William J.; Steur, Jeffrey J.
2000-01-01
In small streams, flow and water-quality concentrations often change quickly in response to meteorological events. Hydrologists, field technicians, or locally hired stream ob- servers involved in water-data collection are often unable to reach streams quickly enough to observe or measure these rapid changes. Therefore, in hydrologic studies designed to describe changes in water quality, a combination of manual and automated sampling methods have commonly been used manual methods when flow is relatively stable and automated methods when flow is rapidly changing. Auto- mated sampling, which makes use of equipment programmed to collect samples in response to changes in stage and flow of a stream, has been shown to be an effective method of sampling to describe the rapid changes in water quality (Graczyk and others, 1993). Because of the high cost of automated sampling, however, especially for studies examining a large number of sites, alternative methods have been considered for collecting samples during rapidly changing stream conditions. One such method employs the siphon sampler (fig. 1). also referred to as the "single-stage sampler." Siphon samplers are inexpensive to build (about $25- $50 per sampler), operate, and maintain, so they are cost effective to use at a large number of sites. Their ability to collect samples representing the average quality of water passing though the entire cross section of a stream, however, has not been fully demonstrated for many types of stream sites.
Ialongo, Cristiano; Pieri, Massimo; Bernardini, Sergio
2017-02-01
Diluting a sample to obtain a measure within the analytical range is a common task in clinical laboratories. However, for urgent samples, it can cause delays in test reporting, which can put patients' safety at risk. The aim of this work is to show a simple artificial neural network that can be used to make it unnecessary to predilute a sample using the information available through the laboratory information system. Particularly, the Multilayer Perceptron neural network built on a data set of 16,106 cardiac troponin I test records produced a correct inference rate of 100% for samples not requiring predilution and 86.2% for those requiring predilution. With respect to the inference reliability, the most relevant inputs were the presence of a cardiac event or surgery and the result of the previous assay. Therefore, such an artificial neural network can be easily implemented into a total automation framework to sensibly reduce the turnaround time of critical orders delayed by the operation required to retrieve, dilute, and retest the sample.
Rochefort, Christian M; Buckeridge, David L; Tanguay, Andréanne; Biron, Alain; D'Aragon, Frédérick; Wang, Shengrui; Gallix, Benoit; Valiquette, Louis; Audet, Li-Anne; Lee, Todd C; Jayaraman, Dev; Petrucci, Bruno; Lefebvre, Patricia
2017-02-16
Adverse events (AEs) in acute care hospitals are frequent and associated with significant morbidity, mortality, and costs. Measuring AEs is necessary for quality improvement and benchmarking purposes, but current detection methods lack in accuracy, efficiency, and generalizability. The growing availability of electronic health records (EHR) and the development of natural language processing techniques for encoding narrative data offer an opportunity to develop potentially better methods. The purpose of this study is to determine the accuracy and generalizability of using automated methods for detecting three high-incidence and high-impact AEs from EHR data: a) hospital-acquired pneumonia, b) ventilator-associated event and, c) central line-associated bloodstream infection. This validation study will be conducted among medical, surgical and ICU patients admitted between 2013 and 2016 to the Centre hospitalier universitaire de Sherbrooke (CHUS) and the McGill University Health Centre (MUHC), which has both French and English sites. A random 60% sample of CHUS patients will be used for model development purposes (cohort 1, development set). Using a random sample of these patients, a reference standard assessment of their medical chart will be performed. Multivariate logistic regression and the area under the curve (AUC) will be employed to iteratively develop and optimize three automated AE detection models (i.e., one per AE of interest) using EHR data from the CHUS. These models will then be validated on a random sample of the remaining 40% of CHUS patients (cohort 1, internal validation set) using chart review to assess accuracy. The most accurate models developed and validated at the CHUS will then be applied to EHR data from a random sample of patients admitted to the MUHC French site (cohort 2) and English site (cohort 3)-a critical requirement given the use of narrative data -, and accuracy will be assessed using chart review. Generalizability will be determined by comparing AUCs from cohorts 2 and 3 to those from cohort 1. This study will likely produce more accurate and efficient measures of AEs. These measures could be used to assess the incidence rates of AEs, evaluate the success of preventive interventions, or benchmark performance across hospitals.
Automated seismic waveform location using Multichannel Coherency Migration (MCM)-I. Theory
NASA Astrophysics Data System (ADS)
Shi, Peidong; Angus, Doug; Rost, Sebastian; Nowacki, Andy; Yuan, Sanyi
2018-03-01
With the proliferation of dense seismic networks sampling the full seismic wavefield, recorded seismic data volumes are getting bigger and automated analysis tools to locate seismic events are essential. Here, we propose a novel Multichannel Coherency Migration (MCM) method to locate earthquakes in continuous seismic data and reveal the location and origin time of seismic events directly from recorded waveforms. By continuously calculating the coherency between waveforms from different receiver pairs, MCM greatly expands the available information which can be used for event location. MCM does not require phase picking or phase identification, which allows fully automated waveform analysis. By migrating the coherency between waveforms, MCM leads to improved source energy focusing. We have tested and compared MCM to other migration-based methods in noise-free and noisy synthetic data. The tests and analysis show that MCM is noise resistant and can achieve more accurate results compared with other migration-based methods. MCM is able to suppress strong interference from other seismic sources occurring at a similar time and location. It can be used with arbitrary 3D velocity models and is able to obtain reasonable location results with smooth but inaccurate velocity models. MCM exhibits excellent location performance and can be easily parallelized giving it large potential to be developed as a real-time location method for very large datasets.
Scholl, Zackary N.; Marszalek, Piotr E.
2013-01-01
The benefits of single molecule force spectroscopy (SMFS) clearly outweigh the challenges which include small sample sizes, tedious data collection and introduction of human bias during the subjective data selection. These difficulties can be partially eliminated through automation of the experimental data collection process for atomic force microscopy (AFM). Automation can be accomplished using an algorithm that triages usable force-extension recordings quickly with positive and negative selection. We implemented an algorithm based on the windowed fast Fourier transform of force-extension traces that identifies peaks using force-extension regimes to correctly identify usable recordings from proteins composed of repeated domains. This algorithm excels as a real-time diagnostic because it involves <30 ms computational time, has high sensitivity and specificity, and efficiently detects weak unfolding events. We used the statistics provided by the automated procedure to clearly demonstrate the properties of molecular adhesion and how these properties change with differences in the cantilever tip and protein functional groups and protein age. PMID:24001740
Using continuous in-situ measurements to adaptively trigger urban storm water samples
NASA Astrophysics Data System (ADS)
Wong, B. P.; Kerkez, B.
2015-12-01
Until cost-effective in-situ sensors are available for biological parameters, nutrients and metals, automated samplers will continue to be the primary source of reliable water quality measurements. Given limited samples bottles, however, autosamplers often obscure insights on nutrient sources and biogeochemical processes which would otherwise be captured using a continuous sampling approach. To that end, we evaluate the efficacy a novel method to measure first-flush nutrient dynamics in flashy, urban watersheds. Our approach reduces the number of samples required to capture water quality dynamics by leveraging an internet-connected sensor node, which is equipped with a suite of continuous in-situ sensors and an automated sampler. To capture both the initial baseflow as well as storm concentrations, a cloud-hosted adaptive algorithm analyzes the high-resolution sensor data along with local weather forecasts to optimize a sampling schedule. The method was tested in a highly developed urban catchment in Ann Arbor, Michigan and collected samples of nitrate, phosphorus, and suspended solids throughout several storm events. Results indicate that the watershed does not exhibit first flush dynamics, a behavior that would have been obscured when using a non-adaptive sampling approach.
NASA Astrophysics Data System (ADS)
Stadler, Hermann; Skritek, Paul; Zerobin, Wolfgang; Klock, Erich; Farnleitner, Andreas H.
2010-05-01
In the last year, global changes in ecosystems, the growth of population, and modifications of the legal framework within the EU have caused an increased need of qualitative groundwater and spring water monitoring with the target to continue to supply the consumers with high-quality drinking water in the future. Additionally the demand for sustainable protection of drinking water resources effected the initiated implementation of early warning systems and quality assurance networks in water supplies. In the field of hydrogeological investigations, event monitoring and event sampling is worst case scenario monitoring. Therefore, such tools become more and more indispensible to get detailed information about aquifer parameter and vulnerability. In the framework of water supplies, smart sampling designs combined with in-situ measurements of different parameters and on-line access can play an important role in early warning systems and quality surveillance networks. In this study nested sampling tiers are presented, which were designed to cover total system dynamic. Basic monitoring sampling (BMS), high frequency sampling (HFS) and automated event sampling (AES) were combined. BMS was organized with a monthly increment for at least two years, and HFS was performed during times of increased groundwater recharge (e.g. during snowmelt). At least one AES tier was embedded in this system. AES was enabled by cross-linking of hydrological stations, so the system could be run fully automated and could include real-time availability of data. By means of networking via Low Earth Orbiting Satellites (LEO-satellites), data from the precipitation station (PS) in the catchment area are brought together with data from the spring sampling station (SSS) without the need of terrestrial infrastructure for communication and power supply. Furthermore, the whole course of input and output parameters, like precipitation (input system) and discharge (output system), and the status of the sampling system is transmitted via LEO-Satellites to a Central Monitoring Station (CMS), which can be linked with a web-server to have unlimited real-time data access. The automatically generated notice of event to a local service team of the sampling station is transmitted in combination with internet, GSM, GPRS or LEO-Satellites. If a GPRS-network is available for the stations, this system could be realized also via this network. However, one great problem of these terrestrial communication systems is the risk of default when their networks are overloaded, like during flood events or thunderstorms. Therefore, in addition, it is necessary to have the possibility to transmit the measured values via communication satellites when a terrestrial infrastructure is not available. LEO-satellites are especially useful in the alpine regions because they have no deadspots, but only sometimes latency periods. In the workouts we combined in-situ measurements (precipitation, electrical conductivity, discharge, water temperature, spectral absorption coefficient, turbidity) with time increments from 1 to 15 minutes with data from the different sampling tires (environmental isotopes, chemical, mineralogical and bacteriological data).
Post-processing of seismic parameter data based on valid seismic event determination
McEvilly, Thomas V.
1985-01-01
An automated seismic processing system and method are disclosed, including an array of CMOS microprocessors for unattended battery-powered processing of a multi-station network. According to a characterizing feature of the invention, each channel of the network is independently operable to automatically detect, measure times and amplitudes, and compute and fit Fast Fourier transforms (FFT's) for both P- and S- waves on analog seismic data after it has been sampled at a given rate. The measured parameter data from each channel are then reviewed for event validity by a central controlling microprocessor and if determined by preset criteria to constitute a valid event, the parameter data are passed to an analysis computer for calculation of hypocenter location, running b-values, source parameters, event count, P- wave polarities, moment-tensor inversion, and Vp/Vs ratios. The in-field real-time analysis of data maximizes the efficiency of microearthquake surveys allowing flexibility in experimental procedures, with a minimum of traditional labor-intensive postprocessing. A unique consequence of the system is that none of the original data (i.e., the sensor analog output signals) are necessarily saved after computation, but rather, the numerical parameters generated by the automatic analysis are the sole output of the automated seismic processor.
Skyalert: a Platform for Event Understanding and Dissemination
NASA Astrophysics Data System (ADS)
Williams, Roy; Drake, A. J.; Djorgovski, S. G.; Donalek, C.; Graham, M. J.; Mahabal, A.
2010-01-01
Skyalert.org is an event repository, web interface, and event-oriented workflow architecture that can be used in many different ways for handling astronomical events that are encoded as VOEvent. It can be used as a remote application (events in the cloud) or installed locally. Some applications are: Dissemination of events with sophisticated discrimination (trigger), using email, instant message, RSS, twitter, etc; Authoring interface for survey-generated events, follow-up observations, and other event types; event streams can be put into the skyalert.org repository, either public or private, or into a local inbstallation of Skyalert; Event-driven software components to fetch archival data, for data-mining and classification of events; human interface to events though wiki, comments, and circulars; use of the "notices and circulars" model, where machines make the notices in real time and people write the interpretation later; Building trusted, automated decisions for automated follow-up observation, and the information infrastructure for automated follow-up with DC3 and HTN telescope schedulers; Citizen science projects such as artifact detection and classification; Query capability for past events, including correlations between different streams and correlations with existing source catalogs; Event metadata structures and connection to the global registry of the virtual observatory.
THE RABIT: A RAPID AUTOMATED BIODOSIMETRY TOOL FOR RADIOLOGICAL TRIAGE
Garty, Guy; Chen, Youhua; Salerno, Alessio; Turner, Helen; Zhang, Jian; Lyulko, Oleksandra; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y. Lawrence; Amundson, Sally A.; Brenner, David J.
2010-01-01
In response to the recognized need for high throughput biodosimetry methods for use after large scale radiological events, a logical approach is complete automation of standard biodosimetric assays that are currently performed manually. We describe progress to date on the RABIT (Rapid Automated BIodosimetry Tool), designed to score micronuclei or γ-H2AX fluorescence in lymphocytes derived from a single drop of blood from a fingerstick. The RABIT system is designed to be completely automated, from the input of the capillary blood sample into the machine, to the output of a dose estimate. Improvements in throughput are achieved through use of a single drop of blood, optimization of the biological protocols for in-situ analysis in multi-well plates, implementation of robotic plate and liquid handling, and new developments in high-speed imaging. Automating well-established bioassays represents a promising approach to high-throughput radiation biodosimetry, both because high throughputs can be achieved, but also because the time to deployment is potentially much shorter than for a new biological assay. Here we describe the development of each of the individual modules of the RABIT system, and show preliminary data from key modules. Ongoing is system integration, followed by calibration and validation. PMID:20065685
DOE Program on Seismic Characterization for Regions of Interest to CTBT Monitoring,
1995-08-14
processing of the monitoring network data). While developing and testing the corrections and other parameters needed by the automated processing systems...the secondary network. Parameters tabulated in the knowledge base must be appropriate for routine automated processing of network data, and must also...operation of the PNDC, as well as to results of investigations of "special events" (i.e., those events that fail to locate or discriminate during automated
Properties of induced seismicity at the geothermal reservoir Insheim, Germany
NASA Astrophysics Data System (ADS)
Olbert, Kai; Küperkoch, Ludger; Thomas, Meier
2017-04-01
Within the framework of the German MAGS2 Project the processing of induced events at the geothermal power plant Insheim, Germany, has been reassessed and evaluated. The power plant is located close to the western rim of the Upper Rhine Graben in a region with a strongly heterogeneous subsurface. Therefore, the location of seismic events particularly the depth estimation is challenging. The seismic network consisting of up to 50 stations has an aperture of approximately 15 km around the power plant. Consequently, the manual processing is time consuming. Using a waveform similarity detection algorithm, the existing dataset from 2012 to 2016 has been reprocessed to complete the catalog of induced seismic events. Based on the waveform similarity clusters of similar events have been detected. Automated P- and S-arrival time determination using an improved multi-component autoregressive prediction algorithm yields approximately 14.000 P- and S-arrivals for 758 events. Applying a dataset of manual picks as reference the automated picking algorithm has been optimized resulting in a standard deviation of the residuals between automated and manual picks of about 0.02s. The automated locations show uncertainties comparable to locations of the manual reference dataset. 90 % of the automated relocations fall within the error ellipsoid of the manual locations. The remaining locations are either badly resolved due to low numbers of picks or so well resolved that the automatic location is outside the error ellipsoid although located close to the manual location. The developed automated processing scheme proved to be a useful tool to supplement real-time monitoring. The event clusters are located at small patches of faults known from reflection seismic studies. The clusters are observed close to both the injection as well as the production wells.
Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David
2018-04-01
Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
Renn, Danny E.
2000-01-01
Suspended-sediment samples and streamflow data were collected from May 1996 through June 1998 at three sites in the Grand Calumet River Basin - Indiana Harbor Canal at East Chicago, the east branch of the Grand Calumet River at Gary, and the west branch of the Grand Calumet River at Hammond. Sample analysis allowed for retention of sediments of 0.0015 millimeters or larger. At Indiana Harbor Canal at East Chicago, an automated sampler collected 2,005 suspended-sediment samples from the canal and, of these, 1,856 had associated streamflow values. To evaluate any bias between instream concentrations of suspended sediment and samples collected by the automated sampler, 27 sets of suspended-sediment samples were collected manually in the canal at the same time samples were collected by the automated sampler. There was no consistent bias between the samples collected manually instream and the samples collected by the automated sampler; therefore, no correction factor was applied to the concentrations of suspended sedment for the samples collected by the automated sampler. For the 2,005 and 1,856 samples, the mean suspended-sediment concentrations were the same, 15 milligrams per liter (mg/L), and the range in suspended-sediment concentrations were the same, from less than 1 mg/L to 97 mg/L. No apparent relation between the concentration of suspended sediment measured in samples from the Indiana Harbor Canal and streamflow was indicated, probably because of complex hydraulic conditions in the study area; most of the streamflow is from industrial and municipal discharges, and streamflow is affected by changes in water levels in Lake Michigan. There did appear to be a seasonal trend in the concentrations of suspended sediment, however, in that the largest concentrations generally were measured during the spring. During the study, four substantial rainfall events were recorded. Only for a rainfall event of 4.20 inches was there a substantial increase in the concentrations of suspended sediment and streamflow in the Indiana Harbor Canal. Six sets of samples were collected from the canal for determination of the percentage of organic material in the suspended sediment. Organic material in these samples averaged 26 percent. Bedload-sediment samples were collected three times in the canal with a bedload-sediment sampler; the collection-bag mesh size was 0.25 millimeter. No bedload sediments were collected in the sampler for any of the sample collections. Seven suspended-sediment samples were collected from the Grand Calumet River at Gary and at Hammond. The mean suspended sediment concentration measured in samples collected at Gary was 13 mg/L, and the mean suspended-sediment concentration measured in samples collected at Hammond was 6 mg/L. For both sites, there was no apparent relation between the concentration of suspended sediment and streamflow. Four suspended sediment samples were collected from the Grand Calumet River at Gary and at Hammond for determination of the percentage of organic material. The amount of organic material at Gary averaged 35 percent, and the amount of organic material at Hammond averaged 34 percent. The concentrations of suspended sediment determined for samples collected from the Indiana Harbor Canal and from the Grand Calumet River are less than concentrations of suspended sediment in samples collected from other streams in northwestern Indiana and in other parts of the State. Loads of suspended sediment were computed as the product of the weekly mean suspended-sediment concentration and the daily average streamflow for the Indiana Harbor Canal at East Chicago. The average suspended-sediment load computed for the canal was 29 tons per day for the first year of the study (June 1996 through May 1997) and 23 tons per day for the second year of the study (June 1997 through May 1998). Loads of suspended sediment for the Grand Calumet River at Gary and at Hammond were estimated by use of the ratin
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J., III; Parasuraman, Raja; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.
2003-01-01
Adaptive automation represents an advanced form of human-centered automation design. The approach to automation provides for real-time and model-based assessments of human-automation interaction, determines whether the human has entered into a hazardous state of awareness and then modulates the task environment to keep the operator in-the-loop , while maintaining an optimal state of task engagement and mental alertness. Because adaptive automation has not matured, numerous challenges remain, including what the criteria are, for determining when adaptive aiding and adaptive function allocation should take place. Human factors experts in the area have suggested a number of measures including the use of psychophysiology. This NASA Technical Paper reports on three experiments that examined the psychophysiological measures of event-related potentials, electroencephalogram, and heart-rate variability for real-time adaptive automation. The results of the experiments confirm the efficacy of these measures for use in both a developmental and operational role for adaptive automation design. The implications of these results and future directions for psychophysiology and human-centered automation design are discussed.
Test and Evaluation of Neural Network Applications for Seismic Signal Discrimination
1992-09-28
IMS) for automated processing and interpretation of regional seismic data. Also reported is the result of a preliminary study on the application of...of analyst-verified events that were missed by the automated processing decreased by more than a factor of 2 (about 10 events/week). The second
Improving patient safety via automated laboratory-based adverse event grading.
Niland, Joyce C; Stiller, Tracey; Neat, Jennifer; Londrc, Adina; Johnson, Dina; Pannoni, Susan
2012-01-01
The identification and grading of adverse events (AEs) during the conduct of clinical trials is a labor-intensive and error-prone process. This paper describes and evaluates a software tool developed by City of Hope to automate complex algorithms to assess laboratory results and identify and grade AEs. We compared AEs identified by the automated system with those previously assessed manually, to evaluate missed/misgraded AEs. We also conducted a prospective paired time assessment of automated versus manual AE assessment. We found a substantial improvement in accuracy/completeness with the automated grading tool, which identified an additional 17% of severe grade 3-4 AEs that had been missed/misgraded manually. The automated system also provided an average time saving of 5.5 min per treatment course. With 400 ongoing treatment trials at City of Hope and an average of 1800 laboratory results requiring assessment per study, the implications of these findings for patient safety are enormous.
Computer-Assisted Automated Scoring of Polysomnograms Using the Somnolyzer System.
Punjabi, Naresh M; Shifa, Naima; Dorffner, Georg; Patil, Susheel; Pien, Grace; Aurora, Rashmi N
2015-10-01
Manual scoring of polysomnograms is a time-consuming and tedious process. To expedite the scoring of polysomnograms, several computerized algorithms for automated scoring have been developed. The overarching goal of this study was to determine the validity of the Somnolyzer system, an automated system for scoring polysomnograms. The analysis sample comprised of 97 sleep studies. Each polysomnogram was manually scored by certified technologists from four sleep laboratories and concurrently subjected to automated scoring by the Somnolyzer system. Agreement between manual and automated scoring was examined. Sleep staging and scoring of disordered breathing events was conducted using the 2007 American Academy of Sleep Medicine criteria. Clinical sleep laboratories. A high degree of agreement was noted between manual and automated scoring of the apnea-hypopnea index (AHI). The average correlation between the manually scored AHI across the four clinical sites was 0.92 (95% confidence interval: 0.90-0.93). Similarly, the average correlation between the manual and Somnolyzer-scored AHI values was 0.93 (95% confidence interval: 0.91-0.96). Thus, interscorer correlation between the manually scored results was no different than that derived from manual and automated scoring. Substantial concordance in the arousal index, total sleep time, and sleep efficiency between manual and automated scoring was also observed. In contrast, differences were noted between manually and automated scored percentages of sleep stages N1, N2, and N3. Automated analysis of polysomnograms using the Somnolyzer system provides results that are comparable to manual scoring for commonly used metrics in sleep medicine. Although differences exist between manual versus automated scoring for specific sleep stages, the level of agreement between manual and automated scoring is not significantly different than that between any two human scorers. In light of the burden associated with manual scoring, automated scoring platforms provide a viable complement of tools in the diagnostic armamentarium of sleep medicine. © 2015 Associated Professional Sleep Societies, LLC.
Development of Automated Moment Tensor Software at the Prototype International Data Center
2000-09-01
Berkeley Digital Seismic Network stations in the 100 to 500 km distance range. With sufficient azimuthal coverage this method is found to perform...the solution reported by NIED (http://argent.geo.bosai.go.jp/ freesia /event/hypo/joho.html). The normal mechanism obtained by the three-component...Digital Seismic Network stations. These stations provide more than 100 degrees of azimuthal coverage, which is an adequate sampling of the focal
Shu, Tongxin; Xia, Min; Chen, Jiahong; Silva, Clarence de
2017-11-05
Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy.
Shu, Tongxin; Xia, Min; Chen, Jiahong; de Silva, Clarence
2017-01-01
Power management is crucial in the monitoring of a remote environment, especially when long-term monitoring is needed. Renewable energy sources such as solar and wind may be harvested to sustain a monitoring system. However, without proper power management, equipment within the monitoring system may become nonfunctional and, as a consequence, the data or events captured during the monitoring process will become inaccurate as well. This paper develops and applies a novel adaptive sampling algorithm for power management in the automated monitoring of the quality of water in an extensive and remote aquatic environment. Based on the data collected on line using sensor nodes, a data-driven adaptive sampling algorithm (DDASA) is developed for improving the power efficiency while ensuring the accuracy of sampled data. The developed algorithm is evaluated using two distinct key parameters, which are dissolved oxygen (DO) and turbidity. It is found that by dynamically changing the sampling frequency, the battery lifetime can be effectively prolonged while maintaining a required level of sampling accuracy. According to the simulation results, compared to a fixed sampling rate, approximately 30.66% of the battery energy can be saved for three months of continuous water quality monitoring. Using the same dataset to compare with a traditional adaptive sampling algorithm (ASA), while achieving around the same Normalized Mean Error (NME), DDASA is superior in saving 5.31% more battery energy. PMID:29113087
Were they in the loop during automated driving? Links between visual attention and crash potential.
Louw, Tyron; Madigan, Ruth; Carsten, Oliver; Merat, Natasha
2017-08-01
A proposed advantage of vehicle automation is that it relieves drivers from the moment-to-moment demands of driving, to engage in other, non-driving related, tasks. However, it is important to gain an understanding of drivers' capacity to resume manual control, should such a need arise. As automation removes vehicle control-based measures as a performance indicator, other metrics must be explored. This driving simulator study, conducted under the European Commission (EC) funded AdaptIVe project, assessed drivers' gaze fixations during partially-automated (SAE Level 2) driving, on approach to critical and non-critical events. Using a between-participant design, 75 drivers experienced automation with one of five out-of-the-loop (OOTL) manipulations, which used different levels of screen visibility and secondary tasks to induce varying levels of engagement with the driving task: 1) no manipulation, 2) manipulation by light fog, 3) manipulation by heavy fog, 4) manipulation by heavy fog plus a visual task, 5) no manipulation plus an n-back task. The OOTL manipulations influenced drivers' first point of gaze fixation after they were asked to attend to an evolving event. Differences resolved within one second and visual attention allocation adapted with repeated events, yet crash outcome was not different between OOTL manipulation groups. Drivers who crashed in the first critical event showed an erratic pattern of eye fixations towards the road centre on approach to the event, while those who did not demonstrated a more stable pattern. Automated driving systems should be able to direct drivers' attention to hazards no less than 6 seconds in advance of an adverse outcome. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Schmidt, Jürgen; Laarousi, Rihab; Stolzmann, Wolfgang; Karrer-Gauß, Katja
2018-06-01
In this article, we examine the performance of different eye blink detection algorithms under various constraints. The goal of the present study was to evaluate the performance of an electrooculogram- and camera-based blink detection process in both manually and conditionally automated driving phases. A further comparison between alert and drowsy drivers was performed in order to evaluate the impact of drowsiness on the performance of blink detection algorithms in both driving modes. Data snippets from 14 monotonous manually driven sessions (mean 2 h 46 min) and 16 monotonous conditionally automated driven sessions (mean 2 h 45 min) were used. In addition to comparing two data-sampling frequencies for the electrooculogram measures (50 vs. 25 Hz) and four different signal-processing algorithms for the camera videos, we compared the blink detection performance of 24 reference groups. The analysis of the videos was based on very detailed definitions of eyelid closure events. The correct detection rates for the alert and manual driving phases (maximum 94%) decreased significantly in the drowsy (minus 2% or more) and conditionally automated (minus 9% or more) phases. Blinking behavior is therefore significantly impacted by drowsiness as well as by automated driving, resulting in less accurate blink detection.
NASA Astrophysics Data System (ADS)
Giovanna, Vessia; Luca, Pisano; Carmela, Vennari; Mauro, Rossi; Mario, Parise
2016-01-01
This paper proposes an automated method for the selection of rainfall data (duration, D, and cumulated, E), responsible for shallow landslide initiation. The method mimics an expert person identifying D and E from rainfall records through a manual procedure whose rules are applied according to her/his judgement. The comparison between the two methods is based on 300 D-E pairs drawn from temporal rainfall data series recorded in a 30 days time-lag before the landslide occurrence. Statistical tests, employed on D and E samples considered both paired and independent values to verify whether they belong to the same population, show that the automated procedure is able to replicate the expert pairs drawn by the expert judgment. Furthermore, a criterion based on cumulated distribution functions (CDFs) is proposed to select the most related D-E pairs to the expert one among the 6 drawn from the coded procedure for tracing the empirical rainfall threshold line.
Integrating laboratory robots with analytical instruments--must it really be so difficult?
Kramer, G W
1990-09-01
Creating a reliable system from discrete laboratory instruments is often a task fraught with difficulties. While many modern analytical instruments are marvels of detection and data handling, attempts to create automated analytical systems incorporating such instruments are often frustrated by their human-oriented control structures and their egocentricity. The laboratory robot, while fully susceptible to these problems, extends such compatibility issues to the physical dimensions involving sample interchange, manipulation, and event timing. The workcell concept was conceived to describe the procedure and equipment necessary to carry out a single task during sample preparation. This notion can be extended to organize all operations in an automated system. Each workcell, no matter how complex its local repertoire of functions, must be minimally capable of accepting information (commands, data), returning information on demand (status, results), and being started, stopped, and reset by a higher level device. Even the system controller should have a mode where it can be directed by instructions from a higher level.
Automated Protist Analysis of Complex Samples: Recent Investigations Using Motion and Thresholding
2012-01-01
Report No: CG-D-15-13 Automated Protist Analysis of Complex Samples: Recent Investigations Using Motion and Thresholding...Distribution Statement A: Approved for public release; distribution is unlimited. January 2012 Automated Protist Analysis of Complex Samples...Chelsea Street New London, CT 06320 Automated Protist Analysis of Complex Samples iii UNCLAS//PUBLIC | CG-926 R&DC | B. Nelson, et al
Fatigue and voluntary utilization of automation in simulated driving.
Neubauer, Catherine; Matthews, Gerald; Langheim, Lisa; Saxby, Dyani
2012-10-01
A driving simulator was used to assess the impact on fatigue, stress, and workload of full vehicle automation that was initiated by the driver. Previous studies have shown that mandatory use of full automation induces a state of "passive fatigue" associated with loss of alertness. By contrast, voluntary use of automation may enhance the driver's perceptions of control and ability to manage fatigue. Participants were assigned to one of two experimental conditions, automation optional (AO) and nonautomation (NA), and then performed a 35 min, monotonous simulated drive. In the last 5 min, automation was unavailable and drivers were required to respond to an emergency event. Subjective state and workload were evaluated before and after the drive. Making automation available to the driver failed to alleviate fatigue and stress states induced by driving in monotonous conditions. Drivers who were fatigued prior to the drive were more likely to choose to use automation, but automation use increased distress, especially in fatigue-prone drivers. Drivers in the AO condition were slower to initiate steering responses to the emergency event, suggesting optional automation may be distracting. Optional, driver-controlled automation appears to pose the same dangers to task engagement and alertness as externally initiated automation. Drivers of automated vehicles may be vulnerable to fatigue that persists when normal vehicle control is restored. It is important to evaluate automated systems' impact on driver fatigue, to seek design solutions to the issue of maintaining driver engagement, and to address the vulnerabilities of fatigue-prone drivers.
DOT National Transportation Integrated Search
1982-07-01
In order to examine specific automated guideway transit (AGT) developments and concepts, UMTA undertook a program of studies and technology investigations called Automated Guideway Transit Technology (AGTT) Program. The objectives of one segment of t...
NASA Astrophysics Data System (ADS)
Acciarri, R.; Adams, C.; An, R.; Anthony, J.; Asaadi, J.; Auger, M.; Bagby, L.; Balasubramanian, S.; Baller, B.; Barnes, C.; Barr, G.; Bass, M.; Bay, F.; Bishai, M.; Blake, A.; Bolton, T.; Camilleri, L.; Caratelli, D.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Chen, H.; Church, E.; Cianci, D.; Cohen, E.; Collin, G. H.; Conrad, J. M.; Convery, M.; Crespo-Anadón, J. I.; Del Tutto, M.; Devitt, D.; Dytman, S.; Eberly, B.; Ereditato, A.; Escudero Sanchez, L.; Esquivel, J.; Fadeeva, A. A.; Fleming, B. T.; Foreman, W.; Furmanski, A. P.; Garcia-Gamez, D.; Garvey, G. T.; Genty, V.; Goeldi, D.; Gollapinni, S.; Graf, N.; Gramellini, E.; Greenlee, H.; Grosso, R.; Guenette, R.; Hackenburg, A.; Hamilton, P.; Hen, O.; Hewes, J.; Hill, C.; Ho, J.; Horton-Smith, G.; Hourlier, A.; Huang, E.-C.; James, C.; Jan de Vries, J.; Jen, C.-M.; Jiang, L.; Johnson, R. A.; Joshi, J.; Jostlein, H.; Kaleko, D.; Karagiorgi, G.; Ketchum, W.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Kreslo, I.; Laube, A.; Li, Y.; Lister, A.; Littlejohn, B. R.; Lockwitz, S.; Lorca, D.; Louis, W. C.; Luethi, M.; Lundberg, B.; Luo, X.; Marchionni, A.; Mariani, C.; Marshall, J.; Martinez Caicedo, D. A.; Meddage, V.; Miceli, T.; Mills, G. B.; Moon, J.; Mooney, M.; Moore, C. D.; Mousseau, J.; Murrells, R.; Naples, D.; Nienaber, P.; Nowak, J.; Palamara, O.; Paolone, V.; Papavassiliou, V.; Pate, S. F.; Pavlovic, Z.; Piasetzky, E.; Porzio, D.; Pulliam, G.; Qian, X.; Raaf, J. L.; Rafique, A.; Rochester, L.; Rudolf von Rohr, C.; Russell, B.; Schmitz, D. W.; Schukraft, A.; Seligman, W.; Shaevitz, M. H.; Sinclair, J.; Smith, A.; Snider, E. L.; Soderberg, M.; Söldner-Rembold, S.; Soleti, S. R.; Spentzouris, P.; Spitz, J.; St. John, J.; Strauss, T.; Szelc, A. M.; Tagg, N.; Terao, K.; Thomson, M.; Toups, M.; Tsai, Y.-T.; Tufanli, S.; Usher, T.; Van De Pontseele, W.; Van de Water, R. G.; Viren, B.; Weber, M.; Wickremasinghe, D. A.; Wolbers, S.; Wongjirad, T.; Woodruff, K.; Yang, T.; Yates, L.; Zeller, G. P.; Zennamo, J.; Zhang, C.
2018-01-01
The development and operation of liquid-argon time-projection chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies.
iMSRC: converting a standard automated microscope into an intelligent screening platform.
Carro, Angel; Perez-Martinez, Manuel; Soriano, Joaquim; Pisano, David G; Megias, Diego
2015-05-27
Microscopy in the context of biomedical research is demanding new tools to automatically detect and capture objects of interest. The few extant packages addressing this need, however, have enjoyed limited uptake due to complexity of use and installation. To overcome these drawbacks, we developed iMSRC, which combines ease of use and installation with high flexibility and enables applications such as rare event detection and high-resolution tissue sample screening, saving time and resources.
Leveraging Long-term Seismic Catalogs for Automated Real-time Event Classification
NASA Astrophysics Data System (ADS)
Linville, L.; Draelos, T.; Pankow, K. L.; Young, C. J.; Alvarez, S.
2017-12-01
We investigate the use of labeled event types available through reviewed seismic catalogs to produce automated event labels on new incoming data from the crustal region spanned by the cataloged events. Using events cataloged by the University of Utah Seismograph Stations between October, 2012 and June, 2017, we calculate the spectrogram for a time window that spans the duration of each event as seen on individual stations, resulting in 110k event spectrograms (50% local earthquakes examples, 50% quarry blasts examples). Using 80% of the randomized example events ( 90k), a classifier is trained to distinguish between local earthquakes and quarry blasts. We explore variations of deep learning classifiers, incorporating elements of convolutional and recurrent neural networks. Using a single-layer Long Short Term Memory recurrent neural network, we achieve 92% accuracy on the classification task on the remaining 20K test examples. Leveraging the decisions from a group of stations that detected the same event by using the median of all classifications in the group increases the model accuracy to 96%. Additional data with equivalent processing from 500 more recently cataloged events (July, 2017), achieves the same accuracy as our test data on both single-station examples and multi-station medians, suggesting that the model can maintain accurate and stable classification rates on real-time automated events local to the University of Utah Seismograph Stations, with potentially minimal levels of re-training through time.
Kitchener, H C; Blanks, R; Cubie, H; Desai, M; Dunn, G; Legood, R; Gray, A; Sadique, Z; Moss, S
2011-01-01
The principal objective was to compare automation-assisted reading of cervical cytology with manual reading using the histological end point of cervical intraepithelial neoplasia grade II (CIN2) or worse (CIN2+). Secondary objectives included (i) an assessment of the slide ranking facility of the Becton Dickinson (BD) FocalPoint™ Slide Profiler (Becton Dickinson, Franklin Lakes, NJ, USA), especially 'No Further Review', (ii) a comparison of the two approved automated systems, the ThinPrep® Imaging System (Hologic, Bedford, MA, USA) and the BD FocalPoint Guided Screener Imaging System, and (iii) automated versus manual in terms of productivity and cost-effectiveness. A 1 : 2 randomised allocation of slides to either manual reading or automation-assisted paired with manual reading. Cytoscreeners were blinded to whether samples would be read only manually or manually paired with automated. Slide reading procedures followed real-life laboratory protocol to produce a final result and, for paired readings, the worse result determined the management. Costs per event were estimated and combined with productivity to produce a cost per slide, per woman and per CIN2+ and cervical intraepithelial neoplasia grade III (CIN3) or worse (CIN3+) lesion detected. Cost-effectiveness was estimated using cost per CIN2+ detected. Lifetime cost-effectiveness in terms of life-years and quality-adjusted life-years was estimated using a mathematical model. Liquid-based cytology samples were obtained in primary care, and a small number of abnormal samples were obtained from local colposcopy clinics, from different women, in order to enrich the proportion of abnormals. All of the samples were read in a single large service laboratory. Liquid residues used for human papillomavirus (HPV) triage were tested (with Hybrid Capture 2, Qiagen, Crawley, UK) in a specialist virology laboratory in Edinburgh, UK. Histopathology was read by a specialist gynaecological pathology team blinded to HPV results and type of reading. Samples were obtained from women aged 25-64 years undergoing primary cervical screening in Greater Manchester, UK, with small proportions from women outside this age range and from women undergoing colposcopy. The principal intervention was automation-assisted reading of cervical cytology slides which was paired with a manual reading of the same slide. Low-grade cytological abnormalities (borderline and mild dyskaryosis) were triaged with HPV testing to direct colposcopy referral. Women with high-grade cytology were referred for colposcopy and those with negative cytology were returned to recall. The principal outcome measure was the sensitivity of automation-assisted reading relative to manual for the detection of CIN2+. A secondary outcome measure was cost-effectiveness of each type of reading to detect CIN2+. The study was powered to detect a relative sensitivity difference equivalent to an absolute difference of 5%. The principal finding was that automated reading was 8% less sensitive relative to manual, 6.3% in absolute terms. 'No further review' was very reliable and, if restricted to routine screening samples, < 1% of CIN2+ would have been missed. Automated and manual were very similar in terms of cost-effectiveness despite a 60%-80% increase in productivity for automation-assisted reading. The significantly reduced sensitivity of automated reading, combined with uncertainty over cost-effectiveness, suggests no justification at present to recommend its introduction. The reliability of 'no further review' warrants further consideration as a means of saving staff time. Current Controlled Trials ISRCTN66377374. This project was funded by the NIHR Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 15, No. 3. See the HTA programme website for further project information.
Kalliokoski, Otto; Sørensen, Dorte B; Hau, Jann; Abelson, Klas S P
2014-01-01
Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters measured, and expressed less anxious behavior. We conclude that repeated blood sampling by automated blood sampling and from the tail vein is less stressful than cheek blood sampling. The choice between automated blood sampling and tail blood sampling should be based on the study requirements, the resources of the laboratory and skills of the staff. PMID:24958546
Computer-Assisted Automated Scoring of Polysomnograms Using the Somnolyzer System
Punjabi, Naresh M.; Shifa, Naima; Dorffner, Georg; Patil, Susheel; Pien, Grace; Aurora, Rashmi N.
2015-01-01
Study Objectives: Manual scoring of polysomnograms is a time-consuming and tedious process. To expedite the scoring of polysomnograms, several computerized algorithms for automated scoring have been developed. The overarching goal of this study was to determine the validity of the Somnolyzer system, an automated system for scoring polysomnograms. Design: The analysis sample comprised of 97 sleep studies. Each polysomnogram was manually scored by certified technologists from four sleep laboratories and concurrently subjected to automated scoring by the Somnolyzer system. Agreement between manual and automated scoring was examined. Sleep staging and scoring of disordered breathing events was conducted using the 2007 American Academy of Sleep Medicine criteria. Setting: Clinical sleep laboratories. Measurements and Results: A high degree of agreement was noted between manual and automated scoring of the apnea-hypopnea index (AHI). The average correlation between the manually scored AHI across the four clinical sites was 0.92 (95% confidence interval: 0.90–0.93). Similarly, the average correlation between the manual and Somnolyzer-scored AHI values was 0.93 (95% confidence interval: 0.91–0.96). Thus, interscorer correlation between the manually scored results was no different than that derived from manual and automated scoring. Substantial concordance in the arousal index, total sleep time, and sleep efficiency between manual and automated scoring was also observed. In contrast, differences were noted between manually and automated scored percentages of sleep stages N1, N2, and N3. Conclusion: Automated analysis of polysomnograms using the Somnolyzer system provides results that are comparable to manual scoring for commonly used metrics in sleep medicine. Although differences exist between manual versus automated scoring for specific sleep stages, the level of agreement between manual and automated scoring is not significantly different than that between any two human scorers. In light of the burden associated with manual scoring, automated scoring platforms provide a viable complement of tools in the diagnostic armamentarium of sleep medicine. Citation: Punjabi NM, Shifa N, Dorffner G, Patil S, Pien G, Aurora RN. Computer-assisted automated scoring of polysomnograms using the Somnolyzer system. SLEEP 2015;38(10):1555–1566. PMID:25902809
Psychophysiological Control of Acognitive Task Using Adaptive Automation
NASA Technical Reports Server (NTRS)
Freeman, Frederick; Pope, Alan T. (Technical Monitor)
2001-01-01
The major focus of the present proposal was to examine psychophysiological variables related to hazardous states of awareness induced by monitoring automated systems. With the increased use of automation in today's work environment, people's roles in the work place are being redefined from that of active participant to one of passive monitor. Although the introduction of automated systems has a number of benefits, there are also a number of disadvantages regarding worker performance. Byrne and Parasuraman have argued for the use of psychophysiological measures in the development and the implementation of adaptive automation. While both performance based and model based adaptive automation have been studied, the use of psychophysiological measures, especially EEG, offers the advantage of real time evaluation of the state of the subject. The current study used the closed-loop system, developed at NASA-Langley Research Center, to control the state of awareness of subjects while they performed a cognitive vigilance task. Previous research in our laboratory, supported by NASA, has demonstrated that, in an adaptive automation, closed-loop environment, subjects perform a tracking task better under a negative than a positive, feedback condition. In addition, this condition produces less subjective workload and larger P300 event related potentials to auditory stimuli presented in a concurrent oddball task. We have also recently shown that the closed-loop system used to control the level of automation in a tracking task can also be used to control the event rate of stimuli in a vigilance monitoring task. By changing the event rate based on the subject's index of arousal, we have been able to produce improved monitoring, relative to various control groups. We have demonstrated in our initial closed-loop experiments with the the vigilance paradigm that using a negative feedback contingency (i.e. increasing event rates when the EEG index is low and decreasing event rates when the EEG index is high) results in a marked decrease of the vigilance decrement over a 40 minute session. This effect is in direct contrast to performance of a positive feedback group, as well as a number of other control groups which demonstrated the typical vigilance decrement. Interestingly, however, the negative feedback group performed at virtually the same level as a yoked control group. The yoked control group received the same order of changes in event rate that were generated by the negative feedback subjects using the closed-loop system. Thus it would appear to be possible to optimize vigilance performance by controlling the stimuli which subjects are asked to process.
Microjets in the penumbra of a sunspot
NASA Astrophysics Data System (ADS)
Drews, Ainar; Rouppe van der Voort, Luc
2017-06-01
Context. Penumbral microjets (PMJs) are short-lived jets found in the penumbra of sunspots, first observed in wide-band Ca II H line observations as localized brightenings, and are thought to be caused by magnetic reconnection. Earlier work on PMJs has focused on smaller samples of by-eye selected events and case studies. Aims: It is our goal to present an automated study of a large sample of PMJs to place the basic statistics of PMJs on a sure footing and to study the PMJ Ca II 8542 Å spectral profile in detail. Methods: High spatial resolution and spectrally well-sampled observations in the Ca II 8542 Å line obtained from the Swedish 1-m Solar Telescope (SST) were reduced by a principle component analysis and subsequently used in the automated detection of PMJs using the simple machine learning algorithm k-nearest neighbour. PMJ detections were verified with co-temporal Ca II H line observations. Results: We find a total of 453 tracked PMJ events, 4253 PMJs detections tallied over all timeframes, and a detection rate of 21 events per timestep. From these, an average length, width and lifetime of 640 km, 210 km and 90 s are obtained. The average PMJ Ca II 8542 Å line profile is characterized by enhanced inner wings, often in the form of one or two distinct peaks, and a brighter line core as compared to the quiet-Sun average. Average blue and red peak positions are determined at - 10.4 km s-1 and + 10.2 km s-1 offsets from the Ca II 8542 Å line core. We find several clusters of PMJ hot-spots within the sunspot penumbra, in which PMJ events occur in the same general area repeatedly over time. Conclusions: Our results indicate smaller average PMJs sizes and longer lifetimes compared to previously published values, but with statistics still in the same orders of magnitude. The investigation and analysis of the PMJ line profiles strengthens the proposed heating of PMJs to transition region temperatures. The presented statistics on PMJs form a solid basis for future investigations and numerical modelling of PMJs.
McCleskey, R. Blaine; Nordstrom, D. Kirk; Steiger, Judy I.; Kimball, Briant A.; Verplanck, Philip L.
2003-01-01
Water analyses are reported for 259 samples collected from the Red River, New Mexico, and its tributaries during low-flow(2001) and spring snowmelt (2002) tracer studies. Water samples were collected along a 20-kilometer reach of the Red River beginning just east of the town of Red River and ending at the U.S. Geological Survey streamflow-gaging station located east of Questa, New Mexico. The study area was divided into three sections where separate injections and synoptic sampling events were performed during the low-flow tracer study. During the spring snowmelt tracer study, three tracer injections and synoptic sampling events were performed bracketing the areas with the greatest metal loading into the Red River as determined from the low-flow tracer study. The lowflow tracer synoptic sampling events were August 17, 20, and 24, 2001. The synoptic sampling events for the spring snowmelt tracer were March 30, 31, and April 1, 2002. Stream and large inflow water samples were sampled using equal-width and depth-integrated sampling methods and composited into half-gallon bottles. Grab water samples were collected from smaller inflows. Stream temperatures were measured at the time of sample collection. Samples were transported to a nearby central processing location where pH and specific conductance were measured and the samples processed for chemical analyses. Cations, trace metals, iron redox species, and fluoride were analyzed at the U.S. Geological Survey laboratory in Boulder, Colorado. Cations and trace metal concentrations were determined using inductively coupled plasma-optical emission spectrometry and graphite furnace atomic absorption spectrometry. Arsenic concentrations were determined using hydride generation atomic absorption spectrometry, iron redox species were measured using ultraviolet-visible spectrometry, and fluoride concentrations were determined using an ion-selective electrode. Alkalinity was measured by automated titration, and sulfate, chloride, and bromide were analyzed by ion chromatography at the U.S. Geological Survey laboratory in Salt Lake City, Utah.
iMSRC: converting a standard automated microscope into an intelligent screening platform
Carro, Angel; Perez-Martinez, Manuel; Soriano, Joaquim; Pisano, David G.; Megias, Diego
2015-01-01
Microscopy in the context of biomedical research is demanding new tools to automatically detect and capture objects of interest. The few extant packages addressing this need, however, have enjoyed limited uptake due to complexity of use and installation. To overcome these drawbacks, we developed iMSRC, which combines ease of use and installation with high flexibility and enables applications such as rare event detection and high-resolution tissue sample screening, saving time and resources. PMID:26015081
Automated defect spatial signature analysis for semiconductor manufacturing process
Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed
1999-01-01
An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.
Anderson, William W.; Fitzjohn, Stephen M.; Collingridge, Graham L.
2012-01-01
WinLTP is a data acquisition program for studying long-term potentiation (LTP) and other aspects of synaptic function. Earlier versions of WinLTP (J. Neurosci. Methods, 162:346–356, 2007) provided automated electrical stimulation and data acquisition capable of running nearly an entire synaptic plasticity experiment, with the primary exception that perfusion solutions had to be changed manually. This automated stimulation and acquisition was done by using ‘Sweep’, ‘Loop’ and ‘Delay’ events to build scripts using the ‘Protocol Builder’. However, this did not allow automatic changing of many solutions while running multiple slice experiments, or solution changing when this had to be performed rapidly and with accurate timing during patch-clamp experiments. We report here the addition of automated perfusion control to WinLTP. First, perfusion change between sweeps is enabled by adding the ‘Perfuse’ event to Protocol Builder scripting and is used in slice experiments. Second, fast perfusion changes during as well as between sweeps is enabled by using the Perfuse event in the protocol scripts to control changes between sweeps, and also by changing digital or analog output during a sweep and is used for single cell single-line perfusion patch-clamp experiments. The addition of stepper control of tube placement allows dual- or triple-line perfusion patch-clamp experiments for up to 48 solutions. The ability to automate perfusion changes and fully integrate them with the already automated stimulation and data acquisition goes a long way toward complete automation of multi-slice extracellularly recorded and single cell patch-clamp experiments. PMID:22524994
Monitoring stream sediment loads in response to agriculture in Prince Edward Island, Canada.
Alberto, Ashley; St-Hilaire, Andre; Courtenay, Simon C; van den Heuvel, Michael R
2016-07-01
Increased agricultural land use leads to accelerated erosion and deposition of fine sediment in surface water. Monitoring of suspended sediment yields has proven challenging due to the spatial and temporal variability of sediment loading. Reliable sediment yield calculations depend on accurate monitoring of these highly episodic sediment loading events. This study aims to quantify precipitation-induced loading of suspended sediments on Prince Edward Island, Canada. Turbidity is considered to be a reasonably accurate proxy for suspended sediment data. In this study, turbidity was used to monitor suspended sediment concentration (SSC) and was measured for 2 years (December 2012-2014) in three subwatersheds with varying degrees of agricultural land use ranging from 10 to 69 %. Comparison of three turbidity meter calibration methods, two using suspended streambed sediment and one using automated sampling during rainfall events, revealed that the use of SSC samples constructed from streambed sediment was not an accurate replacement for water column sampling during rainfall events for calibration. Different particle size distributions in the three rivers produced significant impacts on the calibration methods demonstrating the need for river-specific calibration. Rainfall-induced sediment loading was significantly greater in the most agriculturally impacted site only when the load per rainfall event was corrected for runoff volume (total flow minus baseflow), flow increase intensity (the slope between the start of a runoff event and the peak of the hydrograph), and season. Monitoring turbidity, in combination with sediment modeling, may offer the best option for management purposes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acciarri, R.; Adams, C.; An, R.
The development and operation of Liquid-Argon Time-Projection Chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens ofmore » algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies.« less
Acciarri, R.; Adams, C.; An, R.; ...
2018-01-29
The development and operation of Liquid-Argon Time-Projection Chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens ofmore » algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies.« less
Automated Track Recognition and Event Reconstruction in Nuclear Emulsion
NASA Technical Reports Server (NTRS)
Deines-Jones, P.; Cherry, M. L.; Dabrowska, A.; Holynski, R.; Jones, W. V.; Kolganova, E. D.; Kudzia, D.; Nilsen, B. S.; Olszewski, A.; Pozharova, E. A.;
1998-01-01
The major advantages of nuclear emulsion for detecting charged particles are its submicron position resolution and sensitivity to minimum ionizing particles. These must be balanced, however, against the difficult manual microscope measurement by skilled observers required for the analysis. We have developed an automated system to acquire and analyze the microscope images from emulsion chambers. Each emulsion plate is analyzed independently, allowing coincidence techniques to be used in order to reject back- ground and estimate error rates. The system has been used to analyze a sample of high-multiplicity Pb-Pb interactions (charged particle multiplicities approx. 1100) produced by the 158 GeV/c per nucleon Pb-208 beam at CERN. Automatically reconstructed track lists agree with our best manual measurements to 3%. We describe the image analysis and track reconstruction techniques, and discuss the measurement and reconstruction uncertainties.
Possibilities for serial femtosecond crystallography sample delivery at future light sourcesa)
Chavas, L. M. G.; Gumprecht, L.; Chapman, H. N.
2015-01-01
Serial femtosecond crystallography (SFX) uses X-ray pulses from free-electron laser (FEL) sources that can outrun radiation damage and thereby overcome long-standing limits in the structure determination of macromolecular crystals. Intense X-ray FEL pulses of sufficiently short duration allow the collection of damage-free data at room temperature and give the opportunity to study irreversible time-resolved events. SFX may open the way to determine the structure of biological molecules that fail to crystallize readily into large well-diffracting crystals. Taking advantage of FELs with high pulse repetition rates could lead to short measurement times of just minutes. Automated delivery of sample suspensions for SFX experiments could potentially give rise to a much higher rate of obtaining complete measurements than at today's third generation synchrotron radiation facilities, as no crystal alignment or complex robotic motions are required. This capability will also open up extensive time-resolved structural studies. New challenges arise from the resulting high rate of data collection, and in providing reliable sample delivery. Various developments for fully automated high-throughput SFX experiments are being considered for evaluation, including new implementations for a reliable yet flexible sample environment setup. Here, we review the different methods developed so far that best achieve sample delivery for X-ray FEL experiments and present some considerations towards the goal of high-throughput structure determination with X-ray FELs. PMID:26798808
MESA: An Interactive Modeling and Simulation Environment for Intelligent Systems Automation
NASA Technical Reports Server (NTRS)
Charest, Leonard
1994-01-01
This report describes MESA, a software environment for creating applications that automate NASA mission opterations. MESA enables intelligent automation by utilizing model-based reasoning techniques developed in the field of Artificial Intelligence. Model-based reasoning techniques are realized in Mesa through native support of causal modeling and discrete event simulation.
Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C
2006-12-20
Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the software methods.
Advantages and challenges in automated apatite fission track counting
NASA Astrophysics Data System (ADS)
Enkelmann, E.; Ehlers, T. A.
2012-04-01
Fission track thermochronometer data are often a core element of modern tectonic and denudation studies. Soon after the development of the fission track methods interest emerged for the developed an automated counting procedure to replace the time consuming labor of counting fission tracks under the microscope. Automated track counting became feasible in recent years with increasing improvements in computer software and hardware. One such example used in this study is the commercial automated fission track counting procedure from Autoscan Systems Pty that has been highlighted through several venues. We conducted experiments that are designed to reliably and consistently test the ability of this fully automated counting system to recognize fission tracks in apatite and a muscovite external detector. Fission tracks were analyzed in samples with a step-wise increase in sample complexity. The first set of experiments used a large (mm-size) slice of Durango apatite cut parallel to the prism plane. Second, samples with 80-200 μm large apatite grains of Fish Canyon Tuff were analyzed. This second sample set is characterized by complexities often found in apatites in different rock types. In addition to the automated counting procedure, the same samples were also analyzed using conventional counting procedures. We found for all samples that the fully automated fission track counting procedure using the Autoscan System yields a larger scatter in the fission track densities measured compared to conventional (manual) track counting. This scatter typically resulted from the false identification of tracks due surface and mineralogical defects, regardless of the image filtering procedure used. Large differences between track densities analyzed with the automated counting persisted between different grains analyzed in one sample as well as between different samples. As a result of these differences a manual correction of the fully automated fission track counts is necessary for each individual surface area and grain counted. This manual correction procedure significantly increases (up to four times) the time required to analyze a sample with the automated counting procedure compared to the conventional approach.
Small Libraries Online: Automating Circulation and Public Access Catalogs. Participant Workbook.
ERIC Educational Resources Information Center
Garcia, C. Rebecca; Bridge, Frank R.
This workbook, meant to be used in a workshop, presents information on and guidelines for automating small libraries: (1) planning for automation; (2) automated system procurement and evaluation; (3) data conversion issues; (4) sample configuration worksheets; (5) sample configuration costs; (6) site preparation; (7) training; and (8) acceptance…
14 CFR 121.805 - Crewmember training for in-flight medical events.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Instruction, to include performance drills, in the proper use of automated external defibrillators. (ii... include performance drills, in the proper use of an automated external defibrillators and in...
14 CFR 121.805 - Crewmember training for in-flight medical events.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Instruction, to include performance drills, in the proper use of automated external defibrillators. (ii... include performance drills, in the proper use of an automated external defibrillators and in...
14 CFR 121.805 - Crewmember training for in-flight medical events.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) Instruction, to include performance drills, in the proper use of automated external defibrillators. (ii... include performance drills, in the proper use of an automated external defibrillators and in...
14 CFR 121.805 - Crewmember training for in-flight medical events.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Instruction, to include performance drills, in the proper use of automated external defibrillators. (ii... include performance drills, in the proper use of an automated external defibrillators and in...
14 CFR 121.805 - Crewmember training for in-flight medical events.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Instruction, to include performance drills, in the proper use of automated external defibrillators. (ii... include performance drills, in the proper use of an automated external defibrillators and in...
Schorb, Martin; Gaechter, Leander; Avinoam, Ori; Sieckmann, Frank; Clarke, Mairi; Bebeacua, Cecilia; Bykov, Yury S; Sonnen, Andreas F-P; Lihl, Reinhard; Briggs, John A G
2017-02-01
Correlative light and electron microscopy allows features of interest defined by fluorescence signals to be located in an electron micrograph of the same sample. Rare dynamic events or specific objects can be identified, targeted and imaged by electron microscopy or tomography. To combine it with structural studies using cryo-electron microscopy or tomography, fluorescence microscopy must be performed while maintaining the specimen vitrified at liquid-nitrogen temperatures and in a dry environment during imaging and transfer. Here we present instrumentation, software and an experimental workflow that improves the ease of use, throughput and performance of correlated cryo-fluorescence and cryo-electron microscopy. The new cryo-stage incorporates a specially modified high-numerical aperture objective lens and provides a stable and clean imaging environment. It is combined with a transfer shuttle for contamination-free loading of the specimen. Optimized microscope control software allows automated acquisition of the entire specimen area by cryo-fluorescence microscopy. The software also facilitates direct transfer of the fluorescence image and associated coordinates to the cryo-electron microscope for subsequent fluorescence-guided automated imaging. Here we describe these technological developments and present a detailed workflow, which we applied for automated cryo-electron microscopy and tomography of various specimens. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brigmon, R.; Leskinen, S.; Kearns, E.
2011-10-10
Detection of Legionella pneumophila in cooling towers and domestic hot water systems involves concentration by centrifugation or membrane filtration prior to inoculation onto growth media or analysis using techniques such as PCR or immunoassays. The Portable Multi-use Automated Concentration System (PMACS) was designed for concentrating microorganisms from large volumes of water in the field and was assessed for enhancing surveillance of L. pneumophila at the Savannah River Site, SC. PMACS samples (100 L; n = 28) were collected from six towers between August 2010 and April 2011 with grab samples (500 ml; n = 56) being collected before and aftermore » each PMACS sample. All samples were analyzed for the presence of L. pneumophila by direct fluorescence immunoassay (DFA) using FITC-labeled monoclonal antibodies targeting serogroups 1, 2, 4 and 6. QPCR was utilized for detection of Legionella spp. in the same samples. Counts of L. pneumophila from DFA and of Legionella spp. from qPCR were normalized to cells/L tower water. Concentrations were similar between grab and PMACS samples collected throughout the study by DFA analysis (P = 0.4461; repeated measures ANOVA). The same trend was observed with qPCR. However, PMACS concentration proved advantageous over membrane filtration by providing larger volume, more representative samples of the cooling tower environment, which led to reduced variability among sampling events and increasing the probability of detection of low level targets. These data highlight the utility of the PMACS for enhanced surveillance of L. pneumophila by providing improved sampling of the cooling tower environment.« less
Bilimoria, Karl Y; Kmiecik, Thomas E; DaRosa, Debra A; Halverson, Amy; Eskandari, Mark K; Bell, Richard H; Soper, Nathaniel J; Wayne, Jeffrey D
2009-04-01
To design a Web-based system to track adverse and near-miss events, to establish an automated method to identify patterns of events, and to assess the adverse event reporting behavior of physicians. A Web-based system was designed to collect physician-reported adverse events including weekly Morbidity and Mortality (M&M) entries and anonymous adverse/near-miss events. An automated system was set up to help identify event patterns. Adverse event frequency was compared with hospital databases to assess reporting completeness. A metropolitan tertiary care center. Identification of adverse event patterns and completeness of reporting. From September 2005 to August 2007, 15,524 surgical patients were reported including 957 (6.2%) adverse events and 34 (0.2%) anonymous reports. The automated pattern recognition system helped identify 4 event patterns from M&M reports and 3 patterns from anonymous/near-miss reporting. After multidisciplinary meetings and expert reviews, the patterns were addressed with educational initiatives, correction of systems issues, and/or intensive quality monitoring. Only 25% of complications and 42% of inpatient deaths were reported. A total of 75.2% of adverse events resulting in permanent disability or death were attributed to the nature of the disease. Interventions to improve reporting were largely unsuccessful. We have developed a user-friendly Web-based system to track complications and identify patterns of adverse events. Underreporting of adverse events and attributing the complication to the nature of the disease represent a problem in reporting culture among surgeons at our institution. Similar systems should be used by surgery departments, particularly those affiliated with teaching hospitals, to identify quality improvement opportunities.
Taenzer, Andreas H; Pyke, Joshua; Herrick, Michael D; Dodds, Thomas M; McGrath, Susan P
2014-02-01
The manual collection and charting of traditional vital signs data in inpatient populations have been shown to be inaccurate when compared with true physiologic values. This issue has not been examined with respect to oxygen saturation data despite the increased use of this measurement in systems designed to assess the risk of patient deterioration. Of particular note are the lack of available data examining the accuracy of oxygen saturation charting in a particularly vulnerable group of patients who have prolonged oxygen desaturations (mean SpO2 <90% over at least 15 minutes). In addition, no data are currently available that investigate the often suspected "wake up" effect, resulting from a nurse entering a patient's room to obtain vital signs. In this study, we compared oxygen saturation data recorded manually with data collected by an automated continuous monitoring system in 16 inpatients considered to be at high risk for deterioration (average SpO2 values <90% collected by the automated system in a 15-minute interval before a manual charting event). Data were sampled from the automatic collection system from 2 periods: over a 15-minute period that ended 5 minutes before the time of the manual data collection and charting, and over a 5-minute range before and after the time of the manual data collection and charting. Average saturations from prolonged baseline desaturations (15-minute period) were compared with both the manual and automated data sampled at the time of the nurse's visit to analyze for systematic change and to investigate the presence of an arousal effect. The manually charted data were higher than those recorded by the automated system. Manually recorded data were on average 6.5% (confidence interval, 4.0%-9.0%) higher in oxygen saturation. No significant arousal effect resulting from the nurse's visit to the patient's room was detected. In a cohort of patients with prolonged desaturations, manual recordings of SpO2 did not reflect physiologic patient state when compared with continuous automated sampling. Currently, early warning scores depend on manual vital sign recordings in many settings; the study data suggest that SpO2 ought to be added to the list of vital sign values that have been shown to be recorded inaccurately.
Zhang, Jie; Wei, Shimin; Ayres, David W; Smith, Harold T; Tse, Francis L S
2011-09-01
Although it is well known that automation can provide significant improvement in the efficiency of biological sample preparation in quantitative LC-MS/MS analysis, it has not been widely implemented in bioanalytical laboratories throughout the industry. This can be attributed to the lack of a sound strategy and practical procedures in working with robotic liquid-handling systems. Several comprehensive automation assisted procedures for biological sample preparation and method validation were developed and qualified using two types of Hamilton Microlab liquid-handling robots. The procedures developed were generic, user-friendly and covered the majority of steps involved in routine sample preparation and method validation. Generic automation procedures were established as a practical approach to widely implement automation into the routine bioanalysis of samples in support of drug-development programs.
Bhagwat, Swarupa Nikhil; Sharma, Jayashree H; Jose, Julie; Modi, Charusmita J
2015-01-01
The routine immunohematological tests can be performed by automated as well as manual techniques. These techniques have advantages and disadvantages inherent to them. The present study aims to compare the results of manual and automated techniques for blood grouping and crossmatching so as to validate the automated system effectively. A total of 1000 samples were subjected to blood grouping by the conventional tube technique (CTT) and the automated microplate LYRA system on Techno TwinStation. A total of 269 samples (multitransfused patients and multigravida females) were compared for 927 crossmatches by the CTT in indirect antiglobulin phase against the column agglutination technique (CAT) performed on Techno TwinStation. For blood grouping, the study showed a concordance in results for 942/1000 samples (94.2%), discordance for 4/1000 (0.4%) samples and uninterpretable result for 54/1000 samples (5.4%). On resolution, the uninterpretable results reduced to 49/1000 samples (4.9%) with 951/1000 samples (95.1%) showing concordant results. For crossmatching, the automated CAT showed concordant results in 887/927 (95.6%) and discordant results in 3/927 (0.32%) crossmatches as compared to the CTT. Total 37/927 (3.9%) crossmatches were not interpretable by the automated technique. The automated system shows a high concordance of results with CTT and hence can be brought into routine use. However, the high proportion of uninterpretable results emphasizes on the fact that proper training and standardization are needed prior to its use.
Nikolic, Mark I; Sarter, Nadine B
2007-08-01
To examine operator strategies for diagnosing and recovering from errors and disturbances as well as the impact of automation design and time pressure on these processes. Considerable efforts have been directed at error prevention through training and design. However, because errors cannot be eliminated completely, their detection, diagnosis, and recovery must also be supported. Research has focused almost exclusively on error detection. Little is known about error diagnosis and recovery, especially in the context of event-driven tasks and domains. With a confederate pilot, 12 airline pilots flew a 1-hr simulator scenario that involved three challenging automation-related tasks and events that were likely to produce erroneous actions or assessments. Behavioral data were compared with a canonical path to examine pilots' error and disturbance management strategies. Debriefings were conducted to probe pilots' system knowledge. Pilots seldom followed the canonical path to cope with the scenario events. Detection of a disturbance was often delayed. Diagnostic episodes were rare because of pilots' knowledge gaps and time criticality. In many cases, generic inefficient recovery strategies were observed, and pilots relied on high levels of automation to manage the consequences of an error. Our findings describe and explain the nature and shortcomings of pilots' error management activities. They highlight the need for improved automation training and design to achieve more timely detection, accurate explanation, and effective recovery from errors and disturbances. Our findings can inform the design of tools and techniques that support disturbance management in various complex, event-driven environments.
Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J
2013-07-01
Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.
Automation of Silica Bead-based Nucleic Acid Extraction on a Centrifugal Lab-on-a-Disc Platform
NASA Astrophysics Data System (ADS)
Kinahan, David J.; Mangwanya, Faith; Garvey, Robert; Chung, Danielle WY; Lipinski, Artur; Julius, Lourdes AN; King, Damien; Mohammadi, Mehdi; Mishra, Rohit; Al-Ofi, May; Miyazaki, Celina; Ducrée, Jens
2016-10-01
We describe a centrifugal microfluidic ‘Lab-on-a-Disc’ (LoaD) technology for DNA purification towards eventual integration into a Sample-to-Answer platform for detection of the pathogen Escherichia coli O157:H7 from food samples. For this application, we use a novel microfluidic architecture which combines ‘event-triggered’ dissolvable film (DF) valves with a reaction chamber gated by a centrifugo-pneumatic siphon valve (CPSV). This architecture permits comprehensive flow control by simple changes in the speed of the platform innate spindle motor. Even before method optimisation, characterisation by DNA fluorescence reveals an extraction efficiency of 58%, which is close to commercial spin columns.
Automation--down to the nuts and bolts.
Fix, R J; Rowe, J M; McConnell, B C
2000-01-01
Laboratories that once viewed automation as an expensive luxury are now looking to automation as a solution to increase sample throughput, to help ensure data integrity and to improve laboratory safety. The question is no longer, 'Should we automate?', but 'How should we approach automation?' A laboratory may choose from three approaches when deciding to automate: (1) contract with a third party vendor to produce a turnkey system, (2) develop and fabricate the system in-house or (3) some combination of approaches (1) and (2). The best approach for a given laboratory depends upon its available resources. The first lesson to be learned in automation is that no matter how straightforward an idea appears in the beginning, the solution will not be realized until many complex problems have been resolved. Issues dealing with sample vessel manipulation, liquid handling and system control must be addressed before a final design can be developed. This requires expertise in engineering, electronics, programming and chemistry. Therefore, the team concept of automation should be employed to help ensure success. This presentation discusses the advantages and disadvantages of the three approaches to automation. The development of an automated sample handling and control system for the STAR System focused microwave will be used to illustrate the complexities encountered in a seemingly simple project, and to highlight the importance of the team concept to automation no matter which approach is taken. The STAR System focused microwave from CEM Corporation is an open vessel digestion system with six microwave cells. This system is used to prepare samples for trace metal determination. The automated sample handling was developed around a XYZ motorized gantry system. Grippers were specially designed to perform several different functions and to provide feedback to the control software. Software was written in Visual Basic 5.0 to control the movement of the samples and the operation and monitoring of the STAR microwave. This software also provides a continuous update of the system's status to the computer screen. The system provides unattended preparation of up to 59 samples per run.
Automated DBS microsampling, microscale automation and microflow LC-MS for therapeutic protein PK.
Zhang, Qian; Tomazela, Daniela; Vasicek, Lisa A; Spellman, Daniel S; Beaumont, Maribel; Shyong, BaoJen; Kenny, Jacqueline; Fauty, Scott; Fillgrove, Kerry; Harrelson, Jane; Bateman, Kevin P
2016-04-01
Reduce animal usage for discovery-stage PK studies for biologics programs using microsampling-based approaches and microscale LC-MS. We report the development of an automated DBS-based serial microsampling approach for studying the PK of therapeutic proteins in mice. Automated sample preparation and microflow LC-MS were used to enable assay miniaturization and improve overall assay throughput. Serial sampling of mice was possible over the full 21-day study period with the first six time points over 24 h being collected using automated DBS sample collection. Overall, this approach demonstrated comparable data to a previous study using single mice per time point liquid samples while reducing animal and compound requirements by 14-fold. Reduction in animals and drug material is enabled by the use of automated serial DBS microsampling for mice studies in discovery-stage studies of protein therapeutics.
An automated single ion hit at JAERI heavy ion microbeam to observe individual radiation damage
NASA Astrophysics Data System (ADS)
Kamiya, Tomihiro; Sakai, Takuro; Naitoh, Yutaka; Hamano, Tsuyoshi; Hirao, Toshio
1999-10-01
Microbeam scanning and a single ion hit technique have been combined to establish an automated beam positioning and single ion hit system at the JAERI Takasaki heavy ion microbeam system. Single ion irradiation on preset points of a sample in various patterns can be performed automatically in a short period. The reliability of the system was demonstrated using CR-39 nuclear track detectors. Single ion hit patterns were achieved with a positioning accuracy of 2 μm or less. In measurement of single event transient current using this system, the reduction of the pulse height by accumulation of radiation damages was observed by single ion injection to the same local areas. This technique showed a possibility to get some quantitative information about the lateral displacement of an individual radiation effect in silicon PIN photodiodes. This paper will give details of the irradiation system and present results from several experiments.
Solís-Marcos, Ignacio; Galvao-Carmona, Alejandro; Kircher, Katja
2017-01-01
Research on partially automated driving has revealed relevant problems with driving performance, particularly when drivers’ intervention is required (e.g., take-over when automation fails). Mental fatigue has commonly been proposed to explain these effects after prolonged automated drives. However, performance problems have also been reported after just a few minutes of automated driving, indicating that other factors may also be involved. We hypothesize that, besides mental fatigue, an underload effect of partial automation may also affect driver attention. In this study, such potential effect was investigated during short periods of partially automated and manual driving and at different speeds. Subjective measures of mental demand and vigilance and performance to a secondary task (an auditory oddball task) were used to assess driver attention. Additionally, modulations of some specific attention-related event-related potentials (ERPs, N1 and P3 components) were investigated. The mental fatigue effects associated with the time on task were also evaluated by using the same measurements. Twenty participants drove in a fixed-base simulator while performing an auditory oddball task that elicited the ERPs. Six conditions were presented (5–6 min each) combining three speed levels (low, comfortable and high) and two automation levels (manual and partially automated). The results showed that, when driving partially automated, scores in subjective mental demand and P3 amplitudes were lower than in the manual conditions. Similarly, P3 amplitude and self-reported vigilance levels decreased with the time on task. Based on previous studies, these findings might reflect a reduction in drivers’ attention resource allocation, presumably due to the underload effects of partial automation and to the mental fatigue associated with the time on task. Particularly, such underload effects on attention could explain the performance decrements after short periods of automated driving reported in other studies. However, further studies are needed to investigate this relationship in partial automation and in other automation levels. PMID:29163112
Performance of Copan WASP for Routine Urine Microbiology
Quiblier, Chantal; Jetter, Marion; Rominski, Mark; Mouttet, Forouhar; Böttger, Erik C.; Keller, Peter M.
2015-01-01
This study compared a manual workup of urine clinical samples with fully automated WASPLab processing. As a first step, two different inocula (1 and 10 μl) and different streaking patterns were compared using WASP and InoqulA BT instrumentation. Significantly more single colonies were produced with the10-μl inoculum than with the 1-μl inoculum, and automated streaking yielded significantly more single colonies than manual streaking on whole plates (P < 0.001). In a second step, 379 clinical urine samples were evaluated using WASP and the manual workup. Average numbers of detected morphologies, recovered species, and CFUs per milliliter of all 379 urine samples showed excellent agreement between WASPLab and the manual workup. The percentage of urine samples clinically categorized as positive or negative did not differ between the automated and manual workflow, but within the positive samples, automated processing by WASPLab resulted in the detection of more potential pathogens. In summary, the present study demonstrates that (i) the streaking pattern, i.e., primarily the number of zigzags/length of streaking lines, is critical for optimizing the number of single colonies yielded from primary cultures of urine samples; (ii) automated streaking by the WASP instrument is superior to manual streaking regarding the number of single colonies yielded (for 32.2% of the samples); and (iii) automated streaking leads to higher numbers of detected morphologies (for 47.5% of the samples), species (for 17.4% of the samples), and pathogens (for 3.4% of the samples). The results of this study point to an improved quality of microbiological analyses and laboratory reports when using automated sample processing by WASP and WASPLab. PMID:26677255
APDS: Autonomous Pathogen Detection System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langlois, R G; Brown, S; Burris, L
An early warning system to counter bioterrorism, the Autonomous Pathogen Detection System (APDS) continuously monitors the environment for the presence of biological pathogens (e.g., anthrax) and once detected, it sounds an alarm much like a smoke detector warns of a fire. Long before September 11, 2001, this system was being developed to protect domestic venues and events including performing arts centers, mass transit systems, major sporting and entertainment events, and other high profile situations in which the public is at risk of becoming a target of bioterrorist attacks. Customizing off-the-shelf components and developing new components, a multidisciplinary team developed APDS,more » a stand-alone system for rapid, continuous monitoring of multiple airborne biological threat agents in the environment. The completely automated APDS samples the air, prepares fluid samples in-line, and performs two orthogonal tests: immunoassay and nucleic acid detection. When compared to competing technologies, APDS is unprecedented in terms of flexibility and system performance.« less
Bhagwat, Swarupa Nikhil; Sharma, Jayashree H; Jose, Julie; Modi, Charusmita J
2015-01-01
Context: The routine immunohematological tests can be performed by automated as well as manual techniques. These techniques have advantages and disadvantages inherent to them. Aims: The present study aims to compare the results of manual and automated techniques for blood grouping and crossmatching so as to validate the automated system effectively. Materials and Methods: A total of 1000 samples were subjected to blood grouping by the conventional tube technique (CTT) and the automated microplate LYRA system on Techno TwinStation. A total of 269 samples (multitransfused patients and multigravida females) were compared for 927 crossmatches by the CTT in indirect antiglobulin phase against the column agglutination technique (CAT) performed on Techno TwinStation. Results: For blood grouping, the study showed a concordance in results for 942/1000 samples (94.2%), discordance for 4/1000 (0.4%) samples and uninterpretable result for 54/1000 samples (5.4%). On resolution, the uninterpretable results reduced to 49/1000 samples (4.9%) with 951/1000 samples (95.1%) showing concordant results. For crossmatching, the automated CAT showed concordant results in 887/927 (95.6%) and discordant results in 3/927 (0.32%) crossmatches as compared to the CTT. Total 37/927 (3.9%) crossmatches were not interpretable by the automated technique. Conclusions: The automated system shows a high concordance of results with CTT and hence can be brought into routine use. However, the high proportion of uninterpretable results emphasizes on the fact that proper training and standardization are needed prior to its use. PMID:26417159
MARS: bringing the automation of small-molecule bioanalytical sample preparations to a new frontier.
Li, Ming; Chou, Judy; Jing, Jing; Xu, Hui; Costa, Aldo; Caputo, Robin; Mikkilineni, Rajesh; Flannelly-King, Shane; Rohde, Ellen; Gan, Lawrence; Klunk, Lewis; Yang, Liyu
2012-06-01
In recent years, there has been a growing interest in automating small-molecule bioanalytical sample preparations specifically using the Hamilton MicroLab(®) STAR liquid-handling platform. In the most extensive work reported thus far, multiple small-molecule sample preparation assay types (protein precipitation extraction, SPE and liquid-liquid extraction) have been integrated into a suite that is composed of graphical user interfaces and Hamilton scripts. Using that suite, bioanalytical scientists have been able to automate various sample preparation methods to a great extent. However, there are still areas that could benefit from further automation, specifically, the full integration of analytical standard and QC sample preparation with study sample extraction in one continuous run, real-time 2D barcode scanning on the Hamilton deck and direct Laboratory Information Management System database connectivity. We developed a new small-molecule sample-preparation automation system that improves in all of the aforementioned areas. The improved system presented herein further streamlines the bioanalytical workflow, simplifies batch run design, reduces analyst intervention and eliminates sample-handling error.
Flexible automated approach for quantitative liquid handling of complex biological samples.
Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H
2007-11-01
A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.
Létant, Sonia E; Murphy, Gloria A; Alfaro, Teneile M; Avila, Julie R; Kane, Staci R; Raber, Ellen; Bunt, Thomas M; Shah, Sanjiv R
2011-09-01
In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples.
Automated mitosis detection of stem cell populations in phase-contrast microscopy images.
Huh, Seungil; Ker, Dai Fei Elmer; Bise, Ryoma; Chen, Mei; Kanade, Takeo
2011-03-01
Due to the enormous potential and impact that stem cells may have on regenerative medicine, there has been a rapidly growing interest for tools to analyze and characterize the behaviors of these cells in vitro in an automated and high throughput fashion. Among these behaviors, mitosis, or cell division, is important since stem cells proliferate and renew themselves through mitosis. However, current automated systems for measuring cell proliferation often require destructive or sacrificial methods of cell manipulation such as cell lysis or in vitro staining. In this paper, we propose an effective approach for automated mitosis detection using phase-contrast time-lapse microscopy, which is a nondestructive imaging modality, thereby allowing continuous monitoring of cells in culture. In our approach, we present a probabilistic model for event detection, which can simultaneously 1) identify spatio-temporal patch sequences that contain a mitotic event and 2) localize a birth event, defined as the time and location at which cell division is completed and two daughter cells are born. Our approach significantly outperforms previous approaches in terms of both detection accuracy and computational efficiency, when applied to multipotent C3H10T1/2 mesenchymal and C2C12 myoblastic stem cell populations.
NASA Technical Reports Server (NTRS)
Prinzel, Lawrence J 3rd; Freeman, Frederick G.; Scerbo, Mark W.; Mikulka, Peter J.; Pope, Alan T.
2003-01-01
The present study examined the effects of an electroencephalographic- (EEG-) based system for adaptive automation on tracking performance and workload. In addition, event-related potentials (ERPs) to a secondary task were derived to determine whether they would provide an additional degree of workload specificity. Participants were run in an adaptive automation condition, in which the system switched between manual and automatic task modes based on the value of each individual's own EEG engagement index; a yoked control condition; or another control group, in which task mode switches followed a random pattern. Adaptive automation improved performance and resulted in lower levels of workload. Further, the P300 component of the ERP paralleled the sensitivity to task demands of the performance and subjective measures across conditions. These results indicate that it is possible to improve performance with a psychophysiological adaptive automation system and that ERPs may provide an alternative means for distinguishing among levels of cognitive task demand in such systems. Actual or potential applications of this research include improved methods for assessing operator workload and performance.
Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens
2014-07-07
The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from mammalian cell homogenate.
ERIC Educational Resources Information Center
Zhang, Mo
2013-01-01
Many testing programs use automated scoring to grade essays. One issue in automated essay scoring that has not been examined adequately is population invariance and its causes. The primary purpose of this study was to investigate the impact of sampling in model calibration on population invariance of automated scores. This study analyzed scores…
An automated device for provoking and capturing wildlife calls
Ausband, David E.; Skrivseth, Jesse; Mitchell, Michael S.
2011-01-01
Some animals exhibit call-and-response behaviors that can be exploited to facilitate detection. Traditionally, acoustic surveys that use call-and-respond techniques have required an observer's presence to perform the broadcast, record the response, or both events. This can be labor-intensive and may influence animal behavior and, thus, survey results. We developed an automated acoustic survey device using commercially available hardware (e.g., laptop computer, speaker, microphone) and an author-created (JS) software program ("HOOT") that can be used to survey for any animal that calls. We tested this device to determine 1) deployment longevity, 2) effective sampling area, and 3) ability to detect known packs of gray wolves (Canis lupus) in Idaho, USA. Our device was able to broadcast and record twice daily for 6–7 days using the internal computer battery and surveyed an area of 3.3–17.5 km2 in relatively open habitat depending on the hardware components used. We surveyed for wolves at 2 active rendezvous sites used by closely monitored, radiocollared wolf packs and obtained 4 responses across both packs over 3 days of sampling. We confirmed reproduction in these 2 packs by detecting pup howls aurally from the resulting device recordings. Our device can broadcast and record animal calls and the computer software is freely downloadable. This automated survey device can be used to collect reliable data while reducing the labor costs traditionally associated with acoustic surveys.
An automated device for provoking and capturing Wildlife calls
Ausband, D.E.; Skrivseth, J.; Mitchell, M.S.
2011-01-01
Some animals exhibit call-and-response behaviors that can be exploited to facilitate detection. Traditionally, acoustic surveys that use call-and-respond techniques have required an observer's presence to perform the broadcast, record the response, or both events. This can be labor-intensive and may influence animal behavior and, thus, survey results. We developed an automated acoustic survey device using commercially available hardware (e.g., laptop computer, speaker, microphone) and an author-created (JS) software program ("HOOT") that can be used to survey for any animal that calls. We tested this device to determine 1) deployment longevity, 2) effective sampling area, and 3) ability to detect known packs of gray wolves (Canis lupus) in Idaho, USA. Our device was able to broadcast and record twice daily for 6-7 days using the internal computer battery and surveyed an area of 3.3-17.5 km in relatively open habitat depending on the hardware components used. We surveyed for wolves at 2 active rendezvous sites used by closely monitored, radiocollared wolf packs and obtained 4 responses across both packs over 3 days of sampling. We confirmed reproduction in these 2 packs by detecting pup howls aurally from the resulting device recordings. Our device can broadcast and record animal calls and the computer software is freely downloadable. This automated survey device can be used to collect reliable data while reducing the labor costs traditionally associated with acoustic surveys. ?? 2011 The Wildlife Society.
A real-time automated quality control of rain gauge data based on multiple sensors
NASA Astrophysics Data System (ADS)
qi, Y.; Zhang, J.
2013-12-01
Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.
Oliver, David M; Porter, Kenneth D H; Heathwaite, A Louise; Zhang, Ting; Quilliam, Richard S
2015-07-01
Understanding the role of different rainfall scenarios on faecal indicator organism (FIO) dynamics under variable field conditions is important to strengthen the evidence base on which regulators and land managers can base informed decisions regarding diffuse microbial pollution risks. We sought to investigate the impact of low intensity summer rainfall on Escherichia coli-discharge (Q) patterns at the headwater catchment scale in order to provide new empirical data on FIO concentrations observed during baseflow conditions. In addition, we evaluated the potential impact of using automatic samplers to collect and store freshwater samples for subsequent microbial analysis during summer storm sampling campaigns. The temporal variation of E. coli concentrations with Q was captured during six events throughout a relatively dry summer in central Scotland. The relationship between E. coli concentration and Q was complex with no discernible patterns of cell emergence with Q that were repeated across all events. On several occasions, an order of magnitude increase in E. coli concentrations occurred even with slight increases in Q, but responses were not consistent and highlighted the challenges of attempting to characterise temporal responses of E. coli concentrations relative to Q during low intensity rainfall. Cross-comparison of E. coli concentrations determined in water samples using simultaneous manual grab and automated sample collection was undertaken with no difference in concentrations observed between methods. However, the duration of sample storage within the autosampler unit was found to be more problematic in terms of impacting on the representativeness of microbial water quality, with unrefrigerated autosamplers exhibiting significantly different concentrations of E. coli relative to initial samples after 12-h storage. The findings from this study provide important empirical contributions to the growing evidence base in the field of catchment microbial dynamics.
NASA Technical Reports Server (NTRS)
King, Ellis; Hart, Jeremy; Odegard, Ryan
2010-01-01
The Orion Crew Exploration Vehicle (CET) is being designed to include significantly more automation capability than either the Space Shuttle or the International Space Station (ISS). In particular, the vehicle flight software has requirements to accommodate increasingly automated missions throughout all phases of flight. A data-driven flight software architecture will provide an evolvable automation capability to sequence through Guidance, Navigation & Control (GN&C) flight software modes and configurations while maintaining the required flexibility and human control over the automation. This flexibility is a key aspect needed to address the maturation of operational concepts, to permit ground and crew operators to gain trust in the system and mitigate unpredictability in human spaceflight. To allow for mission flexibility and reconfrgurability, a data driven approach is being taken to load the mission event plan as well cis the flight software artifacts associated with the GN&C subsystem. A database of GN&C level sequencing data is presented which manages and tracks the mission specific and algorithm parameters to provide a capability to schedule GN&C events within mission segments. The flight software data schema for performing automated mission sequencing is presented with a concept of operations for interactions with ground and onboard crew members. A prototype architecture for fault identification, isolation and recovery interactions with the automation software is presented and discussed as a forward work item.
Automation Bias: Decision Making and Performance in High-Tech Cockpits
NASA Technical Reports Server (NTRS)
Mosier, Kathleen L.; Skitka, Linda J.; Heers, Susan; Burdick, Mark; Rosekind, Mark R. (Technical Monitor)
1997-01-01
Automated aids and decision support tools are rapidly becoming indispensible tools in high-technology cockpits, and are assuming increasing control of "cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate "automation bias," a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation "events," or opportunities for automation-related omission and commission errors. Pilots who perceived themselves as "accountable" for their performance and strategies of interaction with the automation were more likely to double-check automated functioning against other cues, and less likely to commit errors. Pilots were also likely to erroneously "remember" the presence of expected cues when describing their decision-making processes.
Besmer, Michael D; Epting, Jannis; Page, Rebecca M; Sigrist, Jürg A; Huggenberger, Peter; Hammes, Frederik
2016-12-07
Detailed measurements of physical, chemical and biological dynamics in groundwater are key to understanding the important processes in place and their influence on water quality - particularly when used for drinking water. Measuring temporal bacterial dynamics at high frequency is challenging due to the limitations in automation of sampling and detection of the conventional, cultivation-based microbial methods. In this study, fully automated online flow cytometry was applied in a groundwater system for the first time in order to monitor microbial dynamics in a groundwater extraction well. Measurements of bacterial concentrations every 15 minutes during 14 days revealed both aperiodic and periodic dynamics that could not be detected previously, resulting in total cell concentration (TCC) fluctuations between 120 and 280 cells μL -1 . The aperiodic dynamic was linked to river water contamination following precipitation events, while the (diurnal) periodic dynamic was attributed to changes in hydrological conditions as a consequence of intermittent groundwater extraction. Based on the high number of measurements, the two patterns could be disentangled and quantified separately. This study i) increases the understanding of system performance, ii) helps to optimize monitoring strategies, and iii) opens the possibility for more sophisticated (quantitative) microbial risk assessment of drinking water treatment systems.
Besmer, Michael D.; Epting, Jannis; Page, Rebecca M.; Sigrist, Jürg A.; Huggenberger, Peter; Hammes, Frederik
2016-01-01
Detailed measurements of physical, chemical and biological dynamics in groundwater are key to understanding the important processes in place and their influence on water quality – particularly when used for drinking water. Measuring temporal bacterial dynamics at high frequency is challenging due to the limitations in automation of sampling and detection of the conventional, cultivation-based microbial methods. In this study, fully automated online flow cytometry was applied in a groundwater system for the first time in order to monitor microbial dynamics in a groundwater extraction well. Measurements of bacterial concentrations every 15 minutes during 14 days revealed both aperiodic and periodic dynamics that could not be detected previously, resulting in total cell concentration (TCC) fluctuations between 120 and 280 cells μL−1. The aperiodic dynamic was linked to river water contamination following precipitation events, while the (diurnal) periodic dynamic was attributed to changes in hydrological conditions as a consequence of intermittent groundwater extraction. Based on the high number of measurements, the two patterns could be disentangled and quantified separately. This study i) increases the understanding of system performance, ii) helps to optimize monitoring strategies, and iii) opens the possibility for more sophisticated (quantitative) microbial risk assessment of drinking water treatment systems. PMID:27924920
DOT National Transportation Integrated Search
1982-06-01
In order to examine specific Automated Guideway Transit (AGT) developments and concepts, and to build a better knowledge base for future decision-making, the Urban Mass Transportation Administration (UMTA) undertook a new program of studies and techn...
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System
Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-01-01
Abstract Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years. PMID:25553271
Forecasting Significant Societal Events Using The Embers Streaming Predictive Analytics System.
Doyle, Andy; Katz, Graham; Summers, Kristen; Ackermann, Chris; Zavorin, Ilya; Lim, Zunsik; Muthiah, Sathappan; Butler, Patrick; Self, Nathan; Zhao, Liang; Lu, Chang-Tien; Khandpur, Rupinder Paul; Fayed, Youssef; Ramakrishnan, Naren
2014-12-01
Developed under the Intelligence Advanced Research Project Activity Open Source Indicators program, Early Model Based Event Recognition using Surrogates (EMBERS) is a large-scale big data analytics system for forecasting significant societal events, such as civil unrest events on the basis of continuous, automated analysis of large volumes of publicly available data. It has been operational since November 2012 and delivers approximately 50 predictions each day for countries of Latin America. EMBERS is built on a streaming, scalable, loosely coupled, shared-nothing architecture using ZeroMQ as its messaging backbone and JSON as its wire data format. It is deployed on Amazon Web Services using an entirely automated deployment process. We describe the architecture of the system, some of the design tradeoffs encountered during development, and specifics of the machine learning models underlying EMBERS. We also present a detailed prospective evaluation of EMBERS in forecasting significant societal events in the past 2 years.
Migration monitoring with automated technology
Rhonda L. Millikin
2005-01-01
Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...
AUTOMATED SOLID PHASE EXTRACTION GC/MS FOR ANALYSIS OF SEMIVOLATILES IN WATER AND SEDIMENTS
Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line sampl...
Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A
2013-08-20
A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.
Automation for deep space vehicle monitoring
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M.
1991-01-01
Information on automation for deep space vehicle monitoring is given in viewgraph form. Information is given on automation goals and strategy; the Monitor Analyzer of Real-time Voyager Engineering Link (MARVEL); intelligent input data management; decision theory for making tradeoffs; dynamic tradeoff evaluation; evaluation of anomaly detection results; evaluation of data management methods; system level analysis with cooperating expert systems; the distributed architecture of multiple expert systems; and event driven response.
The relationship between cell phone use and management of driver fatigue: It's complicated.
Saxby, Dyani Juanita; Matthews, Gerald; Neubauer, Catherine
2017-06-01
Voice communication may enhance performance during monotonous, potentially fatiguing driving conditions (Atchley & Chan, 2011); however, it is unclear whether safety benefits of conversation are outweighed by costs. The present study tested whether personalized conversations intended to simulate hands-free cell phone conversation may counter objective and subjective fatigue effects elicited by vehicle automation. A passive fatigue state (Desmond & Hancock, 2001), characterized by disengagement from the task, was induced using full vehicle automation prior to drivers resuming full control over the driving simulator. A conversation was initiated shortly after reversion to manual control. During the conversation an emergency event occurred. The fatigue manipulation produced greater task disengagement and slower response to the emergency event, relative to a control condition. Conversation did not mitigate passive fatigue effects; rather, it added worry about matters unrelated to the driving task. Conversation moderately improved vehicle control, as measured by SDLP, but it failed to counter fatigue-induced slowing of braking in response to an emergency event. Finally, conversation appeared to have a hidden danger in that it reduced drivers' insights into performance impairments when in a state of passive fatigue. Automation induced passive fatigue, indicated by loss of task engagement; yet, simulated cell phone conversation did not counter the subjective automation-induced fatigue. Conversation also failed to counter objective loss of performance (slower braking speed) resulting from automation. Cell phone conversation in passive fatigue states may impair drivers' awareness of their performance deficits. Practical applications: Results suggest that conversation, even using a hands-free device, may not be a safe way to reduce fatigue and increase alertness during transitions from automated to manual vehicle control. Copyright © 2017 Elsevier Ltd and National Safety Council. All rights reserved.
Matheny, Michael E; Normand, Sharon-Lise T; Gross, Thomas P; Marinac-Dabic, Danica; Loyo-Berrios, Nilsa; Vidi, Venkatesan D; Donnelly, Sharon; Resnic, Frederic S
2011-12-14
Automated adverse outcome surveillance tools and methods have potential utility in quality improvement and medical product surveillance activities. Their use for assessing hospital performance on the basis of patient outcomes has received little attention. We compared risk-adjusted sequential probability ratio testing (RA-SPRT) implemented in an automated tool to Massachusetts public reports of 30-day mortality after isolated coronary artery bypass graft surgery. A total of 23,020 isolated adult coronary artery bypass surgery admissions performed in Massachusetts hospitals between January 1, 2002 and September 30, 2007 were retrospectively re-evaluated. The RA-SPRT method was implemented within an automated surveillance tool to identify hospital outliers in yearly increments. We used an overall type I error rate of 0.05, an overall type II error rate of 0.10, and a threshold that signaled if the odds of dying 30-days after surgery was at least twice than expected. Annual hospital outlier status, based on the state-reported classification, was considered the gold standard. An event was defined as at least one occurrence of a higher-than-expected hospital mortality rate during a given year. We examined a total of 83 hospital-year observations. The RA-SPRT method alerted 6 events among three hospitals for 30-day mortality compared with 5 events among two hospitals using the state public reports, yielding a sensitivity of 100% (5/5) and specificity of 98.8% (79/80). The automated RA-SPRT method performed well, detecting all of the true institutional outliers with a small false positive alerting rate. Such a system could provide confidential automated notification to local institutions in advance of public reporting providing opportunities for earlier quality improvement interventions.
Biomek 3000: the workhorse in an automated accredited forensic genetic laboratory.
Stangegaard, Michael; Meijer, Per-Johan; Børsting, Claus; Hansen, Anders J; Morling, Niels
2012-10-01
We have implemented and validated automated protocols for a wide range of processes such as sample preparation, PCR setup, and capillary electrophoresis setup using small, simple, and inexpensive automated liquid handlers. The flexibility and ease of programming enable the Biomek 3000 to be used in many parts of the laboratory process in a modern forensic genetics laboratory with low to medium sample throughput. In conclusion, we demonstrated that sample processing for accredited forensic genetic DNA typing can be implemented on small automated liquid handlers, leading to the reduction of manual work as well as increased quality and throughput.
Automated event generation for loop-induced processes
Hirschi, Valentin; Mattelaer, Olivier
2015-10-22
We present the first fully automated implementation of cross-section computation and event generation for loop-induced processes. This work is integrated in the MadGraph5_aMC@NLO framework. We describe the optimisations implemented at the level of the matrix element evaluation, phase space integration and event generation allowing for the simulation of large multiplicity loop-induced processes. Along with some selected differential observables, we illustrate our results with a table showing inclusive cross-sections for all loop-induced hadronic scattering processes with up to three final states in the SM as well as for some relevant 2 → 4 processes. Furthermore, many of these are computed heremore » for the first time.« less
Van der Vorst, Sébastien; Dekairelle, Anne-France; Irenge, Léonid; Hamoir, Marc; Robert, Annie; Gala, Jean-Luc
2009-01-01
This study compared automated vs. manual tissue grinding in terms of RNA yield obtained from oral mucosa biopsies. A total of 20 patients undergoing uvulectomy for sleep-related disorders and 10 patients undergoing biopsy for head and neck squamous cell carcinoma were enrolled in the study. Samples were collected, snap-frozen in liquid nitrogen, and divided into two parts of similar weight. Sample grinding was performed on one sample from each pair, either manually or using an automated cell disruptor. The performance and efficacy of each homogenization approach was compared in terms of total RNA yield (spectrophotometry, fluorometry), mRNA quantity [densitometry of specific TP53 amplicons and TP53 quantitative reverse-transcribed real-time PCR (qRT-PCR)], and mRNA quality (functional analysis of separated alleles in yeast). Although spectrophotometry and fluorometry results were comparable for both homogenization methods, TP53 expression values obtained by amplicon densitometry and qRT-PCR were significantly and consistently better after automated homogenization (p<0.005) for both uvula and tumor samples. Functional analysis of separated alleles in yeast results was better with the automated technique for tumor samples. Automated tissue homogenization appears to be a versatile, quick, and reliable method of cell disruption and is especially useful in the case of small malignant samples, which show unreliable results when processed by manual homogenization.
Xu, Weiyi; Wan, Feng; Lou, Yufeng; Jin, Jiali; Mao, Weilin
2014-01-01
A number of automated devices for pretransfusion testing have recently become available. This study evaluated the Immucor Galileo System, a fully automated device based on the microplate hemagglutination technique for ABO/Rh (D) determinations. Routine ABO/Rh typing tests were performed on 13,045 samples using the Immucor automated instruments. Manual tube method was used to resolve ABO forward and reverse grouping discrepancies. D-negative test results were investigated and confirmed manually by the indirect antiglobulin test (IAT). The system rejected 70 tests for sample inadequacy. 87 samples were read as "No-type-determined" due to forward and reverse grouping discrepancies. 25 tests gave these results because of sample hemolysis. After further tests, we found 34 tests were caused by weakened RBC antibodies, 5 tests were attributable to weak A and/or B antigens, 4 tests were due to mixed-field reactions, and 8 tests had high titer cold agglutinin with blood qualifications which react only at temperatures below 34 degrees C. In the remaining 11 cases, irregular RBC antibodies were identified in 9 samples (seven anti-M and two anti-P) and two subgroups were identified in 2 samples (one A1 and one A2) by a reference laboratory. As for D typing, 2 weak D+ samples missed by automated systems gave negative results, but weak-positive reactions were observed in the IAT. The Immucor Galileo System is reliable and suited for ABO and D blood groups, some reasons may cause a discrepancy in ABO/D typing using a fully automated system. It is suggested that standardization of sample collection may improve the performance of the fully automated system.
Science Goal Monitor: Science Goal Driven Automation for NASA Missions
NASA Technical Reports Server (NTRS)
Koratkar, Anuradha; Grosvenor, Sandy; Jung, John; Pell, Melissa; Matusow, David; Bailyn, Charles
2004-01-01
Infusion of automation technologies into NASA s future missions will be essential because of the need to: (1) effectively handle an exponentially increasing volume of scientific data, (2) successfully meet dynamic, opportunistic scientific goals and objectives, and (3) substantially reduce mission operations staff and costs. While much effort has gone into automating routine spacecraft operations to reduce human workload and hence costs, applying intelligent automation to the science side, i.e., science data acquisition, data analysis and reactions to that data analysis in a timely and still scientifically valid manner, has been relatively under-emphasized. In order to introduce science driven automation in missions, we must be able to: capture and interpret the science goals of observing programs, represent those goals in machine interpretable language; and allow spacecrafts onboard systems to autonomously react to the scientist's goals. In short, we must teach our platforms to dynamically understand, recognize, and react to the scientists goals. The Science Goal Monitor (SGM) project at NASA Goddard Space Flight Center is a prototype software tool being developed to determine the best strategies for implementing science goal driven automation in missions. The tools being developed in SGM improve the ability to monitor and react to the changing status of scientific events. The SGM system enables scientists to specify what to look for and how to react in descriptive rather than technical terms. The system monitors streams of science data to identify occurrences of key events previously specified by the scientist. When an event occurs, the system autonomously coordinates the execution of the scientist s desired reactions. Through SGM, we will improve om understanding about the capabilities needed onboard for success, develop metrics to understand the potential increase in science returns, and develop an operational prototype so that the perceived risks associated with increased use of automation can be reduced.
The All-Sky Automated Survey for Supernovae
NASA Astrophysics Data System (ADS)
Bersier, D.
2016-12-01
This is an overview of the All-Sky Automated Survey for SuperNovae - ASAS-SN. We briefly present the hardware and capabilities of the survey and describe the most recent science results, in particular tidal disruption events and supernovae, including the brightest SN ever found.
Enabling Automated Dynamic Demand Response: From Theory to Practice
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frincu, Marc; Chelmis, Charalampos; Aman, Saima
2015-07-14
Demand response (DR) is a technique used in smart grids to shape customer load during peak hours. Automated DR offers utilities a fine grained control and a high degree of confidence in the outcome. However the impact on the customer's comfort means this technique is more suited for industrial and commercial settings than for residential homes. In this paper we propose a system for achieving automated controlled DR in a heterogeneous environment. We present some of the main issues arising in building such a system, including privacy, customer satisfiability, reliability, and fast decision turnaround, with emphasis on the solutions wemore » proposed. Based on the lessons we learned from empirical results we describe an integrated automated system for controlled DR on the USC microgrid. Results show that while on a per building per event basis the accuracy of our prediction and customer selection techniques varies, it performs well on average when considering several events and buildings.« less
A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*
Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing
2016-01-01
Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569
21 CFR 864.5200 - Automated cell counter.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the...
21 CFR 864.5240 - Automated blood cell diluting apparatus.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated blood cell diluting apparatus. 864.5240... § 864.5240 Automated blood cell diluting apparatus. (a) Identification. An automated blood cell diluting apparatus is a fully automated or semi-automated device used to make appropriate dilutions of a blood sample...
21 CFR 864.5240 - Automated blood cell diluting apparatus.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated blood cell diluting apparatus. 864.5240... § 864.5240 Automated blood cell diluting apparatus. (a) Identification. An automated blood cell diluting apparatus is a fully automated or semi-automated device used to make appropriate dilutions of a blood sample...
Towards an Automated Classification of Transient Events in Synoptic Sky Surveys
NASA Technical Reports Server (NTRS)
Djorgovski, S. G.; Donalek, C.; Mahabal, A. A.; Moghaddam, B.; Turmon, M.; Graham, M. J.; Drake, A. J.; Sharma, N.; Chen, Y.
2011-01-01
We describe the development of a system for an automated, iterative, real-time classification of transient events discovered in synoptic sky surveys. The system under development incorporates a number of Machine Learning techniques, mostly using Bayesian approaches, due to the sparse nature, heterogeneity, and variable incompleteness of the available data. The classifications are improved iteratively as the new measurements are obtained. One novel featrue is the development of an automated follow-up recommendation engine, that suggest those measruements that would be the most advantageous in terms of resolving classification ambiguities and/or characterization of the astrophysically most interesting objects, given a set of available follow-up assets and their cost funcations. This illustrates the symbiotic relationship of astronomy and applied computer science through the emerging disciplne of AstroInformatics.
A review of emergency medical services events in US national parks from 2007 to 2011.
Declerck, Matthieu P; Atterton, Laurie M; Seibert, Thomas; Cushing, Tracy A
2013-09-01
Outdoor recreation is growing in the United States, with more than 279 million annual visitors to areas controlled by the National Park Service (NPS). Emergency medical needs in these parks are overseen by the National Park's rangers within the NPS Emergency Medical Services (EMS) system. This study examines medical and traumatic emergencies throughout the NPS over a 5-year period to better understand the types of events and fatalities rangers encounter, both regionally and on a national scale. This is a retrospective review of the annual EMS reports published by the 7 NPS regions from 2007 to 2011. The following were compared and examined at a regional and national level: medical versus traumatic versus first aid events, cardiac events and outcomes, use of automated external defibrillators, and medical versus traumatic fatalities. The national incidence of EMS events was 45.9 events per 1 million visitors. Medical, traumatic, and first aid events composed 29%, 28%, and 43% of reports, respectively. Of medical episodes, 1.8% were cardiac arrests, of which 64.2% received automated external defibrillator treatment; 29.1% of cardiac arrests survived to hospital discharge. Of fatalities, 61.4% were traumatic in nature and the remaining 38.5% were nontraumatic (medical). Regional differences were found for all variables. On a national level, the NPS experiences an equal number of medical and traumatic EMS events. This differs from past observed trends that reported a higher incidence of traumatic events than medical events in wilderness settings. Cardiac events and automated external defibrillator usage are relatively infrequent. Traumatic fatalities are more common than medical fatalities in the NPS. Regional variations in events likely reflect differences in terrain, common activities, proximity to urban areas, and access to definitive care between regions. These data can assist the NPS in targeting the regions with the greatest number of incidents and fatalities for prevention, ranger training, and visitor education. Copyright © 2013 Wilderness Medical Society. Published by Elsevier Inc. All rights reserved.
Ben-Yoav, Hadar; Dykstra, Peter H; Bentley, William E; Ghodssi, Reza
2017-01-01
A microfluidic electrochemical lab-on-a-chip (LOC) device for DNA hybridization detection has been developed. The device comprises a 3 × 3 array of microelectrodes integrated with a dual layer microfluidic valved manipulation system that provides controlled and automated capabilities for high throughput analysis of microliter volume samples. The surface of the microelectrodes is functionalized with single-stranded DNA (ssDNA) probes which enable specific detection of complementary ssDNA targets. These targets are detected by a capacitive technique which measures dielectric variation at the microelectrode-electrolyte interface due to DNA hybridization events. A quantitative analysis of the hybridization events is carried out based on a sensing modeling that includes detailed analysis of energy storage and dissipation components. By calculating these components during hybridization events the device is able to demonstrate specific and dose response sensing characteristics. The developed microfluidic LOC for DNA hybridization detection offers a technology for real-time and label-free assessment of genetic markers outside of laboratory settings, such as at the point-of-care or in-field environmental monitoring.
[Establishment of Automation System for Detection of Alcohol in Blood].
Tian, L L; Shen, Lei; Xue, J F; Liu, M M; Liang, L J
2017-02-01
To establish an automation system for detection of alcohol content in blood. The determination was performed by automated workstation of extraction-headspace gas chromatography (HS-GC). The blood collection with negative pressure, sealing time of headspace bottle and sample needle were checked and optimized in the abstraction of automation system. The automatic sampling was compared with the manual sampling. The quantitative data obtained by the automated workstation of extraction-HS-GC for alcohol was stable. The relative differences of two parallel samples were less than 5%. The automated extraction was superior to the manual extraction. A good linear relationship was obtained at the alcohol concentration range of 0.1-3.0 mg/mL ( r ≥0.999) with good repeatability. The method is simple and quick, with more standard experiment process and accurate experimental data. It eliminates the error from the experimenter and has good repeatability, which can be applied to the qualitative and quantitative detections of alcohol in blood. Copyright© by the Editorial Department of Journal of Forensic Medicine
Vorberg, Ellen; Fleischer, Heidi; Junginger, Steffen; Liu, Hui; Stoll, Norbert; Thurow, Kerstin
2016-10-01
Life science areas require specific sample pretreatment to increase the concentration of the analytes and/or to convert the analytes into an appropriate form for the detection and separation systems. Various workstations are commercially available, allowing for automated biological sample pretreatment. Nevertheless, due to the required temperature, pressure, and volume conditions in typical element and structure-specific measurements, automated platforms are not suitable for analytical processes. Thus, the purpose of the presented investigation was the design, realization, and evaluation of an automated system ensuring high-precision sample preparation for a variety of analytical measurements. The developed system has to enable system adaption and high performance flexibility. Furthermore, the system has to be capable of dealing with the wide range of required vessels simultaneously, allowing for less cost and time-consuming process steps. However, the system's functionality has been confirmed in various validation sequences. Using element-specific measurements, the automated system was up to 25% more precise compared to the manual procedure and as precise as the manual procedure using structure-specific measurements. © 2015 Society for Laboratory Automation and Screening.
Automated Electroglottographic Inflection Events Detection. A Pilot Study.
Codino, Juliana; Torres, María Eugenia; Rubin, Adam; Jackson-Menaldi, Cristina
2016-11-01
Vocal-fold vibration can be analyzed in a noninvasive way by registering impedance changes within the glottis, through electroglottography. The morphology of the electroglottographic (EGG) signal is related to different vibratory patterns. In the literature, a characteristic knee in the descending portion of the signal has been reported. Some EGG signals do not exhibit this particular knee and have other types of events (inflection events) throughout the ascending and/or descending portion of the vibratory cycle. The goal of this work is to propose an automatic method to identify and classify these events. A computational algorithm was developed based on the mathematical properties of the EGG signal, which detects and reports events throughout the contact phase. Retrospective analysis of EGG signals obtained during routine voice evaluation of adult individuals with a variety of voice disorders was performed using the algorithm as well as human raters. Two judges, both experts in clinical voice analysis, and three general speech pathologists performed manual and visual evaluation of the sample set. The results obtained by the automatic method were compared with those of the human raters. Statistical analysis revealed a significant level of agreement. This automatic tool could allow professionals in the clinical setting to obtain an automatic quantitative and qualitative report of such events present in a voice sample, without having to manually analyze the whole EGG signal. In addition, it might provide the speech pathologist with more information that would complement the standard voice evaluation. It could also be a valuable tool in voice research. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Létant, Sonia E.; Murphy, Gloria A.; Alfaro, Teneile M.; Avila, Julie R.; Kane, Staci R.; Raber, Ellen; Bunt, Thomas M.; Shah, Sanjiv R.
2011-01-01
In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples. PMID:21764960
Multiplexed detection of mycotoxins in foods with a regenerable array.
Ngundi, Miriam M; Shriver-Lake, Lisa C; Moore, Martin H; Ligler, Frances S; Taitt, Chris R
2006-12-01
The occurrence of different mycotoxins in cereal products calls for the development of a rapid, sensitive, and reliable detection method that is capable of analyzing samples for multiple toxins simultaneously. In this study, we report the development and application of a multiplexed competitive assay for the simultaneous detection of ochratoxin A (OTA) and deoxynivalenol (DON) in spiked barley, cornmeal, and wheat, as well as in naturally contaminated maize samples. Fluoroimmunoassays were performed with the Naval Research Laboratory array biosensor, by both a manual and an automated version of the system. This system employs evanescent-wave fluorescence excitation to probe binding events as they occur on the surface of a waveguide. Methanolic extracts of the samples were diluted threefold with buffer containing a mixture of fluorescent antibodies and were then passed over the arrays of mycotoxins immobilized on a waveguide. Fluorescent signals of the surface-bound antibody-antigen complexes decreased with increasing concentrations of free mycotoxins in the extract. After sample analysis was completed, surfaces were regenerated with 6 M guanidine hydrochloride in 50 mM glycine, pH 2.0. The limits of detection determined by the manual biosensor system were as follows: 1, 180, and 65 ng/g for DON and 1, 60, and 85 ng/g for OTA in cornmeal, wheat, and barley, respectively. The limits of detection in cornmeal determined with the automated array biosensor were 15 and 150 ng/g for OTA and DON, respectively.
AAC Best Practice Using Automated Language Activity Monitoring.
ERIC Educational Resources Information Center
Hill, Katya; Romich, Barry
This brief paper describes automated language activity monitoring (LAM), an augmentative and alternative communication (AAC) methodology for the collection, editing, and analysis of language data in structured or natural situations with people who have severe communication disorders. The LAM function records each language event (letters, words,…
Automated Microflow NMR: Routine Analysis of Five-Microliter Samples
Jansma, Ariane; Chuan, Tiffany; Geierstanger, Bernhard H.; Albrecht, Robert W.; Olson, Dean L.; Peck, Timothy L.
2006-01-01
A microflow CapNMR probe double-tuned for 1H and 13C was installed on a 400-MHz NMR spectrometer and interfaced to an automated liquid handler. Individual samples dissolved in DMSO-d6 are submitted for NMR analysis in vials containing as little as 10 μL of sample. Sets of samples are submitted in a low-volume 384-well plate. Of the 10 μL of sample per well, as with vials, 5 μL is injected into the microflow NMR probe for analysis. For quality control of chemical libraries, 1D NMR spectra are acquired under full automation from 384-well plates on as many as 130 compounds within 24 h using 128 scans per spectrum and a sample-to-sample cycle time of ∼11 min. Because of the low volume requirements and high mass sensitivity of the microflow NMR system, 30 nmol of a typical small molecule is sufficient to obtain high-quality, well-resolved, 1D proton or 2D COSY NMR spectra in ∼6 or 20 min of data acquisition time per experiment, respectively. Implementation of pulse programs with automated solvent peak identification and suppression allow for reliable data collection, even for samples submitted in fully protonated DMSO. The automated microflow NMR system is controlled and monitored using web-based software. PMID:16194121
Russi, Silvia; Song, Jinhu; McPhillips, Scott E.; ...
2016-02-24
The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russi, Silvia; Song, Jinhu; McPhillips, Scott E.
The Stanford Automated Mounter System, a system for mounting and dismounting cryo-cooled crystals, has been upgraded to increase the throughput of samples on the macromolecular crystallography beamlines at the Stanford Synchrotron Radiation Lightsource. This upgrade speeds up robot maneuvers, reduces the heating/drying cycles, pre-fetches samples and adds an air-knife to remove frost from the gripper arms. As a result, sample pin exchange during automated crystal quality screening now takes about 25 s, five times faster than before this upgrade.
Synoptic Sky Surveys: Lessons Learned and Challenges Ahead
NASA Astrophysics Data System (ADS)
Djorgovski, Stanislav G.; CRTS Team
2014-01-01
A new generation of synoptic sky surveys is now opening the time domain for a systematic exploration, presenting both great new scientific opportunities as well as the challenges. These surveys are touching essentially all subfields of astronomy, producing large statistical samples of the known types of objects and events (e.g., SNe, AGN, variable stars of many kinds), and have already uncovered previously unknown subtypes of these (e.g., rare or peculiar types of SNe). They are generating new science now, and paving the way for even larger surveys to come, e.g., the LSST. Our ability to fully exploit such forthcoming facilities depends critically on the science, methodology, and experience that are being accumulated now. Among the outstanding challenges the foremost is our ability to conduct an effective follow-up of the interesting events discovered by the surveys in any wavelength regime. The follow-up resources, especially spectroscopy, are already be severely limited, and this problem will grow by orders of magnitude. This requires an intelligent down-selection of the most astrophysically interesting events to follow. The first step in that process is an automated, real-time, iterative classification of transient events, that incorporates heterogeneous data from the surveys themselves, archival information (spatial, temporal, and multiwavelength), and the incoming follow-up observations. The second step is an optimal automated event prioritization and allocation of the available follow-up resources that also change in time. Both of these challenges are highly non-trivial, and require a strong cyber-infrastructure based on the Virtual Observatory data grid, and the various astroinformatics efforts now under way. This is inherently an astronomy of telescope-computational systems, that increasingly depends on novel machine learning and artificial intelligence tools. Another arena with a strong potential for discovery is an archival, non-time-critical exploration of the time domain, with the time dimension adding the complexity to an already challenging problem of data mining of highly-dimensional data parameter spaces.
Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane
2012-09-01
Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Stadler, H; Klock, E; Skritek, P; Mach, R L; Zerobin, W; Farnleitner, A H
2010-01-01
Because spring water quality from alpine karst aquifers can change very rapidly during event situations, water abstraction management has to be performed in near real-time. Four summer events (2005-2008) at alpine karst springs were investigated in detail in order to evaluate the spectral absorption coefficient at 254 nm (SAC254) as a real-time early warning proxy for faecal pollution. For the investigation Low-Earth-Orbit (LEO) Satellite-based data communication between portable hydrometeorological measuring stations and an automated microbiological sampling device was used. The method for event triggered microbial sampling and analyzing was already established and described in a previous paper. Data analysis including on-line event characterisation (i.e. precipitation, discharge, turbidity, SAC254) and comprehensive E. coli determination (n>800) indicated that SAC254 is a useful early warning proxy. Irrespective of the studied event situations SAC254 always increased 3 to 6 hours earlier than the onset of faecal pollution, featuring different correlation phases. Furthermore, it seems also possible to use SAC254 as a real-time proxy parameter for estimating the extent of faecal pollution after establishing specific spring and event-type calibrations that take into consideration the variability of the occurrence and the transferability of faecal material It should be highlighted that diffuse faecal pollution from wildlife and live stock sources was responsible for spring water contamination at the investigated catchments. In this respect, the SAC254 can also provide useful information to support microbial source tracking efforts where different situations of infiltration have to be investigated.
21 CFR 864.5200 - Automated cell counter.
Code of Federal Regulations, 2014 CFR
2014-04-01
....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...
21 CFR 864.5200 - Automated cell counter.
Code of Federal Regulations, 2011 CFR
2011-04-01
....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...
21 CFR 864.5200 - Automated cell counter.
Code of Federal Regulations, 2012 CFR
2012-04-01
....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...
21 CFR 864.5200 - Automated cell counter.
Code of Federal Regulations, 2013 CFR
2013-04-01
....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of the patient's peripheral blood (blood circulating in one of the body's extremities, such as the arm). These...
Nonanalytic Laboratory Automation: A Quarter Century of Progress.
Hawker, Charles D
2017-06-01
Clinical laboratory automation has blossomed since the 1989 AACC meeting, at which Dr. Masahide Sasaki first showed a western audience what his laboratory had implemented. Many diagnostics and other vendors are now offering a variety of automated options for laboratories of all sizes. Replacing manual processing and handling procedures with automation was embraced by the laboratory community because of the obvious benefits of labor savings and improvement in turnaround time and quality. Automation was also embraced by the diagnostics vendors who saw automation as a means of incorporating the analyzers purchased by their customers into larger systems in which the benefits of automation were integrated to the analyzers.This report reviews the options that are available to laboratory customers. These options include so called task-targeted automation-modules that range from single function devices that automate single tasks (e.g., decapping or aliquoting) to multifunction workstations that incorporate several of the functions of a laboratory sample processing department. The options also include total laboratory automation systems that use conveyors to link sample processing functions to analyzers and often include postanalytical features such as refrigerated storage and sample retrieval.Most importantly, this report reviews a recommended process for evaluating the need for new automation and for identifying the specific requirements of a laboratory and developing solutions that can meet those requirements. The report also discusses some of the practical considerations facing a laboratory in a new implementation and reviews the concept of machine vision to replace human inspections. © 2017 American Association for Clinical Chemistry.
Economic and workflow analysis of a blood bank automated system.
Shin, Kyung-Hwa; Kim, Hyung Hoi; Chang, Chulhun L; Lee, Eun Yup
2013-07-01
This study compared the estimated costs and times required for ABO/Rh(D) typing and unexpected antibody screening using an automated system and manual methods. The total cost included direct and labor costs. Labor costs were calculated on the basis of the average operator salaries and unit values (minutes), which was the hands-on time required to test one sample. To estimate unit values, workflows were recorded on video, and the time required for each process was analyzed separately. The unit values of ABO/Rh(D) typing using the manual method were 5.65 and 8.1 min during regular and unsocial working hours, respectively. The unit value was less than 3.5 min when several samples were tested simultaneously. The unit value for unexpected antibody screening was 2.6 min. The unit values using the automated method for ABO/Rh(D) typing, unexpected antibody screening, and both simultaneously were all 1.5 min. The total cost of ABO/Rh(D) typing of only one sample using the automated analyzer was lower than that of testing only one sample using the manual technique but higher than that of testing several samples simultaneously. The total cost of unexpected antibody screening using an automated analyzer was less than that using the manual method. ABO/Rh(D) typing using an automated analyzer incurs a lower unit value and cost than that using the manual technique when only one sample is tested at a time. Unexpected antibody screening using an automated analyzer always incurs a lower unit value and cost than that using the manual technique.
Semi-automated 96-well liquid-liquid extraction for quantitation of drugs in biological fluids.
Zhang, N; Hoffman, K L; Li, W; Rossi, D T
2000-02-01
A semi-automated liquid-liquid extraction (LLE) technique for biological fluid sample preparation was introduced for the quantitation of four drugs in rat plasma. All liquid transferring during the sample preparation was automated using a Tomtec Quadra 96 Model 320 liquid handling robot, which processed up to 96 samples in parallel. The samples were either in 96-deep-well plate or tube-rack format. One plate of samples can be prepared in approximately 1.5 h, and the 96-well plate is directly compatible with the autosampler of an LC/MS system. Selection of organic solvents and recoveries are discussed. Also, precision, relative error, linearity and quantitation of the semi automated LLE method are estimated for four example drugs using LC/MS/MS with a multiple reaction monitoring (MRM) approach. The applicability of this method and future directions are evaluated.
Automated quantitative cytological analysis using portable microfluidic microscopy.
Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva
2016-06-01
In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bacterial and fungal DNA extraction from blood samples: automated protocols.
Lorenz, Michael G; Disqué, Claudia; Mühl, Helge
2015-01-01
Automation in DNA isolation is a necessity for routine practice employing molecular diagnosis of infectious agents. To this end, the development of automated systems for the molecular diagnosis of microorganisms directly in blood samples is at its beginning. Important characteristics of systems demanded for routine use include high recovery of microbial DNA, DNA-free containment for the reduction of DNA contamination from exogenous sources, DNA-free reagents and consumables, ideally a walkaway system, and economical pricing of the equipment and consumables. Such full automation of DNA extraction evaluated and in use for sepsis diagnostics is yet not available. Here, we present protocols for the semiautomated isolation of microbial DNA from blood culture and low- and high-volume blood samples. The protocols include a manual pretreatment step followed by automated extraction and purification of microbial DNA.
Optimizing transformations for automated, high throughput analysis of flow cytometry data
2010-01-01
Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce variability in the location of discovered cell populations across samples, and decrease the misclassification (mis-gating) of individual events when compared to default-parameter counterparts. Conclusions Our results indicate that the preferred transformation for fluorescence channels is a parameter- optimized biexponential or generalized Box-Cox, in accordance with current best practices. Interestingly, for populations in the scatter channels, we find that the optimized hyperbolic arcsine may be a better choice in a high-throughput setting than current standard practice of no transformation. However, generally speaking, the choice of transformation remains data-dependent. We have implemented our algorithm in the BioConductor package, flowTrans, which is publicly available. PMID:21050468
Optimizing transformations for automated, high throughput analysis of flow cytometry data.
Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael
2010-11-04
In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce variability in the location of discovered cell populations across samples, and decrease the misclassification (mis-gating) of individual events when compared to default-parameter counterparts. Our results indicate that the preferred transformation for fluorescence channels is a parameter- optimized biexponential or generalized Box-Cox, in accordance with current best practices. Interestingly, for populations in the scatter channels, we find that the optimized hyperbolic arcsine may be a better choice in a high-throughput setting than current standard practice of no transformation. However, generally speaking, the choice of transformation remains data-dependent. We have implemented our algorithm in the BioConductor package, flowTrans, which is publicly available.
A system performance throughput model applicable to advanced manned telescience systems
NASA Technical Reports Server (NTRS)
Haines, Richard F.
1990-01-01
As automated space systems become more complex, autonomous, and opaque to the flight crew, it becomes increasingly difficult to determine whether the total system is performing as it should. Some of the complex and interrelated human performance measurement issues are addressed that are related to total system validation. An evaluative throughput model is presented which can be used to generate a human operator-related benchmark or figure of merit for a given system which involves humans at the input and output ends as well as other automated intelligent agents. The concept of sustained and accurate command/control data information transfer is introduced. The first two input parameters of the model involve nominal and off-nominal predicted events. The first of these calls for a detailed task analysis while the second is for a contingency event assessment. The last two required input parameters involving actual (measured) events, namely human performance and continuous semi-automated system performance. An expression combining these four parameters was found using digital simulations and identical, representative, random data to yield the smallest variance.
Sklar, A E; Sarter, N B
1999-12-01
Observed breakdowns in human-machine communication can be explained, in part, by the nature of current automation feedback, which relies heavily on focal visual attention. Such feedback is not well suited for capturing attention in case of unexpected changes and events or for supporting the parallel processing of large amounts of data in complex domains. As suggested by multiple-resource theory, one possible solution to this problem is to distribute information across various sensory modalities. A simulator study was conducted to compare the effectiveness of visual, tactile, and redundant visual and tactile cues for indicating unexpected changes in the status of an automated cockpit system. Both tactile conditions resulted in higher detection rates for, and faster response times to, uncommanded mode transitions. Tactile feedback did not interfere with, nor was its effectiveness affected by, the performance of concurrent visual tasks. The observed improvement in task-sharing performance indicates that the introduction of tactile feedback is a promising avenue toward better supporting human-machine communication in event-driven, information-rich domains.
Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff
NASA Astrophysics Data System (ADS)
Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.
2016-03-01
Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.
Specimen coordinate automated measuring machine/fiducial automated measuring machine
Hedglen, Robert E.; Jacket, Howard S.; Schwartz, Allan I.
1991-01-01
The Specimen coordinate Automated Measuring Machine (SCAMM) and the Fiducial Automated Measuring Machine (FAMM) is a computer controlled metrology system capable of measuring length, width, and thickness, and of locating fiducial marks. SCAMM and FAMM have many similarities in their designs, and they can be converted from one to the other without taking them out of the hot cell. Both have means for: supporting a plurality of samples and a standard; controlling the movement of the samples in the +/- X and Y directions; determining the coordinates of the sample; compensating for temperature effects; and verifying the accuracy of the measurements and repeating as necessary. SCAMM and FAMM are designed to be used in hot cells.
Automated SEM and TEM sample preparation applied to copper/low k materials
NASA Astrophysics Data System (ADS)
Reyes, R.; Shaapur, F.; Griffiths, D.; Diebold, A. C.; Foran, B.; Raz, E.
2001-01-01
We describe the use of automated microcleaving for preparation of both SEM and TEM samples as done by SELA's new MC500 and TEMstation tools. The MC500 is an automated microcleaving tool that is capable of producing cleaves with 0.25 μm accuracy resulting in SEM-ready samples. The TEMstation is capable of taking a sample output from the MC500 (or from SELA's earlier MC200 tool) and producing a FIB ready slice of 25±5 μm, mounted on a TEM-washer and ready for FIB thinning to electron transparency for TEM analysis. The materials selected for the tool set evaluation mainly included the Cu/TaN/HOSP low-k system. The paper is divided into three sections, experimental approach, SEM preparation and analysis of HOSP low-k, and TEM preparation and analysis of Cu/TaN/HOSP low-k samples. For the samples discussed, data is presented to show the quality of preparation provided by these new automated tools.
A Method for Automated Detection of Usability Problems from Client User Interface Events
Saadawi, Gilan M.; Legowski, Elizabeth; Medvedeva, Olga; Chavan, Girish; Crowley, Rebecca S.
2005-01-01
Think-aloud usability analysis provides extremely useful data but is very time-consuming and expensive to perform because of the extensive manual video analysis that is required. We describe a simple method for automated detection of usability problems from client user interface events for a developing medical intelligent tutoring system. The method incorporates (1) an agent-based method for communication that funnels all interface events and system responses to a centralized database, (2) a simple schema for representing interface events and higher order subgoals, and (3) an algorithm that reproduces the criteria used for manual coding of usability problems. A correction factor was empirically determining to account for the slower task performance of users when thinking aloud. We tested the validity of the method by simultaneously identifying usability problems using TAU and manually computing them from stored interface event data using the proposed algorithm. All usability problems that did not rely on verbal utterances were detectable with the proposed method. PMID:16779121
Testing the event witnessing status of micro-bloggers from evidence in their micro-blogs
2017-01-01
This paper demonstrates a framework of processes for identifying potential witnesses of events from evidence they post to social media. The research defines original evidence models for micro-blog content sources, the relative uncertainty of different evidence types, and models for testing evidence by combination. Methods to filter and extract evidence using automated and semi-automated means are demonstrated using a Twitter case study event. Further, an implementation to test extracted evidence using Dempster Shafer Theory of Evidence are presented. The results indicate that the inclusion of evidence from micro-blog text and linked image content can increase the number of micro-bloggers identified at events, in comparison to the number of micro-bloggers identified from geotags alone. Additionally, the number of micro-bloggers that can be tested for evidence corroboration or conflict, is increased by incorporating evidence identified in their posting history. PMID:29232395
Flow through electrode with automated calibration
Szecsody, James E [Richland, WA; Williams, Mark D [Richland, WA; Vermeul, Vince R [Richland, WA
2002-08-20
The present invention is an improved automated flow through electrode liquid monitoring system. The automated system has a sample inlet to a sample pump, a sample outlet from the sample pump to at least one flow through electrode with a waste port. At least one computer controls the sample pump and records data from the at least one flow through electrode for a liquid sample. The improvement relies upon (a) at least one source of a calibration sample connected to (b) an injection valve connected to said sample outlet and connected to said source, said injection valve further connected to said at least one flow through electrode, wherein said injection valve is controlled by said computer to select between said liquid sample or said calibration sample. Advantages include improved accuracy because of more frequent calibrations, no additional labor for calibration, no need to remove the flow through electrode(s), and minimal interruption of sampling.
Automated Detection of Surgical Adverse Events from Retrospective Clinical Data
ERIC Educational Resources Information Center
Hu, Zhen
2017-01-01
The Detection of surgical adverse events has become increasingly important with the growing demand for quality improvement and public health surveillance with surgery. Event reporting is one of the key steps in determining the impact of postoperative complications from a variety of perspectives and is an integral component of improving…
Use of Archived Information by the United States National Data Center
NASA Astrophysics Data System (ADS)
Junek, W. N.; Pope, B. M.; Roman-Nieves, J. I.; VanDeMark, T. F.; Ichinose, G. A.; Poffenberger, A.; Woods, M. T.
2012-12-01
The United States National Data Center (US NDC) is responsible for monitoring international compliance to nuclear test ban treaties, acquiring data and data products from the International Data Center (IDC), and distributing data according to established policy. The archive of automated and reviewed event solutions residing at the US NDC is a valuable resource for assessing and improving the performance of signal detection, event formation, location, and discrimination algorithms. Numerous research initiatives are currently underway that are focused on optimizing these processes using historic waveform data and alphanumeric information. Identification of optimum station processing parameters is routinely performed through the analysis of archived waveform data. Station specific detector tuning studies produce and compare receiver operating characteristics for multiple detector configurations (e.g., detector type, filter passband) to identify an optimum set of processing parameters with an acceptable false alarm rate. Large aftershock sequences can inundate automated phase association algorithms with numerous detections that are closely spaced in time, which increases the number of false and/or mixed associations in automated event solutions and increases analyst burden. Archived waveform data and alphanumeric information are being exploited to develop an aftershock processor that will construct association templates to assist the Global Association (GA) application, reduce the number of false and merged phase associations, and lessen analyst burden. Statistical models are being developed and evaluated for potential use by the GA application for identifying and rejecting unlikely preliminary event solutions. Other uses of archived data at the US NDC include: improved event locations using empirical travel time corrections and discrimination via a statistical framework known as the event classification matrix (ECM).
Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line 10-m...
Habash, Marc; Johns, Robert
2009-10-01
This study compared an automated Escherichia coli and coliform detection system with the membrane filtration direct count technique for water testing. The automated instrument performed equal to or better than the membrane filtration test in analyzing E. coli-spiked samples and blind samples with interference from Proteus vulgaris or Aeromonas hydrophila.
2014-01-01
Background Adverse drug reactions and adverse drug events (ADEs) are major public health issues. Many different prospective tools for the automated detection of ADEs in hospital databases have been developed and evaluated. The objective of the present study was to evaluate an automated method for the retrospective detection of ADEs with hyperkalaemia during inpatient stays. Methods We used a set of complex detection rules to take account of the patient’s clinical and biological context and the chronological relationship between the causes and the expected outcome. The dataset consisted of 3,444 inpatient stays in a French general hospital. An automated review was performed for all data and the results were compared with those of an expert chart review. The complex detection rules’ analytical quality was evaluated for ADEs. Results In terms of recall, 89.5% of ADEs with hyperkalaemia “with or without an abnormal symptom” were automatically identified (including all three serious ADEs). In terms of precision, 63.7% of the automatically identified ADEs with hyperkalaemia were true ADEs. Conclusions The use of context-sensitive rules appears to improve the automated detection of ADEs with hyperkalaemia. This type of tool may have an important role in pharmacoepidemiology via the routine analysis of large inter-hospital databases. PMID:25212108
Ficheur, Grégoire; Chazard, Emmanuel; Beuscart, Jean-Baptiste; Merlin, Béatrice; Luyckx, Michel; Beuscart, Régis
2014-09-12
Adverse drug reactions and adverse drug events (ADEs) are major public health issues. Many different prospective tools for the automated detection of ADEs in hospital databases have been developed and evaluated. The objective of the present study was to evaluate an automated method for the retrospective detection of ADEs with hyperkalaemia during inpatient stays. We used a set of complex detection rules to take account of the patient's clinical and biological context and the chronological relationship between the causes and the expected outcome. The dataset consisted of 3,444 inpatient stays in a French general hospital. An automated review was performed for all data and the results were compared with those of an expert chart review. The complex detection rules' analytical quality was evaluated for ADEs. In terms of recall, 89.5% of ADEs with hyperkalaemia "with or without an abnormal symptom" were automatically identified (including all three serious ADEs). In terms of precision, 63.7% of the automatically identified ADEs with hyperkalaemia were true ADEs. The use of context-sensitive rules appears to improve the automated detection of ADEs with hyperkalaemia. This type of tool may have an important role in pharmacoepidemiology via the routine analysis of large inter-hospital databases.
Simplifying operations with an uplink/downlink integration toolkit
NASA Technical Reports Server (NTRS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-01-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
Simplifying operations with an uplink/downlink integration toolkit
NASA Astrophysics Data System (ADS)
Murphy, Susan C.; Miller, Kevin J.; Guerrero, Ana Maria; Joe, Chester; Louie, John J.; Aguilera, Christine
1994-11-01
The Operations Engineering Lab (OEL) at JPL has developed a simple, generic toolkit to integrate the uplink/downlink processes, (often called closing the loop), in JPL's Multimission Ground Data System. This toolkit provides capabilities for integrating telemetry verification points with predicted spacecraft commands and ground events in the Mission Sequence Of Events (SOE) document. In the JPL ground data system, the uplink processing functions and the downlink processing functions are separate subsystems that are not well integrated because of the nature of planetary missions with large one-way light times for spacecraft-to-ground communication. Our new closed-loop monitoring tool allows an analyst or mission controller to view and save uplink commands and ground events with their corresponding downlinked telemetry values regardless of the delay in downlink telemetry and without requiring real-time intervention by the user. An SOE document is a time-ordered list of all the planned ground and spacecraft events, including all commands, sequence loads, ground events, significant mission activities, spacecraft status, and resource allocations. The SOE document is generated by expansion and integration of spacecraft sequence files, ground station allocations, navigation files, and other ground event files. This SOE generation process has been automated within the OEL and includes a graphical, object-oriented SOE editor and real-time viewing tool running under X/Motif. The SOE toolkit was used as the framework for the integrated implementation. The SOE is used by flight engineers to coordinate their operations tasks, serving as a predict data set in ground operations and mission control. The closed-loop SOE toolkit allows simple, automated integration of predicted uplink events with correlated telemetry points in a single SOE document for on-screen viewing and archiving. It automatically interfaces with existing real-time or non real-time sources of information, to display actual values from the telemetry data stream. This toolkit was designed to greatly simplify the user's ability to access and view telemetry data, and also provide a means to view this data in the context of the commands and ground events that are used to interpret it. A closed-loop system can prove especially useful in small missions with limited resources requiring automated monitoring tools. This paper will discuss the toolkit implementation, including design trade-offs and future plans for enhancing the automated capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fulton, John; Gallagher, Linda K.; Whitener, Dustin
The Turbo FRMAC (TF) software automates the calculations described in volumes 1-3 of "The Federal Manual for Assessing Environmental Data During a Radiological Emergency" (2010 version). This software automates the process of assessing radiological data during a Federal Radiological Emergency. The manual upon which the software is based is unclassified and freely available on the Internet. TF takes values generated by field samples or computer dispersion models and assesses the data in a way which is meaningful to a decision maker at a radiological emergency; such as, do radiation values exceed city, state, or federal limits; should the crops bemore » destroyed or can they be utilized; do residents need to be evacuated, sheltered in place, or should another action taken. The software also uses formulas generated by the EPA, FDA, and other federal agencies to generate field observable values specific to the radiological event that can be used to determine where regulatory limit values are exceeded. In addition to these calculations, TF calculates values which indicate how long an emergency worker can work in the contaminated area during a radiological emergency, the dose received from drinking contaminated water or milk, the dose from eating contaminated food, the does expected down or upwind of a given field sample, along with a significant number of other similar radiological health values.« less
NASA Astrophysics Data System (ADS)
Theveneau, P.; Baker, R.; Barrett, R.; Beteva, A.; Bowler, M. W.; Carpentier, P.; Caserotto, H.; de Sanctis, D.; Dobias, F.; Flot, D.; Guijarro, M.; Giraud, T.; Lentini, M.; Leonard, G. A.; Mattenet, M.; McCarthy, A. A.; McSweeney, S. M.; Morawe, C.; Nanao, M.; Nurizzo, D.; Ohlsson, S.; Pernot, P.; Popov, A. N.; Round, A.; Royant, A.; Schmid, W.; Snigirev, A.; Surr, J.; Mueller-Dieckmann, C.
2013-03-01
Automation and advances in technology are the key elements in addressing the steadily increasing complexity of Macromolecular Crystallography (MX) experiments. Much of this complexity is due to the inter-and intra-crystal heterogeneity in diffraction quality often observed for crystals of multi-component macromolecular assemblies or membrane proteins. Such heterogeneity makes high-throughput sample evaluation an important and necessary tool for increasing the chances of a successful structure determination. The introduction at the ESRF of automatic sample changers in 2005 dramatically increased the number of samples that were tested for diffraction quality. This "first generation" of automation, coupled with advances in software aimed at optimising data collection strategies in MX, resulted in a three-fold increase in the number of crystal structures elucidated per year using data collected at the ESRF. In addition, sample evaluation can be further complemented using small angle scattering experiments on the newly constructed bioSAXS facility on BM29 and the micro-spectroscopy facility (ID29S). The construction of a second generation of automated facilities on the MASSIF (Massively Automated Sample Screening Integrated Facility) beam lines will build on these advances and should provide a paradigm shift in how MX experiments are carried out which will benefit the entire Structural Biology community.
A comparison of adaptive and adaptable automation under different levels of environmental stress.
Sauer, Juergen; Kao, Chung-Shan; Wastell, David
2012-01-01
The effectiveness of different forms of adaptive and adaptable automation was examined under low- and high-stress conditions, in the form of different levels of noise. Thirty-six participants were assigned to one of the three types of variable automation (adaptive event-based, adaptive performance-based and adaptable serving as a control condition). Participants received 3 h of training on a simulation of a highly automated process control task and were subsequently tested during a 4-h session under noise exposure and quiet conditions. The results for performance suggested no clear benefits of one automation control mode over the other two. However, it emerged that participants under adaptable automation adopted a more active system management strategy and reported higher levels of self-confidence than in the two adaptive control modes. Furthermore, the results showed higher levels of perceived workload, fatigue and anxiety for performance-based adaptive automation control than the other two modes. This study compared two forms of adaptive automation (where the automated system flexibly allocates tasks between human and machine) with adaptable automation (where the human allocates the tasks). The adaptable mode showed marginal advantages. This is of relevance, given that this automation mode may also be easier to design.
Predictive value of the present-on-admission indicator for hospital-acquired venous thromboembolism.
Khanna, Raman R; Kim, Sharon B; Jenkins, Ian; El-Kareh, Robert; Afsarmanesh, Nasim; Amin, Alpesh; Sand, Heather; Auerbach, Andrew; Chia, Catherine Y; Maynard, Gregory; Romano, Patrick S; White, Richard H
2015-04-01
Hospital-acquired venous thromboembolic (HA-VTE) events are an important, preventable cause of morbidity and death, but accurately identifying HA-VTE events requires labor-intensive chart review. Administrative diagnosis codes and their associated "present-on-admission" (POA) indicator might allow automated identification of HA-VTE events, but only if VTE codes are accurately flagged "not present-on-admission" (POA=N). New codes were introduced in 2009 to improve accuracy. We identified all medical patients with at least 1 VTE "other" discharge diagnosis code from 5 academic medical centers over a 24-month period. We then sampled, within each center, patients with VTE codes flagged POA=N or POA=U (insufficient documentation) and POA=Y or POA=W (timing clinically uncertain) and abstracted each chart to clarify VTE timing. All events that were not clearly POA were classified as HA-VTE. We then calculated predictive values of the POA=N/U flags for HA-VTE and the POA=Y/W flags for non-HA-VTE. Among 2070 cases with at least 1 "other" VTE code, we found 339 codes flagged POA=N/U and 1941 flagged POA=Y/W. Among 275 POA=N/U abstracted codes, 75.6% (95% CI, 70.1%-80.6%) were HA-VTE; among 291 POA=Y/W abstracted events, 73.5% (95% CI, 68.0%-78.5%) were non-HA-VTE. Extrapolating from this sample, we estimated that 59% of actual HA-VTE codes were incorrectly flagged POA=Y/W. POA indicator predictive values did not improve after new codes were introduced in 2009. The predictive value of VTE events flagged POA=N/U for HA-VTE was 75%. However, sole reliance on this flag may substantially underestimate the incidence of HA-VTE.
Manual B-mode versus automated radio-frequency carotid intima-media thickness measurements.
Dogan, Soner; Plantinga, Yvonne; Dijk, Joke M; van der Graaf, Yolanda; Grobbee, Diederick E; Bots, Michiel L
2009-10-01
Carotid intima-media thickness (CIMT) serves as an indicator of atherosclerosis and cardiovascular risk. Manual measurements of B-mode ultrasound images are the most applied method. Automated measurements with radiofrequency (RF) ultrasound have been suggested as an alternative. The aim of this study was to compare these methods in terms of risk-factor relations and associations with future events. Data from participants of the Second Manifestations of Arterial Disease (SMART) study were used. Far wall common CIMT was measured online with manual B-mode and automated RF ultrasound. Measurements were performed by a group of 6 sonographers. Risk-factor information was obtained. All participants were followed for the occurrence of vascular events (mean follow-up, 2.1 years). CIMT was related to risk factors with linear regression models and to future events with Cox proportional-hazards models. Data were available for 2,146 participants. Agreement between the methods was modest (intraclass correlation coefficient = 0.34). Risk-factor relations with age and systolic blood pressure were stronger for B-mode than for RF ultrasound. Association with future events was better for B-mode than for RF ultrasound (vascular death, 1.27 vs 1.00; ischemic stroke, 1.45 vs 1.03). In participants with CIMT < 0.9 mm (without plaque), the intraclass correlation between the measures was 0.50. In addition, in that subgroup, RF ultrasound showed a stronger association with future events than B-mode ultrasound (all events, 1.59 vs 1.09; vascular death, 1.72 vs 0.93; coronary ischemic events, 1.65 vs 1.05). The preference for either B-mode or RF measurements may be driven by the type of study population, the expected presence of local atherosclerotic abnormalities, and the main aim of the study (assessing risk factors or events). However, in this study, as in many others, the B-mode approach was shown to be robust in risk-factor relations and the prediction of events.
NASA Technical Reports Server (NTRS)
Fogal, G. L.; Mangialardi, J. K.; Young, R.
1974-01-01
The capability of the basic automated Biowaste Sampling System (ABSS) hardware was extended and improved through the design, fabrication and test of breadboard hardware. A preliminary system design effort established the feasibility of integrating the breadboard concepts into the ABSS.
[DNA Extraction from Old Bones by AutoMate Express™ System].
Li, B; Lü, Z
2017-08-01
To establish a method for extracting DNA from old bones by AutoMate Express™ system. Bones were grinded into powder by freeze-mill. After extraction by AutoMate Express™, DNA were amplified and genotyped by Identifiler®Plus and MinFiler™ kits. DNA were extracted from 10 old bone samples, which kept in different environments with the postmortem interval from 10 to 20 years, in 3 hours by AutoMate Express™ system. Complete STR typing results were obtained from 8 samples. AutoMate Express™ system can quickly and efficiently extract DNA from old bones, which can be applied in forensic practice. Copyright© by the Editorial Department of Journal of Forensic Medicine
Hawker, Charles D; McCarthy, William; Cleveland, David; Messinger, Bonnie L
2014-03-01
Mislabeled samples are a serious problem in most clinical laboratories. Published error rates range from 0.39/1000 to as high as 1.12%. Standardization of bar codes and label formats has not yet achieved the needed improvement. The mislabel rate in our laboratory, although low compared with published rates, prompted us to seek a solution to achieve zero errors. To reduce or eliminate our mislabeled samples, we invented an automated device using 4 cameras to photograph the outside of a sample tube. The system uses optical character recognition (OCR) to look for discrepancies between the patient name in our laboratory information system (LIS) vs the patient name on the customer label. All discrepancies detected by the system's software then require human inspection. The system was installed on our automated track and validated with production samples. We obtained 1 009 830 images during the validation period, and every image was reviewed. OCR passed approximately 75% of the samples, and no mislabeled samples were passed. The 25% failed by the system included 121 samples actually mislabeled by patient name and 148 samples with spelling discrepancies between the patient name on the customer label and the patient name in our LIS. Only 71 of the 121 mislabeled samples detected by OCR were found through our normal quality assurance process. We have invented an automated camera system that uses OCR technology to identify potential mislabeled samples. We have validated this system using samples transported on our automated track. Full implementation of this technology offers the possibility of zero mislabeled samples in the preanalytic stage.
Cyber indicators of compromise: a domain ontology for security information and event management
2017-03-01
COMPROMISE: A DOMAIN ONTOLOGY FOR SECURITY INFORMATION AND EVENT MANAGEMENT by Marsha D. Rowell March 2017 Thesis Co-Advisors: J. D...to automate this work is Security Information and Event Management (SIEM). In short, SIEM technology works by aggregating log information , and then...Distribution is unlimited. CYBER INDICATORS OF COMPROMISE: A DOMAIN ONTOLOGY FOR SECURITY INFORMATION AND EVENT MANAGEMENT Marsha D. Rowell
Input-output identification of controlled discrete manufacturing systems
NASA Astrophysics Data System (ADS)
Estrada-Vargas, Ana Paula; López-Mellado, Ernesto; Lesage, Jean-Jacques
2014-03-01
The automated construction of discrete event models from observations of external system's behaviour is addressed. This problem, often referred to as system identification, allows obtaining models of ill-known (or even unknown) systems. In this article, an identification method for discrete event systems (DESs) controlled by a programmable logic controller is presented. The method allows processing a large quantity of observed long sequences of input/output signals generated by the controller and yields an interpreted Petri net model describing the closed-loop behaviour of the automated DESs. The proposed technique allows the identification of actual complex systems because it is sufficiently efficient and well adapted to cope with both the technological characteristics of industrial controllers and data collection requirements. Based on polynomial-time algorithms, the method is implemented as an efficient software tool which constructs and draws the model automatically; an overview of this tool is given through a case study dealing with an automated manufacturing system.
Automation bias: decision making and performance in high-tech cockpits.
Mosier, K L; Skitka, L J; Heers, S; Burdick, M
1997-01-01
Automated aids and decision support tools are rapidly becoming indispensable tools in high-technology cockpits and are assuming increasing control of"cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate automation bias, a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation events or opportunities for automation-related omission and commission errors. Although experimentally manipulated accountability demands did not significantly impact performance, post hoc analyses revealed that those pilots who reported an internalized perception of "accountability" for their performance and strategies of interaction with the automation were significantly more likely to double-check automated functioning against other cues and less likely to commit errors than those who did not share this perception. Pilots were also lilkely to erroneously "remember" the presence of expected cues when describing their decision-making processes.
Mann, David L; Abernethy, Bruce; Farrow, Damian; Davis, Mark; Spratford, Wayne
2010-05-01
This article describes a new automated method for the controlled occlusion of vision during natural tasks. The method permits the time course of the presence or absence of visual information to be linked to identifiable events within the task of interest. An example application is presented in which the method is used to examine the ability of cricket batsmen to pick up useful information from the prerelease movement patterns of the opposing bowler. Two key events, separated by a consistent within-action time lag, were identified in the cricket bowling action sequence-namely, the penultimate foot strike prior to ball release (Event 1), and the subsequent moment of ball release (Event 2). Force-plate registration of Event 1 was then used as a trigger to facilitate automated occlusion of vision using liquid crystal occlusion goggles at time points relative to Event 2. Validation demonstrated that, compared with existing approaches that are based on manual triggering, this method of occlusion permitted considerable gains in temporal precision and a reduction in the number of unusable trials. A more efficient and accurate protocol to examine anticipation is produced, while preserving the important natural coupling between perception and action.
Zhou, Chan; Mao, Fenglou; Yin, Yanbin; Huang, Jinling; Gogarten, Johann Peter; Xu, Ying
2014-01-01
A challenge in phylogenetic inference of gene trees is how to properly sample a large pool of homologous sequences to derive a good representative subset of sequences. Such a need arises in various applications, e.g. when (1) accuracy-oriented phylogenetic reconstruction methods may not be able to deal with a large pool of sequences due to their high demand in computing resources; (2) applications analyzing a collection of gene trees may prefer to use trees with fewer operational taxonomic units (OTUs), for instance for the detection of horizontal gene transfer events by identifying phylogenetic conflicts; and (3) the pool of available sequences is biased towards extensively studied species. In the past, the creation of subsamples often relied on manual selection. Here we present an Automated sequence-Sampling method for improving the Taxonomic diversity of gene phylogenetic trees, AST, to obtain representative sequences that maximize the taxonomic diversity of the sampled sequences. To demonstrate the effectiveness of AST, we have tested it to solve four problems, namely, inference of the evolutionary histories of the small ribosomal subunit protein S5 of E. coli, 16 S ribosomal RNAs and glycosyl-transferase gene family 8, and a study of ancient horizontal gene transfers from bacteria to plants. Our results show that the resolution of our computational results is almost as good as that of manual inference by domain experts, hence making the tool generally useful to phylogenetic studies by non-phylogeny specialists. The program is available at http://csbl.bmb.uga.edu/~zhouchan/AST.php.
Zhou, Chan; Mao, Fenglou; Yin, Yanbin; Huang, Jinling; Gogarten, Johann Peter; Xu, Ying
2014-01-01
A challenge in phylogenetic inference of gene trees is how to properly sample a large pool of homologous sequences to derive a good representative subset of sequences. Such a need arises in various applications, e.g. when (1) accuracy-oriented phylogenetic reconstruction methods may not be able to deal with a large pool of sequences due to their high demand in computing resources; (2) applications analyzing a collection of gene trees may prefer to use trees with fewer operational taxonomic units (OTUs), for instance for the detection of horizontal gene transfer events by identifying phylogenetic conflicts; and (3) the pool of available sequences is biased towards extensively studied species. In the past, the creation of subsamples often relied on manual selection. Here we present an Automated sequence-Sampling method for improving the Taxonomic diversity of gene phylogenetic trees, AST, to obtain representative sequences that maximize the taxonomic diversity of the sampled sequences. To demonstrate the effectiveness of AST, we have tested it to solve four problems, namely, inference of the evolutionary histories of the small ribosomal subunit protein S5 of E. coli, 16 S ribosomal RNAs and glycosyl-transferase gene family 8, and a study of ancient horizontal gene transfers from bacteria to plants. Our results show that the resolution of our computational results is almost as good as that of manual inference by domain experts, hence making the tool generally useful to phylogenetic studies by non-phylogeny specialists. The program is available at http://csbl.bmb.uga.edu/~zhouchan/AST.php. PMID:24892935
Automated Sample Exchange Robots for the Structural Biology Beam Lines at the Photon Factory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hiraki, Masahiko; Watanabe, Shokei; Yamada, Yusuke
2007-01-19
We are now developing automated sample exchange robots for high-throughput protein crystallographic experiments for onsite use at synchrotron beam lines. It is part of the fully automated robotics systems being developed at the Photon Factory, for the purposes of protein crystallization, monitoring crystal growth, harvesting and freezing crystals, mounting the crystals inside a hutch and for data collection. We have already installed the sample exchange robots based on the SSRL automated mounting system at our insertion device beam lines BL-5A and AR-NW12A at the Photon Factory. In order to reduce the time required for sample exchange further, a prototype ofmore » a double-tonged system was developed. As a result of preliminary experiments with double-tonged robots, the sample exchange time was successfully reduced from 70 seconds to 10 seconds with the exception of the time required for pre-cooling and warming up the tongs.« less
Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.
Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras
2016-04-01
There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. © 2015 Society for Laboratory Automation and Screening.
Evaluation of mouse red blood cell and platelet counting with an automated hematology analyzer.
Fukuda, Teruko; Asou, Eri; Nogi, Kimiko; Goto, Kazuo
2017-10-07
An evaluation of mouse red blood cell (RBC) and platelet (PLT) counting with an automated hematology analyzer was performed with three strains of mice, C57BL/6 (B6), BALB/c (BALB) and DBA/2 (D2). There were no significant differences in RBC and PLT counts between manual and automated optical methods in any of the samples, except for D2 mice. For D2, RBC counts obtained using the manual method were significantly lower than those obtained using the automated optical method (P<0.05), and PLT counts obtained using the manual method were higher than those obtained using the automated optical method (P<0.05). An automated hematology analyzer can be used for RBC and PLT counting; however, an appropriate method should be selected when D2 mice samples are used.
Life Sciences Research Facility automation requirements and concepts for the Space Station
NASA Technical Reports Server (NTRS)
Rasmussen, Daryl N.
1986-01-01
An evaluation is made of the methods and preliminary results of a study on prospects for the automation of the NASA Space Station's Life Sciences Research Facility. In order to remain within current Space Station resource allocations, approximately 85 percent of planned life science experiment tasks must be automated; these tasks encompass specimen care and feeding, cage and instrument cleaning, data acquisition and control, sample analysis, waste management, instrument calibration, materials inventory and management, and janitorial work. Task automation will free crews for specimen manipulation, tissue sampling, data interpretation and communication with ground controllers, and experiment management.
A computer aided treatment event recognition system in radiation therapy.
Xia, Junyi; Mart, Christopher; Bayouth, John
2014-01-01
To develop an automated system to safeguard radiation therapy treatments by analyzing electronic treatment records and reporting treatment events. CATERS (Computer Aided Treatment Event Recognition System) was developed to detect treatment events by retrieving and analyzing electronic treatment records. CATERS is designed to make the treatment monitoring process more efficient by automating the search of the electronic record for possible deviations from physician's intention, such as logical inconsistencies as well as aberrant treatment parameters (e.g., beam energy, dose, table position, prescription change, treatment overrides, etc). Over a 5 month period (July 2012-November 2012), physicists were assisted by the CATERS software in conducting normal weekly chart checks with the aims of (a) determining the relative frequency of particular events in the authors' clinic and (b) incorporating these checks into the CATERS. During this study period, 491 patients were treated at the University of Iowa Hospitals and Clinics for a total of 7692 fractions. All treatment records from the 5 month analysis period were evaluated using all the checks incorporated into CATERS after the training period. About 553 events were detected as being exceptions, although none of them had significant dosimetric impact on patient treatments. These events included every known event type that was discovered during the trial period. A frequency analysis of the events showed that the top three types of detected events were couch position override (3.2%), extra cone beam imaging (1.85%), and significant couch position deviation (1.31%). The significant couch deviation is defined as the number of treatments where couch vertical exceeded two times standard deviation of all couch verticals, or couch lateral/longitudinal exceeded three times standard deviation of all couch laterals and longitudinals. On average, the application takes about 1 s per patient when executed on either a desktop computer or a mobile device. CATERS offers an effective tool to detect and report treatment events. Automation and rapid processing enables electronic record interrogation daily, alerting the medical physicist of deviations potentially days prior to performing weekly check. The output of CATERS could also be utilized as an important input to failure mode and effects analysis.
Machine Learning-based Transient Brokers for Real-time Classification of the LSST Alert Stream
NASA Astrophysics Data System (ADS)
Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika; ANTARES Collaboration
2018-01-01
The number of transient events discovered by wide-field time-domain surveys already far outstrips the combined followup resources of the astronomical community. This number will only increase as we progress towards the commissioning of the Large Synoptic Survey Telescope (LSST), breaking the community's current followup paradigm. Transient brokers - software to sift through, characterize, annotate and prioritize events for followup - will be a critical tool for managing alert streams in the LSST era. Developing the algorithms that underlie the brokers, and obtaining simulated LSST-like datasets prior to LSST commissioning, to train and test these algorithms are formidable, though not insurmountable challenges. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is a joint project of the National Optical Astronomy Observatory and the Department of Computer Science at the University of Arizona. We have been developing completely automated methods to characterize and classify variable and transient events from their multiband optical photometry. We describe the hierarchical ensemble machine learning algorithm we are developing, and test its performance on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, as well as our progress towards incorporating these into a real-time event broker working on live alert streams from time-domain surveys.
NASA Astrophysics Data System (ADS)
Herr, J.; Bhatnagar, T.; Goldfarb, S.; Irrer, J.; McKee, S.; Neal, H. A.
2008-07-01
Large scientific collaborations as well as universities have a growing need for multimedia archiving of meetings and courses. Collaborations need to disseminate training and news to their wide-ranging members, and universities seek to provide their students with more useful studying tools. The University of Michigan ATLAS Collaboratory Project has been involved in the recording and archiving of multimedia lectures since 1999. Our software and hardware architecture has been used to record events for CERN, ATLAS, many units inside the University of Michigan, Fermilab, the American Physical Society and the International Conference on Systems Biology at Harvard. Until 2006 our group functioned primarily as a tiny research/development team with special commitments to the archiving of certain ATLAS events. In 2006 we formed the MScribe project, using a larger scale, and highly automated recording system to record and archive eight University courses in a wide array of subjects. Several robotic carts are wheeled around campus by unskilled student helpers to automatically capture and post to the Web audio, video, slides and chalkboard images. The advances the MScribe project has made in automation of these processes, including a robotic camera operator and automated video processing, are now being used to record ATLAS Collaboration events, making them available more quickly than before and enabling the recording of more events.
Increasing the Automation and Autonomy for Spacecraft Operations with Criteria Action Table
NASA Technical Reports Server (NTRS)
Li, Zhen-Ping; Savki, Cetin
2005-01-01
The Criteria Action Table (CAT) is an automation tool developed for monitoring real time system messages for specific events and processes in order to take user defined actions based on a set of user-defined rules. CAT was developed by Lockheed Martin Space Operations as a part of a larger NASA effort at the Goddard Space Flight Center (GSFC) to create a component-based, middleware-based, and standard-based general purpose ground system architecture referred as GMSEC - the GSFC Mission Services Evolution Center. CAT has been integrated into the upgraded ground systems for Tropical Rainfall Measuring Mission (TRMM) and Small Explorer (SMEX) satellites and it plays the central role in their automation effort to reduce the cost and increase the reliability for spacecraft operations. The GMSEC architecture provides a standard communication interface and protocol for components to publish/describe messages to an information bus. It also provides a standard message definition so components can send and receive messages to the bus interface rather than each other, thus reducing the component-to-component coupling, interface, protocols, and link (socket) management. With the GMSEC architecture, components can publish standard event messages to the bus for all nominal, significant, and surprising events in regard to satellite, celestial, ground system, or any other activity. In addition to sending standard event messages, each GMSEC compliant component is required to accept and process GMSEC directive request messages.
Nieva, Jorge; Wendel, Marco; Luttgen, Madelyn S; Marrinucci, Dena; Bazhenova, Lyudmila; Kolatkar, Anand; Santala, Roger; Whittenberger, Brock; Burke, James; Torrey, Melissa; Bethel, Kelly; Kuhn, Peter
2012-02-01
Sampling circulating tumor cells (CTCs) from peripheral blood is ideally accomplished using assays that detect high numbers of cells and preserve them for downstream characterization. We sought to evaluate a method using enrichment free fluorescent labeling of CTCs followed by automated digital microscopy in patients with non-small cell lung cancer. Twenty-eight patients with non-small cell lung cancer and hematogenously seeded metastasis were analyzed with multiple blood draws. We detected CTCs in 68% of analyzed samples and found a propensity for increased CTC detection as the disease progressed in individual patients. CTCs were present at a median concentration of 1.6 CTCs ml⁻¹ of analyzed blood in the patient population. Higher numbers of detected CTCs were associated with an unfavorable prognosis.
Automation in clinical bacteriology: what system to choose?
Greub, G; Prod'hom, G
2011-05-01
With increased activity and reduced financial and human resources, there is a need for automation in clinical bacteriology. Initial processing of clinical samples includes repetitive and fastidious steps. These tasks are suitable for automation, and several instruments are now available on the market, including the WASP (Copan), Previ-Isola (BioMerieux), Innova (Becton-Dickinson) and Inoqula (KIESTRA) systems. These new instruments allow efficient and accurate inoculation of samples, including four main steps: (i) selecting the appropriate Petri dish; (ii) inoculating the sample; (iii) spreading the inoculum on agar plates to obtain, upon incubation, well-separated bacterial colonies; and (iv) accurate labelling and sorting of each inoculated media. The challenge for clinical bacteriologists is to determine what is the ideal automated system for their own laboratory. Indeed, different solutions will be preferred, according to the number and variety of samples, and to the types of sample that will be processed with the automated system. The final choice is troublesome, because audits proposed by industrials risk being biased towards the solution proposed by their company, and because these automated systems may not be easily tested on site prior to the final decision, owing to the complexity of computer connections between the laboratory information system and the instrument. This article thus summarizes the main parameters that need to be taken into account for choosing the optimal system, and provides some clues to help clinical bacteriologists to make their choice. © 2011 The Authors. Clinical Microbiology and Infection © 2011 European Society of Clinical Microbiology and Infectious Diseases.
Garty, Guy; Chen, Youhua; Turner, Helen C; Zhang, Jian; Lyulko, Oleksandra V; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Lawrence Yao, Y; Brenner, David J
2011-08-01
Over the past five years the Center for Minimally Invasive Radiation Biodosimetry at Columbia University has developed the Rapid Automated Biodosimetry Tool (RABiT), a completely automated, ultra-high throughput biodosimetry workstation. This paper describes recent upgrades and reliability testing of the RABiT. The RABiT analyses fingerstick-derived blood samples to estimate past radiation exposure or to identify individuals exposed above or below a cut-off dose. Through automated robotics, lymphocytes are extracted from fingerstick blood samples into filter-bottomed multi-well plates. Depending on the time since exposure, the RABiT scores either micronuclei or phosphorylation of the histone H2AX, in an automated robotic system, using filter-bottomed multi-well plates. Following lymphocyte culturing, fixation and staining, the filter bottoms are removed from the multi-well plates and sealed prior to automated high-speed imaging. Image analysis is performed online using dedicated image processing hardware. Both the sealed filters and the images are archived. We have developed a new robotic system for lymphocyte processing, making use of an upgraded laser power and parallel processing of four capillaries at once. This system has allowed acceleration of lymphocyte isolation, the main bottleneck of the RABiT operation, from 12 to 2 sec/sample. Reliability tests have been performed on all robotic subsystems. Parallel handling of multiple samples through the use of dedicated, purpose-built, robotics and high speed imaging allows analysis of up to 30,000 samples per day.
Garty, Guy; Chen, Youhua; Turner, Helen; Zhang, Jian; Lyulko, Oleksandra; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y. Lawrence; Brenner, David J.
2011-01-01
Purpose Over the past five years the Center for Minimally Invasive Radiation Biodosimetry at Columbia University has developed the Rapid Automated Biodosimetry Tool (RABiT), a completely automated, ultra-high throughput biodosimetry workstation. This paper describes recent upgrades and reliability testing of the RABiT. Materials and methods The RABiT analyzes fingerstick-derived blood samples to estimate past radiation exposure or to identify individuals exposed above or below a cutoff dose. Through automated robotics, lymphocytes are extracted from fingerstick blood samples into filter-bottomed multi-well plates. Depending on the time since exposure, the RABiT scores either micronuclei or phosphorylation of the histone H2AX, in an automated robotic system, using filter-bottomed multi-well plates. Following lymphocyte culturing, fixation and staining, the filter bottoms are removed from the multi-well plates and sealed prior to automated high-speed imaging. Image analysis is performed online using dedicated image processing hardware. Both the sealed filters and the images are archived. Results We have developed a new robotic system for lymphocyte processing, making use of an upgraded laser power and parallel processing of four capillaries at once. This system has allowed acceleration of lymphocyte isolation, the main bottleneck of the RABiT operation, from 12 to 2 sec/sample. Reliability tests have been performed on all robotic subsystems. Conclusions Parallel handling of multiple samples through the use of dedicated, purpose-built, robotics and high speed imaging allows analysis of up to 30,000 samples per day. PMID:21557703
Raterink, Robert-Jan; Witkam, Yoeri; Vreeken, Rob J; Ramautar, Rawi; Hankemeier, Thomas
2014-10-21
In the field of bioanalysis, there is an increasing demand for miniaturized, automated, robust sample pretreatment procedures that can be easily connected to direct-infusion mass spectrometry (DI-MS) in order to allow the high-throughput screening of drugs and/or their metabolites in complex body fluids like plasma. Liquid-Liquid extraction (LLE) is a common sample pretreatment technique often used for complex aqueous samples in bioanalysis. Despite significant developments that have been made in automated and miniaturized LLE procedures, fully automated LLE techniques allowing high-throughput bioanalytical studies on small-volume samples using direct infusion mass spectrometry, have not been matured yet. Here, we introduce a new fully automated micro-LLE technique based on gas-pressure assisted mixing followed by passive phase separation, coupled online to nanoelectrospray-DI-MS. Our method was characterized by varying the gas flow and its duration through the solvent mixture. For evaluation of the analytical performance, four drugs were spiked to human plasma, resulting in highly acceptable precision (RSD down to 9%) and linearity (R(2) ranging from 0.990 to 0.998). We demonstrate that our new method does not only allow the reliable extraction of analytes from small sample volumes of a few microliters in an automated and high-throughput manner, but also performs comparable or better than conventional offline LLE, in which the handling of small volumes remains challenging. Finally, we demonstrate the applicability of our method for drug screening on dried blood spots showing excellent linearity (R(2) of 0.998) and precision (RSD of 9%). In conclusion, we present the proof of principe of a new high-throughput screening platform for bioanalysis based on a new automated microLLE method, coupled online to a commercially available nano-ESI-DI-MS.
Gittinger, Matthew; Brolliar, Sarah M; Grand, James A; Nichol, Graham; Fernandez, Rosemarie
2017-06-01
This pilot study used a simulation-based platform to evaluate the effect of an automated mechanical chest compression device on team communication and patient management. Four-member emergency department interprofessional teams were randomly assigned to perform manual chest compressions (control, n = 6) or automated chest compressions (intervention, n = 6) during a simulated cardiac arrest with 2 phases: phase 1 baseline (ventricular tachycardia), followed by phase 2 (ventricular fibrillation). Patient management was coded using an Advanced Cardiovascular Life Support-based checklist. Team communication was categorized in the following 4 areas: (1) teamwork focus; (2) huddle events, defined as statements focused on re-establishing situation awareness, reinforcing existing plans, and assessing the need to adjust the plan; (3) clinical focus; and (4) profession of team member. Statements were aggregated for each team. At baseline, groups were similar with respect to total communication statements and patient management. During cardiac arrest, the total number of communication statements was greater in teams performing manual compressions (median, 152.3; interquartile range [IQR], 127.6-181.0) as compared with teams using an automated compression device (median, 105; IQR, 99.5-123.9). Huddle events were more frequent in teams performing automated chest compressions (median, 4.0; IQR, 3.1-4.3 vs. 2.0; IQR, 1.4-2.6). Teams randomized to the automated compression intervention had a delay to initial defibrillation (median, 208.3 seconds; IQR, 153.3-222.1 seconds) as compared with control teams (median, 63.2 seconds; IQR, 30.1-397.2 seconds). Use of an automated compression device may impact both team communication and patient management. Simulation-based assessments offer important insights into the effect of technology on healthcare teams.
Jasuja, Guneet K; Reisman, Joel I; Miller, Donald R; Berlowitz, Dan R; Hylek, Elaine M; Ash, Arlene S; Ozonoff, Al; Zhao, Shibei; Rose, Adam J
2013-01-01
Identifying major bleeding is fundamental to assessing the outcomes of anticoagulation therapy. This drives the need for a credible implementation in automated data for the International Society of Thrombosis and Haemostasis (ISTH) definition of major bleeding. We studied 102,395 patients who received 158,511 person-years of warfarin treatment from the Veterans Health Administration (VA) between 10/1/06-9/30/08. We constructed a list of ICD-9-CM codes of "candidate" bleeding events. Each candidate event was identified as a major hemorrhage if it fulfilled one of four criteria: 1) associated with death within 30days; 2) bleeding in a critical anatomic site; 3) associated with a transfusion; or 4) was coded as the event that precipitated or was responsible for the majority of an inpatient hospitalization. This definition classified 11,240 (15.8%) of 71, 338 candidate events as major hemorrhage. Typically, events more likely to be severe were retained at higher rates than those less likely to be severe. For example, Diverticula of Colon with Hemorrhage (562.12) and Hematuria (599.7) were retained 46% and 4% of the time, respectively. Major, intracranial, and fatal hemorrhage were identified at rates comparable to those found in randomized clinical trials however, higher than those reported in observational studies: 4.73, 1.29, and 0.41 per 100 patient years, respectively. We describe here a workable definition for identifying major hemorrhagic events from large automated datasets. This method of identifying major bleeding may have applications for quality measurement, quality improvement, and comparative effectiveness research. Copyright © 2012 Elsevier Ltd. All rights reserved.
Automated indirect immunofluorescence evaluation of antinuclear autoantibodies on HEp-2 cells.
Voigt, Jörn; Krause, Christopher; Rohwäder, Edda; Saschenbrecker, Sandra; Hahn, Melanie; Danckwardt, Maick; Feirer, Christian; Ens, Konstantin; Fechner, Kai; Barth, Erhardt; Martinetz, Thomas; Stöcker, Winfried
2012-01-01
Indirect immunofluorescence (IIF) on human epithelial (HEp-2) cells is considered as the gold standard screening method for the detection of antinuclear autoantibodies (ANA). However, in terms of automation and standardization, it has not been able to keep pace with most other analytical techniques used in diagnostic laboratories. Although there are already some automation solutions for IIF incubation in the market, the automation of result evaluation is still in its infancy. Therefore, the EUROPattern Suite has been developed as a comprehensive automated processing and interpretation system for standardized and efficient ANA detection by HEp-2 cell-based IIF. In this study, the automated pattern recognition was compared to conventional visual interpretation in a total of 351 sera. In the discrimination of positive from negative samples, concordant results between visual and automated evaluation were obtained for 349 sera (99.4%, kappa = 0.984). The system missed out none of the 272 antibody-positive samples and identified 77 out of 79 visually negative samples (analytical sensitivity/specificity: 100%/97.5%). Moreover, 94.0% of all main antibody patterns were recognized correctly by the software. Owing to its performance characteristics, EUROPattern enables fast, objective, and economic IIF ANA analysis and has the potential to reduce intra- and interlaboratory variability.
Automated Indirect Immunofluorescence Evaluation of Antinuclear Autoantibodies on HEp-2 Cells
Voigt, Jörn; Krause, Christopher; Rohwäder, Edda; Saschenbrecker, Sandra; Hahn, Melanie; Danckwardt, Maick; Feirer, Christian; Ens, Konstantin; Fechner, Kai; Barth, Erhardt; Martinetz, Thomas; Stöcker, Winfried
2012-01-01
Indirect immunofluorescence (IIF) on human epithelial (HEp-2) cells is considered as the gold standard screening method for the detection of antinuclear autoantibodies (ANA). However, in terms of automation and standardization, it has not been able to keep pace with most other analytical techniques used in diagnostic laboratories. Although there are already some automation solutions for IIF incubation in the market, the automation of result evaluation is still in its infancy. Therefore, the EUROPattern Suite has been developed as a comprehensive automated processing and interpretation system for standardized and efficient ANA detection by HEp-2 cell-based IIF. In this study, the automated pattern recognition was compared to conventional visual interpretation in a total of 351 sera. In the discrimination of positive from negative samples, concordant results between visual and automated evaluation were obtained for 349 sera (99.4%, kappa = 0.984). The system missed out none of the 272 antibody-positive samples and identified 77 out of 79 visually negative samples (analytical sensitivity/specificity: 100%/97.5%). Moreover, 94.0% of all main antibody patterns were recognized correctly by the software. Owing to its performance characteristics, EUROPattern enables fast, objective, and economic IIF ANA analysis and has the potential to reduce intra- and interlaboratory variability. PMID:23251220
Vogel, Adam P; Block, Susan; Kefalianos, Elaina; Onslow, Mark; Eadie, Patricia; Barth, Ben; Conway, Laura; Mundt, James C; Reilly, Sheena
2015-04-01
To investigate the feasibility of adopting automated interactive voice response (IVR) technology for remotely capturing standardized speech samples from stuttering children. Participants were 10 6-year-old stuttering children. Their parents called a toll-free number from their homes and were prompted to elicit speech from their children using a standard protocol involving conversation, picture description and games. The automated IVR system was implemented using an off-the-shelf telephony software program and delivered by a standard desktop computer. The software infrastructure utilizes voice over internet protocol. Speech samples were automatically recorded during the calls. Video recordings were simultaneously acquired in the home at the time of the call to evaluate the fidelity of the telephone collected samples. Key outcome measures included syllables spoken, percentage of syllables stuttered and an overall rating of stuttering severity using a 10-point scale. Data revealed a high level of relative reliability in terms of intra-class correlation between the video and telephone acquired samples on all outcome measures during the conversation task. Findings were less consistent for speech samples during picture description and games. Results suggest that IVR technology can be used successfully to automate remote capture of child speech samples.
Kottawatta, Kottawattage S A; Van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S
2017-11-29
Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants ( n = 102) and wet markets ( n = 25). From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter . Broiler meat purchased from retail outlets ( n = 37) was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter -free broiler flocks were processed in semi-automated facilities 15% (5/33) of neck skin samples became contaminated by the end of processing whereas 25% (2/8) became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni . Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.
Kottawatta, Kottawattage S. A.; Van Bergen, Marcel A. P.; Abeynayake, Preeni; Wagenaar, Jaap A.; Veldman, Kees T.; Kalupahana, Ruwani S.
2017-01-01
Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants (n = 102) and wet markets (n = 25). From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter. Broiler meat purchased from retail outlets (n = 37) was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter-free broiler flocks were processed in semi-automated facilities 15% (5/33) of neck skin samples became contaminated by the end of processing whereas 25% (2/8) became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni. Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance. PMID:29186018
Integrated Multi-process Microfluidic Systems for Automating Analysis
Yang, Weichun; Woolley, Adam T.
2010-01-01
Microfluidic technologies have been applied extensively in rapid sample analysis. Some current challenges for standard microfluidic systems are relatively high detection limits, and reduced resolving power and peak capacity compared to conventional approaches. The integration of multiple functions and components onto a single platform can overcome these separation and detection limitations of microfluidics. Multiplexed systems can greatly increase peak capacity in multidimensional separations and can increase sample throughput by analyzing many samples simultaneously. On-chip sample preparation, including labeling, preconcentration, cleanup and amplification, can all serve to speed up and automate processes in integrated microfluidic systems. This paper summarizes advances in integrated multi-process microfluidic systems for automated analysis, their benefits and areas for needed improvement. PMID:20514343
Mendes, Luciano A; Mafra, Márcio; Rodrigues, Jhonatam C
2012-01-01
The glow-to-arc transition phenomena (arcing) observed in plasma reactors used in materials processing was studied through the arcs characteristic current and voltage waveforms. In order to capture these arcs signals, a LABVIEW™ based automated instrumentation system (ARCVIEW) was developed, including the integration of an oscilloscope equipped with proper current and voltage probes. The system also allows capturing the process parameters at the arc occurrence moments, which were used to map the arcs events conditions. Experiments in H(2)-Ar DC pulsed plasma returned signals data from 215 arcs events, which were analyzed through software routines. According to the results, an anti-arcing system should react in the time order of few microseconds to prevent most of the damage caused by the undesired arcing phenomena.
NASA Astrophysics Data System (ADS)
Hiraki, M.; Yamada, Y.; Chavas, L. M. G.; Matsugaki, N.; Igarashi, N.; Wakatsuki, S.
2013-03-01
To achieve fully-automated and/or remote data collection in high-throughput X-ray experiments, the Structural Biology Research Centre at the Photon Factory (PF) has installed PF automated mounting system (PAM) for sample exchange robots at PF macromolecular crystallography beamlines BL-1A, BL-5A, BL-17A, AR-NW12A and AR-NE3A. We are upgrading the experimental systems, including the PAM for stable and efficient operation. To prevent human error in automated data collection, we installed a two-dimensional barcode reader for identification of the cassettes and sample pins. Because no liquid nitrogen pipeline in the PF experimental hutch is installed, the users commonly add liquid nitrogen using a small Dewar. To address this issue, an automated liquid nitrogen filling system that links a 100-liter tank to the robot Dewar has been installed on the PF macromolecular beamline. Here we describe this new implementation, as well as future prospects.
Automated Detection of Events of Scientific Interest
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
A report presents a slightly different perspective of the subject matter of Fusing Symbolic and Numerical Diagnostic Computations (NPO-42512), which appears elsewhere in this issue of NASA Tech Briefs. Briefly, the subject matter is the X-2000 Anomaly Detection Language, which is a developmental computing language for fusing two diagnostic computer programs one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for real-time detection of events. In the case of the cited companion NASA Tech Briefs article, the contemplated events that one seeks to detect would be primarily failures or other changes that could adversely affect the safety or success of a spacecraft mission. In the case of the instant report, the events to be detected could also include natural phenomena that could be of scientific interest. Hence, the use of X- 2000 Anomaly Detection Language could contribute to a capability for automated, coordinated use of multiple sensors and sensor-output-data-processing hardware and software to effect opportunistic collection and analysis of scientific data.
Fleischer, Heidi; Ramani, Kinjal; Blitti, Koffi; Roddelkopf, Thomas; Warkentin, Mareike; Behrend, Detlef; Thurow, Kerstin
2018-02-01
Automation systems are well established in industries and life science laboratories, especially in bioscreening and high-throughput applications. An increasing demand of automation solutions can be seen in the field of analytical measurement in chemical synthesis, quality control, and medical and pharmaceutical fields, as well as research and development. In this study, an automation solution was developed and optimized for the investigation of new biliary endoprostheses (stents), which should reduce clogging after implantation in the human body. The material inside the stents (incrustations) has to be controlled regularly and under identical conditions. The elemental composition is one criterion to be monitored in stent development. The manual procedure was transferred to an automated process including sample preparation, elemental analysis using inductively coupled plasma mass spectrometry (ICP-MS), and data evaluation. Due to safety issues, microwave-assisted acid digestion was executed outside of the automation system. The performance of the automated process was determined and validated. The measurement results and the processing times were compared for both the manual and the automated procedure. Finally, real samples of stent incrustations and pig bile were analyzed using the automation system.
USDA-ARS?s Scientific Manuscript database
Minimizing the effects of restraint and human interaction on the endocrine physiology of animals is essential for collection of accurate physiological measurements. Our objective was to compare stress-induced cortisol (CORT) and noradrenalin (NorA) responses in automated versus manual blood sampling...
HYDRA: A Middleware-Oriented Integrated Architecture for e-Procurement in Supply Chains
NASA Astrophysics Data System (ADS)
Alor-Hernandez, Giner; Aguilar-Lasserre, Alberto; Juarez-Martinez, Ulises; Posada-Gomez, Ruben; Cortes-Robles, Guillermo; Garcia-Martinez, Mario Alberto; Gomez-Berbis, Juan Miguel; Rodriguez-Gonzalez, Alejandro
The Service-Oriented Architecture (SOA) development paradigm has emerged to improve the critical issues of creating, modifying and extending solutions for business processes integration, incorporating process automation and automated exchange of information between organizations. Web services technology follows the SOA's principles for developing and deploying applications. Besides, Web services are considered as the platform for SOA, for both intra- and inter-enterprise communication. However, an SOA does not incorporate information about occurring events into business processes, which are the main features of supply chain management. These events and information delivery are addressed in an Event-Driven Architecture (EDA). Taking this into account, we propose a middleware-oriented integrated architecture that offers a brokering service for the procurement of products in a Supply Chain Management (SCM) scenario. As salient contributions, our system provides a hybrid architecture combining features of both SOA and EDA and a set of mechanisms for business processes pattern management, monitoring based on UML sequence diagrams, Web services-based management, event publish/subscription and reliable messaging service.
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Closing the Loop in ICU Decision Support: Physiologic Event Detection, Alerts, and Documentation
Norris, Patrick R.; Dawant, Benoit M.
2002-01-01
Automated physiologic event detection and alerting is a challenging task in the ICU. Ideally care providers should be alerted only when events are clinically significant and there is opportunity for corrective action. However, the concepts of clinical significance and opportunity are difficult to define in automated systems, and effectiveness of alerting algorithms is difficult to measure. This paper describes recent efforts on the Simon project to capture information from ICU care providers about patient state and therapy in response to alerts, in order to assess the value of event definitions and progressively refine alerting algorithms. Event definitions for intracranial pressure and cerebral perfusion pressure were studied by implementing a reliable system to automatically deliver alerts to clinical users’ alphanumeric pagers, and to capture associated documentation about patient state and therapy when the alerts occurred. During a 6-month test period in the trauma ICU at Vanderbilt University Medical Center, 530 alerts were detected in 2280 hours of data spanning 14 patients. Clinical users electronically documented 81% of these alerts as they occurred. Retrospectively classifying documentation based on therapeutic actions taken, or reasons why actions were not taken, provided useful information about ways to potentially improve event definitions and enhance system utility.
Ouba, Anthony; Abboud-Abi Saab, Marie; Stemmann, Lars
2016-01-01
In this study, we investigated, for the first time, the potential impact of environmental changes on zooplankton abundance over a fourteen year period (2000–2013) at an offshore station in the Eastern Mediterranean Sea (the Levantine basin, offshore Lebanon). Samples were collected monthly and analyzed using the semi-automated system ZooScan. Salinity, temperature and phytoplankton abundance (nano and microphytoplankton) were also measured. Results show no significant temporal trend in sea surface temperature over the years. Between 2005–2010, salinity in the upper layer (0–80 m) of the Levantine basin increased (~0.3°C). During this 5 year period, total zooplankton abundance significantly increased. These modifications were concomitant to the activation of Aegean Sea as a source of dense water formation as part of the “Eastern Mediterranean Transient-like” event. The results of the present study suggested that zooplankton benefited from enhanced phytoplankton production during the mixing years of the event. Changes in the phenology of some taxa were observed accordingly with a predominantly advanced peak of zooplankton abundance. In conclusion, long-term changes in zooplankton abundance were related to the Levantine thermohaline circulation rather than sea surface warming. Sampling must be maintained to assess the impact of long-term climate change on zooplankton communities. PMID:27459093
Shamur, Eyal; Zilka, Miri; Hassner, Tal; China, Victor; Liberzon, Alex; Holzman, Roi
2016-06-01
Using videography to extract quantitative data on animal movement and kinematics constitutes a major tool in biomechanics and behavioral ecology. Advanced recording technologies now enable acquisition of long video sequences encompassing sparse and unpredictable events. Although such events may be ecologically important, analysis of sparse data can be extremely time-consuming and potentially biased; data quality is often strongly dependent on the training level of the observer and subject to contamination by observer-dependent biases. These constraints often limit our ability to study animal performance and fitness. Using long videos of foraging fish larvae, we provide a framework for the automated detection of prey acquisition strikes, a behavior that is infrequent yet critical for larval survival. We compared the performance of four video descriptors and their combinations against manually identified feeding events. For our data, the best single descriptor provided a classification accuracy of 77-95% and detection accuracy of 88-98%, depending on fish species and size. Using a combination of descriptors improved the accuracy of classification by ∼2%, but did not improve detection accuracy. Our results indicate that the effort required by an expert to manually label videos can be greatly reduced to examining only the potential feeding detections in order to filter false detections. Thus, using automated descriptors reduces the amount of manual work needed to identify events of interest from weeks to hours, enabling the assembly of an unbiased large dataset of ecologically relevant behaviors. © 2016. Published by The Company of Biologists Ltd.
Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C
2001-01-01
Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).
Mainali, Dipak; Seelenbinder, John
2016-05-01
Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples. © The Author(s) 2016.
Microbiology of beef carcasses before and after slaughterline automation.
Whelehan, O. P.; Hudson, W. R.; Roberts, T. A.
1986-01-01
The bacterial status of beef carcasses at a commercial abattoir was monitored before and after slaughterline automation. Bacterial counts did not differ significantly overall (P greater than 0.05) between the original manual line and the automated line for either morning or afternoon slaughter. On the manual line counts in the morning were lower than those from carcasses slaughtered in the afternoon, but on the automated line there was no difference between morning and afternoon counts. Due to highly significant line X sample site interaction for both morning and afternoon counts, overall differences among sample sites were not found by analysis of variance. However, principal components analysis revealed a significant shift in bacterial contamination among some sites due to slaughterline changes. The incidence of Enterobacteriaceae increased marginally following automation. PMID:3701039
Freedman, Kevin J; Bastian, Arangassery R; Chaiken, Irwin; Kim, Min Jun
2013-03-11
Protein conjugation provides a unique look into many biological phenomena and has been used for decades for molecular recognition purposes. In this study, the use of solid-state nanopores for the detection of gp120-associated complexes are investigated. They exhibit monovalent and multivalent binding to anti-gp120 antibody monomer and dimers. In order to investigate the feasibility of many practical applications related to nanopores, detection of specific protein complexes is attempted within a heterogeneous protein sample, and the role of voltage on complexed proteins is researched. It is found that the electric field within the pore can result in unbinding of a freely translocating protein complex within the transient event durations measured experimentally. The strong dependence of the unbinding time with voltage can be used to improve the detection capability of the nanopore system by adding an additional level of specificity that can be probed. These data provide a strong framework for future protein-specific detection schemes, which are shown to be feasible in the realm of a 'real-world' sample and an automated multidimensional method of detecting events. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N; Hoflund, Anders; Mogensen, Helle S; Hansen, Anders J; Morling, Niels
2013-05-01
The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480 samples were processed. The highest DNA recovery was obtained with the PrepFiler Express kit on an AutoMate Express while the lowest DNA recovery was obtained using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen. Extraction using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen resulted in the lowest percentage of PCR inhibition (0%) while extraction using manual Chelex resulted in the highest percentage of PCR inhibition (51%). The largest number of reportable STR-profiles was obtained with DNA from samples extracted with the PrepFiler Express kit (75%) while the lowest number was obtained with DNA from samples extracted using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen (41%). Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Benefits of an automated GLP final report preparation software solution.
Elvebak, Larry E
2011-07-01
The final product of analytical laboratories performing US FDA-regulated (or GLP) method validation and bioanalysis studies is the final report. Although there are commercial-off-the-shelf (COTS) software/instrument systems available to laboratory managers to automate and manage almost every aspect of the instrumental and sample-handling processes of GLP studies, there are few software systems available to fully manage the GLP final report preparation process. This lack of appropriate COTS tools results in the implementation of rather Byzantine and manual processes to cobble together all the information needed to generate a GLP final report. The manual nature of these processes results in the need for several iterative quality control and quality assurance events to ensure data accuracy and report formatting. The industry is in need of a COTS solution that gives laboratory managers and study directors the ability to manage as many portions as possible of the GLP final report writing process and the ability to generate a GLP final report with the click of a button. This article describes the COTS software features needed to give laboratory managers and study directors such a solution.
Designing and Securing an Event Processing System for Smart Spaces
ERIC Educational Resources Information Center
Li, Zang
2011-01-01
Smart spaces, or smart environments, represent the next evolutionary development in buildings, banking, homes, hospitals, transportation systems, industries, cities, and government automation. By riding the tide of sensor and event processing technologies, the smart environment captures and processes information about its surroundings as well as…
Automated Intelligent Training with a Tactical Decision Making Serious Game
2014-01-01
tactical skills, but only if experiential events are accompanied with guided feedback. Practice alone is not sufficient for learning; it must be...micro-adaptation occurs within events (Shute, 1993). Micro-adaptation is a major component of InGEAR’s pedagogical strategy, with feedback tailored
Automated biodosimetry using digital image analysis of fluorescence in situ hybridization specimens.
Castleman, K R; Schulze, M; Wu, Q
1997-11-01
Fluorescence in situ hybridization (FISH) of metaphase chromosome spreads is valuable for monitoring the radiation dose to circulating lymphocytes. At low dose levels, the number of cells that must be examined to estimate aberration frequencies is quite large. An automated microscope that can perform this analysis autonomously on suitably prepared specimens promises to make practical the large-scale studies that will be required for biodosimetry in the future. This paper describes such an instrument that is currently under development. We use metaphase specimens in which the five largest chromosomes have been hybridized with different-colored whole-chromosome painting probes. An automated multiband fluorescence microscope locates the spreads and counts the number of chromosome components of each color. Digital image analysis is used to locate and isolate the cells, count chromosome components, and estimate the proportions of abnormal cells. Cells exhibiting more than two chromosomal fragments in any color correspond to a clastogenic event. These automatically derived counts are corrected for statistical bias and used to estimate the overall rate of chromosome breakage. Overlap of fluorophore emission spectra prohibits isolation of the different chromosomes into separate color channels. Image processing effectively isolates each fluorophore to a single monochrome image, simplifying the task of counting chromosome fragments and reducing the error in the algorithm. Using proportion estimation, we remove the bias introduced by counting errors, leaving accuracy restricted by sample size considerations alone.
Development of an automated pre-sampling plan for construction projects : final report.
DOT National Transportation Integrated Search
1983-03-01
The development of an automated pre-sampling plan was undertaken to free the district construction personnel from the cumbersome and time-consuming task of preparing such plans manually. A computer program was written and linked to a data file which ...
NASA Astrophysics Data System (ADS)
Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun
2016-03-01
Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.
Astrophysics in the Era of Massive Time-Domain Surveys
NASA Astrophysics Data System (ADS)
Djorgovski, G.
Synoptic sky surveys are now the largest data producers in astronomy, entering the Petascale regime, opening the time domain for a systematic exploration. A great variety of interesting phenomena, spanning essentially all subfields of astronomy, can only be studied in the time domain, and these new surveys are producing large statistical samples of the known types of objects and events for further studies (e.g., SNe, AGN, variable stars of many kinds), and have already uncovered previously unknown subtypes of these (e.g., rare or peculiar types of SNe). These surveys are generating a new science, and paving the way for even larger surveys to come, e.g., the LSST; our ability to fully exploit such forthcoming facilities depends critically on the science, methodology, and experience that are being accumulated now. Among the outstanding challenges, the foremost is our ability to conduct an effective follow-up of the interesting events discovered by the surveys in any wavelength regime. The follow-up resources, especially spectroscopy, are already and, for the predictable future, will be severely limited, thus requiring an intelligent down-selection of the most astrophysically interesting events to follow. The first step in that process is an automated, real-time, iterative classification of events, that incorporates heterogeneous data from the surveys themselves, archival and contextual information (spatial, temporal, and multiwavelength), and the incoming follow-up observations. The second step is an optimal automated event prioritization and allocation of the available follow-up resources that also change in time. Both of these challenges are highly non-trivial, and require a strong cyber-infrastructure based on the Virtual Observatory data grid, and the various astroinformatics efforts. Time domain astronomy is inherently an astronomy of telescope-computational systems, and will increasingly depend on novel machine learning and artificial intelligence tools. Another arena with a strong potential for discovery is a purely archival, non-time-critical exploration of the time domain, with the time dimension adding the complexity to an already challenging problem of data mining of highly-dimensional parameter spaces produced by sky surveys.
[The actual possibilities of robotic microscopy in analysis automation and laboratory telemedicine].
Medovyĭ, V S; Piatnitskiĭ, A M; Sokolinskiĭ, B Z; Balugian, R Sh
2012-10-01
The article discusses the possibilities of automation microscopy complexes manufactured by Cellavision and MEKOS to perform the medical analyses of blood films and other biomaterials. The joint work of the complex and physician in the regimen of automatic load stages, screening, sampling and sorting on types with simple morphology, visual sorting of sub-sample with complex morphology provides significant increase of method sensitivity, load decrease and enhancement of physician work conditions. The information technologies, the virtual slides and laboratory telemedicine included permit to develop the representative samples of rare types and pathologies to promote automation methods and medical research targets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Egorov, Oleg; O'Hara, Matthew J.; Grate, Jay W.
An automated fluidic instrument is described that rapidly determines the total 99Tc content of aged nuclear waste samples, where the matrix is chemically and radiologically complex and the existing speciation of the 99Tc is variable. The monitor links microwave-assisted sample preparation with an automated anion exchange column separation and detection using a flow-through solid scintillator detector. The sample preparation steps acidify the sample, decompose organics, and convert all Tc species to the pertechnetate anion. The column-based anion exchange procedure separates the pertechnetate from the complex sample matrix, so that radiometric detection can provide accurate measurement of 99Tc. We developed amore » preprogrammed spike addition procedure to automatically determine matrix-matched calibration. The overall measurement efficiency that is determined simultaneously provides a self-diagnostic parameter for the radiochemical separation and overall instrument function. Continuous, automated operation was demonstrated over the course of 54 h, which resulted in the analysis of 215 samples plus 54 hly spike-addition samples, with consistent overall measurement efficiency for the operation of the monitor. A sample can be processed and measured automatically in just 12.5 min with a detection limit of 23.5 Bq/mL of 99Tc in low activity waste (0.495 mL sample volume), with better than 10% RSD precision at concentrations above the quantification limit. This rapid automated analysis method was developed to support nuclear waste processing operations planned for the Hanford nuclear site.« less
Automation of Physiologic Data Presentation and Alarms in the Post Anesthesia Care Unit
Aukburg, S.J.; Ketikidis, P.H.; Kitz, D.S.; Mavrides, T.G.; Matschinsky, B.B.
1989-01-01
The routine use of pulse oximeters, non-invasive blood pressure monitors and electrocardiogram monitors have considerably improved patient care in the post anesthesia period. Using an automated data collection system, we investigated the occurrence of several adverse events frequently revealed by these monitors. We found that the incidence of hypoxia was 35%, hypertension 12%, hypotension 8%, tachycardia 25% and bradycardia 1%. Discriminant analysis was able to correctly predict classification of about 90% of patients into normal vs. hypotensive or hypotensive groups. The system software minimizes artifact, validates data for epidemiologic studies, and is able to identify variables that predict adverse events through application of appropriate statistical and artificial intelligence techniques.
Whitter, P D; Cary, P L; Leaton, J I; Johnson, J E
1999-01-01
An automated extraction scheme for the analysis of 11 -nor-delta9-tetrahydrocannabinol-9-carboxylic acid using the Hamilton Microlab 2200, which was modified for gravity-flow solid-phase extraction, has been evaluated. The Hamilton was fitted with a six-head probe, a modular valve positioner, and a peristaltic pump. The automated method significantly increased sample throughput, improved assay consistency, and reduced the time spent performing the extraction. Extraction recovery for the automated method was > 90%. The limit of detection, limit of quantitation, and upper limit of linearity were equivalent to the manual method: 1.5, 3.0, and 300 ng/mL, respectively. Precision at the 15-ng/mL cut-off was as follows: mean = 14.4, standard deviation = 0.5, coefficient of variation = 3.5%. Comparison of 38 patient samples, extracted by the manual and automated extraction methods, demonstrated the following correlation statistics: r = .991, slope 1.029, and y-intercept -2.895. Carryover was < 0.3% at 1000 ng/mL. Aliquoting/extraction time for the automated method (48 urine samples) was 50 min, and the manual procedure required approximately 2.5 h. The automated aliquoting/extraction method on the Hamilton Microlab 2200 and its use in forensic applications are reviewed.
Petrova, Darinka Todorova; Cocisiu, Gabriela Ariadna; Eberle, Christoph; Rhode, Karl-Heinz; Brandhorst, Gunnar; Walson, Philip D; Oellerich, Michael
2013-09-01
The aim of this study was to develop a novel method for automated quantification of cell-free hemoglobin (fHb) based on the HI (Roche Diagnostics). The novel fHb method based on the HI was correlated with fHb measured using the triple wavelength methods of both Harboe [fHb, g/L = (0.915 * HI + 2.634)/100] and Fairbanks et al. [fHb, g/L = (0.917 * HI + 2.131)/100]. fHb concentrations were estimated from the HI using the Roche Modular automated platform in self-made and commercially available quality controls, as well as samples from a proficiency testing scheme (INSTAND). The fHb using Roche automated HI results were then compared to results obtained using the traditional spectrophotometric assays for one hundred plasma samples with varying degrees of hemolysis, lipemia and/or bilirubinemia. The novel method using automated HI quantification on the Roche Modular clinical chemistry platform correlated well with results using the classical methods in the 100 patient samples (Harboe: r = 0.9284; Fairbanks et al.: r = 0.9689) and recovery was good for self-made controls. However, commercially available quality controls showed poor recovery due to an unidentified matrix problem. The novel method produced reliable determination of fHb in samples without interferences. However, poor recovery using commercially available fHb quality control samples currently greatly limits its usefulness. © 2013.
Automation Activities that Support C2 Agility to Mitigate Type 7 Risks
2014-06-01
on business trip • Space ship runs into space junk What are the probabilities for these events in a 45-year career time frame? Event that...representation that information system understands State- Space Diagram Common Agility Space (CAS) A simple C2 organization representation
The laboratory of the 1990s—Planning for total automation
Brunner, Linda A.
1992-01-01
The analytical laboratory of the 1990s must be able to meet and accommodate the rapid evolution of modern-day technology. One such area is laboratory automation. Total automation may be seen as the coupling of computerized sample tracking, electronic documentation and data reduction with automated sample handling, preparation and analysis, resulting in a complete analytical procedure with minimal human involvement. Requirements may vary from one laboratory or facility to another, so the automation has to be flexible enough to cover a wide range of applications, and yet fit into specific niches depending on individual needs. Total automation must be planned for, well in advance, if the endeavour is to be a success. Space, laboratory layout, proper equipment, and the availability and access to necessary utilities must be taken into account. Adequate training and experience of the personnel working with the technology must also be ensured. In addition, responsibilities of installation, programming maintenance and operation have to be addressed. Proper time management and the efficient implementation and use of total automation are also crucial to successful operations. This paper provides insights into laboratory organization and requirements, as well as discussing the management issues that must be faced when automating laboratory procedures. PMID:18924925
Louw, Tyron; Markkula, Gustav; Boer, Erwin; Madigan, Ruth; Carsten, Oliver; Merat, Natasha
2017-11-01
This driving simulator study, conducted as part of the EU AdaptIVe project, investigated drivers' performance in critical traffic events, during the resumption of control from an automated driving system. Prior to the critical events, using a between-participant design, 75 drivers were exposed to various screen manipulations that varied the amount of available visual information from the road environment and automation state, which aimed to take them progressively further 'out-of-the-loop' (OoTL). The current paper presents an analysis of the timing, type, and rate of drivers' collision avoidance response, also investigating how these were influenced by the criticality of the unfolding situation. Results showed that the amount of visual information available to drivers during automation impacted on how quickly they resumed manual control, with less information associated with slower take-over times, however, this did not influence the timing of when drivers began a collision avoidance manoeuvre. Instead, the observed behaviour is in line with recent accounts emphasising the role of scenario kinematics in the timing of driver avoidance response. When considering collision incidents in particular, avoidance manoeuvres were initiated when the situation criticality exceeded an Inverse Time To Collision value of ≈0.3s -1 . Our results suggest that take-over time and timing and quality of avoidance response appear to be largely independent, and while long take-over time did not predict collision outcome, kinematically late initiation of avoidance did. Hence, system design should focus on achieving kinematically early avoidance initiation, rather than short take-over times. Copyright © 2017 Elsevier Ltd. All rights reserved.
Challenges and Demands on Automated Software Revision
NASA Technical Reports Server (NTRS)
Bonakdarpour, Borzoo; Kulkarni, Sandeep S.
2008-01-01
In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.
NASA Astrophysics Data System (ADS)
Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.
2017-01-01
Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.
ICECAP: an integrated, general-purpose, automation-assisted IC50/EC50 assay platform.
Li, Ming; Chou, Judy; King, Kristopher W; Jing, Jing; Wei, Dong; Yang, Liyu
2015-02-01
IC50 and EC50 values are commonly used to evaluate drug potency. Mass spectrometry (MS)-centric bioanalytical and biomarker labs are now conducting IC50/EC50 assays, which, if done manually, are tedious and error-prone. Existing bioanalytical sample preparation automation systems cannot meet IC50/EC50 assay throughput demand. A general-purpose, automation-assisted IC50/EC50 assay platform was developed to automate the calculations of spiking solutions and the matrix solutions preparation scheme, the actual spiking and matrix solutions preparations, as well as the flexible sample extraction procedures after incubation. In addition, the platform also automates the data extraction, nonlinear regression curve fitting, computation of IC50/EC50 values, graphing, and reporting. The automation-assisted IC50/EC50 assay platform can process the whole class of assays of varying assay conditions. In each run, the system can handle up to 32 compounds and up to 10 concentration levels per compound, and it greatly improves IC50/EC50 assay experimental productivity and data processing efficiency. © 2014 Society for Laboratory Automation and Screening.
Automated Mapping of Flood Events in the Mississippi River Basin Utilizing NASA Earth Observations
NASA Technical Reports Server (NTRS)
Bartkovich, Mercedes; Baldwin-Zook, Helen Blue; Cruz, Dashiell; McVey, Nicholas; Ploetz, Chris; Callaway, Olivia
2017-01-01
The Mississippi River Basin is the fourth largest drainage basin in the world, and is susceptible to multi-level flood events caused by heavy precipitation, snow melt, and changes in water table levels. Conducting flood analysis during periods of disaster is a challenging endeavor for NASA's Short-term Prediction Research and Transition Center (SPoRT), Federal Emergency Management Agency (FEMA), and the U.S. Geological Survey's Hazards Data Distribution Systems (USGS HDDS) due to heavily-involved research and lack of manpower. During this project, an automated script was generated that performs high-level flood analysis to relieve the workload for end-users. The script incorporated Landsat 8 Operational Land Imager (OLI) tiles and utilized computer-learning techniques to generate accurate water extent maps. The script referenced the Moderate Resolution Imaging Spectroradiometer (MODIS) land-water mask to isolate areas of flood induced waters. These areas were overlaid onto the National Land Cover Database's (NLCD) land cover data, the Oak Ridge National Laboratory's LandScan data, and Homeland Infrastructure Foundation-Level Data (HIFLD) to determine the classification of areas impacted and the population density affected by flooding. The automated algorithm was initially tested on the September 2016 flood event that occurred in Upper Mississippi River Basin, and was then further tested on multiple flood events within the Mississippi River Basin. This script allows end users to create their own flood probability and impact maps for disaster mitigation and recovery efforts.
Xu, Stanley; Newcomer, Sophia; Nelson, Jennifer; Qian, Lei; McClure, David; Pan, Yi; Zeng, Chan; Glanz, Jason
2014-05-01
The Vaccine Safety Datalink project captures electronic health record data including vaccinations and medically attended adverse events on 8.8 million enrollees annually from participating managed care organizations in the United States. While the automated vaccination data are generally of high quality, a presumptive adverse event based on diagnosis codes in automated health care data may not be true (misclassification). Consequently, analyses using automated health care data can generate false positive results, where an association between the vaccine and outcome is incorrectly identified, as well as false negative findings, where a true association or signal is missed. We developed novel conditional Poisson regression models and fixed effects models that accommodate misclassification of adverse event outcome for self-controlled case series design. We conducted simulation studies to evaluate their performance in signal detection in vaccine safety hypotheses generating (screening) studies. We also reanalyzed four previously identified signals in a recent vaccine safety study using the newly proposed models. Our simulation studies demonstrated that (i) outcome misclassification resulted in both false positive and false negative signals in screening studies; (ii) the newly proposed models reduced both the rates of false positive and false negative signals. In reanalyses of four previously identified signals using the novel statistical models, the incidence rate ratio estimates and statistical significances were similar to those using conventional models and including only medical record review confirmed cases. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Prüller, Florian; Wagner, Jasmin; Raggam, Reinhard B; Hoenigl, Martin; Kessler, Harald H; Truschnig-Wilders, Martie; Krause, Robert
2014-07-01
Testing for (1→3)-beta-D-glucan (BDG) is used for detection of invasive fungal infection. However, current assays lack automation and the ability to conduct rapid single-sample testing. The Fungitell assay was adopted for automation and evaluated using clinical samples from patients with culture-proven candidemia and from culture-negative controls in duplicate. A comparison with the standard assay protocol was made in order to establish analytical specifications. With the automated protocol, the analytical measuring range was 8-2500 pg/ml of BDG, and precision testing resulted in coefficients of variation that ranged from 3.0% to 5.5%. Samples from 15 patients with culture-proven candidemia and 94 culture-negative samples were evaluated. All culture-proven samples showed BDG values >80 pg/ml (mean 1247 pg/ml; range, 116-2990 pg/ml), which were considered positive. Of the 94 culture-negative samples, 92 had BDG values <60 pg/ml (mean, 28 pg/ml), which were considered to be negative, and 2 samples were false-positive (≥80 pg/ml; up to 124 pg/ml). Results could be obtained within 45 min and showed excellent agreement with results obtained with the standard assay protocol. The automated Fungitell assay proved to be reliable and rapid for diagnosis of candidemia. It was demonstrated to be feasible and cost efficient for both single-sample and large-scale testing of serum BDG. Its 1-h time-to-result will allow better support for clinicians in the management of antifungal therapy. © The Author 2014. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
A compendium of controlled diffusion blades generated by an automated inverse design procedure
NASA Technical Reports Server (NTRS)
Sanz, Jose M.
1989-01-01
A set of sample cases was produced to test an automated design procedure developed at the NASA Lewis Research Center for the design of controlled diffusion blades. The range of application of the automated design procedure is documented. The results presented include characteristic compressor and turbine blade sections produced with the automated design code as well as various other airfoils produced with the base design method prior to the incorporation of the automated procedure.
NASA Astrophysics Data System (ADS)
Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A. G.; Sellergren, Börje; Reubsaet, Léon
2017-03-01
Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting.
Xu, Wenzhao; Collingsworth, Paris D.; Bailey, Barbara; Carlson Mazur, Martha L.; Schaeffer, Jeff; Minsker, Barbara
2017-01-01
This paper proposes a geospatial analysis framework and software to interpret water-quality sampling data from towed undulating vehicles in near-real time. The framework includes data quality assurance and quality control processes, automated kriging interpolation along undulating paths, and local hotspot and cluster analyses. These methods are implemented in an interactive Web application developed using the Shiny package in the R programming environment to support near-real time analysis along with 2- and 3-D visualizations. The approach is demonstrated using historical sampling data from an undulating vehicle deployed at three rivermouth sites in Lake Michigan during 2011. The normalized root-mean-square error (NRMSE) of the interpolation averages approximately 10% in 3-fold cross validation. The results show that the framework can be used to track river plume dynamics and provide insights on mixing, which could be related to wind and seiche events.
Multi-Mission Automated Task Invocation Subsystem
NASA Technical Reports Server (NTRS)
Cheng, Cecilia S.; Patel, Rajesh R.; Sayfi, Elias M.; Lee, Hyun H.
2009-01-01
Multi-Mission Automated Task Invocation Subsystem (MATIS) is software that establishes a distributed data-processing framework for automated generation of instrument data products from a spacecraft mission. Each mission may set up a set of MATIS servers for processing its data products. MATIS embodies lessons learned in experience with prior instrument- data-product-generation software. MATIS is an event-driven workflow manager that interprets project-specific, user-defined rules for managing processes. It executes programs in response to specific events under specific conditions according to the rules. Because requirements of different missions are too diverse to be satisfied by one program, MATIS accommodates plug-in programs. MATIS is flexible in that users can control such processing parameters as how many pipelines to run and on which computing machines to run them. MATIS has a fail-safe capability. At each step, MATIS captures and retains pertinent information needed to complete the step and start the next step. In the event of a restart, this information is retrieved so that processing can be resumed appropriately. At this writing, it is planned to develop a graphical user interface (GUI) for monitoring and controlling a product generation engine in MATIS. The GUI would enable users to schedule multiple processes and manage the data products produced in the processes. Although MATIS was initially designed for instrument data product generation,
Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing
NASA Astrophysics Data System (ADS)
Rowe, C. A.; Stead, R. J.; Begnaud, M. L.
2013-12-01
Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements through a jackknifing process to isolate the anomalous channels, so that an automated analysis system might discard them prior to FK analysis and beamforming on events of interest.
An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices
2018-01-01
The evolution of the Internet of Things leads to new opportunities for the contemporary notion of smart offices, where employees can benefit from automation to maximize their productivity and performance. However, although extensive research has been dedicated to analyze the impact of workers’ emotions on their job performance, there is still a lack of pervasive environments that take into account emotional behaviour. In addition, integrating new components in smart environments is not straightforward. To face these challenges, this article proposes an architecture for emotion aware automation platforms based on semantic event-driven rules to automate the adaptation of the workplace to the employee’s needs. The main contributions of this paper are: (i) the design of an emotion aware automation platform architecture for smart offices; (ii) the semantic modelling of the system; and (iii) the implementation and evaluation of the proposed architecture in a real scenario. PMID:29748468
An Emotion Aware Task Automation Architecture Based on Semantic Technologies for Smart Offices.
Muñoz, Sergio; Araque, Oscar; Sánchez-Rada, J Fernando; Iglesias, Carlos A
2018-05-10
The evolution of the Internet of Things leads to new opportunities for the contemporary notion of smart offices, where employees can benefit from automation to maximize their productivity and performance. However, although extensive research has been dedicated to analyze the impact of workers’ emotions on their job performance, there is still a lack of pervasive environments that take into account emotional behaviour. In addition, integrating new components in smart environments is not straightforward. To face these challenges, this article proposes an architecture for emotion aware automation platforms based on semantic event-driven rules to automate the adaptation of the workplace to the employee’s needs. The main contributions of this paper are: (i) the design of an emotion aware automation platform architecture for smart offices; (ii) the semantic modelling of the system; and (iii) the implementation and evaluation of the proposed architecture in a real scenario.
Mapping the Recent US Hurricanes Triggered Flood Events in Near Real Time
NASA Astrophysics Data System (ADS)
Shen, X.; Lazin, R.; Anagnostou, E. N.; Wanik, D. W.; Brakenridge, G. R.
2017-12-01
Synthetic Aperture Radar (SAR) observations is the only reliable remote sensing data source to map flood inundation during severe weather events. Unfortunately, since state-of-art data processing algorithms cannot meet the automation and quality standard of a near-real-time (NRT) system, quality controlled inundation mapping by SAR currently depends heavily on manual processing, which limits our capability to quickly issue flood inundation maps at global scale. Specifically, most SAR-based inundation mapping algorithms are not fully automated, while those that are automated exhibit severe over- and/or under-detection errors that limit their potential. These detection errors are primarily caused by the strong overlap among the SAR backscattering probability density functions (PDF) of different land cover types. In this study, we tested a newly developed NRT SAR-based inundation mapping system, named Radar Produced Inundation Diary (RAPID), using Sentinel-1 dual polarized SAR data over recent flood events caused by Hurricanes Harvey, Irma, and Maria (2017). The system consists of 1) self-optimized multi-threshold classification, 2) over-detection removal using land-cover information and change detection, 3) under-detection compensation, and 4) machine-learning based correction. Algorithm details are introduced in another poster, H53J-1603. Good agreements were obtained by comparing the result from RAPID with visual interpretation of SAR images and manual processing from Dartmouth Flood Observatory (DFO) (See Figure 1). Specifically, the over- and under-detections that is typically noted in automated methods is significantly reduced to negligible levels. This performance indicates that RAPID can address the automation and accuracy issues of current state-of-art algorithms and has the potential to apply operationally on a number of satellite SAR missions, such as SWOT, ALOS, Sentinel etc. RAPID data can support many applications such as rapid assessment of damage losses and disaster alleviation/rescue at global scale.
Seasonal trend of fog water chemical composition in the Po Valley.
Fuzzi, S; Facchini, M C; Orsi, G; Ferri, D
1992-01-01
Fog frequency in the Po Valley, Northern Italy, can be as high as 30% of the time in the fall-winter season. High pollutant concentrations have been measured in fog water samples collected in this area over the past few years. The combined effects of high fog occurrence and high pollutant loading of the fog droplets can determine, in this area, appreciable chemical deposition rates. An automated station for fog water collection was developed, and deployed at the field station of S. Pietro Capofiume, in the eastern part of the Po Valley for an extended period: from the beginning of November 1989 to the end of April 1990. Time-resolved sampling of fog droplets was carried out during all fog events occurring in this period, and chemical analyses were performed on the collected samples. Statistical information on fog occurrence and fog water chemical composition is reported in this paper, and a tentative seasonal deposition budget is calculated for H+, NH4+, NO3- and SO4(2-) ions. The problems connected with fog droplet sampling in sub-freezing conditions are also addressed in the paper.
Lu, Chao; Li, Xubin; Wu, Dongsheng; Zheng, Lianqing; Yang, Wei
2016-01-12
In aqueous solution, solute conformational transitions are governed by intimate interplays of the fluctuations of solute-solute, solute-water, and water-water interactions. To promote molecular fluctuations to enhance sampling of essential conformational changes, a common strategy is to construct an expanded Hamiltonian through a series of Hamiltonian perturbations and thereby broaden the distribution of certain interactions of focus. Due to a lack of active sampling of configuration response to Hamiltonian transitions, it is challenging for common expanded Hamiltonian methods to robustly explore solvent mediated rare conformational events. The orthogonal space sampling (OSS) scheme, as exemplified by the orthogonal space random walk and orthogonal space tempering methods, provides a general framework for synchronous acceleration of slow configuration responses. To more effectively sample conformational transitions in aqueous solution, in this work, we devised a generalized orthogonal space tempering (gOST) algorithm. Specifically, in the Hamiltonian perturbation part, a solvent-accessible-surface-area-dependent term is introduced to implicitly perturb near-solute water-water fluctuations; more importantly in the orthogonal space response part, the generalized force order parameter is generalized as a two-dimension order parameter set, in which essential solute-solvent and solute-solute components are separately treated. The gOST algorithm is evaluated through a molecular dynamics simulation study on the explicitly solvated deca-alanine (Ala10) peptide. On the basis of a fully automated sampling protocol, the gOST simulation enabled repetitive folding and unfolding of the solvated peptide within a single continuous trajectory and allowed for detailed constructions of Ala10 folding/unfolding free energy surfaces. The gOST result reveals that solvent cooperative fluctuations play a pivotal role in Ala10 folding/unfolding transitions. In addition, our assessment analysis suggests that because essential conformational events are mainly driven by the compensating fluctuations of essential solute-solvent and solute-solute interactions, commonly employed "predictive" sampling methods are unlikely to be effective on this seemingly "simple" system. The gOST development presented in this paper illustrates how to employ the OSS scheme for physics-based sampling method designs.
Microseismic event location by master-event waveform stacking
NASA Astrophysics Data System (ADS)
Grigoli, F.; Cesca, S.; Dahm, T.
2016-12-01
Waveform stacking location methods are nowadays extensively used to monitor induced seismicity monitoring assoiciated with several underground industrial activities such as Mining, Oil&Gas production and Geothermal energy exploitation. In the last decade a significant effort has been spent to develop or improve methodologies able to perform automated seismological analysis for weak events at a local scale. This effort was accompanied by the improvement of monitoring systems, resulting in an increasing number of large microseismicity catalogs. The analysis of microseismicity is challenging, because of the large number of recorded events often characterized by a low signal-to-noise ratio. A significant limitation of the traditional location approaches is that automated picking is often done on each seismogram individually, making little or no use of the coherency information between stations. In order to improve the performance of the traditional location methods, in the last year, alternative approaches have been proposed. These methods exploits the coherence of the waveforms recorded at different stations and do not require any automated picking procedure. The main advantage of this methods relies on their robustness even when the recorded waveforms are very noisy. On the other hand, like any other location method, the location performance strongly depends on the accuracy of the available velocity model. When dealing with inaccurate velocity models, in fact, location results can be affected by large errors. Here we will introduce a new automated waveform stacking location method which is less dependent on the knowledge of the velocity model and presents several benefits, which improve the location accuracy: 1) it accounts for phase delays due to local site effects, e.g. surface topography or variable sediment thickness 2) theoretical velocity model are only used to estimate travel times within the source volume, and not along the whole source-sensor path. We finally compare the location results for both synthetics and real data with those obtained by using classical waveforms stacking approaches.
Lerch, Oliver; Temme, Oliver; Daldrup, Thomas
2014-07-01
The analysis of opioids, cocaine, and metabolites from blood serum is a routine task in forensic laboratories. Commonly, the employed methods include many manual or partly automated steps like protein precipitation, dilution, solid phase extraction, evaporation, and derivatization preceding a gas chromatography (GC)/mass spectrometry (MS) or liquid chromatography (LC)/MS analysis. In this study, a comprehensively automated method was developed from a validated, partly automated routine method. This was possible by replicating method parameters on the automated system. Only marginal optimization of parameters was necessary. The automation relying on an x-y-z robot after manual protein precipitation includes the solid phase extraction, evaporation of the eluate, derivatization (silylation with N-methyl-N-trimethylsilyltrifluoroacetamide, MSTFA), and injection into a GC/MS. A quantitative analysis of almost 170 authentic serum samples and more than 50 authentic samples of other matrices like urine, different tissues, and heart blood on cocaine, benzoylecgonine, methadone, morphine, codeine, 6-monoacetylmorphine, dihydrocodeine, and 7-aminoflunitrazepam was conducted with both methods proving that the analytical results are equivalent even near the limits of quantification (low ng/ml range). To our best knowledge, this application is the first one reported in the literature employing this sample preparation system.
Nurizzo, Didier; Bowler, Matthew W.; Caserotto, Hugo; Dobias, Fabien; Giraud, Thierry; Surr, John; Guichard, Nicolas; Papp, Gergely; Guijarro, Matias; Mueller-Dieckmann, Christoph; Flot, David; McSweeney, Sean; Cipriani, Florent; Theveneau, Pascal; Leonard, Gordon A.
2016-01-01
Automation of the mounting of cryocooled samples is now a feature of the majority of beamlines dedicated to macromolecular crystallography (MX). Robotic sample changers have been developed over many years, with the latest designs increasing capacity, reliability and speed. Here, the development of a new sample changer deployed at the ESRF beamline MASSIF-1 (ID30A-1), based on an industrial six-axis robot, is described. The device, named RoboDiff, includes a high-capacity dewar, acts as both a sample changer and a high-accuracy goniometer, and has been designed for completely unattended sample mounting and diffraction data collection. This aim has been achieved using a high level of diagnostics at all steps of the process from mounting and characterization to data collection. The RoboDiff has been in service on the fully automated endstation MASSIF-1 at the ESRF since September 2014 and, at the time of writing, has processed more than 20 000 samples completely automatically. PMID:27487827
Automated Detection and Analysis of Interplanetary Shocks with Real-Time Application
NASA Astrophysics Data System (ADS)
Vorotnikov, V.; Smith, C. W.; Hu, Q.; Szabo, A.; Skoug, R. M.; Cohen, C. M.
2006-12-01
The ACE real-time data stream provides web-based now-casting capabilities for solar wind conditions upstream of Earth. Our goal is to provide an automated code that finds and analyzes interplanetary shocks as they occur for possible real-time application to space weather nowcasting. Shock analysis algorithms based on the Rankine-Hugoniot jump conditions exist and are in wide-spread use today for the interactive analysis of interplanetary shocks yielding parameters such as shock speed and propagation direction and shock strength in the form of compression ratios. Although these codes can be automated in a reasonable manner to yield solutions not far from those obtained by user-directed interactive analysis, event detection presents an added obstacle and the first step in a fully automated analysis. We present a fully automated Rankine-Hugoniot analysis code that can scan the ACE science data, find shock candidates, analyze the events, obtain solutions in good agreement with those derived from interactive applications, and dismiss false positive shock candidates on the basis of the conservation equations. The intent is to make this code available to NOAA for use in real-time space weather applications. The code has the added advantage of being able to scan spacecraft data sets to provide shock solutions for use outside real-time applications and can easily be applied to science-quality data sets from other missions. Use of the code for this purpose will also be explored.
Automated aortic calcification detection in low-dose chest CT images
NASA Astrophysics Data System (ADS)
Xie, Yiting; Htwe, Yu Maw; Padgett, Jennifer; Henschke, Claudia; Yankelevitz, David; Reeves, Anthony P.
2014-03-01
The extent of aortic calcification has been shown to be a risk indicator for vascular events including cardiac events. We have developed a fully automated computer algorithm to segment and measure aortic calcification in low-dose noncontrast, non-ECG gated, chest CT scans. The algorithm first segments the aorta using a pre-computed Anatomy Label Map (ALM). Then based on the segmented aorta, aortic calcification is detected and measured in terms of the Agatston score, mass score, and volume score. The automated scores are compared with reference scores obtained from manual markings. For aorta segmentation, the aorta is modeled as a series of discrete overlapping cylinders and the aortic centerline is determined using a cylinder-tracking algorithm. Then the aortic surface location is detected using the centerline and a triangular mesh model. The segmented aorta is used as a mask for the detection of aortic calcification. For calcification detection, the image is first filtered, then an elevated threshold of 160 Hounsfield units (HU) is used within the aorta mask region to reduce the effect of noise in low-dose scans, and finally non-aortic calcification voxels (bony structures, calcification in other organs) are eliminated. The remaining candidates are considered as true aortic calcification. The computer algorithm was evaluated on 45 low-dose non-contrast CT scans. Using linear regression, the automated Agatston score is 98.42% correlated with the reference Agatston score. The automated mass and volume score is respectively 98.46% and 98.28% correlated with the reference mass and volume score.
2012-01-01
Background High-throughput methods are widely-used for strain screening effectively resulting in binary information regarding high or low productivity. Nevertheless achieving quantitative and scalable parameters for fast bioprocess development is much more challenging, especially for heterologous protein production. Here, the nature of the foreign protein makes it impossible to predict the, e.g. best expression construct, secretion signal peptide, inductor concentration, induction time, temperature and substrate feed rate in fed-batch operation to name only a few. Therefore, a high number of systematic experiments are necessary to elucidate the best conditions for heterologous expression of each new protein of interest. Results To increase the throughput in bioprocess development, we used a microtiter plate based cultivation system (Biolector) which was fully integrated into a liquid-handling platform enclosed in laminar airflow housing. This automated cultivation platform was used for optimization of the secretory production of a cutinase from Fusarium solani pisi with Corynebacterium glutamicum. The online monitoring of biomass, dissolved oxygen and pH in each of the microtiter plate wells enables to trigger sampling or dosing events with the pipetting robot used for a reliable selection of best performing cutinase producers. In addition to this, further automated methods like media optimization and induction profiling were developed and validated. All biological and bioprocess parameters were exclusively optimized at microtiter plate scale and showed perfect scalable results to 1 L and 20 L stirred tank bioreactor scale. Conclusions The optimization of heterologous protein expression in microbial systems currently requires extensive testing of biological and bioprocess engineering parameters. This can be efficiently boosted by using a microtiter plate cultivation setup embedded into a liquid-handling system, providing more throughput by parallelization and automation. Due to improved statistics by replicate cultivations, automated downstream analysis, and scalable process information, this setup has superior performance compared to standard microtiter plate cultivation. PMID:23113930
NASA Astrophysics Data System (ADS)
Lai, S. C.; Baker, A. R.; Schuck, T. J.; van Velthoven, P.; Oram, D. E.; Zahn, A.; Hermann, M.; Weigelt, A.; Slemr, S.; Brenninkmeijer, C. A. M.
2010-05-01
The research project CARIBIC (Civil Aircraft for the Regular Investigation of the atmosphere Based on an Instrumented Container, phase II) is designed to conduct regular, long-term and detailed observations of the free troposphere and UT/LS regions where passenger aircraft happen to cruise. A fully-automated measurement container (1.5 tons) was equipped onboard an Airbus 340-600 operated by Lufthansa Airlines during regular passenger flights to conduct real time trace gas and aerosol measurements and to collect aerosol and air samples on a near monthly basis. During May 2005 - March 2008, CARIBIC observations have been performed along the flight tracks of Frankfurt-Guangzhou-Manila. Data have been collected in the upper troposphere during a total of 81 flights over the region between South China and the Philippines. Carbon monoxide was used an indicator to identify the pollution events and to access the regional impacts of fossil fuel burning and biomass/biofuel burning on upper tropospheric air. Five regions, i.e. Northeast Asia, South China, Indochina Peninsula, India and Indonesia/Philippines, are identified as the major source regions to be related to the observed pollution events. The characteristics of the events from these regions are investigated. The contributions of different source categories are also estimated.
Zajíček, Antonín; Fučík, Petr; Kaplická, Markéta; Liška, Marek; Maxová, Jana; Dobiáš, Jakub
2018-04-01
Dynamics of pesticides and their metabolites in drainage waters during baseflow periods and rainfall-runoff events (RREs) were studied from 2014 to 2016 at three small, tile-drained agricultural catchments in Bohemian-Moravian Highlands, Czech Republic. Drainage systems in this region are typically built in slopes with considerable proportion of drainage runoff originating outside the drained area itself. Continuous monitoring was performed by automated samplers, and the event hydrograph was separated using 18 O and 2 H isotopes and drainage water temperature. Results showed that drainage systems represent a significant source for pesticides leaching from agricultural land. Leaching of pesticide metabolites was mainly associated with baseflow and shallow interflow. Water from causal precipitation diluted their concentrations. The prerequisites for the leaching of parental compounds were a rainfall-runoff event occurring shortly after spraying, and the presence of event water in the runoff. When such situations happened consequently, pesticides concentrations in drainage water were high and the pesticide load reached several grams in a few hours. Presented results introduce new insights into the processes of pesticides movement in small, tile-drained catchments and emphasizes the need to incorporate drainage hydrology and flow-triggered sampling into monitoring programmes in larger catchments as well as in environment-conservation policy.
Ellefsen, Kyle L; Settle, Brett; Parker, Ian; Smith, Ian F
2014-09-01
Local Ca(2+) transients such as puffs and sparks form the building blocks of cellular Ca(2+) signaling in numerous cell types. They have traditionally been studied by linescan confocal microscopy, but advances in TIRF microscopy together with improved electron-multiplied CCD (EMCCD) cameras now enable rapid (>500 frames s(-1)) imaging of subcellular Ca(2+) signals with high spatial resolution in two dimensions. This approach yields vastly more information (ca. 1 Gb min(-1)) than linescan imaging, rendering visual identification and analysis of local events imaged both laborious and subject to user bias. Here we describe a routine to rapidly automate identification and analysis of local Ca(2+) events. This features an intuitive graphical user-interfaces and runs under Matlab and the open-source Python software. The underlying algorithm features spatial and temporal noise filtering to reliably detect even small events in the presence of noisy and fluctuating baselines; localizes sites of Ca(2+) release with sub-pixel resolution; facilitates user review and editing of data; and outputs time-sequences of fluorescence ratio signals for identified event sites along with Excel-compatible tables listing amplitudes and kinetics of events. Copyright © 2014 Elsevier Ltd. All rights reserved.
Muething, S E; Conway, P H; Kloppenborg, E; Lesko, A; Schoettker, P J; Seid, M; Kotagal, U
2010-10-01
To describe how in-depth analysis of adverse events can reveal underlying causes. Triggers for adverse events were developed using the hospital's computerised medical record (naloxone for opiate-related oversedation and administration of a glucose bolus while on insulin for insulin-related hypoglycaemia). Triggers were identified daily. Based on information from the medical record and interviews, a subject expert determined if an adverse drug event had occurred and then conducted a real-time analysis to identify event characteristics. Expert groups, consisting of frontline staff and specialist physicians, examined event characteristics and determined the apparent cause. 30 insulin-related hypoglycaemia events and 34 opiate-related oversedation events were identified by the triggers over 16 and 21 months, respectively. In the opinion of the experts, patients receiving continuous-infusion insulin and those receiving dextrose only via parenteral nutrition were at increased risk for insulin-related hypoglycaemia. Lack of standardisation in insulin-dosing decisions and variation regarding when and how much to adjust insulin doses in response to changing glucose levels were identified as common causes of the adverse events. Opiate-related oversedation events often occurred within 48 h of surgery. Variation in pain management in the operating room and post-anaesthesia care unit was identified by the experts as potential causes. Variations in practice, multiple services writing orders, multidrug regimens and variations in interpretation of patient assessments were also noted as potential contributing causes. Identification of adverse drug events through an automated trigger system, supplemented by in-depth analysis, can help identify targets for intervention and improvement.
Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S
2015-11-01
High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.
21 CFR 864.5600 - Automated hematocrit instrument.
Code of Federal Regulations, 2012 CFR
2012-04-01
... measures the packed red cell volume of a blood sample to distinguish normal from abnormal states, such as anemia and erythrocytosis (an increase in the number of red cells). (b) Classification. Class II... § 864.5600 Automated hematocrit instrument. (a) Identification. An automated hematocrit instrument is a...
21 CFR 864.5600 - Automated hematocrit instrument.
Code of Federal Regulations, 2011 CFR
2011-04-01
... measures the packed red cell volume of a blood sample to distinguish normal from abnormal states, such as anemia and erythrocytosis (an increase in the number of red cells). (b) Classification. Class II... § 864.5600 Automated hematocrit instrument. (a) Identification. An automated hematocrit instrument is a...
21 CFR 864.5600 - Automated hematocrit instrument.
Code of Federal Regulations, 2014 CFR
2014-04-01
... measures the packed red cell volume of a blood sample to distinguish normal from abnormal states, such as anemia and erythrocytosis (an increase in the number of red cells). (b) Classification. Class II... § 864.5600 Automated hematocrit instrument. (a) Identification. An automated hematocrit instrument is a...
21 CFR 864.5600 - Automated hematocrit instrument.
Code of Federal Regulations, 2013 CFR
2013-04-01
... measures the packed red cell volume of a blood sample to distinguish normal from abnormal states, such as anemia and erythrocytosis (an increase in the number of red cells). (b) Classification. Class II... § 864.5600 Automated hematocrit instrument. (a) Identification. An automated hematocrit instrument is a...
21 CFR 864.5600 - Automated hematocrit instrument.
Code of Federal Regulations, 2010 CFR
2010-04-01
... measures the packed red cell volume of a blood sample to distinguish normal from abnormal states, such as anemia and erythrocytosis (an increase in the number of red cells). (b) Classification. Class II... § 864.5600 Automated hematocrit instrument. (a) Identification. An automated hematocrit instrument is a...
Closing the loop in ICU decision support: physiologic event detection, alerts, and documentation.
Norris, P. R.; Dawant, B. M.
2001-01-01
Automated physiologic event detection and alerting is a challenging task in the ICU. Ideally care providers should be alerted only when events are clinically significant and there is opportunity for corrective action. However, the concepts of clinical significance and opportunity are difficult to define in automated systems, and effectiveness of alerting algorithms is difficult to measure. This paper describes recent efforts on the Simon project to capture information from ICU care providers about patient state and therapy in response to alerts, in order to assess the value of event definitions and progressively refine alerting algorithms. Event definitions for intracranial pressure and cerebral perfusion pressure were studied by implementing a reliable system to automatically deliver alerts to clinical users alphanumeric pagers, and to capture associated documentation about patient state and therapy when the alerts occurred. During a 6-month test period in the trauma ICU at Vanderbilt University Medical Center, 530 alerts were detected in 2280 hours of data spanning 14 patients. Clinical users electronically documented 81% of these alerts as they occurred. Retrospectively classifying documentation based on therapeutic actions taken, or reasons why actions were not taken, provided useful information about ways to potentially improve event definitions and enhance system utility. PMID:11825238
NASA Technical Reports Server (NTRS)
Zeigler, Bernard P.
1989-01-01
It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.
Freundlich, Robert E; Grondin, Louise; Tremper, Kevin K; Saran, Kelly A; Kheterpal, Sachin
2012-10-01
In this case report, the authors present an adverse event possibly caused by miscommunication among three separate medical teams at their hospital. The authors then discuss the hospital's root cause analysis and its proposed solutions, focusing on the subsequent hospital-wide implementation of an automated electronic reminder for abnormal laboratory values that may have helped to prevent similar medical errors.
NASA Astrophysics Data System (ADS)
Pojmański, G.
2004-10-01
The All Sky Automated Survey is a low cost project, the ultimate goal of which is detection and investigation of any kind of the photometric variability present all over the sky. The current system consists of 4 instruments covering 36x36, 9x9 (2 units) and 2x2 degrees, equipped with 2Kx2K CCDs, V,R,I standard filters and custom made automated mounts. All are working in Las Campanas Observatory, Chile in fully automated mode. In the ASAS-3 phase of the project we have been taking data at a rate of 1 measurement per 1-3 days for all available objects brighter than V=14, located south of δ=+28 deg. So far over 2 TB of images has been collected and analyzed, leading to a photometric light curve catalog of over 10 million sources. A preliminary search for variability revealed over 40,000 bright, variable sources (over 75 % were not previously known). Direct access to the data is available over the Internet: http://www.astrouw.edu.pl/˜ gp/asas. At present the ASAS Alert System is being tested. Events, like outbursts of CV's or Novae, eclipses etc. are reported within a few minutes after first detection. Due to large number of artifacts in these data raw events require verification, which can take up to 24 hours.
Meyer, Michael L; Huey, Greg M
2006-05-01
This study utilized telemetric systems to sample microbes and pathogens in forest, burned forest, rangeland, and urban watersheds to assess surface water quality in northern New Mexico. Four sites included remote mountainous watersheds, prairie rangelands, and a small urban area. The telemetric system was linked to dataloggers with automated event monitoring equipment to monitor discharge, turbidity, electrical conductivity, water temperature, and rainfall during base flow and storm events. Site data stored in dataloggers was uploaded to one of three types of telemetry: 1) radio in rangeland and urban settings; 2) a conventional phone/modem system with a modem positioned at the urban/forest interface; and 3) a satellite system used in a remote mountainous burned forest watershed. The major variables affecting selection of each system were site access, distance, technology, and cost. The systems were compared based on operation and cost. Utilization of telecommunications systems in this varied geographic area facilitated the gathering of hydrologic and water quality data on a timely basis.
Mahardika, G N K; Dibia, N; Budayanti, N S; Susilawathi, N M; Subrata, K; Darwinata, A E; Wignall, F S; Richt, J A; Valdivia-Granda, W A; Sudewi, A A R
2014-06-01
The emergence of human and animal rabies in Bali since November 2008 has attracted local, national and international interest. The potential origin and time of introduction of rabies virus to Bali is described. The nucleoprotein (N) gene of rabies virus from dog brain and human clinical specimens was sequenced using an automated DNA sequencer. Phylogenetic inference with Bayesian Markov Chain Monte Carlo (MCMC) analysis using the Bayesian Evolutionary Analysis by Sampling Trees (BEAST) v. 1.7.5 software confirmed that the outbreak of rabies in Bali was caused by an Indonesian lineage virus following a single introduction. The ancestor of Bali viruses was the descendant of a virus from Kalimantan. Contact tracing showed that the event most likely occurred in early 2008. The introduction of rabies into a large unvaccinated dog population in Bali clearly demonstrates the risk of disease transmission for government agencies and should lead to an increased preparedness and efforts for sustained risk reduction to prevent such events from occurring in future.
NASA Astrophysics Data System (ADS)
Blaber, Elizabeth; Dvorochkin, Natalya; Almeida, Eduardo; Fitzpatrick, Garret; Ellingson, Lance; Mitchell, Sarah; Yang, Anthony; Kosnik, Cristine; Rayl, Nicole; Cannon, Tom; Austin, Edward; Sato, Kevin
With the recent call by the 2011 Decadal Report and the 2010 Space Biosciences Roadmap for the International Space Station (ISS) to be used as a National Laboratory for scientific research, there is now a need for new laboratory instruments on ISS to enable such research to occur. The Bioculture System supports the extended culturing of multiple cell types and microbiological specimens. It consists of a docking station that carries ten independent incubation units or ‘Cassettes’. Each Cassette contains a cooling chamber (5(°) C) for temperature sensitive solutions and samples, or long duration fluids and sample storage, as well as an incubation chamber (ambient up to 42(°) C). Each Cassette houses an independent fluidics system comprised of a biochamber, medical-grade fluid tubing, medium warming module, oxygenation module, fluid pump, and sixteen solenoid valves for automated biochamber injections of sampling. The Bioculture System provides the user with the ability to select the incubation temperature, fluid flow rate and automated biochamber sampling or injection events for each separate Cassette. Furthermore, the ISS crew can access the biochamber, media bag, and accessory bags on-orbit using the Microgravity Science Glovebox. The Bioculture System also permits initiation of cultures, subculturing, injection of compounds, and removal of samples for on-orbit processing using ISS facilities. The Bioculture System therefore provides a unique opportunity for the study of stem cells and other cell types in space. The first validation flight of the Bioculture System will be conducted on SpaceX5, consisting of 8 Cassettes and lasting for 30-37 days. During this flight we plan to culture two different mammalian cell types in bioreactors: a mouse osteocytic-like cell line, and human induced pluripotent stem cell (iPS)-derived cardiomyocytes. Specifically, the osteocytic line will enable the study of a type of cell that has been flown on the Bioculture System’s predecessor, the Cell Culture Module, whilst demonstrating the Bioculture Systems bead-based sub-culturing capabilities, automated sampling and fixation, manual sample removal/storage by ISS crew members, and whole bioreactor fixation. These activities will enable, for the first time, the long-duration culture of a proliferative cell line. Furthermore, these activities will facilitate genetic and proteomic analysis of these cells at several time points to determine cell health throughout the culture period. The long-duration culture of iPS-derived cardiomyocytes will afford us the capability to assess the maturation and formation of a cardiac-like tissue in microgravity conditions. Automated sampling of this culture immediately prior to un-berthing from the ISS will enable genetic analysis of the mature cardiomyocyte tissue, whilst still enabling the return of live cultures for analysis of cardiomyocyte morphology, contractility, and viability in response to spaceflight. This validation flight will demonstrate the new functional capabilities of the Bioculture System and the System will enable, for the first time, the study of the response of stem cells and other cell lineages to long-duration spaceflight exposure, whilst enabling normal cell culturing techniques to be automatically conducted on ISS.
Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve
2018-04-03
In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.
Non-Contact Conductivity Measurement for Automated Sample Processing Systems
NASA Technical Reports Server (NTRS)
Beegle, Luther W.; Kirby, James P.
2012-01-01
A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables
Kumar, Vineet
2011-12-01
The grain size statistics, commonly derived from the grain map of a material sample, are important microstructure characteristics that greatly influence its properties. The grain map for nanomaterials is usually obtained manually by visual inspection of the transmission electron microscope (TEM) micrographs because automated methods do not perform satisfactorily. While the visual inspection method provides reliable results, it is a labor intensive process and is often prone to human errors. In this article, an automated grain mapping method is developed using TEM diffraction patterns. The presented method uses wide angle convergent beam diffraction in the TEM. The automated technique was applied on a platinum thin film sample to obtain the grain map and subsequently derive grain size statistics from it. The grain size statistics obtained with the automated method were found in good agreement with the visual inspection method.
Pitman, R.W.; Conley, W.R. Jr.
1962-12-01
An automated system for adding clarifying chemicals to water in a water treatment plant is described. To a sample of the floc suspension polyacrylamide or similar filter aid chemicals are added, and the sample is then put through a fast filter. The resulting filtrate has the requisite properties for monitoring in an optical turbidimeter to control the automated system. (AEC)
Applications of Text Analysis Tools for Spoken Response Grading
ERIC Educational Resources Information Center
Crossley, Scott; McNamara, Danielle
2013-01-01
This study explores the potential for automated indices related to speech delivery, language use, and topic development to model human judgments of TOEFL speaking proficiency in second language (L2) speech samples. For this study, 244 transcribed TOEFL speech samples taken from 244 L2 learners were analyzed using automated indices taken from…
A home automation based environmental sound alert for people experiencing hearing loss.
Mielke, Matthias; Bruck, Rainer
2016-08-01
Different assistive technologies are available for deaf people (i.e. deaf, deafened, and hard of hearing). Besides the well-known hearing aid, devices for detection of sound events that occur at home or at work (e.g. doorbell, telephone) are available. Despite the technological progress in the last years and resulting new possibilities, the basic functions and concepts of such devices have not changed. The user still needs special assistive technology that is bound to the home or work environment. In this contribution a new concept for awareness of events in buildings is presented. In contrast to state-of-the-art assistive devices, it makes use of modern Information and Communication and home automation technology, and thus offers the prospect of cheap implementation and higher comfort for the user. In this concept events are indicated by notifications that are send over a Bluetooth Low Energy mesh network from a source to the user. The notifications are received by the user's smartwatch and the event is indicated by vibration and an icon representing its source.
Driver Vigilance in Automated Vehicles: Hazard Detection Failures Are a Matter of Time.
Greenlee, Eric T; DeLucia, Patricia R; Newton, David C
2018-06-01
The primary aim of the current study was to determine whether monitoring the roadway for hazards during automated driving results in a vigilance decrement. Although automated vehicles are relatively novel, the nature of human-automation interaction within them has the classic hallmarks of a vigilance task. Drivers must maintain attention for prolonged periods of time to detect and respond to rare and unpredictable events, for example, roadway hazards that automation may be ill equipped to detect. Given the similarity with traditional vigilance tasks, we predicted that drivers of a simulated automated vehicle would demonstrate a vigilance decrement in hazard detection performance. Participants "drove" a simulated automated vehicle for 40 minutes. During that time, their task was to monitor the roadway for roadway hazards. As predicted, hazard detection rate declined precipitously, and reaction times slowed as the drive progressed. Further, subjective ratings of workload and task-related stress indicated that sustained monitoring is demanding and distressing and it is a challenge to maintain task engagement. Monitoring the roadway for potential hazards during automated driving results in workload, stress, and performance decrements similar to those observed in traditional vigilance tasks. To the degree that vigilance is required of automated vehicle drivers, performance errors and associated safety risks are likely to occur as a function of time on task. Vigilance should be a focal safety concern in the development of vehicle automation.
Antón, Alfonso; Pazos, Marta; Martín, Belén; Navero, José Manuel; Ayala, Miriam Eleonora; Castany, Marta; Martínez, Patricia; Bardavío, Javier
2013-01-01
To assess sensitivity, specificity, and agreement among automated event analysis, automated trend analysis, and expert evaluation to detect glaucoma progression. This was a prospective study that included 37 eyes with a follow-up of 36 months. All had glaucomatous disks and fields and performed reliable visual fields every 6 months. Each series of fields was assessed with 3 different methods: subjective assessment by 2 independent teams of glaucoma experts, glaucoma/guided progression analysis (GPA) event analysis, and GPA (visual field index-based) trend analysis. Kappa agreement coefficient between methods and sensitivity and specificity for each method using expert opinion as gold standard were calculated. The incidence of glaucoma progression was 16% to 18% in 3 years but only 3 cases showed progression with all 3 methods. Kappa agreement coefficient was high (k=0.82) between subjective expert assessment and GPA event analysis, and only moderate between these two and GPA trend analysis (k=0.57). Sensitivity and specificity for GPA event and GPA trend analysis were 71% and 96%, and 57% and 93%, respectively. The 3 methods detected similar numbers of progressing cases. The GPA event analysis and expert subjective assessment showed high agreement between them and moderate agreement with GPA trend analysis. In a period of 3 years, both methods of GPA analysis offered high specificity, event analysis showed 83% sensitivity, and trend analysis had a 66% sensitivity.
Automated fluid analysis apparatus and techniques
Szecsody, James E.
2004-03-16
An automated device that couples a pair of differently sized sample loops with a syringe pump and a source of degassed water. A fluid sample is mounted at an inlet port and delivered to the sample loops. A selected sample from the sample loops is diluted in the syringe pump with the degassed water and fed to a flow through detector for analysis. The sample inlet is also directly connected to the syringe pump to selectively perform analysis without dilution. The device is airtight and used to detect oxygen-sensitive species, such as dithionite in groundwater following a remedial injection to treat soil contamination.
Αutomated 2D shoreline detection from coastal video imagery: an example from the island of Crete
NASA Astrophysics Data System (ADS)
Velegrakis, A. F.; Trygonis, V.; Vousdoukas, M. I.; Ghionis, G.; Chatzipavlis, A.; Andreadis, O.; Psarros, F.; Hasiotis, Th.
2015-06-01
Beaches are both sensitive and critical coastal system components as they: (i) are vulnerable to coastal erosion (due to e.g. wave regime changes and the short- and long-term sea level rise) and (ii) form valuable ecosystems and economic resources. In order to identify/understand the current and future beach morphodynamics, effective monitoring of the beach spatial characteristics (e.g. the shoreline position) at adequate spatio-temporal resolutions is required. In this contribution we present the results of a new, fully-automated detection method of the (2-D) shoreline positions using high resolution video imaging from a Greek island beach (Ammoudara, Crete). A fully-automated feature detection method was developed/used to monitor the shoreline position in geo-rectified coastal imagery obtained through a video system set to collect 10 min videos every daylight hour with a sampling rate of 5 Hz, from which snapshot, time-averaged (TIMEX) and variance images (SIGMA) were generated. The developed coastal feature detector is based on a very fast algorithm using a localised kernel that progressively grows along the SIGMA or TIMEX digital image, following the maximum backscatter intensity along the feature of interest; the detector results were found to compare very well with those obtained from a semi-automated `manual' shoreline detection procedure. The automated procedure was tested on video imagery obtained from the eastern part of Ammoudara beach in two 5-day periods, a low wave energy period (6-10 April 2014) and a high wave energy period (1 -5 November 2014). The results showed that, during the high wave energy event, there have been much higher levels of shoreline variance which, however, appeared to be similarly unevenly distributed along the shoreline as that related to the low wave energy event, Shoreline variance `hot spots' were found to be related to the presence/architecture of an offshore submerged shallow beachrock reef, found at a distance of 50-80 m from the shoreline. Hydrodynamic observations during the high wave energy period showed (a) that there is very significant wave energy attenuation by the offshore reef and (b) the generation of significant longshore and rip flows. The study results suggest that the developed methodology can provide a fast, powerful and efficient beach monitoring tool, particularly if combined with pertinent hydrodynamic observations.
Studing Regional Wave Source Time Functions Using A Massive Automated EGF Deconvolution Procedure
NASA Astrophysics Data System (ADS)
Xie, J. "; Schaff, D. P.
2010-12-01
Reliably estimated source time functions (STF) from high-frequency regional waveforms, such as Lg, Pn and Pg, provide important input for seismic source studies, explosion detection, and minimization of parameter trade-off in attenuation studies. The empirical Green’s function (EGF) method can be used for estimating STF, but it requires a strict recording condition. Waveforms from pairs of events that are similar in focal mechanism, but different in magnitude must be on-scale recorded on the same stations for the method to work. Searching for such waveforms can be very time consuming, particularly for regional waves that contain complex path effects and have reduced S/N ratios due to attenuation. We have developed a massive, automated procedure to conduct inter-event waveform deconvolution calculations from many candidate event pairs. The procedure automatically evaluates the “spikiness” of the deconvolutions by calculating their “sdc”, which is defined as the peak divided by the background value. The background value is calculated as the mean absolute value of the deconvolution, excluding 10 s around the source time function. When the sdc values are about 10 or higher, the deconvolutions are found to be sufficiently spiky (pulse-like), indicating similar path Green’s functions and good estimates of the STF. We have applied this automated procedure to Lg waves and full regional wavetrains from 989 M ≥ 5 events in and around China, calculating about a million deconvolutions. Of these we found about 2700 deconvolutions with sdc greater than 9, which, if having a sufficiently broad frequency band, can be used to estimate the STF of the larger events. We are currently refining our procedure, as well as the estimated STFs. We will infer the source scaling using the STFs. We will also explore the possibility that the deconvolution procedure could complement cross-correlation in a real time event-screening process.
Home, automated office, and conventional office blood pressure as predictors of cardiovascular risk.
Andreadis, Emmanuel A; Papademetriou, Vasilios; Geladari, Charalampia V; Kolyvas, George N; Angelopoulos, Epameinondas T; Aronis, Konstantinos N
2017-03-01
Automated office blood pressure (AOBP) has recently been shown to closely predict cardiovascular (CV) events in the elderly. Home blood pressure (HBP) has also been accepted as a valuable method in the prediction of CV disease. This study aimed to compare conventional office BP (OBP), HBP, and AOBP in order to evaluate their value in predicting CV events and deaths in hypertensives. We assessed 236 initially treatment naïve hypertensives, examined between 2009 and 2013. The end points were any CV and non-CV event including mortality, myocardial infarction, coronary heart disease, hospitalization for heart failure, severe arrhythmia, stroke, and intermittent claudication. We fitted proportional hazards models using the different modalities as predictors and evaluated their predictive performance using three metrics: time-dependent receiver operating characteristics curves, the Akaike's Information Criterion, and Harrell's C-index. After a mean follow-up of 7 years, 23 participants (39% women) had experienced ≥1 CV event. Conventional office systolic (hazard ratio [HR] per 1 mm Hg increase in BP, 1.028; 95% confidence interval [CI], 1.009-1.048), automated office systolic (HR per 1 mm Hg increase in BP, 1.031; 95% CI, 1.008-1.054), and home systolic (HR, 1.025; 95% CI, 1.003-1.047) were predictive of CV events. All systolic BP measurements were predictive after adjustment for other CV risk factors (P < .05). The predictive performance of the different modalities was similar. Conventional OBP was significantly higher than AOBP and average HBP. AOBP predicts equally well to OBP and HBP CV events. It appears to be comparable to HBP in the assessment of CV risk, and therefore, its introduction into guidelines and clinical practice as the reference method for assessing BP in the office seems reasonable after verification of these findings by randomized trials. Copyright © 2017 American Society of Hypertension. All rights reserved.
Mischnik, Alexander; Mieth, Markus; Busch, Cornelius J; Hofer, Stefan; Zimmermann, Stefan
2012-08-01
Automation of plate streaking is ongoing in clinical microbiological laboratories, but evaluation for routine use is mostly open. In the present study, the recovery of microorganisms from the Previ Isola system plated polyurethane (PU) swab samples is compared to manually plated control viscose swab samples from wounds according to the CLSI procedure M40-A (quality control of microbiological transport systems). One hundred twelve paired samples (224 swabs) were analyzed. In 80/112 samples (71%), concordant culture results were obtained with the two methods. In 32/112 samples (29%), CFU recovery of microorganisms from the two methods was discordant. In 24 (75%) of the 32 paired samples with a discordant result, Previ Isola plated PU swabs were superior. In 8 (25%) of the 32 paired samples with a discordant result, control viscose swabs were superior. The quality of colony growth on culture media for further investigations was superior with Previ Isola inoculated plates compared to manual plating techniques. Gram stain results were concordant between the two methods in 62/112 samples (55%). In 50/112 samples (45%), the results of Gram staining were discordant between the two methods. In 34 (68%) of the 50 paired samples with discordant results, Gram staining of PU swabs was superior to that of control viscose swabs. In 16 (32%) of the 50 paired samples, Gram staining of control viscose swabs was superior to that of PU swabs. We report the first clinical evaluation of Previ Isola automated specimen inoculation for wound swab samples. This study suggests that use of an automated specimen inoculation system has good results with regard to CFU recovery, quality of Gram staining, and accuracy of diagnosis.
Mieth, Markus; Busch, Cornelius J.; Hofer, Stefan; Zimmermann, Stefan
2012-01-01
Automation of plate streaking is ongoing in clinical microbiological laboratories, but evaluation for routine use is mostly open. In the present study, the recovery of microorganisms from the Previ Isola system plated polyurethane (PU) swab samples is compared to manually plated control viscose swab samples from wounds according to the CLSI procedure M40-A (quality control of microbiological transport systems). One hundred twelve paired samples (224 swabs) were analyzed. In 80/112 samples (71%), concordant culture results were obtained with the two methods. In 32/112 samples (29%), CFU recovery of microorganisms from the two methods was discordant. In 24 (75%) of the 32 paired samples with a discordant result, Previ Isola plated PU swabs were superior. In 8 (25%) of the 32 paired samples with a discordant result, control viscose swabs were superior. The quality of colony growth on culture media for further investigations was superior with Previ Isola inoculated plates compared to manual plating techniques. Gram stain results were concordant between the two methods in 62/112 samples (55%). In 50/112 samples (45%), the results of Gram staining were discordant between the two methods. In 34 (68%) of the 50 paired samples with discordant results, Gram staining of PU swabs was superior to that of control viscose swabs. In 16 (32%) of the 50 paired samples, Gram staining of control viscose swabs was superior to that of PU swabs. We report the first clinical evaluation of Previ Isola automated specimen inoculation for wound swab samples. This study suggests that use of an automated specimen inoculation system has good results with regard to CFU recovery, quality of Gram staining, and accuracy of diagnosis. PMID:22692745
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Van Berkel, Gary J
A fully automated liquid extraction-based surface sampling system utilizing a commercially available autosampler coupled to high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) detection is reported. Discrete spots selected for droplet-based sampling and automated sample queue generation for both the autosampler and MS were enabled by using in-house developed software. In addition, co-registration of spatially resolved sampling position and HPLC-MS information to generate heatmaps of compounds monitored for subsequent data analysis was also available in the software. The system was evaluated with whole-body thin tissue sections from propranolol dosed rat. The hands-free operation of the system was demonstrated by creating heatmapsmore » of the parent drug and its hydroxypropranolol glucuronide metabolites with 1 mm resolution in the areas of interest. The sample throughput was approximately 5 min/sample defined by the time needed for chromatographic separation. The spatial distributions of both the drug and its metabolites were consistent with previous studies employing other liquid extraction-based surface sampling methodologies.« less
Comparing Stream DOC Fluxes from Sensor- and Sample-Based Approaches
NASA Astrophysics Data System (ADS)
Shanley, J. B.; Saraceno, J.; Aulenbach, B. T.; Mast, A.; Clow, D. W.; Hood, K.; Walker, J. F.; Murphy, S. F.; Torres-Sanchez, A.; Aiken, G.; McDowell, W. H.
2015-12-01
DOC transport by streamwater is a significant flux that does not consistently show up in ecosystem carbon budgets. In an effort to quantify stream DOC flux, we analyzed three to four years of high-frequency in situ fluorescing dissolved organic matter (FDOM) concentrations and turbidity measured by optical sensors at the five diverse forested and/or alpine headwater sites of the U.S. Geological Survey (USGS) Water, Energy, and Biogeochemical Budgets (WEBB) program. FDOM serves as a proxy for DOC. We also took discrete samples over a range of hydrologic conditions, using both manual weekly and automated event-based sampling. After compensating FDOM for temperature effects and turbidity interference - which was successful even at the high-turbidity Luquillo, PR site -- we evaluated the DOC-FDOM relation based on discrete sample DOC analyses matched to corrected FDOM at the time of sampling. FDOM was a moderately robust predictor of DOC, with r2 from 0.60 to more than 0.95 among sites. We then formed continuous DOC time series by two independent approaches: (1) DOC predicted from FDOM; and (2) the composite method, based on modeled DOC from regression on stream discharge, season, air temperature, and time, forcing the model to observations and adjusting modeled concentrations between observations by linearly-interpolated model residuals. DOC flux from each approach was then computed directly as concentration times discharge. DOC fluxes based on the sensor approach were consistently greater than the sample-based approach. At Loch Vale, CO (2.5 years) and Panola Mountain GA (1 year), the difference was 5-17%. At Sleepers River, VT (3 years), preliminary differences were greater than 20%. The difference is driven by the highest events, but we are investigating these results further. We will also present comparisons from Luquillo, PR, and Allequash Creek, WI. The higher sensor-based DOC fluxes could result from their accuracy during hysteresis, which is difficult to model. In at least one case the higher sensor-based DOC flux was linked to an unsampled event outside the range of the concentration model. Sensors require upkeep and vigilance with the data, but have the potential to yield more accurate fluxes than sample-based approaches.
FRAME (Force Review Automation Environment): MATLAB-based AFM data processor.
Partola, Kostyantyn R; Lykotrafitis, George
2016-05-03
Data processing of force-displacement curves generated by atomic force microscopes (AFMs) for elastic moduli and unbinding event measurements is very time consuming and susceptible to user error or bias. There is an evident need for consistent, dependable, and easy-to-use AFM data processing software. We have developed an open-source software application, the force review automation environment (or FRAME), that provides users with an intuitive graphical user interface, automating data processing, and tools for expediting manual processing. We did not observe a significant difference between manually processed and automatically processed results from the same data sets. Copyright © 2016 Elsevier Ltd. All rights reserved.
Automated spectral and timing analysis of AGNs
NASA Astrophysics Data System (ADS)
Munz, F.; Karas, V.; Guainazzi, M.
2006-12-01
% We have developed an autonomous script that helps the user to automate the XMM-Newton data analysis for the purposes of extensive statistical investigations. We test this approach by examining X-ray spectra of bright AGNs pre-selected from the public database. The event lists extracted in this process were studied further by constructing their energy-resolved Fourier power-spectrum density. This analysis combines energy distributions, light-curves, and their power-spectra and it proves useful to assess the variability patterns present is the data. As another example, an automated search was based on the XSPEC package to reveal the emission features in 2-8 keV range.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riscassi, Ami L; Miller, Carrie L; Brooks, Scott C
Mercury (Hg) and methylmercury (MeHg) concentrations in streamwater can vary on short timescales (hourly or less) during storm flow and on a diel cycle; the frequency and timing of sampling required to accurately characterize these dynamics may be difficult to accomplish manually. Automated sampling can assist in sample collection; however use has been limited for Hg and MeHg analysis due to stability concerns of trace concentrations during extended storage times. We examined the viability of using automated samplers with disposable low-density polyethylene (LDPE) sample bags to collect industrially contaminated streamwater for unfiltered and filtered Hg and MeHg analysis. Specifically wemore » investigated the effect of holding times ranging from hours to days on streamwater collected during baseflow and storm flow. Unfiltered and filtered Hg and MeHg concentrations decreased with increases in time prior to sample processing; holding times of 24 hours or less resulted in concentration changes (mean 11 7% different) similar to variability in duplicates collected manually during analogous field conditions (mean 7 10% different). Comparisons of samples collected with manual and automated techniques throughout a year for a wide range of stream conditions were also found to be similar to differences observed between duplicate grab samples. These results demonstrate automated sampling into LDPE bags with holding times of 24 hours or less can be effectively used to collect streamwater for Hg and MeHg analysis, and encourage the testing of these materials and methods for implementation in other aqueous systems where high-frequency sampling is warranted.« less
Development and evaluation of a water level proportional water sampler
NASA Astrophysics Data System (ADS)
Schneider, P.; Lange, A.; Doppler, T.
2013-12-01
We developed and adapted a new type of sampler for time-integrated, water level proportional water quality sampling (e.g. nutrients, contaminants and stable isotopes). Our samplers are designed for sampling small to mid-size streams based on the law of Hagen-Poiseuille, where a capillary (or a valve) limits the sampling aliquot by reducing the air flux out of a submersed plastic (HDPE) sampling container. They are good alternatives to battery-operated automated water samplers when working in remote areas, or at streams that are characterized by pronounced daily discharge variations such as glacier streams. We evaluated our samplers against standard automated water samplers (ISCO 2900 and ISCO 6712) during the snowmelt in the Black Forest and the Alps and tested them in remote glacial catchments in Iceland, Switzerland and Kyrgyzstan. The results clearly showed that our samplers are an adequate tool for time-integrated, water level proportional water sampling at remote test sites, as they do not need batteries, are relatively inexpensive, lightweight, and compact. They are well suited for headwater streams - especially when sampling for stable isotopes - as the sampled water is perfectly protected against evaporation. Moreover, our samplers have a reduced risk of icing in cold environments, as they are installed submersed in water, whereas automated samplers (typically installed outside the stream) may get clogged due to icing of hoses. Based on this study, we find these samplers to be an adequate replacement for automated samplers when time-integrated sampling or solute load estimates are the main monitoring tasks.
2009-09-01
2.1 Participants Twelve civilians (7 men and 5 women ) with no prior experience with the Robotic NCO simulation participated in this study. The mean...operators in a multitasking environment. 15. SUBJECT TERMS design guidelines, robotics, simulation, unmanned systems, automation 16. SECURITY...model of operator performance, or a hybrid method which combines one or more of these different invocation techniques (e.g., critical events and
Bayesian ISOLA: new tool for automated centroid moment tensor inversion
NASA Astrophysics Data System (ADS)
Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John
2017-04-01
Focal mechanisms are important for understanding seismotectonics of a region, and they serve as a basic input for seismic hazard assessment. Usually, the point source approximation and the moment tensor (MT) are used. We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances and high signal-to-noise are rejected, and full-waveform inversion in a space-time grid around a provided hypocenter. The method is innovative in the following aspects: (i) The CMT inversion is fully automated, no user interaction is required, although the details of the process can be visually inspected latter on many figures which are automatically plotted.(ii) The automated process includes detection of disturbances based on MouseTrap code, so disturbed recordings do not affect inversion.(iii) A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequencies.(iv) Bayesian approach is used, so not only the best solution is obtained, but also the posterior probability density function.(v) A space-time grid search effectively combined with the least-squares inversion of moment tensor components speeds up the inversion and allows to obtain more accurate results compared to stochastic methods. The method has been tested on synthetic and observed data. It has been tested by comparison with manually processed moment tensors of all events greater than M≥3 in the Swiss catalogue over 16 years using data available at the Swiss data center (http://arclink.ethz.ch). The quality of the results of the presented automated process is comparable with careful manual processing of data. The software package programmed in Python has been designed to be as versatile as possible in order to be applicable in various networks ranging from local to regional. The method can be applied either to the everyday network data flow, or to process large previously existing earthquake catalogues and data sets.
Conway, Laurie J.; Riley, Linda; Saiman, Lisa; Cohen, Bevin; Alper, Paul; Larson, Elaine L.
2015-01-01
Article-at-a-Glance Background Despite substantial evidence to support the effectiveness of hand hygiene for preventing health care–associated infections, hand hygiene practice is often inadequate. Hand hygiene product dispensers that can electronically capture hand hygiene events have the potential to improve hand hygiene performance. A study on an automated group monitoring and feedback system was implemented from January 2012 through March 2013 at a 140-bed community hospital. Methods An electronic system that monitors the use of sanitizer and soap but does not identify individual health care personnel was used to calculate hand hygiene events per patient-hour for each of eight inpatient units and hand hygiene events per patient-visit for the six outpatient units. Hand hygiene was monitored but feedback was not provided during a six-month baseline period and three-month rollout period. During the rollout, focus groups were conducted to determine preferences for feedback frequency and format. During the six-month intervention period, graphical reports were e-mailed monthly to all managers and administrators, and focus groups were repeated. Results After the feedback began, hand hygiene increased on average by 0.17 events/patient-hour in inpatient units (interquartile range = 0.14, p = .008). In outpatient units, hand hygiene performance did not change significantly. A variety of challenges were encountered, including obtaining accurate census and staffing data, engendering confidence in the system, disseminating information in the reports, and using the data to drive improvement. Conclusions Feedback via an automated system was associated with improved hand hygiene performance in the short term. PMID:25252389
Conway, Laurie J; Riley, Linda; Saiman, Lisa; Cohen, Bevin; Alper, Paul; Larson, Elaine L
2014-09-01
Despite substantial evidence to support the effectiveness of hand hygiene for preventing health care-associated infections, hand hygiene practice is often inadequate. Hand hygiene product dispensers that can electronically capture hand hygiene events have the potential to improve hand hygiene performance. A study on an automated group monitoring and feedback system was implemented from January 2012 through March 2013 at a 140-bed community hospital. An electronic system that monitors the use of sanitizer and soap but does not identify individual health care personnel was used to calculate hand hygiene events per patient-hour for each of eight inpatient units and hand hygiene events per patient-visit for the six outpatient units. Hand hygiene was monitored but feedback was not provided during a six-month baseline period and three-month rollout period. During the rollout, focus groups were conducted to determine preferences for feedback frequency and format. During the six-month intervention period, graphical reports were e-mailed monthly to all managers and administrators, and focus groups were repeated. After the feedback began, hand hygiene increased on average by 0.17 events/patient-hour in inpatient units (interquartile range = 0.14, p = .008). In outpatient units, hand hygiene performance did not change significantly. A variety of challenges were encountered, including obtaining accurate census and staffing data, engendering confidence in the system, disseminating information in the reports, and using the data to drive improvement. Feedback via an automated system was associated with improved hand hygiene performance in the short-term.
Digital drug safety surveillance: monitoring pharmaceutical products in twitter.
Freifeld, Clark C; Brownstein, John S; Menone, Christopher M; Bao, Wenjie; Filice, Ross; Kass-Hout, Taha; Dasgupta, Nabarun
2014-05-01
Traditional adverse event (AE) reporting systems have been slow in adapting to online AE reporting from patients, relying instead on gatekeepers, such as clinicians and drug safety groups, to verify each potential event. In the meantime, increasing numbers of patients have turned to social media to share their experiences with drugs, medical devices, and vaccines. The aim of the study was to evaluate the level of concordance between Twitter posts mentioning AE-like reactions and spontaneous reports received by a regulatory agency. We collected public English-language Twitter posts mentioning 23 medical products from 1 November 2012 through 31 May 2013. Data were filtered using a semi-automated process to identify posts with resemblance to AEs (Proto-AEs). A dictionary was developed to translate Internet vernacular to a standardized regulatory ontology for analysis (MedDRA(®)). Aggregated frequency of identified product-event pairs was then compared with data from the public FDA Adverse Event Reporting System (FAERS) by System Organ Class (SOC). Of the 6.9 million Twitter posts collected, 4,401 Proto-AEs were identified out of 60,000 examined. Automated, dictionary-based symptom classification had 86 % recall and 72 % precision [corrected]. Similar overall distribution profiles were observed, with Spearman rank correlation rho of 0.75 (p < 0.0001) between Proto-AEs reported in Twitter and FAERS by SOC. Patients reporting AEs on Twitter showed a range of sophistication when describing their experience. Despite the public availability of these data, their appropriate role in pharmacovigilance has not been established. Additional work is needed to improve data acquisition and automation.
An automated system for global atmospheric sampling using B-747 airliners
NASA Technical Reports Server (NTRS)
Lew, K. Q.; Gustafsson, U. R. C.; Johnson, R. E.
1981-01-01
The global air sampling program utilizes commercial aircrafts in scheduled service to measure atmospheric constituents. A fully automated system designed for the 747 aircraft is described. Airline operational constraints and data and control subsystems are treated. The overall program management, system monitoring, and data retrieval from four aircraft in global service is described.
Hashimoto, Yuichiro
2017-01-01
The development of a robust ionization source using the counter-flow APCI, miniature mass spectrometer, and an automated sampling system for detecting explosives are described. These development efforts using mass spectrometry were made in order to improve the efficiencies of on-site detection in areas such as security, environmental, and industrial applications. A development team, including the author, has struggled for nearly 20 years to enhance the robustness and reduce the size of mass spectrometers to meet the requirements needed for on-site applications. This article focuses on the recent results related to the detection of explosive materials where automated particle sampling using a cyclone concentrator permitted the inspection time to be successfully reduced to 3 s. PMID:28337396
Comparison of different methods to quantify fat classes in bakery products.
Shin, Jae-Min; Hwang, Young-Ok; Tu, Ock-Ju; Jo, Han-Bin; Kim, Jung-Hun; Chae, Young-Zoo; Rhu, Kyung-Hun; Park, Seung-Kook
2013-01-15
The definition of fat differs in different countries; thus whether fat is listed on food labels depends on the country. Some countries list crude fat content in the 'Fat' section on the food label, whereas other countries list total fat. In this study, three methods were used for determining fat classes and content in bakery products: the Folch method, the automated Soxhlet method, and the AOAC 996.06 method. The results using these methods were compared. Fat (crude) extracted by the Folch and Soxhlet methods was gravimetrically determined and assessed by fat class using capillary gas chromatography (GC). In most samples, fat (total) content determined by the AOAC 996.06 method was lower than the fat (crude) content determined by the Folch or automated Soxhlet methods. Furthermore, monounsaturated fat or saturated fat content determined by the AOAC 996.06 method was lowest. Almost no difference was observed between fat (crude) content determined by the Folch method and that determined by the automated Soxhlet method for nearly all samples. In three samples (wheat biscuits, butter cookies-1, and chocolate chip cookies), monounsaturated fat, saturated fat, and trans fat content obtained by the automated Soxhlet method was higher than that obtained by the Folch method. The polyunsaturated fat content obtained by the automated Soxhlet method was not higher than that obtained by the Folch method in any sample. Copyright © 2012 Elsevier Ltd. All rights reserved.
ODOT research news : winter quarter 2002.
DOT National Transportation Integrated Search
2002-01-01
The newsletter includes: : 1) Improving the Effectiveness of Partnering; : 2) Super Hard Steel; : 3) FreezeFree Field Demonstration; : 4) Automated Data Collection; : 5) Cracked bridges; : and other events.
ABO Mistyping of cis-AB Blood Group by the Automated Microplate Technique.
Chun, Sejong; Ryu, Mi Ra; Cha, Seung-Yeon; Seo, Ji-Young; Cho, Duck
2018-01-01
The cis -AB phenotype, although rare, is the relatively most frequent of ABO subgroups in Koreans. To prevent ABO mistyping of cis -AB samples, our hospital has applied a combination of the manual tile method with automated devices. Herein, we report cases of ABO mistyping detected by the combination testing system. Cases that showed discrepant results by automated devices and the manual tile method were evaluated. These samples were also tested by the standard tube method. The automated devices used in this study were a QWALYS-3 and Galileo NEO. Exons 6 and 7 of the ABO gene were sequenced. 13 cases that had the cis -AB allele showed results suggestive of the cis -AB subgroup by manual methods, but were interpreted as AB by either automated device. This happened in 87.5% of these cases by QWALYS-3 and 70.0% by Galileo NEO. Genotyping results showed that 12 cases were ABO*cis-AB01/ABO*O01 or ABO*cis-AB01/ABO*O02 , and one case was ABO*cis-AB01/ ABO*A102. Cis -AB samples were mistyped as AB by the automated microplate technique in some cases. We suggest that the manual tile method can be a simple supplemental test for the detection of the cis -AB phenotype, especially in countries with relatively high cis- AB prevalence.
Automating Energy Bandgap Measurements in Semiconductors Using LabVIEW
ERIC Educational Resources Information Center
Garg, Amit; Sharma, Reena; Dhingra, Vishal
2010-01-01
In this paper, we report the development of an automated system for energy bandgap and resistivity measurement of a semiconductor sample using Four-Probe method for use in the undergraduate laboratory of Physics and Electronics students. The automated data acquisition and analysis system has been developed using National Instruments USB-6008 DAQ…
Automated data collection based on RoboDiff at the ESRF beamline MASSIF-1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nurizzo, Didier, E-mail: Didier.nurizzo@esrf.fr; Guichard, Nicolas; McSweeney, Sean
2016-07-27
The European Synchrotron Radiation Facility has a long standing history in the automation of experiments in Macromolecular Crystallography. MASSIF-1 (Massively Automated Sample Screening and evaluation Integrated Facility), a beamline constructed as part of the ESRF Upgrade Phase I program, has been open to the external user community since July 2014 and offers a unique completely automated data collection service to both academic and industrial structural biologists.
Kloefkorn, Heidi E.; Pettengill, Travis R.; Turner, Sara M. F.; Streeter, Kristi A.; Gonzalez-Rothi, Elisa J.; Fuller, David D.; Allen, Kyle D.
2016-01-01
While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns. PMID:27554674
Kitsos, Christine M; Bhamidipati, Phani; Melnikova, Irena; Cash, Ethan P; McNulty, Chris; Furman, Julia; Cima, Michael J; Levinson, Douglas
2007-01-01
This study examined whether hierarchical clustering could be used to detect cell states induced by treatment combinations that were generated through automation and high-throughput (HT) technology. Data-mining techniques were used to analyze the large experimental data sets to determine whether nonlinear, non-obvious responses could be extracted from the data. Unary, binary, and ternary combinations of pharmacological factors (examples of stimuli) were used to induce differentiation of HL-60 cells using a HT automated approach. Cell profiles were analyzed by incorporating hierarchical clustering methods on data collected by flow cytometry. Data-mining techniques were used to explore the combinatorial space for nonlinear, unexpected events. Additional small-scale, follow-up experiments were performed on cellular profiles of interest. Multiple, distinct cellular profiles were detected using hierarchical clustering of expressed cell-surface antigens. Data-mining of this large, complex data set retrieved cases of both factor dominance and cooperativity, as well as atypical cellular profiles. Follow-up experiments found that treatment combinations producing "atypical cell types" made those cells more susceptible to apoptosis. CONCLUSIONS Hierarchical clustering and other data-mining techniques were applied to analyze large data sets from HT flow cytometry. From each sample, the data set was filtered and used to define discrete, usable states that were then related back to their original formulations. Analysis of resultant cell populations induced by a multitude of treatments identified unexpected phenotypes and nonlinear response profiles.
Kloefkorn, Heidi E; Pettengill, Travis R; Turner, Sara M F; Streeter, Kristi A; Gonzalez-Rothi, Elisa J; Fuller, David D; Allen, Kyle D
2017-03-01
While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns.
Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo
2008-01-01
Background Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Methods Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). Results We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. Conclusion The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes. PMID:18627634
Flaberg, Emilie; Sabelström, Per; Strandh, Christer; Szekely, Laszlo
2008-07-16
Confocal laser scanning microscopy has revolutionized cell biology. However, the technique has major limitations in speed and sensitivity due to the fact that a single laser beam scans the sample, allowing only a few microseconds signal collection for each pixel. This limitation has been overcome by the introduction of parallel beam illumination techniques in combination with cold CCD camera based image capture. Using the combination of microlens enhanced Nipkow spinning disc confocal illumination together with fully automated image capture and large scale in silico image processing we have developed a system allowing the acquisition, presentation and analysis of maximum resolution confocal panorama images of several Gigapixel size. We call the method Extended Field Laser Confocal Microscopy (EFLCM). We show using the EFLCM technique that it is possible to create a continuous confocal multi-colour mosaic from thousands of individually captured images. EFLCM can digitize and analyze histological slides, sections of entire rodent organ and full size embryos. It can also record hundreds of thousands cultured cells at multiple wavelength in single event or time-lapse fashion on fixed slides, in live cell imaging chambers or microtiter plates. The observer independent image capture of EFLCM allows quantitative measurements of fluorescence intensities and morphological parameters on a large number of cells. EFLCM therefore bridges the gap between the mainly illustrative fluorescence microscopy and purely quantitative flow cytometry. EFLCM can also be used as high content analysis (HCA) instrument for automated screening processes.
The LabTube - a novel microfluidic platform for assay automation in laboratory centrifuges.
Kloke, A; Fiebach, A R; Zhang, S; Drechsel, L; Niekrawietz, S; Hoehl, M M; Kneusel, R; Panthel, K; Steigert, J; von Stetten, F; Zengerle, R; Paust, N
2014-05-07
Assay automation is the key for successful transformation of modern biotechnology into routine workflows. Yet, it requires considerable investment in processing devices and auxiliary infrastructure, which is not cost-efficient for laboratories with low or medium sample throughput or point-of-care testing. To close this gap, we present the LabTube platform, which is based on assay specific disposable cartridges for processing in laboratory centrifuges. LabTube cartridges comprise interfaces for sample loading and downstream applications and fluidic unit operations for release of prestored reagents, mixing, and solid phase extraction. Process control is achieved by a centrifugally-actuated ballpen mechanism. To demonstrate the workflow and functionality of the LabTube platform, we show two LabTube automated sample preparation assays from laboratory routines: DNA extractions from whole blood and purification of His-tagged proteins. Equal DNA and protein yields were observed compared to manual reference runs, while LabTube automation could significantly reduce the hands-on-time to one minute per extraction.
Some Challenges in the Design of Human-Automation Interaction for Safety-Critical Systems
NASA Technical Reports Server (NTRS)
Feary, Michael S.; Roth, Emilie
2014-01-01
Increasing amounts of automation are being introduced to safety-critical domains. While the introduction of automation has led to an overall increase in reliability and improved safety, it has also introduced a class of failure modes, and new challenges in risk assessment for the new systems, particularly in the assessment of rare events resulting from complex inter-related factors. Designing successful human-automation systems is challenging, and the challenges go beyond good interface development (e.g., Roth, Malin, & Schreckenghost 1997; Christoffersen & Woods, 2002). Human-automation design is particularly challenging when the underlying automation technology generates behavior that is difficult for the user to anticipate or understand. These challenges have been recognized in several safety-critical domains, and have resulted in increased efforts to develop training, procedures, regulations and guidance material (CAST, 2008, IAEA, 2001, FAA, 2013, ICAO, 2012). This paper points to the continuing need for new methods to describe and characterize the operational environment within which new automation concepts are being presented. We will describe challenges to the successful development and evaluation of human-automation systems in safety-critical domains, and describe some approaches that could be used to address these challenges. We will draw from experience with the aviation, spaceflight and nuclear power domains.
Automated Design of Multiphase Space Missions Using Hybrid Optimal Control
ERIC Educational Resources Information Center
Chilan, Christian Miguel
2009-01-01
A modern space mission is assembled from multiple phases or events such as impulsive maneuvers, coast arcs, thrust arcs and planetary flybys. Traditionally, a mission planner would resort to intuition and experience to develop a sequence of events for the multiphase mission and to find the space trajectory that minimizes propellant use by solving…
AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, Yingssu; Stanford University, 333 Campus Drive, Mudd Building, Stanford, CA 94305-5080; McPhillips, Scott E.
New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data,more » performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully demonstrated. This workflow was run once on the same 96 samples that the group had examined manually and the workflow cycled successfully through all of the samples, collected data from the same samples that were selected manually and located the same peaks of unmodeled density in the resulting difference Fourier maps.« less
Stellar Death in the Nearby Universe
NASA Astrophysics Data System (ADS)
Holoien, Thomas Warren-Son
The night sky is replete with transient and variable events that help shape our universe. The violent, explosive deaths of stars represent some of the most energetic of these events, as a single star is able to outshine billions during its final moments. Aside from imparting significant energy into their host environments, stellar deaths are also responsible for seeding heavy elements into the universe, regulating star formation in their host galaxies, and affecting the evolution of supermassive black holes at the centers of their host galaxies. The large amount of energy output during these events allows them to be seen from billions of lightyears away, making them useful observational probes of physical processes important to many fields of astronomy. In this dissertation I present a series of observational studies of two classes of transients associated with the deaths of stars in the nearby universe: tidal disruption events (TDEs) and supernovae (SNe). Discovered by the All-Sky Automated Survey for Supernovae (ASAS-SN), the objects I discuss were all bright and nearby, and were subject to extensive follow-up observational campaigns. In the first three studies, I present observational data and theoretical models of ASASSN-14ae, ASASSN-14li, and ASASSN-15oi, three TDEs discovered by ASAS-SN and three of the most well-studied TDEs ever discovered. Next I present the discovery of ASASSN-13co, an SN that does not conform to the traditional model of Type II SNe. Finally, I discuss the full sample of bright SNe discovered from 2014 May 1 through 2016 December 31, which is significantly less biased than previous nearby SN samples due to the ASAS-SN survey approach, and perform statistical analyses on this population that will be used for future studies of nearby SNe and their hosts.
Carulli, Giovanni; Marini, Alessandra; Sammuri, Paola; Domenichini, Cristiana; Ottaviano, Virginia; Pacini, Simone; Petrini, Mario
2015-01-01
The identification of eosinophils by flow cytometry is difficult because most of the surface antigens expressed by eosinophils are shared with neutrophils. Some methods have been proposed, generally based on differential light scatter properties, enhanced autofluorescence, lack of CD16 or selective positivity of CD52. Such methods, however, show several limitations. In the present study we report a novel method based on the analysis of glycosylphosphatidylinositol (GPI)-linked molecules. The combination of CD157 and FLAER was used, since FLAER recognizes all GPI-linked molecules, while CD157 is absent on the membrane of eosinophils and expressed by neutrophils. Peripheral blood samples from normal subjects and patients with variable percentages of eosinophils (n = 31), and without any evidence for circulating immature myeloid cells, were stained with the combination of FLAER-Alexa Fluor and CD157-PE. A FascCanto II cytometer was used. Granulocytes were gated after CD33 staining and eosinophils were identified as CD157(-)/FLAER(+) events. Neutrophils were identified as CD157(+)/FLAER(+) events. The percentages of eosinophils detected by this method showed a very significant correlation both with automated counting and with manual counting (r = 0.981 and 0.989, respectively). Sorting assays were carried out by a S3 Cell Sorter: cytospins obtained from CD157(-)/FLAER(+) events consisted of 100% eosinophils, while samples from CD157(+)/FLAER(+) events were represented only by neutrophils. In conclusion, this method shows high sensitivity and specificity in order to distinguish eosinophils from neutrophils by flow cytometry. However, since CD157 is gradually up-regulated throughout bone marrow myeloid maturation, our method cannot be applied to cases characterized by immature myeloid cells.
Drechsel, Lisa; Schulz, Martin; von Stetten, Felix; Moldovan, Carmen; Zengerle, Roland; Paust, Nils
2015-02-07
Lab-on-a-chip devices hold promise for automation of complex workflows from sample to answer with minimal consumption of reagents in portable devices. However, complex, inhomogeneous samples as they occur in environmental or food analysis may block microchannels and thus often cause malfunction of the system. Here we present the novel AutoDip platform which is based on the movement of a solid phase through the reagents and sample instead of transporting a sequence of reagents through a fixed solid phase. A ball-pen mechanism operated by an external actuator automates unit operations such as incubation and washing by consecutively dipping the solid phase into the corresponding liquids. The platform is applied to electrochemical detection of organophosphorus pesticides in real food samples using an acetylcholinesterase (AChE) biosensor. Minimal sample preparation and an integrated reagent pre-storage module hold promise for easy handling of the assay. Detection of the pesticide chlorpyrifos-oxon (CPO) spiked into apple samples at concentrations of 10(-7) M has been demonstrated. This concentration is below the maximum residue level for chlorpyrifos in apples defined by the European Commission.
Hypoxic events and concomitant factors in preterm infants on non-invasive ventilation.
Fathabadi, Omid Sadeghi; Gale, Timothy; Wheeler, Kevin; Plottier, Gemma; Owen, Louise S; Olivier, J C; Dargaville, Peter A
2017-04-01
Automated control of inspired oxygen for newborn infants is an emerging technology, currently limited by reliance on a single input signal (oxygen saturation, SpO 2 ). This is while other signals that may herald the onset of hypoxic events or identify spurious hypoxia are not usually utilised. We wished to assess the frequency of apnoea, loss of circuit pressure and/or motion artefact in proximity to hypoxic events in preterm infants on non-invasive ventilation. Hypoxic events (SpO 2 < 80 %) were identified using a previously acquired dataset obtained from preterm infants receiving non-invasive ventilation. Events with concomitant apnoea, loss of circuit pressure or oximetry motion artefact were annotated, and the frequency of each of these factors was determined. The effect of duration and timing of apnoea on the characteristics of the associated hypoxic events was studied. Among 1224 hypoxic events, 555 (45 %) were accompanied by apnoea, 31 (2.5 %) by loss of circuit pressure and 696 (57 %) by motion artefact, while for 224 (18 %) there were no concomitant factors identified. Respiratory pauses of longer duration (>15 s) preceding hypoxic events, were associated with a relatively slow decline in SpO 2 and more prolonged hypoxia compared to shorter pauses. Hypoxic events are frequently accompanied by respiratory pauses and/or motion artefact. Real-time monitoring and input of respiratory waveform may thus improve the function of automated oxygen controllers, allowing pre-emptive responses to respiratory pauses. Furthermore, use of motion-resistant oximeters and plethysmographic waveform assessment procedures will help to optimise feedback control of inspired oxygen delivery.
ShatterProof: operational detection and quantification of chromothripsis.
Govind, Shaylan K; Zia, Amin; Hennings-Yeomans, Pablo H; Watson, John D; Fraser, Michael; Anghel, Catalina; Wyatt, Alexander W; van der Kwast, Theodorus; Collins, Colin C; McPherson, John D; Bristow, Robert G; Boutros, Paul C
2014-03-19
Chromothripsis, a newly discovered type of complex genomic rearrangement, has been implicated in the evolution of several types of cancers. To date, it has been described in bone cancer, SHH-medulloblastoma and acute myeloid leukemia, amongst others, however there are still no formal or automated methods for detecting or annotating it in high throughput sequencing data. As such, findings of chromothripsis are difficult to compare and many cases likely escape detection altogether. We introduce ShatterProof, a software tool for detecting and quantifying chromothriptic events. ShatterProof takes structural variation calls (translocations, copy-number variations, short insertions and loss of heterozygosity) produced by any algorithm and using an operational definition of chromothripsis performs robust statistical tests to accurately predict the presence and location of chromothriptic events. Validation of our tool was conducted using clinical data sets including matched normal, prostate cancer samples in addition to the colorectal cancer and SCLC data sets used in the original description of chromothripsis. ShatterProof is computationally efficient, having low memory requirements and near linear computation time. This allows it to become a standard component of sequencing analysis pipelines, enabling researchers to routinely and accurately assess samples for chromothripsis. Source code and documentation can be found at http://search.cpan.org/~sgovind/Shatterproof.
Shayanfar, Noushin; Tobler, Ulrich; von Eckardstein, Arnold; Bestmann, Lukas
2007-01-01
Automated analysis of insoluble urine components can reduce the workload of conventional microscopic examination of urine sediment and is possibly helpful for standardization. We compared the diagnostic performance of two automated urine sediment analyzers and combined dipstick/automated urine analysis with that of the traditional dipstick/microscopy algorithm. A total of 332 specimens were collected and analyzed for insoluble urine components by microscopy and automated analyzers, namely the Iris iQ200 (Iris Diagnostics) and the UF-100 flow cytometer (Sysmex). The coefficients of variation for day-to-day quality control of the iQ200 and UF-100 analyzers were 6.5% and 5.5%, respectively, for red blood cells. We reached accuracy ranging from 68% (bacteria) to 97% (yeast) for the iQ200 and from 42% (bacteria) to 93% (yeast) for the UF-100. The combination of dipstick and automated urine sediment analysis increased the sensitivity of screening to approximately 98%. We conclude that automated urine sediment analysis is sufficiently precise and improves the workflow in a routine laboratory. In addition, it allows sediment analysis of all urine samples and thereby helps to detect pathological samples that would have been missed in the conventional two-step procedure according to the European guidelines. Although it is not a substitute for microscopic sediment examination, it can, when combined with dipstick testing, reduce the number of specimens submitted to microscopy. Visual microscopy is still required for some samples, namely, dysmorphic erythrocytes, yeasts, Trichomonas, oval fat bodies, differentiation of casts and certain crystals.
Temporal and Location Based RFID Event Data Management and Processing
NASA Astrophysics Data System (ADS)
Wang, Fusheng; Liu, Peiya
Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.
Automated sleep scoring and sleep apnea detection in children
NASA Astrophysics Data System (ADS)
Baraglia, David P.; Berryman, Matthew J.; Coussens, Scott W.; Pamula, Yvonne; Kennedy, Declan; Martin, A. James; Abbott, Derek
2005-12-01
This paper investigates the automated detection of a patient's breathing rate and heart rate from their skin conductivity as well as sleep stage scoring and breathing event detection from their EEG. The software developed for these tasks is tested on data sets obtained from the sleep disorders unit at the Adelaide Women's and Children's Hospital. The sleep scoring and breathing event detection tasks used neural networks to achieve signal classification. The Fourier transform and the Higuchi fractal dimension were used to extract features for input to the neural network. The filtered skin conductivity appeared visually to bear a similarity to the breathing and heart rate signal, but a more detailed evaluation showed the relation was not consistent. Sleep stage classification was achieved with and accuracy of around 65% with some stages being accurately scored and others poorly scored. The two breathing events hypopnea and apnea were scored with varying degrees of accuracy with the highest scores being around 75% and 30%.
Woynaroski, Tiffany; Oller, D. Kimbrough; Keceli-Kaysili, Bahar; Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill; Gray, Sharmistha; Yoder, Paul
2017-01-01
Theory and research suggest that vocal development predicts “useful speech” in preschoolers with autism spectrum disorder (ASD), but conventional methods for measurement of vocal development are costly and time consuming. This longitudinal correlational study examines the reliability and validity of several automated indices of vocalization development relative to an index derived from human coded, conventional communication samples in a sample of preverbal preschoolers with ASD. Automated indices of vocal development were derived using software that is presently “in development” and/or only available for research purposes and using commercially available Language ENvironment Analysis (LENA) software. Indices of vocal development that could be derived using the software available for research purposes: (a) were highly stable with a single day-long audio recording, (b) predicted future spoken vocabulary to a degree that was nonsignificantly different from the index derived from conventional communication samples, and (c) continued to predict future spoken vocabulary even after controlling for concurrent vocabulary in our sample. The score derived from standard LENA software was similarly stable, but was not significantly correlated with future spoken vocabulary. Findings suggest that automated vocal analysis is a valid and reliable alternative to time intensive and expensive conventional communication samples for measurement of vocal development of preverbal preschoolers with ASD in research and clinical practice. PMID:27459107
Automated position control of a surface array relative to a liquid microjunction surface sampler
Van Berkel, Gary J.; Kertesz, Vilmos; Ford, Michael James
2007-11-13
A system and method utilizes an image analysis approach for controlling the probe-to-surface distance of a liquid junction-based surface sampling system for use with mass spectrometric detection. Such an approach enables a hands-free formation of the liquid microjunction used to sample solution composition from the surface and for re-optimization, as necessary, of the microjunction thickness during a surface scan to achieve a fully automated surface sampling system.
Automation of TL brick dating by ADAM-1
NASA Astrophysics Data System (ADS)
Čechák, T.; Gerndt, J.; Hiršl, P.; Jiroušek, P.; Kanaval, J.; Kubelík, M.; Musílek, L.
2001-06-01
A specially adapted machine ADAM-1 for the thermoluminescence fine grain dating of bricks was constructed in an interdisciplinary research project, undertaken by a team recruited from three faculties of the Czech Technical University in Prague. This TL-reader is able to measure and evaluate automatically numerous samples. The sample holder has 60 sample positions, which allow the irradiation and evaluation of samples taken from two locations. All procedures of alpha and beta irradiation by varying doses and the TL-signal measurement as also the age evaluation and error assessment are programmable and fully automated.
Lo, Andy; Tang, Yanan; Chen, Lu; Li, Liang
2013-07-25
Isotope labeling liquid chromatography-mass spectrometry (LC-MS) is a major analytical platform for quantitative proteome analysis. Incorporation of isotopes used to distinguish samples plays a critical role in the success of this strategy. In this work, we optimized and automated a chemical derivatization protocol (dimethylation after guanidination, 2MEGA) to increase the labeling reproducibility and reduce human intervention. We also evaluated the reagent compatibility of this protocol to handle biological samples in different types of buffers and surfactants. A commercially available liquid handler was used for reagent dispensation to minimize analyst intervention and at least twenty protein digest samples could be prepared in a single run. Different front-end sample preparation methods for protein solubilization (SDS, urea, Rapigest™, and ProteaseMAX™) and two commercially available cell lysis buffers were evaluated for compatibility with the automated protocol. It was found that better than 94% desired labeling could be obtained in all conditions studied except urea, where the rate was reduced to about 92% due to carbamylation on the peptide amines. This work illustrates the automated 2MEGA labeling process can be used to handle a wide range of protein samples containing various reagents that are often encountered in protein sample preparation for quantitative proteome analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Chattanooga Electric Power Board Case Study Distribution Automation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glass, Jim; Melin, Alexander M.; Starke, Michael R.
In 2009, the U.S. Department of Energy under the American Recovery and Reinvestment Act (ARRA) awarded a grant to the Chattanooga, Tennessee, Electric Power Board (EPB) as part of the Smart Grid Investment Grant Program. The grant had the objective “to accelerate the transformation of the nation’s electric grid by deploying smart grid technologies.” This funding award enabled EPB to expedite the original smart grid implementation schedule from an estimated 10-12 years to 2.5 years. With this funding, EPB invested heavily in distribution automation technologies including installing over 1,200 automated circuit switches and sensors on 171 circuits. For utilities consideringmore » a commitment to distribution automation, there are underlying questions such as the following: “What is the value?” and “What are the costs?” This case study attempts to answer these questions. The primary benefit of distribution automation is increased reliability or reduced power outage duration and frequency. Power outages directly impact customer economics by interfering with business functions. In the past, this economic driver has been difficult to effectively evaluate. However, as this case study demonstrates, tools and analysis techniques are now available. In this case study, the impact on customer costs associated with power outages before and after the implementation of distribution automation are compared. Two example evaluations are performed to demonstrate the benefits: 1) a savings baseline for customers under normal operations1 and 2) customer savings for a single severe weather event. Cost calculations for customer power outages are performed using the US Department of Energy (DOE) Interruption Cost Estimate (ICE) calculator2. This tool uses standard metrics associated with outages and the customers to calculate cost impact. The analysis shows that EPB customers have seen significant reliability improvements from the implementation of distribution automation. Under normal operations, the investment in distribution automation has enabled a 43.5% reduction in annual outage minutes since 2012. This has led to an estimated total savings of $26.8 million per year. Examining a single severe weather event3, the distribution automation was able to restore power to 40,579 (nearly 56%) customers within 1–2 seconds and reduce outage minutes by 29.0%. This saved customers an estimated $23.2 million over the course of the storm.« less
Sarter, Nadine B; Mumaw, Randall J; Wickens, Christopher D
2007-06-01
The objective of the study was to examine pilots' automation monitoring strategies and performance on highly automated commercial flight decks. A considerable body of research and operational experience has documented breakdowns in pilot-automation coordination on modern flight decks. These breakdowns are often considered symptoms of monitoring failures even though, to date, only limited and mostly anecdotal data exist concerning pilots' monitoring strategies and performance. Twenty experienced B-747-400 airline pilots flew a 1-hr scenario involving challenging automation-related events on a full-mission simulator. Behavioral, mental model, and eye-tracking data were collected. The findings from this study confirm that pilots monitor basic flight parameters to a much greater extent than visual indications of the automation configuration. More specifically, they frequently fail to verify manual mode selections or notice automatic mode changes. In other cases, they do not process mode annunciations in sufficient depth to understand their implications for aircraft behavior. Low system observability and gaps in pilots' understanding of complex automation modes were shown to contribute to these problems. Our findings describe and explain shortcomings in pilot's automation monitoring strategies and performance based on converging behavioral, eye-tracking, and mental model data. They confirm that monitoring failures are one major contributor to breakdowns in pilot-automation interaction. The findings from this research can inform the design of improved training programs and automation interfaces that support more effective system monitoring.
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-01-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-08-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.
Automated fault-management in a simulated spaceflight micro-world
NASA Technical Reports Server (NTRS)
Lorenz, Bernd; Di Nocera, Francesco; Rottger, Stefan; Parasuraman, Raja
2002-01-01
BACKGROUND: As human spaceflight missions extend in duration and distance from Earth, a self-sufficient crew will bear far greater onboard responsibility and authority for mission success. This will increase the need for automated fault management (FM). Human factors issues in the use of such systems include maintenance of cognitive skill, situational awareness (SA), trust in automation, and workload. This study examine the human performance consequences of operator use of intelligent FM support in interaction with an autonomous, space-related, atmospheric control system. METHODS: An expert system representing a model-base reasoning agent supported operators at a low level of automation (LOA) by a computerized fault finding guide, at a medium LOA by an automated diagnosis and recovery advisory, and at a high LOA by automate diagnosis and recovery implementation, subject to operator approval or veto. Ten percent of the experimental trials involved complete failure of FM support. RESULTS: Benefits of automation were reflected in more accurate diagnoses, shorter fault identification time, and reduced subjective operator workload. Unexpectedly, fault identification times deteriorated more at the medium than at the high LOA during automation failure. Analyses of information sampling behavior showed that offloading operators from recovery implementation during reliable automation enabled operators at high LOA to engage in fault assessment activities CONCLUSIONS: The potential threat to SA imposed by high-level automation, in which decision advisories are automatically generated, need not inevitably be counteracted by choosing a lower LOA. Instead, freeing operator cognitive resources by automatic implementation of recover plans at a higher LOA can promote better fault comprehension, so long as the automation interface is designed to support efficient information sampling.
Chapin, Thomas P.; Todd, Andrew S.
2012-01-01
Abandoned hard-rock mines can be a significant source of acid mine drainage (AMD) and toxic metal pollution to watersheds. In Colorado, USA, abandoned mines are often located in remote, high elevation areas that are snowbound for 7–8 months of the year. The difficulty in accessing these remote sites, especially during winter, creates challenging water sampling problems and major hydrologic and toxic metal loading events are often under sampled. Currently available automated water samplers are not well suited for sampling remote snowbound areas so the U.S. Geological Survey (USGS) has developed a new water sampler, the MiniSipper, to provide long-duration, high-resolution water sampling in remote areas. The MiniSipper is a small, portable sampler that uses gas bubbles to separate up to 250 five milliliter acidified samples in a long tubing coil. The MiniSipper operates for over 8 months unattended in water under snow/ice, reduces field work costs, and greatly increases sampling resolution, especially during inaccessible times. MiniSippers were deployed in support of an U.S. Environmental Protection Agency (EPA) project evaluating acid mine drainage inputs from the Pennsylvania Mine to the Snake River watershed in Summit County, CO, USA. MiniSipper metal results agree within 10% of EPA-USGS hand collected grab sample results. Our high-resolution results reveal very strong correlations (R2 > 0.9) between potentially toxic metals (Cd, Cu, and Zn) and specific conductivity at the Pennsylvania Mine site. The large number of samples collected by the MiniSipper over the entire water year provides a detailed look at the effects of major hydrologic events such as snowmelt runoff and rainstorms on metal loading from the Pennsylvania Mine. MiniSipper results will help guide EPA sampling strategy and remediation efforts in the Snake River watershed.
Chapin, Thomas P; Todd, Andrew S
2012-11-15
Abandoned hard-rock mines can be a significant source of acid mine drainage (AMD) and toxic metal pollution to watersheds. In Colorado, USA, abandoned mines are often located in remote, high elevation areas that are snowbound for 7-8 months of the year. The difficulty in accessing these remote sites, especially during winter, creates challenging water sampling problems and major hydrologic and toxic metal loading events are often under sampled. Currently available automated water samplers are not well suited for sampling remote snowbound areas so the U.S. Geological Survey (USGS) has developed a new water sampler, the MiniSipper, to provide long-duration, high-resolution water sampling in remote areas. The MiniSipper is a small, portable sampler that uses gas bubbles to separate up to 250 five milliliter acidified samples in a long tubing coil. The MiniSipper operates for over 8 months unattended in water under snow/ice, reduces field work costs, and greatly increases sampling resolution, especially during inaccessible times. MiniSippers were deployed in support of an U.S. Environmental Protection Agency (EPA) project evaluating acid mine drainage inputs from the Pennsylvania Mine to the Snake River watershed in Summit County, CO, USA. MiniSipper metal results agree within 10% of EPA-USGS hand collected grab sample results. Our high-resolution results reveal very strong correlations (R(2)>0.9) between potentially toxic metals (Cd, Cu, and Zn) and specific conductivity at the Pennsylvania Mine site. The large number of samples collected by the MiniSipper over the entire water year provides a detailed look at the effects of major hydrologic events such as snowmelt runoff and rainstorms on metal loading from the Pennsylvania Mine. MiniSipper results will help guide EPA sampling strategy and remediation efforts in the Snake River watershed. Published by Elsevier B.V.
Vogeser, Michael; Spöhrer, Ute
2006-01-01
Liquid chromatography tandem-mass spectrometry (LC-MS/MS) is an efficient technology for routine determination of immunosuppressants in whole blood; however, time-consuming manual sample preparation remains a significant limitation of this technique. Using a commercially available robotic pipetting system (Tecan Freedom EVO), we developed an automated sample-preparation protocol for quantification of tacrolimus in whole blood by LC-MS/MS. Barcode reading, sample resuspension, transfer of whole blood aliquots into a deep-well plate, addition of internal standard solution, mixing, and protein precipitation by addition of an organic solvent is performed by the robotic system. After centrifugation of the plate, the deproteinized supernatants are submitted to on-line solid phase extraction, using column switching prior to LC-MS/MS analysis. The only manual actions within the entire process are decapping of the tubes, and transfer of the deep-well plate from the robotic system to a centrifuge and finally to the HPLC autosampler. Whole blood pools were used to assess the reproducibility of the entire analytical system for measuring tacrolimus concentrations. A total coefficient of variation of 1.7% was found for the entire automated analytical process (n=40; mean tacrolimus concentration, 5.3 microg/L). Close agreement between tacrolimus results obtained after manual and automated sample preparation was observed. The analytical system described here, comprising automated protein precipitation, on-line solid phase extraction and LC-MS/MS analysis, is convenient and precise, and minimizes hands-on time and the risk of mistakes in the quantification of whole blood immunosuppressant concentrations compared to conventional methods.
Bayesian ISOLA: new tool for automated centroid moment tensor inversion
NASA Astrophysics Data System (ADS)
Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John
2017-08-01
We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances are rejected and full-waveform inversion in a space-time grid around a provided hypocentre. A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequency ranges. The method is tested on synthetic and observed data. It is applied on a data set from the Swiss seismic network and the results are compared with the existing high-quality MT catalogue. The software package programmed in Python is designed to be as versatile as possible in order to be applicable in various networks ranging from local to regional. The method can be applied either to the everyday network data flow, or to process large pre-existing earthquake catalogues and data sets.
Clarity: An Open Source Manager for Laboratory Automation
Delaney, Nigel F.; Echenique, José Rojas; Marx, Christopher J.
2013-01-01
Software to manage automated laboratories interfaces with hardware instruments, gives users a way to specify experimental protocols, and schedules activities to avoid hardware conflicts. In addition to these basics, modern laboratories need software that can run multiple different protocols in parallel and that can be easily extended to interface with a constantly growing diversity of techniques and instruments. We present Clarity: a laboratory automation manager that is hardware agnostic, portable, extensible and open source. Clarity provides critical features including remote monitoring, robust error reporting by phone or email, and full state recovery in the event of a system crash. We discuss the basic organization of Clarity; demonstrate an example of its implementation for the automated analysis of bacterial growth; and describe how the program can be extended to manage new hardware. Clarity is mature; well documented; actively developed; written in C# for the Common Language Infrastructure; and is free and open source software. These advantages set Clarity apart from currently available laboratory automation programs. PMID:23032169
Morchel, Herman; Ogedegbe, Chinwe; Desai, Nilesh; Faley, Brian; Mahmood, Nasir; Moro, Gary Del; Feldman, Joseph
2015-01-01
This article describes the innovative use of an automated drug distribution cabinet system for medication supply in a disaster response mobile Emergency Department vehicle. Prior to the use of the automated drug distribution cabinet system described in this article, the mobile hospitals were stocked as needed with drugs in individual boxes and draws. Experience with multiple deployments found this method to be very cumbersome and labor intensive, both in preparation, operational use, and demobilization. For a recent deployment to provide emergency medical care at the 2014 Super Bowl football event, the automated drug distribution cabinet system in the Institution's main campus Emergency Department was duplicated and incorporated into the mobile Emergency Department. This method of drug stocking and dispensing was found to be far more efficient than gathering and placing drugs in onboard draws and racks. Automated drug distribution cabinet systems can be used to significantly improve patient care and overall efficiency in mobile hospital deployments.
Automated cellular sample preparation using a Centrifuge-on-a-Chip.
Mach, Albert J; Kim, Jae Hyun; Arshi, Armin; Hur, Soojung Claire; Di Carlo, Dino
2011-09-07
The standard centrifuge is a laboratory instrument widely used by biologists and medical technicians for preparing cell samples. Efforts to automate the operations of concentration, cell separation, and solution exchange that a centrifuge performs in a simpler and smaller platform have had limited success. Here, we present a microfluidic chip that replicates the functions of a centrifuge without moving parts or external forces. The device operates using a purely fluid dynamic phenomenon in which cells selectively enter and are maintained in microscale vortices. Continuous and sequential operation allows enrichment of cancer cells from spiked blood samples at the mL min(-1) scale, followed by fluorescent labeling of intra- and extra-cellular antigens on the cells without the need for manual pipetting and washing steps. A versatile centrifuge-analogue may open opportunities in automated, low-cost and high-throughput sample preparation as an alternative to the standard benchtop centrifuge in standardized clinical diagnostics or resource poor settings.
Pavlov, Sergey S; Dmitriev, Andrey Yu; Frontasyeva, Marina V
The present status of development of software packages and equipment designed for automation of NAA at the reactor IBR-2 of FLNP, JINR, Dubna, RF, is described. The NAA database, construction of sample changers and software for automation of spectra measurement and calculation of concentrations are presented. Automation of QC procedures is integrated in the software developed. Details of the design are shown.
NASA Astrophysics Data System (ADS)
Field, M. Paul; Romaniello, Stephen; Gordon, Gwyneth W.; Anbar, Ariel D.; Herrmann, Achim; Martinez-Boti, Miguel A.; Anagnostou, Eleni; Foster, Gavin L.
2014-05-01
MC-ICP-MS has dramatically improved the analytical throughput for high-precision radiogenic and non-traditional isotope ratio measurements, compared to TIMS. The generation of large data sets, however, remains hampered by tedious manual drip chromatography required for sample purification. A new, automated chromatography system reduces the laboratory bottle neck and expands the utility of high-precision isotope analyses in applications where large data sets are required: geochemistry, forensic anthropology, nuclear forensics, medical research and food authentication. We have developed protocols to automate ion exchange purification for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U) using the new prepFAST-MC™ (ESI, Nebraska, Omaha). The system is not only inert (all-flouropolymer flow paths), but is also very flexible and can easily facilitate different resins, samples, and reagent types. When programmed, precise and accurate user defined volumes and flow rates are implemented to automatically load samples, wash the column, condition the column and elute fractions. Unattended, the automated, low-pressure ion exchange chromatography system can process up to 60 samples overnight. Excellent reproducibility, reliability, recovery, with low blank and carry over for samples in a variety of different matrices, have been demonstrated to give accurate and precise isotopic ratios within analytical error for several isotopic systems (B, Ca, Fe, Cu, Zn, Sr, Cd, Pb and U). This illustrates the potential of the new prepFAST-MC™ (ESI, Nebraska, Omaha) as a powerful tool in radiogenic and non-traditional isotope research.
RoboTAP: Target priorities for robotic microlensing observations
NASA Astrophysics Data System (ADS)
Hundertmark, M.; Street, R. A.; Tsapras, Y.; Bachelet, E.; Dominik, M.; Horne, K.; Bozza, V.; Bramich, D. M.; Cassan, A.; D'Ago, G.; Figuera Jaimes, R.; Kains, N.; Ranc, C.; Schmidt, R. W.; Snodgrass, C.; Wambsganss, J.; Steele, I. A.; Mao, S.; Ment, K.; Menzies, J.; Li, Z.; Cross, S.; Maoz, D.; Shvartzvald, Y.
2018-01-01
Context. The ability to automatically select scientifically-important transient events from an alert stream of many such events, and to conduct follow-up observations in response, will become increasingly important in astronomy. With wide-angle time domain surveys pushing to fainter limiting magnitudes, the capability to follow-up on transient alerts far exceeds our follow-up telescope resources, and effective target prioritization becomes essential. The RoboNet-II microlensing program is a pathfinder project, which has developed an automated target selection process (RoboTAP) for gravitational microlensing events, which are observed in real time using the Las Cumbres Observatory telescope network. Aims: Follow-up telescopes typically have a much smaller field of view compared to surveys, therefore the most promising microlensing events must be automatically selected at any given time from an annual sample exceeding 2000 events. The main challenge is to select between events with a high planet detection sensitivity, with the aim of detecting many planets and characterizing planetary anomalies. Methods: Our target selection algorithm is a hybrid system based on estimates of the planet detection zones around a microlens. It follows automatic anomaly alerts and respects the expected survey coverage of specific events. Results: We introduce the RoboTAP algorithm, whose purpose is to select and prioritize microlensing events with high sensitivity to planetary companions. In this work, we determine the planet sensitivity of the RoboNet follow-up program and provide a working example of how a broker can be designed for a real-life transient science program conducting follow-up observations in response to alerts; we explore the issues that will confront similar programs being developed for the Large Synoptic Survey Telescope (LSST) and other time domain surveys.
Automation of diagnostic genetic testing: mutation detection by cyclic minisequencing.
Alagrund, Katariina; Orpana, Arto K
2014-01-01
The rising role of nucleic acid testing in clinical decision making is creating a need for efficient and automated diagnostic nucleic acid test platforms. Clinical use of nucleic acid testing sets demands for shorter turnaround times (TATs), lower production costs and robust, reliable methods that can easily adopt new test panels and is able to run rare tests in random access principle. Here we present a novel home-brew laboratory automation platform for diagnostic mutation testing. This platform is based on the cyclic minisequecing (cMS) and two color near-infrared (NIR) detection. Pipetting is automated using Tecan Freedom EVO pipetting robots and all assays are performed in 384-well micro plate format. The automation platform includes a data processing system, controlling all procedures, and automated patient result reporting to the hospital information system. We have found automated cMS a reliable, inexpensive and robust method for nucleic acid testing for a wide variety of diagnostic tests. The platform is currently in clinical use for over 80 mutations or polymorphisms. Additionally to tests performed from blood samples, the system performs also epigenetic test for the methylation of the MGMT gene promoter, and companion diagnostic tests for analysis of KRAS and BRAF gene mutations from formalin fixed and paraffin embedded tumor samples. Automation of genetic test reporting is found reliable and efficient decreasing the work load of academic personnel.
An Automated Sample Processing System for Planetary Exploration
NASA Technical Reports Server (NTRS)
Soto, Juancarlos; Lasnik, James; Roark, Shane; Beegle, Luther
2012-01-01
An Automated Sample Processing System (ASPS) for wet chemistry processing of organic materials on the surface of Mars has been jointly developed by Ball Aerospace and the Jet Propulsion Laboratory. The mechanism has been built and tested to demonstrate TRL level 4. This paper describes the function of the system, mechanism design, lessons learned, and several challenges that were overcome.
ERIC Educational Resources Information Center
Broomfield, Laura; McHugh, Louise; Reed, Phil
2008-01-01
Stimulus over-selectivity occurs when one of potentially many aspects of the environment comes to control behaviour. In two experiments, adults with no developmental disabilities, were trained and tested in an automated match to samples (MTS) paradigm. In Experiment 1, participants completed two conditions, in one of which the over-selected…
Bahk, Chi Y; Cumming, Melissa; Paushter, Louisa; Madoff, Lawrence C; Thomson, Angus; Brownstein, John S
2016-02-01
Real-time monitoring of mainstream and social media can inform public health practitioners and policy makers about vaccine sentiment and hesitancy. We describe a publicly available platform for monitoring vaccination-related content, called the Vaccine Sentimeter. With automated data collection from 100,000 mainstream media sources and Twitter, natural-language processing for automated filtering, and manual curation to ensure accuracy, the Vaccine Sentimeter offers a global real-time view of vaccination conversations online. To assess the system's utility, we followed two events: polio vaccination in Pakistan after a news story about a Central Intelligence Agency vaccination ruse and subsequent attacks on health care workers, and a controversial episode in a television program about adverse events following human papillomavirus vaccination. For both events, increased online activity was detected and characterized. For the first event, Twitter response to the attacks on health care workers decreased drastically after the first attack, in contrast to mainstream media coverage. For the second event, the mainstream and social media response was largely positive about the HPV vaccine, but antivaccine conversations persisted longer than the provaccine reaction. Using the Vaccine Sentimeter could enable public health professionals to detect increased online activity or sudden shifts in sentiment that could affect vaccination uptake. Project HOPE—The People-to-People Health Foundation, Inc.
A large-scale dataset of solar event reports from automated feature recognition modules
NASA Astrophysics Data System (ADS)
Schuh, Michael A.; Angryk, Rafal A.; Martens, Petrus C.
2016-05-01
The massive repository of images of the Sun captured by the Solar Dynamics Observatory (SDO) mission has ushered in the era of Big Data for Solar Physics. In this work, we investigate the entire public collection of events reported to the Heliophysics Event Knowledgebase (HEK) from automated solar feature recognition modules operated by the SDO Feature Finding Team (FFT). With the SDO mission recently surpassing five years of operations, and over 280,000 event reports for seven types of solar phenomena, we present the broadest and most comprehensive large-scale dataset of the SDO FFT modules to date. We also present numerous statistics on these modules, providing valuable contextual information for better understanding and validating of the individual event reports and the entire dataset as a whole. After extensive data cleaning through exploratory data analysis, we highlight several opportunities for knowledge discovery from data (KDD). Through these important prerequisite analyses presented here, the results of KDD from Solar Big Data will be overall more reliable and better understood. As the SDO mission remains operational over the coming years, these datasets will continue to grow in size and value. Future versions of this dataset will be analyzed in the general framework established in this work and maintained publicly online for easy access by the community.
Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes
NASA Astrophysics Data System (ADS)
Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao
2010-06-01
To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.
Somnam, Sarawut; Jakmunee, Jaroon; Grudpan, Kate; Lenghor, Narong; Motomizu, Shoji
2008-12-01
An automated hydrodynamic sequential injection (HSI) system with spectrophotometric detection was developed. Thanks to the hydrodynamic injection principle, simple devices can be used for introducing reproducible microliter volumes of both sample and reagent into the flow channel to form stacked zones in a similar fashion to those in a sequential injection system. The zones were then pushed to the detector and a peak profile was recorded. The determination of nitrite and nitrate in water samples by employing the Griess reaction was chosen as a model. Calibration graphs with linearity in the range of 0.7 - 40 muM were obtained for both nitrite and nitrate. Detection limits were found to be 0.3 muM NO(2)(-) and 0.4 muM NO(3)(-), respectively, with a sample throughput of 20 h(-1) for consecutive determination of both the species. The developed system was successfully applied to the analysis of water samples, employing simple and cost-effective instrumentation and offering higher degrees of automation and low chemical consumption.
Vakh, Christina; Evdokimova, Ekaterina; Pochivalov, Aleksei; Moskvin, Leonid; Bulatov, Andrey
2017-12-15
An easily performed fully automated and miniaturized flow injection chemiluminescence (CL) method for determination of phenols in smoked food samples has been proposed. This method includes the ultrasound assisted solid-liquid extraction coupled with gas-diffusion separation of phenols from smoked food sample and analytes absorption into a NaOH solution in a specially designed gas-diffusion cell. The flow system was designed to focus on automation and miniaturization with minimal sample and reagent consumption by inexpensive instrumentation. The luminol - N-bromosuccinimide system in an alkaline medium was used for the CL determination of phenols. The limit of detection of the proposed procedure was 3·10 -8 ·molL -1 (0.01mgkg -1 ) in terms of phenol. The presented method demonstrated to be a good tool for easy, rapid and cost-effective point-of-need screening phenols in smoked food samples. Copyright © 2017 Elsevier Ltd. All rights reserved.
Automated Sensitivity Analysis of Interplanetary Trajectories
NASA Technical Reports Server (NTRS)
Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno
2017-01-01
This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.
Jungkind, D
2001-01-01
While it is an extremely powerful and versatile assay method, polymerase chain reaction (PCR) can be a labor-intensive process. Since the advent of commercial test kits from Roche and the semi-automated microwell Amplicor system, PCR has become an increasingly useful and widespread clinical tool. However, more widespread acceptance of molecular testing will depend upon automation that allows molecular assays to enter the routine clinical laboratory. The forces driving the need for automated PCR are the requirements for diagnosis and treatment of chronic viral diseases, economic pressures to develop more automated and less expensive test procedures similar to those in the clinical chemistry laboratories, and a shortage in many areas of qualified laboratory personnel trained in the types of manual procedures used in past decades. The automated Roche COBAS AMPLICOR system has automated the amplification and detection process. Specimen preparation remains the most labor-intensive part of the PCR testing process, accounting for the majority of the hands-on-time in most of the assays. A new automated specimen preparation system, the COBAS AmpliPrep, was evaluated. The system automatically releases the target nucleic acid, captures the target with specific oligonucleotide probes, which become attached to magnetic beads via a biotin-streptavidin binding reaction. Once attached to the beads, the target is purified and concentrated automatically. Results of 298 qualitative and 57 quantitative samples representing a wide range of virus concentrations analyzed after the COBAS AmpliPrep and manual specimen preparation methods, showed that there was no significant difference in qualitative or quantitative hepatitis C virus (HCV) assay performance, respectively. The AmpliPrep instrument decreased the time required to prepare serum or plasma samples for HCV PCR to under 1 min per sample. This was a decrease of 76% compared to the manual specimen preparation method. Systems that can analyze more samples with higher throughput and that can answer more questions about the nature of the microbes that we can presently only detect and quantitate will be needed in the future.
Application of automation and information systems to forensic genetic specimen processing.
Leclair, Benoît; Scholl, Tom
2005-03-01
During the last 10 years, the introduction of PCR-based DNA typing technologies in forensic applications has been highly successful. This technology has become pervasive throughout forensic laboratories and it continues to grow in prevalence. For many criminal cases, it provides the most probative evidence. Criminal genotype data banking and victim identification initiatives that follow mass-fatality incidents have benefited the most from the introduction of automation for sample processing and data analysis. Attributes of offender specimens including large numbers, high quality and identical collection and processing are ideal for the application of laboratory automation. The magnitude of kinship analysis required by mass-fatality incidents necessitates the application of computing solutions to automate the task. More recently, the development activities of many forensic laboratories are focused on leveraging experience from these two applications to casework sample processing. The trend toward increased prevalence of forensic genetic analysis will continue to drive additional innovations in high-throughput laboratory automation and information systems.
NASA Technical Reports Server (NTRS)
Collow, Allie Marquardt; Bosilovich, Mike; Ullrich, Paul; Hoeck, Ian
2017-01-01
Extreme precipitation events can have a large impact on society through flooding that can result in property destruction, crop losses, economic losses, the spread of water-borne diseases, and fatalities. Observations indicate there has been a statistically significant increase in extreme precipitation events over the past 15 years in the Northeastern United States and other localized regions of the country have become crippled with record flooding events, for example, the flooding that occurred in the Southeast United States associated with Hurricane Matthew in October 2016. Extreme precipitation events in the United States can be caused by various meteorological influences such as extratropical cyclones, tropical cyclones, mesoscale convective complexes, general air mass thunderstorms, upslope flow, fronts, and the North American Monsoon. Reanalyses, such as the Modern Era Retrospective Analysis for Research and Applications, version 2 (MERRA-2), have become a pivotal tool to study the meteorology surrounding extreme precipitation events. Using days classified as an extreme precipitation events based on a combination of observational gauge and radar data, two techniques for the classification of these events are used to gather additional information that can be used to determine how events have changed over time using atmospheric data from MERRA-2. The first is self organizing maps, which is an artificial neural network that uses unsupervised learning to cluster like patterns and the second is an automated detection technique that searches for characteristics in the atmosphere that define a meteorological phenomena. For example, the automated detection for tropical cycles searches for a defined area of suppressed sea level pressure, alongside thickness anomalies aloft, indicating the presence of a warm core. These techniques are employed for extreme precipitation events in preselected regions that were chosen based an analysis of the climatology of precipitation.
An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films
NASA Astrophysics Data System (ADS)
Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander
Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.
NASA Astrophysics Data System (ADS)
Trubilowicz, J. W.; Moore, D.
2015-12-01
Snowpack dynamics and runoff generation in coastal mountain regions are complicated by rain-on-snow (ROS) events. During major ROS events associated with warm, moist air and strong winds, turbulent heat fluxes can produce substantial melt to supplement rainfall, but previous studies suggest this may not be true for smaller, more frequent events. The internal temperature and water content of the snowpack are also expected to influence runoff generation during ROS events: a cold snowpack with no liquid water content will have the ability to store significant amounts of rainfall, whereas a 'ripe' snowpack may begin to melt and generate outflow with little rain input. However, it is not well understood how antecedent snowpack conditions and energy fluxes differ between ROS events that cause large runoff events and those that do not, in large part because major flood-producing ROS events occur infrequently, and thus are often not sampled during short-term research projects. To generate greater understanding of runoff generation over the spectrum of ROS magnitudes and frequencies, we analyzed data from Automated Snow Pillow (ASP) sites, which record hourly air temperature, precipitation and snowpack water equivalent and offer up to several decades of data at each site. We supplemented the ASP data with output from the North American Regional Reanalysis (NARR) product to support point scale snow modeling for 335 ROS event records from six ASP sites in southwestern BC from 2003 to 2013. Our analysis reconstructed the weather conditions, surface energy exchanges, internal mass and energy states of the snowpack, and generation of snow melt and water available for runoff (WAR) for each ROS event. Results indicate that WAR generation during large events is largely independent of the snowpack conditions, but for smaller events, the antecedent snow conditions play a significant role in either damping or enhancing WAR generation.
Effects of Selected Task Performance Criteria at Initiating Adaptive Task Real locations
NASA Technical Reports Server (NTRS)
Montgomery, Demaris A.
2001-01-01
In the current report various performance assessment methods used to initiate mode transfers between manual control and automation for adaptive task reallocation were tested. Participants monitored two secondary tasks for critical events while actively controlling a process in a fictional system. One of the secondary monitoring tasks could be automated whenever operators' performance was below acceptable levels. Automation of the secondary task and transfer of the secondary task back to manual control were either human- or machine-initiated. Human-initiated transfers were based on the operator's assessment of the current task demands while machine-initiated transfers were based on the operators' performance. Different performance assessment methods were tested in two separate experiments.
Choe, Leila H; Lee, Kelvin H
2003-10-01
We investigate one approach to assess the quantitative variability in two-dimensional gel electrophoresis (2-DE) separations based on gel-to-gel variability, sample preparation variability, sample load differences, and the effect of automation on image analysis. We observe that 95% of spots present in three out of four replicate gels exhibit less than a 0.52 coefficient of variation (CV) in fluorescent stain intensity (% volume) for a single sample run on multiple gels. When four parallel sample preparations are performed, this value increases to 0.57. We do not observe any significant change in quantitative value for an increase or decrease in sample load of 30% when using appropriate image analysis variables. Increasing use of automation, while necessary in modern 2-DE experiments, does change the observed level of quantitative and qualitative variability among replicate gels. The number of spots that change qualitatively for a single sample run in parallel varies from a CV = 0.03 for fully manual analysis to CV = 0.20 for a fully automated analysis. We present a systematic method by which a single laboratory can measure gel-to-gel variability using only three gel runs.
Automated Classification of Power Signals
2008-06-01
determine when a transient occurs. The identification of this signal can then be determined by an expert classifier and a series of these...the manual identification and classification of system events. Once events were located, the characteristics were examined to determine if system... identification code, which varies depending on the system classifier that is specified. Figure 3-7 provides an example of a Linux directory containing
Automated Classification and Analysis of Non-metallic Inclusion Data Sets
NASA Astrophysics Data System (ADS)
Abdulsalam, Mohammad; Zhang, Tongsheng; Tan, Jia; Webler, Bryan A.
2018-05-01
The aim of this study is to utilize principal component analysis (PCA), clustering methods, and correlation analysis to condense and examine large, multivariate data sets produced from automated analysis of non-metallic inclusions. Non-metallic inclusions play a major role in defining the properties of steel and their examination has been greatly aided by automated analysis in scanning electron microscopes equipped with energy dispersive X-ray spectroscopy. The methods were applied to analyze inclusions on two sets of samples: two laboratory-scale samples and four industrial samples from a near-finished 4140 alloy steel components with varying machinability. The laboratory samples had well-defined inclusions chemistries, composed of MgO-Al2O3-CaO, spinel (MgO-Al2O3), and calcium aluminate inclusions. The industrial samples contained MnS inclusions as well as (Ca,Mn)S + calcium aluminate oxide inclusions. PCA could be used to reduce inclusion chemistry variables to a 2D plot, which revealed inclusion chemistry groupings in the samples. Clustering methods were used to automatically classify inclusion chemistry measurements into groups, i.e., no user-defined rules were required.
Stangegaard, Michael; Frøslev, Tobias G; Frank-Hansen, Rune; Hansen, Anders J; Morling, Niels
2011-04-01
We have implemented and validated automated protocols for DNA extraction and PCR setup using a Tecan Freedom EVO liquid handler mounted with the Te-MagS magnetic separation device (Tecan, Männedorf, Switzerland). The protocols were validated for accredited forensic genetic work according to ISO 17025 using the Qiagen MagAttract DNA Mini M48 kit (Qiagen GmbH, Hilden, Germany) from fresh whole blood and blood from deceased individuals. The workflow was simplified by returning the DNA extracts to the original tubes minimizing the risk of misplacing samples. The tubes that originally contained the samples were washed with MilliQ water before the return of the DNA extracts. The PCR was setup in 96-well microtiter plates. The methods were validated for the kits: AmpFℓSTR Identifiler, SGM Plus and Yfiler (Applied Biosystems, Foster City, CA), GenePrint FFFL and PowerPlex Y (Promega, Madison, WI). The automated protocols allowed for extraction and addition of PCR master mix of 96 samples within 3.5h. In conclusion, we demonstrated that (1) DNA extraction with magnetic beads and (2) PCR setup for accredited, forensic genetic short tandem repeat typing can be implemented on a simple automated liquid handler leading to the reduction of manual work, and increased quality and throughput. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.
[DNA extraction from bones and teeth using AutoMate Express forensic DNA extraction system].
Gao, Lin-Lin; Xu, Nian-Lai; Xie, Wei; Ding, Shao-Cheng; Wang, Dong-Jing; Ma, Li-Qin; Li, You-Ying
2013-04-01
To explore a new method in order to extract DNA from bones and teeth automatically. Samples of 33 bones and 15 teeth were acquired by freeze-mill method and manual method, respectively. DNA materials were extracted and quantified from the triturated samples by AutoMate Express forensic DNA extraction system. DNA extraction from bones and teeth were completed in 3 hours using the AutoMate Express forensic DNA extraction system. There was no statistical difference between the two methods in the DNA concentration of bones. Both bones and teeth got the good STR typing by freeze-mill method, and the DNA concentration of teeth was higher than those by manual method. AutoMate Express forensic DNA extraction system is a new method to extract DNA from bones and teeth, which can be applied in forensic practice.
Elbeik, Tarek; Loftus, Richard A; Beringer, Scott
2007-11-01
Labor, supply and waste were evaluated for HIV-1 and HCV bDNA on the semi-automated System 340 bDNA Analyzer and the automated VERSANT 440 Molecular System (V440). HIV-1 sample processing was evaluated using a 24- and 48-position centrifuge rotor. Vigilance time (hands-on manipulations plus incubation time except initial target hybridization) and disposables were approximately 37 and 12% lower for HIV-1, and 64 and 31% lower for HCV bDNA, respectively, with V440. Biohazardous solid waste was approximately twofold lower for both assays and other waste types were the same for either assay on both platforms. HIV-1 sample processing vigilance time for the 48-position rotor was reduced by 2 h. V440 provides cost savings and improved workflow.
Wolff, Reuben H.; Wong, Michael F.
2008-01-01
Since November 1998, water-quality data have been collected from the H-3 Highway Storm Drain C, which collects runoff from a 4-mi-long viaduct, and from Halawa Stream on Oahu, Hawaii. From January 2001 to August 2004, data were collected from the storm drain and four stream sites in the Halawa Stream drainage basin as part of the State of Hawaii Department of Transportation Storm Water Monitoring Program. Data from the stormwater monitoring program have been published in annual reports. This report uses these water-quality data to explore how the highway storm-drain runoff affects Halawa Stream and the factors that might be controlling the water quality in the drainage basin. In general, concentrations of nutrients, total dissolved solids, and total suspended solids were lower in highway runoff from Storm Drain C than at stream sites upstream and downstream of Storm Drain C. The opposite trend was observed for most trace metals, which generally occurred in higher concentrations in the highway runoff from Storm Drain C than in the samples collected from Halawa Stream. The absolute contribution from Storm Drain C highway runoff, in terms of total storm loads, was much smaller than at stations upstream and downstream, whereas the constituent yields (the relative contribution per unit drainage basin area) at Storm Drain C were comparable to or higher than storm yields at stations upstream and downstream. Most constituent concentrations and loads in stormwater runoff increased in a downstream direction. The timing of the storm sampling is an important factor controlling constituent concentrations observed in stormwater runoff samples. Automated point samplers were used to collect grab samples during the period of increasing discharge of the storm throughout the stormflow peak and during the period of decreasing discharge of the storm, whereas manually collected grab samples were generally collected during the later stages near the end of the storm. Grab samples were analyzed to determine concentrations and loads at a particular point in time. Flow-weighted time composite samples from the automated point samplers were analyzed to determine mean constituent concentrations or loads during a storm. Chemical analysis of individual grab samples from the automated point sampler at Storm Drain C demonstrated the ?first flush? phenomenon?higher constituent concentrations at the beginning of runoff events?for the trace metals cadmium, lead, zinc, and copper, whose concentrations were initially high during the period of increasing discharge and gradually decreased over the duration of the storm. Water-quality data from Storm Drain C and four stream sites were compared to the State of Hawaii Department of Health (HDOH) water-quality standards to determine the effects of highway storm runoff on the water quality of Halawa Stream. The geometric-mean standards and the 10- and 2-percent-of-the-time concentration standards for total nitrogen, nitrite plus nitrate, total phosphorus, total suspended solids, and turbidity were exceeded in many of the comparisons. However, these standards were not designed for stormwater sampling, in which constituent concentrations would be expected to increase for short periods of time. With the aim of enhancing the usefulness of the water-quality data, several modifications to the stormwater monitoring program are suggested. These suggestions include (1) the periodic analyzing of discrete samples from the automated point samplers over the course of a storm to get a clearer profile of the storm, from first flush to the end of the receding discharge; (2) adding an analysis of the dissolved fractions of metals to the sampling plan; (3) installation of an automatic sampler at Bridge 8 to enable sampling earlier in the storms; (4) a one-time sampling and analysis of soils upstream of Bridge 8 for base-line contaminant concentrations; (5) collection of samples from Halawa Stream during low-flow conditions
Setting objective thresholds for rare event detection in flow cytometry
Richards, Adam J.; Staats, Janet; Enzor, Jennifer; McKinnon, Katherine; Frelinger, Jacob; Denny, Thomas N.; Weinhold, Kent J.; Chan, Cliburn
2014-01-01
The accurate identification of rare antigen-specific cytokine positive cells from peripheral blood mononuclear cells (PBMC) after antigenic stimulation in an intracellular staining (ICS) flow cytometry assay is challenging, as cytokine positive events may be fairly diffusely distributed and lack an obvious separation from the negative population. Traditionally, the approach by flow operators has been to manually set a positivity threshold to partition events into cytokine-positive and cytokine-negative. This approach suffers from subjectivity and inconsistency across different flow operators. The use of statistical clustering methods does not remove the need to find an objective threshold between between positive and negative events since consistent identification of rare event subsets is highly challenging for automated algorithms, especially when there is distributional overlap between the positive and negative events (“smear”). We present a new approach, based on the Fβ measure, that is similar to manual thresholding in providing a hard cutoff, but has the advantage of being determined objectively. The performance of this algorithm is compared with results obtained by expert visual gating. Several ICS data sets from the External Quality Assurance Program Oversight Laboratory (EQAPOL) proficiency program were used to make the comparisons. We first show that visually determined thresholds are difficult to reproduce and pose a problem when comparing results across operators or laboratories, as well as problems that occur with the use of commonly employed clustering algorithms. In contrast, a single parameterization for the Fβ method performs consistently across different centers, samples, and instruments because it optimizes the precision/recall tradeoff by using both negative and positive controls. PMID:24727143
Portable Automation of Static Chamber Sample Collection for Quantifying Soil Gas Flux
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Morgan P.; Groh, Tyler A.; Parkin, Timothy B.
Quantification of soil gas flux using the static chamber method is labor intensive. The number of chambers that can be sampled is limited by the spacing between chambers and the availability of trained research technicians. An automated system for collecting gas samples from chambers in the field would eliminate the need for personnel to return to the chamber during a flux measurement period and would allow a single technician to sample multiple chambers simultaneously. This study describes Chamber Automated Sampling Equipment (FluxCASE) to collect and store chamber headspace gas samples at assigned time points for the measurement of soil gasmore » flux. The FluxCASE design and operation is described, and the accuracy and precision of the FluxCASE system is evaluated. In laboratory measurements of nitrous oxide (N2O), carbon dioxide (CO2), and methane (CH4) concentrations of a standardized gas mixture, coefficients of variation associated with automated and manual sample collection were comparable, indicating no loss of precision. In the field, soil gas fluxes measured from FluxCASEs were in agreement with manual sampling for both N2O and CO2. Slopes of regression equations were 1.01 for CO2 and 0.97 for N2O. The 95% confidence limits of the slopes of the regression lines included the value of one, indicating no bias. Additionally, an expense analysis found a cost recovery ranging from 0.6 to 2.2 yr. Implementing the FluxCASE system is an alternative to improve the efficiency of the static chamber method for measuring soil gas flux while maintaining the accuracy and precision of manual sampling.« less
Zhou, Zhi; Pons, Marie Noëlle; Raskin, Lutgarde; Zilles, Julie L
2007-05-01
When fluorescence in situ hybridization (FISH) analyses are performed with complex environmental samples, difficulties related to the presence of microbial cell aggregates and nonuniform background fluorescence are often encountered. The objective of this study was to develop a robust and automated quantitative FISH method for complex environmental samples, such as manure and soil. The method and duration of sample dispersion were optimized to reduce the interference of cell aggregates. An automated image analysis program that detects cells from 4',6'-diamidino-2-phenylindole (DAPI) micrographs and extracts the maximum and mean fluorescence intensities for each cell from corresponding FISH images was developed with the software Visilog. Intensity thresholds were not consistent even for duplicate analyses, so alternative ways of classifying signals were investigated. In the resulting method, the intensity data were divided into clusters using fuzzy c-means clustering, and the resulting clusters were classified as target (positive) or nontarget (negative). A manual quality control confirmed this classification. With this method, 50.4, 72.1, and 64.9% of the cells in two swine manure samples and one soil sample, respectively, were positive as determined with a 16S rRNA-targeted bacterial probe (S-D-Bact-0338-a-A-18). Manual counting resulted in corresponding values of 52.3, 70.6, and 61.5%, respectively. In two swine manure samples and one soil sample 21.6, 12.3, and 2.5% of the cells were positive with an archaeal probe (S-D-Arch-0915-a-A-20), respectively. Manual counting resulted in corresponding values of 22.4, 14.0, and 2.9%, respectively. This automated method should facilitate quantitative analysis of FISH images for a variety of complex environmental samples.
Automated Sensitivity Analysis of Interplanetary Trajectories for Optimal Mission Design
NASA Technical Reports Server (NTRS)
Knittel, Jeremy; Hughes, Kyle; Englander, Jacob; Sarli, Bruno
2017-01-01
This work describes a suite of Python tools known as the Python EMTG Automated Trade Study Application (PEATSA). PEATSA was written to automate the operation of trajectory optimization software, simplify the process of performing sensitivity analysis, and was ultimately found to out-perform a human trajectory designer in unexpected ways. These benefits will be discussed and demonstrated on sample mission designs.
Ruaño, Gualberto; Kocherla, Mohan; Graydon, James S; Holford, Theodore R; Makowski, Gregory S; Goethe, John W
2016-05-01
We describe a population genetic approach to compare samples interpreted with expert calling (EC) versus automated calling (AC) for CYP2D6 haplotyping. The analysis represents 4812 haplotype calls based on signal data generated by the Luminex xMap analyzers from 2406 patients referred to a high-complexity molecular diagnostics laboratory for CYP450 testing. DNA was extracted from buccal swabs. We compared the results of expert calls (EC) and automated calls (AC) with regard to haplotype number and frequency. The ratio of EC to AC was 1:3. Haplotype frequencies from EC and AC samples were convergent across haplotypes, and their distribution was not statistically different between the groups. Most duplications required EC, as only expansions with homozygous or hemizygous haplotypes could be automatedly called. High-complexity laboratories can offer equivalent interpretation to automated calling for non-expanded CYP2D6 loci, and superior interpretation for duplications. We have validated scientific expert calling specified by scoring rules as standard operating procedure integrated with an automated calling algorithm. The integration of EC with AC is a practical strategy for CYP2D6 clinical haplotyping. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Chapman, K. B.; Cox, C. M.; Thomas, C. W.; Cuevas, O. O.; Beckman, R. M.
1994-01-01
The Flight Dynamics Facility (FDF) at the NASA Goddard Space Flight Center (GSFC) generates numerous products for NASA-supported spacecraft, including the Tracking and Data Relay Satellites (TDRS's), the Hubble Space Telescope (HST), the Extreme Ultraviolet Explorer (EUVE), and the space shuttle. These products include orbit determination data, acquisition data, event scheduling data, and attitude data. In most cases, product generation involves repetitive execution of many programs. The increasing number of missions supported by the FDF has necessitated the use of automated systems to schedule, execute, and quality assure these products. This automation allows the delivery of accurate products in a timely and cost-efficient manner. To be effective, these systems must automate as many repetitive operations as possible and must be flexible enough to meet changing support requirements. The FDF Orbit Determination Task (ODT) has implemented several systems that automate product generation and quality assurance (QA). These systems include the Orbit Production Automation System (OPAS), the New Enhanced Operations Log (NEOLOG), and the Quality Assurance Automation Software (QA Tool). Implementation of these systems has resulted in a significant reduction in required manpower, elimination of shift work and most weekend support, and improved support quality, while incurring minimal development cost. This paper will present an overview of the concepts used and experiences gained from the implementation of these automation systems.
Moreno-Duarte, Ingrid; Montenegro, Julio; Balonov, Konstantin; Schumann, Roman
2017-04-15
Most modern anesthesia workstations provide automated checkout, which indicates the readiness of the anesthesia machine. In this case report, an anesthesia machine passed the automated machine checkout. Minutes after the induction of general anesthesia, we observed a mismatch between the selected and delivered tidal volumes in the volume auto flow mode with increased inspiratory resistance during manual ventilation. Endotracheal tube kinking, circuit obstruction, leaks, and patient-related factors were ruled out. Further investigation revealed a broken internal insert within the CO2 absorbent canister that allowed absorbent granules to cause a partial obstruction to inspiratory and expiratory flow triggering contradictory alarms. We concluded that even when the automated machine checkout indicates machine readiness, unforeseen equipment failure due to unexpected events can occur and require providers to remain vigilant.
A Risk Assessment System with Automatic Extraction of Event Types
NASA Astrophysics Data System (ADS)
Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula
In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.
Development of a generic GMCC simulator.
DOT National Transportation Integrated Search
2001-11-01
This document describes the development and current status of a high fidelity, human-in-the-loop simulator for Airway Facilities : Maintenance Control Centers and Operations Control Centers. Applications include Event Manager, Maintenance Automation ...
Gu, Qun; David, Frank; Lynen, Frédéric; Rumpel, Klaus; Dugardeyn, Jasper; Van Der Straeten, Dominique; Xu, Guowang; Sandra, Pat
2011-05-27
In this paper, automated sample preparation, retention time locked gas chromatography-mass spectrometry (GC-MS) and data analysis methods for the metabolomics study were evaluated. A miniaturized and automated derivatisation method using sequential oximation and silylation was applied to a polar extract of 4 types (2 types×2 ages) of Arabidopsis thaliana, a popular model organism often used in plant sciences and genetics. Automation of the derivatisation process offers excellent repeatability, and the time between sample preparation and analysis was short and constant, reducing artifact formation. Retention time locked (RTL) gas chromatography-mass spectrometry was used, resulting in reproducible retention times and GC-MS profiles. Two approaches were used for data analysis. XCMS followed by principal component analysis (approach 1) and AMDIS deconvolution combined with a commercially available program (Mass Profiler Professional) followed by principal component analysis (approach 2) were compared. Several features that were up- or down-regulated in the different types were detected. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Javaux, Denis; Masson, Michel; Dekeyser, Veronique
1994-01-01
There is currently a growing interest in the aeronautical community to assess the effects of the increasing levels of automation on pilots' performance and overall safety. The first effect of automation is the change in the nature of the pilot's role on the flight deck. Pilots have become supervisors who monitor aircraft systems in usual situations and intervene only when unanticipated events occur. Instead of 'hand flying' the airplane, pilots contribute to the control of aircraft by acting as mediators, instructions given to the automation. By eliminating the need for manually controlling normal situations, such a role division has reduced the opportunities for the pilot to acquire experience and skills necessary to safely cope with abnormal events. Difficulties in assessing the state and behavior of automation arise mainly from four factors: (1) the complexity of current systems and consequence mode-related problems; (2) the intrinsic autonomy of automation which is able to fire mode transitions without explicit commands from the pilots; (3) the bad quality of feed-back from the control systems displays and interfaces to the pilots; and (4) the fact that the automation currently has no explicit representation of the current pilots' intentions and strategy. Assuming certification has among its major goals to guarantee the passengers' and pilots' safety and the airplane integrity under normal and abnormal operational conditions, the authors suggest it would be particularly fruitful to come up with a conceptual reference system providing the certification authorities both with a theoretical framework and a list of principles usable for assessing the quality of the equipment and designs under examination. This is precisely the scope of this paper. However, the authors recognize that the conceptual presented is still under development and would thus be best considered as a source of reflection for the design, evaluation and certification processes of advanced aviation technologies.
Automated analysis of cell migration and nuclear envelope rupture in confined environments.
Elacqua, Joshua J; McGregor, Alexandra L; Lammerding, Jan
2018-01-01
Recent in vitro and in vivo studies have highlighted the importance of the cell nucleus in governing migration through confined environments. Microfluidic devices that mimic the narrow interstitial spaces of tissues have emerged as important tools to study cellular dynamics during confined migration, including the consequences of nuclear deformation and nuclear envelope rupture. However, while image acquisition can be automated on motorized microscopes, the analysis of the corresponding time-lapse sequences for nuclear transit through the pores and events such as nuclear envelope rupture currently requires manual analysis. In addition to being highly time-consuming, such manual analysis is susceptible to person-to-person variability. Studies that compare large numbers of cell types and conditions therefore require automated image analysis to achieve sufficiently high throughput. Here, we present an automated image analysis program to register microfluidic constrictions and perform image segmentation to detect individual cell nuclei. The MATLAB program tracks nuclear migration over time and records constriction-transit events, transit times, transit success rates, and nuclear envelope rupture. Such automation reduces the time required to analyze migration experiments from weeks to hours, and removes the variability that arises from different human analysts. Comparison with manual analysis confirmed that both constriction transit and nuclear envelope rupture were detected correctly and reliably, and the automated analysis results closely matched a manual analysis gold standard. Applying the program to specific biological examples, we demonstrate its ability to detect differences in nuclear transit time between cells with different levels of the nuclear envelope proteins lamin A/C, which govern nuclear deformability, and to detect an increase in nuclear envelope rupture duration in cells in which CHMP7, a protein involved in nuclear envelope repair, had been depleted. The program thus presents a versatile tool for the study of confined migration and its effect on the cell nucleus.
Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike
2013-01-01
In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.
Automated CD-SEM recipe creation technology for mass production using CAD data
NASA Astrophysics Data System (ADS)
Kawahara, Toshikazu; Yoshida, Masamichi; Tanaka, Masashi; Ido, Sanyu; Nakano, Hiroyuki; Adachi, Naokaka; Abe, Yuichi; Nagatomo, Wataru
2011-03-01
Critical Dimension Scanning Electron Microscope (CD-SEM) recipe creation needs sample preparation necessary for matching pattern registration, and recipe creation on CD-SEM using the sample, which hinders the reduction in test production cost and time in semiconductor manufacturing factories. From the perspective of cost reduction and improvement of the test production efficiency, automated CD-SEM recipe creation without the sample preparation and the manual operation has been important in the production lines. For the automated CD-SEM recipe creation, we have introduced RecipeDirector (RD) that enables the recipe creation by using Computer-Aided Design (CAD) data and text data that includes measurement information. We have developed a system that automatically creates the CAD data and the text data necessary for the recipe creation on RD; and, for the elimination of the manual operation, we have enhanced RD so that all measurement information can be specified in the text data. As a result, we have established an automated CD-SEM recipe creation system without the sample preparation and the manual operation. For the introduction of the CD-SEM recipe creation system using RD to the production lines, the accuracy of the pattern matching was an issue. The shape of design templates for the matching created from the CAD data was different from that of SEM images in vision. Thus, a development of robust pattern matching algorithm that considers the shape difference was needed. The addition of image processing of the templates for the matching and shape processing of the CAD patterns in the lower layer has enabled the robust pattern matching. This paper describes the automated CD-SEM recipe creation technology for the production lines without the sample preparation and the manual operation using RD applied in Sony Semiconductor Kyusyu Corporation Kumamoto Technology Center (SCK Corporation Kumamoto TEC).
Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc; Hart, A. John
2013-01-01
Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes. PMID:24289435
Ni, Yizhao; Lingren, Todd; Hall, Eric S; Leonard, Matthew; Melton, Kristin; Kirkendall, Eric S
2018-05-01
Timely identification of medication administration errors (MAEs) promises great benefits for mitigating medication errors and associated harm. Despite previous efforts utilizing computerized methods to monitor medication errors, sustaining effective and accurate detection of MAEs remains challenging. In this study, we developed a real-time MAE detection system and evaluated its performance prior to system integration into institutional workflows. Our prospective observational study included automated MAE detection of 10 high-risk medications and fluids for patients admitted to the neonatal intensive care unit at Cincinnati Children's Hospital Medical Center during a 4-month period. The automated system extracted real-time medication use information from the institutional electronic health records and identified MAEs using logic-based rules and natural language processing techniques. The MAE summary was delivered via a real-time messaging platform to promote reduction of patient exposure to potential harm. System performance was validated using a physician-generated gold standard of MAE events, and results were compared with those of current practice (incident reporting and trigger tools). Physicians identified 116 MAEs from 10 104 medication administrations during the study period. Compared to current practice, the sensitivity with automated MAE detection was improved significantly from 4.3% to 85.3% (P = .009), with a positive predictive value of 78.0%. Furthermore, the system showed potential to reduce patient exposure to harm, from 256 min to 35 min (P < .001). The automated system demonstrated improved capacity for identifying MAEs while guarding against alert fatigue. It also showed promise for reducing patient exposure to potential harm following MAE events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hawkins, C.; Dietz, M.; Kaminski, M.
2016-03-01
A technical program to support the Centers of Disease Control and Prevention is being developed to provide an analytical method for rapid extraction of Sr-90 from urine, with the intent of assessing the general population’s exposure during an emergency response to a radiological terrorist event. Results are presented on the progress in urine sample preparation and chemical separation steps that provide an accurate and quantitative detection of Sr-90 based upon an automated column separation sequence and a liquid scintillation assay. Batch extractions were used to evaluate the urine pretreatment and the column separation efficiency and loading capacity based upon commercial,more » extractant-loaded resins. An efficient pretreatment process for decolorizing and removing organics from urine without measurable loss of radiostrontium from the sample was demonstrated. In addition, the Diphonix® resin shows promise for the removal of high concentrations of common strontium interferents in urine as a first separation step for Sr-90 analysis.« less
Banks, Victoria A; Stanton, Neville A
2015-01-01
Automated assistance in driving emergencies aims to improve the safety of our roads by avoiding or mitigating the effects of accidents. However, the behavioural implications of such systems remain unknown. This paper introduces the driver decision-making in emergencies (DDMiEs) framework to investigate how the level and type of automation may affect driver decision-making and subsequent responses to critical braking events using network analysis to interrogate retrospective verbalisations. Four DDMiE models were constructed to represent different levels of automation within the driving task and its effects on driver decision-making. Findings suggest that whilst automation does not alter the decision-making pathway (e.g. the processes between hazard detection and response remain similar), it does appear to significantly weaken the links between information-processing nodes. This reflects an unintended yet emergent property within the task network that could mean that we may not be improving safety in the way we expect. This paper contrasts models of driver decision-making in emergencies at varying levels of automation using the Southampton University Driving Simulator. Network analysis of retrospective verbalisations indicates that increasing the level of automation in driving emergencies weakens the link between information-processing nodes essential for effective decision-making.
Non-Uniform Sampling and J-UNIO Automation for Efficient Protein NMR Structure Determination.
Didenko, Tatiana; Proudfoot, Andrew; Dutta, Samit Kumar; Serrano, Pedro; Wüthrich, Kurt
2015-08-24
High-resolution structure determination of small proteins in solution is one of the big assets of NMR spectroscopy in structural biology. Improvements in the efficiency of NMR structure determination by advances in NMR experiments and automation of data handling therefore attracts continued interest. Here, non-uniform sampling (NUS) of 3D heteronuclear-resolved [(1)H,(1)H]-NOESY data yielded two- to three-fold savings of instrument time for structure determinations of soluble proteins. With the 152-residue protein NP_372339.1 from Staphylococcus aureus and the 71-residue protein NP_346341.1 from Streptococcus pneumonia we show that high-quality structures can be obtained with NUS NMR data, which are equally well amenable to robust automated analysis as the corresponding uniformly sampled data. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Automated diagnostic kiosk for diagnosing diseases
Regan, John Frederick; Birch, James Michael
2014-02-11
An automated and autonomous diagnostic apparatus that is capable of dispensing collection vials and collections kits to users interesting in collecting a biological sample and submitting their collected sample contained within a collection vial into the apparatus for automated diagnostic services. The user communicates with the apparatus through a touch-screen monitor. A user is able to enter personnel information into the apparatus including medical history, insurance information, co-payment, and answer a series of questions regarding their illness, which is used to determine the assay most likely to yield a positive result. Remotely-located physicians can communicate with users of the apparatus using video tele-medicine and request specific assays to be performed. The apparatus archives submitted samples for additional testing. Users may receive their assay results electronically. Users may allow the uploading of their diagnoses into a central databank for disease surveillance purposes.
Kilaru, Austin S; Leffer, Marc; Perkner, John; Sawyer, Kate Flanigan; Jolley, Chandra E; Nadkarni, Lindsay D; Shofer, Frances S; Merchant, Raina M
2014-01-01
Federal Occupational Health (FOH) administers a nationwide public access defibrillation program in US federal buildings. We describe the use of automated external defibrillators (AEDs) in federal buildings and evaluate survival after cardiac arrest. Using the FOH database, we examined reported events in which an AED was brought to a medical emergency in federal buildings over a 14-year period, from 1999 to 2012. There were 132 events involving an AED, 96 (73%) of which were due to cardiac arrest of cardiac etiology. Of 54 people who were witnessed to experience a cardiac arrest and presented with ventricular fibrillation or ventricular tachycardia, 21 (39%) survived to hospital discharge. Public access defibrillation, along with protocols to install, maintain, and deploy AEDs and train first responders, benefits survival after cardiac arrest in the workplace.
Use of Automated External Defibrillators in US Federal Buildings
Kilaru, Austin S.; Leffer, Marc; Perkner, John; Sawyer, Kate Flanigan; Jolley, Chandra E.; Nadkarni, Lindsay D.; Shofer, Frances S.; Merchant, Raina M.
2014-01-01
Objective Federal Occupational Health (FOH) administers a nationwide public access defibrillation program in US federal buildings. We describe the use of automated external defibrillators (AEDs) in federal buildings and evaluate survival after cardiac arrest. Methods Using the FOH database, we examined reported events in which an AED was brought to a medical emergency in federal buildings over a 14-year period, from 1999 to 2012. Results There were 132 events involving an AED, 96 (73%) of which were due to cardiac arrest of cardiac etiology. Of 54 people who were witnessed to experience a cardiac arrest and presented with ventricular fibrillation or ventricular tachycardia, 21 (39%) survived to hospital discharge. Conclusions Public access defibrillation, along with protocols to install, maintain, and deploy AEDs and train first responders, benefits survival after cardiac arrest in the workplace. PMID:24351893
Peripheral refractive correction and automated perimetric profiles.
Wild, J M; Wood, J M; Crews, S J
1988-06-01
The effect of peripheral refractive error correction on the automated perimetric sensitivity profile was investigated on a sample of 10 clinically normal, experienced observers. Peripheral refractive error was determined at eccentricities of 0 degree, 20 degrees and 40 degrees along the temporal meridian of the right eye using the Canon Autoref R-1, an infra-red automated refractor, under the parametric conditions of the Octopus automated perimeter. Perimetric sensitivity was then undertaken at these eccentricities (stimulus sizes 0 and III) with and without the appropriate peripheral refractive correction using the Octopus 201 automated perimeter. Within the measurement limits of the experimental procedures employed, perimetric sensitivity was not influenced by peripheral refractive correction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bostick, Debra A.; Hexel, Cole R.; Ticknor, Brian W.
2016-11-01
To shorten the lengthy and costly manual chemical purification procedures, sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment. This addresses a serious need in the nuclear safeguards community to debottleneck the separation of U and Pu in environmental samples—currently performed by overburdened chemists—with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on current COTS equipment that was modified for U/Pu separations utilizing Eichrom™ TEVA and UTEVA resins. Initial verification of individual columns yielded small elution volumes with consistent elution profiles and good recovery. Combined columnmore » calibration demonstrated ample separation without crosscontamination of the eluent. Automated packing and unpacking of the built-in columns initially showed >15% deviation in resin loading by weight, which can lead to inconsistent separations. Optimization of the packing and unpacking methods led to a reduction in the variability of the packed resin to less than 5% daily. The reproducibility of the automated system was tested with samples containing 30 ng U and 15 pg Pu, which were separated in a series with alternating reagent blanks. These experiments showed very good washout of both the resin and the sample from the columns as evidenced by low blank values. Analysis of the major and minor isotope ratios for U and Pu provided values well within data quality limits for the International Atomic Energy Agency. Additionally, system process blanks spiked with 233U and 244Pu tracers were separated using the automated system after it was moved outside of a clean room and yielded levels equivalent to clean room blanks, confirming that the system can produce high quality results without the need for expensive clean room infrastructure. Comparison of the amount of personnel time necessary for successful manual vs. automated chemical separations showed a significant decrease in hands-on time from 9.8 hours to 35 minutes for seven samples, respectively. This documented time savings and reduced labor translates to a significant cost savings per sample. Overall, the system will enable faster sample reporting times at reduced costs by limiting personnel hours dedicated to the chemical separation.« less
Autonomous Multi-Sensor Coordination: The Science Goal Monitor
NASA Technical Reports Server (NTRS)
Koratkar, Anuradha; Grosvenor, Sandy; Jung, John; Hess, Melissa; Jones, Jeremy
2004-01-01
Many dramatic earth phenomena are dynamic and coupled. In order to fully understand them, we need to obtain timely coordinated multi-sensor observations from widely dispersed instruments. Such a dynamic observing system must include the ability to Schedule flexibly and react autonomously to sciencehser driven events; Understand higher-level goals of a sciencehser defined campaign; Coordinate various space-based and ground-based resources/sensors effectively and efficiently to achieve goals. In order to capture transient events, such a 'sensor web' system must have an automated reactive capability built into its scientific operations. To do this, we must overcome a number of challenges inherent in infusing autonomy. The Science Goal Monitor (SGM) is a prototype software tool being developed to explore the nature of automation necessary to enable dynamic observing. The tools being developed in SGM improve our ability to autonomously monitor multiple independent sensors and coordinate reactions to better observe dynamic phenomena. The SGM system enables users to specify what to look for and how to react in descriptive rather than technical terms. The system monitors streams of data to identify occurrences of the key events previously specified by the scientisther. When an event occurs, the system autonomously coordinates the execution of the users' desired reactions between different sensors. The information can be used to rapidly respond to a variety of fast temporal events. Investigators will no longer have to rely on after-the-fact data analysis to determine what happened. Our paper describes a series of prototype demonstrations that we have developed using SGM and NASA's Earth Observing-1 (EO-1) satellite and Earth Observing Systems' Aqua/Terra spacecrafts' MODIS instrument. Our demonstrations show the promise of coordinating data from different sources, analyzing the data for a relevant event, autonomously updating and rapidly obtaining a follow-on relevant image. SGM was used to investigate forest fires, floods and volcanic eruptions. We are now identifying new Earth science scenarios that will have more complex SGM reasoning. By developing and testing a prototype in an operational environment, we are also establishing and gathering metrics to gauge the success of automating science campaigns.
Automated transient identification in the Dark Energy Survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goldstein, D. A.
2015-08-20
We describe an algorithm for identifying point-source transients and moving objects on reference-subtracted optical images containing artifacts of processing and instrumentation. The algorithm makes use of the supervised machine learning technique known as Random Forest. We present results from its use in the Dark Energy Survey Supernova program (DES-SN), where it was trained using a sample of 898,963 signal and background events generated by the transient detection pipeline. After reprocessing the data collected during the first DES-SN observing season (2013 September through 2014 February) using the algorithm, the number of transient candidates eligible for human scanning decreased by a factormore » of 13.4, while only 1.0 percent of the artificial Type Ia supernovae (SNe) injected into search images to monitor survey efficiency were lost, most of which were very faint events. Here we characterize the algorithm's performance in detail, and we discuss how it can inform pipeline design decisions for future time-domain imaging surveys, such as the Large Synoptic Survey Telescope and the Zwicky Transient Facility.« less
Automated transient identification in the Dark Energy Survey
Goldstein, D. A.; D'Andrea, C. B.; Fischer, J. A.; ...
2015-09-01
We describe an algorithm for identifying point-source transients and moving objects on reference-subtracted optical images containing artifacts of processing and instrumentation. The algorithm makes use of the supervised machine learning technique known as Random Forest. We present results from its use in the Dark Energy Survey Supernova program (DES-SN), where it was trained using a sample of 898,963 signal and background events generated by the transient detection pipeline. After reprocessing the data collected during the first DES-SN observing season (2013 September through 2014 February) using the algorithm, the number of transient candidates eligible for human scanning decreased by a factormore » of 13.4, while only 1.0% of the artificial Type Ia supernovae (SNe) injected into search images to monitor survey efficiency were lost, most of which were very faint events. Furthermore, we characterize the algorithm's performance in detail, and we discuss how it can inform pipeline design decisions for future time-domain imaging surveys, such as the Large Synoptic Survey Telescope and the Zwicky Transient Facility.« less
Scalable Probabilistic Inference for Global Seismic Monitoring
NASA Astrophysics Data System (ADS)
Arora, N. S.; Dear, T.; Russell, S.
2011-12-01
We describe a probabilistic generative model for seismic events, their transmission through the earth, and their detection (or mis-detection) at seismic stations. We also describe an inference algorithm that constructs the most probable event bulletin explaining the observed set of detections. The model and inference are called NET-VISA (network processing vertically integrated seismic analysis) and is designed to replace the current automated network processing at the IDC, the SEL3 bulletin. Our results (attached table) demonstrate that NET-VISA significantly outperforms SEL3 by reducing the missed events from 30.3% down to 12.5%. The difference is even more dramatic for smaller magnitude events. NET-VISA has no difficulty in locating nuclear explosions as well. The attached figure demonstrates the location predicted by NET-VISA versus other bulletins for the second DPRK event. Further evaluation on dense regional networks demonstrates that NET-VISA finds many events missed in the LEB bulletin, which is produced by the human analysts. Large aftershock sequences, as produced by the 2004 December Sumatra earthquake and the 2011 March Tohoku earthquake, can pose a significant load for automated processing, often delaying the IDC bulletins by weeks or months. Indeed these sequences can overload the serial NET-VISA inference as well. We describe an enhancement to NET-VISA to make it multi-threaded, and hence take full advantage of the processing power of multi-core and -cpu machines. Our experiments show that the new inference algorithm is able to achieve 80% efficiency in parallel speedup.
Contamination analyses of technology mirror assembly optical surfaces
NASA Technical Reports Server (NTRS)
Germani, Mark S.
1991-01-01
Automated electron microprobe analyses were performed on tape lift samples from the Technology Mirror Assembly (TMA) optical surfaces. Details of the analyses are given, and the contamination of the mirror surfaces is discussed. Based on the automated analyses of the tape lifts from the TMA surfaces and the control blank, we can conclude that the particles identified on the actual samples were not a result of contamination due to the handling or sampling process itself and that the particles reflect the actual contamination on the surface of the mirror.
Thurman, G. B.; Strong, D. M.; Ahmed, A.; Green, S. S.; Sell, K. W.; Hartzman, R. J.; Bach, F. H.
1973-01-01
Use of lymphocyte cultures for in vitro studies such as pretransplant histocompatibility testing has established the need for standardization of this technique. A microculture technique has been developed that has facilitated the culturing of lymphocytes and increased the quantity of cultures feasible, while lowering the variation between replicate samples. Cultures were prepared for determination of tritiated thymidine incorporation using a Multiple Automated Sample Harvester (MASH). Using this system, the parameters that influence the in vitro responsiveness of human lymphocytes to allogeneic lymphocytes have been investigated. PMID:4271568
A method for automated control of belt velocity changes with an instrumented treadmill.
Hinkel-Lipsker, Jacob W; Hahn, Michael E
2016-01-04
Increased practice difficulty during asymmetrical split-belt treadmill rehabilitation has been shown to improve gait outcomes during retention and transfer tests. However, research in this area has been limited by manual treadmill operation. In the case of variable practice, which requires stride-by-stride changes to treadmill belt velocities, the treadmill control must be automated. This paper presents a method for automation of asymmetrical split-belt treadmill walking, and evaluates how well this method performs with regards to timing of gait events. One participant walked asymmetrically for 100 strides, where the non-dominant limb was driven at their self-selected walking speed, while the other limb was driven randomly on a stride-by-stride basis. In the control loop, the key factors to insure that the treadmill belt had accelerated to its new velocity safely during the swing phase were the sampling rate of the A/D converter, processing time within the controller software, and acceleration of the treadmill belt. The combination of these three factors resulted in a total control loop time during each swing phase that satisfied these requirements with a factor of safety that was greater than 4. Further, a polynomial fit indicated that belt acceleration was the largest contributor to changes in this total time. This approach appears to be safe and reliable for stride-by-stride adjustment of treadmill belt speed, making it suitable for future asymmetrical split-belt walking studies. Further, it can be incorporated into virtual reality rehabilitation paradigms that utilize split-belt treadmill walking. Copyright © 2015 Elsevier Ltd. All rights reserved.
The Automation-by-Expertise-by-Training Interaction.
Strauch, Barry
2017-03-01
I introduce the automation-by-expertise-by-training interaction in automated systems and discuss its influence on operator performance. Transportation accidents that, across a 30-year interval demonstrated identical automation-related operator errors, suggest a need to reexamine traditional views of automation. I review accident investigation reports, regulator studies, and literature on human computer interaction, expertise, and training and discuss how failing to attend to the interaction of automation, expertise level, and training has enabled operators to commit identical automation-related errors. Automated systems continue to provide capabilities exceeding operators' need for effective system operation and provide interfaces that can hinder, rather than enhance, operator automation-related situation awareness. Because of limitations in time and resources, training programs do not provide operators the expertise needed to effectively operate these automated systems, requiring them to obtain the expertise ad hoc during system operations. As a result, many do not acquire necessary automation-related system expertise. Integrating automation with expected operator expertise levels, and within training programs that provide operators the necessary automation expertise, can reduce opportunities for automation-related operator errors. Research to address the automation-by-expertise-by-training interaction is needed. However, such research must meet challenges inherent to examining realistic sociotechnical system automation features with representative samples of operators, perhaps by using observational and ethnographic research. Research in this domain should improve the integration of design and training and, it is hoped, enhance operator performance.
Frégeau, Chantal J; Lett, C Marc; Elliott, Jim; Yensen, Craig; Fourney, Ron M
2008-05-01
An automated process has been developed for the analysis of forensic casework samples using TECAN Genesis RSP 150/8 or Freedom EVO liquid handling workstations equipped exclusively with nondisposable tips. Robot tip cleaning routines have been incorporated strategically within the DNA extraction process as well as at the end of each session. Alternative options were examined for cleaning the tips and different strategies were employed to verify cross-contamination. A 2% sodium hypochlorite wash (1/5th dilution of the 10.8% commercial bleach stock) proved to be the best overall approach for preventing cross-contamination of samples processed using our automated protocol. The bleach wash steps do not adversely impact the short tandem repeat (STR) profiles developed from DNA extracted robotically and allow for major cost savings through the implementation of fixed tips. We have demonstrated that robotic workstations equipped with fixed pipette tips can be used with confidence with properly designed tip washing routines to process casework samples using an adapted magnetic bead extraction protocol.
UV LED lighting for automated crystal centring
Chavas, Leonard M. G.; Yamada, Yusuke; Hiraki, Masahiko; Igarashi, Noriyuki; Matsugaki, Naohiro; Wakatsuki, Soichi
2011-01-01
A direct outcome of the exponential growth of macromolecular crystallography is the continuously increasing demand for synchrotron beam time, both from academic and industrial users. As more and more projects entail screening a profusion of sample crystals, fully automated procedures at every level of the experiments are being implemented at all synchrotron facilities. One of the major obstacles to achieving such automation lies in the sample recognition and centring in the X-ray beam. The capacity of UV light to specifically react with aromatic residues present in proteins or with DNA base pairs is at the basis of UV-assisted crystal centring. Although very efficient, a well known side effect of illuminating biological samples with strong UV sources is the damage induced on the irradiated samples. In the present study the effectiveness of a softer UV light for crystal centring by taking advantage of low-power light-emitting diode (LED) sources has been investigated. The use of UV LEDs represents a low-cost solution for crystal centring with high specificity. PMID:21169682
Machine-learning-based Brokers for Real-time Classification of the LSST Alert Stream
NASA Astrophysics Data System (ADS)
Narayan, Gautham; Zaidi, Tayeb; Soraisam, Monika D.; Wang, Zhe; Lochner, Michelle; Matheson, Thomas; Saha, Abhijit; Yang, Shuo; Zhao, Zhenge; Kececioglu, John; Scheidegger, Carlos; Snodgrass, Richard T.; Axelrod, Tim; Jenness, Tim; Maier, Robert S.; Ridgway, Stephen T.; Seaman, Robert L.; Evans, Eric Michael; Singh, Navdeep; Taylor, Clark; Toeniskoetter, Jackson; Welch, Eric; Zhu, Songzhe; The ANTARES Collaboration
2018-05-01
The unprecedented volume and rate of transient events that will be discovered by the Large Synoptic Survey Telescope (LSST) demand that the astronomical community update its follow-up paradigm. Alert-brokers—automated software system to sift through, characterize, annotate, and prioritize events for follow-up—will be critical tools for managing alert streams in the LSST era. The Arizona-NOAO Temporal Analysis and Response to Events System (ANTARES) is one such broker. In this work, we develop a machine learning pipeline to characterize and classify variable and transient sources only using the available multiband optical photometry. We describe three illustrative stages of the pipeline, serving the three goals of early, intermediate, and retrospective classification of alerts. The first takes the form of variable versus transient categorization, the second a multiclass typing of the combined variable and transient data set, and the third a purity-driven subtyping of a transient class. Although several similar algorithms have proven themselves in simulations, we validate their performance on real observations for the first time. We quantitatively evaluate our pipeline on sparse, unevenly sampled, heteroskedastic data from various existing observational campaigns, and demonstrate very competitive classification performance. We describe our progress toward adapting the pipeline developed in this work into a real-time broker working on live alert streams from time-domain surveys.
Automated methods for multiplexed pathogen detection.
Straub, Timothy M; Dockendorff, Brian P; Quiñonez-Díaz, Maria D; Valdez, Catherine O; Shutthanandan, Janani I; Tarasevich, Barbara J; Grate, Jay W; Bruckner-Lea, Cynthia J
2005-09-01
Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cycler where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides "live vs. dead" capabilities. However, sensitivity of the method will need to be improved for RNA analysis to replace PCR.
Automated Methods for Multiplexed Pathogen Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Tim M.; Dockendorff, Brian P.; Quinonez-Diaz, Maria D.
2005-09-01
Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cyclermore » where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides ''live vs. dead'' capabilities. However, sensitivity of the method will need to be improved for RNA analysis to replace PCR.« less
Takemura, Hiroyuki; Ai, Tomohiko; Kimura, Konobu; Nagasaka, Kaori; Takahashi, Toshihiro; Tsuchiya, Koji; Yang, Haeun; Konishi, Aya; Uchihashi, Kinya; Horii, Takashi; Tabe, Yoko; Ohsaka, Akimichi
2018-01-01
The XN series automated hematology analyzer has been equipped with a body fluid (BF) mode to count and differentiate leukocytes in BF samples including cerebrospinal fluid (CSF). However, its diagnostic accuracy is not reliable for CSF samples with low cell concentration at the border between normal and pathologic level. To overcome this limitation, a new flow cytometry-based technology, termed "high sensitive analysis (hsA) mode," has been developed. In addition, the XN series analyzer has been equipped with the automated digital cell imaging analyzer DI-60 to classify cell morphology including normal leukocytes differential and abnormal malignant cells detection. Using various BF samples, we evaluated the performance of the XN-hsA mode and DI-60 compared to manual microscopic examination. The reproducibility of the XN-hsA mode showed good results in samples with low cell densities (coefficient of variation; % CV: 7.8% for 6 cells/μL). The linearity of the XN-hsA mode was established up to 938 cells/μL. The cell number obtained using the XN-hsA mode correlated highly with the corresponding microscopic examination. Good correlation was also observed between the DI-60 analyses and manual microscopic classification for all leukocyte types, except monocytes. In conclusion, the combined use of cell counting with the XN-hsA mode and automated morphological analyses using the DI-60 mode is potentially useful for the automated analysis of BF cells.
NASA Astrophysics Data System (ADS)
Lagos, Soledad R.; Velis, Danilo R.
2018-02-01
We perform the location of microseismic events generated in hydraulic fracturing monitoring scenarios using two global optimization techniques: Very Fast Simulated Annealing (VFSA) and Particle Swarm Optimization (PSO), and compare them against the classical grid search (GS). To this end, we present an integrated and optimized workflow that concatenates into an automated bash script the different steps that lead to the microseismic events location from raw 3C data. First, we carry out the automatic detection, denoising and identification of the P- and S-waves. Secondly, we estimate their corresponding backazimuths using polarization information, and propose a simple energy-based criterion to automatically decide which is the most reliable estimate. Finally, after taking proper care of the size of the search space using the backazimuth information, we perform the location using the aforementioned algorithms for 2D and 3D usual scenarios of hydraulic fracturing processes. We assess the impact of restricting the search space and show the advantages of using either VFSA or PSO over GS to attain significant speed-ups.
Towards automated traceability maintenance
Mäder, Patrick; Gotel, Orlena
2012-01-01
Traceability relations support stakeholders in understanding the dependencies between artifacts created during the development of a software system and thus enable many development-related tasks. To ensure that the anticipated benefits of these tasks can be realized, it is necessary to have an up-to-date set of traceability relations between the established artifacts. This goal requires the creation of traceability relations during the initial development process. Furthermore, the goal also requires the maintenance of traceability relations over time as the software system evolves in order to prevent their decay. In this paper, an approach is discussed that supports the (semi-) automated update of traceability relations between requirements, analysis and design models of software systems expressed in the UML. This is made possible by analyzing change events that have been captured while working within a third-party UML modeling tool. Within the captured flow of events, development activities comprised of several events are recognized. These are matched with predefined rules that direct the update of impacted traceability relations. The overall approach is supported by a prototype tool and empirical results on the effectiveness of tool-supported traceability maintenance are provided. PMID:23471308
An Architecture for Autonomous Rovers on Future Planetary Missions
NASA Astrophysics Data System (ADS)
Ocon, J.; Avilés, M.; Graziano, M.
2018-04-01
This paper proposes an architecture for autonomous planetary rovers. This architecture combines a set of characteristics required in this type of system: high level of abstraction, reactive event-based activity execution, and automous navigation.
AN AUTOMATED MONITORING SYSTEM FOR FISH PHYSIOLOGY AND TOXICOLOGY
This report describes a data acquisition and control (DAC) system that was constructed to manage selected physiological measurements and sample control for aquatic physiology and toxicology. Automated DAC was accomplished with a microcomputer running menu-driven software develope...
Accurate seismic phase identification and arrival time picking of glacial icequakes
NASA Astrophysics Data System (ADS)
Jones, G. A.; Doyle, S. H.; Dow, C.; Kulessa, B.; Hubbard, A.
2010-12-01
A catastrophic lake drainage event was monitored continuously using an array of 6, 4.5 Hz 3 component geophones in the Russell Glacier catchment, Western Greenland. Many thousands of events and arrival time phases (e.g., P- or S-wave) were recorded, often with events occurring simultaneously but at different locations. In addition, different styles of seismic events were identified from 'classical' tectonic earthquakes to tremors usually observed in volcanic regions. The presence of such a diverse and large dataset provides insight into the complex system of lake drainage. One of the most fundamental steps in seismology is the accurate identification of a seismic event and its associated arrival times. However, the collection of such a large and complex dataset makes the manual identification of a seismic event and picking of the arrival time phases time consuming with variable results. To overcome the issues of consistency and manpower, a number of different methods have been developed including short-term and long-term averages, spectrograms, wavelets, polarisation analyses, higher order statistics and auto-regressive techniques. Here we propose an automated procedure which establishes the phase type and accurately determines the arrival times. The procedure combines a number of different automated methods to achieve this, and is applied to the recently acquired lake drainage data. Accurate identification of events and their arrival time phases are the first steps in gaining a greater understanding of the extent of the deformation and the mechanism of such drainage events. A good knowledge of the propagation pathway of lake drainage meltwater through a glacier will have significant consequences for interpretation of glacial and ice sheet dynamics.
A Comparison of Automated and Manual Crater Counting Techniques in Images of Elysium Planitia.
NASA Astrophysics Data System (ADS)
Plesko, C. S.; Brumby, S. P.; Asphaug, E.
2004-11-01
Surveys of impact craters yield a wealth of information about Martian geology, providing clues to the relative age, local composition and erosional history of the surface. Martian craters are also of intrinsic geophysical interest, given that the processes by which they form are not entirely clear, especially cratering in ice-saturated regoliths (Plesko et al. 2004, AGU) which appear common on Mars (Squyres and Carr 1986). However, the deluge of data over the last decade has made comprehensive manual counts prohibitive, except in select regions. Given that most small craters on Mars may be secondaries from a few very recent impact events (McEwen et al. in press, Icarus 2004), using select regions for age dating introduces considerable potential for sampling error. Automation is thus an enabling planetary science technology. In contrast to machine counts, human counts are prone to human decision making, thus not intrinsically reproducible. One can address human "noise" by averaging over many human counts (Kanefsky et al. 2001), but this multiplies the already laborious effort required. In this study, we test automated crater counting algorithms developed with the Los Alamos National Laboratory genetic programming suite GENIE (Harvey et al., 2002) against established manual counts of craters in Elysium Planitia, using MOC and THEMIS data. We intend to establish the validity of our method against well-regarded hand counts (Hartmann et al. 2000), and then apply it generally to larger regions of Mars. Previous work on automated crater counting used customized algorithms (Bierhaus et al. 2003, Burl et al.. 2001). Algorithms generated by genetic programming have the advantage of requiring little time or user effort to generate, so it is relatively easy to generate a suite of algorithms for varied terrain types, or to compare results from multiple algorithms for improved accuracy (Plesko et al. 2003).
Gabard-Durnam, Laurel J; Mendez Leal, Adriana S; Wilkinson, Carol L; Levin, April R
2018-01-01
Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe.
Gabard-Durnam, Laurel J.; Mendez Leal, Adriana S.; Wilkinson, Carol L.; Levin, April R.
2018-01-01
Electroenchephalography (EEG) recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE) as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact contamination and recording lengths. HAPPE software is freely available under the terms of the GNU General Public License at https://github.com/lcnhappe/happe. PMID:29535597
Defining event reconstruction of digital crime scenes.
Carrier, Brian D; Spafford, Eugene H
2004-11-01
Event reconstruction plays a critical role in solving physical crimes by explaining why a piece of physical evidence has certain characteristics. With digital crimes, the current focus has been on the recognition and identification of digital evidence using an object's characteristics, but not on the identification of the events that caused the characteristics. This paper examines digital event reconstruction and proposes a process model and procedure that can be used for a digital crime scene. The model has been designed so that it can apply to physical crime scenes, can support the unique aspects of a digital crime scene, and can be implemented in software to automate part of the process. We also examine the differences between physical event reconstruction and digital event reconstruction.
Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...
Appendix C: Automated Vitrification of Mammalian Embryos on a Digital Microfluidic Device.
Liu, Jun; Pyne, Derek G; Abdelgawad, Mohamed; Sun, Yu
2017-01-01
This chapter introduces a digital microfluidic device that automates sample preparation for mammalian embryo vitrification. Individual microdroplets manipulated on the microfluidic device were used as microvessels to transport a single mouse embryo through a complete vitrification procedure. Advantages of this approach, compared to manual operation and channel-based microfluidic vitrification, include automated operation, cryoprotectant concentration gradient generation, and feasibility of loading and retrieval of embryos.
A Liquid-Handling Robot for Automated Attachment of Biomolecules to Microbeads.
Enten, Aaron; Yang, Yujia; Ye, Zihan; Chu, Ryan; Van, Tam; Rothschild, Ben; Gonzalez, Francisco; Sulchek, Todd
2016-08-01
Diagnostics, drug delivery, and other biomedical industries rely on cross-linking ligands to microbead surfaces. Microbead functionalization requires multiple steps of liquid exchange, incubation, and mixing, which are laborious and time intensive. Although automated systems exist, they are expensive and cumbersome, limiting their routine use in biomedical laboratories. We present a small, bench-top robotic system that automates microparticle functionalization and streamlines sample preparation. The robot uses a programmable microcontroller to regulate liquid exchange, incubation, and mixing functions. Filters with a pore diameter smaller than the minimum bead diameter are used to prevent bead loss during liquid exchange. The robot uses three liquid reagents and processes up to 10(7) microbeads per batch. The effectiveness of microbead functionalization was compared with a manual covalent coupling process and evaluated via flow cytometry and fluorescent imaging. The mean percentages of successfully functionalized beads were 91% and 92% for the robot and manual methods, respectively, with less than 5% bead loss. Although the two methods share similar qualities, the automated approach required approximately 10 min of active labor, compared with 3 h for the manual approach. These results suggest that a low-cost, automated microbead functionalization system can streamline sample preparation with minimal operator intervention. © 2015 Society for Laboratory Automation and Screening.
NASA Astrophysics Data System (ADS)
von Freyberg, Jana; Kirchner, James W.
2017-04-01
In the pre-Alpine Alptal catchment in central Switzerland, snowmelt and rainfall events cause rapid changes not only in hydrological conditions, but also in water quality. A flood forecasting model for such a mountainous catchment thus requires process understanding that is informed by high-frequency monitoring of hydrological and hydrochemical parameters. Therefore, we installed a high-frequency sampling and analysis system near the outlet of the 0.7 km2 Erlenbach catchment, a headwater tributary of the Alp river. We measured stable water isotopes (δ18O, δ2H) in precipitation and streamwater using Picarro, Inc.'s (Santa Clara, CA, USA) newly developed Continuous Water Sampler Module (CWS) coupled to their L2130-i Cavity Ring-Down Spectrometer, at 30 min temporal resolution. Water quality was monitored with a dual-channel ion chomatograph (Metrohm AG, Herisau, Switzerland) for analysis of major cations and anions, as well as with a UV-Vis spectroscopy system and electrochemical probes (s::can Messtechnik GmbH, Vienna, Austria) for characterization of nutrients and basic water quality parameters. For quantification of trace elements and metals, we collected additional water samples for subsequent ICP-MS analysis in the laboratory. To illustrate the applicability of our newly developed automated analysis and sampling system under field conditions, we will present initial results from the 2016 fall and winter seasons at the Erlenbach catchment. During this period, river discharge was mainly fed by groundwater, as well as intermittent snowmelt and rain-on-snow events. Our high-frequency data set, along with spatially distributed sampling of snowmelt, enables a detailed analysis of source areas, flow pathways and biogeochemical processes that control chemical dynamics in streamflow and the discharge regime.
Sebok, Angelia; Wickens, Christopher D
2017-03-01
The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.
Round, A. R.; Franke, D.; Moritz, S.; Huchler, R.; Fritsche, M.; Malthan, D.; Klaering, R.; Svergun, D. I.; Roessle, M.
2008-01-01
There is a rapidly increasing interest in the use of synchrotron small-angle X-ray scattering (SAXS) for large-scale studies of biological macromolecules in solution, and this requires an adequate means of automating the experiment. A prototype has been developed of an automated sample changer for solution SAXS, where the solutions are kept in thermostatically controlled well plates allowing for operation with up to 192 samples. The measuring protocol involves controlled loading of protein solutions and matching buffers, followed by cleaning and drying of the cell between measurements. The system was installed and tested at the X33 beamline of the EMBL, at the storage ring DORIS-III (DESY, Hamburg), where it was used by over 50 external groups during 2007. At X33, a throughput of approximately 12 samples per hour, with a failure rate of sample loading of less than 0.5%, was observed. The feedback from users indicates that the ease of use and reliability of the user operation at the beamline were greatly improved compared with the manual filling mode. The changer is controlled by a client–server-based network protocol, locally and remotely. During the testing phase, the changer was operated in an attended mode to assess its reliability and convenience. Full integration with the beamline control software, allowing for automated data collection of all samples loaded into the machine with remote control from the user, is presently being implemented. The approach reported is not limited to synchrotron-based SAXS but can also be used on laboratory and neutron sources. PMID:25484841
Sato, Yuka; Seimiya, Masanori; Yoshida, Toshihiko; Sawabe, Yuji; Hokazono, Eisaku; Osawa, Susumu; Matsushita, Kazuyuki
2017-01-01
Background The indocyanine green retention rate is important for assessing the severity of liver disorders. In the conventional method, blood needs to be collected twice. In the present study, we developed an automated indocyanine green method that does not require blood sampling before intravenous indocyanine green injections and is applicable to an automated biochemical analyser. Methods The serum samples of 471 patients collected before and after intravenous indocyanine green injections and submitted to the clinical laboratory of our hospital were used as samples. The standard procedure established by the Japan Society of Hepatology was used as the standard method. In the automated indocyanine green method, serum collected after an intravenous indocyanine green injection was mixed with the saline reagent containing a surfactant, and the indocyanine green concentration was measured at a dominant wavelength of 805 nm and a complementary wavelength of 884 nm. Results The coefficient of variations of the within- and between-run reproducibilities of this method were 2% or lower, and dilution linearity passing the origin was noted up to 10 mg/L indocyanine green. The reagent was stable for four weeks or longer. Haemoglobin, bilirubin and chyle had no impact on the results obtained. The correlation coefficient between the standard method (x) and this method (y) was r=0.995; however, slight divergence was noted in turbid samples. Conclusion Divergence in turbid samples may have corresponded to false negativity with the standard procedure. Our method may be highly practical because blood sampling before indocyanine green loading is unnecessary and measurements are simple.
Decision Making In A High-Tech World: Automation Bias and Countermeasures
NASA Technical Reports Server (NTRS)
Mosier, Kathleen L.; Skitka, Linda J.; Burdick, Mark R.; Heers, Susan T.; Rosekind, Mark R. (Technical Monitor)
1996-01-01
Automated decision aids and decision support systems have become essential tools in many high-tech environments. In aviation, for example, flight management systems computers not only fly the aircraft, but also calculate fuel efficient paths, detect and diagnose system malfunctions and abnormalities, and recommend or carry out decisions. Air Traffic Controllers will soon be utilizing decision support tools to help them predict and detect potential conflicts and to generate clearances. Other fields as disparate as nuclear power plants and medical diagnostics are similarly becoming more and more automated. Ideally, the combination of human decision maker and automated decision aid should result in a high-performing team, maximizing the advantages of additional cognitive and observational power in the decision-making process. In reality, however, the presence of these aids often short-circuits the way that even very experienced decision makers have traditionally handled tasks and made decisions, and introduces opportunities for new decision heuristics and biases. Results of recent research investigating the use of automated aids have indicated the presence of automation bias, that is, errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers inappropriately follow an automated directive, or automation omission errors, i.e., errors made when humans fail to take action or notice a problem because an automated aid fails to inform them, can result from this tendency. Evidence of the tendency to make automation-related omission and commission errors has been found in pilot self reports, in studies using pilots in flight simulations, and in non-flight decision making contexts with student samples. Considerable research has found that increasing social accountability can successfully ameliorate a broad array of cognitive biases and resultant errors. To what extent these effects generalize to performance situations is not yet empirically established. The two studies to be presented represent concurrent efforts, with student and professional pilot samples, to determine the effects of accountability pressures on automation bias and on the verification of the accurate functioning of automated aids. Students (Experiment 1) and commercial pilots (Experiment 2) performed simulated flight tasks using automated aids. In both studies, participants who perceived themselves as accountable for their strategies of interaction with the automation were significantly more likely to verify its correctness, and committed significantly fewer automation-related errors than those who did not report this perception.
Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R.; Fernandes, Daryl L.; Satsangi, Jack; Spencer, Daniel I. R.
2015-01-01
Introduction Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. Methods 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. Results There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. Conclusions The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures. PMID:25831126
Ventham, Nicholas T; Gardner, Richard A; Kennedy, Nicholas A; Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R; Fernandes, Daryl L; Satsangi, Jack; Spencer, Daniel I R
2015-01-01
Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures.
Detection of Respiratory Viruses in Sputum from Adults by Use of Automated Multiplex PCR
Walsh, Edward E.; Formica, Maria A.; Falsey, Ann R.
2014-01-01
Respiratory tract infections (RTI) frequently cause hospital admissions among adults. Diagnostic viral reverse transcriptase PCR (RT-PCR) of nose and throat swabs (NTS) is useful for patient care by informing antiviral use and appropriate isolation. However, automated RT-PCR systems are not amenable to utilizing sputum due to its viscosity. We evaluated a simple method of processing sputum samples in a fully automated respiratory viral panel RT-PCR assay (FilmArray). Archived sputum and NTS samples collected in 2008-2012 from hospitalized adults with RTI were evaluated. A subset of sputum samples positive for 10 common viruses by a uniplex RT-PCR was selected. A sterile cotton-tip swab was dunked in sputum, swirled in 700 μL of sterile water (dunk and swirl method) and tested by the FilmArray assay. Quantitative RT-PCR was performed on “dunked” sputum and NTS samples for influenza A (Flu A), respiratory syncytial virus (RSV), coronavirus OC43 (OC43), and human metapneumovirus (HMPV). Viruses were identified in 31% of 965 illnesses using a uniplex RT-PCR. The sputum sample was the only sample positive for 105 subjects, including 35% (22/64) of influenza cases and significantly increased the diagnostic yield of NTS alone (302/965 [31%] versus 197/965 [20%]; P = 0.0001). Of 108 sputum samples evaluated by the FilmArray assay using the dunk and swirl method, 99 (92%) were positive. Quantitative RT-PCR revealed higher mean viral loads in dunked sputum samples compared to NTS samples for Flu A, RSV, and HMPV (P = 0.0001, P = 0.006, and P = 0.011, respectively). The dunk and swirl method is a simple and practical method for reliably processing sputum samples in a fully automated PCR system. The higher viral loads in sputa may increase detection over NTS testing alone. PMID:25056335
Kerr, Darcy A; Sweeney, Brenda; Arpin, Ronald N; Ring, Melissa; Pitman, Martha B; Wilbur, David C; Faquin, William C
2016-08-01
-Testing for high-risk human papillomavirus (HR-HPV) in head and neck squamous cell carcinomas (HNSCCs) is important for both prognostication and clinical management. Several testing platforms are available for HR-HPV; however, effective alternative automated approaches are needed. -To assess the performance of the automated Roche cobas 4800 HPV real-time polymerase chain reaction-based system on formalin-fixed, paraffin-embedded HNSCC specimens and compare results with standard methods of in situ hybridization (ISH) and p16 immunohistochemistry. -Formalin-fixed, paraffin-embedded samples of HNSCC were collected from archival specimens in the Department of Pathology, Massachusetts General Hospital (Boston), and prepared using the automated system by deparaffinization and dehydration followed by tissue lysis. Samples were integrated into routine cervical cytology testing runs by cobas. Corresponding formalin-fixed, paraffin-embedded samples were evaluated for HR-HPV by ISH and p16 by immunohistochemistry. Discrepant cases were adjudicated by polymerase chain reaction. -Sixty-two HNSCC samples were analyzed using the automated cobas system, ISH, and immunohistochemistry. Fifty-two percent (n = 32 of 62) of formalin-fixed, paraffin-embedded tumors were positive for HR-HPV by cobas. Eighty-eight percent (n = 28 of 32) of cases were the HPV 16 subtype and 12% (n = 4 of 32) were other HR-HPV subtypes. Corresponding testing with ISH was concordant in 92% (n = 57 of 62) of cases. Compared with the adjudication polymerase chain reaction standard, there were 3 false-positive cases by cobas. -Concordance in HNSCC HR-HPV status between cobas and ISH was more than 90%. The cobas demonstrated a sensitivity of 100% and a specificity of 91% for detection of HR-HPV. Advantages favoring cobas include its automation, cost efficiency, objective results, and ease of performance.
Pleil, Joachim D; Angrish, Michelle M; Madden, Michael C
2015-12-11
Immunochemistry is an important clinical tool for indicating biological pathways leading towards disease. Standard enzyme-linked immunosorbent assays (ELISA) are labor intensive and lack sensitivity at low-level concentrations. Here we report on emerging technology implementing fully-automated ELISA capable of molecular level detection and describe application to exhaled breath condensate (EBC) samples. The Quanterix SIMOA HD-1 analyzer was evaluated for analytical performance for inflammatory cytokines (IL-6, TNF-α, IL-1β and IL-8). The system was challenged with human EBC representing the most dilute and analytically difficult of the biological media. Calibrations from synthetic samples and spiked EBC showed excellent linearity at trace levels (r(2) > 0.99). Sensitivities varied by analyte, but were robust from ~0.006 (IL-6) to ~0.01 (TNF-α) pg ml(-1). All analytes demonstrated response suppression when diluted with deionized water and so assay buffer diluent was found to be a better choice. Analytical runs required ~45 min setup time for loading samples, reagents, calibrants, etc., after which the instrument performs without further intervention for up to 288 separate samples. Currently, available kits are limited to single-plex analyses and so sample volumes require adjustments. Sample dilutions should be made with assay diluent to avoid response suppression. Automation performs seamlessly and data are automatically analyzed and reported in spreadsheet format. The internal 5-parameter logistic (pl) calibration model should be supplemented with a linear regression spline at the very lowest analyte levels, (<1.3 pg ml(-1)). The implementation of the automated Quanterix platform was successfully demonstrated using EBC, which poses the greatest challenge to ELISA due to limited sample volumes and low protein levels.
Woynaroski, Tiffany; Oller, D Kimbrough; Keceli-Kaysili, Bahar; Xu, Dongxin; Richards, Jeffrey A; Gilkerson, Jill; Gray, Sharmistha; Yoder, Paul
2017-03-01
Theory and research suggest that vocal development predicts "useful speech" in preschoolers with autism spectrum disorder (ASD), but conventional methods for measurement of vocal development are costly and time consuming. This longitudinal correlational study examines the reliability and validity of several automated indices of vocalization development relative to an index derived from human coded, conventional communication samples in a sample of preverbal preschoolers with ASD. Automated indices of vocal development were derived using software that is presently "in development" and/or only available for research purposes and using commercially available Language ENvironment Analysis (LENA) software. Indices of vocal development that could be derived using the software available for research purposes: (a) were highly stable with a single day-long audio recording, (b) predicted future spoken vocabulary to a degree that was nonsignificantly different from the index derived from conventional communication samples, and (c) continued to predict future spoken vocabulary even after controlling for concurrent vocabulary in our sample. The score derived from standard LENA software was similarly stable, but was not significantly correlated with future spoken vocabulary. Findings suggest that automated vocal analysis is a valid and reliable alternative to time intensive and expensive conventional communication samples for measurement of vocal development of preverbal preschoolers with ASD in research and clinical practice. Autism Res 2017, 10: 508-519. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.
MECH: Algorithms and Tools for Automated Assessment of Potential Attack Locations
2015-10-06
conscious and subconscious processing of the geometric structure of the local terrain, sight lines to prominent or useful terrain features, proximity...This intuition or instinct is the outcome of an unconscious or subconscious integration of available facts and impressions. Thus, in the search...adjacency. Even so, we inevitably introduce a bias between events and non-event road locations when calculating the route visibility features. 63
Event detection for car park entries by video-surveillance
NASA Astrophysics Data System (ADS)
Coquin, Didier; Tailland, Johan; Cintract, Michel
2007-10-01
Intelligent surveillance has become an important research issue due to the high cost and low efficiency of human supervisors, and machine intelligence is required to provide a solution for automated event detection. In this paper we describe a real-time system that has been used for detecting car park entries, using an adaptive background learning algorithm and two indicators representing activity and identity to overcome the difficulty of tracking objects.
2010-09-01
MULTIPLE-ARRAY DETECTION, ASSOCIATION AND LOCATION OF INFRASOUND AND SEISMO-ACOUSTIC EVENTS – UTILIZATION OF GROUND TRUTH INFORMATION Stephen J...and infrasound data from seismo-acoustic arrays and apply the methodology to regional networks for validation with ground truth information. In the...initial year of the project automated techniques for detecting, associating and locating infrasound signals were developed. Recently, the location
Towards a geophysical decision-support system for monitoring and managing unstable slopes
NASA Astrophysics Data System (ADS)
Chambers, J. E.; Meldrum, P.; Wilkinson, P. B.; Uhlemann, S.; Swift, R. T.; Inauen, C.; Gunn, D.; Kuras, O.; Whiteley, J.; Kendall, J. M.
2017-12-01
Conventional approaches for condition monitoring, such as walk over surveys, remote sensing or intrusive sampling, are often inadequate for predicting instabilities in natural and engineered slopes. Surface observations cannot detect the subsurface precursors to failure events; instead they can only identify failure once it has begun. On the other hand, intrusive investigations using boreholes only sample a very small volume of ground and hence small scale deterioration process in heterogeneous ground conditions can easily be missed. It is increasingly being recognised that geophysical techniques can complement conventional approaches by providing spatial subsurface information. Here we describe the development and testing of a new geophysical slope monitoring system. It is built around low-cost electrical resistivity tomography instrumentation, combined with integrated geotechnical logging capability, and coupled with data telemetry. An automated data processing and analysis workflow is being developed to streamline information delivery. The development of this approach has provided the basis of a decision-support tool for monitoring and managing unstable slopes. The hardware component of the system has been operational at a number of field sites associated with a range of natural and engineered slopes for up to two years. We report on the monitoring results from these sites, discuss the practicalities of installing and maintaining long-term geophysical monitoring infrastructure, and consider the requirements of a fully automated data processing and analysis workflow. We propose that the result of this development work is a practical decision-support tool that can provide near-real-time information relating to the internal condition of problematic slopes.
Software framework for the upcoming MMT Observatory primary mirror re-aluminization
NASA Astrophysics Data System (ADS)
Gibson, J. Duane; Clark, Dusty; Porter, Dallan
2014-07-01
Details of the software framework for the upcoming in-situ re-aluminization of the 6.5m MMT Observatory (MMTO) primary mirror are presented. This framework includes: 1) a centralized key-value store and data structure server for data exchange between software modules, 2) a newly developed hardware-software interface for faster data sampling and better hardware control, 3) automated control algorithms that are based upon empirical testing, modeling, and simulation of the aluminization process, 4) re-engineered graphical user interfaces (GUI's) that use state-of-the-art web technologies, and 5) redundant relational databases for data logging. Redesign of the software framework has several objectives: 1) automated process control to provide more consistent and uniform mirror coatings, 2) optional manual control of the aluminization process, 3) modular design to allow flexibility in process control and software implementation, 4) faster data sampling and logging rates to better characterize the approximately 100-second aluminization event, and 5) synchronized "real-time" web application GUI's to provide all users with exactly the same data. The framework has been implemented as four modules interconnected by a data store/server. The four modules are integrated into two Linux system services that start automatically at boot-time and remain running at all times. Performance of the software framework is assessed through extensive testing within 2.0 meter and smaller coating chambers at the Sunnyside Test Facility. The redesigned software framework helps ensure that a better performing and longer lasting coating will be achieved during the re-aluminization of the MMTO primary mirror.
LAMA Preconference and Program Highlights.
ERIC Educational Resources Information Center
Library Administration & Management, 1988
1988-01-01
Highlights events of the Library Administration and Management Association 1988 conference, including presentation of awards and programs on: (1) transfer of training; (2) hiring; (3) mentoring; (4) acquisitions automation; (5) library building consultation; and (6) managing shared systems. (MES)
Ly, Trang T; Nicholas, Jennifer A; Retterath, Adam; Lim, Ee Mun; Davis, Elizabeth A; Jones, Timothy W
2013-09-25
Hypoglycemia is a critical obstacle to the care of patients with type 1 diabetes. Sensor-augmented insulin pump with automated low-glucose insulin suspension has the potential to reduce the incidence of major hypoglycemic events. To determine the incidence of severe and moderate hypoglycemia with sensor-augmented pump with low-glucose suspension compared with standard insulin pump therapy. A randomized clinical trial involving 95 patients with type 1 diabetes, recruited from December 2009 to January 2012 in Australia. Patients were randomized to insulin pump only or automated insulin suspension for 6 months. The primary outcome was the combined incidence of severe (hypoglycemic seizure or coma) and moderate hypoglycemia (an event requiring assistance for treatment). In a subgroup, counterregulatory hormone responses to hypoglycemia were assessed using the hypoglycemic clamp technique. Of the 95 patients randomized, 49 were assigned to the standard-pump (pump-only) therapy and 46 to the low-glucose suspension group. The mean (SD) age was 18.6 (11.8) years; duration of diabetes, 11.0 (8.9) years; and duration of pump therapy, 4.1 (3.4) years. The baseline rate of severe and moderate hypoglycemic events in the pump-only group was 20.7 vs 129.6 events per 100 patient months in the low-glucose suspension group. After 6 months of treatment, the event rates decreased from 28 to 16 in the pump-only group vs 175 to 35 in the low-glucose suspension group. The adjusted incidence rate per 100 patient-months was 34.2 (95% CI, 22.0-53.3) for the pump-only group vs 9.5 (95% CI, 5.2-17.4) for the low-glucose suspension group. The incidence rate ratio was 3.6 (95% CI, 1.7-7.5; P <.001). There was no change in glycated hemoglobin in either group: mean, 7.4 (95% CI, 7.2-7.6) to 7.4 (95% CI, 7.2-7.7) in the pump-only group vs mean, 7.6 (95%, CI, 7.4-7.9) to 7.5 (95% CI, 7.3-7.7) in the low-glucose suspension group. Counterregulatory hormone responses to hypoglycemia were not changed. There were no episodes of diabetic ketoacidosis or hyperglycemia with ketosis. Sensor-augmented pump therapy with automated insulin suspension reduced the combined rate of severe and moderate hypoglycemia in patients with type 1 diabetes. anzctr.org.au Identifier: ACTRN12610000024044.
Gibb, Stuart W.; Wood, John W.; Fauzi, R.; Mantoura, C.
1995-01-01
The automation and improved design and performance of Flow Injection Gas Diffusion-Ion Chromatography (FIGD-IC), a novel technique for the simultaneous analysis of trace ammonia (NH3) and methylamines (MAs) in aqueous media, is presented. Automated Flow Injection Gas Diffusion (FIGD) promotes the selective transmembrane diffusion of MAs and NH3 from aqueous sample under strongly alkaline (pH > 12, NaOH), chelated (EDTA) conditions into a recycled acidic acceptor stream. The acceptor is then injected onto an ion chromatograph where NH3 and the MAs are fully resolved as their cations and detected conductimetrically. A versatile PC interfaced control unit and data capture unit (DCU) are employed in series to direct the selonoid valve switching sequence, IC operation and collection of data. Automation, together with other modifications improved both linearily (R2 > 0.99 MAs 0-100 nM, NH3 0-1000 nM) and precision (<8%) of FIGD-IC at nanomolar concentrations, compared with the manual procedure. The system was successfully applied to the determination of MAs and NH3 in seawater and in trapped particulate and gaseous atmospheric samples during an oceanographic research cruise. PMID:18925047
Replacement Sequence of Events Generator
NASA Technical Reports Server (NTRS)
Fisher, Forest; Gladden, Daniel Wenkert Roy; Khanampompan, Teerpat
2008-01-01
The soeWINDOW program automates the generation of an ITAR (International Traffic in Arms Regulations)-compliant sub-RSOE (Replacement Sequence of Events) by extracting a specified temporal window from an RSOE while maintaining page header information. RSOEs contain a significant amount of information that is not ITAR-compliant, yet that foreign partners need to see for command details to their instrument, as well as the surrounding commands that provide context for validation. soeWINDOW can serve as an example of how command support products can be made ITAR-compliant for future missions. This software is a Perl script intended for use in the mission operations UNIX environment. It is designed for use to support the MRO (Mars Reconnaissance Orbiter) instrument team. The tool also provides automated DOM (Distributed Object Manager) storage into the special ITAR-okay DOM collection, and can be used for creating focused RSOEs for product review by any of the MRO teams.
Automated agar plate streaker: a linear plater on Society for Biomolecular Sciences standard plates.
King, Gregory W; Kath, Gary S; Siciliano, Sal; Simpson, Neal; Masurekar, Prakash; Sigmund, Jan; Polishook, Jon; Skwish, Stephen; Bills, Gerald; Genilloud, Olga; Peláez, Fernando; Martín, Jesus; Dufresne, Claude
2006-09-01
Several protocols for bacterial isolation and techniques for aerobic plate counting rely on the use of a spiral plater to deposit concentration gradients of microbial suspensions onto a circular agar plate to isolate colony growth. The advantage of applying a gradient of concentrations across the agar surface is that the original microbiological sample can be applied at a single concentration rather than as multiple serial dilutions. The spiral plater gradually dilutes the sample across a compact area and therefore saves time preparing dilutions and multiple agar plates. Commercial spiral platers are not automated and require manual sample loading. Dispensing of the sample volume and rate of gradients are often very limited in range. Furthermore, the spiral sample application cannot be used with rectangular microplates. Another limitation of commercial spiral platers is that they are useful only for dilute, filtered suspensions and cannot plate suspensions of coarse organic particles therefore precluding the use of many kinds of microorganism-containing substrata. An automated agar plate spreader capable of processing 99 rectangular microplates in unattended mode is described. This novel instrument is capable of dispensing discrete volumes of sample in a linear pattern. It can be programmed to dispense a sample suspense at a uniform application rate or across a decreasing concentration gradient.
Automating the Processing of Earth Observation Data
NASA Technical Reports Server (NTRS)
Golden, Keith; Pang, Wan-Lin; Nemani, Ramakrishna; Votava, Petr
2003-01-01
NASA s vision for Earth science is to build a "sensor web": an adaptive array of heterogeneous satellites and other sensors that will track important events, such as storms, and provide real-time information about the state of the Earth to a wide variety of customers. Achieving this vision will require automation not only in the scheduling of the observations but also in the processing of the resulting data. To address this need, we are developing a planner-based agent to automatically generate and execute data-flow programs to produce the requested data products.
Automated Data Processing (ADP) Research and Development,
1995-08-14
individual explosions were 16x16 ft for M1 and 18x18 ft for M2. 740 I L 1 tic 4 MI I f"hom~ \\fl i\\ 1l-2 t’lkercd li111c <, Crtc > jut!d WSHItlhZ ll cro...National Laboratory under contract W-7405-ENG-48. 733 1 . OBJECTIVES Our primary objective is to develop efficient and reliable automated event location and...real seismograms; Figure 1 shows example wavelet coefficients (in the transform domain) and bandpass filtering versions of a seismogram as a function of
Bacterial contamination of platelet components not detected by BacT/ALERT®.
Abela, M A; Fenning, S; Maguire, K A; Morris, K G
2018-02-01
To investigate the possible causes for false negative results in BacT/ALERT ® 3D Signature System despite bacterial contamination of platelet units. The Northern Ireland Blood Transfusion Service (NIBTS) routinely extends platelet component shelf life to 7 days. Components are sampled and screened for bacterial contamination using an automated microbial detection system, the BacT/ALERT ® 3D Signature System. We report on three platelet components with confirmed bacterial contamination, which represent false negative BacT/ALERT ® results and near-miss serious adverse events. NIBTS protocols for risk reduction of bacterial contamination of platelet components are described. The methodology for bacterial detection using BacT/ALERT ® is outlined. Laboratory tests, relevant patient details and relevant follow-up information are analysed. In all three cases, Staphylococcus aureus was isolated from the platelet residue and confirmed on terminal sub-culture using BacT/ALERT ® . In two cases, S. aureus with similar genetic makeup was isolated from the donors. Risk reduction measures for bacterial contamination of platelet components are not always effective. Automated bacterial culture detection does not eliminate the risk of bacterial contamination. Visual inspection of platelet components prior to release, issue and administration remains an important last line of defence. © 2017 British Blood Transfusion Society.
Dynamic analysis of apoptosis using cyanine SYTO probes: From classical to microfluidic cytometry
Wlodkowic, Donald; Skommer, Joanna; Faley, Shannon; Darzynkiewicz, Zbigniew; Cooper, Jonathan M.
2013-01-01
Cell death is a stochastic process, often initiated and/or executed in a multi-pathway/multi-organelle fashion. Therefore, high-throughput single-cell analysis platforms are required to provide detailed characterization of kinetics and mechanisms of cell death in heterogeneous cell populations. However, there is still a largely unmet need for inert fluorescent probes, suitable for prolonged kinetic studies. Here, we compare the use of innovative adaptation of unsymmetrical SYTO dyes for dynamic real-time analysis of apoptosis in conventional as well as microfluidic chip-based systems. We show that cyanine SYTO probes allow non-invasive tracking of intracellular events over extended time. Easy handling and “stain–no wash” protocols open up new opportunities for high-throughput analysis and live-cell sorting. Furthermore, SYTO probes are easily adaptable for detection of cell death using automated microfluidic chip-based cytometry. Overall, the combined use of SYTO probes and state-of-the-art Lab-on-a-Chip platform emerges as a cost effective solution for automated drug screening compared to conventional Annexin V or TUNEL assays. In particular, it should allow for dynamic analysis of samples where low cell number has so far been an obstacle, e.g. primary cancer stems cells or circulating minimal residual tumors. PMID:19298813
Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia
2018-04-15
Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e. events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: (i) extracting events from the literature that corroborate existing models with evidence; (ii) discovering new events which can update models; and (iii) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61 and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online.
NASA Astrophysics Data System (ADS)
Brassard, D.; Clime, L.; Daoud, J.; Geissler, M.; Malic, L.; Charlebois, D.; Buckley, N.; Veres, T.
2018-02-01
An innovative centrifugal microfluidic universal platform for remote bio-analytical assays automation required in life-sciences research and medical applications, including purification and analysis from body fluids of cellular and circulating markers.
Greenspoon, Susan A; Ban, Jeffrey D; Sykes, Karen; Ballard, Elizabeth J; Edler, Shelley S; Baisden, Melissa; Covington, Brian L
2004-01-01
Robotic systems are commonly utilized for the extraction of database samples. However, the application of robotic extraction to forensic casework samples is a more daunting task. Such a system must be versatile enough to accommodate a wide range of samples that may contain greatly varying amounts of DNA, but it must also pose no more risk of contamination than the manual DNA extraction methods. This study demonstrates that the BioMek 2000 Laboratory Automation Workstation, used in combination with the DNA IQ System, is versatile enough to accommodate the wide range of samples typically encountered by a crime laboratory. The use of a silica coated paramagnetic resin, as with the DNA IQ System, facilitates the adaptation of an open well, hands off, robotic system to the extraction of casework samples since no filtration or centrifugation steps are needed. Moreover, the DNA remains tightly coupled to the silica coated paramagnetic resin for the entire process until the elution step. A short pre-extraction incubation step is necessary prior to loading samples onto the robot and it is at this step that most modifications are made to accommodate the different sample types and substrates commonly encountered with forensic evidentiary samples. Sexual assault (mixed stain) samples, cigarette butts, blood stains, buccal swabs, and various tissue samples were successfully extracted with the BioMek 2000 Laboratory Automation Workstation and the DNA IQ System, with no evidence of contamination throughout the extensive validation studies reported here.
Automated sample area definition for high-throughput microscopy.
Zeder, M; Ellrott, A; Amann, R
2011-04-01
High-throughput screening platforms based on epifluorescence microscopy are powerful tools in a variety of scientific fields. Although some applications are based on imaging geometrically defined samples such as microtiter plates, multiwell slides, or spotted gene arrays, others need to cope with inhomogeneously located samples on glass slides. The analysis of microbial communities in aquatic systems by sample filtration on membrane filters followed by multiple fluorescent staining, or the investigation of tissue sections are examples. Therefore, we developed a strategy for flexible and fast definition of sample locations by the acquisition of whole slide overview images and automated sample recognition by image analysis. Our approach was tested on different microscopes and the computer programs are freely available (http://www.technobiology.ch). Copyright © 2011 International Society for Advancement of Cytometry.
Brown, G.E.; McLain, B.J.
1994-01-01
The analysis of natural-water samples for antimony by automated-hydride atomic absorption spectrophotometry is described. Samples are prepared for analysis by addition of potassium and hydrochloric acid followed by an autoclave digestion. After the digestion, potassium iodide and sodium borohydride are added automatically. Antimony hydride (stibine) gas is generated, then swept into a heated quartz cell for determination of antimony by atomic absorption spectrophotometry. Precision and accuracy data are presented. Results obtained on standard reference water samples agree with means established by interlaboratory studies. Spike recoveries for actual samples range from 90 to 114 percent. Replicate analyses of water samples of varying matrices give relative standard deviations from 3 to 10 percent.
Device and method for automated separation of a sample of whole blood into aliquots
Burtis, Carl A.; Johnson, Wayne F.
1989-01-01
A device and a method for automated processing and separation of an unmeasured sample of whole blood into multiple aliquots of plasma. Capillaries are radially oriented on a rotor, with the rotor defining a sample chamber, transfer channels, overflow chamber, overflow channel, vent channel, cell chambers, and processing chambers. A sample of whole blood is placed in the sample chamber, and when the rotor is rotated, the blood moves outward through the transfer channels to the processing chambers where the blood is centrifugally separated into a solid cellular component and a liquid plasma component. When the rotor speed is decreased, the plasma component backfills the capillaries resulting in uniform aliquots of plasma which may be used for subsequent analytical procedures.
Hatem, S M; Hu, L; Ragé, M; Gierasimowicz, A; Plaghki, L; Bouhassira, D; Attal, N; Iannetti, G D; Mouraux, A
2012-12-01
To assess the clinical usefulness of an automated analysis of event-related potentials (ERPs). Nociceptive laser-evoked potentials (LEPs) and non-nociceptive somatosensory electrically-evoked potentials (SEPs) were recorded in 37 patients with syringomyelia and 21 controls. LEP and SEP peak amplitudes and latencies were estimated using a single-trial automated approach based on time-frequency wavelet filtering and multiple linear regression, as well as a conventional approach based on visual inspection. The amplitudes and latencies of normal and abnormal LEP and SEP peaks were identified reliably using both approaches, with similar sensitivity and specificity. Because the automated approach provided an unbiased solution to account for average waveforms where no ERP could be identified visually, it revealed significant differences between patients and controls that were not revealed using the visual approach. The automated analysis of ERPs characterized reliably and objectively LEP and SEP waveforms in patients. The automated single-trial analysis can be used to characterize normal and abnormal ERPs with a similar sensitivity and specificity as visual inspection. While this does not justify its use in a routine clinical setting, the technique could be useful to avoid observer-dependent biases in clinical research. Copyright © 2012 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Meckler, Elizabeth M.
This paper examines the belief that no more than half of the public school libraries in the state of Ohio are automated to any degree. The purpose of the research was to determine the degree and nature of automation at the public school libraries in Ohio. A written survey was mailed to 350 libraries that represented a randomized sample of the…
Determination of $sup 241$Am in soil using an automated nuclear radiation measurement laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engstrom, D.E.; White, M.G.; Dunaway, P.B.
The recent completion of REECo's Automated Laboratory and associated software systems has provided a significant increase in capability while reducing manpower requirements. The system is designed to perform gamma spectrum analyses on the large numbers of samples required by the current Nevada Applied Ecology Group (NAEG) and Plutonium Distribution Inventory Program (PDIP) soil sampling programs while maintaining sufficient sensitivities as defined by earlier investigations of the same type. The hardware and systems are generally described in this paper, with emphasis being placed on spectrum reduction and the calibration procedures used for soil samples. (auth)
A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data
NASA Astrophysics Data System (ADS)
Kohl, B. C.; Given, J.
2017-12-01
The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.
Leb, Victoria; Stöcher, Markus; Valentine-Thon, Elizabeth; Hölzl, Gabriele; Kessler, Harald; Stekel, Herbert; Berg, Jörg
2004-02-01
We report on the development of a fully automated real-time PCR assay for the quantitative detection of hepatitis B virus (HBV) DNA in plasma with EDTA (EDTA plasma). The MagNA Pure LC instrument was used for automated DNA purification and automated preparation of PCR mixtures. Real-time PCR was performed on the LightCycler instrument. An internal amplification control was devised as a PCR competitor and was introduced into the assay at the stage of DNA purification to permit monitoring for sample adequacy. The detection limit of the assay was found to be 200 HBV DNA copies/ml, with a linear dynamic range of 8 orders of magnitude. When samples from the European Union Quality Control Concerted Action HBV Proficiency Panel 1999 were examined, the results were found to be in acceptable agreement with the HBV DNA concentrations of the panel members. In a clinical laboratory evaluation of 123 EDTA plasma samples, a significant correlation was found with the results obtained by the Roche HBV Monitor test on the Cobas Amplicor analyzer within the dynamic range of that system. In conclusion, the newly developed assay has a markedly reduced hands-on time, permits monitoring for sample adequacy, and is suitable for the quantitative detection of HBV DNA in plasma in a routine clinical laboratory.
Teilmann, Anne Charlotte; Rozell, Björn; Kalliokoski, Otto; Hau, Jann; Abelson, Klas S P
2016-01-01
Automated blood sampling through a vascular catheter is a frequently utilized technique in laboratory mice. The potential immunological and physiological implications associated with this technique have, however, not been investigated in detail. The present study compared plasma levels of the cytokines IL-1β, IL-2, IL-6, IL-10, IL-17A, GM-CSF, IFN-γ and TNF-α in male NMRI mice that had been subjected to carotid artery catheterization and subsequent automated blood sampling with age-matched control mice. Body weight and histopathological changes in the surgical area, including the salivary glands, the heart, brain, spleen, liver, kidneys and lungs were compared. Catheterized mice had higher levels of IL-6 than did control mice, but other cytokine levels did not differ between the groups. No significant difference in body weight was found. The histology revealed inflammatory and regenerative (healing) changes at surgical sites of all catheterized mice, with mild inflammatory changes extending into the salivary glands. Several catheterized mice had multifocal degenerative to necrotic changes with inflammation in the heart, kidneys and livers, suggesting that thrombi had detached from the catheter tip and embolized to distant sites. Thus, catheterization and subsequent automated blood sampling may have physiological impact. Possible confounding effects of visceral damage should be assessed and considered, when using catheterized mouse models.
Providing security for automated process control systems at hydropower engineering facilities
NASA Astrophysics Data System (ADS)
Vasiliev, Y. S.; Zegzhda, P. D.; Zegzhda, D. P.
2016-12-01
This article suggests the concept of a cyberphysical system to manage computer security of automated process control systems at hydropower engineering facilities. According to the authors, this system consists of a set of information processing tools and computer-controlled physical devices. Examples of cyber attacks on power engineering facilities are provided, and a strategy of improving cybersecurity of hydropower engineering systems is suggested. The architecture of the multilevel protection of the automated process control system (APCS) of power engineering facilities is given, including security systems, control systems, access control, encryption, secure virtual private network of subsystems for monitoring and analysis of security events. The distinctive aspect of the approach is consideration of interrelations and cyber threats, arising when SCADA is integrated with the unified enterprise information system.
Planas, Carles; Palacios, Oscar; Ventura, Francesc; Rivera, Josep; Caixach, Josep
2008-08-15
A method based on automated solid-phase extraction (SPE) and isotope dilution gas chromatography/high resolution mass spectrometry (GC/HRMS) has been developed for the analysis of nine nitrosamines in water samples. The combination of automated SPE and GC/HRMS for the analysis of nitrosamines has not been reported previously. The method shows as advantages the selectivity and sensitivity of GC/HRMS analysis and the high efficiency of automated SPE with coconut charcoal EPA 521 cartridges. Low method detection limits (MDLs) were achieved, along with a greater facility of the procedure and less dependence on the operator with regard to the methods based on manual SPE. Quality requirements for isotope dilution-based methods were accomplished for most analysed nitrosamines, regarding to trueness (80-120%), method precision (<15%) and MDLs (0.08-1.7 ng/L). Nineteen water samples (16 samples from a drinking water treatment plant {DWTP}, 2 chlorinated samples from a sewage treatment plant {STP} effluent, and 1 chlorinated sample from a reservoir) were analysed. Concentrations of nitrosamines in the STP effluent were 309.4 and 730.2 ng/L, being higher when higher doses of chlorine were applied. N-Nitrosodimethylamine (NDMA) and N-nitrosodiethylamine (NDEA) were the main compounds identified in the STP effluent, and NDEA was detected above 200 ng/L, regulatory level for NDMA in effluents stated in Ontario (Canada). Lower concentrations of nitrosamines were found in the reservoir (20.3 ng/L) and in the DWTP samples (n.d. -28.6 ng/L). NDMA and NDEA were respectively found in the reservoir and in treated and highly chlorinated DWTP samples at concentrations above 10 ng/L (guide value established in different countries). The highest concentrations of nitrosamines were found after chlorination and ozonation processes (ozonated, treated and highly chlorinated water) in DWTP samples.
Enjeti, Anoop; Granter, Neil; Ashraf, Asma; Fletcher, Linda; Branford, Susan; Rowlings, Philip; Dooley, Susan
2015-10-01
An automated cartridge-based detection system (GeneXpert; Cepheid) is being widely adopted in low throughput laboratories for monitoring BCR-ABL1 transcript in chronic myelogenous leukaemia. This Australian study evaluated the longitudinal performance specific characteristics of the automated system.The automated cartridge-based system was compared prospectively with the manual qRT-PCR-based reference method at SA Pathology, Adelaide, over a period of 2.5 years. A conversion factor determination was followed by four re-validations. Peripheral blood samples (n = 129) with international scale (IS) values within detectable range were selected for assessment. The mean bias, proportion of results within specified fold difference (2-, 3- and 5-fold), the concordance rate of major molecular remission (MMR) and concordance across a range of IS values on paired samples were evaluated.The initial conversion factor for the automated system was determined as 0.43. Except for the second re-validation, where a negative bias of 1.9-fold was detected, all other biases fell within desirable limits. A cartridge-specific conversion factor and efficiency value was introduced and the conversion factor was confirmed to be stable in subsequent re-validation cycles. Concordance with the reference method/laboratory at >0.1-≤10 IS was 78.2% and at ≤0.001 was 80%, compared to 86.8% in the >0.01-≤0.1 IS range. The overall and MMR concordance were 85.7% and 94% respectively, for samples that fell within ± 5-fold of the reference laboratory value over the entire period of study.Conversion factor and performance specific characteristics for the automated system were longitudinally stable in the clinically relevant range, following introduction by the manufacturer of lot specific efficiency values.
Saikali, Melody; Tanios, Alain; Saab, Antoine
2017-11-21
The aim of the study was to evaluate the sensitivity and resource efficiency of a partially automated adverse event (AE) surveillance system for routine patient safety efforts in hospitals with limited resources. Twenty-eight automated triggers from the hospital information system's clinical and administrative databases identified cases that were then filtered by exclusion criteria per trigger and then reviewed by an interdisciplinary team. The system, developed and implemented using in-house resources, was applied for 45 days of surveillance, for all hospital inpatient admissions (N = 1107). Each trigger was evaluated for its positive predictive value (PPV). Furthermore, the sensitivity of the surveillance system (overall and by AE category) was estimated relative to incidence ranges in the literature. The surveillance system identified a total of 123 AEs among 283 reviewed medical records, yielding an overall PPV of 52%. The tool showed variable levels of sensitivity across and within AE categories when compared with the literature, with a relatively low overall sensitivity estimated between 21% and 44%. Adverse events were detected in 23 of the 36 AE categories defined by an established harm classification system. Furthermore, none of the detected AEs were voluntarily reported. The surveillance system showed variable sensitivity levels across a broad range of AE categories with an acceptable PPV, overcoming certain limitations associated with other harm detection methods. The number of cases captured was substantial, and none had been previously detected or voluntarily reported. For hospitals with limited resources, this methodology provides valuable safety information from which interventions for quality improvement can be formulated.
NASA Astrophysics Data System (ADS)
Rasmussen, John C.; Bautista, Merrick; Tan, I.-Chih; Adams, Kristen E.; Aldrich, Melissa; Marshall, Milton V.; Fife, Caroline E.; Maus, Erik A.; Smith, Latisha A.; Zhang, Jingdan; Xiang, Xiaoyan; Zhou, Shaohua Kevin; Sevick-Muraca, Eva M.
2011-02-01
Recently, we demonstrated near-infrared (NIR) fluorescence imaging for quantifying real-time lymphatic propulsion in humans following intradermal injections of microdose amounts of indocyanine green. However computational methods for image analysis are underdeveloped, hindering the translation and clinical adaptation of NIR fluorescent lymphatic imaging. In our initial work we used ImageJ and custom MatLab programs to manually identify lymphatic vessels and individual propulsion events using the temporal transit of the fluorescent dye. In addition, we extracted the apparent velocities of contractile propagation and time periods between propulsion events. Extensive time and effort were required to analyze the 6-8 gigabytes of NIR fluorescent images obtained for each subject. To alleviate this bottleneck, we commenced development of ALFIA, an integrated software platform which will permit automated, near real-time analysis of lymphatic function using NIR fluorescent imaging. However, prior to automation, the base algorithms calculating the apparent velocity and period must be validated to verify that they produce results consistent with the proof-of-concept programs. To do this, both methods were used to analyze NIR fluorescent images of two subjects and the number of propulsive events identified, the average apparent velocities, and the average periods for each subject were compared. Paired Student's t-tests indicate that the differences between their average results are not significant. With the base algorithms validated, further development and automation of ALFIA can be realized, significantly reducing the amount of user interaction required, and potentially enabling the near real-time, clinical evaluation of NIR fluorescent lymphatic imaging.
NASA Astrophysics Data System (ADS)
Bhattacharya, D.; Painho, M.
2017-09-01
The paper endeavours to enhance the Sensor Web with crucial geospatial analysis capabilities through integration with Spatial Data Infrastructure. The objective is development of automated smart cities intelligence system (SMACiSYS) with sensor-web access (SENSDI) utilizing geomatics for sustainable societies. There has been a need to develop automated integrated system to categorize events and issue information that reaches users directly. At present, no web-enabled information system exists which can disseminate messages after events evaluation in real time. Research work formalizes a notion of an integrated, independent, generalized, and automated geo-event analysing system making use of geo-spatial data under popular usage platform. Integrating Sensor Web With Spatial Data Infrastructures (SENSDI) aims to extend SDIs with sensor web enablement, converging geospatial and built infrastructure, and implement test cases with sensor data and SDI. The other benefit, conversely, is the expansion of spatial data infrastructure to utilize sensor web, dynamically and in real time for smart applications that smarter cities demand nowadays. Hence, SENSDI augments existing smart cities platforms utilizing sensor web and spatial information achieved by coupling pairs of otherwise disjoint interfaces and APIs formulated by Open Geospatial Consortium (OGC) keeping entire platform open access and open source. SENSDI is based on Geonode, QGIS and Java, that bind most of the functionalities of Internet, sensor web and nowadays Internet of Things superseding Internet of Sensors as well. In a nutshell, the project delivers a generalized real-time accessible and analysable platform for sensing the environment and mapping the captured information for optimal decision-making and societal benefit.
Development of an integrated semi-automated system for in vitro pharmacodynamic modelling.
Wang, Liangsu; Wismer, Michael K; Racine, Fred; Conway, Donald; Giacobbe, Robert A; Berejnaia, Olga; Kath, Gary S
2008-11-01
The aim of this study was to develop an integrated system for in vitro pharmacodynamic modelling of antimicrobials with greater flexibility, easier control and better accuracy than existing in vitro models. Custom-made bottle caps, fittings, valve controllers and a modified bench-top shaking incubator were used. A temperature-controlled automated sample collector was built. Computer software was developed to manage experiments and to control the entire system including solenoid pinch valves, peristaltic pumps and the sample collector. The system was validated by pharmacokinetic simulations of linezolid 600 mg infusion. The antibacterial effect of linezolid against multiple Staphylococcus aureus strains was also studied in this system. An integrated semi-automated bench-top system was built and validated. The temperature-controlled automated sample collector allowed unattended collection and temporary storage of samples. The system software reduced the labour necessary for many tasks and also improved the timing accuracy for performing simultaneous actions in multiple parallel experiments. The system was able to simulate human pharmacokinetics of linezolid 600 mg intravenous infusion accurately. A pharmacodynamic study of linezolid against multiple S. aureus strains with a range of MICs showed that the required 24 h free drug AUC/MIC ratio was approximately 30 in order to keep the organism counts at the same level as their initial inoculum and was about > or = 68 in order to achieve > 2 log(10) cfu/mL reduction in the in vitro model. The integrated semi-automated bench-top system provided the ability to overcome many of the drawbacks of existing in vitro models. It can be used for various simple or complicated pharmacokinetic/pharmacodynamic studies efficiently and conveniently.
Comparison of Inoculation with the InoqulA and WASP Automated Systems with Manual Inoculation
Croxatto, Antony; Dijkstra, Klaas; Prod'hom, Guy
2015-01-01
The quality of sample inoculation is critical for achieving an optimal yield of discrete colonies in both monomicrobial and polymicrobial samples to perform identification and antibiotic susceptibility testing. Consequently, we compared the performance between the InoqulA (BD Kiestra), the WASP (Copan), and manual inoculation methods. Defined mono- and polymicrobial samples of 4 bacterial species and cloudy urine specimens were inoculated on chromogenic agar by the InoqulA, the WASP, and manual methods. Images taken with ImagA (BD Kiestra) were analyzed with the VisionLab version 3.43 image analysis software to assess the quality of growth and to prevent subjective interpretation of the data. A 3- to 10-fold higher yield of discrete colonies was observed following automated inoculation with both the InoqulA and WASP systems than that with manual inoculation. The difference in performance between automated and manual inoculation was mainly observed at concentrations of >106 bacteria/ml. Inoculation with the InoqulA system allowed us to obtain significantly more discrete colonies than the WASP system at concentrations of >107 bacteria/ml. However, the level of difference observed was bacterial species dependent. Discrete colonies of bacteria present in 100- to 1,000-fold lower concentrations than the most concentrated populations in defined polymicrobial samples were not reproducibly recovered, even with the automated systems. The analysis of cloudy urine specimens showed that InoqulA inoculation provided a statistically significantly higher number of discrete colonies than that with WASP and manual inoculation. Consequently, the automated InoqulA inoculation greatly decreased the requirement for bacterial subculture and thus resulted in a significant reduction in the time to results, laboratory workload, and laboratory costs. PMID:25972424
Automated High-Throughput Permethylation for Glycosylation Analysis of Biologics Using MALDI-TOF-MS.
Shubhakar, Archana; Kozak, Radoslaw P; Reiding, Karli R; Royle, Louise; Spencer, Daniel I R; Fernandes, Daryl L; Wuhrer, Manfred
2016-09-06
Monitoring glycoprotein therapeutics for changes in glycosylation throughout the drug's life cycle is vital, as glycans significantly modulate the stability, biological activity, serum half-life, safety, and immunogenicity. Biopharma companies are increasingly adopting Quality by Design (QbD) frameworks for measuring, optimizing, and controlling drug glycosylation. Permethylation of glycans prior to analysis by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) is a valuable tool for glycan characterization and for screening of large numbers of samples in QbD drug realization. However, the existing protocols for manual permethylation and liquid-liquid extraction (LLE) steps are labor intensive and are thus not practical for high-throughput (HT) studies. Here we present a glycan permethylation protocol, based on 96-well microplates, that has been developed into a kit suitable for HT work. The workflow is largely automated using a liquid handling robot and includes N-glycan release, enrichment of N-glycans, permethylation, and LLE. The kit has been validated according to industry analytical performance guidelines and applied to characterize biopharmaceutical samples, including IgG4 monoclonal antibodies (mAbs) and recombinant human erythropoietin (rhEPO). The HT permethylation enabled glycan characterization and relative quantitation with minimal side reactions: the MALDI-TOF-MS profiles obtained were in good agreement with hydrophilic liquid interaction chromatography (HILIC) and ultrahigh performance liquid chromatography (UHPLC) data. Automated permethylation and extraction of 96 glycan samples was achieved in less than 5 h and automated data acquisition on MALDI-TOF-MS took on average less than 1 min per sample. This automated and HT glycan preparation and permethylation showed to be convenient, fast, and reliable and can be applied for drug glycan profiling and clinical glycan biomarker studies.
Futamura, Megumi; Sugama, Junko; Okuwa, Mayumi; Sanada, Hiromi; Tabata, Keiko
2008-12-01
This study objectively evaluated the degree of comfort in bedridden older adults using an air-cell mattress with an automated turning mechanism. The sample included 10 bedridden women with verbal communication difficulties. The high frequency (HF) components of heart rate variability, which reflect parasympathetic nervous activity, were compared for the manual and automated turning periods. No significant differences in the HF component were observed in 5 of the participants. Significant increases in the HF component associated with automated turning were observed in 3 participants; however, the two participants with the lowest body mass index values exhibited a significant reduction in the HF component during the automated turning period. The results revealed that comfort might not be disturbed during the automated turning period.
Concept of Operations for Road Weather Connected Vehicle and Automated Vehicle Applications
DOT National Transportation Integrated Search
2017-05-21
Weather has a significant impact on the operations of the nation's roadway system year round. These weather events translate into changes in traffic conditions, roadway safety, travel reliability, operational effectiveness and productivity. It is, th...
Álvarez, Daniel; Alonso-Álvarez, María L.; Gutiérrez-Tobal, Gonzalo C.; Crespo, Andrea; Kheirandish-Gozal, Leila; Hornero, Roberto; Gozal, David; Terán-Santos, Joaquín; Del Campo, Félix
2017-01-01
Study Objectives: Nocturnal oximetry has become known as a simple, readily available, and potentially useful diagnostic tool of childhood obstructive sleep apnea (OSA). However, at-home respiratory polygraphy (HRP) remains the preferred alternative to polysomnography (PSG) in unattended settings. The aim of this study was twofold: (1) to design and assess a novel methodology for pediatric OSA screening based on automated analysis of at-home oxyhemoglobin saturation (SpO2), and (2) to compare its diagnostic performance with HRP. Methods: SpO2 recordings were parameterized by means of time, frequency, and conventional oximetric measures. Logistic regression models were optimized using genetic algorithms (GAs) for three cutoffs for OSA: 1, 3, and 5 events/h. The diagnostic performance of logistic regression models, manual obstructive apnea-hypopnea index (OAHI) from HRP, and the conventional oxygen desaturation index ≥ 3% (ODI3) were assessed. Results: For a cutoff of 1 event/h, the optimal logistic regression model significantly outperformed both conventional HRP-derived ODI3 and OAHI: 85.5% accuracy (HRP 74.6%; ODI3 65.9%) and 0.97 area under the receiver operating characteristics curve (AUC) (HRP 0.78; ODI3 0.75) were reached. For a cutoff of 3 events/h, the logistic regression model achieved 83.4% accuracy (HRP 85.0%; ODI3 74.5%) and 0.96 AUC (HRP 0.93; ODI3 0.85) whereas using a cutoff of 5 events/h, oximetry reached 82.8% accuracy (HRP 85.1%; ODI3 76.7) and 0.97 AUC (HRP 0.95; ODI3 0.84). Conclusions: Automated analysis of at-home SpO2 recordings provide accurate detection of children with high pretest probability of OSA. Thus, unsupervised nocturnal oximetry may enable a simple and effective alternative to HRP and PSG in unattended settings. Citation: Álvarez D, Alonso-Álvarez ML, Gutiérrez-Tobal GC, Crespo A, Kheirandish-Gozal L, Hornero R, Gozal D, Terán-Santos J, Del Campo F. Automated screening of children with obstructive sleep apnea using nocturnal oximetry: an alternative to respiratory polygraphy in unattended settings. J Clin Sleep Med. 2017;13(5):693–702. PMID:28356177
One of My Favorite Assignments: Automated Teller Machine Simulation.
ERIC Educational Resources Information Center
Oberman, Paul S.
2001-01-01
Describes an assignment for an introductory computer science class that requires the student to write a software program that simulates an automated teller machine. Highlights include an algorithm for the assignment; sample file contents; language features used; assignment variations; and discussion points. (LRW)
An automated field phenotyping pipeline for application in grapevine research.
Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard
2015-02-26
Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale.
Automated tumor analysis for molecular profiling in lung cancer
Boyd, Clinton; James, Jacqueline A.; Loughrey, Maurice B.; Hougton, Joseph P.; Boyle, David P.; Kelly, Paul; Maxwell, Perry; McCleary, David; Diamond, James; McArt, Darragh G.; Tunstall, Jonathon; Bankhead, Peter; Salto-Tellez, Manuel
2015-01-01
The discovery and clinical application of molecular biomarkers in solid tumors, increasingly relies on nucleic acid extraction from FFPE tissue sections and subsequent molecular profiling. This in turn requires the pathological review of haematoxylin & eosin (H&E) stained slides, to ensure sample quality, tumor DNA sufficiency by visually estimating the percentage tumor nuclei and tumor annotation for manual macrodissection. In this study on NSCLC, we demonstrate considerable variation in tumor nuclei percentage between pathologists, potentially undermining the precision of NSCLC molecular evaluation and emphasising the need for quantitative tumor evaluation. We subsequently describe the development and validation of a system called TissueMark for automated tumor annotation and percentage tumor nuclei measurement in NSCLC using computerized image analysis. Evaluation of 245 NSCLC slides showed precise automated tumor annotation of cases using Tissuemark, strong concordance with manually drawn boundaries and identical EGFR mutational status, following manual macrodissection from the image analysis generated tumor boundaries. Automated analysis of cell counts for % tumor measurements by Tissuemark showed reduced variability and significant correlation (p < 0.001) with benchmark tumor cell counts. This study demonstrates a robust image analysis technology that can facilitate the automated quantitative analysis of tissue samples for molecular profiling in discovery and diagnostics. PMID:26317646
An Automated Field Phenotyping Pipeline for Application in Grapevine Research
Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard
2015-01-01
Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485
NASA Astrophysics Data System (ADS)
Stark, K.
2017-12-01
One common source of uncertainty in sediment transport modeling of large semi-arid rivers is sediment influx delivered by ephemeral, flood-driven tributaries. Large variations in sediment delivery are associated with these regimes due to the highly variable nature of flows within them. While there are many sediment transport equations, they are typically developed for perennial streams and can be inaccurate for ephemeral channels. Discrete, manual sampling is labor intensive and requires personnel to be on site during flooding. In addition, flooding within these tributaries typically last on the order of hours, making it difficult to be present during an event. To better understand these regimes, automated systems are needed to continuously sample bedload and suspended load. In preparation for the pending installation of an automated site on the Arroyo de los Piños in New Mexico, manual sediment and flow samples have been collected over the summer monsoon season of 2017, in spite of the logistical challenges. These data include suspended and bedload sediment samples at the basin outlet, and stage and precipitation data from throughout the basin. Data indicate a complex system; flow is generated primarily in areas of exposed bedrock in the center and higher elevations of the watershed. Bedload samples show a large coarse-grained fraction, with 50% >2 mm and 25% >6 mm, which is compatible with acoustic measuring techniques. These data will be used to inform future site operations, which will combine direct sediment measurement from Reid-type slot samplers and non-invasive acoustic measuring methods. Bedload will be indirectly monitored using pipe-style microphones, plate-style geophones, channel hydrophones, and seismometers. These instruments record vibrations and acoustic signals from bedload impacts and movement. Indirect methods for measuring of bedload have never been extensively evaluated in ephemeral channels in the southwest United States. Once calibrated these indirect methods of measuring sediment load can be readily and economically deployed elsewhere within the arid Southwest. Ultimately, this experiment will provide more accurate ephemeral channel sediment loads for stream restoration studies, sediment management actions, and reservoir sedimentation studies.
Trace-Level Automated Mercury Speciation Analysis
Taylor, Vivien F.; Carter, Annie; Davies, Colin; Jackson, Brian P.
2011-01-01
An automated system for methyl Hg analysis by purge and trap gas chromatography (GC) was evaluated, with comparison of several different instrument configurations including chromatography columns (packed column or capillary), detector (atomic fluorescence, AFS, or inductively coupled plasma mass spectrometry, ICP-MS, using quadrupole and sector field ICP- MS instruments). Method detection limits (MDL) of 0.042 pg and 0.030 pg for CH3Hg+ were achieved with the automated Hg analysis system configured with AFS and ICPMS detection, respectively. Capillary GC with temperature programming was effective in improving resolution and decreasing retention times of heavier Hg species (in this case C3H7Hg+) although carryover between samples was increased. With capillary GC, the MDL for CH3Hg+ was 0.25 pg for AFS detection and 0.060 pg for ICP-MS detection. The automated system was demonstrated to have high throughput (72 samples analyzed in 8 hours) requiring considerably less analyst time than the manual method for methyl mercury analysis described in EPA 1630. PMID:21572543
Efficient and automatic wireless geohazard monitoring
NASA Astrophysics Data System (ADS)
Rubin, Marc J.
In this dissertation, we present our research contributions geared towards creating an automated and efficient wireless sensor network (WSN) for geohazard monitoring. Specifically, this dissertation addresses three overall technical research problems inherent in implementing and deploying such a WSN, i.e., 1) automated event detection from geophysical data, 2) efficient wireless transmission, and 3) low-cost wireless hardware. In addition, after presenting algorithms, experimentation, and results from these three overall problems, we take a step back and discuss how, when, and why such scientific work matters in a geohazardous risk scenario. First, in Chapter 2, we discuss automated geohazard event detection within geophysical data. In particular, we present our pattern recognition workflow that can automatically detect snow avalanche events in seismic (geophone sensor) data. This workflow includes customized signal preprocessing for feature extraction, cluster-based stratified sub-sampling for majority class reduction, and experimentation with 12 different machine learning algorithms; results show that a decision stump classifier achieved 99.8% accuracy, 88.8% recall, and 13.2% precision in detecting avalanches within seismic data collected in the mountains above Davos, Switzerland, an improvement on previous work in the field. To address the second overall research problem (i.e., efficient wireless transmission), we present and evaluate our on-mote compressive sampling algorithm called Randomized Timing Vector (RTV) in Chapter 3 and compare our approach to four other on-mote, lossy compression algorithms in Chapter 4. Results from our work show that our RTV algorithm outperforms current on-mote compressive sampling algorithms and performs comparably to (and in many cases better than) the four state-of-the-art, on-mote lossy compression techniques. The main benefit of RTV is that it can guarantee a desired level of compression performance (and thus, radio usage and power consumption) without subjugating recovered signal quality. Another benefit of RTV is its simplicity and low computational overhead; by sampling directly in compressed form, RTV vastly decreases the amount of memory space and computation time required for on-mote compression. Third, in Chapter 5, we present and evaluate our custom, low-cost, Arduino-based wireless hardware (i.e., GeoMoteShield) developed for wireless seismic data acquisition. In particular, we first provide details regarding the motivation, design, and implementation of our custom GeoMoteShield and then compare our custom hardware against two much more expensive systems, i.e., a traditional wired seismograph and a "from-the-ground-up" wireless mote developed by SmartGeo colleagues. We validate our custom WSN of nine GeoMoteShields using controlled lab tests and then further evaluate the WSN's performance during two seismic field tests, i.e., a "walk-away" test and a seismic refraction survey. Results show that our low-cost, Arduino-based GeoMoteShield performs comparably to a much more expensive wired system and a "from the ground up" wireless mote in terms of signal precision, accuracy, and time synchronization. Finally, in Chapter 6, we provide a broad literature review and discussion of how, when, and why scientific work matters in geohazardous risk scenarios. This work is geared towards scientists conducting research within fields involving geohazard risk assessment and mitigation. In particular, this chapter reviews three topics from Science, Technology, Engineering, and Policy (STEP): 1) risk, scientific uncertainty, and policy, 2) society's perceptions of risk, and 3) the effectiveness of risk communication. Though this chapter is not intended to be a comprehensive STEP literature survey, it addresses many pertinent questions and provides guidance to scientists and engineers operating in such fields. In short, this chapter aims to answer three main questions, i.e., 1) "when does scientific work influence policy decisions?", 2) "how does scientific work impact people's perception of risk?", and 3) "how is technical scientific work communicated to the non-scientific community?".
Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.
O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.
Transforming Microbial Genotyping: A Robotic Pipeline for Genotyping Bacterial Strains
Velayudhan, Vimalkumar; Murphy, Ronan A.; Achtman, Mark
2012-01-01
Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of < €25 per strain. Since developing this pipeline, >200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost. PMID:23144721
Bennett, Teresa A; Montesinos, Pau; Moscardo, Federico; Martinez-Cuadron, David; Martinez, Joaquin; Sierra, Jorge; García, Raimundo; de Oteyza, Jaime Perez; Fernandez, Pascual; Serrano, Josefina; Fernandez, Angeles; Herrera, Pilar; Gonzalez, Ataulfo; Bethancourt, Concepcion; Rodriguez-Macias, Gabriela; Alonso, Arancha; Vera, Juan A; Navas, Begoña; Lavilla, Esperanza; Lopez, Juan A; Jimenez, Santiago; Simiele, Adriana; Vidriales, Belen; Gonzalez, Bernardo J; Burgaleta, Carmen; Hernandez Rivas, Jose A; Mascuñano, Raul Cordoba; Bautista, Guiomar; Perez Simon, Jose A; Fuente, Adolfo de la; Rayón, Consolación; Troconiz, Iñaki F; Janda, Alvaro; Bosanquet, Andrew G; Hernandez-Campo, Pilar; Primo, Daniel; Lopez, Rocio; Liebana, Belen; Rojas, Jose L; Gorrochategui, Julian; Sanz, Miguel A; Ballesteros, Joan
2014-08-01
We have evaluated the ex vivo pharmacology of single drugs and drug combinations in malignant cells of bone marrow samples from 125 patients with acute myeloid leukemia using a novel automated flow cytometry-based platform (ExviTech). We have improved previous ex vivo drug testing with 4 innovations: identifying individual leukemic cells, using intact whole blood during the incubation, using an automated platform that escalates reliably data, and performing analyses pharmacodynamic population models. Samples were sent from 24 hospitals to a central laboratory and incubated for 48 hours in whole blood, after which drug activity was measured in terms of depletion of leukemic cells. The sensitivity of single drugs is assessed for standard efficacy (EMAX) and potency (EC50) variables, ranked as percentiles within the population. The sensitivity of drug-combination treatments is assessed for the synergism achieved in each patient sample. We found a large variability among patient samples in the dose-response curves to a single drug or combination treatment. We hypothesize that the use of the individual patient ex vivo pharmacological profiles may help to guide a personalized treatment selection. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Microbiological profiles of the Viking spacecraft.
Puleo, J R; Fields, N D; Bergstrom, S L; Oxborrow, G S; Stabekis, P D; Koukol, R
1977-01-01
Planetary quarantine requirements associated with the launch of two Viking spacecraft necessitated microbiological assessment during assembly and testing at Cape Canaveral and the Kennedy Space Center. Samples were collected from selected surface of the Viking Lander Capsules (VLC), Orbiters, (VO), and Shrouds at predetermined intervals during assembly and testing. Approximately 7,000 samples were assayed. Levels of bacterial spores per square meter on the VLC-1 and VLC-2 were 1.6 x 10(2) and 9.7 x 10(1), respectively, prior to dry-heat sterilization. The ranges of aerobic mesophilic microorganisms detected on the VO-1 and VO-2 at various sampling events were 4.2 x 10(2) to 4.3 x 10(3) and 2.3 x 10(2) to 8.9 x 10(3)/m2, respectively. Approximately 1,300 colonies were picked from culture plates, identified, lypholipized, and stored for future reference. About 75% of all isolates were microorganisms considered indigenous to humans; the remaining isolates were associated with soil and dust in the environment. The percentage of microorganisms of human origin was consistent with results obtained with previous automated spacecraft but slightly lower than those observed for manned (Apollo) spacecraft. PMID:848957
Microbiological profiles of the Viking spacecraft.
Puleo, J R; Fields, N D; Bergstrom, S L; Oxborrow, G S; Stabekis, P D; Koukol, R
1977-02-01
Planetary quarantine requirements associated with the launch of two Viking spacecraft necessitated microbiological assessment during assembly and testing at Cape Canaveral and the Kennedy Space Center. Samples were collected from selected surface of the Viking Lander Capsules (VLC), Orbiters, (VO), and Shrouds at predetermined intervals during assembly and testing. Approximately 7,000 samples were assayed. Levels of bacterial spores per square meter on the VLC-1 and VLC-2 were 1.6 x 10(2) and 9.7 x 10(1), respectively, prior to dry-heat sterilization. The ranges of aerobic mesophilic microorganisms detected on the VO-1 and VO-2 at various sampling events were 4.2 x 10(2) to 4.3 x 10(3) and 2.3 x 10(2) to 8.9 x 10(3)/m2, respectively. Approximately 1,300 colonies were picked from culture plates, identified, lypholipized, and stored for future reference. About 75% of all isolates were microorganisms considered indigenous to humans; the remaining isolates were associated with soil and dust in the environment. The percentage of microorganisms of human origin was consistent with results obtained with previous automated spacecraft but slightly lower than those observed for manned (Apollo) spacecraft.
Automated cell counts on CSF samples: A multicenter performance evaluation of the GloCyte system.
Hod, E A; Brugnara, C; Pilichowska, M; Sandhaus, L M; Luu, H S; Forest, S K; Netterwald, J C; Reynafarje, G M; Kratz, A
2018-02-01
Automated cell counters have replaced manual enumeration of cells in blood and most body fluids. However, due to the unreliability of automated methods at very low cell counts, most laboratories continue to perform labor-intensive manual counts on many or all cerebrospinal fluid (CSF) samples. This multicenter clinical trial investigated if the GloCyte System (Advanced Instruments, Norwood, MA), a recently FDA-approved automated cell counter, which concentrates and enumerates red blood cells (RBCs) and total nucleated cells (TNCs), is sufficiently accurate and precise at very low cell counts to replace all manual CSF counts. The GloCyte System concentrates CSF and stains RBCs with fluorochrome-labeled antibodies and TNCs with nucleic acid dyes. RBCs and TNCs are then counted by digital image analysis. Residual adult and pediatric CSF samples obtained for clinical analysis at five different medical centers were used for the study. Cell counts were performed by the manual hemocytometer method and with the GloCyte System following the same protocol at all sites. The limits of the blank, detection, and quantitation, as well as precision and accuracy of the GloCyte, were determined. The GloCyte detected as few as 1 TNC/μL and 1 RBC/μL, and reliably counted as low as 3 TNCs/μL and 2 RBCs/μL. The total coefficient of variation was less than 20%. Comparison with cell counts obtained with a hemocytometer showed good correlation (>97%) between the GloCyte and the hemocytometer, including at very low cell counts. The GloCyte instrument is a precise, accurate, and stable system to obtain red cell and nucleated cell counts in CSF samples. It allows for the automated enumeration of even very low cell numbers, which is crucial for CSF analysis. These results suggest that GloCyte is an acceptable alternative to the manual method for all CSF samples, including those with normal cell counts. © 2017 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Yussup, N.; Ibrahim, M. M.; Rahman, N. A. A.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.; Lombigit, L.; Azman, A.; Omar, S. A.
2018-01-01
Most of the procedures in neutron activation analysis (NAA) process that has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s were performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient especially for sample counting and measurement process. The sample needs to be changed and the measurement software needs to be setup for every one hour counting time. Both of these procedures are performed manually for every sample. Hence, an automatic sample changer system (ASC) that consists of hardware and software is developed to automate sample counting process for up to 30 samples consecutively. This paper describes the ASC control software for NAA process which is designed and developed to control the ASC hardware and call GammaVision software for sample measurement. The software is developed by using National Instrument LabVIEW development package.
SAMPL4 & DOCK3.7: lessons for automated docking procedures
NASA Astrophysics Data System (ADS)
Coleman, Ryan G.; Sterling, Teague; Weiss, Dahlia R.
2014-03-01
The SAMPL4 challenges were used to test current automated methods for solvation energy, virtual screening, pose and affinity prediction of the molecular docking pipeline DOCK 3.7. Additionally, first-order models of binding affinity were proposed as milestones for any method predicting binding affinity. Several important discoveries about the molecular docking software were made during the challenge: (1) Solvation energies of ligands were five-fold worse than any other method used in SAMPL4, including methods that were similarly fast, (2) HIV Integrase is a challenging target, but automated docking on the correct allosteric site performed well in terms of virtual screening and pose prediction (compared to other methods) but affinity prediction, as expected, was very poor, (3) Molecular docking grid sizes can be very important, serious errors were discovered with default settings that have been adjusted for all future work. Overall, lessons from SAMPL4 suggest many changes to molecular docking tools, not just DOCK 3.7, that could improve the state of the art. Future difficulties and projects will be discussed.
Golden, J.P.; Verbarg, J.; Howell, P.B.; Shriver-Lake, L.C.; Ligler, F.S.
2012-01-01
A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose–response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. PMID:22960010
Golden, J P; Verbarg, J; Howell, P B; Shriver-Lake, L C; Ligler, F S
2013-02-15
A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose-response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. Published by Elsevier B.V.
A fully automated liquid–liquid extraction system utilizing interface detection
Maslana, Eugene; Schmitt, Robert; Pan, Jeffrey
2000-01-01
The development of the Abbott Liquid-Liquid Extraction Station was a result of the need for an automated system to perform aqueous extraction on large sets of newly synthesized organic compounds used for drug discovery. The system utilizes a cylindrical laboratory robot to shuttle sample vials between two loading racks, two identical extraction stations, and a centrifuge. Extraction is performed by detecting the phase interface (by difference in refractive index) of the moving column of fluid drawn from the bottom of each vial containing a biphasic mixture. The integration of interface detection with fluid extraction maximizes sample throughput. Abbott-developed electronics process the detector signals. Sample mixing is performed by high-speed solvent injection. Centrifuging of the samples reduces interface emulsions. Operating software permits the user to program wash protocols with any one of six solvents per wash cycle with as many cycle repeats as necessary. Station capacity is eighty, 15 ml vials. This system has proven successful with a broad spectrum of both ethyl acetate and methylene chloride based chemistries. The development and characterization of this automated extraction system will be presented. PMID:18924693
A high-throughput semi-automated preparation for filtered synaptoneurosomes.
Murphy, Kathryn M; Balsor, Justin; Beshara, Simon; Siu, Caitlin; Pinto, Joshua G A
2014-09-30
Synaptoneurosomes have become an important tool for studying synaptic proteins. The filtered synaptoneurosomes preparation originally developed by Hollingsworth et al. (1985) is widely used and is an easy method to prepare synaptoneurosomes. The hand processing steps in that preparation, however, are labor intensive and have become a bottleneck for current proteomic studies using synaptoneurosomes. For this reason, we developed new steps for tissue homogenization and filtration that transform the preparation of synaptoneurosomes to a high-throughput, semi-automated process. We implemented a standardized protocol with easy to follow steps for homogenizing multiple samples simultaneously using a FastPrep tissue homogenizer (MP Biomedicals, LLC) and then filtering all of the samples in centrifugal filter units (EMD Millipore, Corp). The new steps dramatically reduce the time to prepare synaptoneurosomes from hours to minutes, increase sample recovery, and nearly double enrichment for synaptic proteins. These steps are also compatible with biosafety requirements for working with pathogen infected brain tissue. The new high-throughput semi-automated steps to prepare synaptoneurosomes are timely technical advances for studies of low abundance synaptic proteins in valuable tissue samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Biurrun Manresa, José A.; Arguissain, Federico G.; Medina Redondo, David E.; Mørch, Carsten D.; Andersen, Ole K.
2015-01-01
The agreement between humans and algorithms on whether an event-related potential (ERP) is present or not and the level of variation in the estimated values of its relevant features are largely unknown. Thus, the aim of this study was to determine the categorical and quantitative agreement between manual and automated methods for single-trial detection and estimation of ERP features. To this end, ERPs were elicited in sixteen healthy volunteers using electrical stimulation at graded intensities below and above the nociceptive withdrawal reflex threshold. Presence/absence of an ERP peak (categorical outcome) and its amplitude and latency (quantitative outcome) in each single-trial were evaluated independently by two human observers and two automated algorithms taken from existing literature. Categorical agreement was assessed using percentage positive and negative agreement and Cohen’s κ, whereas quantitative agreement was evaluated using Bland-Altman analysis and the coefficient of variation. Typical values for the categorical agreement between manual and automated methods were derived, as well as reference values for the average and maximum differences that can be expected if one method is used instead of the others. Results showed that the human observers presented the highest categorical and quantitative agreement, and there were significantly large differences between detection and estimation of quantitative features among methods. In conclusion, substantial care should be taken in the selection of the detection/estimation approach, since factors like stimulation intensity and expected number of trials with/without response can play a significant role in the outcome of a study. PMID:26258532
NASA Astrophysics Data System (ADS)
Villanueva, S.; Eastman, J. D.; Gaudi, B. S.; Pogge, R. W.; Stassun, K. G.; Trueblood, M.; Trueblood, P.
2016-07-01
We present the design and development of the DEdicatedMONitor of EXotransits and Transients (DEMONEXT), an automated and robotic 20 inch telescope jointly funded by The Ohio State University and Vanderbilt University. The telescope is a PlaneWave CDK20 f/6.8 Corrected Dall-Kirkham Astrograph telescope on a Mathis Instruments MI-750/1000 Fork Mount located atWiner Observatory in Sonoita, AZ. DEMONEXT has a Hedrick electronic focuser, Finger Lakes Instrumentation (FLI) CFW-3-10 filter wheel, and a 2048 x 2048 pixel FLI Proline CCD3041 camera with a pixel scale of 0.90 arc-seconds per pixel and a 30.7× 30.7 arc-minute field-of-view. The telescope's automation, controls, and scheduling are implemented in Python, including a facility to add new targets in real time for rapid follow-up of time-critical targets. DEMONEXT will be used for the confirmation and detailed investigation of newly discovered planet candidates from the Kilodegree Extremely Little Telescope (KELT) survey, exploration of the atmospheres of Hot Jupiters via transmission spectroscopy and thermal emission measurements, and monitoring of select eclipsing binary star systems as benchmarks for models of stellar evolution. DEMONEXT will enable rapid confirmation imaging of supernovae, flare stars, tidal disruption events, and other transients discovered by the All Sky Automated Survey for SuperNovae (ASAS-SN). DEMONEXT will also provide follow-up observations of single-transit planets identified by the Transiting Exoplanet Survey Satellite (TESS) mission, and to validate long-period eclipsing systems discovered by Gaia.
Harris, Don; Stanton, Neville A; Starr, Alison
2015-01-01
Function Allocation methods are important for the appropriate allocation of tasks between humans and automated systems. It is proposed that Operational Event Sequence Diagrams (OESDs) provide a simple yet rigorous basis upon which allocation of work can be assessed. This is illustrated with respect to a design concept for a passenger aircraft flown by just a single pilot where the objective is to replace or supplement functions normally undertaken by the second pilot with advanced automation. A scenario-based analysis (take off) was used in which there would normally be considerable demands and interactions with the second pilot. The OESD analyses indicate those tasks that would be suitable for allocation to automated assistance on the flight deck and those tasks that are now redundant in this new configuration (something that other formal Function Allocation approaches cannot identify). Furthermore, OESDs are demonstrated to be an easy to apply and flexible approach to the allocation of function in prospective systems. OESDs provide a simple yet rigorous basis upon which allocation of work can be assessed. The technique can deal with the flexible, dynamic allocation of work and the deletion of functions no longer required. This is illustrated using a novel design concept for a single-crew commercial aircraft.
Effects of alcohol on automated and controlled driving performances.
Berthelon, Catherine; Gineyt, Guy
2014-05-01
Alcohol is the most frequently detected substance in fatal automobile crashes, but its precise mode of action is not always clear. The present study was designed to establish the influence of blood alcohol concentration as a function of the complexity of the scenarios. Road scenarios implying automatic or controlled driving performances were manipulated in order to identify which behavioral parameters were deteriorated. A single blind counterbalanced experiment was conducted on a driving simulator. Sixteen experienced drivers (25.3 ± 2.9 years old, 8 men and 8 women) were tested with 0, 0.3, 0.5, and 0.8 g/l of alcohol. Driving scenarios varied: road tracking, car following, and an urban scenario including events inspired by real accidents. Statistical analyses were performed on driving parameters as a function of alcohol level. Automated driving parameters such as standard deviation of lateral position measured with the road tracking and car following scenarios were impaired by alcohol, notably with the highest dose. More controlled parameters such as response time to braking and number of crashes when confronted with specific events (urban scenario) were less affected by the alcohol level. Performance decrement was greater with driving scenarios involving automated processes than with scenarios involving controlled processes.
Unraveling the Tangles of Language Evolution
NASA Astrophysics Data System (ADS)
Petroni, F.; Serva, M.; Volchenkov, D.
2012-07-01
The relationships between languages molded by extremely complex social, cultural and political factors are assessed by an automated method, in which the distance between languages is estimated by the average normalized Levenshtein distance between words from the list of 200 meanings maximally resistant to change. A sequential process of language classification described by random walks on the matrix of lexical distances allows to represent complex relationships between languages geometrically, in terms of distances and angles. We have tested the method on a sample of 50 Indo-European and 50 Austronesian languages. The geometric representations of language taxonomy allows for making accurate interfaces on the most significant events of human history by tracing changes in language families through time. The Anatolian and Kurgan hypothesis of the Indo-European origin and the "express train" model of the Polynesian origin are thoroughly discussed.
Verplaetse, Ruth; Henion, Jack
2016-01-01
Opioids are well known, widely used painkillers. Increased stability of opioids in the dried blood spot (DBS) matrix compared to blood/plasma has been described. Other benefits provided by DBS techniques include point-of-care collection, less invasive micro sampling, more economical shipment, and convenient storage. Current methodology for analysis of micro whole blood samples for opioids is limited to the classical DBS workflow, including tedious manual punching of the DBS cards followed by extraction and liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. The goal of this study was to develop and validate a fully automated on-line sample preparation procedure for the analysis of DBS micro samples relevant to the detection of opioids in finger prick blood. To this end, automated flow-through elution of DBS cards was followed by on-line solid-phase extraction (SPE) and analysis by LC-MS/MS. Selective, sensitive, accurate, and reproducible quantitation of five representative opioids in human blood at sub-therapeutic, therapeutic, and toxic levels was achieved. The range of reliable response (R(2) ≥0.997) was 1 to 500 ng/mL whole blood for morphine, codeine, oxycodone, hydrocodone; and 0.1 to 50 ng/mL for fentanyl. Inter-day, intra-day, and matrix inter-lot accuracy and precision was less than 15% (even at lower limits of quantitation (LLOQ) level). The method was successfully used to measure hydrocodone and its major metabolite norhydrocodone in incurred human samples. Our data support the enormous potential of DBS sampling and automated analysis for monitoring opioids as well as other pharmaceuticals in both anti-doping and pain management regimens. Copyright © 2015 John Wiley & Sons, Ltd.
Hamdani, Mehdi; Chassot, Olivier; Fournier, Roxane
2014-01-01
Automated bolus delivery has recently been shown to reduce local anesthetic consumption and improve analgesia, compared with continuous infusion, in continuous sciatic and epidural block. However, there are few data on the influence of local anesthetic delivery method on local anesthetic consumption following interscalene blockade. This randomized, double-blind trial was designed to determine whether hourly automated perineural boluses (4 mL) of local anesthesia delivered with patient-controlled pro re nata (PRN, on demand) boluses would result in a reduction in total local anesthesia consumption during continuous interscalene blockade after shoulder surgery compared with continuous perineural infusion (4 mL/h) plus patient-controlled PRN boluses. One hundred one patients undergoing major shoulder surgery under general anesthesia with ultrasound-guided continuous interscalene block were randomly assigned to receive 0.2% ropivacaine via interscalene end-hole catheter either by continuous infusion 4 mL/h (n = 50) or as automated bolus 4 mL/h (n = 51). Both delivery methods were combined with 5 mL PRN boluses of 0.2% ropivacaine with a lockout time of 30 minutes. Postoperative number of PRN boluses, 24- and 48-hour local anesthetic consumption, pain scores, rescue analgesia (morphine), and adverse events were recorded. There were no significant differences in either the number of PRN ropivacaine boluses or total 48 hour local anesthetic consumption between the groups (18.5 [11-25.2] PRN boluses in the continuous infusion group vs 17 [8.5-29] PRN boluses in the automated bolus group). Postoperative pain was similar in both groups; on day 2, the median average pain score was 4 (2-6) in the continuous infusion group versus 3 (2-5) in the automated bolus group (P = 0.54). Nor were any statistically significant intergroup differences observed with respect to morphine rescue, incidence of adverse events, or patient satisfaction. In continuous interscalene blockade under ultrasound guidance after shoulder surgery, automated boluses of local anesthetic combined with PRN boluses did not provide any reduction in local anesthetic consumption or rescue analgesia, compared with continuous infusion combined with PRN boluses.
Expert system isssues in automated, autonomous space vehicle rendezvous
NASA Technical Reports Server (NTRS)
Goodwin, Mary Ann; Bochsler, Daniel C.
1987-01-01
The problems involved in automated autonomous rendezvous are briefly reviewed, and the Rendezvous Expert (RENEX) expert system is discussed with reference to its goals, approach used, and knowledge structure and contents. RENEX has been developed to support streamlining operations for the Space Shuttle and Space Station program and to aid definition of mission requirements for the autonomous portions of rendezvous for the Mars Surface Sample Return and Comet Nucleus Sample return unmanned missions. The experience with REMEX to date and recommendations for further development are presented.
Karger, Barry L.; Kotler, Lev; Foret, Frantisek; Minarik, Marek; Kleparnik, Karel
2003-12-09
A modular multiple lane or capillary electrophoresis (chromatography) system that permits automated parallel separation and comprehensive collection of all fractions from samples in all lanes or columns, with the option of further on-line automated sample fraction analysis, is disclosed. Preferably, fractions are collected in a multi-well fraction collection unit, or plate (40). The multi-well collection plate (40) is preferably made of a solvent permeable gel, most preferably a hydrophilic, polymeric gel such as agarose or cross-linked polyacrylamide.
High-density grids for efficient data collection from multiple crystals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto
Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassettemore » or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into theBlu-Ice/DCSSexperimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. As a result, crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures.« less
High-density grids for efficient data collection from multiple crystals
Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto; Barnes, Christopher O.; Bonagura, Christopher A.; Brehmer, Winnie; Brunger, Axel T.; Calero, Guillermo; Caradoc-Davies, Tom T.; Chatterjee, Ruchira; Degrado, William F.; Fraser, James S.; Ibrahim, Mohamed; Kern, Jan; Kobilka, Brian K.; Kruse, Andrew C.; Larsson, Karl M.; Lemke, Heinrik T.; Lyubimov, Artem Y.; Manglik, Aashish; McPhillips, Scott E.; Norgren, Erik; Pang, Siew S.; Soltis, S. M.; Song, Jinhu; Thomaston, Jessica; Tsai, Yingssu; Weis, William I.; Woldeyes, Rahel A.; Yachandra, Vittal; Yano, Junko; Zouni, Athina; Cohen, Aina E.
2016-01-01
Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassette or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into the Blu-Ice/DCSS experimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. Crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures. PMID:26894529
High-density grids for efficient data collection from multiple crystals
Baxter, Elizabeth L.; Aguila, Laura; Alonso-Mori, Roberto; ...
2015-11-03
Higher throughput methods to mount and collect data from multiple small and radiation-sensitive crystals are important to support challenging structural investigations using microfocus synchrotron beamlines. Furthermore, efficient sample-delivery methods are essential to carry out productive femtosecond crystallography experiments at X-ray free-electron laser (XFEL) sources such as the Linac Coherent Light Source (LCLS). To address these needs, a high-density sample grid useful as a scaffold for both crystal growth and diffraction data collection has been developed and utilized for efficient goniometer-based sample delivery at synchrotron and XFEL sources. A single grid contains 75 mounting ports and fits inside an SSRL cassettemore » or uni-puck storage container. The use of grids with an SSRL cassette expands the cassette capacity up to 7200 samples. Grids may also be covered with a polymer film or sleeve for efficient room-temperature data collection from multiple samples. New automated routines have been incorporated into theBlu-Ice/DCSSexperimental control system to support grids, including semi-automated grid alignment, fully automated positioning of grid ports, rastering and automated data collection. Specialized tools have been developed to support crystallization experiments on grids, including a universal adaptor, which allows grids to be filled by commercial liquid-handling robots, as well as incubation chambers, which support vapor-diffusion and lipidic cubic phase crystallization experiments. Experiments in which crystals were loaded into grids or grown on grids using liquid-handling robots and incubation chambers are described. As a result, crystals were screened at LCLS-XPP and SSRL BL12-2 at room temperature and cryogenic temperatures.« less
Automated Analysis of Child Phonetic Production Using Naturalistic Recordings
ERIC Educational Resources Information Center
Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill
2014-01-01
Purpose: Conventional resource-intensive methods for child phonetic development studies are often impractical for sampling and analyzing child vocalizations in sufficient quantity. The purpose of this study was to provide new information on early language development by an automated analysis of child phonetic production using naturalistic…