Science.gov

Sample records for mindboggle automated brain

  1. 101 Labeled Brain Images and a Consistent Human Cortical Labeling Protocol

    PubMed Central

    Klein, Arno; Tourville, Jason

    2012-01-01

    We introduce the Mindboggle-101 dataset, the largest and most complete set of free, publicly accessible, manually labeled human brain images. To manually label the macroscopic anatomy in magnetic resonance images of 101 healthy participants, we created a new cortical labeling protocol that relies on robust anatomical landmarks and minimal manual edits after initialization with automated labels. The “Desikan–Killiany–Tourville” (DKT) protocol is intended to improve the ease, consistency, and accuracy of labeling human cortical areas. Given how difficult it is to label brains, the Mindboggle-101 dataset is intended to serve as brain atlases for use in labeling other brains, as a normative dataset to establish morphometric variation in a healthy population for comparison against clinical populations, and contribute to the development, training, testing, and evaluation of automated registration and labeling algorithms. To this end, we also introduce benchmarks for the evaluation of such algorithms by comparing our manual labels with labels automatically generated by probabilistic and multi-atlas registration-based approaches. All data and related software and updated information are available on the http://mindboggle.info/data website. PMID:23227001

  2. Automated segmentation of MR images of brain tumors.

    PubMed

    Kaus, M R; Warfield, S K; Nabavi, A; Black, P M; Jolesz, F A; Kikinis, R

    2001-02-01

    An automated brain tumor segmentation method was developed and validated against manual segmentation with three-dimensional magnetic resonance images in 20 patients with meningiomas and low-grade gliomas. The automated method (operator time, 5-10 minutes) allowed rapid identification of brain and tumor tissue with an accuracy and reproducibility comparable to those of manual segmentation (operator time, 3-5 hours), making automated segmentation practical for low-grade gliomas and meningiomas. PMID:11161183

  3. Automated brain segmentation using neural networks

    NASA Astrophysics Data System (ADS)

    Powell, Stephanie; Magnotta, Vincent; Johnson, Hans; Andreasen, Nancy

    2006-03-01

    Automated methods to delineate brain structures of interest are required to analyze large amounts of imaging data like that being collected in several on going multi-center studies. We have previously reported on using artificial neural networks (ANN) to define subcortical brain structures such as the thalamus (0.825), caudate (0.745), and putamen (0.755). One of the inputs into the ANN is the apriori probability of a structure existing at a given location. In this previous work, the apriori probability information was generated in Talairach space using a piecewise linear registration. In this work we have increased the dimensionality of this registration using Thirion's demons registration algorithm. The input vector consisted of apriori probability, spherical coordinates, and an iris of surrounding signal intensity values. The output of the neural network determined if the voxel was defined as one of the N regions used for training. Training was performed using a standard back propagation algorithm. The ANN was trained on a set of 15 images for 750,000,000 iterations. The resulting ANN weights were then applied to 6 test images not part of the training set. Relative overlap calculated for each structure was 0.875 for the thalamus, 0.845 for the caudate, and 0.814 for the putamen. With the modifications on the neural net algorithm and the use of multi-dimensional registration, we found substantial improvement in the automated segmentation method. The resulting segmented structures are as reliable as manual raters and the output of the neural network can be used without additional rater intervention.

  4. "BRAIN": Baruch Retrieval of Automated Information for Negotiations.

    ERIC Educational Resources Information Center

    Levenstein, Aaron, Ed.

    1981-01-01

    A data processing program that can be used as a research and collective bargaining aid for colleges is briefly described and the fields of the system are outlined. The system, known as BRAIN (Baruch Retrieval of Automated Information for Negotiations), is designed primarily as an instrument for quantitative and qualitative analysis. BRAIN consists…

  5. Automated analysis of fundamental features of brain structures.

    PubMed

    Lancaster, Jack L; McKay, D Reese; Cykowski, Matthew D; Martinez, Michael J; Tan, Xi; Valaparla, Sunil; Zhang, Yi; Fox, Peter T

    2011-12-01

    Automated image analysis of the brain should include measures of fundamental structural features such as size and shape. We used principal axes (P-A) measurements to measure overall size and shape of brain structures segmented from MR brain images. The rationale was that quantitative volumetric studies of brain structures would benefit from shape standardization as had been shown for whole brain studies. P-A analysis software was extended to include controls for variability in position and orientation to support individual structure spatial normalization (ISSN). The rationale was that ISSN would provide a bias-free means to remove elementary sources of a structure's spatial variability in preparation for more detailed analyses. We studied nine brain structures (whole brain, cerebral hemispheres, cerebellum, brainstem, caudate, putamen, hippocampus, inferior frontal gyrus, and precuneus) from the 40-brain LPBA40 atlas. This paper provides the first report of anatomical positions and principal axes orientations within a standard reference frame, in addition to "shape/size related" principal axes measures, for the nine brain structures from the LPBA40 atlas. Analysis showed that overall size (mean volume) for internal brain structures was preserved using shape standardization while variance was reduced by more than 50%. Shape standardization provides increased statistical power for between-group volumetric studies of brain structures compared to volumetric studies that control only for whole brain size. To test ISSN's ability to control for spatial variability of brain structures we evaluated the overlap of 40 regions of interest (ROIs) in a standard reference frame for the nine different brain structures before and after processing. Standardizations of orientation or shape were ineffective when not combined with position standardization. The greatest reduction in spatial variability was seen for combined standardizations of position, orientation and shape. These

  6. Automated segmentation of three-dimensional MR brain images

    NASA Astrophysics Data System (ADS)

    Park, Jonggeun; Baek, Byungjun; Ahn, Choong-Il; Ku, Kyo Bum; Jeong, Dong Kyun; Lee, Chulhee

    2006-03-01

    Brain segmentation is a challenging problem due to the complexity of the brain. In this paper, we propose an automated brain segmentation method for 3D magnetic resonance (MR) brain images which are represented as a sequence of 2D brain images. The proposed method consists of three steps: pre-processing, removal of non-brain regions (e.g., the skull, meninges, other organs, etc), and spinal cord restoration. In pre-processing, we perform adaptive thresholding which takes into account variable intensities of MR brain images corresponding to various image acquisition conditions. In segmentation process, we iteratively apply 2D morphological operations and masking for the sequences of 2D sagittal, coronal, and axial planes in order to remove non-brain tissues. Next, final 3D brain regions are obtained by applying OR operation for segmentation results of three planes. Finally we reconstruct the spinal cord truncated during the previous processes. Experiments are performed with fifteen 3D MR brain image sets with 8-bit gray-scale. Experiment results show the proposed algorithm is fast, and provides robust and satisfactory results.

  7. Automated in situ brain imaging for mapping the Drosophila connectome.

    PubMed

    Lin, Chi-Wen; Lin, Hsuan-Wen; Chiu, Mei-Tzu; Shih, Yung-Hsin; Wang, Ting-Yuan; Chang, Hsiu-Ming; Chiang, Ann-Shyn

    2015-01-01

    Mapping the connectome, a wiring diagram of the entire brain, requires large-scale imaging of numerous single neurons with diverse morphology. It is a formidable challenge to reassemble these neurons into a virtual brain and correlate their structural networks with neuronal activities, which are measured in different experiments to analyze the informational flow in the brain. Here, we report an in situ brain imaging technique called Fly Head Array Slice Tomography (FHAST), which permits the reconstruction of structural and functional data to generate an integrative connectome in Drosophila. Using FHAST, the head capsules of an array of flies can be opened with a single vibratome sectioning to expose the brains, replacing the painstaking and inconsistent brain dissection process. FHAST can reveal in situ brain neuroanatomy with minimal distortion to neuronal morphology and maintain intact neuronal connections to peripheral sensory organs. Most importantly, it enables the automated 3D imaging of 100 intact fly brains in each experiment. The established head model with in situ brain neuroanatomy allows functional data to be accurately registered and associated with 3D images of single neurons. These integrative data can then be shared, searched, visualized, and analyzed for understanding how brain-wide activities in different neurons within the same circuit function together to control complex behaviors.

  8. Automated Brain Extraction from T2-weighted Magnetic Resonance Images

    PubMed Central

    Datta, Sushmita; Narayana, Ponnada A.

    2011-01-01

    Purpose To develop and implement an automated and robust technique to extract brain from T2-weighted images. Materials and Methods Magnetic resonance imaging (MRI) was performed on 75 adult volunteers to acquire dual fast spin echo (FSE) images with fat-saturation technique on a 3T Philips scanner. Histogram-derived thresholds were derived directly from the original images followed by the application of regional labeling, regional connectivity, and mathematical morphological operations to extract brain from axial late-echo FSE (T2-weighted) images. The proposed technique was evaluated subjectively by an expert and quantitatively using Bland-Altman plot and Jaccard and Dice similarity measures. Results Excellent agreement between the extracted brain volumes with the proposed technique and manual stripping by an expert was observed based on Bland-Altman plot and also as assessed by high similarity indices (Jaccard: 0.9825± 0.0045; Dice: 0.9912 ±0.0023). Conclusion Brain extraction using proposed automated methodology is robust and the results are reproducible. PMID:21448946

  9. Automated Prescription of Oblique Brain 3D MRSI

    PubMed Central

    Ozhinsky, Eugene; Vigneron, Daniel B.; Chang, Susan M.; Nelson, Sarah J.

    2012-01-01

    Two major difficulties encountered in implementing Magnetic Resonance Spectroscopic Imaging (MRSI) in a clinical setting are limited coverage and difficulty in prescription. The goal of this project was to completely automate the process of 3D PRESS MRSI prescription, including placement of the selection box, saturation bands and shim volume, while maximizing the coverage of the brain. The automated prescription technique included acquisition of an anatomical MRI image, optimization of the oblique selection box parameters, optimization of the placement of OVS saturation bands, and loading of the calculated parameters into a customized 3D MRSI pulse sequence. To validate the technique and compare its performance with existing protocols, 3D MRSI data were acquired from 6 exams from 3 healthy volunteers. To assess the performance of the automated 3D MRSI prescription for patients with brain tumors, the data were collected from 16 exams from 8 subjects with gliomas. This technique demonstrated robust coverage of the tumor, high consistency of prescription and very good data quality within the T2 lesion. PMID:22692829

  10. Automated population-based planning for whole brain radiation therapy.

    PubMed

    Schreibmann, Eduard; Fox, Tim; Curran, Walter; Shu, Hui-Kuo; Crocker, Ian

    2015-09-08

    Treatment planning for whole-brain radiation treatment is technically a simple process, but in practice it takes valuable clinical time of repetitive and tedious tasks. This report presents a method that automatically segments the relevant target and normal tissues, and creates a treatment plan in only a few minutes after patient simulation. Segmentation of target and critical structures is performed automatically through morphological operations on the soft tissue and was validated by comparing with manual clinical segmentation using the Dice coefficient and Hausdorff distance. The treatment plan is generated by searching a database of previous cases for patients with similar anatomy. In this search, each database case is ranked in terms of similarity using a customized metric designed for sensitivity by including only geometrical changes that affect the dose distribution. The database case with the best match is automatically modified to replace relevant patient info and isocenter position while maintaining original beam and MLC settings. Fifteen patients with marginally acceptable treatment plans were used to validate the method. In each of these cases the anatomy was accurately segmented, but the beams and MLC settings led to a suboptimal treatment plan by either underdosing the brain or excessively irradiating critical normal tissues. For each case, the anatomy was automatically segmented with the proposed method, and the automated and anual segmentations were then compared. The mean Dice coefficient was 0.97, with a standard deviation of 0.008 for the brain, 0.85 ± 0.009 for the eyes, and 0.67 ± 0.11 for the lens. The mean Euclidian distance was 0.13 ± 0.13 mm for the brain, 0.27± 0.31 for the eye, and 2.34 ± 7.23 for the lens. Each case was then subsequently matched against a database of 70 validated treatment plans and the best matching plan (termed autoplanned), was compared retrospectively with the clinical plans in terms of brain coverage and

  11. Automated population-based planning for whole brain radiation therapy.

    PubMed

    Schreibmann, Eduard; Fox, Tim; Curran, Walter; Shu, Hui-Kuo; Crocker, Ian

    2015-01-01

    Treatment planning for whole-brain radiation treatment is technically a simple process, but in practice it takes valuable clinical time of repetitive and tedious tasks. This report presents a method that automatically segments the relevant target and normal tissues, and creates a treatment plan in only a few minutes after patient simulation. Segmentation of target and critical structures is performed automatically through morphological operations on the soft tissue and was validated by comparing with manual clinical segmentation using the Dice coefficient and Hausdorff distance. The treatment plan is generated by searching a database of previous cases for patients with similar anatomy. In this search, each database case is ranked in terms of similarity using a customized metric designed for sensitivity by including only geometrical changes that affect the dose distribution. The database case with the best match is automatically modified to replace relevant patient info and isocenter position while maintaining original beam and MLC settings. Fifteen patients with marginally acceptable treatment plans were used to validate the method. In each of these cases the anatomy was accurately segmented, but the beams and MLC settings led to a suboptimal treatment plan by either underdosing the brain or excessively irradiating critical normal tissues. For each case, the anatomy was automatically segmented with the proposed method, and the automated and anual segmentations were then compared. The mean Dice coefficient was 0.97, with a standard deviation of 0.008 for the brain, 0.85 ± 0.009 for the eyes, and 0.67 ± 0.11 for the lens. The mean Euclidian distance was 0.13 ± 0.13 mm for the brain, 0.27± 0.31 for the eye, and 2.34 ± 7.23 for the lens. Each case was then subsequently matched against a database of 70 validated treatment plans and the best matching plan (termed autoplanned), was compared retrospectively with the clinical plans in terms of brain coverage and

  12. Automated core-penumbra quantification in neonatal ischemic brain injury.

    PubMed

    Ghosh, Nirmalya; Yuan, Xiangpeng; Turenius, Christine I; Tone, Beatriz; Ambadipudi, Kamalakar; Snyder, Evan Y; Obenaus, Andre; Ashwal, Stephen

    2012-12-01

    Neonatal hypoxic-ischemic brain injury (HII) and arterial ischemic stroke (AIS) result in irreversibly injured (core) and salvageable (penumbral) tissue regions. Identification and reliable quantification of salvageable tissue is pivotal to any effective and safe intervention. Magnetic resonance imaging (MRI) is the current standard to distinguish core from penumbra using diffusion-perfusion mismatch (DPM). However, subtle MR signal variations between core-penumbral regions make their visual delineation difficult. We hypothesized that computational analysis of MRI data provides a more accurate assessment of core and penumbral tissue evolution in HII/AIS. We used two neonatal rat-pup models of HII/AIS (unilateral and global hypoxic-ischemia) and clinical data sets from neonates with AIS to test our noninvasive, automated computational approach, Hierarchical Region Splitting (HRS), to detect and quantify ischemic core-penumbra using only a single MRI modality (T2- or diffusion-weighted imaging, T2WI/DWI). We also validated our approach by comparing core-penumbral images (from HRS) to DPM with immunohistochemical validation of HII tissues. Our translational and clinical data results showed that HRS could accurately and reliably distinguish the ischemic core from penumbra and their spatiotemporal evolution, which may aid in the vetting and execution of effective therapeutic interventions as well as patient selection.

  13. Predicting competency in automated machine use in an acquired brain injury population using neuropsychological measures.

    PubMed

    Crowe, Simon F; Mahony, Kate; Jackson, Martin

    2004-08-01

    The purpose of the current study was to explore whether performance on standardised neuropsychological measures could predict functional ability with automated machines and services among people with an acquired brain injury (ABI). Participants were 45 individuals who met the criteria for mild, moderate or severe ABI and 15 control participants matched on demographic variables including age- and education. Each participant was required to complete a battery of neuropsychological tests, as well as performing three automated service delivery tasks: a transport automated ticketing machine, an automated teller machine (ATM) and an automated telephone service. The results showed consistently high relationship between the neuropsychological measures, both as single predictors and in combination, and level of competency with the automated machines. Automated machines are part of a relatively new phenomena in service delivery and offer an ecologically valid functional measure of performance that represents a true indication of functional disability. PMID:15271411

  14. Fast whole-brain optical tomography capable of automated slice-collection (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Yuan, Jing; Jiang, Tao; Deng, Lei; Long, Beng; Peng, Jie; Luo, Qingming; Gong, Hui

    2016-03-01

    Acquiring brain-wide composite information of neuroanatomical and molecular phenotyping is crucial to understand brain functions. However, current whole-brain imaging methods based on mechnical sectioning haven't achieved brain-wide acquisition of both neuroanatomical and molecular phenotyping due to the lack of appropriate whole-brain immunostaining of embedded samples. Here, we present a novel strategy of acquiring brain-wide structural and molecular maps in the same brain, combining whole-brain imaging and subsequent immunostaining of automated-collected slices. We developed a whole-brain imaging system capable of automatically imaging and then collecting imaged tissue slices in order. The system contains three parts: structured illumination microscopy for high-throughput optical sectioning, vibratome for high-precision sectioning and slice-collection device for automated collecting of tissue slices. Through our system, we could acquire a whole-brain dataset of agarose-embedded mouse brain at lateral resolution of 0.33 µm with z-interval sampling of 100 µm in 9 h, and automatically collect the imaged slices in sequence. Subsequently, we performed immunohistochemistry of the collected slices in the routine way. We acquired mouse whole-brain imaging datasets of multiple specific types of neurons, proteins and gene expression profiles. We believe our method could accelerate systematic analysis of brain anatomical structure with specific proteins or genes expression information and understanding how the brain processes information and generates behavior.

  15. Functional MRI Preprocessing in Lesioned Brains: Manual Versus Automated Region of Interest Analysis.

    PubMed

    Garrison, Kathleen A; Rogalsky, Corianne; Sheng, Tong; Liu, Brent; Damasio, Hanna; Winstein, Carolee J; Aziz-Zadeh, Lisa S

    2015-01-01

    Functional magnetic resonance imaging (fMRI) has significant potential in the study and treatment of neurological disorders and stroke. Region of interest (ROI) analysis in such studies allows for testing of strong a priori clinical hypotheses with improved statistical power. A commonly used automated approach to ROI analysis is to spatially normalize each participant's structural brain image to a template brain image and define ROIs using an atlas. However, in studies of individuals with structural brain lesions, such as stroke, the gold standard approach may be to manually hand-draw ROIs on each participant's non-normalized structural brain image. Automated approaches to ROI analysis are faster and more standardized, yet are susceptible to preprocessing error (e.g., normalization error) that can be greater in lesioned brains. The manual approach to ROI analysis has high demand for time and expertise, but may provide a more accurate estimate of brain response. In this study, commonly used automated and manual approaches to ROI analysis were directly compared by reanalyzing data from a previously published hypothesis-driven cognitive fMRI study, involving individuals with stroke. The ROI evaluated is the pars opercularis of the inferior frontal gyrus. Significant differences were identified in task-related effect size and percent-activated voxels in this ROI between the automated and manual approaches to ROI analysis. Task interactions, however, were consistent across ROI analysis approaches. These findings support the use of automated approaches to ROI analysis in studies of lesioned brains, provided they employ a task interaction design. PMID:26441816

  16. Functional MRI Preprocessing in Lesioned Brains: Manual Versus Automated Region of Interest Analysis.

    PubMed

    Garrison, Kathleen A; Rogalsky, Corianne; Sheng, Tong; Liu, Brent; Damasio, Hanna; Winstein, Carolee J; Aziz-Zadeh, Lisa S

    2015-01-01

    Functional magnetic resonance imaging (fMRI) has significant potential in the study and treatment of neurological disorders and stroke. Region of interest (ROI) analysis in such studies allows for testing of strong a priori clinical hypotheses with improved statistical power. A commonly used automated approach to ROI analysis is to spatially normalize each participant's structural brain image to a template brain image and define ROIs using an atlas. However, in studies of individuals with structural brain lesions, such as stroke, the gold standard approach may be to manually hand-draw ROIs on each participant's non-normalized structural brain image. Automated approaches to ROI analysis are faster and more standardized, yet are susceptible to preprocessing error (e.g., normalization error) that can be greater in lesioned brains. The manual approach to ROI analysis has high demand for time and expertise, but may provide a more accurate estimate of brain response. In this study, commonly used automated and manual approaches to ROI analysis were directly compared by reanalyzing data from a previously published hypothesis-driven cognitive fMRI study, involving individuals with stroke. The ROI evaluated is the pars opercularis of the inferior frontal gyrus. Significant differences were identified in task-related effect size and percent-activated voxels in this ROI between the automated and manual approaches to ROI analysis. Task interactions, however, were consistent across ROI analysis approaches. These findings support the use of automated approaches to ROI analysis in studies of lesioned brains, provided they employ a task interaction design.

  17. Serial two-photon tomography: an automated method for ex-vivo mouse brain imaging

    PubMed Central

    Ragan, Timothy; Kadiri, Lolahon R.; Venkataraju, Kannan Umadevi; Bahlmann, Karsten; Sutin, Jason; Taranda, Julian; Arganda-Carreras, Ignacio; Kim, Yongsoo; Seung, H. Sebastian

    2011-01-01

    Here we describe an automated method, which we call serial two-photon (STP) tomography, that achieves high-throughput fluorescence imaging of mouse brains by integrating two-photon microscopy and tissue sectioning. STP tomography generates high-resolution datasets that are free of distortions and can be readily warped in 3D, for example, for comparing multiple anatomical tracings. This method opens the door to routine systematic studies of neuroanatomy in mouse models of human brain disorders. PMID:22245809

  18. Semi-Automated Atlas-based Analysis of Brain Histological Sections

    PubMed Central

    Kopec, Charles D.; Bowers, Amanda C.; Pai, Shraddha; Brody, Carlos D.

    2011-01-01

    Quantifying the location and/or number of features in a histological section of the brain currently requires one to first, manually register a corresponding section from a tissue atlas onto the experimental section and second, count the features. No automated method exists for the first process (registering), and most automated methods for the second process (feature counting) operate reliably only in a high signal-to-noise regime. To reduce experimenter bias and inconsistencies and increase the speed of these analyses, we developed Atlas Fitter, a semi-automated, open-source MatLab-based software package that assists in rapidly registering atlas panels onto histological sections. We also developed CellCounter, a novel fully-automated cell counting algorithm that is designed to operate on images with non-uniform background intensities and low signal-to-noise ratios. PMID:21194546

  19. Automated detection of periventricular veins on 7 T brain MRI

    NASA Astrophysics Data System (ADS)

    Kuijf, Hugo J.; Bouvy, Willem H.; Zwanenburg, Jaco J. M.; Viergever, Max A.; Biessels, Geert Jan; Vincken, Koen L.

    2015-03-01

    Cerebral small vessel disease is common in elderly persons and a leading cause of cognitive decline, dementia, and acute stroke. With the introduction of ultra-high field strength 7.0T MRI, it is possible to visualize small vessels in the brain. In this work, a proof-of-principle study is conducted to assess the feasibility of automatically detecting periventricular veins. Periventricular veins are organized in a fan-pattern and drain venous blood from the brain towards the caudate vein of Schlesinger, which is situated along the lateral ventricles. Just outside this vein, a region-of- interest (ROI) through which all periventricular veins must cross is defined. Within this ROI, a combination of the vesselness filter, tubular tracking, and hysteresis thresholding is applied to locate periventricular veins. All detected locations were evaluated by an expert human observer. The results showed a positive predictive value of 88% and a sensitivity of 95% for detecting periventricular veins. The proposed method shows good results in detecting periventricular veins in the brain on 7.0T MR images. Compared to previous works, that only use a 1D or 2D ROI and limited image processing, our work presents a more comprehensive definition of the ROI, advanced image processing techniques to detect periventricular veins, and a quantitative analysis of the performance. The results of this proof-of-principle study are promising and will be used to assess periventricular veins on 7.0T brain MRI.

  20. Simple Fully Automated Group Classification on Brain fMRI

    SciTech Connect

    Honorio, J.; Goldstein, R.; Honorio, J.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-04-14

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  1. Associations between Family Adversity and Brain Volume in Adolescence: Manual vs. Automated Brain Segmentation Yields Different Results

    PubMed Central

    Lyden, Hannah; Gimbel, Sarah I.; Del Piero, Larissa; Tsai, A. Bryna; Sachs, Matthew E.; Kaplan, Jonas T.; Margolin, Gayla; Saxbe, Darby

    2016-01-01

    Associations between brain structure and early adversity have been inconsistent in the literature. These inconsistencies may be partially due to methodological differences. Different methods of brain segmentation may produce different results, obscuring the relationship between early adversity and brain volume. Moreover, adolescence is a time of significant brain growth and certain brain areas have distinct rates of development, which may compromise the accuracy of automated segmentation approaches. In the current study, 23 adolescents participated in two waves of a longitudinal study. Family aggression was measured when the youths were 12 years old, and structural scans were acquired an average of 4 years later. Bilateral amygdalae and hippocampi were segmented using three different methods (manual tracing, FSL, and NeuroQuant). The segmentation estimates were compared, and linear regressions were run to assess the relationship between early family aggression exposure and all three volume segmentation estimates. Manual tracing results showed a positive relationship between family aggression and right amygdala volume, whereas FSL segmentation showed negative relationships between family aggression and both the left and right hippocampi. However, results indicate poor overlap between methods, and different associations were found between early family aggression exposure and brain volume depending on the segmentation method used. PMID:27656121

  2. Associations between Family Adversity and Brain Volume in Adolescence: Manual vs. Automated Brain Segmentation Yields Different Results.

    PubMed

    Lyden, Hannah; Gimbel, Sarah I; Del Piero, Larissa; Tsai, A Bryna; Sachs, Matthew E; Kaplan, Jonas T; Margolin, Gayla; Saxbe, Darby

    2016-01-01

    Associations between brain structure and early adversity have been inconsistent in the literature. These inconsistencies may be partially due to methodological differences. Different methods of brain segmentation may produce different results, obscuring the relationship between early adversity and brain volume. Moreover, adolescence is a time of significant brain growth and certain brain areas have distinct rates of development, which may compromise the accuracy of automated segmentation approaches. In the current study, 23 adolescents participated in two waves of a longitudinal study. Family aggression was measured when the youths were 12 years old, and structural scans were acquired an average of 4 years later. Bilateral amygdalae and hippocampi were segmented using three different methods (manual tracing, FSL, and NeuroQuant). The segmentation estimates were compared, and linear regressions were run to assess the relationship between early family aggression exposure and all three volume segmentation estimates. Manual tracing results showed a positive relationship between family aggression and right amygdala volume, whereas FSL segmentation showed negative relationships between family aggression and both the left and right hippocampi. However, results indicate poor overlap between methods, and different associations were found between early family aggression exposure and brain volume depending on the segmentation method used.

  3. Associations between Family Adversity and Brain Volume in Adolescence: Manual vs. Automated Brain Segmentation Yields Different Results

    PubMed Central

    Lyden, Hannah; Gimbel, Sarah I.; Del Piero, Larissa; Tsai, A. Bryna; Sachs, Matthew E.; Kaplan, Jonas T.; Margolin, Gayla; Saxbe, Darby

    2016-01-01

    Associations between brain structure and early adversity have been inconsistent in the literature. These inconsistencies may be partially due to methodological differences. Different methods of brain segmentation may produce different results, obscuring the relationship between early adversity and brain volume. Moreover, adolescence is a time of significant brain growth and certain brain areas have distinct rates of development, which may compromise the accuracy of automated segmentation approaches. In the current study, 23 adolescents participated in two waves of a longitudinal study. Family aggression was measured when the youths were 12 years old, and structural scans were acquired an average of 4 years later. Bilateral amygdalae and hippocampi were segmented using three different methods (manual tracing, FSL, and NeuroQuant). The segmentation estimates were compared, and linear regressions were run to assess the relationship between early family aggression exposure and all three volume segmentation estimates. Manual tracing results showed a positive relationship between family aggression and right amygdala volume, whereas FSL segmentation showed negative relationships between family aggression and both the left and right hippocampi. However, results indicate poor overlap between methods, and different associations were found between early family aggression exposure and brain volume depending on the segmentation method used.

  4. Associations between Family Adversity and Brain Volume in Adolescence: Manual vs. Automated Brain Segmentation Yields Different Results.

    PubMed

    Lyden, Hannah; Gimbel, Sarah I; Del Piero, Larissa; Tsai, A Bryna; Sachs, Matthew E; Kaplan, Jonas T; Margolin, Gayla; Saxbe, Darby

    2016-01-01

    Associations between brain structure and early adversity have been inconsistent in the literature. These inconsistencies may be partially due to methodological differences. Different methods of brain segmentation may produce different results, obscuring the relationship between early adversity and brain volume. Moreover, adolescence is a time of significant brain growth and certain brain areas have distinct rates of development, which may compromise the accuracy of automated segmentation approaches. In the current study, 23 adolescents participated in two waves of a longitudinal study. Family aggression was measured when the youths were 12 years old, and structural scans were acquired an average of 4 years later. Bilateral amygdalae and hippocampi were segmented using three different methods (manual tracing, FSL, and NeuroQuant). The segmentation estimates were compared, and linear regressions were run to assess the relationship between early family aggression exposure and all three volume segmentation estimates. Manual tracing results showed a positive relationship between family aggression and right amygdala volume, whereas FSL segmentation showed negative relationships between family aggression and both the left and right hippocampi. However, results indicate poor overlap between methods, and different associations were found between early family aggression exposure and brain volume depending on the segmentation method used. PMID:27656121

  5. Automated Robust Image Segmentation: Level Set Method Using Nonnegative Matrix Factorization with Application to Brain MRI.

    PubMed

    Dera, Dimah; Bouaynaya, Nidhal; Fathallah-Shaykh, Hassan M

    2016-07-01

    We address the problem of fully automated region discovery and robust image segmentation by devising a new deformable model based on the level set method (LSM) and the probabilistic nonnegative matrix factorization (NMF). We describe the use of NMF to calculate the number of distinct regions in the image and to derive the local distribution of the regions, which is incorporated into the energy functional of the LSM. The results demonstrate that our NMF-LSM method is superior to other approaches when applied to synthetic binary and gray-scale images and to clinical magnetic resonance images (MRI) of the human brain with and without a malignant brain tumor, glioblastoma multiforme. In particular, the NMF-LSM method is fully automated, highly accurate, less sensitive to the initial selection of the contour(s) or initial conditions, more robust to noise and model parameters, and able to detect as small distinct regions as desired. These advantages stem from the fact that the proposed method relies on histogram information instead of intensity values and does not introduce nuisance model parameters. These properties provide a general approach for automated robust region discovery and segmentation in heterogeneous images. Compared with the retrospective radiological diagnoses of two patients with non-enhancing grade 2 and 3 oligodendroglioma, the NMF-LSM detects earlier progression times and appears suitable for monitoring tumor response. The NMF-LSM method fills an important need of automated segmentation of clinical MRI. PMID:27417984

  6. Automated long-term tracking of freely moving animal and functional brain imaging based on fiber optic microscopy

    NASA Astrophysics Data System (ADS)

    Cha, Jaepyeong; Cheon, Gyeong Woo; Kang, Jin U.

    2015-03-01

    In this study, we demonstrate an automated data acquisition/analysis platform for both long-term motion tracking and functional brain imaging in freely moving mice. Our system utilizes a fiber-bundle based fluorescence microscope for 24 hours imaging of cellular activities within the brain while also monitoring corresponding animal behaviors using a NIR camera. Synchronized software and automation of analysis allow quantification of all animal behaviors and their brain activities over extended periods of time. Our platform can be used for interrogation of the brain activities in different behavioral states and is also well-suited for longitudinal studies of cellular activities in freely moving animals.

  7. Automated monitoring of early neurobehavioral changes in mice following traumatic brain injury

    PubMed Central

    Qu, Wenrui; Liu, Nai-kui; Xie, Xin-min (Simon); Li, Rui; Xu, Xiao-ming

    2016-01-01

    Traumatic brain injury often causes a variety of behavioral and emotional impairments that can develop into chronic disorders. Therefore, there is a need to shift towards identifying early symptoms that can aid in the prediction of traumatic brain injury outcomes and behavioral endpoints in patients with traumatic brain injury after early interventions. In this study, we used the SmartCage system, an automated quantitative approach to assess behavior alterations in mice during an early phase of traumatic brain injury in their home cages. Female C57BL/6 adult mice were subjected to moderate controlled cortical impact (CCI) injury. The mice then received a battery of behavioral assessments including neurological score, locomotor activity, sleep/wake states, and anxiety-like behaviors on days 1, 2, and 7 after CCI. Histological analysis was performed on day 7 after the last assessment. Spontaneous activities on days 1 and 2 after injury were significantly decreased in the CCI group. The average percentage of sleep time spent in both dark and light cycles were significantly higher in the CCI group than in the sham group. For anxiety-like behaviors, the time spent in a light compartment and the number of transitions between the dark/light compartments were all significantly reduced in the CCI group than in the sham group. In addition, the mice suffering from CCI exhibited a preference of staying in the dark compartment of a dark/light cage. The CCI mice showed reduced neurological score and histological abnormalities, which are well correlated to the automated behavioral assessments. Our findings demonstrate that the automated SmartCage system provides sensitive and objective measures for early behavior changes in mice following traumatic brain injury. PMID:27073377

  8. Semi-Automated Trajectory Analysis of Deep Ballistic Penetrating Brain Injury

    PubMed Central

    Folio, Les; Solomon, Jeffrey; Biassou, Nadia; Fischer, Tatjana; Dworzak, Jenny; Raymont, Vanessa; Sinaii, Ninet; Wassermann, Eric M.; Grafman, Jordan

    2016-01-01

    Background Penetrating head injuries (PHIs) are common in combat operations and most have visible wound paths on computed tomography (CT). Objective We assess agreement between an automated trajectory analysis-based assessment of brain injury and manual tracings of encephalomalacia on CT. Methods We analyzed 80 head CTs with ballistic PHI from the Institutional Review Board approved Vietnam head injury registry. Anatomic reports were generated from spatial coordinates of projectile entrance and terminal fragment location. These were compared to manual tracings of the regions of encephalomalacia. Dice’s similarity coefficients, kappa, sensitivities, and specificities were calculated to assess agreement. Times required for case analysis were also compared. Results Results show high specificity of anatomic regions identified on CT with semiautomated anatomical estimates and manual tracings of tissue damage. Radiologist’s and medical students’ anatomic region reports were similar (Kappa 0.8, t-test p < 0.001). Region of probable injury modeling of involved brain structures was sensitive (0.7) and specific (0.9) compared with manually traced structures. Semiautomated analysis was 9-fold faster than manual tracings. Conclusion Our region of probable injury spatial model approximates anatomical regions of encephalomalacia from ballistic PHI with time-saving over manual methods. Results show potential for automated anatomical reporting as an adjunct to current practice of radiologist/neurosurgical review of brain injury by penetrating projectiles. PMID:23707123

  9. Quantifying brain tissue volume in multiple sclerosis with automated lesion segmentation and filling.

    PubMed

    Valverde, Sergi; Oliver, Arnau; Roura, Eloy; Pareto, Deborah; Vilanova, Joan C; Ramió-Torrentà, Lluís; Sastre-Garriga, Jaume; Montalban, Xavier; Rovira, Àlex; Lladó, Xavier

    2015-01-01

    Lesion filling has been successfully applied to reduce the effect of hypo-intense T1-w Multiple Sclerosis (MS) lesions on automatic brain tissue segmentation. However, a study of fully automated pipelines incorporating lesion segmentation and lesion filling on tissue volume analysis has not yet been performed. Here, we analyzed the % of error introduced by automating the lesion segmentation and filling processes in the tissue segmentation of 70 clinically isolated syndrome patient images. First of all, images were processed using the LST and SLS toolkits with different pipeline combinations that differed in either automated or manual lesion segmentation, and lesion filling or masking out lesions. Then, images processed following each of the pipelines were segmented into gray matter (GM) and white matter (WM) using SPM8, and compared with the same images where expert lesion annotations were filled before segmentation. Our results showed that fully automated lesion segmentation and filling pipelines reduced significantly the % of error in GM and WM volume on images of MS patients, and performed similarly to the images where expert lesion annotations were masked before segmentation. In all the pipelines, the amount of misclassified lesion voxels was the main cause in the observed error in GM and WM volume. However, the % of error was significantly lower when automatically estimated lesions were filled and not masked before segmentation. These results are relevant and suggest that LST and SLS toolboxes allow the performance of accurate brain tissue volume measurements without any kind of manual intervention, which can be convenient not only in terms of time and economic costs, but also to avoid the inherent intra/inter variability between manual annotations.

  10. Quantifying brain tissue volume in multiple sclerosis with automated lesion segmentation and filling

    PubMed Central

    Valverde, Sergi; Oliver, Arnau; Roura, Eloy; Pareto, Deborah; Vilanova, Joan C.; Ramió-Torrentà, Lluís; Sastre-Garriga, Jaume; Montalban, Xavier; Rovira, Àlex; Lladó, Xavier

    2015-01-01

    Lesion filling has been successfully applied to reduce the effect of hypo-intense T1-w Multiple Sclerosis (MS) lesions on automatic brain tissue segmentation. However, a study of fully automated pipelines incorporating lesion segmentation and lesion filling on tissue volume analysis has not yet been performed. Here, we analyzed the % of error introduced by automating the lesion segmentation and filling processes in the tissue segmentation of 70 clinically isolated syndrome patient images. First of all, images were processed using the LST and SLS toolkits with different pipeline combinations that differed in either automated or manual lesion segmentation, and lesion filling or masking out lesions. Then, images processed following each of the pipelines were segmented into gray matter (GM) and white matter (WM) using SPM8, and compared with the same images where expert lesion annotations were filled before segmentation. Our results showed that fully automated lesion segmentation and filling pipelines reduced significantly the % of error in GM and WM volume on images of MS patients, and performed similarly to the images where expert lesion annotations were masked before segmentation. In all the pipelines, the amount of misclassified lesion voxels was the main cause in the observed error in GM and WM volume. However, the % of error was significantly lower when automatically estimated lesions were filled and not masked before segmentation. These results are relevant and suggest that LST and SLS toolboxes allow the performance of accurate brain tissue volume measurements without any kind of manual intervention, which can be convenient not only in terms of time and economic costs, but also to avoid the inherent intra/inter variability between manual annotations. PMID:26740917

  11. New tissue priors for improved automated classification of subcortical brain structures on MRI.

    PubMed

    Lorio, S; Fresard, S; Adaszewski, S; Kherif, F; Chowdhury, R; Frackowiak, R S; Ashburner, J; Helms, G; Weiskopf, N; Lutti, A; Draganski, B

    2016-04-15

    Despite the constant improvement of algorithms for automated brain tissue classification, the accurate delineation of subcortical structures using magnetic resonance images (MRI) data remains challenging. The main difficulties arise from the low gray-white matter contrast of iron rich areas in T1-weighted (T1w) MRI data and from the lack of adequate priors for basal ganglia and thalamus. The most recent attempts to obtain such priors were based on cohorts with limited size that included subjects in a narrow age range, failing to account for age-related gray-white matter contrast changes. Aiming to improve the anatomical plausibility of automated brain tissue classification from T1w data, we have created new tissue probability maps for subcortical gray matter regions. Supported by atlas-derived spatial information, raters manually labeled subcortical structures in a cohort of healthy subjects using magnetization transfer saturation and R2* MRI maps, which feature optimal gray-white matter contrast in these areas. After assessment of inter-rater variability, the new tissue priors were tested on T1w data within the framework of voxel-based morphometry. The automated detection of gray matter in subcortical areas with our new probability maps was more anatomically plausible compared to the one derived with currently available priors. We provide evidence that the improved delineation compensates age-related bias in the segmentation of iron rich subcortical regions. The new tissue priors, allowing robust detection of basal ganglia and thalamus, have the potential to enhance the sensitivity of voxel-based morphometry in both healthy and diseased brains. PMID:26854557

  12. New tissue priors for improved automated classification of subcortical brain structures on MRI☆

    PubMed Central

    Lorio, S.; Fresard, S.; Adaszewski, S.; Kherif, F.; Chowdhury, R.; Frackowiak, R.S.; Ashburner, J.; Helms, G.; Weiskopf, N.; Lutti, A.; Draganski, B.

    2016-01-01

    Despite the constant improvement of algorithms for automated brain tissue classification, the accurate delineation of subcortical structures using magnetic resonance images (MRI) data remains challenging. The main difficulties arise from the low gray-white matter contrast of iron rich areas in T1-weighted (T1w) MRI data and from the lack of adequate priors for basal ganglia and thalamus. The most recent attempts to obtain such priors were based on cohorts with limited size that included subjects in a narrow age range, failing to account for age-related gray-white matter contrast changes. Aiming to improve the anatomical plausibility of automated brain tissue classification from T1w data, we have created new tissue probability maps for subcortical gray matter regions. Supported by atlas-derived spatial information, raters manually labeled subcortical structures in a cohort of healthy subjects using magnetization transfer saturation and R2* MRI maps, which feature optimal gray-white matter contrast in these areas. After assessment of inter-rater variability, the new tissue priors were tested on T1w data within the framework of voxel-based morphometry. The automated detection of gray matter in subcortical areas with our new probability maps was more anatomically plausible compared to the one derived with currently available priors. We provide evidence that the improved delineation compensates age-related bias in the segmentation of iron rich subcortical regions. The new tissue priors, allowing robust detection of basal ganglia and thalamus, have the potential to enhance the sensitivity of voxel-based morphometry in both healthy and diseased brains. PMID:26854557

  13. Automated metastatic brain lesion detection: a computer aided diagnostic and clinical research tool

    NASA Astrophysics Data System (ADS)

    Devine, Jeremy; Sahgal, Arjun; Karam, Irene; Martel, Anne L.

    2016-03-01

    The accurate localization of brain metastases in magnetic resonance (MR) images is crucial for patients undergoing stereotactic radiosurgery (SRS) to ensure that all neoplastic foci are targeted. Computer automated tumor localization and analysis can improve both of these tasks by eliminating inter and intra-observer variations during the MR image reading process. Lesion localization is accomplished using adaptive thresholding to extract enhancing objects. Each enhancing object is represented as a vector of features which includes information on object size, symmetry, position, shape, and context. These vectors are then used to train a random forest classifier. We trained and tested the image analysis pipeline on 3D axial contrast-enhanced MR images with the intention of localizing the brain metastases. In our cross validation study and at the most effective algorithm operating point, we were able to identify 90% of the lesions at a precision rate of 60%.

  14. Comparison of manual vs. automated multimodality (CT-MRI) image registration for brain tumors

    SciTech Connect

    Sarkar, Abhirup; Santiago, Roberto J.; Smith, Ryan; Kassaee, Alireza . E-mail: Kassaee@xrt.upenn.edu

    2005-03-31

    Computed tomgoraphy-magnetic resonance imaging (CT-MRI) registrations are routinely used for target-volume delineation of brain tumors. We clinically use 2 software packages based on manual operation and 1 automated package with 2 different algorithms: chamfer matching using bony structures, and mutual information using intensity patterns. In all registration algorithms, a minimum of 3 pairs of identical anatomical and preferably noncoplanar landmarks is used on each of the 2 image sets. In manual registration, the program registers these points and links the image sets using a 3-dimensional (3D) transformation. In automated registration, the 3 landmarks are used as an initial starting point and further processing is done to complete the registration. Using our registration packages, registration of CT and MRI was performed on 10 patients. We scored the results of each registration set based on the amount of time spent, the accuracy reported by the software, and a final evaluation. We evaluated each software program by measuring the residual error between 'matched' points on the right and left globes and the posterior fossa for fused image slices. In general, manual registration showed higher misalignment between corresponding points compared to automated registration using intensity matching. This error had no directional dependence and was, most of the time, larger for a larger structure in both registration techniques. Automated algorithm based on intensity matching also gave the best results in terms of registration accuracy, irrespective of whether or not the initial landmarks were chosen carefully, when compared to that done using bone matching algorithm. Intensity-matching algorithm required the least amount of user-time and provided better accuracy.

  15. A Comparison of a Brain-Based Adaptive System and a Manual Adaptable System for Invoking Automation

    NASA Technical Reports Server (NTRS)

    Bailey, Nathan R.; Scerbo, Mark W.; Freeman, Frederick G.; Mikulka, Peter J.; Scott, Lorissa A.

    2004-01-01

    Two experiments are presented that examine alternative methods for invoking automation. In each experiment, participants were asked to perform simultaneously a monitoring task and a resource management task as well as a tracking task that changed between automatic and manual modes. The monitoring task required participants to detect failures of an automated system to correct aberrant conditions under either high or low system reliability. Performance on each task was assessed as well as situation awareness and subjective workload. In the first experiment, half of the participants worked with a brain-based system that used their EEG signals to switch the tracking task between automatic and manual modes. The remaining participants were yoked to participants from the adaptive condition and received the same schedule of mode switches, but their EEG had no effect on the automation. Within each group, half of the participants were assigned to either the low or high reliability monitoring task. In addition, within each combination of automation invocation and system reliability, participants were separated into high and low complacency potential groups. The results revealed no significant effects of automation invocation on the performance measures; however, the high complacency individuals demonstrated better situation awareness when working with the adaptive automation system. The second experiment was the same as the first with one important exception. Automation was invoked manually. Thus, half of the participants pressed a button to invoke automation for 10 s. The remaining participants were yoked to participants from the adaptable condition and received the same schedule of mode switches, but they had no control over the automation. The results showed that participants who could invoke automation performed more poorly on the resource management task and reported higher levels of subjective workload. Further, those who invoked automation more frequently performed

  16. Automated brain tumor segmentation using spatial accuracy-weighted hidden Markov Random Field.

    PubMed

    Nie, Jingxin; Xue, Zhong; Liu, Tianming; Young, Geoffrey S; Setayesh, Kian; Guo, Lei; Wong, Stephen T C

    2009-09-01

    A variety of algorithms have been proposed for brain tumor segmentation from multi-channel sequences, however, most of them require isotropic or pseudo-isotropic resolution of the MR images. Although co-registration and interpolation of low-resolution sequences, such as T2-weighted images, onto the space of the high-resolution image, such as T1-weighted image, can be performed prior to the segmentation, the results are usually limited by partial volume effects due to interpolation of low-resolution images. To improve the quality of tumor segmentation in clinical applications where low-resolution sequences are commonly used together with high-resolution images, we propose the algorithm based on Spatial accuracy-weighted Hidden Markov random field and Expectation maximization (SHE) approach for both automated tumor and enhanced-tumor segmentation. SHE incorporates the spatial interpolation accuracy of low-resolution images into the optimization procedure of the Hidden Markov Random Field (HMRF) to segment tumor using multi-channel MR images with different resolutions, e.g., high-resolution T1-weighted and low-resolution T2-weighted images. In experiments, we evaluated this algorithm using a set of simulated multi-channel brain MR images with known ground-truth tissue segmentation and also applied it to a dataset of MR images obtained during clinical trials of brain tumor chemotherapy. The results show that more accurate tumor segmentation results can be obtained by comparing with conventional multi-channel segmentation algorithms.

  17. Automated brain tumor segmentation using spatial accuracy-weighted hidden Markov Random Field.

    PubMed

    Nie, Jingxin; Xue, Zhong; Liu, Tianming; Young, Geoffrey S; Setayesh, Kian; Guo, Lei; Wong, Stephen T C

    2009-09-01

    A variety of algorithms have been proposed for brain tumor segmentation from multi-channel sequences, however, most of them require isotropic or pseudo-isotropic resolution of the MR images. Although co-registration and interpolation of low-resolution sequences, such as T2-weighted images, onto the space of the high-resolution image, such as T1-weighted image, can be performed prior to the segmentation, the results are usually limited by partial volume effects due to interpolation of low-resolution images. To improve the quality of tumor segmentation in clinical applications where low-resolution sequences are commonly used together with high-resolution images, we propose the algorithm based on Spatial accuracy-weighted Hidden Markov random field and Expectation maximization (SHE) approach for both automated tumor and enhanced-tumor segmentation. SHE incorporates the spatial interpolation accuracy of low-resolution images into the optimization procedure of the Hidden Markov Random Field (HMRF) to segment tumor using multi-channel MR images with different resolutions, e.g., high-resolution T1-weighted and low-resolution T2-weighted images. In experiments, we evaluated this algorithm using a set of simulated multi-channel brain MR images with known ground-truth tissue segmentation and also applied it to a dataset of MR images obtained during clinical trials of brain tumor chemotherapy. The results show that more accurate tumor segmentation results can be obtained by comparing with conventional multi-channel segmentation algorithms. PMID:19446435

  18. Automated segmentation of brain ventricles in unenhanced CT of patients with ischemic stroke

    NASA Astrophysics Data System (ADS)

    Qian, Xiaohua; Wang, Jiahui; Li, Qiang

    2013-02-01

    We are developing an automated method for detection and quantification of ischemic stroke in computed tomography (CT). Ischemic stroke often connects to brain ventricle, therefore, ventricular segmentation is an important and difficult task when stroke is present, and is the topic of this study. We first corrected inclination angle of brain by aligning midline of brain with the vertical centerline of a slice. We then estimated the intensity range of the ventricles by use of the k-means method. Two segmentation of the ventricle were obtained by use of thresholding technique. One segmentation contains ventricle and nearby stroke. The other mainly contains ventricle. Therefore, the stroke regions can be extracted and removed using image difference technique. An adaptive template-matching algorithm was employed to identify objects in the fore-mentioned segmentation. The largest connected component was identified and considered as the ventricle. We applied our method to 25 unenhanced CT scans with stroke. Our method achieved average Dice index, sensitivity, and specificity of 95.1%, 97.0%, and 99.8% for the entire ventricular regions. The experimental results demonstrated that the proposed method has great potential in detection and quantification of stroke and other neurologic diseases.

  19. A framework to support automated classification and labeling of brain electromagnetic patterns.

    PubMed

    Frishkoff, Gwen A; Frank, Robert M; Rong, Jiawei; Dou, Dejing; Dien, Joseph; Halderman, Laura K

    2007-01-01

    This paper describes a framework for automated classification and labeling of patterns in electroencephalographic (EEG) and magnetoencephalographic (MEG) data. We describe recent progress on four goals: 1) specification of rules and concepts that capture expert knowledge of event-related potentials (ERP) patterns in visual word recognition; 2) implementation of rules in an automated data processing and labeling stream; 3) data mining techniques that lead to refinement of rules; and 4) iterative steps towards system evaluation and optimization. This process combines top-down, or knowledge-driven, methods with bottom-up, or data-driven, methods. As illustrated here, these methods are complementary and can lead to development of tools for pattern classification and labeling that are robust and conceptually transparent to researchers. The present application focuses on patterns in averaged EEG (ERP) data. We also describe efforts to extend our methods to represent patterns in MEG data, as well as EM patterns in source (anatomical) space. The broader aim of this work is to design an ontology-based system to support cross-laboratory, cross-paradigm, and cross-modal integration of brain functional data. Tools developed for this project are implemented in MATLAB and are freely available on request. PMID:18301711

  20. Automated Detection of Brain Abnormalities in Neonatal Hypoxia Ischemic Injury from MR Images

    PubMed Central

    Ghosh, Nirmalya; Sun, Yu; Bhanu, Bir; Ashwal, Stephen; Obenaus, Andre

    2014-01-01

    We compared the efficacy of three automated brain injury detection methods, namely symmetry-integrated region growing (SIRG), hierarchical region splitting (HRS) and modified watershed segmentation (MWS) in human and animal magnetic resonance imaging (MRI) datasets for the detection of hypoxic ischemic injuries (HII). Diffusion weighted imaging (DWI, 1.5T) data from neonatal arterial ischemic stroke (AIS) patients, as well as T2-weighted imaging (T2WI, 11.7T, 4.7T) at seven different time-points (1, 4, 7, 10, 17, 24 and 31 days post HII) in rat-pup model of hypoxic ischemic injury were used to check the temporal efficacy of our computational approaches. Sensitivity, specificity, similarity were used as performance metrics based on manual (‘gold standard’) injury detection to quantify comparisons. When compared to the manual gold standard, automated injury location results from SIRG performed the best in 62% of the data, while 29% for HRS and 9% for MWS. Injury severity detection revealed that SIRG performed the best in 67% cases while HRS for 33% data. Prior information is required by HRS and MWS, but not by SIRG. However, SIRG is sensitive to parameter-tuning, while HRS and MWS are not. Among these methods, SIRG performs the best in detecting lesion volumes; HRS is the most robust, while MWS lags behind in both respects. PMID:25000294

  1. Automated segmentation of in vivo and ex vivo mouse brain magnetic resonance images.

    PubMed

    Scheenstra, Alize E H; van de Ven, Rob C G; van der Weerd, Louise; van den Maagdenberg, Arn M J M; Dijkstra, Jouke; Reiber, Johan H C

    2009-01-01

    Segmentation of magnetic resonance imaging (MRI) data is required for many applications, such as the comparison of different structures or time points, and for annotation purposes. Currently, the gold standard for automated image segmentation is nonlinear atlas-based segmentation. However, these methods are either not sufficient or highly time consuming for mouse brains, owing to the low signal to noise ratio and low contrast between structures compared with other applications. We present a novel generic approach to reduce processing time for segmentation of various structures of mouse brains, in vivo and ex vivo. The segmentation consists of a rough affine registration to a template followed by a clustering approach to refine the rough segmentation near the edges. Compared with manual segmentations, the presented segmentation method has an average kappa index of 0.7 for 7 of 12 structures in in vivo MRI and 11 of 12 structures in ex vivo MRI. Furthermore, we found that these results were equal to the performance of a nonlinear segmentation method, but with the advantage of being 8 times faster. The presented automatic segmentation method is quick and intuitive and can be used for image registration, volume quantification of structures, and annotation. PMID:19344574

  2. Three validation metrics for automated probabilistic image segmentation of brain tumours

    PubMed Central

    Zou, Kelly H.; Wells, William M.; Kikinis, Ron; Warfield, Simon K.

    2005-01-01

    SUMMARY The validity of brain tumour segmentation is an important issue in image processing because it has a direct impact on surgical planning. We examined the segmentation accuracy based on three two-sample validation metrics against the estimated composite latent gold standard, which was derived from several experts’ manual segmentations by an EM algorithm. The distribution functions of the tumour and control pixel data were parametrically assumed to be a mixture of two beta distributions with different shape parameters. We estimated the corresponding receiver operating characteristic curve, Dice similarity coefficient, and mutual information, over all possible decision thresholds. Based on each validation metric, an optimal threshold was then computed via maximization. We illustrated these methods on MR imaging data from nine brain tumour cases of three different tumour types, each consisting of a large number of pixels. The automated segmentation yielded satisfactory accuracy with varied optimal thresholds. The performances of these validation metrics were also investigated via Monte Carlo simulation. Extensions of incorporating spatial correlation structures using a Markov random field model were considered. PMID:15083482

  3. SU-D-BRD-06: Automated Population-Based Planning for Whole Brain Radiation Therapy

    SciTech Connect

    Schreibmann, E; Fox, T; Crocker, I; Shu, H

    2014-06-01

    Purpose: Treatment planning for whole brain radiation treatment is technically a simple process but in practice it takes valuable clinical time of repetitive and tedious tasks. This report presents a method that automatically segments the relevant target and normal tissues and creates a treatment plan in only a few minutes after patient simulation. Methods: Segmentation is performed automatically through morphological operations on the soft tissue. The treatment plan is generated by searching a database of previous cases for patients with similar anatomy. In this search, each database case is ranked in terms of similarity using a customized metric designed for sensitivity by including only geometrical changes that affect the dose distribution. The database case with the best match is automatically modified to replace relevant patient info and isocenter position while maintaining original beam and MLC settings. Results: Fifteen patients were used to validate the method. In each of these cases the anatomy was accurately segmented to mean Dice coefficients of 0.970 ± 0.008 for the brain, 0.846 ± 0.009 for the eyes and 0.672 ± 0.111 for the lens as compared to clinical segmentations. Each case was then subsequently matched against a database of 70 validated treatment plans and the best matching plan (termed auto-planned), was compared retrospectively with the clinical plans in terms of brain coverage and maximum doses to critical structures. Maximum doses were reduced by a maximum of 20.809 Gy for the left eye (mean 3.533), by 13.352 (1.311) for the right eye, and by 27.471 (4.856), 25.218 (6.315) for the left and right lens. Time from simulation to auto-plan was 3-4 minutes. Conclusion: Automated database- based matching is an alternative to classical treatment planning that improves quality while providing a cost—effective solution to planning through modifying previous validated plans to match a current patient's anatomy.

  4. Automated segmentation of ventricles from serial brain MRI for the quantification of volumetric changes associated with communicating hydrocephalus in patients with brain tumor

    NASA Astrophysics Data System (ADS)

    Pura, John A.; Hamilton, Allison M.; Vargish, Geoffrey A.; Butman, John A.; Linguraru, Marius George

    2011-03-01

    Accurate ventricle volume estimates could improve the understanding and diagnosis of postoperative communicating hydrocephalus. For this category of patients, associated changes in ventricle volume can be difficult to identify, particularly over short time intervals. We present an automated segmentation algorithm that evaluates ventricle size from serial brain MRI examination. The technique combines serial T1- weighted images to increase SNR and segments the means image to generate a ventricle template. After pre-processing, the segmentation is initiated by a fuzzy c-means clustering algorithm to find the seeds used in a combination of fast marching methods and geodesic active contours. Finally, the ventricle template is propagated onto the serial data via non-linear registration. Serial volume estimates were obtained in an automated robust and accurate manner from difficult data.

  5. Precise Anatomic Localization of Accumulated Lipids in Mfp2 Deficient Murine Brains Through Automated Registration of SIMS Images to the Allen Brain Atlas

    NASA Astrophysics Data System (ADS)

    Škrášková, Karolina; Khmelinskii, Artem; Abdelmoula, Walid M.; De Munter, Stephanie; Baes, Myriam; McDonnell, Liam; Dijkstra, Jouke; Heeren, Ron M. A.

    2015-06-01

    Mass spectrometry imaging (MSI) is a powerful tool for the molecular characterization of specific tissue regions. Histochemical staining provides anatomic information complementary to MSI data. The combination of both modalities has been proven to be beneficial. However, direct comparison of histology based and mass spectrometry-based molecular images can become problematic because of potential tissue damages or changes caused by different sample preparation. Curated atlases such as the Allen Brain Atlas (ABA) offer a collection of highly detailed and standardized anatomic information. Direct comparison of MSI brain data to the ABA allows for conclusions to be drawn on precise anatomic localization of the molecular signal. Here we applied secondary ion mass spectrometry imaging at high spatial resolution to study brains of knock-out mouse models with impaired peroxisomal β-oxidation. Murine models were lacking D-multifunctional protein (MFP2), which is involved in degradation of very long chain fatty acids. SIMS imaging revealed deposits of fatty acids within distinct brain regions. Manual comparison of the MSI data with the histologic stains did not allow for an unequivocal anatomic identification of the fatty acids rich regions. We further employed an automated pipeline for co-registration of the SIMS data to the ABA. The registration enabled precise anatomic annotation of the brain structures with the revealed lipid deposits. The precise anatomic localization allowed for a deeper insight into the pathology of Mfp2 deficient mouse models.

  6. Automated detection of cerebral microbleeds in patients with Traumatic Brain Injury.

    PubMed

    van den Heuvel, T L A; van der Eerden, A W; Manniesing, R; Ghafoorian, M; Tan, T; Andriessen, T M J C; Vande Vyvere, T; van den Hauwe, L; Ter Haar Romeny, B M; Goraj, B M; Platel, B

    2016-01-01

    In this paper a Computer Aided Detection (CAD) system is presented to automatically detect Cerebral Microbleeds (CMBs) in patients with Traumatic Brain Injury (TBI). It is believed that the presence of CMBs has clinical prognostic value in TBI patients. To study the contribution of CMBs in patient outcome, accurate detection of CMBs is required. Manual detection of CMBs in TBI patients is a time consuming task that is prone to errors, because CMBs are easily overlooked and are difficult to distinguish from blood vessels. This study included 33 TBI patients. Because of the laborious nature of manually annotating CMBs, only one trained expert manually annotated the CMBs in all 33 patients. A subset of ten TBI patients was annotated by six experts. Our CAD system makes use of both Susceptibility Weighted Imaging (SWI) and T1 weighted magnetic resonance images to detect CMBs. After pre-processing these images, a two-step approach was used for automated detection of CMBs. In the first step, each voxel was characterized by twelve features based on the dark and spherical nature of CMBs and a random forest classifier was used to identify CMB candidate locations. In the second step, segmentations were made from each identified candidate location. Subsequently an object-based classifier was used to remove false positive detections of the voxel classifier, by considering seven object-based features that discriminate between spherical objects (CMBs) and elongated objects (blood vessels). A guided user interface was designed for fast evaluation of the CAD system result. During this process, an expert checked each CMB detected by the CAD system. A Fleiss' kappa value of only 0.24 showed that the inter-observer variability for the TBI patients in this study was very large. An expert using the guided user interface reached an average sensitivity of 93%, which was significantly higher (p = 0.03) than the average sensitivity of 77% (sd 12.4%) that the six experts manually detected

  7. Automated detection of cerebral microbleeds in patients with Traumatic Brain Injury.

    PubMed

    van den Heuvel, T L A; van der Eerden, A W; Manniesing, R; Ghafoorian, M; Tan, T; Andriessen, T M J C; Vande Vyvere, T; van den Hauwe, L; Ter Haar Romeny, B M; Goraj, B M; Platel, B

    2016-01-01

    In this paper a Computer Aided Detection (CAD) system is presented to automatically detect Cerebral Microbleeds (CMBs) in patients with Traumatic Brain Injury (TBI). It is believed that the presence of CMBs has clinical prognostic value in TBI patients. To study the contribution of CMBs in patient outcome, accurate detection of CMBs is required. Manual detection of CMBs in TBI patients is a time consuming task that is prone to errors, because CMBs are easily overlooked and are difficult to distinguish from blood vessels. This study included 33 TBI patients. Because of the laborious nature of manually annotating CMBs, only one trained expert manually annotated the CMBs in all 33 patients. A subset of ten TBI patients was annotated by six experts. Our CAD system makes use of both Susceptibility Weighted Imaging (SWI) and T1 weighted magnetic resonance images to detect CMBs. After pre-processing these images, a two-step approach was used for automated detection of CMBs. In the first step, each voxel was characterized by twelve features based on the dark and spherical nature of CMBs and a random forest classifier was used to identify CMB candidate locations. In the second step, segmentations were made from each identified candidate location. Subsequently an object-based classifier was used to remove false positive detections of the voxel classifier, by considering seven object-based features that discriminate between spherical objects (CMBs) and elongated objects (blood vessels). A guided user interface was designed for fast evaluation of the CAD system result. During this process, an expert checked each CMB detected by the CAD system. A Fleiss' kappa value of only 0.24 showed that the inter-observer variability for the TBI patients in this study was very large. An expert using the guided user interface reached an average sensitivity of 93%, which was significantly higher (p = 0.03) than the average sensitivity of 77% (sd 12.4%) that the six experts manually detected

  8. Comparison of automated brain segmentation using a brain phantom and patients with early Alzheimer's dementia or mild cognitive impairment.

    PubMed

    Fellhauer, Iven; Zöllner, Frank G; Schröder, Johannes; Degen, Christina; Kong, Li; Essig, Marco; Thomann, Philipp A; Schad, Lothar R

    2015-09-30

    Magnetic resonance imaging (MRI) and brain volumetry allow for the quantification of changes in brain volume using automatic algorithms which are widely used in both, clinical and scientific studies. However, studies comparing the reliability of these programmes are scarce and mainly involved MRI derived from younger healthy controls. This study evaluates the reliability of frequently used segmentation programmes (SPM, FreeSurfer, FSL) using a realistic digital brain phantom and MRI brain acquisitions from patients with manifest Alzheimer's disease (AD, n=34), mild cognitive impairment (MCI, n=60), and healthy subjects (n=32) matched for age and sex. Analysis of the brain phantom dataset demonstrated that SPM, FSL and FreeSurfer underestimate grey matter and overestimate white matter volumes with increasing noise. FreeSurfer calculated overall smaller brain volumes with increasing noise. Image inhomogeneity had only minor, non- significant effects on the results obtained with SPM and FreeSurfer 5.1, but had effects on the FSL results (increased white matter volumes with decreased grey matter volumes). The analysis of the patient data yielded decreasing volumes of grey and white matter with progression of brain atrophy independent of the method used. FreeSurfer calculated the largest grey matter and the smallest white matter volumes. FSL calculated the smallest grey matter volumes; SPM the largest white matter volumes. Best results are obtained with good image quality. With poor image quality, especially noise, SPM provides the best segmentation results. An optimised template for segmentation had no significant effect on segmentation results. While our findings underline the applicability of the programmes investigated, SPM may be the programme of choice when MRIs with limited image quality or brain images of elderly should be analysed. PMID:26211622

  9. kNN-based multi-spectral MRI brain tissue classification: manual training versus automated atlas-based training

    NASA Astrophysics Data System (ADS)

    Vrooman, Henri A.; Cocosco, Chris A.; Stokking, Rik; Ikram, M. Arfan; Vernooij, Meike W.; Breteler, Monique M.; Niessen, Wiro J.

    2006-03-01

    Conventional k-Nearest-Neighbor (kNN) classification, which has been successfully applied to classify brain tissue, requires laborious training on manually labeled subjects. In this work, the performance of kNN-based segmentation of gray matter (GM), white matter (WM) and cerebrospinal fluid (CSF) using manual training is compared with a new method, in which training is automated using an atlas. From 12 subjects, standard T2 and PD scans and a high-resolution, high-contrast scan (Siemens T1-weighted HASTE sequence with reverse contrast) were used as feature sets. For the conventional kNN method, manual segmentations were used for training, and classifications were evaluated in a leave-one-out study. The performance as a function of the number of samples per tissue, and k was studied. For fully automated training, scans were registered to a probabilistic brain atlas. Initial training samples were randomly selected per tissue based on a threshold on the tissue probability. These initials were processed to keep the most reliable samples. Performance of the method for varying the threshold on the tissue probability method was studied. By measuring the percentage overlap (SI), classification results of both methods were validated. For conventional kNN classification, varying the number of training samples did not result in significant differences, while increasing k gave significantly better results. In the method using automated training, there is an overestimation of GM at the expense of CSF at higher thresholds on the tissue probability maps. The difference between the conventional method (k=45) and the observers was not significantly larger than inter-observer variability for all tissue types. The automated method performed slightly worse and performed equal to the observers for WM, and less for CSF and GM. From these results it can be concluded that conventional kNN classification may replace manual segmentation, and that atlas-based kNN segmentation has strong

  10. Skeleton-based region competition for automated gray matter and white matter segmentation of human brain MR images

    NASA Astrophysics Data System (ADS)

    Chu, Yong; Chen, Ya-Fang; Su, Min-Ying; Nalcioglu, Orhan

    2005-04-01

    Image segmentation is an essential process for quantitative analysis. Segmentation of brain tissues in magnetic resonance (MR) images is very important for understanding the structural-functional relationship for various pathological conditions, such as dementia vs. normal brain aging. Different brain regions are responsible for certain functions and may have specific implication for diagnosis. Segmentation may facilitate the analysis of different brain regions to aid in early diagnosis. Region competition has been recently proposed as an effective method for image segmentation by minimizing a generalized Bayes/MDL criterion. However, it is sensitive to initial conditions - the "seeds", therefore an optimal choice of "seeds" is necessary for accurate segmentation. In this paper, we present a new skeleton-based region competition algorithm for automated gray and white matter segmentation. Skeletons can be considered as good "seed regions" since they provide the morphological a priori information, thus guarantee a correct initial condition. Intensity gradient information is also added to the global energy function to achieve a precise boundary localization. This algorithm was applied to perform gray and white matter segmentation using simulated MRI images from a realistic digital brain phantom. Nine different brain regions were manually outlined for evaluation of the performance in these separate regions. The results were compared to the gold-standard measure to calculate the true positive and true negative percentages. In general, this method worked well with a 96% accuracy, although the performance varied in different regions. We conclude that the skeleton-based region competition is an effective method for gray and white matter segmentation.

  11. Implementation of talairach atlas based automated brain segmentation for radiation therapy dosimetry.

    PubMed

    Popple, R A; Griffith, H R; Sawrie, S M; Fiveash, J B; Brezovich, I A

    2006-02-01

    Radiotherapy for brain cancer inevitably results in irradiation of uninvolved brain. While it has been demonstrated that irradiation of the brain can result in cognitive deficits, dose-volume relationships are not well established. There is little work correlating a particular cognitive deficit with dose received by the region of the brain responsible for the specific cognitive function. One obstacle to such studies is that identification of brain anatomy is both labor intensive and dependent on the individual performing the segmentation. Automatic segmentation has the potential to be both efficient and consistent. Brains2 is a software package developed by the University of Iowa for MRI volumetric studies. It utilizes MR images, the Talairach atlas, and an artificial neural network (ANN) to segment brain images into substructures in a standardized manner. We have developed a software package, Brains2DICOM, that converts the regions of interest identified by Brains2 into a DICOM radiotherapy structure set. The structure set can be imported into a treatment planning system for dosimetry. We demonstrated the utility of Brains2DICOM using a test case, a 34-year-old man with diffuse astrocytoma treated with three-dimensional conformal radiotherapy. Brains2 successfully applied the Talairach atlas to identify the right and left frontal, parietal, temporal, occipital, subcortical, and cerebellum regions. Brains2 was not successful in applying the ANN to identify small structures, such as the hippocampus and caudate. Further work is necessary to revise the ANN or to develop new methods for identification of small structures in the presence of disease and radiation induced changes. The segmented regions-of-interest were transferred to our commercial treatment planning system using DICOM and dose-volume histograms were constructed. This method will facilitate the acquisition of data necessary for the development of normal tissue complication probability (NTCP) models that

  12. Brain-Wide Mapping of Axonal Connections: Workflow for Automated Detection and Spatial Analysis of Labeling in Microscopic Sections.

    PubMed

    Papp, Eszter A; Leergaard, Trygve B; Csucs, Gergely; Bjaalie, Jan G

    2016-01-01

    Axonal tracing techniques are powerful tools for exploring the structural organization of neuronal connections. Tracers such as biotinylated dextran amine (BDA) and Phaseolus vulgaris leucoagglutinin (Pha-L) allow brain-wide mapping of connections through analysis of large series of histological section images. We present a workflow for efficient collection and analysis of tract-tracing datasets with a focus on newly developed modules for image processing and assignment of anatomical location to tracing data. New functionality includes automatic detection of neuronal labeling in large image series, alignment of images to a volumetric brain atlas, and analytical tools for measuring the position and extent of labeling. To evaluate the workflow, we used high-resolution microscopic images from axonal tracing experiments in which different parts of the rat primary somatosensory cortex had been injected with BDA or Pha-L. Parameters from a set of representative images were used to automate detection of labeling in image series covering the entire brain, resulting in binary maps of the distribution of labeling. For high to medium labeling densities, automatic detection was found to provide reliable results when compared to manual analysis, whereas weak labeling required manual curation for optimal detection. To identify brain regions corresponding to labeled areas, section images were aligned to the Waxholm Space (WHS) atlas of the Sprague Dawley rat brain (v2) by custom-angle slicing of the MRI template to match individual sections. Based on the alignment, WHS coordinates were obtained for labeled elements and transformed to stereotaxic coordinates. The new workflow modules increase the efficiency and reliability of labeling detection in large series of images from histological sections, and enable anchoring to anatomical atlases for further spatial analysis and comparison with other data. PMID:27148038

  13. Brain-Wide Mapping of Axonal Connections: Workflow for Automated Detection and Spatial Analysis of Labeling in Microscopic Sections

    PubMed Central

    Papp, Eszter A.; Leergaard, Trygve B.; Csucs, Gergely; Bjaalie, Jan G.

    2016-01-01

    Axonal tracing techniques are powerful tools for exploring the structural organization of neuronal connections. Tracers such as biotinylated dextran amine (BDA) and Phaseolus vulgaris leucoagglutinin (Pha-L) allow brain-wide mapping of connections through analysis of large series of histological section images. We present a workflow for efficient collection and analysis of tract-tracing datasets with a focus on newly developed modules for image processing and assignment of anatomical location to tracing data. New functionality includes automatic detection of neuronal labeling in large image series, alignment of images to a volumetric brain atlas, and analytical tools for measuring the position and extent of labeling. To evaluate the workflow, we used high-resolution microscopic images from axonal tracing experiments in which different parts of the rat primary somatosensory cortex had been injected with BDA or Pha-L. Parameters from a set of representative images were used to automate detection of labeling in image series covering the entire brain, resulting in binary maps of the distribution of labeling. For high to medium labeling densities, automatic detection was found to provide reliable results when compared to manual analysis, whereas weak labeling required manual curation for optimal detection. To identify brain regions corresponding to labeled areas, section images were aligned to the Waxholm Space (WHS) atlas of the Sprague Dawley rat brain (v2) by custom-angle slicing of the MRI template to match individual sections. Based on the alignment, WHS coordinates were obtained for labeled elements and transformed to stereotaxic coordinates. The new workflow modules increase the efficiency and reliability of labeling detection in large series of images from histological sections, and enable anchoring to anatomical atlases for further spatial analysis and comparison with other data. PMID:27148038

  14. Control of a Wheelchair in an Indoor Environment Based on a Brain-Computer Interface and Automated Navigation.

    PubMed

    Zhang, Rui; Li, Yuanqing; Yan, Yongyong; Zhang, Hao; Wu, Shaoyu; Yu, Tianyou; Gu, Zhenghui

    2016-01-01

    The concept of controlling a wheelchair using brain signals is promising. However, the continuous control of a wheelchair based on unstable and noisy electroencephalogram signals is unreliable and generates a significant mental burden for the user. A feasible solution is to integrate a brain-computer interface (BCI) with automated navigation techniques. This paper presents a brain-controlled intelligent wheelchair with the capability of automatic navigation. Using an autonomous navigation system, candidate destinations and waypoints are automatically generated based on the existing environment. The user selects a destination using a motor imagery (MI)-based or P300-based BCI. According to the determined destination, the navigation system plans a short and safe path and navigates the wheelchair to the destination. During the movement of the wheelchair, the user can issue a stop command with the BCI. Using our system, the mental burden of the user can be substantially alleviated. Furthermore, our system can adapt to changes in the environment. Two experiments based on MI and P300 were conducted to demonstrate the effectiveness of our system. PMID:26054072

  15. Control of a Wheelchair in an Indoor Environment Based on a Brain-Computer Interface and Automated Navigation.

    PubMed

    Zhang, Rui; Li, Yuanqing; Yan, Yongyong; Zhang, Hao; Wu, Shaoyu; Yu, Tianyou; Gu, Zhenghui

    2016-01-01

    The concept of controlling a wheelchair using brain signals is promising. However, the continuous control of a wheelchair based on unstable and noisy electroencephalogram signals is unreliable and generates a significant mental burden for the user. A feasible solution is to integrate a brain-computer interface (BCI) with automated navigation techniques. This paper presents a brain-controlled intelligent wheelchair with the capability of automatic navigation. Using an autonomous navigation system, candidate destinations and waypoints are automatically generated based on the existing environment. The user selects a destination using a motor imagery (MI)-based or P300-based BCI. According to the determined destination, the navigation system plans a short and safe path and navigates the wheelchair to the destination. During the movement of the wheelchair, the user can issue a stop command with the BCI. Using our system, the mental burden of the user can be substantially alleviated. Furthermore, our system can adapt to changes in the environment. Two experiments based on MI and P300 were conducted to demonstrate the effectiveness of our system.

  16. Quantification of Human Brain Metabolites from in Vivo1H NMR Magnitude Spectra Using Automated Artificial Neural Network Analysis

    NASA Astrophysics Data System (ADS)

    Hiltunen, Yrjö; Kaartinen, Jouni; Pulkkinen, Juhani; Häkkinen, Anna-Maija; Lundbom, Nina; Kauppinen, Risto A.

    2002-01-01

    Long echo time (TE=270 ms) in vivo proton NMR spectra resembling human brain metabolite patterns were simulated for lineshape fitting (LF) and quantitative artificial neural network (ANN) analyses. A set of experimental in vivo1H NMR spectra were first analyzed by the LF method to match the signal-to-noise ratios and linewidths of simulated spectra to those in the experimental data. The performance of constructed ANNs was compared for the peak area determinations of choline-containing compounds (Cho), total creatine (Cr), and N-acetyl aspartate (NAA) signals using both manually phase-corrected and magnitude spectra as inputs. The peak area data from ANN and LF analyses for simulated spectra yielded high correlation coefficients demonstrating that the peak areas quantified with ANN gave similar results as LF analysis. Thus, a fully automated ANN method based on magnitude spectra has demonstrated potential for quantification of in vivo metabolites from long echo time spectroscopic imaging.

  17. Automated fetal brain segmentation from 2D MRI slices for motion correction.

    PubMed

    Keraudren, K; Kuklisova-Murgasova, M; Kyriakopoulou, V; Malamateniou, C; Rutherford, M A; Kainz, B; Hajnal, J V; Rueckert, D

    2014-11-01

    Motion correction is a key element for imaging the fetal brain in-utero using Magnetic Resonance Imaging (MRI). Maternal breathing can introduce motion, but a larger effect is frequently due to fetal movement within the womb. Consequently, imaging is frequently performed slice-by-slice using single shot techniques, which are then combined into volumetric images using slice-to-volume reconstruction methods (SVR). For successful SVR, a key preprocessing step is to isolate fetal brain tissues from maternal anatomy before correcting for the motion of the fetal head. This has hitherto been a manual or semi-automatic procedure. We propose an automatic method to localize and segment the brain of the fetus when the image data is acquired as stacks of 2D slices with anatomy misaligned due to fetal motion. We combine this segmentation process with a robust motion correction method, enabling the segmentation to be refined as the reconstruction proceeds. The fetal brain localization process uses Maximally Stable Extremal Regions (MSER), which are classified using a Bag-of-Words model with Scale-Invariant Feature Transform (SIFT) features. The segmentation process is a patch-based propagation of the MSER regions selected during detection, combined with a Conditional Random Field (CRF). The gestational age (GA) is used to incorporate prior knowledge about the size and volume of the fetal brain into the detection and segmentation process. The method was tested in a ten-fold cross-validation experiment on 66 datasets of healthy fetuses whose GA ranged from 22 to 39 weeks. In 85% of the tested cases, our proposed method produced a motion corrected volume of a relevant quality for clinical diagnosis, thus removing the need for manually delineating the contours of the brain before motion correction. Our method automatically generated as a side-product a segmentation of the reconstructed fetal brain with a mean Dice score of 93%, which can be used for further processing.

  18. An automated pipeline for constructing personalized virtual brains from multimodal neuroimaging data.

    PubMed

    Schirner, Michael; Rothmeier, Simon; Jirsa, Viktor K; McIntosh, Anthony Randal; Ritter, Petra

    2015-08-15

    Large amounts of multimodal neuroimaging data are acquired every year worldwide. In order to extract high-dimensional information for computational neuroscience applications standardized data fusion and efficient reduction into integrative data structures are required. Such self-consistent multimodal data sets can be used for computational brain modeling to constrain models with individual measurable features of the brain, such as done with The Virtual Brain (TVB). TVB is a simulation platform that uses empirical structural and functional data to build full brain models of individual humans. For convenient model construction, we developed a processing pipeline for structural, functional and diffusion-weighted magnetic resonance imaging (MRI) and optionally electroencephalography (EEG) data. The pipeline combines several state-of-the-art neuroinformatics tools to generate subject-specific cortical and subcortical parcellations, surface-tessellations, structural and functional connectomes, lead field matrices, electrical source activity estimates and region-wise aggregated blood oxygen level dependent (BOLD) functional MRI (fMRI) time-series. The output files of the pipeline can be directly uploaded to TVB to create and simulate individualized large-scale network models that incorporate intra- and intercortical interaction on the basis of cortical surface triangulations and white matter tractograpy. We detail the pitfalls of the individual processing streams and discuss ways of validation. With the pipeline we also introduce novel ways of estimating the transmission strengths of fiber tracts in whole-brain structural connectivity (SC) networks and compare the outcomes of different tractography or parcellation approaches. We tested the functionality of the pipeline on 50 multimodal data sets. In order to quantify the robustness of the connectome extraction part of the pipeline we computed several metrics that quantify its rescan reliability and compared them to other

  19. Automated identification of brain tumors from single MR images based on segmentation with refined patient-specific priors.

    PubMed

    Sanjuán, Ana; Price, Cathy J; Mancini, Laura; Josse, Goulven; Grogan, Alice; Yamamoto, Adam K; Geva, Sharon; Leff, Alex P; Yousry, Tarek A; Seghier, Mohamed L

    2013-01-01

    Brain tumors can have different shapes or locations, making their identification very challenging. In functional MRI, it is not unusual that patients have only one anatomical image due to time and financial constraints. Here, we provide a modified automatic lesion identification (ALI) procedure which enables brain tumor identification from single MR images. Our method rests on (A) a modified segmentation-normalization procedure with an explicit "extra prior" for the tumor and (B) an outlier detection procedure for abnormal voxel (i.e., tumor) classification. To minimize tissue misclassification, the segmentation-normalization procedure requires prior information of the tumor location and extent. We therefore propose that ALI is run iteratively so that the output of Step B is used as a patient-specific prior in Step A. We test this procedure on real T1-weighted images from 18 patients, and the results were validated in comparison to two independent observers' manual tracings. The automated procedure identified the tumors successfully with an excellent agreement with the manual segmentation (area under the ROC curve = 0.97 ± 0.03). The proposed procedure increases the flexibility and robustness of the ALI tool and will be particularly useful for lesion-behavior mapping studies, or when lesion identification and/or spatial normalization are problematic.

  20. Automated identification of brain tumors from single MR images based on segmentation with refined patient-specific priors

    PubMed Central

    Sanjuán, Ana; Price, Cathy J.; Mancini, Laura; Josse, Goulven; Grogan, Alice; Yamamoto, Adam K.; Geva, Sharon; Leff, Alex P.; Yousry, Tarek A.; Seghier, Mohamed L.

    2013-01-01

    Brain tumors can have different shapes or locations, making their identification very challenging. In functional MRI, it is not unusual that patients have only one anatomical image due to time and financial constraints. Here, we provide a modified automatic lesion identification (ALI) procedure which enables brain tumor identification from single MR images. Our method rests on (A) a modified segmentation-normalization procedure with an explicit “extra prior” for the tumor and (B) an outlier detection procedure for abnormal voxel (i.e., tumor) classification. To minimize tissue misclassification, the segmentation-normalization procedure requires prior information of the tumor location and extent. We therefore propose that ALI is run iteratively so that the output of Step B is used as a patient-specific prior in Step A. We test this procedure on real T1-weighted images from 18 patients, and the results were validated in comparison to two independent observers' manual tracings. The automated procedure identified the tumors successfully with an excellent agreement with the manual segmentation (area under the ROC curve = 0.97 ± 0.03). The proposed procedure increases the flexibility and robustness of the ALI tool and will be particularly useful for lesion-behavior mapping studies, or when lesion identification and/or spatial normalization are problematic. PMID:24381535

  1. Colorization and Automated Segmentation of Human T2 MR Brain Images for Characterization of Soft Tissues

    PubMed Central

    Attique, Muhammad; Gilanie, Ghulam; Hafeez-Ullah; Mehmood, Malik S.; Naweed, Muhammad S.; Ikram, Masroor; Kamran, Javed A.; Vitkin, Alex

    2012-01-01

    Characterization of tissues like brain by using magnetic resonance (MR) images and colorization of the gray scale image has been reported in the literature, along with the advantages and drawbacks. Here, we present two independent methods; (i) a novel colorization method to underscore the variability in brain MR images, indicative of the underlying physical density of bio tissue, (ii) a segmentation method (both hard and soft segmentation) to characterize gray brain MR images. The segmented images are then transformed into color using the above-mentioned colorization method, yielding promising results for manual tracing. Our color transformation incorporates the voxel classification by matching the luminance of voxels of the source MR image and provided color image by measuring the distance between them. The segmentation method is based on single-phase clustering for 2D and 3D image segmentation with a new auto centroid selection method, which divides the image into three distinct regions (gray matter (GM), white matter (WM), and cerebrospinal fluid (CSF) using prior anatomical knowledge). Results have been successfully validated on human T2-weighted (T2) brain MR images. The proposed method can be potentially applied to gray-scale images from other imaging modalities, in bringing out additional diagnostic tissue information contained in the colorized image processing approach as described. PMID:22479421

  2. An Automated and Intelligent Medical Decision Support System for Brain MRI Scans Classification.

    PubMed

    Siddiqui, Muhammad Faisal; Reza, Ahmed Wasif; Kanesan, Jeevan

    2015-01-01

    A wide interest has been observed in the medical health care applications that interpret neuroimaging scans by machine learning systems. This research proposes an intelligent, automatic, accurate, and robust classification technique to classify the human brain magnetic resonance image (MRI) as normal or abnormal, to cater down the human error during identifying the diseases in brain MRIs. In this study, fast discrete wavelet transform (DWT), principal component analysis (PCA), and least squares support vector machine (LS-SVM) are used as basic components. Firstly, fast DWT is employed to extract the salient features of brain MRI, followed by PCA, which reduces the dimensions of the features. These reduced feature vectors also shrink the memory storage consumption by 99.5%. At last, an advanced classification technique based on LS-SVM is applied to brain MR image classification using reduced features. For improving the efficiency, LS-SVM is used with non-linear radial basis function (RBF) kernel. The proposed algorithm intelligently determines the optimized values of the hyper-parameters of the RBF kernel and also applied k-fold stratified cross validation to enhance the generalization of the system. The method was tested by 340 patients' benchmark datasets of T1-weighted and T2-weighted scans. From the analysis of experimental results and performance comparisons, it is observed that the proposed medical decision support system outperformed all other modern classifiers and achieves 100% accuracy rate (specificity/sensitivity 100%/100%). Furthermore, in terms of computation time, the proposed technique is significantly faster than the recent well-known methods, and it improves the efficiency by 71%, 3%, and 4% on feature extraction stage, feature reduction stage, and classification stage, respectively. These results indicate that the proposed well-trained machine learning system has the potential to make accurate predictions about brain abnormalities from the

  3. Brain tumor target volume determination for radiation therapy treatment planning through the use of automated MRI segmentation

    NASA Astrophysics Data System (ADS)

    Mazzara, Gloria Patrika

    Radiation therapy seeks to effectively irradiate the tumor cells while minimizing the dose to adjacent normal cells. Prior research found that the low success rates for treating brain tumors would be improved with higher radiation doses to the tumor area. This is feasible only if the target volume can be precisely identified. However, the definition of tumor volume is still based on time-intensive, highly subjective manual outlining by radiation oncologists. In this study the effectiveness of two automated Magnetic Resonance Imaging (MRI) segmentation methods, k-Nearest Neighbors (kNN) and Knowledge-Guided (KG), in determining the Gross Tumor Volume (GTV) of brain tumors for use in radiation therapy was assessed. Three criteria were applied: accuracy of the contours; quality of the resulting treatment plan in terms of dose to the tumor; and a novel treatment plan evaluation technique based on post-treatment images. The kNN method was able to segment all cases while the KG method was limited to enhancing tumors and gliomas with clear enhancing edges. Various software applications were developed to create a closed smooth contour that encompassed the tumor pixels from the segmentations and to integrate these results into the treatment planning software. A novel, probabilistic measurement of accuracy was introduced to compare the agreement of the segmentation methods with the weighted average physician volume. Both computer methods under-segment the tumor volume when compared with the physicians but performed within the variability of manual contouring (28% +/- 12% for inter-operator variability). Computer segmentations were modified vertically to compensate for their under-segmentation. When comparing radiation treatment plans designed from physician-defined tumor volumes with treatment plans developed from the modified segmentation results, the reference target volume was irradiated within the same level of conformity. Analysis of the plans based on post

  4. Fully Automated Segmentation of the Pons and Midbrain Using Human T1 MR Brain Images

    PubMed Central

    Nigro, Salvatore; Cerasa, Antonio; Zito, Giancarlo; Perrotta, Paolo; Chiaravalloti, Francesco; Donzuso, Giulia; Fera, Franceso; Bilotta, Eleonora; Pantano, Pietro; Quattrone, Aldo

    2014-01-01

    Purpose This paper describes a novel method to automatically segment the human brainstem into midbrain and pons, called LABS: Landmark-based Automated Brainstem Segmentation. LABS processes high-resolution structural magnetic resonance images (MRIs) according to a revised landmark-based approach integrated with a thresholding method, without manual interaction. Methods This method was first tested on morphological T1-weighted MRIs of 30 healthy subjects. Its reliability was further confirmed by including neurological patients (with Alzheimer's Disease) from the ADNI repository, in whom the presence of volumetric loss within the brainstem had been previously described. Segmentation accuracies were evaluated against expert-drawn manual delineation. To evaluate the quality of LABS segmentation we used volumetric, spatial overlap and distance-based metrics. Results The comparison between the quantitative measurements provided by LABS against manual segmentations revealed excellent results in healthy controls when considering either the midbrain (DICE measures higher that 0.9; Volume ratio around 1 and Hausdorff distance around 3) or the pons (DICE measures around 0.93; Volume ratio ranging 1.024–1.05 and Hausdorff distance around 2). Similar performances were detected for AD patients considering segmentation of the pons (DICE measures higher that 0.93; Volume ratio ranging from 0.97–0.98 and Hausdorff distance ranging 1.07–1.33), while LABS performed lower for the midbrain (DICE measures ranging 0.86–0.88; Volume ratio around 0.95 and Hausdorff distance ranging 1.71–2.15). Conclusions Our study represents the first attempt to validate a new fully automated method for in vivo segmentation of two anatomically complex brainstem subregions. We retain that our method might represent a useful tool for future applications in clinical practice. PMID:24489664

  5. BIANCA (Brain Intensity AbNormality Classification Algorithm): A new tool for automated segmentation of white matter hyperintensities.

    PubMed

    Griffanti, Ludovica; Zamboni, Giovanna; Khan, Aamira; Li, Linxin; Bonifacio, Guendalina; Sundaresan, Vaanathi; Schulz, Ursula G; Kuker, Wilhelm; Battaglini, Marco; Rothwell, Peter M; Jenkinson, Mark

    2016-11-01

    Reliable quantification of white matter hyperintensities of presumed vascular origin (WMHs) is increasingly needed, given the presence of these MRI findings in patients with several neurological and vascular disorders, as well as in elderly healthy subjects. We present BIANCA (Brain Intensity AbNormality Classification Algorithm), a fully automated, supervised method for WMH detection, based on the k-nearest neighbour (k-NN) algorithm. Relative to previous k-NN based segmentation methods, BIANCA offers different options for weighting the spatial information, local spatial intensity averaging, and different options for the choice of the number and location of the training points. BIANCA is multimodal and highly flexible so that the user can adapt the tool to their protocol and specific needs. We optimised and validated BIANCA on two datasets with different MRI protocols and patient populations (a "predominantly neurodegenerative" and a "predominantly vascular" cohort). BIANCA was first optimised on a subset of images for each dataset in terms of overlap and volumetric agreement with a manually segmented WMH mask. The correlation between the volumes extracted with BIANCA (using the optimised set of options), the volumes extracted from the manual masks and visual ratings showed that BIANCA is a valid alternative to manual segmentation. The optimised set of options was then applied to the whole cohorts and the resulting WMH volume estimates showed good correlations with visual ratings and with age. Finally, we performed a reproducibility test, to evaluate the robustness of BIANCA, and compared BIANCA performance against existing methods. Our findings suggest that BIANCA, which will be freely available as part of the FSL package, is a reliable method for automated WMH segmentation in large cross-sectional cohort studies. PMID:27402600

  6. Automated multi-subject fiber clustering of mouse brain using dominant sets.

    PubMed

    Dodero, Luca; Vascon, Sebastiano; Murino, Vittorio; Bifone, Angelo; Gozzi, Alessandro; Sona, Diego

    2014-01-01

    Mapping of structural and functional connectivity may provide deeper understanding of brain function and disfunction. Diffusion Magnetic Resonance Imaging (DMRI) is a powerful technique to non-invasively delineate white matter (WM) tracts and to obtain a three-dimensional description of the structural architecture of the brain. However, DMRI tractography methods produce highly multi-dimensional datasets whose interpretation requires advanced analytical tools. Indeed, manual identification of specific neuroanatomical tracts based on prior anatomical knowledge is time-consuming and prone to operator-induced bias. Here we propose an automatic multi-subject fiber clustering method that enables retrieval of group-wise WM fiber bundles. In order to account for variance across subjects, we developed a multi-subject approach based on a method known as Dominant Sets algorithm, via an intra- and cross-subject clustering. The intra-subject step allows us to reduce the complexity of the raw tractography data, thus obtaining homogeneous neuroanatomically-plausible bundles in each diffusion space. The cross-subject step, characterized by a proper space-invariant metric in the original diffusion space, enables the identification of the same WM bundles across multiple subjects without any prior neuroanatomical knowledge. Quantitative analysis was conducted comparing our algorithm with spectral clustering and affinity propagation methods on synthetic dataset. We also performed qualitative analysis on mouse brain tractography retrieving significant WM structures. The approach serves the final goal of detecting WM bundles at a population level, thus paving the way to the study of the WM organization across groups.

  7. Automated Protein Localization of Blood Brain Barrier Vasculature in Brightfield IHC Images.

    PubMed

    Soans, Rajath E; Lim, Diane C; Keenan, Brendan T; Pack, Allan I; Shackleford, James A

    2016-01-01

    In this paper, we present an objective method for localization of proteins in blood brain barrier (BBB) vasculature using standard immunohistochemistry (IHC) techniques and bright-field microscopy. Images from the hippocampal region at the BBB are acquired using bright-field microscopy and subjected to our segmentation pipeline which is designed to automatically identify and segment microvessels containing the protein glucose transporter 1 (GLUT1). Gabor filtering and k-means clustering are employed to isolate potential vascular structures within cryosectioned slabs of the hippocampus, which are subsequently subjected to feature extraction followed by classification via decision forest. The false positive rate (FPR) of microvessel classification is characterized using synthetic and non-synthetic IHC image data for image entropies ranging between 3 and 8 bits. The average FPR for synthetic and non-synthetic IHC image data was found to be 5.48% and 5.04%, respectively. PMID:26828723

  8. Automated Protein Localization of Blood Brain Barrier Vasculature in Brightfield IHC Images

    PubMed Central

    Keenan, Brendan T.; Pack, Allan I.; Shackleford, James A.

    2016-01-01

    In this paper, we present an objective method for localization of proteins in blood brain barrier (BBB) vasculature using standard immunohistochemistry (IHC) techniques and bright-field microscopy. Images from the hippocampal region at the BBB are acquired using bright-field microscopy and subjected to our segmentation pipeline which is designed to automatically identify and segment microvessels containing the protein glucose transporter 1 (GLUT1). Gabor filtering and k-means clustering are employed to isolate potential vascular structures within cryosectioned slabs of the hippocampus, which are subsequently subjected to feature extraction followed by classification via decision forest. The false positive rate (FPR) of microvessel classification is characterized using synthetic and non-synthetic IHC image data for image entropies ranging between 3 and 8 bits. The average FPR for synthetic and non-synthetic IHC image data was found to be 5.48% and 5.04%, respectively. PMID:26828723

  9. Automated segmentation of the corpus callosum in midsagittal brain magnetic resonance images

    NASA Astrophysics Data System (ADS)

    Lee, Chulhee; Huh, Shin; Ketter, Terence A.; Unser, Michael A.

    2000-04-01

    We propose a new algorithm to find the corpus callosum automatically from midsagittal brain MR (magnetic resonance) images using the statistical characteristics and shape information of the corpus callosum. We first extract regions satisfying the statistical characteristics (gray level distributions) of the corpus callosum that have relatively high intensity values. Then we try to find a region matching the shape information of the corpus callosum. In order to match the shape information, we propose a new directed window region growing algorithm instead of using conventional contour matching. An innovative feature of the algorithm is that we adaptively relax the statistical requirement until we find a region matching the shape information. After the initial segmentation, a directed border path pruning algorithm is proposed in order to remove some undesired artifacts, especially on the top of the corpus callosum. The proposed algorithm was applied to over 120 images and provided promising results.

  10. Automated Whole-Brain N-Acetylaspartate Proton MR Spectroscopic Quantification

    PubMed Central

    Soher, Brian J.; Wu, William E.; Tal, Assaf; Storey, Pippa; Zhang, Ke; Babb, James. S.; Lui, Yvonne W.; Gonen, Oded

    2014-01-01

    The neuronal-marker, NAA, a quantitative metric for neurons' health and density, is currently obtained by integration of the manually defined peak in whole-head 1H-MRS. Our goal was to develop a full spectral modeling approach for estimating automatically the whole-brain NAA concentration (WBNAA) and to compare the performance of this approach with a manual frequency-range peak integration approach previously employed. MRI and WBNAA 1H-MRS from 18 healthy young were examined. Their non-localized, whole-head 1H-MRS obtained at 3 T yielded the NAA peak area through both manually-defined frequency-range integration and the new, full spectral simulation. The NAA peak area was converted into an absolute amount with phantom replacement and normalized for brain volume (segmented from T1-weighted MRI) to yield the WBNAA. A paired sample t test was used to compare the WBNAA paradigms and a likelihood ratio test to compare their coefficients of variation. While the between-subject WBNAA means were nearly identical: 12.8±2.5 mM for integration, 12.8±1.4 mM for spectral modeling, the latter's standard deviation was significantly smaller (∼50%, p=0.026). The within-subject variability was 11.7% (1.3 mM) for integration, versus 7.0% (±0.8 mM) for spectral modeling, i.e., a 40% improvement. The (quantifiable) quality of the modeling approach was high, as reflected by Cramer-Rao lower bounds ‹0.1% and vanishingly small (experiment - fit) residuals. Modeling the whole-head 1H-MRS increases WBNAA quantification reliability by reducing its variability, its susceptibility to operator bias and baseline roll, and by providing quality-control feedback. Together, these enhance the usefulness of the technique to monitor neurological disorders' diffuse progression and treatment response. PMID:25196714

  11. A Fully Automated Trial Selection Method for Optimization of Motor Imagery Based Brain-Computer Interface.

    PubMed

    Zhou, Bangyan; Wu, Xiaopei; Lv, Zhao; Zhang, Lei; Guo, Xiaojin

    2016-01-01

    Independent component analysis (ICA) as a promising spatial filtering method can separate motor-related independent components (MRICs) from the multichannel electroencephalogram (EEG) signals. However, the unpredictable burst interferences may significantly degrade the performance of ICA-based brain-computer interface (BCI) system. In this study, we proposed a new algorithm frame to address this issue by combining the single-trial-based ICA filter with zero-training classifier. We developed a two-round data selection method to identify automatically the badly corrupted EEG trials in the training set. The "high quality" training trials were utilized to optimize the ICA filter. In addition, we proposed an accuracy-matrix method to locate the artifact data segments within a single trial and investigated which types of artifacts can influence the performance of the ICA-based MIBCIs. Twenty-six EEG datasets of three-class motor imagery were used to validate the proposed methods, and the classification accuracies were compared with that obtained by frequently used common spatial pattern (CSP) spatial filtering algorithm. The experimental results demonstrated that the proposed optimizing strategy could effectively improve the stability, practicality and classification performance of ICA-based MIBCI. The study revealed that rational use of ICA method may be crucial in building a practical ICA-based MIBCI system. PMID:27631789

  12. A Fully Automated Trial Selection Method for Optimization of Motor Imagery Based Brain-Computer Interface

    PubMed Central

    Zhou, Bangyan; Wu, Xiaopei; Lv, Zhao; Zhang, Lei; Guo, Xiaojin

    2016-01-01

    Independent component analysis (ICA) as a promising spatial filtering method can separate motor-related independent components (MRICs) from the multichannel electroencephalogram (EEG) signals. However, the unpredictable burst interferences may significantly degrade the performance of ICA-based brain-computer interface (BCI) system. In this study, we proposed a new algorithm frame to address this issue by combining the single-trial-based ICA filter with zero-training classifier. We developed a two-round data selection method to identify automatically the badly corrupted EEG trials in the training set. The “high quality” training trials were utilized to optimize the ICA filter. In addition, we proposed an accuracy-matrix method to locate the artifact data segments within a single trial and investigated which types of artifacts can influence the performance of the ICA-based MIBCIs. Twenty-six EEG datasets of three-class motor imagery were used to validate the proposed methods, and the classification accuracies were compared with that obtained by frequently used common spatial pattern (CSP) spatial filtering algorithm. The experimental results demonstrated that the proposed optimizing strategy could effectively improve the stability, practicality and classification performance of ICA-based MIBCI. The study revealed that rational use of ICA method may be crucial in building a practical ICA-based MIBCI system. PMID:27631789

  13. Large-scale automated image analysis for computational profiling of brain tissue surrounding implanted neuroprosthetic devices using Python

    PubMed Central

    Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri

    2014-01-01

    In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries. PMID:24808857

  14. Large-scale automated image analysis for computational profiling of brain tissue surrounding implanted neuroprosthetic devices using Python.

    PubMed

    Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri

    2014-01-01

    In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries.

  15. Large-scale automated image analysis for computational profiling of brain tissue surrounding implanted neuroprosthetic devices using Python.

    PubMed

    Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri

    2014-01-01

    In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries. PMID:24808857

  16. A Method for Automated Classification of Parkinson's Disease Diagnosis Using an Ensemble Average Propagator Template Brain Map Estimated from Diffusion MRI.

    PubMed

    Banerjee, Monami; Okun, Michael S; Vaillancourt, David E; Vemuri, Baba C

    2016-01-01

    Parkinson's disease (PD) is a common and debilitating neurodegenerative disorder that affects patients in all countries and of all nationalities. Magnetic resonance imaging (MRI) is currently one of the most widely used diagnostic imaging techniques utilized for detection of neurologic diseases. Changes in structural biomarkers will likely play an important future role in assessing progression of many neurological diseases inclusive of PD. In this paper, we derived structural biomarkers from diffusion MRI (dMRI), a structural modality that allows for non-invasive inference of neuronal fiber connectivity patterns. The structural biomarker we use is the ensemble average propagator (EAP), a probability density function fully characterizing the diffusion locally at a voxel level. To assess changes with respect to a normal anatomy, we construct an unbiased template brain map from the EAP fields of a control population. Use of an EAP captures both orientation and shape information of the diffusion process at each voxel in the dMRI data, and this feature can be a powerful representation to achieve enhanced PD brain mapping. This template brain map construction method is applicable to small animal models as well as to human brains. The differences between the control template brain map and novel patient data can then be assessed via a nonrigid warping algorithm that transforms the novel data into correspondence with the template brain map, thereby capturing the amount of elastic deformation needed to achieve this correspondence. We present the use of a manifold-valued feature called the Cauchy deformation tensor (CDT), which facilitates morphometric analysis and automated classification of a PD versus a control population. Finally, we present preliminary results of automated discrimination between a group of 22 controls and 46 PD patients using CDT. This method may be possibly applied to larger population sizes and other parkinsonian syndromes in the near future.

  17. A Method for Automated Classification of Parkinson’s Disease Diagnosis Using an Ensemble Average Propagator Template Brain Map Estimated from Diffusion MRI

    PubMed Central

    Banerjee, Monami; Okun, Michael S.; Vaillancourt, David E.; Vemuri, Baba C.

    2016-01-01

    Parkinson’s disease (PD) is a common and debilitating neurodegenerative disorder that affects patients in all countries and of all nationalities. Magnetic resonance imaging (MRI) is currently one of the most widely used diagnostic imaging techniques utilized for detection of neurologic diseases. Changes in structural biomarkers will likely play an important future role in assessing progression of many neurological diseases inclusive of PD. In this paper, we derived structural biomarkers from diffusion MRI (dMRI), a structural modality that allows for non-invasive inference of neuronal fiber connectivity patterns. The structural biomarker we use is the ensemble average propagator (EAP), a probability density function fully characterizing the diffusion locally at a voxel level. To assess changes with respect to a normal anatomy, we construct an unbiased template brain map from the EAP fields of a control population. Use of an EAP captures both orientation and shape information of the diffusion process at each voxel in the dMRI data, and this feature can be a powerful representation to achieve enhanced PD brain mapping. This template brain map construction method is applicable to small animal models as well as to human brains. The differences between the control template brain map and novel patient data can then be assessed via a nonrigid warping algorithm that transforms the novel data into correspondence with the template brain map, thereby capturing the amount of elastic deformation needed to achieve this correspondence. We present the use of a manifold-valued feature called the Cauchy deformation tensor (CDT), which facilitates morphometric analysis and automated classification of a PD versus a control population. Finally, we present preliminary results of automated discrimination between a group of 22 controls and 46 PD patients using CDT. This method may be possibly applied to larger population sizes and other parkinsonian syndromes in the near future. PMID

  18. Brain

    MedlinePlus

    ... will return after updating. Resources Archived Modules Updates Brain Cerebrum The cerebrum is the part of the ... the outside of the brain and spinal cord. Brain Stem The brain stem is the part of ...

  19. Automated Spatial Brain Normalization and Hindbrain White Matter Reference Tissue Give Improved [18F]-Florbetaben PET Quantitation in Alzheimer's Model Mice

    PubMed Central

    Overhoff, Felix; Brendel, Matthias; Jaworska, Anna; Korzhova, Viktoria; Delker, Andreas; Probst, Federico; Focke, Carola; Gildehaus, Franz-Josef; Carlsen, Janette; Baumann, Karlheinz; Haass, Christian; Bartenstein, Peter; Herms, Jochen; Rominger, Axel

    2016-01-01

    Preclinical PET studies of β-amyloid (Aβ) accumulation are of growing importance, but comparisons between research sites require standardized and optimized methods for quantitation. Therefore, we aimed to evaluate systematically the (1) impact of an automated algorithm for spatial brain normalization, and (2) intensity scaling methods of different reference regions for Aβ-PET in a large dataset of transgenic mice. PS2APP mice in a 6 week longitudinal setting (N = 37) and another set of PS2APP mice at a histologically assessed narrow range of Aβ burden (N = 40) were investigated by [18F]-florbetaben PET. Manual spatial normalization by three readers at different training levels was performed prior to application of an automated brain spatial normalization and inter-reader agreement was assessed by Fleiss Kappa (κ). For this method the impact of templates at different pathology stages was investigated. Four different reference regions on brain uptake normalization were used to calculate frontal cortical standardized uptake value ratios (SUVRCTX∕REF), relative to raw SUVCTX. Results were compared on the basis of longitudinal stability (Cohen's d), and in reference to gold standard histopathological quantitation (Pearson's R). Application of an automated brain spatial normalization resulted in nearly perfect agreement (all κ≥0.99) between different readers, with constant or improved correlation with histology. Templates based on inappropriate pathology stage resulted in up to 2.9% systematic bias for SUVRCTX∕REF. All SUVRCTX∕REF methods performed better than SUVCTX both with regard to longitudinal stability (d≥1.21 vs. d = 0.23) and histological gold standard agreement (R≥0.66 vs. R≥0.31). Voxel-wise analysis suggested a physiologically implausible longitudinal decrease by global mean scaling. The hindbrain white matter reference (Rmean = 0.75) was slightly superior to the brainstem (Rmean = 0.74) and the cerebellum (Rmean = 0.73). Automated brain

  20. Sensitivity analysis and automation for intraoperative implementation of the atlas-based method for brain shift correction

    NASA Astrophysics Data System (ADS)

    Chen, Ishita; Simpson, Amber L.; Sun, Kay; Thompson, Reid C.; Miga, Michael I.

    2013-03-01

    The use of biomechanical models to correct the misregistration due to deformation in image guided neurosurgical systems has been a growing area of investigation. In previous work, an atlas-based inverse model was developed to account for soft-tissue deformations during image-guided surgery. Central to that methodology is a considerable amount of pre-computation and planning. The goal of this work is to evaluate techniques that could potentially reduce that burden. Distinct from previous manual techniques, an automated segmentation technique is described for the cerebrum and dural septa. The shift correction results using this automated segmentation method were compared to those using the manual methods. In addition, the extent and distribution of the surgical parameters associated with the deformation atlas were investigated by a sensitivity analysis using simulation experiments and clinical data. The shift correction results did not change significantly using the automated method (correction of 73+/-13% ) as compared to the semi-automated method from previous work (correction of 76+/-13%). The results of the sensitivity analysis show that the atlas could be constructed by coarser sampling (six fold reduction) without substantial degradation in the shift reconstruction, a decrease in preoperative computational time from 13.1+/-3.5 hours to 2.2+/-0.6 hours. The automated segmentation technique and the findings of the sensitivity study have significant impact on the reduction of pre-operative computational time, improving the utility of the atlas-based method. The work in this paper suggests that the atlas-based technique can become a `time of surgery' setup procedure rather than a pre-operative computing strategy.

  1. Human brain atlas for automated region of interest selection in quantitative susceptibility mapping: application to determine iron content in deep gray matter structures.

    PubMed

    Lim, Issel Anne L; Faria, Andreia V; Li, Xu; Hsu, Johnny T C; Airan, Raag D; Mori, Susumu; van Zijl, Peter C M

    2013-11-15

    The purpose of this paper is to extend the single-subject Eve atlas from Johns Hopkins University, which currently contains diffusion tensor and T1-weighted anatomical maps, by including contrast based on quantitative susceptibility mapping. The new atlas combines a "deep gray matter parcellation map" (DGMPM) derived from a single-subject quantitative susceptibility map with the previously established "white matter parcellation map" (WMPM) from the same subject's T1-weighted and diffusion tensor imaging data into an MNI coordinate map named the "Everything Parcellation Map in Eve Space," also known as the "EvePM." It allows automated segmentation of gray matter and white matter structures. Quantitative susceptibility maps from five healthy male volunteers (30 to 33 years of age) were coregistered to the Eve Atlas with AIR and Large Deformation Diffeomorphic Metric Mapping (LDDMM), and the transformation matrices were applied to the EvePM to produce automated parcellation in subject space. Parcellation accuracy was measured with a kappa analysis for the left and right structures of six deep gray matter regions. For multi-orientation QSM images, the Kappa statistic was 0.85 between automated and manual segmentation, with the inter-rater reproducibility Kappa being 0.89 for the human raters, suggesting "almost perfect" agreement between all segmentation methods. Segmentation seemed slightly more difficult for human raters on single-orientation QSM images, with the Kappa statistic being 0.88 between automated and manual segmentation, and 0.85 and 0.86 between human raters. Overall, this atlas provides a time-efficient tool for automated coregistration and segmentation of quantitative susceptibility data to analyze many regions of interest. These data were used to establish a baseline for normal magnetic susceptibility measurements for over 60 brain structures of 30- to 33-year-old males. Correlating the average susceptibility with age-based iron concentrations in gray

  2. Human brain atlas for automated region of interest selection in quantitative susceptibility mapping: application to determine iron content in deep gray matter structures.

    PubMed

    Lim, Issel Anne L; Faria, Andreia V; Li, Xu; Hsu, Johnny T C; Airan, Raag D; Mori, Susumu; van Zijl, Peter C M

    2013-11-15

    The purpose of this paper is to extend the single-subject Eve atlas from Johns Hopkins University, which currently contains diffusion tensor and T1-weighted anatomical maps, by including contrast based on quantitative susceptibility mapping. The new atlas combines a "deep gray matter parcellation map" (DGMPM) derived from a single-subject quantitative susceptibility map with the previously established "white matter parcellation map" (WMPM) from the same subject's T1-weighted and diffusion tensor imaging data into an MNI coordinate map named the "Everything Parcellation Map in Eve Space," also known as the "EvePM." It allows automated segmentation of gray matter and white matter structures. Quantitative susceptibility maps from five healthy male volunteers (30 to 33 years of age) were coregistered to the Eve Atlas with AIR and Large Deformation Diffeomorphic Metric Mapping (LDDMM), and the transformation matrices were applied to the EvePM to produce automated parcellation in subject space. Parcellation accuracy was measured with a kappa analysis for the left and right structures of six deep gray matter regions. For multi-orientation QSM images, the Kappa statistic was 0.85 between automated and manual segmentation, with the inter-rater reproducibility Kappa being 0.89 for the human raters, suggesting "almost perfect" agreement between all segmentation methods. Segmentation seemed slightly more difficult for human raters on single-orientation QSM images, with the Kappa statistic being 0.88 between automated and manual segmentation, and 0.85 and 0.86 between human raters. Overall, this atlas provides a time-efficient tool for automated coregistration and segmentation of quantitative susceptibility data to analyze many regions of interest. These data were used to establish a baseline for normal magnetic susceptibility measurements for over 60 brain structures of 30- to 33-year-old males. Correlating the average susceptibility with age-based iron concentrations in gray

  3. Automated Quantification of Human Brain Metabolites by Artificial Neural Network Analysis from in VivoSingle-Voxel 1H NMR Spectra

    NASA Astrophysics Data System (ADS)

    Kaartinen, Jouni; Mierisová, Šarka; Oja, Joni M. E.; Usenius, Jukka-Pekka; Kauppinen, Risto A.; Hiltunen, Yrjö

    1998-09-01

    A real-time automated way of quantifying metabolites fromin vivoNMR spectra using an artificial neural network (ANN) analysis is presented. The spectral training and test sets for ANN containing peaks at the chemical shift ranges resembling long echo time proton NMR spectra from human brain were simulated. The performance of the ANN constructed was compared with an established lineshape fitting (LF) analysis using both simulated and experimental spectral data as inputs. The correspondence between the ANN and LF analyses showed correlation coefficients of order of 0.915-0.997 for spectra with large variations in both signal-to-noise and peak areas. Water suppressed1H NMR spectra from 24 healthy subjects were collected and choline-containing compounds (Cho), total creatine (Cr), and N-acetyl aspartate (NAA) were quantified with both methods. The ANN quantified these spectra with an accuracy similar to LF analysis (correlation coefficients of 0.915-0.951). These results show that LF and ANN are equally good quantifiers; however, the ANN analyses are more easily automated than LF analyses.

  4. Shape-based multifeature brain parcellation

    NASA Astrophysics Data System (ADS)

    Nadeem, Saad; Kaufman, Arie

    2016-03-01

    We present a novel approach to parcellate - delineate the anatomical feature (folds, gyri, sulci) boundaries - the brain cortex. Our approach is based on extracting the 3D brain cortical surface mesh from magnetic resonance (MR) images, computing the shape measures (area, mean curvature, geodesic, and travel depths) for this mesh, and delineating the anatomical feature boundaries using these measures. We use angle-area preserving mapping of the cortical surface mesh to a simpler topology (disk or rectangle) to aid in the visualization and delineation of these boundaries. Contrary to commonly used generic 2D brain image atlas-based approaches, we use 3D surface mesh data extracted from a given brain MR imaging data and its specific shape measures for the parcellation. Our method does not require any non-linear registration of a given brain dataset to a generic atlas and hence, does away with the structure similarity assumption critical to the atlas-based approaches. We evaluate our approach using Mindboggle manually labeled brain datasets and achieve the following accuracies: 72.4% for gyri, 78.5% for major sulci, and 98.4% for folds. These results warrant further investigation of this approach as an alternative or as an initialization to the atlas-based approaches.

  5. Fully automated rodent brain MR image processing pipeline on a Midas server: from acquired images to region-based statistics

    PubMed Central

    Budin, Francois; Hoogstoel, Marion; Reynolds, Patrick; Grauer, Michael; O'Leary-Moore, Shonagh K.; Oguz, Ipek

    2013-01-01

    Magnetic resonance imaging (MRI) of rodent brains enables study of the development and the integrity of the brain under certain conditions (alcohol, drugs etc.). However, these images are difficult to analyze for biomedical researchers with limited image processing experience. In this paper we present an image processing pipeline running on a Midas server, a web-based data storage system. It is composed of the following steps: rigid registration, skull-stripping, average computation, average parcellation, parcellation propagation to individual subjects, and computation of region-based statistics on each image. The pipeline is easy to configure and requires very little image processing knowledge. We present results obtained by processing a data set using this pipeline and demonstrate how this pipeline can be used to find differences between populations. PMID:23964234

  6. Development of a Highly Automated and Multiplexed Targeted Proteome Pipeline and Assay for 112 Rat Brain Synaptic Proteins

    PubMed Central

    Colangelo, Christopher M.; Ivosev, Gordana; Chung, Lisa; Abbott, Thomas; Shifman, Mark; Sakaue, Fumika; Cox, David; Kitchen, Rob R.; Burton, Lyle; Tate, Stephen A; Gulcicek, Erol; Bonner, Ron; Rinehart, Jesse; Nairn, Angus C.; Williams, Kenneth R.

    2015-01-01

    We present a comprehensive workflow for large scale (>1000 transitions/run) label-free LC-MRM proteome assays. Innovations include automated MRM transition selection, intelligent retention time scheduling (xMRM) that improves Signal/Noise by >2-fold, and automatic peak modeling. Improvements to data analysis include a novel Q/C metric, Normalized Group Area Ratio (NGAR), MLR normalization, weighted regression analysis, and data dissemination through the Yale Protein Expression Database. As a proof of principle we developed a robust 90 minute LC-MRM assay for Mouse/Rat Post-Synaptic Density (PSD) fractions which resulted in the routine quantification of 337 peptides from 112 proteins based on 15 observations per protein. Parallel analyses with stable isotope dilution peptide standards (SIS), demonstrate very high correlation in retention time (1.0) and protein fold change (0.94) between the label-free and SIS analyses. Overall, our first method achieved a technical CV of 11.4% with >97.5% of the 1697 transitions being quantified without user intervention, resulting in a highly efficient, robust, and single injection LC-MRM assay. PMID:25476245

  7. Early Experience of Automated Intraventricular Type Intracranial Pressure Monitoring (LiquoGuard®) for Severe Traumatic Brain Injury Patients

    PubMed Central

    Kwon, Young Sub; Lee, Yun Ho

    2016-01-01

    Objective The LiquoGuard® system is a new ventricular-type monitoring device that facilitates intracranial pressure (ICP)-controlled or volume-controlled drainage of cerebrospinal fluid (CSF). The purpose of this study is to report the authors' experience with the LiquoGuard® ICP monitoring system, as well as the clinical safety, usefulness, and limitations of this device in the management of patients with traumatic brain injury (TBI). Methods Intraventricular ICP monitoring was performed on 10 patients with TBI using the LiquoGuard® monitoring system. ICP measurements, volume of drained CSF, and clinical outcomes were analyzed and discussed. Results ICP monitoring was performed on 10 patients for a mean duration of 6.9 days. With a mean 82,718 records per patient, the mean initial ICP was 16.4 mm Hg and the average ICP across the total duration of monitoring was 15.5 mm Hg. The mean volume of drained CSF was 29.2 cc/day, with no CSF drained in 4 patients. Seven of 10 patients showed 1 or 2 episodes of abnormal ICP measurements. No patient exhibited complications associated with ICP monitoring. Conclusion The LiquoGuard® system is a versatile tool in the management of TBI patients. Its use is both reliable and feasible for ICP monitoring and therapeutic drainage of CSF. However, episodes of abnormal ICP measurements were frequently observed in patients with slit ventricles, and further study may be needed to overcome this issue. PMID:27182499

  8. Automated pipeline for atlas-based annotation of gene expresssion patterns: application to postnatal day 7 mouse brain

    SciTech Connect

    Carson, James P.; Ju, Tao; Bello, Musodiq; Thaller, Christina; Warren, Joe; Kakadiaris, Ioannis; Chiu, Wah; Eichele, Gregor

    2010-02-01

    Abstract As bio-medical images and volumes are being collected at an increasing speed, there is a growing demand for efficient means to organize spatial information for comparative analysis. In many scenarios, such as determining gene expression patterns by in situ hybridization, the images are collected from multiple subjects over a common anatomical region, such as the brain. A fundamental challenge in comparing spatial data from different images is how to account for the shape variations among subjects, which makes direct image-to-image comparison meaningless. In this paper, we describe subdivision meshes as a geometric means to efficiently organize 2D images and 3D volumes collected from different subjects for comparison. The key advantages of a subdivision mesh for this purpose are its light-weight geometric structure and its explicit modeling of anatomical boundaries, which enable efficient and accurate registration. The multi-resolution structure of a subdivision mesh also allows development of fast comparison algorithms among registered images and volumes.

  9. Cockpit automation

    NASA Technical Reports Server (NTRS)

    Wiener, Earl L.

    1988-01-01

    The aims and methods of aircraft cockpit automation are reviewed from a human-factors perspective. Consideration is given to the mixed pilot reception of increased automation, government concern with the safety and reliability of highly automated aircraft, the formal definition of automation, and the ground-proximity warning system and accidents involving controlled flight into terrain. The factors motivating automation include technology availability; safety; economy, reliability, and maintenance; workload reduction and two-pilot certification; more accurate maneuvering and navigation; display flexibility; economy of cockpit space; and military requirements.

  10. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  11. Process automation

    SciTech Connect

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs.

  12. Assessment of the Molecular Expression and Structure of Gangliosides in Brain Metastasis of Lung Adenocarcinoma by an Advanced Approach Based on Fully Automated Chip-Nanoelectrospray Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zamfir, Alina D.; Serb, Alina; Vukeli, Željka; Flangea, Corina; Schiopu, Catalin; Fabris, Dragana; Kalanj-Bognar, Svjetlana; Capitan, Florina; Sisu, Eugen

    2011-12-01

    Gangliosides (GGs), sialic acid-containing glycosphingolipids, are known to be involved in the invasive/metastatic behavior of brain tumor cells. Development of modern methods for determination of the variations in GG expression and structure during neoplastic cell transformation is a priority in the field of biomedical analysis. In this context, we report here on the first optimization and application of chip-based nanoelectrospray (NanoMate robot) mass spectrometry (MS) for the investigation of gangliosides in a secondary brain tumor. In our work a native GG mixture extracted and purified from brain metastasis of lung adenocarcinoma was screened by NanoMate robot coupled to a quadrupole time-of-flight MS. A native GG mixture from an age-matched healthy brain tissue, sampled and analyzed under identical conditions, served as a control. Comparative MS analysis demonstrated an evident dissimilarity in GG expression in the two tissue types. Brain metastasis is characterized by many species having a reduced N-acetylneuraminic acid (Neu5Ac) content, however, modified by fucosylation or O-acetylation such as Fuc-GM4, Fuc-GM3, di- O-Ac-GM1, O-Ac-GM3. In contrast, healthy brain tissue is dominated by longer structures exhibiting from mono- to hexasialylated sugar chains. Also, significant differences in ceramide composition were discovered. By tandem MS using collision-induced dissociation at low energies, brain metastasis-associated GD3 (d18:1/18:0) species as well as an uncommon Fuc-GM1 (d18:1/18:0) detected in the normal brain tissue could be structurally characterized. The novel protocol was able to provide a reliable compositional and structural characterization with high analysis pace and at a sensitivity situated in the fmol range.

  13. Habitat automation

    NASA Technical Reports Server (NTRS)

    Swab, Rodney E.

    1992-01-01

    A habitat, on either the surface of the Moon or Mars, will be designed and built with the proven technologies of that day. These technologies will be mature and readily available to the habitat designer. We believe an acceleration of the normal pace of automation would allow a habitat to be safer and more easily maintained than would be the case otherwise. This document examines the operation of a habitat and describes elements of that operation which may benefit from an increased use of automation. Research topics within the automation realm are then defined and discussed with respect to the role they can have in the design of the habitat. Problems associated with the integration of advanced technologies into real-world projects at NASA are also addressed.

  14. The Brain Is Faster than the Hand in Split-Second Intentions to Respond to an Impending Hazard: A Simulation of Neuroadaptive Automation to Speed Recovery to Perturbation in Flight Attitude.

    PubMed

    Callan, Daniel E; Terzibas, Cengiz; Cassel, Daniel B; Sato, Masa-Aki; Parasuraman, Raja

    2016-01-01

    The goal of this research is to test the potential for neuroadaptive automation to improve response speed to a hazardous event by using a brain-computer interface (BCI) to decode perceptual-motor intention. Seven participants underwent four experimental sessions while measuring brain activity with magnetoencephalograpy. The first three sessions were of a simple constrained task in which the participant was to pull back on the control stick to recover from a perturbation in attitude in one condition and to passively observe the perturbation in the other condition. The fourth session consisted of having to recover from a perturbation in attitude while piloting the plane through the Grand Canyon constantly maneuvering to track over the river below. Independent component analysis was used on the first two sessions to extract artifacts and find an event related component associated with the onset of the perturbation. These two sessions were used to train a decoder to classify trials in which the participant recovered from the perturbation (motor intention) vs. just passively viewing the perturbation. The BCI-decoder was tested on the third session of the same simple task and found to be able to significantly distinguish motor intention trials from passive viewing trials (mean = 69.8%). The same BCI-decoder was then used to test the fourth session on the complex task. The BCI-decoder significantly classified perturbation from no perturbation trials (73.3%) with a significant time savings of 72.3 ms (Original response time of 425.0-352.7 ms for BCI-decoder). The BCI-decoder model of the best subject was shown to generalize for both performance and time savings to the other subjects. The results of our off-line open loop simulation demonstrate that BCI based neuroadaptive automation has the potential to decode motor intention faster than manual control in response to a hazardous perturbation in flight attitude while ignoring ongoing motor and visual induced activity

  15. Chiral analysis of methadone and its main metabolite, EDDP, in postmortem brain and blood by automated SPE and liquid chromatography-mass spectrometry.

    PubMed

    Holm, Karen Marie Dollerup; Linnet, Kristian

    2012-09-01

    We developed a method based on liquid chromatography coupled with tandem mass spectrometry to quantify individual enantiomers of methadone and its primary metabolite, R/S-2-ethyl-1,5-dimethyl-3,3-diphenylpyrrolinium (EDDP), in postmortem blood and brain tissue. Samples were prepared with a Tecan Evo robotic system. Precipitation was followed by solid-phase extraction, evaporation and reconstitution in the mobile phase. Enantiomers were fully separated with liquid chromatography on a chiral α(1)-acid glycoprotein column. A Quattro micro mass spectrometer was used for detection in the positive ion mode with an electrospray source. The lower limit of quantification in brain tissue was 0.005 mg/kg for methadone and 0.001 mg/kg for EDDP enantiomers; the maximum precision was 17% for both compounds; accuracy ranged from 94 to 101%. In blood, the limit of quantification was 0.001 mg/kg for all compounds, the total relative standard deviation was <15%, and the accuracy varied from 95 to 109%. Brain (n = 11) and blood (n = 15) samples were analyzed with intermediate precision that varied from 7.5 to 15% at 0.005 mg/kg and from 6.8 to 11.3% at 0.25 mg/kg for all compounds. Method development focused on producing a clean extract, particularly from brain samples. The method was tested on authentic brain and femoral blood samples. PMID:22778199

  16. Automated dispenser

    SciTech Connect

    Hollen, R.M.; Stalnaker, N.D.

    1989-04-06

    An automated dispenser having a conventional pipette attached to an actuating cylinder through a flexible cable for delivering precise quantities of a liquid through commands from remotely located computer software. The travel of the flexible cable is controlled by adjustable stops and a locking shaft. The pipette can be positioned manually or by the hands of a robot. 1 fig.

  17. Automating Finance

    ERIC Educational Resources Information Center

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  18. Evaluation of brain perfusion in specific Brodmann areas in Frontotemporal dementia and Alzheimer disease using automated 3-D voxel based analysis

    NASA Astrophysics Data System (ADS)

    Valotassiou, V.; Papatriantafyllou, J.; Sifakis, N.; Karageorgiou, C.; Tsougos, I.; Tzavara, C.; Zerva, C.; Georgoulias, P.

    2009-05-01

    Introduction. Brain perfusion studies with single-photon emission computed tomography (SPECT) have been applied in demented patients to provide better discrimination between frontotemporal dementia (FTD) and Alzheimer's disease (AD). Aim. To assess the perfusion of specific Brodmann (Br) areas of the brain cortex in FTD and AD patients, using NeuroGam processing program to provide 3D voxel-by-voxel cerebral SPECT analysis. Material and methods. We studied 34 consecutive patients. We used the established criteria for the diagnosis of dementia and the specific established criteria for the diagnosis of FTD and AD. All the patients had a neuropsychological evaluation with a battery of tests including the mini-mental state examination (MMSE).Twenty-six patients (16 males, 10 females, mean age 68.76±6.51 years, education 11.81±4.25 years, MMSE 16.69±9.89) received the diagnosis of FTD and 8 patients (all females, mean age 71.25±10.48 years, education 10±4.6 years, MMSE 12.5±3.89) the diagnosis of AD. All the patients underwent a brain SPECT. We applied the NeuroGam Software for the evaluation of brain perfusion in specific Br areas in the left (L) and right (R) hemispheres. Results. Statistically significant hypoperfusion in FTD compared to AD patients, was found in the following Br areas: 11L (p<0.0001), 11R, 20L, 20R, 32L, 38L, 38R, 44L (p<0.001), 32R, 36L, 36R, 45L, 45R, 47R (p<0.01), 9L, 21L, 39R, 44R, 46R, 47L (p<0.05). On the contrary, AD patients presented significant (p<0.05) hypoperfusion in 7R and 39R Br areas. Conclusion. NeuroGam processing program of brain perfusion SPECT could result in enhanced accuracy for the differential diagnosis between AD and FTD patients.

  19. The Brain Is Faster than the Hand in Split-Second Intentions to Respond to an Impending Hazard: A Simulation of Neuroadaptive Automation to Speed Recovery to Perturbation in Flight Attitude

    PubMed Central

    Callan, Daniel E.; Terzibas, Cengiz; Cassel, Daniel B.; Sato, Masa-aki; Parasuraman, Raja

    2016-01-01

    The goal of this research is to test the potential for neuroadaptive automation to improve response speed to a hazardous event by using a brain-computer interface (BCI) to decode perceptual-motor intention. Seven participants underwent four experimental sessions while measuring brain activity with magnetoencephalograpy. The first three sessions were of a simple constrained task in which the participant was to pull back on the control stick to recover from a perturbation in attitude in one condition and to passively observe the perturbation in the other condition. The fourth session consisted of having to recover from a perturbation in attitude while piloting the plane through the Grand Canyon constantly maneuvering to track over the river below. Independent component analysis was used on the first two sessions to extract artifacts and find an event related component associated with the onset of the perturbation. These two sessions were used to train a decoder to classify trials in which the participant recovered from the perturbation (motor intention) vs. just passively viewing the perturbation. The BCI-decoder was tested on the third session of the same simple task and found to be able to significantly distinguish motor intention trials from passive viewing trials (mean = 69.8%). The same BCI-decoder was then used to test the fourth session on the complex task. The BCI-decoder significantly classified perturbation from no perturbation trials (73.3%) with a significant time savings of 72.3 ms (Original response time of 425.0–352.7 ms for BCI-decoder). The BCI-decoder model of the best subject was shown to generalize for both performance and time savings to the other subjects. The results of our off-line open loop simulation demonstrate that BCI based neuroadaptive automation has the potential to decode motor intention faster than manual control in response to a hazardous perturbation in flight attitude while ignoring ongoing motor and visual induced activity

  20. The Brain Is Faster than the Hand in Split-Second Intentions to Respond to an Impending Hazard: A Simulation of Neuroadaptive Automation to Speed Recovery to Perturbation in Flight Attitude.

    PubMed

    Callan, Daniel E; Terzibas, Cengiz; Cassel, Daniel B; Sato, Masa-Aki; Parasuraman, Raja

    2016-01-01

    The goal of this research is to test the potential for neuroadaptive automation to improve response speed to a hazardous event by using a brain-computer interface (BCI) to decode perceptual-motor intention. Seven participants underwent four experimental sessions while measuring brain activity with magnetoencephalograpy. The first three sessions were of a simple constrained task in which the participant was to pull back on the control stick to recover from a perturbation in attitude in one condition and to passively observe the perturbation in the other condition. The fourth session consisted of having to recover from a perturbation in attitude while piloting the plane through the Grand Canyon constantly maneuvering to track over the river below. Independent component analysis was used on the first two sessions to extract artifacts and find an event related component associated with the onset of the perturbation. These two sessions were used to train a decoder to classify trials in which the participant recovered from the perturbation (motor intention) vs. just passively viewing the perturbation. The BCI-decoder was tested on the third session of the same simple task and found to be able to significantly distinguish motor intention trials from passive viewing trials (mean = 69.8%). The same BCI-decoder was then used to test the fourth session on the complex task. The BCI-decoder significantly classified perturbation from no perturbation trials (73.3%) with a significant time savings of 72.3 ms (Original response time of 425.0-352.7 ms for BCI-decoder). The BCI-decoder model of the best subject was shown to generalize for both performance and time savings to the other subjects. The results of our off-line open loop simulation demonstrate that BCI based neuroadaptive automation has the potential to decode motor intention faster than manual control in response to a hazardous perturbation in flight attitude while ignoring ongoing motor and visual induced activity

  1. Automated lithocell

    NASA Astrophysics Data System (ADS)

    Englisch, Andreas; Deuter, Armin

    1990-06-01

    Integration and automation have gained more and more ground in modern IC-manufacturing. It is difficult to make a direct calculation of the profit these investments yield. On the other hand, the demands to man, machine and technology have increased enormously of late; it is not difficult to see that only by means of integration and automation can these demands be coped with. Here are some salient points: U the complexity and costs incurred by the equipment and processes have got significantly higher . owing to the reduction of all dimensions, the tolerances within which the various process steps have to be carried out have got smaller and smaller and the adherence to these tolerances more and more difficult U the cycle time has become more and more important both for the development and control of new processes and, to a great extent, for a rapid and reliable supply to the customer. In order that the products be competitive under these conditions, all sort of costs have to be reduced and the yield has to be maximized. Therefore, the computer-aided control of the equipment and the process combined with an automatic data collection and a real-time SPC (statistical process control) has become absolutely necessary for successful IC-manufacturing. Human errors must be eliminated from the execution of the various process steps by automation. The work time set free in this way makes it possible for the human creativity to be employed on a larger scale in stabilizing the processes. Besides, a computer-aided equipment control can ensure the optimal utilization of the equipment round the clock.

  2. Evaluation of 14 nonlinear deformation algorithms applied to human brain MRI registration

    PubMed Central

    Klein, Arno; Andersson, Jesper; Ardekani, Babak A.; Ashburner, John; Avants, Brian; Chiang, Ming-Chang; Christensen, Gary E.; Collins, D. Louis; Gee, James; Hellier, Pierre; Song, Joo Hyun; Jenkinson, Mark; Lepage, Claude; Rueckert, Daniel; Thompson, Paul; Vercauteren, Tom; Woods, Roger P.; Mann, J. John; Parsey, Ramin V.

    2009-01-01

    All fields of neuroscience that employ brain imaging need to communicate their results with reference to anatomical regions. In particular, comparative morphometry and group analysis of functional and physiological data require coregistration of brains to establish correspondences across brain structures. It is well established that linear registration of one brain to another is inadequate for aligning brain structures, so numerous algorithms have emerged to nonlinearly register brains to one another. This study is the largest evaluation of nonlinear deformation algorithms applied to brain image registration ever conducted. Fourteen algorithms from laboratories around the world are evaluated using 8 different error measures. More than 45,000 registrations between 80 manually labeled brains were performed by algorithms including: AIR, ANIMAL, ART, Diffeomorphic Demons, FNIRT, IRTK, JRD-fluid, ROMEO, SICLE, SyN, and four different SPM5 algorithms (“SPM2-type” and regular Normalization, Unified Segmentation, and the DARTEL Toolbox). All of these registrations were preceded by linear registration between the same image pairs using FLIRT. One of the most significant findings of this study is that the relative performances of the registration methods under comparison appear to be little affected by the choice of subject population, labeling protocol, and type of overlap measure. This is important because it suggests that the findings are generalizable to new subject populations that are labeled or evaluated using different labeling protocols. Furthermore, we ranked the 14 methods according to three completely independent analyses (permutation tests, one-way ANOVA tests, and indifference-zone ranking) and derived three almost identical top rankings of the methods. ART, SyN, IRTK, and SPM's DARTEL Toolbox gave the best results according to overlap and distance measures, with ART and SyN delivering the most consistently high accuracy across subjects and label sets

  3. Path Planning for Semi-automated Simulated Robotic Neurosurgery

    PubMed Central

    Hu, Danying; Gong, Yuanzheng; Hannaford, Blake; Seibel, Eric J.

    2015-01-01

    This paper considers the semi-automated robotic surgical procedure for removing the brain tumor margins, where the manual operation is a tedious and time-consuming task for surgeons. We present robust path planning methods for robotic ablation of tumor residues in various shapes, which are represented in point-clouds instead of analytical geometry. Along with the path plans, corresponding metrics are also delivered to the surgeon for selecting the optimal candidate in the automated robotic ablation. The selected path plan is then executed and tested on RAVEN™ II surgical robot platform as part of the semi-automated robotic brain tumor ablation surgery in a simulated tissue phantom. PMID:26705501

  4. Automated External Defibrillator

    MedlinePlus

    ... from the NHLBI on Twitter. What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a portable device that ... Institutes of Health Department of Health and Human Services USA.gov

  5. Automation: triumph or trap?

    PubMed

    Smythe, M H

    1997-01-01

    Automation, a hot topic in the laboratory world today, can be a very expensive option. Those who are considering implementing automation can save time and money by examining the issues from the standpoint of an industrial/manufacturing engineer. The engineer not only asks what problems will be solved by automation, but what problems will be created. This article discusses questions that must be asked and answered to ensure that automation efforts will yield real and substantial payoffs.

  6. Workflow automation architecture standard

    SciTech Connect

    Moshofsky, R.P.; Rohen, W.T.

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  7. Automated segmentation of the canine corpus callosum for the measurement of diffusion tensor imaging.

    PubMed

    Peterson, David E; Chen, Steven D; Calabrese, Evan; White, Leonard E; Provenzale, James M

    2016-02-01

    The goal of this study was to apply image registration-based automated segmentation methods to measure diffusion tensor imaging (DTI) metrics within the canine brain. Specifically, we hypothesized that this method could measure DTI metrics within the canine brain with greater reproducibility than with hand-drawn region of interest (ROI) methods. We performed high-resolution post-mortem DTI imaging on two canine brains on a 7 T MR scanner. We designated the two brains as brain 1 and brain 2. We measured DTI metrics within the corpus callosum of brain 1 using a hand-drawn ROI method and an automated segmentation method in which ROIs from brain 2 were transformed into the space of brain 1. We repeated both methods in order to measure their reliability. Mean differences between the two sets of hand-drawn ROIs ranged from 4% to 10%. Mean differences between the hand-drawn ROIs and the automated ROIs were less than 3%. The mean differences between the first and second automated ROIs were all less than 0.25%. Our findings indicate that the image registration-based automated segmentation method was clearly the more reproducible method. These results provide the groundwork for using image registration-based automated segmentation methods to measure DTI metrics within the canine brain. Such methods will facilitate the study of white matter pathology in canine models of neurologic disease. PMID:26577603

  8. Shoe-String Automation

    SciTech Connect

    Duncan, M.L.

    2001-07-30

    Faced with a downsizing organization, serious budget reductions and retirement of key metrology personnel, maintaining capabilities to provide necessary services to our customers was becoming increasingly difficult. It appeared that the only solution was to automate some of our more personnel-intensive processes; however, it was crucial that the most personnel-intensive candidate process be automated, at the lowest price possible and with the lowest risk of failure. This discussion relates factors in the selection of the Standard Leak Calibration System for automation, the methods of automation used to provide the lowest-cost solution and the benefits realized as a result of the automation.

  9. Automation of industrial bioprocesses.

    PubMed

    Beyeler, W; DaPra, E; Schneider, K

    2000-01-01

    The dramatic development of new electronic devices within the last 25 years has had a substantial influence on the control and automation of industrial bioprocesses. Within this short period of time the method of controlling industrial bioprocesses has changed completely. In this paper, the authors will use a practical approach focusing on the industrial applications of automation systems. From the early attempts to use computers for the automation of biotechnological processes up to the modern process automation systems some milestones are highlighted. Special attention is given to the influence of Standards and Guidelines on the development of automation systems.

  10. Automation in Clinical Microbiology

    PubMed Central

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  11. Automated DNA Sequencing System

    SciTech Connect

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  12. Comparison of automated and manual segmentation of hippocampus MR images

    NASA Astrophysics Data System (ADS)

    Haller, John W.; Christensen, Gary E.; Miller, Michael I.; Joshi, Sarang C.; Gado, Mokhtar; Csernansky, John G.; Vannier, Michael W.

    1995-05-01

    The precision and accuracy of area estimates from magnetic resonance (MR) brain images and using manual and automated segmentation methods are determined. Areas of the human hippocampus were measured to compare a new automatic method of segmentation with regions of interest drawn by an expert. MR images of nine normal subjects and nine schizophrenic patients were acquired with a 1.5-T unit (Siemens Medical Systems, Inc., Iselin, New Jersey). From each individual MPRAGE 3D volume image a single comparable 2-D slice (matrix equals 256 X 256) was chosen which corresponds to the same coronal slice of the hippocampus. The hippocampus was first manually segmented, then segmented using high dimensional transformations of a digital brain atlas to individual brain MR images. The repeatability of a trained rater was assessed by comparing two measurements from each individual subject. Variability was also compared within and between subject groups of schizophrenics and normal subjects. Finally, the precision and accuracy of automated segmentation of hippocampal areas were determined by comparing automated measurements to manual segmentation measurements made by the trained rater on MR and brain slice images. The results demonstrate the high repeatability of area measurement from MR images of the human hippocampus. Automated segmentation using high dimensional transformations from a digital brain atlas provides repeatability superior to that of manual segmentation. Furthermore, the validity of automated measurements was demonstrated by a high correlation with manual segmentation measurements made by a trained rater. Quantitative morphometry of brain substructures (e.g. hippocampus) is feasible by use of a high dimensional transformation of a digital brain atlas to an individual MR image. This method automates the search for neuromorphological correlates of schizophrenia by a new mathematically robust method with unprecedented sensitivity to small local and regional differences.

  13. Automated Cognome Construction and Semi-automated Hypothesis Generation

    PubMed Central

    Voytek, Jessica B.; Voytek, Bradley

    2012-01-01

    Modern neuroscientific research stands on the shoulders of countless giants. PubMed alone contains more than 21 million peer-reviewed articles with 40–50,000 more published every month. Understanding the human brain, cognition, and disease will require integrating facts from dozens of scientific fields spread amongst millions of studies locked away in static documents, making any such integration daunting, at best. The future of scientific progress will be aided by bridging the gap between the millions of published research articles and modern databases such as the Allen Brain Atlas (ABA). To that end, we have analyzed the text of over 3.5 million scientific abstracts to find associations between neuroscientific concepts. From the literature alone, we show that we can blindly and algorithmically extract a “cognome”: relationships between brain structure, function, and disease. We demonstrate the potential of data-mining and cross-platform data-integration with the ABA by introducing two methods for semiautomated hypothesis generation. By analyzing statistical “holes” and discrepancies in the literature we can find understudied or overlooked research paths. That is, we have added a layer of semi-automation to a part of the scientific process itself. This is an important step toward fundamentally incorporating data-mining algorithms into the scientific method in a manner that is generalizable to any scientific or medical field. PMID:22584238

  14. Laboratory Automation and Middleware.

    PubMed

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory.

  15. Automating checks of plan check automation.

    PubMed

    Halabi, Tarek; Lu, Hsiao-Ming

    2014-07-08

    While a few physicists have designed new plan check automation solutions for their clinics, fewer, if any, managed to adapt existing solutions. As complex and varied as the systems they check, these programs must gain the full confidence of those who would run them on countless patient plans. The present automation effort, planCheck, therefore focuses on versatility and ease of implementation and verification. To demonstrate this, we apply planCheck to proton gantry, stereotactic proton gantry, stereotactic proton fixed beam (STAR), and IMRT treatments.

  16. Work and Programmable Automation.

    ERIC Educational Resources Information Center

    DeVore, Paul W.

    A new industrial era based on electronics and the microprocessor has arrived, an era that is being called intelligent automation. Intelligent automation, in the form of robots, replaces workers, and the new products, using microelectronic devices, require significantly less labor to produce than the goods they replace. The microprocessor thus…

  17. Automation and Cataloging.

    ERIC Educational Resources Information Center

    Furuta, Kenneth; And Others

    1990-01-01

    These three articles address issues in library cataloging that are affected by automation: (1) the impact of automation and bibliographic utilities on professional catalogers; (2) the effect of the LASS microcomputer software on the cost of authority work in cataloging at the University of Arizona; and (3) online subject heading and classification…

  18. Library Automation Style Guide.

    ERIC Educational Resources Information Center

    Gaylord Bros., Liverpool, NY.

    This library automation style guide lists specific terms and names often used in the library automation industry. The terms and/or acronyms are listed alphabetically and each is followed by a brief definition. The guide refers to the "Chicago Manual of Style" for general rules, and a notes section is included for the convenience of individual…

  19. More Benefits of Automation.

    ERIC Educational Resources Information Center

    Getz, Malcolm

    1988-01-01

    Describes a study that measured the benefits of an automated catalog and automated circulation system from the library user's point of view in terms of the value of time saved. Topics discussed include patterns of use, access time, availability of information, search behaviors, and the effectiveness of the measures used. (seven references)…

  20. Educating Archivists for Automation.

    ERIC Educational Resources Information Center

    Weber, Lisa B.

    1988-01-01

    Archivists indicate they want to learn more about automation in archives, the MARC AMC (Archival and Manuscripts Control) format, and emerging computer technologies; they look for educational opportunities through professional associations, publications, and college coursework; future archival automation education needs include standards, shared…

  1. Automation and robotics

    NASA Technical Reports Server (NTRS)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  2. Advances in inspection automation

    NASA Astrophysics Data System (ADS)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  3. Automation in Immunohematology

    PubMed Central

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-01-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  4. Automation in immunohematology.

    PubMed

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process. PMID:22988378

  5. Automation in immunohematology.

    PubMed

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  6. Systematic review automation technologies.

    PubMed

    Tsafnat, Guy; Glasziou, Paul; Choong, Miew Keen; Dunn, Adam; Galgani, Filippo; Coiera, Enrico

    2014-07-09

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects.We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time.

  7. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  8. Automation synthesis modules review.

    PubMed

    Boschi, S; Lodi, F; Malizia, C; Cicoria, G; Marengo, M

    2013-06-01

    The introduction of (68)Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived (68)Ge/(68)Ga generator has been at the bases of the development of (68)Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for (68)Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations.

  9. Brain herniation

    MedlinePlus

    ... herniation; Uncal herniation; Subfalcine herniation; Tonsillar herniation; Herniation - brain ... Brain herniation occurs when something inside the skull produces pressure that moves brain tissues. This is most ...

  10. Automated Lattice Perturbation Theory

    SciTech Connect

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  11. Automated Pilot Advisory System

    NASA Technical Reports Server (NTRS)

    Parks, J. L., Jr.; Haidt, J. G.

    1981-01-01

    An Automated Pilot Advisory System (APAS) was developed and operationally tested to demonstrate the concept that low cost automated systems can provide air traffic and aviation weather advisory information at high density uncontrolled airports. The system was designed to enhance the see and be seen rule of flight, and pilots who used the system preferred it over the self announcement system presently used at uncontrolled airports.

  12. Automated Status Notification System

    NASA Technical Reports Server (NTRS)

    2005-01-01

    NASA Lewis Research Center's Automated Status Notification System (ASNS) was born out of need. To prevent "hacker attacks," Lewis' telephone system needed to monitor communications activities 24 hr a day, 7 days a week. With decreasing staff resources, this continuous monitoring had to be automated. By utilizing existing communications hardware, a UNIX workstation, and NAWK (a pattern scanning and processing language), we implemented a continuous monitoring system.

  13. Automated Groundwater Screening

    SciTech Connect

    Taylor, Glenn A.; Collard, Leonard, B.

    2005-10-31

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application.

  14. Automated imagery orthorectification pilot

    NASA Astrophysics Data System (ADS)

    Slonecker, E. Terrence; Johnson, Brad; McMahon, Joe

    2009-10-01

    Automated orthorectification of raw image products is now possible based on the comprehensive metadata collected by Global Positioning Systems and Inertial Measurement Unit technology aboard aircraft and satellite digital imaging systems, and based on emerging pattern-matching and automated image-to-image and control point selection capabilities in many advanced image processing systems. Automated orthorectification of standard aerial photography is also possible if a camera calibration report and sufficient metadata is available. Orthorectification of historical imagery, for which only limited metadata was available, was also attempted and found to require some user input, creating a semi-automated process that still has significant potential to reduce processing time and expense for the conversion of archival historical imagery into geospatially enabled, digital formats, facilitating preservation and utilization of a vast archive of historical imagery. Over 90 percent of the frames of historical aerial photos used in this experiment were successfully orthorectified to the accuracy of the USGS 100K base map series utilized for the geospatial reference of the archive. The accuracy standard for the 100K series maps is approximately 167 feet (51 meters). The main problems associated with orthorectification failure were cloud cover, shadow and historical landscape change which confused automated image-to-image matching processes. Further research is recommended to optimize automated orthorectification methods and enable broad operational use, especially as related to historical imagery archives.

  15. BrainPrint: A Discriminative Characterization of Brain Morphology

    PubMed Central

    Wachinger, Christian; Golland, Polina; Kremen, William; Fischl, Bruce; Reuter, Martin

    2015-01-01

    We introduce BrainPrint, a compact and discriminative representation of brain morphology. BrainPrint captures shape information of an ensemble of cortical and subcortical structures by solving the eigenvalue problem of the 2D and 3D Laplace-Beltrami operator on triangular (boundary) and tetrahedral (volumetric) meshes. This discriminative characterization enables new ways to study the similarity between brains; the focus can either be on a specific brain structure of interest or on the overall brain similarity. We highlight four applications for BrainPrint in this article: (i) subject identification, (ii) age and sex prediction, (iii) brain asymmetry analysis, and (iv) potential genetic influences on brain morphology. The properties of BrainPrint require the derivation of new algorithms to account for the heterogeneous mix of brain structures with varying discriminative power. We conduct experiments on three datasets, including over 3000 MRI scans from the ADNI database, 436 MRI scans from the OASIS dataset, and 236 MRI scans from the VETSA twin study. All processing steps for obtaining the compact representation are fully automated, making this processing framework particularly attractive for handling large datasets. PMID:25613439

  16. Optimized Brain Extraction for Pathological Brains (optiBET)

    PubMed Central

    Lutkenhoff, Evan S.; Rosenberg, Matthew; Chiang, Jeffrey; Zhang, Kunyu; Pickard, John D.; Owen, Adrian M.; Monti, Martin M.

    2014-01-01

    The study of structural and functional magnetic resonance imaging data has greatly benefitted from the development of sophisticated and efficient algorithms aimed at automating and optimizing the analysis of brain data. We address, in the context of the segmentation of brain from non-brain tissue (i.e., brain extraction, also known as skull-stripping), the tension between the increased theoretical and clinical interest in patient data, and the difficulty of conventional algorithms to function optimally in the presence of gross brain pathology. Indeed, because of the reliance of many algorithms on priors derived from healthy volunteers, images with gross pathology can severely affect their ability to correctly trace the boundaries between brain and non-brain tissue, potentially biasing subsequent analysis. We describe and make available an optimized brain extraction script for the pathological brain (optiBET) robust to the presence of pathology. Rather than attempting to trace the boundary between tissues, optiBET performs brain extraction by (i) calculating an initial approximate brain extraction; (ii) employing linear and non-linear registration to project the approximate extraction into the MNI template space; (iii) back-projecting a standard brain-only mask from template space to the subject’s original space; and (iv) employing the back-projected brain-only mask to mask-out non-brain tissue. The script results in up to 94% improvement of the quality of extractions over those obtained with conventional software across a large set of severely pathological brains. Since optiBET makes use of freely available algorithms included in FSL, it should be readily employable by anyone having access to such tools. PMID:25514672

  17. Automated telescope scheduling

    NASA Technical Reports Server (NTRS)

    Johnston, Mark D.

    1988-01-01

    With the ever increasing level of automation of astronomical telescopes the benefits and feasibility of automated planning and scheduling are becoming more apparent. Improved efficiency and increased overall telescope utilization are the most obvious goals. Automated scheduling at some level has been done for several satellite observatories, but the requirements on these systems were much less stringent than on modern ground or satellite observatories. The scheduling problem is particularly acute for Hubble Space Telescope: virtually all observations must be planned in excruciating detail weeks to months in advance. Space Telescope Science Institute has recently made significant progress on the scheduling problem by exploiting state-of-the-art artificial intelligence software technology. What is especially interesting is that this effort has already yielded software that is well suited to scheduling groundbased telescopes, including the problem of optimizing the coordinated scheduling of more than one telescope.

  18. Materials Testing and Automation

    NASA Astrophysics Data System (ADS)

    Cooper, Wayne D.; Zweigoron, Ronald B.

    1980-07-01

    The advent of automation in materials testing has been in large part responsible for recent radical changes in the materials testing field: Tests virtually impossible to perform without a computer have become more straightforward to conduct. In addition, standardized tests may be performed with enhanced efficiency and repeatability. A typical automated system is described in terms of its primary subsystems — an analog station, a digital computer, and a processor interface. The processor interface links the analog functions with the digital computer; it includes data acquisition, command function generation, and test control functions. Features of automated testing are described with emphasis on calculated variable control, control of a variable that is computed by the processor and cannot be read directly from a transducer. Three calculated variable tests are described: a yield surface probe test, a thermomechanical fatigue test, and a constant-stress-intensity range crack-growth test. Future developments are discussed.

  19. Automated Factor Slice Sampling

    PubMed Central

    Tibbits, Matthew M.; Groendyke, Chris; Haran, Murali; Liechty, John C.

    2013-01-01

    Markov chain Monte Carlo (MCMC) algorithms offer a very general approach for sampling from arbitrary distributions. However, designing and tuning MCMC algorithms for each new distribution, can be challenging and time consuming. It is particularly difficult to create an efficient sampler when there is strong dependence among the variables in a multivariate distribution. We describe a two-pronged approach for constructing efficient, automated MCMC algorithms: (1) we propose the “factor slice sampler”, a generalization of the univariate slice sampler where we treat the selection of a coordinate basis (factors) as an additional tuning parameter, and (2) we develop an approach for automatically selecting tuning parameters in order to construct an efficient factor slice sampler. In addition to automating the factor slice sampler, our tuning approach also applies to the standard univariate slice samplers. We demonstrate the efficiency and general applicability of our automated MCMC algorithm with a number of illustrative examples. PMID:24955002

  20. Automation in medicinal chemistry.

    PubMed

    Reader, John C

    2004-01-01

    The implementation of appropriate automation can make a significant improvement in productivity at each stage of the drug discovery process, if it is incorporated into an efficient overall process. Automated chemistry has evolved rapidly from the 'combinatorial' techniques implemented in many industrial laboratories in the early 1990's which focused primarily on the hit discovery phase, and were highly dependent on solid-phase techniques and instrumentation derived from peptide synthesis. Automated tools and strategies have been developed which can impact the hit discovery, hit expansion and lead optimization phases, not only in synthesis, but also in reaction optimization, work-up, and purification of compounds. This article discusses the implementation of some of these techniques, based especially on experiences at Millennium Pharmaceuticals Research and Development Ltd.

  1. Automated Camera Calibration

    NASA Technical Reports Server (NTRS)

    Chen, Siqi; Cheng, Yang; Willson, Reg

    2006-01-01

    Automated Camera Calibration (ACAL) is a computer program that automates the generation of calibration data for camera models used in machine vision systems. Machine vision camera models describe the mapping between points in three-dimensional (3D) space in front of the camera and the corresponding points in two-dimensional (2D) space in the camera s image. Calibrating a camera model requires a set of calibration data containing known 3D-to-2D point correspondences for the given camera system. Generating calibration data typically involves taking images of a calibration target where the 3D locations of the target s fiducial marks are known, and then measuring the 2D locations of the fiducial marks in the images. ACAL automates the analysis of calibration target images and greatly speeds the overall calibration process.

  2. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R. L.; Veatch, M.

    1983-01-01

    Generic power-system elements and their potential faults are identified. Automation functions and their resulting benefits are defined and automation functions between power subsystem, central spacecraft computer, and ground flight-support personnel are partitioned. All automation activities were categorized as data handling, monitoring, routine control, fault handling, planning and operations, or anomaly handling. Incorporation of all these classes of tasks, except for anomaly handling, in power subsystem hardware and software was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are long mission lifetime, modular growth, high-performance flexibility, a need to accommodate different electrical user-load equipment, onorbit assembly/maintenance/servicing, and potentially large number of power subsystem components. A significant effort in algorithm development and validation is essential in meeting the 1987 technology readiness date for the space station.

  3. Automated fiber pigtailing technology

    NASA Astrophysics Data System (ADS)

    Strand, O. T.; Lowry, M. E.; Lu, S. Y.; Nelson, D. C.; Nikkel, D. J.; Pocha, M. D.; Young, K. D.

    1994-02-01

    The high cost of optoelectronic (OE) devices is due mainly to the labor-intensive packaging process. Manually pigtailing such devices as single-mode laser diodes and modulators is very time consuming with poor quality control. The Photonics Program and the Engineering Research Division at LLNL are addressing several issues associated with automatically packaging OE devices. A furry automated system must include high-precision fiber alignment, fiber attachment techniques, in-situ quality control, and parts handling and feeding. This paper will present on-going work at LLNL in the areas of automated fiber alignment and fiber attachment. For the fiber alignment, we are building an automated fiber pigtailing machine (AFPM) which combines computer vision and object recognition algorithms with active feedback to perform sub-micron alignments of single-mode fibers to modulators and laser diodes. We expect to perform sub-micron alignments in less than five minutes with this technology. For fiber attachment, we are building various geometries of silicon microbenches which include on-board heaters to solder metal-coated fibers and other components in place; these designs are completely compatible with an automated process of OE packaging. We have manually attached a laser diode, a thermistor, and a thermo-electric heater to one of our microbenches in less than 15 minutes using the on-board heaters for solder reflow; an automated process could perform this same exercise in only a few minutes. Automated packaging techniques such as these will help lower the costs of OE devices.

  4. Automated gas chromatography

    DOEpatents

    Mowry, Curtis D.; Blair, Dianna S.; Rodacy, Philip J.; Reber, Stephen D.

    1999-01-01

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute.

  5. Ground based automated telescope

    SciTech Connect

    Colgate, S.A.; Thompson, W.

    1980-01-01

    Recommendation that a ground-based automated telescope of the 2-meter class be built for remote multiuser use as a natural facility. Experience dictates that a primary consideration is a time shared multitasking operating system with virtual memory overlayed with a real time priority interrupt. The primary user facility is a remote terminal networked to the single computer. Many users must have simultaneous time shared access to the computer for program development. The telescope should be rapid slewing, and hence a light weight construction. Automation allows for the closed loop pointing error correction independent of extreme accuracy of the mount.

  6. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.

  7. Automating the CMS DAQ

    SciTech Connect

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  8. Automated knowledge generation

    NASA Technical Reports Server (NTRS)

    Myler, Harley R.; Gonzalez, Avelino J.

    1988-01-01

    The general objectives of the NASA/UCF Automated Knowledge Generation Project were the development of an intelligent software system that could access CAD design data bases, interpret them, and generate a diagnostic knowledge base in the form of a system model. The initial area of concentration is in the diagnosis of the process control system using the Knowledge-based Autonomous Test Engineer (KATE) diagnostic system. A secondary objective was the study of general problems of automated knowledge generation. A prototype was developed, based on object-oriented language (Flavors).

  9. Automation of analytical isotachophoresis

    NASA Technical Reports Server (NTRS)

    Thormann, Wolfgang

    1985-01-01

    The basic features of automation of analytical isotachophoresis (ITP) are reviewed. Experimental setups consisting of narrow bore tubes which are self-stabilized against thermal convection are considered. Sample detection in free solution is discussed, listing the detector systems presently used or expected to be of potential use in the near future. The combination of a universal detector measuring the evolution of ITP zone structures with detector systems specific to desired components is proposed as a concept of an automated chemical analyzer based on ITP. Possible miniaturization of such an instrument by means of microlithographic techniques is discussed.

  10. Human Factors In Aircraft Automation

    NASA Technical Reports Server (NTRS)

    Billings, Charles

    1995-01-01

    Report presents survey of state of art in human factors in automation of aircraft operation. Presents examination of aircraft automation and effects on flight crews in relation to human error and aircraft accidents.

  11. Brain Tumors

    MedlinePlus

    A brain tumor is a growth of abnormal cells in the tissues of the brain. Brain tumors can be benign, with no cancer cells, ... cancer cells that grow quickly. Some are primary brain tumors, which start in the brain. Others are ...

  12. Automated Student Model Improvement

    ERIC Educational Resources Information Center

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  13. Library Automation: An Overview.

    ERIC Educational Resources Information Center

    Saffady, William

    1989-01-01

    Surveys the current state of computer applications in six areas of library work: circulation control; descriptive cataloging; catalog maintenance and production; reference services; acquisitions; and serials control. Motives for automation are discussed, and examples of representative vendors, products, and services are given. (15 references) (LRW)

  14. Automation in haemostasis.

    PubMed

    Huber, A R; Méndez, A; Brunner-Agten, S

    2013-01-01

    Automatia, an ancient Greece goddess of luck who makes things happen by themselves and on her own will without human engagement, is present in our daily life in the medical laboratory. Automation has been introduced and perfected by clinical chemistry and since then expanded into other fields such as haematology, immunology, molecular biology and also coagulation testing. The initial small and relatively simple standalone instruments have been replaced by more complex systems that allow for multitasking. Integration of automated coagulation testing into total laboratory automation has become possible in the most recent years. Automation has many strengths and opportunities if weaknesses and threats are respected. On the positive side, standardization, reduction of errors, reduction of cost and increase of throughput are clearly beneficial. Dependence on manufacturers, high initiation cost and somewhat expensive maintenance are less favourable factors. The modern lab and especially the todays lab technicians and academic personnel in the laboratory do not add value for the doctor and his patients by spending lots of time behind the machines. In the future the lab needs to contribute at the bedside suggesting laboratory testing and providing support and interpretation of the obtained results. The human factor will continue to play an important role in testing in haemostasis yet under different circumstances.

  15. Building Automation Systems.

    ERIC Educational Resources Information Center

    Honeywell, Inc., Minneapolis, Minn.

    A number of different automation systems for use in monitoring and controlling building equipment are described in this brochure. The system functions include--(1) collection of information, (2) processing and display of data at a central panel, and (3) taking corrective action by sounding alarms, making adjustments, or automatically starting and…

  16. Automated CCTV Tester

    2000-09-13

    The purpose of an automated CCTV tester is to automatically and continuously monitor multiple perimeter security cameras for changes in a camera's measured resolution and alignment (camera looking at the proper area). It shall track and record the image quality and position of each camera and produce an alarm when a camera is out of specification.

  17. Blastocyst microinjection automation.

    PubMed

    Mattos, Leonardo S; Grant, Edward; Thresher, Randy; Kluckman, Kimberly

    2009-09-01

    Blastocyst microinjections are routinely involved in the process of creating genetically modified mice for biomedical research, but their efficiency is highly dependent on the skills of the operators. As a consequence, much time and resources are required for training microinjection personnel. This situation has been aggravated by the rapid growth of genetic research, which has increased the demand for mutant animals. Therefore, increased productivity and efficiency in this area are highly desired. Here, we pursue these goals through the automation of a previously developed teleoperated blastocyst microinjection system. This included the design of a new system setup to facilitate automation, the definition of rules for automatic microinjections, the implementation of video processing algorithms to extract feedback information from microscope images, and the creation of control algorithms for process automation. Experimentation conducted with this new system and operator assistance during the cells delivery phase demonstrated a 75% microinjection success rate. In addition, implantation of the successfully injected blastocysts resulted in a 53% birth rate and a 20% yield of chimeras. These results proved that the developed system was capable of automatic blastocyst penetration and retraction, demonstrating the success of major steps toward full process automation.

  18. Library Automation in Australia.

    ERIC Educational Resources Information Center

    Blank, Karen L.

    1984-01-01

    Discussion of Australia's move toward library automation highlights development of a national bibliographic network, local and regional cooperation, integrated library systems, telecommunications, and online systems, as well as microcomputer usage, ergonomics, copyright issues, and national information policy. Information technology plans of the…

  19. Automated Management Of Documents

    NASA Technical Reports Server (NTRS)

    Boy, Guy

    1995-01-01

    Report presents main technical issues involved in computer-integrated documentation. Problems associated with automation of management and maintenance of documents analyzed from perspectives of artificial intelligence and human factors. Technologies that may prove useful in computer-integrated documentation reviewed: these include conventional approaches to indexing and retrieval of information, use of hypertext, and knowledge-based artificial-intelligence systems.

  20. Mining Your Automated System.

    ERIC Educational Resources Information Center

    Larsen, Patricia M., Ed.; And Others

    1996-01-01

    Four articles address issues of collecting, compiling, reporting, and interpreting statistics generated by automated library systems for administrative decision making. Topics include using a management information system to forecast growth and assess areas for downsizing; statistics for collection development and analysis; and online system…

  1. Automated conflict resolution issues

    NASA Technical Reports Server (NTRS)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  2. Automating Food Service.

    ERIC Educational Resources Information Center

    Kavulla, Timothy A.

    1986-01-01

    The Wichita, Kansas, Public Schools' Food Service Department Project Reduction in Paperwork (RIP) is designed to automate certain paperwork functions, thus reducing cost and flow of paper. This article addresses how RIP manages free/reduced meal applications and meets the objectives of reducing paper and increasing accuracy, timeliness, and…

  3. Automated Estimating System (AES)

    SciTech Connect

    Holder, D.A.

    1989-09-01

    This document describes Version 3.1 of the Automated Estimating System, a personal computer-based software package designed to aid in the creation, updating, and reporting of project cost estimates for the Estimating and Scheduling Department of the Martin Marietta Energy Systems Engineering Division. Version 3.1 of the Automated Estimating System is capable of running in a multiuser environment across a token ring network. The token ring network makes possible services and applications that will more fully integrate all aspects of information processing, provides a central area for large data bases to reside, and allows access to the data base by multiple users. Version 3.1 of the Automated Estimating System also has been enhanced to include an Assembly pricing data base that may be used to retrieve cost data into an estimate. A WBS Title File program has also been included in Version 3.1. The WBS Title File program allows for the creation of a WBS title file that has been integrated with the Automated Estimating System to provide WBS titles in update mode and in reports. This provides for consistency in WBS titles and provides the capability to display WBS titles on reports generated at a higher WBS level.

  4. Automated Administrative Data Bases

    NASA Technical Reports Server (NTRS)

    Marrie, M. D.; Jarrett, J. R.; Reising, S. A.; Hodge, J. E.

    1984-01-01

    Improved productivity and more effective response to information requirements for internal management, NASA Centers, and Headquarters resulted from using automated techniques. Modules developed to provide information on manpower, RTOPS, full time equivalency, and physical space reduced duplication, increased communication, and saved time. There is potential for greater savings by sharing and integrating with those who have the same requirements.

  5. Automating Small Libraries.

    ERIC Educational Resources Information Center

    Swan, James

    1996-01-01

    Presents a four-phase plan for small libraries strategizing for automation: inventory and weeding, data conversion, implementation, and enhancements. Other topics include selecting a system, MARC records, compatibility, ease of use, industry standards, searching capabilities, support services, system security, screen displays, circulation modules,…

  6. CLAN Automation Plan.

    ERIC Educational Resources Information Center

    Nevada State Library and Archives, Carson City.

    The Central Libraries Automated Network (CLAN) of Nevada is a cooperative system which shares circulation, cataloging, and acquisitions systems and numerous online databases. Its mission is to provide public access to information and efficient library administration through shared computer systems, databases, and telecommunications. This document…

  7. Automated EEG acquisition

    NASA Technical Reports Server (NTRS)

    Frost, J. D., Jr.; Hillman, C. E., Jr.

    1977-01-01

    Automated self-contained portable device can be used by technicians with minimal training. Data acquired from patient at remote site are transmitted to centralized interpretation center using conventional telephone equipment. There, diagnostic information is analyzed, and results are relayed back to remote site.

  8. Automated Essay Scoring

    ERIC Educational Resources Information Center

    Dikli, Semire

    2006-01-01

    The impacts of computers on writing have been widely studied for three decades. Even basic computers functions, i.e. word processing, have been of great assistance to writers in modifying their essays. The research on Automated Essay Scoring (AES) has revealed that computers have the capacity to function as a more effective cognitive tool (Attali,…

  9. Brain surgery

    MedlinePlus

    Craniotomy; Surgery - brain; Neurosurgery; Craniectomy; Stereotactic craniotomy; Stereotactic brain biopsy; Endoscopic craniotomy ... cut depends on where the problem in the brain is located. The surgeon creates a hole in ...

  10. Brain Malformations

    MedlinePlus

    Most brain malformations begin long before a baby is born. Something damages the developing nervous system or causes it ... medicines, infections, or radiation during pregnancy interferes with brain development. Parts of the brain may be missing, ...

  11. Differentiation of sCJD and vCJD forms by automated analysis of basal ganglia intensity distribution in multisequence MRI of the brain--definition and evaluation of new MRI-based ratios.

    PubMed

    Linguraru, Marius George; Ayache, Nicholas; Bardinet, Eric; Ballester, Miguel Angel González; Galanaud, Damien; Haïk, Stéphane; Faucheux, Baptiste; Hauw, Jean-Jacques; Cozzone, Patrick; Dormont, Didier; Brandel, Jean-Philippe

    2006-08-01

    We present a method for the analysis of basal ganglia (including the thalamus) for accurate detection of human spongiform encephalopathy in multisequence magnetic resonance imaging (MRI) of the brain. One common feature of most forms of prion protein diseases is the appearance of hyperintensities in the deep grey matter area of the brain in T2-weighted magnetic resonance (MR) images. We employ T1, T2, and Flair-T2 MR sequences for the detection of intensity deviations in the internal nuclei. First, the MR data are registered to a probabilistic atlas and normalized in intensity. Then smoothing is applied with edge enhancement. The segmentation of hyperintensities is performed using a model of the human visual system. For more accurate results, a priori anatomical data from a segmented atlas are employed to refine the registration and remove false positives. The results are robust over the patient data and in accordance with the clinical ground truth. Our method further allows the quantification of intensity distributions in basal ganglia. The caudate nuclei are highlighted as main areas of diagnosis of sporadic Creutzfeldt-Jakob Disease (sCJD), in agreement with the histological data. The algorithm permitted the classification of the intensities of abnormal signals in sCJD patient FLAIR images with a higher hypersignal in caudate nuclei (10/10) and putamen (6/10) than in thalami. Defining normalized MRI measures of the intensity relations between the internal grey nuclei of patients, we robustly differentiate sCJD and variant CJD (vCJD) patients, in an attempt to create an automatic classification tool of human spongiform encephalopathies.

  12. Mapping brain circuitry with a light microscope

    PubMed Central

    Osten, Pavel; Margrie, Troy W.

    2014-01-01

    The beginning of the 21st century has seen a renaissance in light microscopy and anatomical tract tracing that together are rapidly advancing our understanding of the form and function of neuronal circuits. The introduction of instruments for automated imaging of whole mouse brains, new cell type-specific and transsynaptic tracers, and computational methods for handling the whole-brain datasets has opened the door to neuroanatomical studies at an unprecedented scale. We present an overview of the state of play and future opportunities in charting long-range and local connectivity in the entire mouse brain and in linking brain circuits to function. PMID:23722211

  13. Automated gas chromatography

    DOEpatents

    Mowry, C.D.; Blair, D.S.; Rodacy, P.J.; Reber, S.D.

    1999-07-13

    An apparatus and process for the continuous, near real-time monitoring of low-level concentrations of organic compounds in a liquid, and, more particularly, a water stream. A small liquid volume of flow from a liquid process stream containing organic compounds is diverted by an automated process to a heated vaporization capillary where the liquid volume is vaporized to a gas that flows to an automated gas chromatograph separation column to chromatographically separate the organic compounds. Organic compounds are detected and the information transmitted to a control system for use in process control. Concentrations of organic compounds less than one part per million are detected in less than one minute. 7 figs.

  14. Automated theorem proving.

    PubMed

    Plaisted, David A

    2014-03-01

    Automated theorem proving is the use of computers to prove or disprove mathematical or logical statements. Such statements can express properties of hardware or software systems, or facts about the world that are relevant for applications such as natural language processing and planning. A brief introduction to propositional and first-order logic is given, along with some of the main methods of automated theorem proving in these logics. These methods of theorem proving include resolution, Davis and Putnam-style approaches, and others. Methods for handling the equality axioms are also presented. Methods of theorem proving in propositional logic are presented first, and then methods for first-order logic. WIREs Cogn Sci 2014, 5:115-128. doi: 10.1002/wcs.1269 CONFLICT OF INTEREST: The authors has declared no conflicts of interest for this article. For further resources related to this article, please visit the WIREs website. PMID:26304304

  15. Automated macromolecular crystallization screening

    DOEpatents

    Segelke, Brent W.; Rupp, Bernhard; Krupka, Heike I.

    2005-03-01

    An automated macromolecular crystallization screening system wherein a multiplicity of reagent mixes are produced. A multiplicity of analysis plates is produced utilizing the reagent mixes combined with a sample. The analysis plates are incubated to promote growth of crystals. Images of the crystals are made. The images are analyzed with regard to suitability of the crystals for analysis by x-ray crystallography. A design of reagent mixes is produced based upon the expected suitability of the crystals for analysis by x-ray crystallography. A second multiplicity of mixes of the reagent components is produced utilizing the design and a second multiplicity of reagent mixes is used for a second round of automated macromolecular crystallization screening. In one embodiment the multiplicity of reagent mixes are produced by a random selection of reagent components.

  16. Automated breeder fuel fabrication

    SciTech Connect

    Goldmann, L.H.; Frederickson, J.R.

    1983-09-01

    The objective of the Secure Automated Fabrication (SAF) Project is to develop remotely operated equipment for the processing and manufacturing of breeder reactor fuel pins. The SAF line will be installed in the Fuels and Materials Examination Facility (FMEF). The FMEF is presently under construction at the Department of Energy's (DOE) Hanford site near Richland, Washington, and is operated by the Westinghouse Hanford Company (WHC). The fabrication and support systems of the SAF line are designed for computer-controlled operation from a centralized control room. Remote and automated fuel fabriction operations will result in: reduced radiation exposure to workers; enhanced safeguards; improved product quality; near real-time accountability, and increased productivity. The present schedule calls for installation of SAF line equipment in the FMEF beginning in 1984, with qualifying runs starting in 1986 and production commencing in 1987. 5 figures.

  17. The automation of science.

    PubMed

    King, Ross D; Rowland, Jem; Oliver, Stephen G; Young, Michael; Aubrey, Wayne; Byrne, Emma; Liakata, Maria; Markham, Magdalena; Pir, Pinar; Soldatova, Larisa N; Sparkes, Andrew; Whelan, Kenneth E; Clare, Amanda

    2009-04-01

    The basis of science is the hypothetico-deductive method and the recording of experiments in sufficient detail to enable reproducibility. We report the development of Robot Scientist "Adam," which advances the automation of both. Adam has autonomously generated functional genomics hypotheses about the yeast Saccharomyces cerevisiae and experimentally tested these hypotheses by using laboratory automation. We have confirmed Adam's conclusions through manual experiments. To describe Adam's research, we have developed an ontology and logical language. The resulting formalization involves over 10,000 different research units in a nested treelike structure, 10 levels deep, that relates the 6.6 million biomass measurements to their logical description. This formalization describes how a machine contributed to scientific knowledge. PMID:19342587

  18. Compact reactor design automation

    NASA Technical Reports Server (NTRS)

    Nassersharif, Bahram; Gaeta, Michael J.

    1991-01-01

    A conceptual compact reactor design automation experiment was performed using the real-time expert system G2. The purpose of this experiment was to investigate the utility of an expert system in design; in particular, reactor design. The experiment consisted of the automation and integration of two design phases: reactor neutronic design and fuel pin design. The utility of this approach is shown using simple examples of formulating rules to ensure design parameter consistency between the two design phases. The ability of G2 to communicate with external programs even across networks provides the system with the capability of supplementing the knowledge processing features with conventional canned programs with possible applications for realistic iterative design tools.

  19. Automated campaign system

    NASA Astrophysics Data System (ADS)

    Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere

    2006-02-01

    To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.

  20. Automated assembly in space

    NASA Technical Reports Server (NTRS)

    Srivastava, Sandanand; Dwivedi, Suren N.; Soon, Toh Teck; Bandi, Reddy; Banerjee, Soumen; Hughes, Cecilia

    1989-01-01

    The installation of robots and their use of assembly in space will create an exciting and promising future for the U.S. Space Program. The concept of assembly in space is very complicated and error prone and it is not possible unless the various parts and modules are suitably designed for automation. Certain guidelines are developed for part designing and for an easy precision assembly. Major design problems associated with automated assembly are considered and solutions to resolve these problems are evaluated in the guidelines format. Methods for gripping and methods for part feeding are developed with regard to the absence of gravity in space. The guidelines for part orientation, adjustments, compliances and various assembly construction are discussed. Design modifications of various fasteners and fastening methods are also investigated.

  1. Automated Testing System

    2006-05-09

    ATS is a Python-language program for automating test suites for software programs that do not interact with thier users, such as scripted scientific simulations. ATS features a decentralized approach especially suited to larger projects. In its multinode mode it can utilize many nodes of a cluster in order to do many test in parallel. It has features for submitting longer-running tests to a batch system and would have to be customized for use elsewhere.

  2. Power subsystem automation study

    NASA Technical Reports Server (NTRS)

    Tietz, J. C.; Sewy, D.; Pickering, C.; Sauers, R.

    1984-01-01

    The purpose of the phase 2 of the power subsystem automation study was to demonstrate the feasibility of using computer software to manage an aspect of the electrical power subsystem on a space station. The state of the art in expert systems software was investigated in this study. This effort resulted in the demonstration of prototype expert system software for managing one aspect of a simulated space station power subsystem.

  3. Cavendish Balance Automation

    NASA Technical Reports Server (NTRS)

    Thompson, Bryan

    2000-01-01

    This is the final report for a project carried out to modify a manual commercial Cavendish Balance for automated use in cryostat. The scope of this project was to modify an off-the-shelf manually operated Cavendish Balance to allow for automated operation for periods of hours or days in cryostat. The purpose of this modification was to allow the balance to be used in the study of effects of superconducting materials on the local gravitational field strength to determine if the strength of gravitational fields can be reduced. A Cavendish Balance was chosen because it is a fairly simple piece of equipment for measuring gravity, one the least accurately known and least understood physical constants. The principle activities that occurred under this purchase order were: (1) All the components necessary to hold and automate the Cavendish Balance in a cryostat were designed. Engineering drawings were made of custom parts to be fabricated, other off-the-shelf parts were procured; (2) Software was written in LabView to control the automation process via a stepper motor controller and stepper motor, and to collect data from the balance during testing; (3)Software was written to take the data collected from the Cavendish Balance and reduce it to give a value for the gravitational constant; (4) The components of the system were assembled and fitted to a cryostat. Also the LabView hardware including the control computer, stepper motor driver, data collection boards, and necessary cabling were assembled; and (5) The system was operated for a number of periods, data collected, and reduced to give an average value for the gravitational constant.

  4. Automated Microbial Metabolism Laboratory

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Development of the automated microbial metabolism laboratory (AMML) concept is reported. The focus of effort of AMML was on the advanced labeled release experiment. Labeled substrates, inhibitors, and temperatures were investigated to establish a comparative biochemical profile. Profiles at three time intervals on soil and pure cultures of bacteria isolated from soil were prepared to establish a complete library. The development of a strategy for the return of a soil sample from Mars is also reported.

  5. Automated Cooperative Trajectories

    NASA Technical Reports Server (NTRS)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  6. Automation in biological crystallization.

    PubMed

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  7. Automation in biological crystallization

    PubMed Central

    Shaw Stewart, Patrick; Mueller-Dieckmann, Jochen

    2014-01-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given. PMID:24915074

  8. Automation in organizations: Eternal conflict

    NASA Technical Reports Server (NTRS)

    Dieterly, D. L.

    1981-01-01

    Some ideas on and insights into the problems associated with automation in organizations are presented with emphasis on the concept of automation, its relationship to the individual, and its impact on system performance. An analogy is drawn, based on an American folk hero, to emphasize the extent of the problems encountered when dealing with automation within an organization. A model is proposed to focus attention on a set of appropriate dimensions. The function allocation process becomes a prominent aspect of the model. The current state of automation research is mentioned in relation to the ideas introduced. Proposed directions for an improved understanding of automation's effect on the individual's efficiency are discussed. The importance of understanding the individual's perception of the system in terms of the degree of automation is highlighted.

  9. AUTOMATION FOR THE SYNTHESIS AND APPLICATION OF PET RADIOPHARMACEUTICALS.

    SciTech Connect

    Alexoff, D.L.

    2001-09-21

    The development of automated systems supporting the production and application of PET radiopharmaceuticals has been an important focus of researchers since the first successes of using carbon-11 (Comar et al., 1979) and fluorine-18 (Reivich et al., 1979) labeled compounds to visualize functional activity of the human brain. These initial successes of imaging the human brain soon led to applications in the human heart (Schelbert et al., 1980), and quickly radiochemists began to see the importance of automation to support PET studies in humans (Lambrecht, 1982; Langstrom et al., 1983). Driven by the necessity of controlling processes emanating high fluxes of 511 KeV photons, and by the tedium of repetitive syntheses for carrying out these human PET investigations, academic and government scientists have designed, developed and tested many useful and novel automated systems in the past twenty years. These systems, originally designed primarily by radiochemists, not only carry out effectively the tasks they were designed for, but also demonstrate significant engineering innovation in the field of laboratory automation.

  10. [Automated anesthesia record system].

    PubMed

    Zhu, Tao; Liu, Jin

    2005-12-01

    Based on Client/Server architecture, a software of automated anesthesia record system running under Windows operation system and networks has been developed and programmed with Microsoft Visual C++ 6.0, Visual Basic 6.0 and SQL Server. The system can deal with patient's information throughout the anesthesia. It can collect and integrate the data from several kinds of medical equipment such as monitor, infusion pump and anesthesia machine automatically and real-time. After that, the system presents the anesthesia sheets automatically. The record system makes the anesthesia record more accurate and integral and can raise the anesthesiologist's working efficiency.

  11. Automated fiber pigtailing machine

    DOEpatents

    Strand, O.T.; Lowry, M.E.

    1999-01-05

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectronic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems. 26 figs.

  12. Automated fiber pigtailing machine

    DOEpatents

    Strand, Oliver T.; Lowry, Mark E.

    1999-01-01

    The Automated Fiber Pigtailing Machine (AFPM) aligns and attaches optical fibers to optoelectonic (OE) devices such as laser diodes, photodiodes, and waveguide devices without operator intervention. The so-called pigtailing process is completed with sub-micron accuracies in less than 3 minutes. The AFPM operates unattended for one hour, is modular in design and is compatible with a mass production manufacturing environment. This machine can be used to build components which are used in military aircraft navigation systems, computer systems, communications systems and in the construction of diagnostics and experimental systems.

  13. Automated Propellant Blending

    NASA Technical Reports Server (NTRS)

    Hohmann, Carl W. (Inventor); Harrington, Douglas W. (Inventor); Dutton, Maureen L. (Inventor); Tipton, Billy Charles, Jr. (Inventor); Bacak, James W. (Inventor); Salazar, Frank (Inventor)

    2000-01-01

    An automated propellant blending apparatus and method that uses closely metered addition of countersolvent to a binder solution with propellant particles dispersed therein to precisely control binder precipitation and particle aggregation is discussed. A profile of binder precipitation versus countersolvent-solvent ratio is established empirically and used in a computer algorithm to establish countersolvent addition parameters near the cloud point for controlling the transition of properties of the binder during agglomeration and finishing of the propellant composition particles. The system is remotely operated by computer for safety, reliability and improved product properties, and also increases product output.

  14. Automated Propellant Blending

    NASA Technical Reports Server (NTRS)

    Hohmann, Carl W. (Inventor); Harrington, Douglas W. (Inventor); Dutton, Maureen L. (Inventor); Tipton, Billy Charles, Jr. (Inventor); Bacak, James W. (Inventor); Salazar, Frank (Inventor)

    1999-01-01

    An automated propellant blending apparatus and method uses closely metered addition of countersolvent to a binder solution with propellant particles dispersed therein to precisely control binder precipitation and particle aggregation. A profile of binder precipitation versus countersolvent-solvent ratio is established empirically and used in a computer algorithm to establish countersolvent addition parameters near the cloud point for controlling the transition of properties of the binder during agglomeration and finishing of the propellant composition particles. The system is remotely operated by computer for safety, reliability and improved product properties, and also increases product output.

  15. The Automated Medical Office

    PubMed Central

    Petreman, Mel

    1990-01-01

    With shock and surprise many physicians learned in the 1980s that they must change the way they do business. Competition for patients, increasing government regulation, and the rapidly escalating risk of litigation forces physicians to seek modern remedies in office management. The author describes a medical clinic that strives to be paperless using electronic innovation to solve the problems of medical practice management. A computer software program to automate information management in a clinic shows that practical thinking linked to advanced technology can greatly improve office efficiency. PMID:21233899

  16. Automated Hazard Analysis

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control andmore » job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the work planning process.« less

  17. The automated medical office.

    PubMed

    Petreman, M

    1990-08-01

    With shock and surprise many physicians learned in the 1980s that they must change the way they do business. Competition for patients, increasing government regulation, and the rapidly escalating risk of litigation forces physicians to seek modern remedies in office management. The author describes a medical clinic that strives to be paperless using electronic innovation to solve the problems of medical practice management. A computer software program to automate information management in a clinic shows that practical thinking linked to advanced technology can greatly improve office efficiency.

  18. Janice VanCleave's Electricity: Mind-Boggling Experiments You Can Turn into Science Fair Projects.

    ERIC Educational Resources Information Center

    VanCleave, Janice

    This book is designed to provide guidance and ideas for science projects to help students learn more about science as they search for answers to specific problems. The 20 topics on electricity in this book suggest many possible problems to solve. Each topic has one detailed experiment followed by a section that provides additional questions about…

  19. World-wide distribution automation systems

    SciTech Connect

    Devaney, T.M.

    1994-12-31

    A worldwide power distribution automation system is outlined. Distribution automation is defined and the status of utility automation is discussed. Other topics discussed include a distribution management system, substation feeder, and customer functions, potential benefits, automation costs, planning and engineering considerations, automation trends, databases, system operation, computer modeling of system, and distribution management systems.

  20. Closed-loop, ultraprecise, automated craniotomies

    PubMed Central

    Pak, Nikita; Siegle, Joshua H.; Kinney, Justin P.; Denman, Daniel J.; Blanche, Timothy J.

    2015-01-01

    A large array of neuroscientific techniques, including in vivo electrophysiology, two-photon imaging, optogenetics, lesions, and microdialysis, require access to the brain through the skull. Ideally, the necessary craniotomies could be performed in a repeatable and automated fashion, without damaging the underlying brain tissue. Here we report that when drilling through the skull a stereotypical increase in conductance can be observed when the drill bit passes through the skull base. We present an architecture for a robotic device that can perform this algorithm, along with two implementations—one based on homebuilt hardware and one based on commercially available hardware—that can automatically detect such changes and create large numbers of precise craniotomies, even in a single skull. We also show that this technique can be adapted to automatically drill cranial windows several millimeters in diameter. Such robots will not only be useful for helping neuroscientists perform both small and large craniotomies more reliably but can also be used to create precisely aligned arrays of craniotomies with stereotaxic registration to standard brain atlases that would be difficult to drill by hand. PMID:25855700

  1. Closed-loop, ultraprecise, automated craniotomies.

    PubMed

    Pak, Nikita; Siegle, Joshua H; Kinney, Justin P; Denman, Daniel J; Blanche, Timothy J; Boyden, Edward S

    2015-06-01

    A large array of neuroscientific techniques, including in vivo electrophysiology, two-photon imaging, optogenetics, lesions, and microdialysis, require access to the brain through the skull. Ideally, the necessary craniotomies could be performed in a repeatable and automated fashion, without damaging the underlying brain tissue. Here we report that when drilling through the skull a stereotypical increase in conductance can be observed when the drill bit passes through the skull base. We present an architecture for a robotic device that can perform this algorithm, along with two implementations--one based on homebuilt hardware and one based on commercially available hardware--that can automatically detect such changes and create large numbers of precise craniotomies, even in a single skull. We also show that this technique can be adapted to automatically drill cranial windows several millimeters in diameter. Such robots will not only be useful for helping neuroscientists perform both small and large craniotomies more reliably but can also be used to create precisely aligned arrays of craniotomies with stereotaxic registration to standard brain atlases that would be difficult to drill by hand. PMID:25855700

  2. Automated System Marketplace 1995: The Changing Face of Automation.

    ERIC Educational Resources Information Center

    Barry, Jeff; And Others

    1995-01-01

    Discusses trends in the automated system marketplace with specific attention to online vendors and their customers: academic, public, school, and special libraries. Presents vendor profiles; tables and charts on computer systems and sales; and sidebars that include a vendor source list and the differing views on procuring an automated library…

  3. Maneuver Automation Software

    NASA Technical Reports Server (NTRS)

    Uffelman, Hal; Goodson, Troy; Pellegrin, Michael; Stavert, Lynn; Burk, Thomas; Beach, David; Signorelli, Joel; Jones, Jeremy; Hahn, Yungsun; Attiyah, Ahlam; Illsley, Jeannette

    2009-01-01

    The Maneuver Automation Software (MAS) automates the process of generating commands for maneuvers to keep the spacecraft of the Cassini-Huygens mission on a predetermined prime mission trajectory. Before MAS became available, a team of approximately 10 members had to work about two weeks to design, test, and implement each maneuver in a process that involved running many maneuver-related application programs and then serially handing off data products to other parts of the team. MAS enables a three-member team to design, test, and implement a maneuver in about one-half hour after Navigation has process-tracking data. MAS accepts more than 60 parameters and 22 files as input directly from users. MAS consists of Practical Extraction and Reporting Language (PERL) scripts that link, sequence, and execute the maneuver- related application programs: "Pushing a single button" on a graphical user interface causes MAS to run navigation programs that design a maneuver; programs that create sequences of commands to execute the maneuver on the spacecraft; and a program that generates predictions about maneuver performance and generates reports and other files that enable users to quickly review and verify the maneuver design. MAS can also generate presentation materials, initiate electronic command request forms, and archive all data products for future reference.

  4. Space station advanced automation

    NASA Technical Reports Server (NTRS)

    Woods, Donald

    1990-01-01

    In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.

  5. Automated office blood pressure.

    PubMed

    Myers, Martin G; Godwin, Marshall

    2012-05-01

    Manual blood pressure (BP) is gradually disappearing from clinical practice with the mercury sphygmomanometer now considered to be an environmental hazard. Manual BP is also subject to measurement error on the part of the physician/nurse and patient-related anxiety which can result in poor quality BP measurements and office-induced (white coat) hypertension. Automated office (AO) BP with devices such as the BpTRU (BpTRU Medical Devices, Coquitlam, BC) has already replaced conventional manual BP in many primary care practices in Canada and has also attracted interest in other countries where research studies using AOBP have been undertaken. The basic principles of AOBP include multiple readings taken with a fully automated recorder with the patient resting alone in a quiet room. When these principles are followed, office-induced hypertension is eliminated and AOBP exhibits a much stronger correlation with the awake ambulatory BP as compared with routine manual BP measurements. Unlike routine manual BP, AOBP correlates as well with left ventricular mass as does the awake ambulatory BP. AOBP also simplifies the definition of hypertension in that the cut point for a normal AOBP (< 135/85 mm Hg) is the same as for the awake ambulatory BP and home BP. This article summarizes the currently available evidence supporting the use of AOBP in routine clinical practice and proposes an algorithm in which AOBP replaces manual BP for the diagnosis and management of hypertension. PMID:22265230

  6. Automated localization of periventricular and subcortical white matter lesions

    NASA Astrophysics Data System (ADS)

    van der Lijn, Fedde; Vernooij, Meike W.; Ikram, M. Arfan; Vrooman, Henri A.; Rueckert, Daniel; Hammers, Alexander; Breteler, Monique M. B.; Niessen, Wiro J.

    2007-03-01

    It is still unclear whether periventricular and subcortical white matter lesions (WMLs) differ in etiology or clinical consequences. Studies addressing this issue would benefit from automated segmentation and localization of WMLs. Several papers have been published on WML segmentation in MR images. Automated localization however, has not been investigated as much. This work presents and evaluates a novel method to label segmented WMLs as periventricular and subcortical. The proposed technique combines tissue classification and registration-based segmentation to outline the ventricles in MRI brain data. The segmented lesions can then be labeled into periventricular WMLs and subcortical WMLs by applying region growing and morphological operations. The technique was tested on scans of 20 elderly subjects in which neuro-anatomy experts manually segmented WMLs. Localization accuracy was evaluated by comparing the results of the automated method with a manual localization. Similarity indices and volumetric intraclass correlations between the automated and the manual localization were 0.89 and 0.95 for periventricular WMLs and 0.64 and 0.89 for subcortical WMLs, respectively. We conclude that this automated method for WML localization performs well to excellent in comparison to the gold standard.

  7. Brain Power.

    ERIC Educational Resources Information Center

    Albrecht, Karl

    2002-01-01

    Reviews significant findings of recent brain research, including the concept of five minds: automatic, subconscious, practical, creative, and spiritual. Suggests approaches to training the brain that are related to this hierarchy of thinking. (JOW)

  8. Brain Basics

    MedlinePlus

    ... have been linked to many mental disorders, including autism , obsessive compulsive disorder (OCD) , schizophrenia , and depression . Brain ... studies show that brain growth in children with autism appears to peak early. And as they grow ...

  9. Brain components

    MedlinePlus

    ... 3 major components of the brain are the cerebrum, cerebellum, and brain stem. The cerebrum is divided into left and right hemispheres, each ... gray matter) is the outside portion of the cerebrum and provides us with functions associated with conscious ...

  10. Brain Diseases

    MedlinePlus

    The brain is the control center of the body. It controls thoughts, memory, speech, and movement. It regulates the function of many organs. When the brain is healthy, it works quickly and automatically. However, ...

  11. Brain abscess

    MedlinePlus

    Tunkel AR. Brain abscess. In: Bennett JE, Dolin R, Blaser MJ, eds. Mandell, Douglas, and Bennett's Principles and Practice ... Philadelphia, PA: Elsevier Saunders; 2015:chap 92. Tunkel AR, Scheld WM. Brain abscess. In: Winn HR, ed. ...

  12. Computer automated design and computer automated manufacture.

    PubMed

    Brncick, M

    2000-08-01

    The introduction of computer aided design and computer aided manufacturing into the field of prosthetics and orthotics did not arrive without concern. Many prosthetists feared that the computer would provide other allied health practitioners who had little or no experience in prosthetics the ability to fit and manage amputees. Technicians in the field felt their jobs may be jeopardized by automated fabrication techniques. This has not turned out to be the case. Prosthetists who use CAD-CAM techniques are finding they have more time for patient care and clinical assessment. CAD-CAM is another tool for them to provide better care for the patients/clients they serve. One of the factors that deterred the acceptance of CAD-CAM techniques in its early stages was that of cost. It took a significant investment in software and hardware for the prosthetists to begin to use the new systems. This new technique was not reimbursed by insurance coverage. Practitioners did not have enough information about this new technique to make a sound decision on their investment of time and money. Ironically, it is the need to hold health care costs down that may prove to be the catalyst for the increased use of CAD-CAM in the field. Providing orthoses and prostheses to patients who require them is a very labor intensive process. Practitioners are looking for better, faster, and more economical ways in which to provide their services under the pressure of managed care. CAD-CAM may be the answer. The author foresees shape sensing departments in hospitals where patients would be sent to be digitized, similar to someone going for radiograph or ultrasound. Afterwards, an orthosis or prosthesis could be provided from a central fabrication facility at a remote site, most likely on the same day. Not long ago, highly skilled practitioners with extensive technical ability would custom make almost every orthosis. One now practices in an atmosphere where off-the-shelf orthoses are the standard. This

  13. A Demonstration of Automated DNA Sequencing.

    ERIC Educational Resources Information Center

    Latourelle, Sandra; Seidel-Rogol, Bonnie

    1998-01-01

    Details a simulation that employs a paper-and-pencil model to demonstrate the principles behind automated DNA sequencing. Discusses the advantages of automated sequencing as well as the chemistry of automated DNA sequencing. (DDR)

  14. Robotics/Automated Systems Technicians.

    ERIC Educational Resources Information Center

    Doty, Charles R.

    Major resources exist that can be used to develop or upgrade programs in community colleges and technical institutes that educate robotics/automated systems technicians. The first category of resources is Economic, Social, and Education Issues. The Office of Technology Assessment (OTA) report, "Automation and the Workplace," presents analyses of…

  15. Automated Test-Form Generation

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  16. Opening up Library Automation Software

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  17. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry; Riedesel, Joel; Myers, Chris; Miller, William; Jones, Ellen F.; Freeman, Kenneth; Walsh, Richard; Walls, Bryan K.; Weeks, David J.; Bechtel, Robert T.

    1992-01-01

    Autonomous power-distribution system includes power-control equipment and automation equipment. System automatically schedules connection of power to loads and reconfigures itself when it detects fault. Potential terrestrial applications include optimization of consumption of power in homes, power supplies for autonomous land vehicles and vessels, and power supplies for automated industrial processes.

  18. Automating a clinical management system.

    PubMed

    Gordon, B; Braun, D

    1990-06-01

    Automating the clinical documentation of a home health care agency will prove crucial as the industry continues to grow and becomes increasingly complex. Kimberly Quality Care, a large, multi-office home care company, made a major commitment to the automation of its clinical management documents.

  19. Translation: Aids, Robots, and Automation.

    ERIC Educational Resources Information Center

    Andreyewsky, Alexander

    1981-01-01

    Examines electronic aids to translation both as ways to automate it and as an approach to solve problems resulting from shortage of qualified translators. Describes the limitations of robotic MT (Machine Translation) systems, viewing MAT (Machine-Aided Translation) as the only practical solution and the best vehicle for further automation. (MES)

  20. Progress Toward Automated Cost Estimation

    NASA Technical Reports Server (NTRS)

    Brown, Joseph A.

    1992-01-01

    Report discusses efforts to develop standard system of automated cost estimation (ACE) and computer-aided design (CAD). Advantage of system is time saved and accuracy enhanced by automating extraction of quantities from design drawings, consultation of price lists, and application of cost and markup formulas.

  1. Automated Circulation. SPEC Kit 43.

    ERIC Educational Resources Information Center

    Association of Research Libraries, Washington, DC. Office of Management Studies.

    Of the 64 libraries responding to a 1978 Association of Research Libraries (ARL) survey, 37 indicated that they used automated circulation systems; half of these were commercial systems, and most were batch-process or combination batch process and online. Nearly all libraries without automated systems cited lack of funding as the reason for not…

  2. Brain Aneurysm

    MedlinePlus

    A brain aneurysm is an abnormal bulge or "ballooning" in the wall of an artery in the brain. They are sometimes called berry aneurysms because they ... often the size of a small berry. Most brain aneurysms produce no symptoms until they become large, ...

  3. Automated design of aerospace structures

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Mccomb, H. G.

    1974-01-01

    The current state-of-the-art in structural analysis of aerospace vehicles is characterized, automated design technology is discussed, and an indication is given of the future direction of research in analysis and automated design. Representative computer programs for analysis typical of those in routine use in vehicle design activities are described, and results are shown for some selected analysis problems. Recent and planned advances in analysis capability are indicated. Techniques used to automate the more routine aspects of structural design are discussed, and some recently developed automated design computer programs are described. Finally, discussion is presented of early accomplishments in interdisciplinary automated design systems, and some indication of the future thrust of research in this field is given.

  4. Automated Desalting Apparatus

    NASA Technical Reports Server (NTRS)

    Spencer, Maegan K.; Liu, De-Ling; Kanik, Isik; Beegle, Luther

    2010-01-01

    Because salt and metals can mask the signature of a variety of organic molecules (like amino acids) in any given sample, an automated system to purify complex field samples has been created for the analytical techniques of electrospray ionization/ mass spectroscopy (ESI/MS), capillary electrophoresis (CE), and biological assays where unique identification requires at least some processing of complex samples. This development allows for automated sample preparation in the laboratory and analysis of complex samples in the field with multiple types of analytical instruments. Rather than using tedious, exacting protocols for desalting samples by hand, this innovation, called the Automated Sample Processing System (ASPS), takes analytes that have been extracted through high-temperature solvent extraction and introduces them into the desalting column. After 20 minutes, the eluent is produced. This clear liquid can then be directly analyzed by the techniques listed above. The current apparatus including the computer and power supplies is sturdy, has an approximate mass of 10 kg, and a volume of about 20 20 20 cm, and is undergoing further miniaturization. This system currently targets amino acids. For these molecules, a slurry of 1 g cation exchange resin in deionized water is packed into a column of the apparatus. Initial generation of the resin is done by flowing sequentially 2.3 bed volumes of 2N NaOH and 2N HCl (1 mL each) to rinse the resin, followed by .5 mL of deionized water. This makes the pH of the resin near neutral, and eliminates cross sample contamination. Afterward, 2.3 mL of extracted sample is then loaded into the column onto the top of the resin bed. Because the column is packed tightly, the sample can be applied without disturbing the resin bed. This is a vital step needed to ensure that the analytes adhere to the resin. After the sample is drained, oxalic acid (1 mL, pH 1.6-1.8, adjusted with NH4OH) is pumped into the column. Oxalic acid works as a

  5. Automated design of ligands to polypharmacological profiles

    PubMed Central

    Besnard, Jérémy; Ruda, Gian Filippo; Setola, Vincent; Abecassis, Keren; Rodriguiz, Ramona M.; Huang, Xi-Ping; Norval, Suzanne; Sassano, Maria F.; Shin, Antony I.; Webster, Lauren A.; Simeons, Frederick R.C.; Stojanovski, Laste; Prat, Annik; Seidah, Nabil G.; Constam, Daniel B.; Bickerton, G. Richard; Read, Kevin D.; Wetsel, William C.; Gilbert, Ian H.; Roth, Bryan L.; Hopkins, Andrew L.

    2012-01-01

    The clinical efficacy and safety of a drug is determined by its activity profile across multiple proteins in the proteome. However, designing drugs with a specific multi-target profile is both complex and difficult. Therefore methods to rationally design drugs a priori against profiles of multiple proteins would have immense value in drug discovery. We describe a new approach for the automated design of ligands against profiles of multiple drug targets. The method is demonstrated by the evolution of an approved acetylcholinesterase inhibitor drug into brain penetrable ligands with either specific polypharmacology or exquisite selectivity profiles for G-protein coupled receptors. Overall, 800 ligand-target predictions of prospectively designed ligands were tested experimentally, of which 75% were confirmed correct. We also demonstrate target engagement in vivo. The approach can be a useful source of drug leads where multi-target profiles are required to achieve either selectivity over other drug targets or a desired polypharmacology. PMID:23235874

  6. Brain Basics: Know Your Brain

    MedlinePlus

    ... fact sheet is a basic introduction to the human brain. It may help you understand how the healthy ... largest and most highly developed part of the human brain: it consists primarily of the cerebrum ( 2 ) and ...

  7. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    PubMed Central

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  8. Automated glioblastoma segmentation based on a multiparametric structured unsupervised classification.

    PubMed

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V; Robles, Montserrat; Aparici, F; Martí-Bonmatí, L; García-Gómez, Juan M

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  9. Automated Analysis Workstation

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Information from NASA Tech Briefs of work done at Langley Research Center and the Jet Propulsion Laboratory assisted DiaSys Corporation in manufacturing their first product, the R/S 2000. Since then, the R/S 2000 and R/S 2003 have followed. Recently, DiaSys released their fourth workstation, the FE-2, which automates the process of making and manipulating wet-mount preparation of fecal concentrates. The time needed to read the sample is decreased, permitting technologists to rapidly spot parasites, ova and cysts, sometimes carried in the lower intestinal tract of humans and animals. Employing the FE-2 is non-invasive, can be performed on an out-patient basis, and quickly provides confirmatory results.

  10. Robust automated knowledge capture.

    SciTech Connect

    Stevens-Adams, Susan Marie; Abbott, Robert G.; Forsythe, James Chris; Trumbo, Michael Christopher Stefan; Haass, Michael Joseph; Hendrickson, Stacey M. Langfitt

    2011-10-01

    This report summarizes research conducted through the Sandia National Laboratories Robust Automated Knowledge Capture Laboratory Directed Research and Development project. The objective of this project was to advance scientific understanding of the influence of individual cognitive attributes on decision making. The project has developed a quantitative model known as RumRunner that has proven effective in predicting the propensity of an individual to shift strategies on the basis of task and experience related parameters. Three separate studies are described which have validated the basic RumRunner model. This work provides a basis for better understanding human decision making in high consequent national security applications, and in particular, the individual characteristics that underlie adaptive thinking.

  11. Protein fabrication automation

    PubMed Central

    Cox, J. Colin; Lape, Janel; Sayed, Mahmood A.; Hellinga, Homme W.

    2007-01-01

    Facile “writing” of DNA fragments that encode entire gene sequences potentially has widespread applications in biological analysis and engineering. Rapid writing of open reading frames (ORFs) for expressed proteins could transform protein engineering and production for protein design, synthetic biology, and structural analysis. Here we present a process, protein fabrication automation (PFA), which facilitates the rapid de novo construction of any desired ORF from oligonucleotides with low effort, high speed, and little human interaction. PFA comprises software for sequence design, data management, and the generation of instruction sets for liquid-handling robotics, a liquid-handling robot, a robust PCR scheme for gene assembly from synthetic oligonucleotides, and a genetic selection system to enrich correctly assembled full-length synthetic ORFs. The process is robust and scalable. PMID:17242375

  12. Automated Defect Classification (ADC)

    1998-01-01

    The ADC Software System is designed to provide semiconductor defect feature analysis and defect classification capabilities. Defect classification is an important software method used by semiconductor wafer manufacturers to automate the analysis of defect data collected by a wide range of microscopy techniques in semiconductor wafer manufacturing today. These microscopies (e.g., optical bright and dark field, scanning electron microscopy, atomic force microscopy, etc.) generate images of anomalies that are induced or otherwise appear on wafermore » surfaces as a result of errant manufacturing processes or simple atmospheric contamination (e.g., airborne particles). This software provides methods for analyzing these images, extracting statistical features from the anomalous regions, and applying supervised classifiers to label the anomalies into user-defined categories.« less

  13. Health care automation companies.

    PubMed

    1995-12-01

    Health care automation companies: card transaction processing/EFT/EDI-capable banks; claims auditing/analysis; claims processors/clearinghouses; coding products/services; computer hardware; computer networking/LAN/WAN; consultants; data processing/outsourcing; digital dictation/transcription; document imaging/optical disk storage; executive information systems; health information networks; hospital/health care information systems; interface engines; laboratory information systems; managed care information systems; patient identification/credit cards; pharmacy information systems; POS terminals; radiology information systems; software--claims related/computer-based patient records/home health care/materials management/supply ordering/physician practice management/translation/utilization review/outcomes; telecommunications products/services; telemedicine/teleradiology; value-added networks. PMID:10153839

  14. Automated Standard Hazard Tool

    NASA Technical Reports Server (NTRS)

    Stebler, Shane

    2014-01-01

    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  15. Expedition automated flow fluorometer

    NASA Astrophysics Data System (ADS)

    Krikun, V. A.; Salyuk, P. A.

    2015-11-01

    This paper describes an apparatus and operation of automated flow-through dual-channel fluorometer for studying the fluorescence of dissolved organic matter, and the fluorescence of phytoplankton cells with open and closed reaction centers in sea areas with oligotrophic and eutrophic water type. The step-by step excitation by two semiconductor lasers or two light-emitting diodes is realized in the current device. The excitation wavelengths are 405nm and 532nm in the default configuration. Excitation radiation of each light source can be changed with different durations, intensities and repetition rate. Registration of the fluorescence signal carried out by two photo-multipliers with different optical filters of 580-600 nm and 680-700 nm band pass diapasons. The configuration of excitation sources and spectral diapasons of registered radiation can be changed due to decided tasks.

  16. Automated external defibrillators (AEDs).

    PubMed

    2003-06-01

    Automated external defibrillators, or AEDs, will automatically analyze a patient's ECG and, if needed, deliver a defibrillating shock to the heart. We sometimes refer to these devices as AED-only devices or stand-alone AEDs. The basic function of AEDs is similar to that of defibrillator/monitors, but AEDs lack their advanced capabilities and generally don't allow manual defibrillation. A device that functions strictly as an AED is intended to be used by basic users only. Such devices are often referred to as public access defibrillators. In this Evaluation, we present our findings for a newly evaluated model, the Zoll AED Plus. We also summarize our findings for the previously evaluated model that is still on the market and describe other AEDs that are also available but that we haven't evaluated. We rate the models collectively for first-responder use and public access defibrillation (PAD) applications.

  17. Health care automation companies.

    PubMed

    1995-12-01

    Health care automation companies: card transaction processing/EFT/EDI-capable banks; claims auditing/analysis; claims processors/clearinghouses; coding products/services; computer hardware; computer networking/LAN/WAN; consultants; data processing/outsourcing; digital dictation/transcription; document imaging/optical disk storage; executive information systems; health information networks; hospital/health care information systems; interface engines; laboratory information systems; managed care information systems; patient identification/credit cards; pharmacy information systems; POS terminals; radiology information systems; software--claims related/computer-based patient records/home health care/materials management/supply ordering/physician practice management/translation/utilization review/outcomes; telecommunications products/services; telemedicine/teleradiology; value-added networks.

  18. [From automation to robotics].

    PubMed

    1985-01-01

    The introduction of automation into the laboratory of biology seems to be unavoidable. But at which cost, if it is necessary to purchase a new machine for every new application? Fortunately the same image processing techniques, belonging to a theoretic framework called Mathematical Morphology, may be used in visual inspection tasks, both in car industry and in the biology lab. Since the market for industrial robotics applications is much higher than the market of biomedical applications, the price of image processing devices drops, and becomes sometimes less than the price of a complete microscope equipment. The power of the image processing methods of Mathematical Morphology will be illustrated by various examples, as automatic silver grain counting in autoradiography, determination of HLA genotype, electrophoretic gels analysis, automatic screening of cervical smears... Thus several heterogeneous applications may share the same image processing device, provided there is a separate and devoted work station for each of them.

  19. Berkeley automated supernova search

    SciTech Connect

    Kare, J.T.; Pennypacker, C.R.; Muller, R.A.; Mast, T.S.; Crawford, F.S.; Burns, M.S.

    1981-01-01

    The Berkeley automated supernova search employs a computer controlled 36-inch telescope and charge coupled device (CCD) detector to image 2500 galaxies per night. A dedicated minicomputer compares each galaxy image with stored reference data to identify supernovae in real time. The threshold for detection is m/sub v/ = 18.8. We plan to monitor roughly 500 galaxies in Virgo and closer every night, and an additional 6000 galaxies out to 70 Mpc on a three night cycle. This should yield very early detection of several supernovae per year for detailed study, and reliable premaximum detection of roughly 100 supernovae per year for statistical studies. The search should be operational in mid-1982.

  20. Automating Frame Analysis

    SciTech Connect

    Sanfilippo, Antonio P.; Franklin, Lyndsey; Tratz, Stephen C.; Danielson, Gary R.; Mileson, Nicholas D.; Riensche, Roderick M.; McGrath, Liam

    2008-04-01

    Frame Analysis has come to play an increasingly stronger role in the study of social movements in Sociology and Political Science. While significant steps have been made in providing a theory of frames and framing, a systematic characterization of the frame concept is still largely lacking and there are no rec-ognized criteria and methods that can be used to identify and marshal frame evi-dence reliably and in a time and cost effective manner. Consequently, current Frame Analysis work is still too reliant on manual annotation and subjective inter-pretation. The goal of this paper is to present an approach to the representation, acquisition and analysis of frame evidence which leverages Content Analysis, In-formation Extraction and Semantic Search methods to provide a systematic treat-ment of a Frame Analysis and automate frame annotation.

  1. Protein fabrication automation.

    PubMed

    Cox, J Colin; Lape, Janel; Sayed, Mahmood A; Hellinga, Homme W

    2007-03-01

    Facile "writing" of DNA fragments that encode entire gene sequences potentially has widespread applications in biological analysis and engineering. Rapid writing of open reading frames (ORFs) for expressed proteins could transform protein engineering and production for protein design, synthetic biology, and structural analysis. Here we present a process, protein fabrication automation (PFA), which facilitates the rapid de novo construction of any desired ORF from oligonucleotides with low effort, high speed, and little human interaction. PFA comprises software for sequence design, data management, and the generation of instruction sets for liquid-handling robotics, a liquid-handling robot, a robust PCR scheme for gene assembly from synthetic oligonucleotides, and a genetic selection system to enrich correctly assembled full-length synthetic ORFs. The process is robust and scalable.

  2. Automated calorimeter testing system

    SciTech Connect

    Rodenburg, W.W.; James, S.J.

    1990-01-01

    The Automated Calorimeter Testing System (ACTS) is a portable measurement device that provides an independent measurement of all critical parameters of a calorimeter system. The ACTS was developed to improve productivity and performance of Mound-produced calorimeters. With ACTS, an individual with minimal understanding of calorimetry operation can perform a consistent set of diagnostic measurements on the system. The operator can identify components whose performance has deteriorated by a simple visual comparison of the current data plots with previous measurements made when the system was performing properly. Thus, downtime and out of control'' situations can be reduced. Should a system malfunction occur, a flowchart of troubleshooting procedures has been developed to facilitate quick identification of the malfunctioning component. If diagnosis is beyond the capability of the operator, the ACTS provides a consistent set of test data for review by a knowledgeable expert. The first field test was conducted at the Westinghouse Savannah River Site in early 1990. 6 figs.

  3. Automated attendance accounting system

    NASA Technical Reports Server (NTRS)

    Chapman, C. P. (Inventor)

    1973-01-01

    An automated accounting system useful for applying data to a computer from any or all of a multiplicity of data terminals is disclosed. The system essentially includes a preselected number of data terminals which are each adapted to convert data words of decimal form to another form, i.e., binary, usable with the computer. Each data terminal may take the form of a keyboard unit having a number of depressable buttons or switches corresponding to selected data digits and/or function digits. A bank of data buffers, one of which is associated with each data terminal, is provided as a temporary storage. Data from the terminals is applied to the data buffers on a digit by digit basis for transfer via a multiplexer to the computer.

  4. Automated Defect Classification (ADC)

    SciTech Connect

    1998-01-01

    The ADC Software System is designed to provide semiconductor defect feature analysis and defect classification capabilities. Defect classification is an important software method used by semiconductor wafer manufacturers to automate the analysis of defect data collected by a wide range of microscopy techniques in semiconductor wafer manufacturing today. These microscopies (e.g., optical bright and dark field, scanning electron microscopy, atomic force microscopy, etc.) generate images of anomalies that are induced or otherwise appear on wafer surfaces as a result of errant manufacturing processes or simple atmospheric contamination (e.g., airborne particles). This software provides methods for analyzing these images, extracting statistical features from the anomalous regions, and applying supervised classifiers to label the anomalies into user-defined categories.

  5. Automating the multiprocessing environment

    NASA Technical Reports Server (NTRS)

    Arpasi, Dale J.

    1989-01-01

    An approach to automate the programming and operation of tree-structured networks of multiprocessor systems is discussed. A conceptual, knowledge-based operating environment is presented, and requirements for two major technology elements are identified as follows: (1) An intelligent information translator is proposed for implementating information transfer between dissimilar hardware and software, thereby enabling independent and modular development of future systems and promoting a language-independence of codes and information; (2) A resident system activity manager, which recognizes the systems capabilities and monitors the status of all systems within the environment, is proposed for integrating dissimilar systems into effective parallel processing resources to optimally meet user needs. Finally, key computational capabilities which must be provided before the environment can be realized are identified.

  6. Automated Computerized Analysis of Speechin Psychiatric Disorders

    PubMed Central

    Cohen, Alex S.; Elvevåg, Brita

    2014-01-01

    Purpose of Review Disturbances in communication are a hallmark of severe mental illnesses. Recent technological advances have paved the way for objectifying communication using automated computerized linguistic and acoustic analysis. We review recent studies applying various computer-based assessments to the natural language produced by adult patients with severe mental illness. Recent Findings Automated computerized methods afford tools with which it is possible to objectively evaluate patients in a reliable, valid and efficient manner that complements human ratings. Crucially, these measures correlate with important clinical measures. The clinical relevance of these novel metrics has been demonstrated by showing their relationship to functional outcome measures, their in vivo link to classic ‘language’ regions in the brain, and, in the case of linguistic analysis, their relationship to candidate genes for severe mental illness. Summary Computer based assessments of natural language afford a framework with which to measure communication disturbances in adults with SMI. Emerging evidence suggests that they can be reliable and valid, and overcome many practical limitations of more traditional assessment methods. The advancement of these technologies offers unprecedented potential for measuring and understanding some of the most crippling symptoms of some of the most debilitating illnesses known to humankind. PMID:24613984

  7. Automating the analytical laboratory via the Chemical Analysis Automation paradigm

    SciTech Connect

    Hollen, R.; Rzeszutko, C.

    1997-10-01

    To address the need for standardization within the analytical chemistry laboratories of the nation, the Chemical Analysis Automation (CAA) program within the US Department of Energy, Office of Science and Technology`s Robotic Technology Development Program is developing laboratory sample analysis systems that will automate the environmental chemical laboratories. The current laboratory automation paradigm consists of islands-of-automation that do not integrate into a system architecture. Thus, today the chemist must perform most aspects of environmental analysis manually using instrumentation that generally cannot communicate with other devices in the laboratory. CAA is working towards a standardized and modular approach to laboratory automation based upon the Standard Analysis Method (SAM) architecture. Each SAM system automates a complete chemical method. The building block of a SAM is known as the Standard Laboratory Module (SLM). The SLM, either hardware or software, automates a subprotocol of an analysis method and can operate as a standalone or as a unit within a SAM. The CAA concept allows the chemist to easily assemble an automated analysis system, from sample extraction through data interpretation, using standardized SLMs without the worry of hardware or software incompatibility or the necessity of generating complicated control programs. A Task Sequence Controller (TSC) software program schedules and monitors the individual tasks to be performed by each SLM configured within a SAM. The chemist interfaces with the operation of the TSC through the Human Computer Interface (HCI), a logical, icon-driven graphical user interface. The CAA paradigm has successfully been applied in automating EPA SW-846 Methods 3541/3620/8081 for the analysis of PCBs in a soil matrix utilizing commercially available equipment in tandem with SLMs constructed by CAA.

  8. Automated imatinib immunoassay

    PubMed Central

    Beumer, Jan H.; Kozo, Daniel; Harney, Rebecca L.; Baldasano, Caitlin N.; Jarrah, Justin; Christner, Susan M.; Parise, Robert; Baburina, Irina; Courtney, Jodi B.; Salamone, Salvatore J.

    2014-01-01

    Background Imatinib pharmacokinetic variability and the relationship of trough concentrations with clinical outcomes have been extensively reported. Though physical methods to quantitate imatinib exist, they are not widely available for routine use. An automated homogenous immunoassay for imatinib has been developed, facilitating routine imatinib testing. Methods Imatinib-selective monoclonal antibodies, without substantial cross-reactivity to the N-desmethyl metabolite or N-desmethyl conjugates, were produced. The antibodies were conjugated to 200 nm particles to develop immunoassay reagents on the Beckman Coulter AU480™ analyzer. These reagents were analytically validated using Clinical Laboratory Standards Institute protocols. Method comparison to LC-MS/MS was conducted using 77 plasma samples collected from subjects receiving imatinib. Results The assay requires 4 µL of sample without pre-treatment. The non-linear calibration curve ranges from 0 to 3,000 ng/mL. With automated sample dilution, concentrations of up to 9,000 ng/mL can be quantitated. The AU480 produces the first result in 10 minutes, and up to 400 tests per hour. Repeatability ranged from 2.0 to 6.0% coefficient of variation (CV), and within-laboratory reproducibility ranged from 2.9 to 7.4% CV. Standard curve stability was two weeks and on-board reagent stability was 6 weeks. For clinical samples with imatinib concentrations from 438 – 2,691 ng/mL, method comparison with LC-MS/MS gave a slope of 0.995 with a y-intercept of 24.3 and a correlation coefficient of 0.978. Conclusion The immunoassay is suitable for quantitating imatinib in human plasma, demonstrating good correlation with a physical method. Testing for optimal imatinib exposure can now be performed on routine clinical analyzers. PMID:25551407

  9. The Brains Behind the Brain.

    ERIC Educational Resources Information Center

    D'Arcangelo, Marcia

    1998-01-01

    Interviews with five neuroscientists--Martin Diamond, Pat Wolfe, Robert Sylwester, Geoffrey Caine, and Eric Jensen--disclose brain-research findings of practical interest to educators. Topics include brain physiology, environmental enrichment, memorization, windows of learning opportunity, brain learning capacity, attention span, student interest,…

  10. Automated MRI Segmentation for Individualized Modeling of Current Flow in the Human Head

    PubMed Central

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-01-01

    Objective High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography (HD-EEG) require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images (MRI) requires labor-intensive manual segmentation, even when leveraging available automated segmentation tools. Also, accurate placement of many high-density electrodes on individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach A fully automated segmentation technique based on Statical Parametric Mapping 8 (SPM8), including an improved tissue probability map (TPM) and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on 4 healthy subjects and 7 stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. Main results The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view (FOV) extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly

  11. Automated MRI segmentation for individualized modeling of current flow in the human head

    NASA Astrophysics Data System (ADS)

    Huang, Yu; Dmochowski, Jacek P.; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C.

    2013-12-01

    Objective. High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. Approach. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets.Main results. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly.Significance. Fully

  12. Automated Fluid Interface System (AFIS)

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Automated remote fluid servicing will be necessary for future space missions, as future satellites will be designed for on-orbit consumable replenishment. In order to develop an on-orbit remote servicing capability, a standard interface between a tanker and the receiving satellite is needed. The objective of the Automated Fluid Interface System (AFIS) program is to design, fabricate, and functionally demonstrate compliance with all design requirements for an automated fluid interface system. A description and documentation of the Fairchild AFIS design is provided.

  13. Automated Engineering Design (AED); An approach to automated documentation

    NASA Technical Reports Server (NTRS)

    Mcclure, C. W.

    1970-01-01

    The automated engineering design (AED) is reviewed, consisting of a high level systems programming language, a series of modular precoded subroutines, and a set of powerful software machine tools that effectively automate the production and design of new languages. AED is used primarily for development of problem and user-oriented languages. Software production phases are diagramed, and factors which inhibit effective documentation are evaluated.

  14. Brain tumors.

    PubMed Central

    Black, K. L.; Mazziotta, J. C.; Becker, D. P.

    1991-01-01

    Recent advances in experimental tumor biology are being applied to critical clinical problems of primary brain tumors. The expression of peripheral benzodiazepine receptors, which are sparse in normal brain, is increased as much as 20-fold in brain tumors. Experimental studies show promise in using labeled ligands to these receptors to identify the outer margins of malignant brain tumors. Whereas positron emission tomography has improved the dynamic understanding of tumors, the labeled selective tumor receptors with positron emitters will enhance the ability to specifically diagnose and greatly aid in the pretreatment planning for tumors. Modulation of these receptors will also affect tumor growth and metabolism. Novel methods to deliver antitumor agents to the brain and new approaches using biologic response modifiers also hold promise to further improve the management of brain tumors. Images PMID:1848735

  15. Fuzzy Control/Space Station automation

    NASA Technical Reports Server (NTRS)

    Gersh, Mark

    1990-01-01

    Viewgraphs on fuzzy control/space station automation are presented. Topics covered include: Space Station Freedom (SSF); SSF evolution; factors pointing to automation & robotics (A&R); astronaut office inputs concerning A&R; flight system automation and ground operations applications; transition definition program; and advanced automation software tools.

  16. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  17. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 1 2012-10-01 2012-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  18. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  19. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  20. 46 CFR 15.715 - Automated vessels.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 1 2013-10-01 2013-10-01 false Automated vessels. 15.715 Section 15.715 Shipping COAST... Limitations and Qualifying Factors § 15.715 Automated vessels. (a) Coast Guard acceptance of automated systems... automated system in establishing initial manning levels; however, until the system is proven reliable,...

  1. Human factors in cockpit automation

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.

    1984-01-01

    The rapid advance in microprocessor technology has made it possible to automate many functions that were previously performed manually. Several research areas have been identified which are basic to the question of the implementation of automation in the cockpit. One of the identified areas deserving further research is warning and alerting systems. Modern transport aircraft have had one after another warning and alerting systems added, and computer-based cockpit systems make it possible to add even more. Three major areas of concern are: input methods (including voice, keyboard, touch panel, etc.), output methods and displays (from traditional instruments to CRTs, to exotic displays including the human voice), and training for automation. Training for operating highly automatic systems requires considerably more attention than it has been given in the past. Training methods have not kept pace with the advent of flight-deck automation.

  2. Automating the Purple Crow Lidar

    NASA Astrophysics Data System (ADS)

    Hicks, Shannon; Sica, R. J.; Argall, P. S.

    2016-06-01

    The Purple Crow LiDAR (PCL) was built to measure short and long term coupling between the lower, middle, and upper atmosphere. The initial component of my MSc. project is to automate two key elements of the PCL: the rotating liquid mercury mirror and the Zaber alignment mirror. In addition to the automation of the Zaber alignment mirror, it is also necessary to describe the mirror's movement and positioning errors. Its properties will then be added into the alignment software. Once the alignment software has been completed, we will compare the new alignment method with the previous manual procedure. This is the first among several projects that will culminate in a fully-automated lidar. Eventually, we will be able to work remotely, thereby increasing the amount of data we collect. This paper will describe the motivation for automation, the methods we propose, preliminary results for the Zaber alignment error analysis, and future work.

  3. Real Automation in the Field

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar; Mayero, Micaela; Bushnell, Dennis M. (Technical Monitor)

    2001-01-01

    We provide a package of strategies for automation of non-linear arithmetic in PVS. In particular, we describe a simplication procedure for the field of real numbers and a strategy for cancellation of common terms.

  4. Genetic circuit design automation.

    PubMed

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization.

  5. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  6. Automated Supernova Discovery (Abstract)

    NASA Astrophysics Data System (ADS)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  7. Genetic circuit design automation.

    PubMed

    Nielsen, Alec A K; Der, Bryan S; Shin, Jonghyeon; Vaidyanathan, Prashant; Paralanov, Vanya; Strychalski, Elizabeth A; Ross, David; Densmore, Douglas; Voigt, Christopher A

    2016-04-01

    Computation can be performed in living cells by DNA-encoded circuits that process sensory information and control biological functions. Their construction is time-intensive, requiring manual part assembly and balancing of regulator expression. We describe a design environment, Cello, in which a user writes Verilog code that is automatically transformed into a DNA sequence. Algorithms build a circuit diagram, assign and connect gates, and simulate performance. Reliable circuit design requires the insulation of gates from genetic context, so that they function identically when used in different circuits. We used Cello to design 60 circuits forEscherichia coli(880,000 base pairs of DNA), for which each DNA sequence was built as predicted by the software with no additional tuning. Of these, 45 circuits performed correctly in every output state (up to 10 regulators and 55 parts), and across all circuits 92% of the output states functioned as predicted. Design automation simplifies the incorporation of genetic circuits into biotechnology projects that require decision-making, control, sensing, or spatial organization. PMID:27034378

  8. Automated Gas Distribution System

    NASA Astrophysics Data System (ADS)

    Starke, Allen; Clark, Henry

    2012-10-01

    The cyclotron of Texas A&M University is one of the few and prized cyclotrons in the country. Behind the scenes of the cyclotron is a confusing, and dangerous setup of the ion sources that supplies the cyclotron with particles for acceleration. To use this machine there is a time consuming, and even wasteful step by step process of switching gases, purging, and other important features that must be done manually to keep the system functioning properly, while also trying to maintain the safety of the working environment. Developing a new gas distribution system to the ion source prevents many of the problems generated by the older manually setup process. This developed system can be controlled manually in an easier fashion than before, but like most of the technology and machines in the cyclotron now, is mainly operated based on software programming developed through graphical coding environment Labview. The automated gas distribution system provides multi-ports for a selection of different gases to decrease the amount of gas wasted through switching gases, and a port for the vacuum to decrease the amount of time spent purging the manifold. The Labview software makes the operation of the cyclotron and ion sources easier, and safer for anyone to use.

  9. Automated call tracking systems

    SciTech Connect

    Hardesty, C.

    1993-03-01

    User Services groups are on the front line with user support. We are the first to hear about problems. The speed, accuracy, and intelligence with which we respond determines the user`s perception of our effectiveness and our commitment to quality and service. To keep pace with the complex changes at our sites, we must have tools to help build a knowledge base of solutions, a history base of our users, and a record of every problem encountered. Recently, I completed a survey of twenty sites similar to the National Energy Research Supercomputer Center (NERSC). This informal survey reveals that 27% of the sites use a paper system to log calls, 60% employ homegrown automated call tracking systems, and 13% use a vendor-supplied system. Fifty-four percent of those using homegrown systems are exploring the merits of switching to a vendor-supplied system. The purpose of this paper is to provide guidelines for evaluating a call tracking system. In addition, insights are provided to assist User Services groups in selecting a system that fits their needs.

  10. Automated Microbial Metabolism Laboratory

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The Automated Microbial Metabolism Laboratory (AMML) 1971-1972 program involved the investigation of three separate life detection schemes. The first was a continued further development of the labeled release experiment. The possibility of chamber reuse without inbetween sterilization, to provide comparative biochemical information was tested. Findings show that individual substrates or concentrations of antimetabolites may be sequentially added to a single test chamber. The second detection system which was investigated for possible inclusion in the AMML package of assays, was nitrogen fixation as detected by acetylene reduction. Thirdly, a series of preliminary steps were taken to investigate the feasibility of detecting biopolymers in soil. A strategy for the safe return to Earth of a Mars sample prior to manned landings on Mars is outlined. The program assumes that the probability of indigenous life on Mars is unity and then broadly presents the procedures for acquisition and analysis of the Mars sample in a manner to satisfy the scientific community and the public that adequate safeguards are being taken.

  11. Technology modernization assessment flexible automation

    SciTech Connect

    Bennett, D.W.; Boyd, D.R.; Hansen, N.H.; Hansen, M.A.; Yount, J.A.

    1990-12-01

    The objectives of this report are: to present technology assessment guidelines to be considered in conjunction with defense regulations before an automation project is developed to give examples showing how assessment guidelines may be applied to a current project to present several potential areas where automation might be applied successfully in the depot system. Depots perform primarily repair and remanufacturing operations, with limited small batch manufacturing runs. While certain activities (such as Management Information Systems and warehousing) are directly applicable to either environment, the majority of applications will require combining existing and emerging technologies in different ways, with the special needs of depot remanufacturing environment. Industry generally enjoys the ability to make revisions to its product lines seasonally, followed by batch runs of thousands or more. Depot batch runs are in the tens, at best the hundreds, of parts with a potential for large variation in product mix; reconfiguration may be required on a week-to-week basis. This need for a higher degree of flexibility suggests a higher level of operator interaction, and, in turn, control systems that go beyond the state of the art for less flexible automation and industry in general. This report investigates the benefits and barriers to automation and concludes that, while significant benefits do exist for automation, depots must be prepared to carefully investigate the technical feasibility of each opportunity and the life-cycle costs associated with implementation. Implementation is suggested in two ways: (1) develop an implementation plan for automation technologies based on results of small demonstration automation projects; (2) use phased implementation for both these and later stage automation projects to allow major technical and administrative risk issues to be addressed. 10 refs., 2 figs., 2 tabs. (JF)

  12. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Thomason, Cindy; Anderson, Paul M.; Martin, James A.

    1990-01-01

    Automated power-distribution system monitors and controls electrical power to modules in network. Handles both 208-V, 20-kHz single-phase alternating current and 120- to 150-V direct current. Power distributed to load modules from power-distribution control units (PDCU's) via subsystem distributors. Ring busses carry power to PDCU's from power source. Needs minimal attention. Detects faults and also protects against them. Potential applications include autonomous land vehicles and automated industrial process systems.

  13. Evolution paths for advanced automation

    NASA Technical Reports Server (NTRS)

    Healey, Kathleen J.

    1990-01-01

    As Space Station Freedom (SSF) evolves, increased automation and autonomy will be required to meet Space Station Freedom Program (SSFP) objectives. As a precursor to the use of advanced automation within the SSFP, especially if it is to be used on SSF (e.g., to automate the operation of the flight systems), the underlying technologies will need to be elevated to a high level of readiness to ensure safe and effective operations. Ground facilities supporting the development of these flight systems -- from research and development laboratories through formal hardware and software development environments -- will be responsible for achieving these levels of technology readiness. These facilities will need to evolve support the general evolution of the SSFP. This evolution will include support for increasing the use of advanced automation. The SSF Advanced Development Program has funded a study to define evolution paths for advanced automaton within the SSFP's ground-based facilities which will enable, promote, and accelerate the appropriate use of advanced automation on-board SSF. The current capability of the test beds and facilities, such as the Software Support Environment, with regard to advanced automation, has been assessed and their desired evolutionary capabilities have been defined. Plans and guidelines for achieving this necessary capability have been constructed. The approach taken has combined indepth interviews of test beds personnel at all SSF Work Package centers with awareness of relevant state-of-the-art technology and technology insertion methodologies. Key recommendations from the study include advocating a NASA-wide task force for advanced automation, and the creation of software prototype transition environments to facilitate the incorporation of advanced automation in the SSFP.

  14. An Automated Motion Detection and Reward System for Animal Training

    PubMed Central

    Miller, Brad; Lim, Audrey N; Heidbreder, Arnold F

    2015-01-01

    A variety of approaches has been used to minimize head movement during functional brain imaging studies in awake laboratory animals. Many laboratories expend substantial effort and time training animals to remain essentially motionless during such studies. We could not locate an “off-the-shelf” automated training system that suited our needs.  We developed a time- and labor-saving automated system to train animals to hold still for extended periods of time. The system uses a personal computer and modest external hardware to provide stimulus cues, monitor movement using commercial video surveillance components, and dispense rewards. A custom computer program automatically increases the motionless duration required for rewards based on performance during the training session but allows changes during sessions. This system was used to train cynomolgus monkeys (Macaca fascicularis) for awake neuroimaging studies using positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). The automated system saved the trainer substantial time, presented stimuli and rewards in a highly consistent manner, and automatically documented training sessions. We have limited data to prove the training system's success, drawn from the automated records during training sessions, but we believe others may find it useful. The system can be adapted to a range of behavioral training/recording activities for research or commercial applications, and the software is freely available for non-commercial use. PMID:26798573

  15. An Automated Motion Detection and Reward System for Animal Training.

    PubMed

    Miller, Brad; Lim, Audrey N; Heidbreder, Arnold F; Black, Kevin J

    2015-12-04

    A variety of approaches has been used to minimize head movement during functional brain imaging studies in awake laboratory animals. Many laboratories expend substantial effort and time training animals to remain essentially motionless during such studies. We could not locate an "off-the-shelf" automated training system that suited our needs.  We developed a time- and labor-saving automated system to train animals to hold still for extended periods of time. The system uses a personal computer and modest external hardware to provide stimulus cues, monitor movement using commercial video surveillance components, and dispense rewards. A custom computer program automatically increases the motionless duration required for rewards based on performance during the training session but allows changes during sessions. This system was used to train cynomolgus monkeys (Macaca fascicularis) for awake neuroimaging studies using positron emission tomography (PET) and functional magnetic resonance imaging (fMRI). The automated system saved the trainer substantial time, presented stimuli and rewards in a highly consistent manner, and automatically documented training sessions. We have limited data to prove the training system's success, drawn from the automated records during training sessions, but we believe others may find it useful. The system can be adapted to a range of behavioral training/recording activities for research or commercial applications, and the software is freely available for non-commercial use.

  16. An automated method for high-definition transcranial direct current stimulation modeling.

    PubMed

    Huang, Yu; Su, Yuzhuo; Rorden, Christopher; Dmochowski, Jacek; Datta, Abhishek; Parra, Lucas C

    2012-01-01

    Targeted transcranial stimulation with electric currents requires accurate models of the current flow from scalp electrodes to the human brain. Idiosyncratic anatomy of individual brains and heads leads to significant variability in such current flows across subjects, thus, necessitating accurate individualized head models. Here we report on an automated processing chain that computes current distributions in the head starting from a structural magnetic resonance image (MRI). The main purpose of automating this process is to reduce the substantial effort currently required for manual segmentation, electrode placement, and solving of finite element models. In doing so, several weeks of manual labor were reduced to no more than 4 hours of computation time and minimal user interaction, while current-flow results for the automated method deviated by less than 27.9% from the manual method. Key facilitating factors are the addition of three tissue types (skull, scalp and air) to a state-of-the-art automated segmentation process, morphological processing to correct small but important segmentation errors, and automated placement of small electrodes based on easily reproducible standard electrode configurations. We anticipate that such an automated processing will become an indispensable tool to individualize transcranial direct current stimulation (tDCS) therapy.

  17. Automated image segmentation using support vector machines

    NASA Astrophysics Data System (ADS)

    Powell, Stephanie; Magnotta, Vincent A.; Andreasen, Nancy C.

    2007-03-01

    Neurodegenerative and neurodevelopmental diseases demonstrate problems associated with brain maturation and aging. Automated methods to delineate brain structures of interest are required to analyze large amounts of imaging data like that being collected in several on going multi-center studies. We have previously reported on using artificial neural networks (ANN) to define subcortical brain structures including the thalamus (0.88), caudate (0.85) and the putamen (0.81). In this work, apriori probability information was generated using Thirion's demons registration algorithm. The input vector consisted of apriori probability, spherical coordinates, and an iris of surrounding signal intensity values. We have applied the support vector machine (SVM) machine learning algorithm to automatically segment subcortical and cerebellar regions using the same input vector information. SVM architecture was derived from the ANN framework. Training was completed using a radial-basis function kernel with gamma equal to 5.5. Training was performed using 15,000 vectors collected from 15 training images in approximately 10 minutes. The resulting support vectors were applied to delineate 10 images not part of the training set. Relative overlap calculated for the subcortical structures was 0.87 for the thalamus, 0.84 for the caudate, 0.84 for the putamen, and 0.72 for the hippocampus. Relative overlap for the cerebellar lobes ranged from 0.76 to 0.86. The reliability of the SVM based algorithm was similar to the inter-rater reliability between manual raters and can be achieved without rater intervention.

  18. Automated ship image acquisition

    NASA Astrophysics Data System (ADS)

    Hammond, T. R.

    2008-04-01

    The experimental Automated Ship Image Acquisition System (ASIA) collects high-resolution ship photographs at a shore-based laboratory, with minimal human intervention. The system uses Automatic Identification System (AIS) data to direct a high-resolution SLR digital camera to ship targets and to identify the ships in the resulting photographs. The photo database is then searchable using the rich data fields from AIS, which include the name, type, call sign and various vessel identification numbers. The high-resolution images from ASIA are intended to provide information that can corroborate AIS reports (e.g., extract identification from the name on the hull) or provide information that has been omitted from the AIS reports (e.g., missing or incorrect hull dimensions, cargo, etc). Once assembled into a searchable image database, the images can be used for a wide variety of marine safety and security applications. This paper documents the author's experience with the practicality of composing photographs based on AIS reports alone, describing a number of ways in which this can go wrong, from errors in the AIS reports, to fixed and mobile obstructions and multiple ships in the shot. The frequency with which various errors occurred in automatically-composed photographs collected in Halifax harbour in winter time were determined by manual examination of the images. 45% of the images examined were considered of a quality sufficient to read identification markings, numbers and text off the entire ship. One of the main technical challenges for ASIA lies in automatically differentiating good and bad photographs, so that few bad ones would be shown to human users. Initial attempts at automatic photo rating showed 75% agreement with manual assessments.

  19. Semi-automated and automated glioma grading using dynamic susceptibility-weighted contrast-enhanced perfusion MRI relative cerebral blood volume measurements

    PubMed Central

    Friedman, S N; Bambrough, P J; Kotsarini, C; Khandanpour, N; Hoggard, N

    2012-01-01

    Objective Despite the established role of MRI in the diagnosis of brain tumours, histopathological assessment remains the clinically used technique, especially for the glioma group. Relative cerebral blood volume (rCBV) is a dynamic susceptibility-weighted contrast-enhanced perfusion MRI parameter that has been shown to correlate to tumour grade, but assessment requires a specialist and is time consuming. We developed analysis software to determine glioma gradings from perfusion rCBV scans in a manner that is quick, easy and does not require a specialist operator. Methods MRI perfusion data from 47 patients with different histopathological grades of glioma were analysed with custom-designed software. Semi-automated analysis was performed with a specialist and non-specialist operator separately determining the maximum rCBV value corresponding to the tumour. Automated histogram analysis was performed by calculating the mean, standard deviation, median, mode, skewness and kurtosis of rCBV values. All values were compared with the histopathologically assessed tumour grade. Results A strong correlation between specialist and non-specialist observer measurements was found. Significantly different values were obtained between tumour grades using both semi-automated and automated techniques, consistent with previous results. The raw (unnormalised) data single-pixel maximum rCBV semi-automated analysis value had the strongest correlation with glioma grade. Standard deviation of the raw data had the strongest correlation of the automated analysis. Conclusion Semi-automated calculation of raw maximum rCBV value was the best indicator of tumour grade and does not require a specialist operator. Advances in knowledge Both semi-automated and automated MRI perfusion techniques provide viable non-invasive alternatives to biopsy for glioma tumour grading. PMID:23175486

  20. Neurodegenerative changes in Alzheimer's disease: a comparative study of manual, semi-automated, and fully automated assessment using MRI

    NASA Astrophysics Data System (ADS)

    Fritzsche, Klaus H.; Giesel, Frederik L.; Heimann, Tobias; Thomann, Philipp A.; Hahn, Horst K.; Pantel, Johannes; Schröder, Johannes; Essig, Marco; Meinzer, Hans-Peter

    2008-03-01

    Objective quantification of disease specific neurodegenerative changes can facilitate diagnosis and therapeutic monitoring in several neuropsychiatric disorders. Reproducibility and easy-to-perform assessment are essential to ensure applicability in clinical environments. Aim of this comparative study is the evaluation of a fully automated approach that assesses atrophic changes in Alzheimer's disease (AD) and Mild Cognitive Impairment (MCI). 21 healthy volunteers (mean age 66.2), 21 patients with MCI (66.6), and 10 patients with AD (65.1) were enrolled. Subjects underwent extensive neuropsychological testing and MRI was conducted on a 1.5 Tesla clinical scanner. Atrophic changes were measured automatically by a series of image processing steps including state of the art brain mapping techniques. Results were compared with two reference approaches: a manual segmentation of the hippocampal formation and a semi-automated estimation of temporal horn volume, which is based upon interactive selection of two to six landmarks in the ventricular system. All approaches separated controls and AD patients significantly (10 -5 < p < 10 -4) and showed a slight but not significant increase of neurodegeneration for subjects with MCI compared to volunteers. The automated approach correlated significantly with the manual (r = -0.65, p < 10 -6) and semi automated (r = -0.83, p < 10 -13) measurements. It proved high accuracy and at the same time maximized observer independency, time reduction and thus usefulness for clinical routine.

  1. [Brain metastases].

    PubMed

    Brennum, Jannick; Kosteljanetz, Michael; Roed, Henrik Michael H

    2002-07-01

    The incidence of symptomatic brain metastases in Denmark is about 3500. In the present review, the aetiology, symptomatology, and diagnostic procedures are described. The main topic is a review of current treatments and the evidence for their efficacy. Treatment of brain metastases rarely cures the patient, the goal is rather to improve the quality of life and prolong survival. Without treatment, the median survival following diagnosis of brain metastases is about one month, with steroid treatment two months, with whole brain irradiation four to six months, and after surgery or stereotactic radiosurgery 10-12 months. A relatively simple treatment scheme based on the number of brain metastases and the overall condition of the patient is provided.

  2. Brain peroxisomes.

    PubMed

    Trompier, D; Vejux, A; Zarrouk, A; Gondcaille, C; Geillon, F; Nury, T; Savary, S; Lizard, G

    2014-03-01

    Peroxisomes are essential organelles in higher eukaryotes as they play a major role in numerous metabolic pathways and redox homeostasis. Some peroxisomal abnormalities, which are often not compatible with life or normal development, were identified in severe demyelinating and neurodegenerative brain diseases. The metabolic roles of peroxisomes, especially in the brain, are described and human brain peroxisomal disorders resulting from a peroxisome biogenesis or a single peroxisomal enzyme defect are listed. The brain abnormalities encountered in these disorders (demyelination, oxidative stress, inflammation, cell death, neuronal migration, differentiation) are described and their pathogenesis are discussed. Finally, the contribution of peroxisomal dysfunctions to the alterations of brain functions during aging and to the development of Alzheimer's disease is considered.

  3. Automated system for analyzing the activity of individual neurons

    NASA Technical Reports Server (NTRS)

    Bankman, Isaac N.; Johnson, Kenneth O.; Menkes, Alex M.; Diamond, Steve D.; Oshaughnessy, David M.

    1993-01-01

    This paper presents a signal processing system that: (1) provides an efficient and reliable instrument for investigating the activity of neuronal assemblies in the brain; and (2) demonstrates the feasibility of generating the command signals of prostheses using the activity of relevant neurons in disabled subjects. The system operates online, in a fully automated manner and can recognize the transient waveforms of several neurons in extracellular neurophysiological recordings. Optimal algorithms for detection, classification, and resolution of overlapping waveforms are developed and evaluated. Full automation is made possible by an algorithm that can set appropriate decision thresholds and an algorithm that can generate templates on-line. The system is implemented with a fast IBM PC compatible processor board that allows on-line operation.

  4. In vivo robotics: the automation of neuroscience and other intact-system biological fields

    PubMed Central

    Kodandaramaiah, Suhasa B.; Boyden, Edward S.; Forest, Craig R.

    2013-01-01

    Robotic and automation technologies have played a huge role in in vitro biological science, having proved critical for scientific endeavors such as genome sequencing and high-throughput screening. Robotic and automation strategies are beginning to play a greater role in in vivo and in situ sciences, especially when it comes to the difficult in vivo experiments required for understanding the neural mechanisms of behavior and disease. In this perspective, we discuss the prospects for robotics and automation to impact neuroscientific and intact-system biology fields. We discuss how robotic innovations might be created to open up new frontiers in basic and applied neuroscience, and present a concrete example with our recent automation of in vivo whole cell patch clamp electrophysiology of neurons in the living mouse brain. PMID:23841584

  5. In vivo robotics: the automation of neuroscience and other intact-system biological fields.

    PubMed

    Kodandaramaiah, Suhasa B; Boyden, Edward S; Forest, Craig R

    2013-12-01

    Robotic and automation technologies have played a huge role in in vitro biological science, having proved critical for scientific endeavors such as genome sequencing and high-throughput screening. Robotic and automation strategies are beginning to play a greater role in in vivo and in situ sciences, especially when it comes to the difficult in vivo experiments required for understanding the neural mechanisms of behavior and disease. In this perspective, we discuss the prospects for robotics and automation to influence neuroscientific and intact-system biology fields. We discuss how robotic innovations might be created to open up new frontiers in basic and applied neuroscience and present a concrete example with our recent automation of in vivo whole-cell patch clamp electrophysiology of neurons in the living mouse brain.

  6. Brain investigation and brain conceptualization

    PubMed Central

    Redolfi, Alberto; Bosco, Paolo; Manset, David; Frisoni, Giovanni B.

    Summary The brain of a patient with Alzheimer’s disease (AD) undergoes changes starting many years before the development of the first clinical symptoms. The recent availability of large prospective datasets makes it possible to create sophisticated brain models of healthy subjects and patients with AD, showing pathophysiological changes occurring over time. However, these models are still inadequate; representations are mainly single-scale and they do not account for the complexity and interdependence of brain changes. Brain changes in AD patients occur at different levels and for different reasons: at the molecular level, changes are due to amyloid deposition; at cellular level, to loss of neuron synapses, and at tissue level, to connectivity disruption. All cause extensive atrophy of the whole brain organ. Initiatives aiming to model the whole human brain have been launched in Europe and the US with the goal of reducing the burden of brain diseases. In this work, we describe a new approach to earlier diagnosis based on a multimodal and multiscale brain concept, built upon existing and well-characterized single modalities. PMID:24139654

  7. Automation: Decision Aid or Decision Maker?

    NASA Technical Reports Server (NTRS)

    Skitka, Linda J.

    1998-01-01

    This study clarified that automation bias is something unique to automated decision making contexts, and is not the result of a general tendency toward complacency. By comparing performance on exactly the same events on the same tasks with and without an automated decision aid, we were able to determine that at least the omission error part of automation bias is due to the unique context created by having an automated decision aid, and is not a phenomena that would occur even if people were not in an automated context. However, this study also revealed that having an automated decision aid did lead to modestly improved performance across all non-error events. Participants in the non- automated condition responded with 83.68% accuracy, whereas participants in the automated condition responded with 88.67% accuracy, across all events. Automated decision aids clearly led to better overall performance when they were accurate. People performed almost exactly at the level of reliability as the automation (which across events was 88% reliable). However, also clear, is that the presence of less than 100% accurate automated decision aids creates a context in which new kinds of errors in decision making can occur. Participants in the non-automated condition responded with 97% accuracy on the six "error" events, whereas participants in the automated condition had only a 65% accuracy rate when confronted with those same six events. In short, the presence of an AMA can lead to vigilance decrements that can lead to errors in decision making.

  8. Automated protein NMR resonance assignments.

    PubMed

    Wan, Xiang; Xu, Dong; Slupsky, Carolyn M; Lin, Guohui

    2003-01-01

    NMR resonance peak assignment is one of the key steps in solving an NMR protein structure. The assignment process links resonance peaks to individual residues of the target protein sequence, providing the prerequisite for establishing intra- and inter-residue spatial relationships between atoms. The assignment process is tedious and time-consuming, which could take many weeks. Though there exist a number of computer programs to assist the assignment process, many NMR labs are still doing the assignments manually to ensure quality. This paper presents (1) a new scoring system for mapping spin systems to residues, (2) an automated adjacency information extraction procedure from NMR spectra, and (3) a very fast assignment algorithm based on our previous proposed greedy filtering method and a maximum matching algorithm to automate the assignment process. The computational tests on 70 instances of (pseudo) experimental NMR data of 14 proteins demonstrate that the new score scheme has much better discerning power with the aid of adjacency information between spin systems simulated across various NMR spectra. Typically, with automated extraction of adjacency information, our method achieves nearly complete assignments for most of the proteins. The experiment shows very promising perspective that the fast automated assignment algorithm together with the new score scheme and automated adjacency extraction may be ready for practical use. PMID:16452794

  9. Space power subsystem automation technology

    NASA Technical Reports Server (NTRS)

    Graves, J. R. (Compiler)

    1982-01-01

    The technology issues involved in power subsystem automation and the reasonable objectives to be sought in such a program were discussed. The complexities, uncertainties, and alternatives of power subsystem automation, along with the advantages from both an economic and a technological perspective were considered. Whereas most spacecraft power subsystems now use certain automated functions, the idea of complete autonomy for long periods of time is almost inconceivable. Thus, it seems prudent that the technology program for power subsystem automation be based upon a growth scenario which should provide a structured framework of deliberate steps to enable the evolution of space power subsystems from the current practice of limited autonomy to a greater use of automation with each step being justified on a cost/benefit basis. Each accomplishment should move toward the objectives of decreased requirement for ground control, increased system reliability through onboard management, and ultimately lower energy cost through longer life systems that require fewer resources to operate and maintain. This approach seems well-suited to the evolution of more sophisticated algorithms and eventually perhaps even the use of some sort of artificial intelligence. Multi-hundred kilowatt systems of the future will probably require an advanced level of autonomy if they are to be affordable and manageable.

  10. [Brain concussion].

    PubMed

    Pälvimäki, Esa-Pekka; Siironen, Jari; Pohjola, Juha; Hernesniemi, Juha

    2011-01-01

    Brain concussion is a common disturbance caused by external forces or acceleration affecting the head. It may be accompanied by transient loss of consciousness and amnesia. Typical symptoms include headache, nausea and dizziness; these may remain for a week or two. Some patients may experience transient loss of inability to create new memories or other brief impairment of mental functioning. Treatment is symptomatic. Some patients may suffer from prolonged symptoms, the connection of which with brain concession is difficult to show. Almost invariably the prognosis of brain concussion is good.

  11. Classification of CT-brain slices based on local histograms

    NASA Astrophysics Data System (ADS)

    Avrunin, Oleg G.; Tymkovych, Maksym Y.; Pavlov, Sergii V.; Timchik, Sergii V.; Kisała, Piotr; Orakbaev, Yerbol

    2015-12-01

    Neurosurgical intervention is a very complicated process. Modern operating procedures based on data such as CT, MRI, etc. Automated analysis of these data is an important task for researchers. Some modern methods of brain-slice segmentation use additional data to process these images. Classification can be used to obtain this information. To classify the CT images of the brain, we suggest using local histogram and features extracted from them. The paper shows the process of feature extraction and classification CT-slices of the brain. The process of feature extraction is specialized for axial cross-section of the brain. The work can be applied to medical neurosurgical systems.

  12. Review and Challenges of Brain Analysis through DTI Measurements.

    PubMed

    Garin-Muga, Alba; Borro, Diego

    2014-01-01

    Medical images are being studied to analyse the brain in neurological disorders. Measurements extracted from Diffusion tensor image (DTI) such as Fractional Anisotropy (FA) describe the brain changes caused by diseases. However, there is no single best method for the quantitative brain analysis. This paper presents a review of the existing methods and software tools for brain analysis through DTI measurements. It also states some challenges that current software tools still have to meet in order to improve automation and usability and become smarter software tools. PMID:25488208

  13. Automated segmentation of the human hippocampus along its longitudinal axis.

    PubMed

    Lerma-Usabiaga, Garikoitz; Iglesias, Juan Eugenio; Insausti, Ricardo; Greve, Douglas N; Paz-Alonso, Pedro M

    2016-09-01

    The human hippocampal formation is a crucial brain structure for memory and cognitive function that is closely related to other subcortical and cortical brain regions. Recent neuroimaging studies have revealed differences along the hippocampal longitudinal axis in terms of structure, connectivity, and function, stressing the importance of improving the reliability of the available segmentation methods that are typically used to divide the hippocampus into its anterior and posterior parts. However, current segmentation conventions present two main sources of variability related to manual operations intended to correct in-scanner head position across subjects and the selection of dividing planes along the longitudinal axis. Here, our aim was twofold: (1) to characterize inter- and intra-rater variability associated with these manual operations and compare manual (landmark based) and automatic (percentage based) hippocampal anterior-posterior segmentation procedures; and (2) to propose and test automated rotation methods based on approximating the hippocampal longitudinal axis to a straight line (estimated with principal component analysis, PCA) or a quadratic Bézier curve (fitted with numerical methods); as well as an automated anterior-posterior hippocampal segmentation procedure based on the percentage-based method. Our results reveal that automated rotation and segmentation procedures, used in combination or independently, minimize inconsistencies generated by the accumulation of manual operations while providing higher statistical power to detect well-known effects. A Matlab-based implementation of these procedures is made publicly available to the research community. Hum Brain Mapp 37:3353-3367, 2016. © 2016 Wiley Periodicals, Inc. PMID:27159325

  14. Design automation for integrated circuits

    NASA Astrophysics Data System (ADS)

    Newell, S. B.; de Geus, A. J.; Rohrer, R. A.

    1983-04-01

    Consideration is given to the development status of the use of computers in automated integrated circuit design methods, which promise the minimization of both design time and design error incidence. Integrated circuit design encompasses two major tasks: error specification, in which the goal is a logic diagram that accurately represents the desired electronic function, and physical specification, in which the goal is an exact description of the physical locations of all circuit elements and their interconnections on the chip. Design automation not only saves money by reducing design and fabrication time, but also helps the community of systems and logic designers to work more innovatively. Attention is given to established design automation methodologies, programmable logic arrays, and design shortcuts.

  15. Automated power management and control

    NASA Technical Reports Server (NTRS)

    Dolce, James L.

    1991-01-01

    A comprehensive automation design is being developed for Space Station Freedom's electric power system. A joint effort between NASA's Office of Aeronautics and Exploration Technology and NASA's Office of Space Station Freedom, it strives to increase station productivity by applying expert systems and conventional algorithms to automate power system operation. The initial station operation will use ground-based dispatches to perform the necessary command and control tasks. These tasks constitute planning and decision-making activities that strive to eliminate unplanned outages. We perceive an opportunity to help these dispatchers make fast and consistent on-line decisions by automating three key tasks: failure detection and diagnosis, resource scheduling, and security analysis. Expert systems will be used for the diagnostics and for the security analysis; conventional algorithms will be used for the resource scheduling.

  16. Automated mapping of hammond's landforms

    USGS Publications Warehouse

    Gallant, A.L.; Brown, D.D.; Hoffer, R.M.

    2005-01-01

    We automated a method for mapping Hammond's landforms over large landscapes using digital elevation data. We compared our results against Hammond's published landform maps, derived using manual interpretation procedures. We found general agreement in landform patterns mapped by the manual and the automated approaches, and very close agreement in characterization of local topographic relief. The two approaches produced different interpretations of intermediate landforms, which relied upon quantification of the proportion of landscape having gently sloping terrain. This type of computation is more efficiently and consistently applied by computer than human. Today's ready access to digital data and computerized geospatial technology provides a good foundation for mapping terrain features, but the mapping criteria guiding manual techniques in the past may not be appropriate for automated approaches. We suggest that future efforts center on the advantages offered by digital advancements in refining an approach to better characterize complex landforms. ?? 2005 IEEE.

  17. Automated gaseous criteria pollutant audits

    SciTech Connect

    Watson, J.P.

    1998-12-31

    The Quality Assurance Section (QAS) of the California Air Resources Board (CARB) began performing automated gaseous audits of its ambient air monitoring sites in July 1996. The concept of automated audits evolved from the constant streamlining of the through-the-probe audit process. Continual audit van development and the desire to utilize advanced technology to save time and improve the accuracy of the overall audit process also contributed to the concept. The automated audit process is a computer program which controls an audit van`s ambient gas calibration system, isolated relay and analog to digital cards, and a monitoring station`s data logging system. The program instructs the audit van`s gas calibration system to deliver specified audit concentrations to a monitoring station`s instruments through their collection probe inlet. The monitoring station`s responses to the audit concentrations are obtained by the program polling the station`s datalogger through its RS-232 port. The program calculates relevant audit statistics and stores all data collected during an audit in a relational database. Planning for the development of an automated gaseous audit system began in earnest in 1993, when the CARB purchased computerized ambient air calibration systems which could be remotely controlled by computer through their serial ports. After receiving all the required components of the automated audit system, they were individually tested to confirm their correct operation. Subsequently, a prototype program was developed to perform through-the-probe automated ozone audits. Numerous simulated ozone audits documented the program`s ability to control audit equipment and extract data from a monitoring station`s data logging system. The program was later modified to incorporate the capability to perform audits for carbon monoxide, total hydrocarbons, methane, nitrogen dioxide, sulfur dioxide, and hydrogen sulfide.

  18. Brain radiation - discharge

    MedlinePlus

    Radiation - brain - discharge; Cancer-brain radiation; Lymphoma - brain radiation; Leukemia - brain radiation ... Decadron) while you are getting radiation to the brain. It may make you hungrier, cause leg swelling ...

  19. Right Hemisphere Brain Damage

    MedlinePlus

    ... Language and Swallowing / Disorders and Diseases Right Hemisphere Brain Damage [ en Español ] What is right hemisphere brain ... right hemisphere brain damage ? What is right hemisphere brain damage? Right hemisphere brain damage (RHD) is damage ...

  20. Brain Development

    MedlinePlus

    ... new neural connections every second. This growing brain development is influenced by many factors, including a child’s relationships, experiences and environment. Learn more about the crucial role you play ...

  1. BOA: Framework for automated builds

    SciTech Connect

    N. Ratnikova et al.

    2003-09-30

    Managing large-scale software products is a complex software engineering task. The automation of the software development, release and distribution process is most beneficial in the large collaborations, where the big number of developers, multiple platforms and distributed environment are typical factors. This paper describes Build and Output Analyzer framework and its components that have been developed in CMS to facilitate software maintenance and improve software quality. The system allows to generate, control and analyze various types of automated software builds and tests, such as regular rebuilds of the development code, software integration for releases and installation of the existing versions.

  2. Advanced automation for space missions

    SciTech Connect

    Freitas, R.A., Jr.; Healy, T.J.; Long, J.E.

    1982-01-01

    A NASA/ASEE summer study conducted at the University of Santa Clara in 1980 examined the feasibility of using advanced artificial intelligence and automation technologies in future NASA space missions. Four candidate applications missions were considered: an intelligent earth-sensing information system; an autonomous space exploration system; an automated space manufacturing facility; and a self-replicating, growing lunar factory. The study assessed the various artificial intelligence and machine technologies which must be developed if such sophisticated missions are to become feasible by the century's end. 18 references.

  3. Automating Shallow Seismic Imaging

    SciTech Connect

    Steeples, Don W.

    2004-12-09

    This seven-year, shallow-seismic reflection research project had the aim of improving geophysical imaging of possible contaminant flow paths. Thousands of chemically contaminated sites exist in the United States, including at least 3,700 at Department of Energy (DOE) facilities. Imaging technologies such as shallow seismic reflection (SSR) and ground-penetrating radar (GPR) sometimes are capable of identifying geologic conditions that might indicate preferential contaminant-flow paths. Historically, SSR has been used very little at depths shallower than 30 m, and even more rarely at depths of 10 m or less. Conversely, GPR is rarely useful at depths greater than 10 m, especially in areas where clay or other electrically conductive materials are present near the surface. Efforts to image the cone of depression around a pumping well using seismic methods were only partially successful (for complete references of all research results, see the full Final Technical Report, DOE/ER/14826-F), but peripheral results included development of SSR methods for depths shallower than one meter, a depth range that had not been achieved before. Imaging at such shallow depths, however, requires geophone intervals of the order of 10 cm or less, which makes such surveys very expensive in terms of human time and effort. We also showed that SSR and GPR could be used in a complementary fashion to image the same volume of earth at very shallow depths. The primary research focus of the second three-year period of funding was to develop and demonstrate an automated method of conducting two-dimensional (2D) shallow-seismic surveys with the goal of saving time, effort, and money. Tests involving the second generation of the hydraulic geophone-planting device dubbed the ''Autojuggie'' showed that large numbers of geophones can be placed quickly and automatically and can acquire high-quality data, although not under rough topographic conditions. In some easy-access environments, this device could

  4. Automated Tools for Subject Matter Expert Evaluation of Automated Scoring

    ERIC Educational Resources Information Center

    Williamson, David M.; Bejar, Isaac I.; Sax, Anne

    2004-01-01

    As automated scoring of complex constructed-response examinations reaches operational status, the process of evaluating the quality of resultant scores, particularly in contrast to scores of expert human graders, becomes as complex as the data itself. Using a vignette from the Architectural Registration Examination (ARE), this article explores the…

  5. Automation U.S.A.: Overcoming Barriers to Automation.

    ERIC Educational Resources Information Center

    Brody, Herb

    1985-01-01

    Although labor unions and inadequate technology play minor roles, the principal barrier to factory automation is "fear of change." Related problems include long-term benefits, nontechnical executives, and uncertainty of factory cost accounting. Industry support for university programs is helping to educate engineers to design, implement, and…

  6. Brain imaging and brain function

    SciTech Connect

    Sokoloff, L.

    1985-01-01

    This book is a survey of the applications of imaging studies of regional cerebral blood flow and metabolism to the investigation of neurological and psychiatric disorders. Contributors review imaging techniques and strategies for measuring regional cerebral blood flow and metabolism, for mapping functional neural systems, and for imaging normal brain functions. They then examine the applications of brain imaging techniques to the study of such neurological and psychiatric disorders as: cerebral ischemia; convulsive disorders; cerebral tumors; Huntington's disease; Alzheimer's disease; depression and other mood disorders. A state-of-the-art report on magnetic resonance imaging of the brain and central nervous system rounds out the book's coverage.

  7. Modular workcells: modern methods for laboratory automation.

    PubMed

    Felder, R A

    1998-12-01

    Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.

  8. What Is an Automated External Defibrillator?

    MedlinePlus

    ANSWERS by heart Treatments + Tests What Is an Automated External Defibrillator? An automated external defibrillator (AED) is a lightweight, portable device ... AED? Non-medical personnel such as police, fire service personnel, flight attendants, security guards and other lay ...

  9. Understanding human management of automation errors

    PubMed Central

    McBride, Sara E.; Rogers, Wendy A.; Fisk, Arthur D.

    2013-01-01

    Automation has the potential to aid humans with a diverse set of tasks and support overall system performance. Automated systems are not always reliable, and when automation errs, humans must engage in error management, which is the process of detecting, understanding, and correcting errors. However, this process of error management in the context of human-automation interaction is not well understood. Therefore, we conducted a systematic review of the variables that contribute to error management. We examined relevant research in human-automation interaction and human error to identify critical automation, person, task, and emergent variables. We propose a framework for management of automation errors to incorporate and build upon previous models. Further, our analysis highlights variables that may be addressed through design and training to positively influence error management. Additional efforts to understand the error management process will contribute to automation designed and implemented to support safe and effective system performance. PMID:25383042

  10. Ask the experts: automation: part I.

    PubMed

    Allinson, John L; Blick, Kenneth E; Cohen, Lucinda; Higton, David; Li, Ming

    2013-08-01

    Bioanalysis invited a selection of leading researchers to express their views on automation in the bioanalytical laboratory. The topics discussed include the challenges that the modern bioanalyst faces when integrating automation into existing drug-development processes, the impact of automation and how they envision the modern bioanalytical laboratory changing in the near future. Their enlightening responses provide a valuable insight into the impact of automation and the future of the constantly evolving bioanalytical laboratory.

  11. Progress in Fully Automated Abdominal CT Interpretation

    PubMed Central

    Summers, Ronald M.

    2016-01-01

    OBJECTIVE Automated analysis of abdominal CT has advanced markedly over just the last few years. Fully automated assessment of organs, lymph nodes, adipose tissue, muscle, bowel, spine, and tumors are some examples where tremendous progress has been made. Computer-aided detection of lesions has also improved dramatically. CONCLUSION This article reviews the progress and provides insights into what is in store in the near future for automated analysis for abdominal CT, ultimately leading to fully automated interpretation. PMID:27101207

  12. Automated Cataloging. SPEC Kit 47.

    ERIC Educational Resources Information Center

    Association of Research Libraries, Washington, DC. Office of Management Studies.

    Results of a 1978 Association of Research Libraries (ARL) survey indicated that 68 (89%) of responding libraries utilized an automated cataloging system. Of those 68, 53 participated in the Ohio College Library Center (OCLC), five in BALLOTS, and the rest in other networks or local systems. At the beginning of this collection, a concise summary…

  13. Automated species identification: why not?

    PubMed Central

    Gaston, Kevin J; O'Neill, Mark A

    2004-01-01

    Where possible, automation has been a common response of humankind to many activities that have to be repeated numerous times. The routine identification of specimens of previously described species has many of the characteristics of other activities that have been automated, and poses a major constraint on studies in many areas of both pure and applied biology. In this paper, we consider some of the reasons why automated species identification has not become widely employed, and whether it is a realistic option, addressing the notions that it is too difficult, too threatening, too different or too costly. Although recognizing that there are some very real technical obstacles yet to be overcome, we argue that progress in the development of automated species identification is extremely encouraging that such an approach has the potential to make a valuable contribution to reducing the burden of routine identifications. Vision and enterprise are perhaps more limiting at present than practical constraints on what might possibly be achieved. PMID:15253351

  14. Fully automated solid weighing workstation.

    PubMed

    Wong, Stephen K-F; Lu, YiFeng; Heineman, William; Palmer, Janice; Courtney, Carter

    2005-08-01

    A fully automated, solid-to-solid weighing workstation (patent pending) is described in this article. The core of this automated process is the use of an electrostatically charged pipette tip to attract solid particles on its outside surface. The particles were then dislodged into a 1.2-mL destination vial in a microbalance by spinning the pipette tip. Textures of solid that could be weighed included powder, crystalline, liquid, and semi-solid substances. The workstation can pick up submilligram quantities of sample (=0.3mg) from source vials containing as little as 1mg. The destination vials containing the samples were stored in a 96-well rack to enable subsequent automated liquid handling. Using bovine serum albumin as test solid, the coefficient of variation of the protein concentration for 48 samples is less than 6%. The workstation was used successfully to weigh out 48 different synthetic compounds. Time required for automated weighing was similar to manual weighing. The use of this workstation reduced 90% hands-on time and thus exposure to potentially toxic compounds. In addition, it minimized sample waste and reduced artifacts due to the poor solubility of compound in solvents. Moreover, it enabled compounds synthesized in milligram quantities to be weighed out and tested in biological assays.

  15. Automated Filtering of Internet Postings.

    ERIC Educational Resources Information Center

    Rosenfeld, Louis B.; Holland, Maurita P.

    1994-01-01

    Discussion of the use of dynamic data resources, such as Internet LISTSERVs or Usenet newsgroups, focuses on an experiment using an automated filtering system with Usenet newsgroups. Highlights include user satisfaction, based on retrieval size, data sources, and user interface and the need for some human mediation. (Contains two references.) (LRW)

  16. Automated analysis of oxidative metabolites

    NASA Technical Reports Server (NTRS)

    Furner, R. L. (Inventor)

    1974-01-01

    An automated system for the study of drug metabolism is described. The system monitors the oxidative metabolites of aromatic amines and of compounds which produce formaldehyde on oxidative dealkylation. It includes color developing compositions suitable for detecting hyroxylated aromatic amines and formaldehyde.

  17. Teacherbot: Interventions in Automated Teaching

    ERIC Educational Resources Information Center

    Bayne, Sian

    2015-01-01

    Promises of "teacher-light" tuition and of enhanced "efficiency" via the automation of teaching have been with us since the early days of digital education, sometimes embraced by academics and institutions, and sometimes resisted as a set of moves which are damaging to teacher professionalism and to the humanistic values of…

  18. Automating the conflict resolution process

    NASA Technical Reports Server (NTRS)

    Wike, Jeffrey S.

    1991-01-01

    The purpose is to initiate a discussion of how the conflict resolution process at the Network Control Center can be made more efficient. Described here are how resource conflicts are currently resolved as well as the impacts of automating conflict resolution in the ATDRSS era. A variety of conflict resolution strategies are presented.

  19. Automating a High School Restroom.

    ERIC Educational Resources Information Center

    Ritner-Heir, Robbin

    1999-01-01

    Discusses how one high school transformed its restrooms into cleaner and more vandal-resistant environments by automating them. Solutions discussed include installing perforated stainless steel panel ceilings, using epoxy-based paint for walls, selecting china commode fixtures instead of stainless steel, installing electronic faucets and sensors,…

  20. Automation; The New Industrial Revolution.

    ERIC Educational Resources Information Center

    Arnstein, George E.

    Automation is a word that describes the workings of computers and the innovations of automatic transfer machines in the factory. As the hallmark of the new industrial revolution, computers displace workers and create a need for new skills and retraining programs. With improved communication between industry and the educational community to…

  1. Safety in the Automated Office.

    ERIC Educational Resources Information Center

    Graves, Pat R.; Greathouse, Lillian R.

    1990-01-01

    Office automation has introduced new hazards to the workplace: electrical hazards related to computer wiring, musculoskeletal problems resulting from use of computer terminals and design of work stations, and environmental concerns related to ventilation, noise levels, and office machine chemicals. (SK)

  2. Library Automation: Guidelines to Costing.

    ERIC Educational Resources Information Center

    Ford, Geoffrey

    As with all new programs, the costs associated with library automation must be carefully considered before implementation. This document suggests guidelines to be followed and areas to be considered in the costing of library procedures. An existing system model has been suggested as a standard (Appendix A) and a classification of library tasks…

  3. Delaware: Library Automation and Networking.

    ERIC Educational Resources Information Center

    Sloan, Tom

    1996-01-01

    Describes automation and networking activities among Delaware libraries, including integrated library systems for public libraries, the Delaware Technical and Community College telecommunications network, Delaware Public Library Internet access planning, digital resources, a computer/technology training center, and the Delaware Center for…

  4. Automation of Space Inventory Management

    NASA Technical Reports Server (NTRS)

    Fink, Patrick W.; Ngo, Phong; Wagner, Raymond; Barton, Richard; Gifford, Kevin

    2009-01-01

    This viewgraph presentation describes the utilization of automated space-based inventory management through handheld RFID readers and BioNet Middleware. The contents include: 1) Space-Based INventory Management; 2) Real-Time RFID Location and Tracking; 3) Surface Acoustic Wave (SAW) RFID; and 4) BioNet Middleware.

  5. Automation on the Laboratory Bench.

    ERIC Educational Resources Information Center

    Legrand, M.; Foucard, A.

    1978-01-01

    A kit is described for use in automation of routine chemical research procedures. The kit uses sensors to evaluate the state of the system, actuators which modify the adjustable parameters, and an organ of decision which uses the information from the sensors. (BB)

  6. Illinois: Library Automation and Connectivity Initiatives.

    ERIC Educational Resources Information Center

    Lamont, Bridget L.; Bloomberg, Kathleen L.

    1996-01-01

    Discussion of library automation in Illinois focuses on ILLINET, the Illinois Library and Information Network. Topics include automated resource sharing; ILLINET's online catalog; regional library system automation; community networking and public library technology development; telecommunications initiatives; electronic access to state government…

  7. You're a What? Automation Technician

    ERIC Educational Resources Information Center

    Mullins, John

    2010-01-01

    Many people think of automation as laborsaving technology, but it sure keeps Jim Duffell busy. Defined simply, automation is a technique for making a device run or a process occur with minimal direct human intervention. But the functions and technologies involved in automated manufacturing are complex. Nearly all functions, from orders coming in…

  8. Does Automated Feedback Improve Writing Quality?

    ERIC Educational Resources Information Center

    Wilson, Joshua; Olinghouse, Natalie G.; Andrada, Gilbert N.

    2014-01-01

    The current study examines data from students in grades 4-8 who participated in a statewide computer-based benchmark writing assessment that featured automated essay scoring and automated feedback. We examined whether the use of automated feedback was associated with gains in writing quality across revisions to an essay, and with transfer effects…

  9. Flight-deck automation: Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The state of the art in human factors in flight-deck automation is presented. A number of critical problem areas are identified and broad design guidelines are offered. Automation-related aircraft accidents and incidents are discussed as examples of human factors problems in automated flight.

  10. Automated System Marketplace 1987: Maturity and Competition.

    ERIC Educational Resources Information Center

    Walton, Robert A.; Bridge, Frank R.

    1988-01-01

    This annual review of the library automation marketplace presents profiles of 15 major library automation firms and looks at emerging trends. Seventeen charts and tables provide data on market shares, number and size of installations, hardware availability, operating systems, and interfaces. A directory of 49 automation sources is included. (MES)

  11. Archives and Automation: Issues and Trends.

    ERIC Educational Resources Information Center

    Weiner, Rob

    This paper focuses on archives and automation, and reviews recent literature on various topics concerning archives and automation. Topics include: resistance to technology and the need to educate about automation; the change in archival theory due to the information age; problems with technology use; the history of organizing archival records…

  12. Specimen coordinate automated measuring machine/fiducial automated measuring machine

    DOEpatents

    Hedglen, Robert E.; Jacket, Howard S.; Schwartz, Allan I.

    1991-01-01

    The Specimen coordinate Automated Measuring Machine (SCAMM) and the Fiducial Automated Measuring Machine (FAMM) is a computer controlled metrology system capable of measuring length, width, and thickness, and of locating fiducial marks. SCAMM and FAMM have many similarities in their designs, and they can be converted from one to the other without taking them out of the hot cell. Both have means for: supporting a plurality of samples and a standard; controlling the movement of the samples in the +/- X and Y directions; determining the coordinates of the sample; compensating for temperature effects; and verifying the accuracy of the measurements and repeating as necessary. SCAMM and FAMM are designed to be used in hot cells.

  13. Automation: how much is too much?

    PubMed

    Hancock, P A

    2014-01-01

    The headlong rush to automate continues apace. The dominant question still remains whether we can automate, not whether we should automate. However, it is this latter question that is featured and considered explicitly here. The suggestion offered is that unlimited automation of all technical functions will eventually prove anathema to the fundamental quality of human life. Examples of tasks, pursuits and past-times that should potentially be excused from the automation imperative are discussed. This deliberation leads us back to the question of balance in the cooperation, coordination and potential conflict between humans and the machines they create.

  14. Unsupervised Decoding of Long-Term, Naturalistic Human Neural Recordings with Automated Video and Audio Annotations.

    PubMed

    Wang, Nancy X R; Olson, Jared D; Ojemann, Jeffrey G; Rao, Rajesh P N; Brunton, Bingni W

    2016-01-01

    Fully automated decoding of human activities and intentions from direct neural recordings is a tantalizing challenge in brain-computer interfacing. Implementing Brain Computer Interfaces (BCIs) outside carefully controlled experiments in laboratory settings requires adaptive and scalable strategies with minimal supervision. Here we describe an unsupervised approach to decoding neural states from naturalistic human brain recordings. We analyzed continuous, long-term electrocorticography (ECoG) data recorded over many days from the brain of subjects in a hospital room, with simultaneous audio and video recordings. We discovered coherent clusters in high-dimensional ECoG recordings using hierarchical clustering and automatically annotated them using speech and movement labels extracted from audio and video. To our knowledge, this represents the first time techniques from computer vision and speech processing have been used for natural ECoG decoding. Interpretable behaviors were decoded from ECoG data, including moving, speaking and resting; the results were assessed by comparison with manual annotation. Discovered clusters were projected back onto the brain revealing features consistent with known functional areas, opening the door to automated functional brain mapping in natural settings. PMID:27148018

  15. Unsupervised Decoding of Long-Term, Naturalistic Human Neural Recordings with Automated Video and Audio Annotations

    PubMed Central

    Wang, Nancy X. R.; Olson, Jared D.; Ojemann, Jeffrey G.; Rao, Rajesh P. N.; Brunton, Bingni W.

    2016-01-01

    Fully automated decoding of human activities and intentions from direct neural recordings is a tantalizing challenge in brain-computer interfacing. Implementing Brain Computer Interfaces (BCIs) outside carefully controlled experiments in laboratory settings requires adaptive and scalable strategies with minimal supervision. Here we describe an unsupervised approach to decoding neural states from naturalistic human brain recordings. We analyzed continuous, long-term electrocorticography (ECoG) data recorded over many days from the brain of subjects in a hospital room, with simultaneous audio and video recordings. We discovered coherent clusters in high-dimensional ECoG recordings using hierarchical clustering and automatically annotated them using speech and movement labels extracted from audio and video. To our knowledge, this represents the first time techniques from computer vision and speech processing have been used for natural ECoG decoding. Interpretable behaviors were decoded from ECoG data, including moving, speaking and resting; the results were assessed by comparison with manual annotation. Discovered clusters were projected back onto the brain revealing features consistent with known functional areas, opening the door to automated functional brain mapping in natural settings. PMID:27148018

  16. Vision's Brain.

    ERIC Educational Resources Information Center

    Miller, Julie Ann

    1978-01-01

    The functional architecture of the primary visual cortex has been explored by monitoring the responses of individual brain cells to visual stimuli. A combination of anatomical and physiological techniques reveals groups of functionally related cells, juxtaposed and superimposed, in a sometimes complex, but presumably efficient, structure. (BB)

  17. Smart Brains.

    ERIC Educational Resources Information Center

    Jones, Rebecca

    1995-01-01

    New techniques have opened windows to the brain. Although the biochemistry of learning remains largely a mystery, the following findings seem to have clear implications for education: (1) the importance of early-learning opportunities for the very young; (2) the connection between music and abstract reasoning; and (3) the importance of good…

  18. Brain surgery breathes new life into aging plants

    SciTech Connect

    Makansi, J.

    2006-04-15

    Unlike managing the human aging process, extending the life of a power plant often includes brain surgery, modernizing its control and automation system. Lately, such retrofits range from wholesale replacing of existing controls to the addition of specific control elements that help optimize performance. Pending revisions to safety codes and cybersecurity issues also need to be considered. 4 figs.

  19. Automated nutrient analyses in seawater

    SciTech Connect

    Whitledge, T.E.; Malloy, S.C.; Patton, C.J.; Wirick, C.D.

    1981-02-01

    This manual was assembled for use as a guide for analyzing the nutrient content of seawater samples collected in the marine coastal zone of the Northeast United States and the Bering Sea. Some modifications (changes in dilution or sample pump tube sizes) may be necessary to achieve optimum measurements in very pronounced oligotrophic, eutrophic or brackish areas. Information is presented under the following section headings: theory and mechanics of automated analysis; continuous flow system description; operation of autoanalyzer system; cookbook of current nutrient methods; automated analyzer and data analysis software; computer interfacing and hardware modifications; and trouble shooting. The three appendixes are entitled: references and additional reading; manifold components and chemicals; and software listings. (JGB)

  20. Automated Demand Response and Commissioning

    SciTech Connect

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Bourassa, Norman

    2005-04-01

    This paper describes the results from the second season of research to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability and manage electricity costs. Fully-Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. We refer to this as Auto-DR. The evaluation of the control and communications must be properly configured and pass through a set of test stages: Readiness, Approval, Price Client/Price Server Communication, Internet Gateway/Internet Relay Communication, Control of Equipment, and DR Shed Effectiveness. New commissioning tests are needed for such systems to improve connecting demand responsive building systems to the electric grid demand response systems.

  1. Automating occupational protection records systems

    SciTech Connect

    Lyon, M.; Martin, J.B.

    1991-10-01

    Occupational protection records have traditionally been generated by field and laboratory personnel, assembled into files in the safety office, and eventually stored in a warehouse or other facility. Until recently, these records have been primarily paper copies, often handwritten. Sometimes, the paper is microfilmed for storage. However, electronic records are beginning to replace these traditional methods. The purpose of this paper is to provide guidance for making the transition to automated record keeping and retrieval using modern computer equipment. This paper describes the types of records most readily converted to electronic record keeping and a methodology for implementing an automated record system. The process of conversion is based on a requirements analysis to assess program needs and a high level of user involvement during the development. The importance of indexing the hard copy records for easy retrieval is also discussed. The concept of linkage between related records and its importance relative to reporting, research, and litigation will be addressed. 2 figs.

  2. Automated illustration of patients instructions.

    PubMed

    Bui, Duy; Nakamura, Carlos; Bray, Bruce E; Zeng-Treitler, Qing

    2012-01-01

    A picture can be a powerful communication tool. However, creating pictures to illustrate patient instructions can be a costly and time-consuming task. Building on our prior research in this area, we developed a computer application that automatically converts text to pictures using natural language processing and computer graphics techniques. After iterative testing, the automated illustration system was evaluated using 49 previously unseen cardiology discharge instructions. The completeness of the system-generated illustrations was assessed by three raters using a three-level scale. The average inter-rater agreement for text correctly represented in the pictograph was about 66 percent. Since illustration in this context is intended to enhance rather than replace text, these results support the feasibility of conducting automated illustration.

  3. Automation design and crew coordination

    NASA Technical Reports Server (NTRS)

    Segal, Leon D.

    1993-01-01

    Advances in technology have greatly impacted the appearance of the modern aircraft cockpit. Where once one would see rows upon rows. The introduction of automation has greatly altered the demands on the pilots and the dynamics of aircrew task performance. While engineers and designers continue to implement the latest technological innovations in the cockpit - claiming higher reliability and decreased workload - a large percentage of aircraft accidents are still attributed to human error. Rather than being the main instigators of accidents, operators tend to be the inheritors of system defects created by poor design, incorrect installation, faulty maintenance and bad management decisions. This paper looks at some of the variables that need to be considered if we are to eliminate at least one of these inheritances - poor design. Specifically, this paper describes the first part of a comprehensive study aimed at identifying the effects of automation on crew coordination.

  4. Automated labeling in document images

    NASA Astrophysics Data System (ADS)

    Kim, Jongwoo; Le, Daniel X.; Thoma, George R.

    2000-12-01

    The National Library of Medicine (NLM) is developing an automated system to produce bibliographic records for its MEDLINER database. This system, named Medical Article Record System (MARS), employs document image analysis and understanding techniques and optical character recognition (OCR). This paper describes a key module in MARS called the Automated Labeling (AL) module, which labels all zones of interest (title, author, affiliation, and abstract) automatically. The AL algorithm is based on 120 rules that are derived from an analysis of journal page layouts and features extracted from OCR output. Experiments carried out on more than 11,000 articles in over 1,000 biomedical journals show the accuracy of this rule-based algorithm to exceed 96%.

  5. Algorithms Could Automate Cancer Diagnosis

    NASA Technical Reports Server (NTRS)

    Baky, A. A.; Winkler, D. G.

    1982-01-01

    Five new algorithms are a complete statistical procedure for quantifying cell abnormalities from digitized images. Procedure could be basis for automated detection and diagnosis of cancer. Objective of procedure is to assign each cell an atypia status index (ASI), which quantifies level of abnormality. It is possible that ASI values will be accurate and economical enough to allow diagnoses to be made quickly and accurately by computer processing of laboratory specimens extracted from patients.

  6. Home automation in the workplace.

    PubMed

    McCormack, J E; Tello, S F

    1994-01-01

    Environmental control units and home automation devices contribute to the independence and potential of individuals with disabilities, both at work and at home. Devices currently exist that can assist people with physical, cognitive, and sensory disabilities to control lighting, appliances, temperature, security, and telephone communications. This article highlights several possible applications for these technologies and discusses emerging technologies that will increase the benefits these devices offer people with disabilities.

  7. Automated Scheduling Via Artificial Intelligence

    NASA Technical Reports Server (NTRS)

    Biefeld, Eric W.; Cooper, Lynne P.

    1991-01-01

    Artificial-intelligence software that automates scheduling developed in Operations Mission Planner (OMP) research project. Software used in both generation of new schedules and modification of existing schedules in view of changes in tasks and/or available resources. Approach based on iterative refinement. Although project focused upon scheduling of operations of scientific instruments and other equipment aboard spacecraft, also applicable to such terrestrial problems as scheduling production in factory.

  8. Convection automated logic oven control

    SciTech Connect

    Boyer, M.A.; Eke, K.I.

    1998-03-01

    For the past few years, there has been a greater push to bring more automation to the cooling process. There have been attempts at automated cooking using a wide range of sensors and procedures, but with limited success. The authors have the answer to the automated cooking process; this patented technology is called Convection AutoLogic (CAL). The beauty of the technology is that it requires no extra hardware for the existing oven system. They use the existing temperature probe, whether it is an RTD, thermocouple, or thermistor. This means that the manufacturer does not have to be burdened with extra costs associated with automated cooking in comparison to standard ovens. The only change to the oven is the program in the central processing unit (CPU) on the board. As for its operation, when the user places the food into the oven, he or she is required to select a category (e.g., beef, poultry, or casseroles) and then simply press the start button. The CAL program then begins its cooking program. It first looks at the ambient oven temperature to see if it is a cold, warm, or hot start. CAL stores this data and then begins to look at the food`s thermal footprint. After CAL has properly detected this thermal footprint, it can calculate the time and temperature at which the food needs to be cooked. CAL then sets up these factors for the cooking stage of the program and, when the food has finished cooking, the oven is turned off automatically. The total time for this entire process is the same as the standard cooking time the user would normally set. The CAL program can also compensate for varying line voltages and detect when the oven door is opened. With all of these varying factors being monitored, CAL can produce a perfectly cooked item with minimal user input.

  9. Automated Platform Management System Scheduling

    NASA Technical Reports Server (NTRS)

    Hull, Larry G.

    1990-01-01

    The Platform Management System was established to coordinate the operation of platform systems and instruments. The management functions are split between ground and space components. Since platforms are to be out of contact with the ground more than the manned base, the on-board functions are required to be more autonomous than those of the manned base. Under this concept, automated replanning and rescheduling, including on-board real-time schedule maintenance and schedule repair, are required to effectively and efficiently meet Space Station Freedom mission goals. In a FY88 study, we developed several promising alternatives for automated platform planning and scheduling. We recommended both a specific alternative and a phased approach to automated platform resource scheduling. Our recommended alternative was based upon use of exactly the same scheduling engine in both ground and space components of the platform management system. Our phased approach recommendation was based upon evolutionary development of the platform. In the past year, we developed platform scheduler requirements and implemented a rapid prototype of a baseline platform scheduler. Presently we are rehosting this platform scheduler rapid prototype and integrating the scheduler prototype into two Goddard Space Flight Center testbeds, as the ground scheduler in the Scheduling Concepts, Architectures, and Networks Testbed and as the on-board scheduler in the Platform Management System Testbed. Using these testbeds, we will investigate rescheduling issues, evaluate operational performance and enhance the platform scheduler prototype to demonstrate our evolutionary approach to automated platform scheduling. The work described in this paper was performed prior to Space Station Freedom rephasing, transfer of platform responsibility to Code E, and other recently discussed changes. We neither speculate on these changes nor attempt to predict the impact of the final decisions. As a consequence some of our

  10. Small Business Innovations (Automated Information)

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Bruce G. Jackson & Associates Document Director is an automated tool that combines word processing and database management technologies to offer the flexibility and convenience of text processing with the linking capability of database management. Originally developed for NASA, it provides a means to collect and manage information associated with requirements development. The software system was used by NASA in the design of the Assured Crew Return Vehicle, as well as by other government and commercial organizations including the Southwest Research Institute.

  11. Fully automated urban traffic system

    NASA Technical Reports Server (NTRS)

    Dobrotin, B. M.; Hansen, G. R.; Peng, T. K. C.; Rennels, D. A.

    1977-01-01

    The replacement of the driver with an automatic system which could perform the functions of guiding and routing a vehicle with a human's capability of responding to changing traffic demands was discussed. The problem was divided into four technological areas; guidance, routing, computing, and communications. It was determined that the latter three areas being developed independent of any need for fully automated urban traffic. A guidance system that would meet system requirements was not being developed but was technically feasible.

  12. Home automation in the workplace.

    PubMed

    McCormack, J E; Tello, S F

    1994-01-01

    Environmental control units and home automation devices contribute to the independence and potential of individuals with disabilities, both at work and at home. Devices currently exist that can assist people with physical, cognitive, and sensory disabilities to control lighting, appliances, temperature, security, and telephone communications. This article highlights several possible applications for these technologies and discusses emerging technologies that will increase the benefits these devices offer people with disabilities. PMID:24440955

  13. Understanding Brain Tumors

    MedlinePlus

    ... to Know About Brain Tumors . What is a Brain Tumor? A brain tumor is an abnormal growth
 ... Tumors” from Frankly Speaking Frankly Speaking About Cancer: Brain Tumors Download the full book Questions to ask ...

  14. Brain Tumors (For Parents)

    MedlinePlus

    ... Story" 5 Things to Know About Zika & Pregnancy Brain Tumors KidsHealth > For Parents > Brain Tumors Print A ... radiation therapy or chemotherapy, or both. Types of Brain Tumors There are many different types of brain ...

  15. Brain Tumor Diagnosis

    MedlinePlus

    ... Types of Brain Scans X-rays Laboratory Tests DNA Profiling Biopsy Procedure Malignant and Benign Brain Tumors Tumor ... Types of Brain Scans X-rays Laboratory Tests DNA Profiling Biopsy Procedure Malignant and Benign Brain Tumors Tumor ...

  16. Trust in automation: designing for appropriate reliance.

    PubMed

    Lee, John D; See, Katrina A

    2004-01-01

    Automation is often problematic because people fail to rely upon it appropriately. Because people respond to technology socially, trust influences reliance on automation. In particular, trust guides reliance when complexity and unanticipated situations make a complete understanding of the automation impractical. This review considers trust from the organizational, sociological, interpersonal, psychological, and neurological perspectives. It considers how the context, automation characteristics, and cognitive processes affect the appropriateness of trust. The context in which the automation is used influences automation performance and provides a goal-oriented perspective to assess automation characteristics along a dimension of attributional abstraction. These characteristics can influence trust through analytic, analogical, and affective processes. The challenges of extrapolating the concept of trust in people to trust in automation are discussed. A conceptual model integrates research regarding trust in automation and describes the dynamics of trust, the role of context, and the influence of display characteristics. Actual or potential applications of this research include improved designs of systems that require people to manage imperfect automation.

  17. Process development for automated solar cell and module production. Task 4: Automated array assembly

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use was developed. The process sequence was then critically analyzed from a technical and economic standpoint to determine the technological readiness of certain process steps for implementation. The steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect, both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development.

  18. Whole-brain activity mapping onto a zebrafish brain atlas

    PubMed Central

    Randlett, Owen; Wee, Caroline L.; Naumann, Eva A.; Nnaemeka, Onyeka; Schoppik, David; Fitzgerald, James E.; Portugues, Ruben; Lacoste, Alix M.B.; Riegler, Clemens; Engert, Florian; Schier, Alexander F.

    2015-01-01

    In order to localize the neural circuits involved in generating behaviors, it is necessary to assign activity onto anatomical maps of the nervous system. Using brain registration across hundreds of larval zebrafish, we have built an expandable open source atlas containing molecular labels and anatomical region definitions, the Z-Brain. Using this platform and immunohistochemical detection of phosphorylated-Extracellular signal-regulated kinase (ERK/MAPK) as a readout of neural activity, we have developed a system to create and contextualize whole brain maps of stimulus- and behavior-dependent neural activity. This MAP-Mapping (Mitogen Activated Protein kinase – Mapping) assay is technically simple, fast, inexpensive, and data analysis is completely automated. Since MAP-Mapping is performed on fish that are freely swimming, it is applicable to nearly any stimulus or behavior. We demonstrate the utility of our high-throughput approach using hunting/feeding, pharmacological, visual and noxious stimuli. The resultant maps outline hundreds of areas associated with behaviors. PMID:26778924

  19. Whole-brain activity mapping onto a zebrafish brain atlas.

    PubMed

    Randlett, Owen; Wee, Caroline L; Naumann, Eva A; Nnaemeka, Onyeka; Schoppik, David; Fitzgerald, James E; Portugues, Ruben; Lacoste, Alix M B; Riegler, Clemens; Engert, Florian; Schier, Alexander F

    2015-11-01

    In order to localize the neural circuits involved in generating behaviors, it is necessary to assign activity onto anatomical maps of the nervous system. Using brain registration across hundreds of larval zebrafish, we have built an expandable open-source atlas containing molecular labels and definitions of anatomical regions, the Z-Brain. Using this platform and immunohistochemical detection of phosphorylated extracellular signal–regulated kinase (ERK) as a readout of neural activity, we have developed a system to create and contextualize whole-brain maps of stimulus- and behavior-dependent neural activity. This mitogen-activated protein kinase (MAP)-mapping assay is technically simple, and data analysis is completely automated. Because MAP-mapping is performed on freely swimming fish, it is applicable to studies of nearly any stimulus or behavior. Here we demonstrate our high-throughput approach using pharmacological, visual and noxious stimuli, as well as hunting and feeding. The resultant maps outline hundreds of areas associated with behaviors. PMID:26778924

  20. The contaminant analysis automation robot implementation for the automated laboratory

    SciTech Connect

    Younkin, J.R.; Igou, R.E.; Urenda, T.D.

    1995-12-31

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLM when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation.

  1. Martian 'Brain'

    NASA Technical Reports Server (NTRS)

    2004-01-01

    5 May 2004 Most middle-latitude craters on Mars have strange landforms on their floors. Often, the floors have pitted and convoluted features that lack simple explanation. In this case, the central part of the crater floor shown in this 2004 Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image bears some resemblance to the folded nature of a brain. Or not. It depends upon the 'eye of the beholder,' perhaps. The light-toned 'ring' around the 'brain' feature is more easily explained--windblown ripples and dunes. The crater occurs near 33.1oS, 91.2oW, and is illuminated from the upper left. The picture covers an area about 3 km (1.9 mi) across.

  2. Correlation between automated writing movements and striatal dopaminergic innervation in patients with Wilson's disease.

    PubMed

    Hermann, Wieland; Eggers, Birk; Barthel, Henryk; Clark, Daniel; Villmann, Thomas; Hesse, Swen; Grahmann, Friedrich; Kühn, Hans-Jürgen; Sabri, Osama; Wagner, Armin

    2002-08-01

    Handwriting defects are an early sign of motor impairment in patients with Wilson's disease. The basal ganglia being the primary site of copper accumulation in the brain suggests a correlation with lesions in the nigrostiatal dopaminergic system. We have analysed and correlated striatal dopaminergic innervation using [(123)I]beta-CIT-SPECT and automated handwriting movements in 37 patients with Wilson's disease. There was a significant correlation of putaminal dopaminergic innervation with fine motor ability (p < 0,05 for NIV [number of inversion in velocity], NIA [number of inversion in acceleration], frequency). These data suggest that loss of dorsolateral striatal dopaminergic innervation has a pathophysiological function for decreased automated motor control in Wilson's disease. Furthermore analysis of automated handwriting movements could be useful for therapy monitoring and evaluation of striatal dopaminergic innervation. PMID:12195459

  3. Silicon Brains

    NASA Astrophysics Data System (ADS)

    Hoefflinger, Bernd

    Beyond the digital neural networks of Chap. 16, the more radical mapping of brain-like structures and processes into VLSI substrates has been pioneered by Carver Mead more than 30 years ago [1]. The basic idea was to exploit the massive parallelism of such circuits and to create low-power and fault-tolerant information-processing systems. Neuromorphic engineering has recently seen a revival with the availability of deep-submicron CMOS technology, which allows for the construction of very-large-scale mixed-signal systems combining local analog processing in neuronal cells with binary signalling via action potentials. Modern implementations are able to reach the complexity-scale of large functional units of the human brain, and they feature the ability to learn by plasticity mechanisms found in neuroscience. Combined with high-performance programmable logic and elaborate software tools, such systems are currently evolving into user-configurable non-von-Neumann computing systems, which can be used to implement and test novel computational paradigms. The chapter introduces basic properties of biological brains with up to 200 Billion neurons and their 1014 synapses, where action on a synapse takes ˜10 ms and involves an energy of ˜10 fJ. We outline 10x programs on neuromorphic electronic systems in Europe and the USA, which are intended to integrate 108 neurons and 1012 synapses, the level of a cat's brain, in a volume of 1 L and with a power dissipation <1 kW. For a balanced view on intelligence, we references Hawkins' view to first perceive the task and then design an intelligent technical response.

  4. Brain imaging

    SciTech Connect

    Bradshaw, J.R.

    1989-01-01

    This book presents a survey of the various imaging tools with examples of the different diseases shown best with each modality. It includes 100 case presentations covering the gamut of brain diseases. These examples are grouped according to the clinical presentation of the patient: headache, acute headache, sudden unilateral weakness, unilateral weakness of gradual onset, speech disorders, seizures, pituitary and parasellar lesions, sensory disorders, posterior fossa and cranial nerve disorders, dementia, and congenital lesions.

  5. Animating Brains

    PubMed Central

    Borck, Cornelius

    2016-01-01

    A recent paper famously accused the rising field of social neuroscience of using faulty statistics under the catchy title ‘Voodoo Correlations in Social Neuroscience’. This Special Issue invites us to take this claim as the starting point for a cross-cultural analysis: in which meaningful ways can recent research in the burgeoning field of functional imaging be described as, contrasted with, or simply compared to animistic practices? And what light does such a reading shed on the dynamics and effectiveness of a century of brain research into higher mental functions? Reviewing the heated debate from 2009 around recent trends in neuroimaging as a possible candidate for current instances of ‘soul catching’, the paper will then compare these forms of primarily image-based brain research with older regimes, revolving around the deciphering of the brain’s electrical activity. How has the move from a decoding paradigm to a representational regime affected the conceptualisation of self, psyche, mind and soul (if there still is such an entity)? And in what ways does modern technoscience provide new tools for animating brains? PMID:27292322

  6. Expert Robots For Automated Packaging And Processing

    NASA Astrophysics Data System (ADS)

    Slutzky, G. D.; Hall, E. L.; Shell, R. L.

    1989-02-01

    A variety of problems in automated packaging and processing seem ready for expert robotic solutions. Such problems as automated palletizing, bin-picking, automated stoilw and retrieval, automated kitting of parts for assembly, and automated warehousing are currently being considered. The use of expert robots which consist of specialized computer programs, manipulators and integrated sensors has been demonstrated with robot Chedkers, peg games, etc. Actual solutions for automated palletizing, pit-carb basket loading, etc. have also been developed for industrial applications at our Center. The generic concepts arising from this research will be described, unsolved problems discussed, and some important tools demonstrated. The significance of this work lies in its broad application to a host of generic industrial problems which can improve quality, reduce waste, are eliminate human injuries.

  7. Automated systems for identification of microorganisms.

    PubMed Central

    Stager, C E; Davis, J R

    1992-01-01

    Automated instruments for the identification of microorganisms were introduced into clinical microbiology laboratories in the 1970s. During the past two decades, the capabilities and performance characteristics of automated identification systems have steadily progressed and improved. This article explores the development of the various automated identification systems available in the United States and reviews their performance for identification of microorganisms. Observations regarding deficiencies and suggested improvements for these systems are provided. PMID:1498768

  8. Powder handling for automated fuel processing

    SciTech Connect

    Frederickson, J.R.; Eschenbaum, R.C.; Goldmann, L.H.

    1989-04-09

    Installation of the Secure Automated Fabrication (SAF) line has been completed. It is located in the Fuel Cycle Plant (FCP) at the Department of Energy's (DOE) Hanford site near Richland, Washington. The SAF line was designed to fabricate advanced reactor fuel pellets and assemble fuel pins by automated, remote operation. This paper describes powder handling equipment and techniques utilized for automated powder processing and powder conditioning systems in this line. 9 figs.

  9. Temperature automation for a propellant mixer

    NASA Technical Reports Server (NTRS)

    Vincent, T. L.; Wilson, R. G.

    1990-01-01

    The analysis and installation of an automatic temperature controller on a propellant mixer is presented. Ultimately, the entire mixing process will come under automation, but since precise adherence to the temperature profile is very difficult to sustain manually, this was the first component to be automated. Automation is not only important for producing a uniform product, but it is necessary for envisioned space-based propellant production.

  10. New luster for space robots and automation

    NASA Technical Reports Server (NTRS)

    Heer, E.

    1978-01-01

    Consideration is given to the potential role of robotics and automation in space transportation systems. Automation development requirements are defined for projects in space exploration, global services, space utilization, and space transport. In each category the potential automation of ground operations, on-board spacecraft operations, and in-space handling is noted. The major developments of space robot technology are noted for the 1967-1978 period. Economic aspects of ground-operation, ground command, and mission operations are noted.

  11. Human-centered aircraft automation: A concept and guidelines

    NASA Technical Reports Server (NTRS)

    Billings, Charles E.

    1991-01-01

    Aircraft automation is examined and its effects on flight crews. Generic guidelines are proposed for the design and use of automation in transport aircraft, in the hope of stimulating increased and more effective dialogue among designers of automated cockpits, purchasers of automated aircraft, and the pilots who must fly those aircraft in line operations. The goal is to explore the means whereby automation may be a maximally effective tool or resource for pilots without compromising human authority and with an increase in system safety. After definition of the domain of the aircraft pilot and brief discussion of the history of aircraft automation, a concept of human centered automation is presented and discussed. Automated devices are categorized as a control automation, information automation, and management automation. The environment and context of aircraft automation are then considered, followed by thoughts on the likely future of automation of that category.

  12. Automated decentralized pharmacy dispensing systems.

    PubMed

    1996-12-01

    Automated decentralized pharmacy dispensing systems (ADPDSs) are medication management systems that allow hospitals to store and dispense drugs near the point of use. These systems, which can be compared with the automated teller machines used by banks, provide nurses with ready access to medications while maintaining tight control of drug distribution. In this study, we evaluated three ADPDSs from two suppliers, focusing on whether these systems can store and dispense drugs in a safe, secure, and effective manner. When rating the systems, we considered their applicability to two different implementation schemes: The use of a system with a pharmacy profile interface. This feature broadens the capabilities of the system by allowing more information to be provided at the dispensing cabinet and by providing better integration of the information from this cabinet with the pharmacy's information system. Two of the evaluated systems have this feature and were rated Acceptable. The use of a system without a pharmacy profile interface. We rated all three of the evaluated systems Acceptable for such implementations. To decide which scheme is most appropriate for a particular hospital, the facility will need to determine both how it intends to use the ADPDS and what it hopes to achieve by implementing the system. By performing this type of analysis, the facility can then determine which ADPDS features and capabilities are needed to accomplish its goals. To help facilities make these decisions, we have provided an Equipment Management Guide, "Improving the Drug Distribution Process-Do You Need an Automated Decentralized Pharmacy Dispensing System?," which precedes this Evaluation. In addition, readers unfamiliar with the roles of both the pharmacy and the pharmacist within the hospital can refer to the Primer, "Functions of a Hospital Pharmacy," also published in this issue. PMID:8968721

  13. Automated dry powder dispenser for explosive components

    SciTech Connect

    Garcia, P.; Salmonson, J.C.

    1992-09-01

    Sandia and Mound are developing a workcell that will automate the assembly of explosive components. Sandia is responsible for the automated powder dispenser subsystem. Automated dispensing of explosive powders in the past resulted in separation or segregation of powder constituents. The Automated Dry Powder Dispenser designed by Sandia achieves weight tolerances of {plus_minus}0.1 mg while keeping powderoxidizer separation to a minimum. A software control algorithm compensates fore changes in powder flow due to lot variations, temperature, humidity, and the amount of powder left in the system.

  14. Automated dry powder dispenser for explosive components

    SciTech Connect

    Garcia, P. ); Salmonson, J.C. )

    1992-01-01

    Sandia and Mound are developing a workcell that will automate the assembly of explosive components. Sandia is responsible for the automated powder dispenser subsystem. Automated dispensing of explosive powders in the past resulted in separation or segregation of powder constituents. The Automated Dry Powder Dispenser designed by Sandia achieves weight tolerances of {plus minus}0.1 mg while keeping powderoxidizer separation to a minimum. A software control algorithm compensates fore changes in powder flow due to lot variations, temperature, humidity, and the amount of powder left in the system.

  15. Automated High Throughput Drug Target Crystallography

    SciTech Connect

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  16. Automation and quality in analytical laboratories

    SciTech Connect

    Valcarcel, M.; Rios, A.

    1994-05-01

    After a brief introduction to the generic aspects of automation in analytical laboratories, the different approaches to quality in analytical chemistry are presented and discussed to establish the following different facets emerging from the combination of quality and automation: automated analytical control of quality of products and systems; quality control of automated chemical analysis; and improvement of capital (accuracy and representativeness), basic (sensitivity, precision, and selectivity), and complementary (rapidity, cost, and personnel factors) analytical features. Several examples are presented to demonstrate the importance of this marriage of convenience in present and future analytical chemistry. 7 refs., 4 figs.

  17. Automated radiochemical synthesis and biodistribution of [11C]l-α-acetylmethadol ([11C]LAAM)

    PubMed Central

    Sai, Kiran Kumar Solingapuram; Fan, Jinda; Tu, Zhude; Zerkel, Patrick; Mach, Robert H.; Kharasch, Evan D.

    2015-01-01

    Long-acting opioid agonists methadone and l-α-acetylmethadol (LAAM) prevent withdrawal in opioid-dependent persons. Attempts to synthesize [11C]-methadone for PET evaluation of brain disposition were unsuccessful. Owing, however, to structural and pharmacologic similarities, we aimed to develop [11C]LAAM as a PET ligand to probe the brain exposure of long-lasting opioids in humans. This manuscript describes [11C]LAAM synthesis and its biodistribution in mice. The radiochemical synthetic strategy afforded high radiochemical yield, purity and specific activity, thereby making the synthesis adaptable to automated modules. PMID:24935116

  18. Application of automated MRI volumetric measurement techniques to the ventricular system in schizophrenics and normal controls.

    PubMed

    Shenton, M E; Kikinis, R; McCarley, R W; Metcalf, D; Tieman, J; Jolesz, F A

    1991-09-01

    As an initial approach to computer-automated segmentation of cerebral spinal fluid (CSF) vs. brain parenchyma in MR scans, and the transformation of these data sets into volumetric information and 3D display, we examined the ventricular system in a sample of ten chronic schizophrenics with primarily positive symptoms and 12 normal subjects. While no significant differences were noted between groups on volumetric measures of ventricular brain ratio or lateral ventricle size, normals showed a pattern of left greater than right lateral ventricular volume asymmetry not present in the schizophrenics. Within the schizophrenic group, departure from the normal left greater than right pattern was highly correlated with thought disorder.

  19. Automated mass spectrometer grows up

    SciTech Connect

    McInteer, B.B.; Montoya, J.G.; Stark, E.E.

    1984-01-01

    In 1980 we reported the development of an automated mass spectrometer for large scale batches of samples enriched in nitrogen-15 as ammonium salts. Since that time significant technical progress has been made in the instrument. Perhaps more significantly, administrative and institutional changes have permitted the entire effort to be transferred to the private sector from its original base at the Los Alamos National Laboratory. This has ensured the continuance of a needed service to the international scientific community as revealed by a development project at a national laboratory, and is an excellent example of beneficial technology transfer to private industry.

  20. Automated fuel pin loading system

    DOEpatents

    Christiansen, David W.; Brown, William F.; Steffen, Jim M.

    1985-01-01

    An automated loading system for nuclear reactor fuel elements utilizes a gravity feed conveyor which permits individual fuel pins to roll along a constrained path perpendicular to their respective lengths. The individual lengths of fuel cladding are directed onto movable transports, where they are aligned coaxially with the axes of associated handling equipment at appropriate production stations. Each fuel pin can be reciprocated axially and/or rotated about its axis as required during handling steps. The fuel pins are inserted as a batch prior to welding of end caps by one of two disclosed welding systems.

  1. Automated solar panel assembly line

    NASA Technical Reports Server (NTRS)

    Somberg, H.

    1981-01-01

    The initial stage of the automated solar panel assembly line program was devoted to concept development and proof of approach through simple experimental verification. In this phase, laboratory bench models were built to demonstrate and verify concepts. Following this phase was machine design and integration of the various machine elements. The third phase was machine assembly and debugging. In this phase, the various elements were operated as a unit and modifications were made as required. The final stage of development was the demonstration of the equipment in a pilot production operation.

  2. Automated planar patch-clamp.

    PubMed

    Milligan, Carol J; Möller, Clemens

    2013-01-01

    Ion channels are integral membrane proteins that regulate the flow of ions across the plasma membrane and the membranes of intracellular organelles of both excitable and non-excitable cells. Ion channels are vital to a wide variety of biological processes and are prominent components of the nervous system and cardiovascular system, as well as controlling many metabolic functions. Furthermore, ion channels are known to be involved in many disease states and as such have become popular therapeutic targets. For many years now manual patch-clamping has been regarded as one of the best approaches for assaying ion channel function, through direct measurement of ion flow across these membrane proteins. Over the last decade there have been many remarkable breakthroughs in the development of technologies enabling the study of ion channels. One of these breakthroughs is the development of automated planar patch-clamp technology. Automated platforms have demonstrated the ability to generate high-quality data with high throughput capabilities, at great efficiency and reliability. Additional features such as simultaneous intracellular and extracellular perfusion of the cell membrane, current clamp operation, fast compound application, an increasing rate of parallelization, and more recently temperature control have been introduced. Furthermore, in addition to the well-established studies of over-expressed ion channel proteins in cell lines, new generations of planar patch-clamp systems have enabled successful studies of native and primary mammalian cells. This technology is becoming increasingly popular and extensively used both within areas of drug discovery as well as academic research. Many platforms have been developed including NPC-16 Patchliner(®) and SyncroPatch(®) 96 (Nanion Technologies GmbH, Munich), CytoPatch™ (Cytocentrics AG, Rostock), PatchXpress(®) 7000A, IonWorks(®) Quattro and IonWorks Barracuda™, (Molecular Devices, LLC); Dynaflow(®) HT (Cellectricon

  3. Automated flight test management system

    NASA Technical Reports Server (NTRS)

    Hewett, M. D.; Tartt, D. M.; Agarwal, A.

    1991-01-01

    The Phase 1 development of an automated flight test management system (ATMS) as a component of a rapid prototyping flight research facility for artificial intelligence (AI) based flight concepts is discussed. The ATMS provides a flight engineer with a set of tools that assist in flight test planning, monitoring, and simulation. The system is also capable of controlling an aircraft during flight test by performing closed loop guidance functions, range management, and maneuver-quality monitoring. The ATMS is being used as a prototypical system to develop a flight research facility for AI based flight systems concepts at NASA Ames Dryden.

  4. Automated Coal-Mining System

    NASA Technical Reports Server (NTRS)

    Gangal, M. D.; Isenberg, L.; Lewis, E. V.

    1985-01-01

    Proposed system offers safety and large return on investment. System, operating by year 2000, employs machines and processes based on proven principles. According to concept, line of parallel machines, connected in groups of four to service modules, attacks face of coal seam. High-pressure water jets and central auger on each machine break face. Jaws scoop up coal chunks, and auger grinds them and forces fragments into slurry-transport system. Slurry pumped through pipeline to point of use. Concept for highly automated coal-mining system increases productivity, makes mining safer, and protects health of mine workers.

  5. Programmable, automated transistor test system

    NASA Technical Reports Server (NTRS)

    Truong, L. V.; Sundburg, G. R.

    1986-01-01

    A programmable, automated transistor test system was built to supply experimental data on new and advanced power semiconductors. The data will be used for analytical models and by engineers in designing space and aircraft electric power systems. A pulsed power technique was used at low duty cycles in a nondestructive test to examine the dynamic switching characteristic curves of power transistors in the 500 to 1000 V, 10 to 100 A range. Data collection, manipulation, storage, and output are operator interactive but are guided and controlled by the system software.

  6. Automated fuel pin loading system

    DOEpatents

    Christiansen, D.W.; Brown, W.F.; Steffen, J.M.

    An automated loading system for nuclear reactor fuel elements utilizes a gravity feed conveyor which permits individual fuel pins to roll along a constrained path perpendicular to their respective lengths. The individual lengths of fuel cladding are directed onto movable transports, where they are aligned coaxially with the axes of associated handling equipment at appropriate production stations. Each fuel pin can be be reciprocated axially and/or rotated about its axis as required during handling steps. The fuel pins are inerted as a batch prior to welding of end caps by one of two disclosed welding systems.

  7. Automating a residual gas analyzer

    NASA Technical Reports Server (NTRS)

    Petrie, W. F.; Westfall, A. H.

    1982-01-01

    A residual gas analyzer (RGA), a device for measuring the amounts and species of various gases present in a vacuum system is discussed. In a recent update of the RGA, it was shown that the use of microprocessors could revolutionize data acquisition and data reduction. This revolution is exemplified by the Inficon 1Q200 RGA which was selected to meet the needs of this update. The Inficon RGA and the Zilog microcomputer were interfaced in order the receive and format the digital data from the RGA. This automated approach is discussed in detail.

  8. AUTOMATION.

    ERIC Educational Resources Information Center

    Manpower Research Council, Milwaukee, WI.

    THE MANPOWER RESEARCH COUNCIL, A NONPROFIT SERVICE ORGANIZATION, HAS AS ITS OBJECTIVE THE DEVELOPMENT OF AN INTERCHANGE AMONG THE MANUFACTURING AND SERVICE INDUSTRIES OF THE UNITED STATES OF INFORMATION ON EMPLOYMENT, INDUSTRIAL RELATIONS TRENDS AND ACTIVITIES, AND MANAGEMENT PROBLEMS. A SURVEY OF 200 MEMBER CORPORATIONS, EMPLOYING A TOTAL OF…

  9. Disease-Specific Probabilistic Brain Atlases

    PubMed Central

    Thompson, Paul; Mega, Michael S.; Toga, Arthur W.

    2009-01-01

    Atlases of the human brain, in health and disease, provide a comprehensive framework for understanding brain structure and function. The complexity and variability of brain structure, especially in the gyral patterns of the human cortex, present challenges in creating standardized brain atlases that reflect the anatomy of a population. This paper introduces the concept of a population-based, disease-specific brain atlas that can reflect the unique anatomy and physiology of a particular clinical subpopulation. Based on well-characterized patient groups, disease-specific atlases contain thousands of structure models, composite maps, average templates, and visualizations of structural variability, asymmetry and group-specific differences. They correlate the structural, metabolic, molecular and histologic hallmarks of the disease. Rather than simply fusing information from multiple subjects and sources, new mathematical strategies are introduced to resolve group-specific features not apparent in individual scans. High-dimensional elastic mappings, based on covariant partial differential equations, are developed to encode patterns of cortical variation. In the resulting brain atlas, disease-specific features and regional asymmetries emerge that are not apparent in individual anatomies. The resulting probabilistic atlas can identify patterns of altered structure and function, and can guide algorithms for knowledge-based image analysis, automated image labeling, tissue classification, data mining and functional image analysis. PMID:19424457

  10. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  11. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  12. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  13. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  14. 21 CFR 864.5200 - Automated cell counter.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Automated cell counter. 864.5200 Section 864.5200....5200 Automated cell counter. (a) Identification. An automated cell counter is a fully-automated or semi-automated device used to count red blood cells, white blood cells, or blood platelets using a sample of...

  15. Automation of Oklahoma School Library Media Centers: Automation at the Local Level.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Education, Oklahoma City. Library and Learning Resources Section.

    This document outlines a workshop for media specialists--"School Library Automation: Solving the Puzzle"--that is designed to reduce automation anxiety and give a broad overview of the concerns confronting school library media centers planning for or involved in automation. Issues are addressed under the following headings: (1) Levels of School…

  16. Massachusetts Library Automation Survey: A Directory of Automated Operations in Massachusetts Libraries.

    ERIC Educational Resources Information Center

    Stephens, Eileen; Nijenberg, Caroline

    This directory is designed to provide information on automated systems and/or equipment used in libraries to provide a tool for planning future automation in the context of interlibrary cooperation considerations, and to inform the library and information community of the state of the art of automation in Massachusetts libraries. The main body is…

  17. Brain Imaging

    PubMed Central

    Racine, Eric; Bar-Ilan, Ofek; Illes, Judy

    2007-01-01

    Advances in neuroscience are increasingly intersecting with issues of ethical, legal, and social interest. This study is an analysis of press coverage of an advanced technology for brain imaging, functional magnetic resonance imaging, that has gained significant public visibility over the past ten years. Discussion of issues of scientific validity and interpretation dominated over ethical content in both the popular and specialized press. Coverage of research on higher order cognitive phenomena specifically attributed broad personal and societal meaning to neuroimages. The authors conclude that neuroscience provides an ideal model for exploring science communication and ethics in a multicultural context. PMID:17330151

  18. Automated Car Park Management System

    NASA Astrophysics Data System (ADS)

    Fabros, J. P.; Tabañag, D.; Espra, A.; Gerasta, O. J.

    2015-06-01

    This study aims to develop a prototype for an Automated Car Park Management System that will increase the quality of service of parking lots through the integration of a smart system that assists motorist in finding vacant parking lot. The research was based on implementing an operating system and a monitoring system for parking system without the use of manpower. This will include Parking Guidance and Information System concept which will efficiently assist motorists and ensures the safety of the vehicles and the valuables inside the vehicle. For monitoring, Optical Character Recognition was employed to monitor and put into list all the cars entering the parking area. All parking events in this system are visible via MATLAB GUI which contain time-in, time-out, time consumed information and also the lot number where the car parks. To put into reality, this system has a payment method, and it comes via a coin slot operation to control the exit gate. The Automated Car Park Management System was successfully built by utilizing microcontrollers specifically one PIC18f4550 and two PIC16F84s and one PIC16F628A.

  19. Automated pipelines for spectroscopic analysis

    NASA Astrophysics Data System (ADS)

    Allende Prieto, C.

    2016-09-01

    The Gaia mission will have a profound impact on our understanding of the structure and dynamics of the Milky Way. Gaia is providing an exhaustive census of stellar parallaxes, proper motions, positions, colors and radial velocities, but also leaves some glaring holes in an otherwise complete data set. The radial velocities measured with the on-board high-resolution spectrograph will only reach some 10 % of the full sample of stars with astrometry and photometry from the mission, and detailed chemical information will be obtained for less than 1 %. Teams all over the world are organizing large-scale projects to provide complementary radial velocities and chemistry, since this can now be done very efficiently from the ground thanks to large and mid-size telescopes with a wide field-of-view and multi-object spectrographs. As a result, automated data processing is taking an ever increasing relevance, and the concept is applying to many more areas, from targeting to analysis. In this paper, I provide a quick overview of recent, ongoing, and upcoming spectroscopic surveys, and the strategies adopted in their automated analysis pipelines.

  20. Automated experimentation in ecological networks

    PubMed Central

    2011-01-01

    Background In ecological networks, natural communities are studied from a complex systems perspective by representing interactions among species within them in the form of a graph, which is in turn analysed using mathematical tools. Topological features encountered in complex networks have been proved to provide the systems they represent with interesting attributes such as robustness and stability, which in ecological systems translates into the ability of communities to resist perturbations of different kinds. A focus of research in community ecology is on understanding the mechanisms by which these complex networks of interactions among species in a community arise. We employ an agent-based approach to model ecological processes operating at the species' interaction level for the study of the emergence of organisation in ecological networks. Results We have designed protocols of interaction among agents in a multi-agent system based on ecological processes occurring at the interaction level between species in plant-animal mutualistic communities. Interaction models for agents coordination thus engineered facilitate the emergence of network features such as those found in ecological networks of interacting species, in our artificial societies of agents. Conclusions Agent based models developed in this way facilitate the automation of the design an execution of simulation experiments that allow for the exploration of diverse behavioural mechanisms believed to be responsible for community organisation in ecological communities. This automated way of conducting experiments empowers the study of ecological networks by exploiting the expressive power of interaction models specification in agent systems. PMID:21554669

  1. Automation of surface observations program

    NASA Technical Reports Server (NTRS)

    Short, Steve E.

    1988-01-01

    At present, surface weather observing methods are still largely manual and labor intensive. Through the nationwide implementation of Automated Surface Observing Systems (ASOS), this situation can be improved. Two ASOS capability levels are planned. The first is a basic-level system which will automatically observe the weather parameters essential for aviation operations and will operate either with or without supplemental contributions by an observer. The second is a more fully automated, stand-alone system which will observe and report the full range of weather parameters and will operate primarily in the unattended mode. Approximately 250 systems are planned by the end of the decade. When deployed, these systems will generate the standard hourly and special long-line transmitted weather observations, as well as provide continuous weather information direct to airport users. Specific ASOS configurations will vary depending upon whether the operation is unattended, minimally attended, or fully attended. The major functions of ASOS are data collection, data processing, product distribution, and system control. The program phases of development, demonstration, production system acquisition, and operational implementation are described.

  2. Cassini Tour Atlas Automated Generation

    NASA Technical Reports Server (NTRS)

    Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.

    2011-01-01

    During the Cassini spacecraft s cruise phase and nominal mission, the Cassini Science Planning Team developed and maintained an online database of geometric and timing information called the Cassini Tour Atlas. The Tour Atlas consisted of several hundreds of megabytes of EVENTS mission planning software outputs, tables, plots, and images used by mission scientists for observation planning. Each time the nominal mission trajectory was altered or tweaked, a new Tour Atlas had to be regenerated manually. In the early phases of Cassini s Equinox Mission planning, an a priori estimate suggested that mission tour designers would develop approximately 30 candidate tours within a short period of time. So that Cassini scientists could properly analyze the science opportunities in each candidate tour quickly and thoroughly so that the optimal series of orbits for science return could be selected, a separate Tour Atlas was required for each trajectory. The task of manually generating the number of trajectory analyses in the allotted time would have been impossible, so the entire task was automated using code written in five different programming languages. This software automates the generation of the Cassini Tour Atlas database. It performs with one UNIX command what previously took a day or two of human labor.

  3. Automated cleaning of electronic components

    SciTech Connect

    Drotning, W.

    1994-03-01

    Environmental and operator safety concerns are leading to the elimination of trichloroethylene (TCE) and chlorofluorocarbon (CFC) solvents in electronic component cleaning processes that remove rosin flux, organic and inorganic contamination, and particulates. Present processes depend heavily on these solvents for manual spray cleaning of small components and subassemblies. Use of alternative solvent systems can lead to longer processing times and reduced quality. Automated spray cleaning can improve the quality of the cleaning process, thus enabling the productive use of environmentally conscious materials, while minimizing personnel exposure to hazardous materials. In addition, the use of robotic and automated systems can reduce the manual handling of parts that necessitates additional cleaning. We describe the development of a prototype robotic system for cleaning electronic components in a spray cleaning workcell. An important feature of the prototype system is the capability to generate the robot paths and motions automatically from the CAD models of the part to be cleaned, and to embed cleaning process knowledge into the automatically programmed operations.

  4. Automation and robotics human performance

    NASA Technical Reports Server (NTRS)

    Mah, Robert W.

    1990-01-01

    The scope of this report is limited to the following: (1) assessing the feasibility of the assumptions for crew productivity during the intra-vehicular activities and extra-vehicular activities; (2) estimating the appropriate level of automation and robotics to accomplish balanced man-machine, cost-effective operations in space; (3) identifying areas where conceptually different approaches to the use of people and machines can leverage the benefits of the scenarios; and (4) recommending modifications to scenarios or developing new scenarios that will improve the expected benefits. The FY89 special assessments are grouped into the five categories shown in the report. The high level system analyses for Automation & Robotics (A&R) and Human Performance (HP) were performed under the Case Studies Technology Assessment category, whereas the detailed analyses for the critical systems and high leverage development areas were performed under the appropriate operations categories (In-Space Vehicle Operations or Planetary Surface Operations). The analysis activities planned for the Science Operations technology areas were deferred to FY90 studies. The remaining activities such as analytic tool development, graphics/video demonstrations and intelligent communicating systems software architecture were performed under the Simulation & Validations category.

  5. Automated glass-fragmentation analysis

    NASA Astrophysics Data System (ADS)

    Gordon, Gaile G.

    1996-02-01

    This paper describes a novel automated inspection process for tempered safety glass. The system is geared toward the European Community (EC) import regulations which are based on fragment count and dimensions in a fractured glass sample. The automation of this test presents two key challenges: image acquisition, and robust particle segmentation. The image acquisition must perform well both for clear and opaque glass. Opaque regions of glass are common in the American auto industry due to painted styling or adhesives (e.g. defroster cables). The system presented uses a multiple light source, reflected light imaging technique, rather than transmitted light imaging which is often used in manual versions of this inspection test. Segmentation of the glass fragments in the resulting images must produce clean and completely connected crack lines in order to compute the correct particle count. Processing must therefore be robust with respect to noise in the imaging process such as dust and glint on the glass. The system presented takes advantage of mathematical morphology algorithms, in particular the watershed algorithm, to perform robust preprocessing and segmentation. Example images and image segmentation results are shown for tempered safety glass which has been painted on the outside edges for styling purposes.

  6. The Potential of Using Brain Images for Authentication

    PubMed Central

    Zhou, Zongtan; Shen, Hui; Hu, Dewen

    2014-01-01

    Biometric recognition (also known as biometrics) refers to the automated recognition of individuals based on their biological or behavioral traits. Examples of biometric traits include fingerprint, palmprint, iris, and face. The brain is the most important and complex organ in the human body. Can it be used as a biometric trait? In this study, we analyze the uniqueness of the brain and try to use the brain for identity authentication. The proposed brain-based verification system operates in two stages: gray matter extraction and gray matter matching. A modified brain segmentation algorithm is implemented for extracting gray matter from an input brain image. Then, an alignment-based matching algorithm is developed for brain matching. Experimental results on two data sets show that the proposed brain recognition system meets the high accuracy requirement of identity authentication. Though currently the acquisition of the brain is still time consuming and expensive, brain images are highly unique and have the potential possibility for authentication in view of pattern recognition. PMID:25126604

  7. The potential of using brain images for authentication.

    PubMed

    Chen, Fanglin; Zhou, Zongtan; Shen, Hui; Hu, Dewen

    2014-01-01

    Biometric recognition (also known as biometrics) refers to the automated recognition of individuals based on their biological or behavioral traits. Examples of biometric traits include fingerprint, palmprint, iris, and face. The brain is the most important and complex organ in the human body. Can it be used as a biometric trait? In this study, we analyze the uniqueness of the brain and try to use the brain for identity authentication. The proposed brain-based verification system operates in two stages: gray matter extraction and gray matter matching. A modified brain segmentation algorithm is implemented for extracting gray matter from an input brain image. Then, an alignment-based matching algorithm is developed for brain matching. Experimental results on two data sets show that the proposed brain recognition system meets the high accuracy requirement of identity authentication. Though currently the acquisition of the brain is still time consuming and expensive, brain images are highly unique and have the potential possibility for authentication in view of pattern recognition. PMID:25126604

  8. Automated 3D ultrasound image segmentation to aid breast cancer image interpretation.

    PubMed

    Gu, Peng; Lee, Won-Mean; Roubidoux, Marilyn A; Yuan, Jie; Wang, Xueding; Carson, Paul L

    2016-02-01

    Segmentation of an ultrasound image into functional tissues is of great importance to clinical diagnosis of breast cancer. However, many studies are found to segment only the mass of interest and not all major tissues. Differences and inconsistencies in ultrasound interpretation call for an automated segmentation method to make results operator-independent. Furthermore, manual segmentation of entire three-dimensional (3D) ultrasound volumes is time-consuming, resource-intensive, and clinically impractical. Here, we propose an automated algorithm to segment 3D ultrasound volumes into three major tissue types: cyst/mass, fatty tissue, and fibro-glandular tissue. To test its efficacy and consistency, the proposed automated method was employed on a database of 21 cases of whole breast ultrasound. Experimental results show that our proposed method not only distinguishes fat and non-fat tissues correctly, but performs well in classifying cyst/mass. Comparison of density assessment between the automated method and manual segmentation demonstrates good consistency with an accuracy of 85.7%. Quantitative comparison of corresponding tissue volumes, which uses overlap ratio, gives an average similarity of 74.54%, consistent with values seen in MRI brain segmentations. Thus, our proposed method exhibits great potential as an automated approach to segment 3D whole breast ultrasound volumes into functionally distinct tissues that may help to correct ultrasound speed of sound aberrations and assist in density based prognosis of breast cancer.

  9. Automated 3D ultrasound image segmentation for assistant diagnosis of breast cancer

    NASA Astrophysics Data System (ADS)

    Wang, Yuxin; Gu, Peng; Lee, Won-Mean; Roubidoux, Marilyn A.; Du, Sidan; Yuan, Jie; Wang, Xueding; Carson, Paul L.

    2016-04-01

    Segmentation of an ultrasound image into functional tissues is of great importance to clinical diagnosis of breast cancer. However, many studies are found to segment only the mass of interest and not all major tissues. Differences and inconsistencies in ultrasound interpretation call for an automated segmentation method to make results operator-independent. Furthermore, manual segmentation of entire three-dimensional (3D) ultrasound volumes is time-consuming, resource-intensive, and clinically impractical. Here, we propose an automated algorithm to segment 3D ultrasound volumes into three major tissue types: cyst/mass, fatty tissue, and fibro-glandular tissue. To test its efficacy and consistency, the proposed automated method was employed on a database of 21 cases of whole breast ultrasound. Experimental results show that our proposed method not only distinguishes fat and non-fat tissues correctly, but performs well in classifying cyst/mass. Comparison of density assessment between the automated method and manual segmentation demonstrates good consistency with an accuracy of 85.7%. Quantitative comparison of corresponding tissue volumes, which uses overlap ratio, gives an average similarity of 74.54%, consistent with values seen in MRI brain segmentations. Thus, our proposed method exhibits great potential as an automated approach to segment 3D whole breast ultrasound volumes into functionally distinct tissues that may help to correct ultrasound speed of sound aberrations and assist in density based prognosis of breast cancer.

  10. Fortress brain.

    PubMed

    Royall, Donald R

    2013-02-01

    Neurodegenerative diseases are associated with neuronal inclusions, comprised of protein aggregates. In Alzheimer's Disease (AD) and Lewy Body Disease (LBD) such lesions are distributed in a hierarchical retrograde transynaptic spatial pattern. This implies a retrograde transynaptic temporal propagation as well. There can be few explanations for this other than infectious agents (prions and viruses). This suggests that AD and LBD (at least) may have infectious origins. Transynaptic infiltration of the CNS along cranial nerve or other major projections, by one or more infectious agents has important implications. The clinical syndrome and natural history of each neurodegenerative disorder will reflect its portal of entry. There may be a different neurodegenerative syndrome for each cranial nerve or other portal of entry, and not all may manifest as "dementia". Each syndrome may be associated with more than one pathological lesion. Each pathology may be associated with several clinical syndromes. Host-parasite interactions are species specific. This may explain the rarity of AD-like pathology in most other older mammals. Over evolutionary timescales, the human brain should be adapted to predation by neurotropic agents. Viewed from this perspective, the prion-like pro-inflammatory and pro-apoptotic properties of β-amyloid and other proteins may be adaptive, and anti-microbial. Reductions in synaptic density may slow the progress of invading pathogens, while perineuronal nets and other structures may guard the gates. This suggests a defense in depth of a structure, the brain, that is inherently vulnerable to invasion along its neural networks.

  11. To automate or not to automate: this is the question

    PubMed Central

    Cymborowski, M.; Klimecka, M.; Chruszcz, M.; Zimmerman, M. D.; Shumilin, I. A.; Borek, D.; Lazarski, K.; Joachimiak, A.; Otwinowski, Z.; Anderson, W.

    2010-01-01

    New protocols and instrumentation significantly boost the outcome of structural biology, which has resulted in significant growth in the number of deposited Protein Data Bank structures. However, even an enormous increase of the productivity of a single step of the structure determination process may not significantly shorten the time between clone and deposition or publication. For example, in a medium size laboratory equipped with the LabDB and HKL-3000 systems, we show that automation of some (and integration of all) steps of the X-ray structure determination pathway is critical for laboratory productivity. Moreover, we show that the lag period after which the impact of a technology change is observed is longer than expected. PMID:20526815

  12. To automate or not to automate : this is the question.

    SciTech Connect

    Cymborowski, M.; Klimecka, M.; Chruszcz, M.; Zimmerman, M.; Shumilin, I.; Borek, D.; Lazarski, K.; Joachimiak, A.; Otwinowski, Z.; Anderson, W.; Minor, W.; Biosciences Division; Univ. of Virginia; Univ. of Texas; Northwestern Univ.; Univ. of Chicago

    2010-06-06

    New protocols and instrumentation significantly boost the outcome of structural biology, which has resulted in significant growth in the number of deposited Protein Data Bank structures. However, even an enormous increase of the productivity of a single step of the structure determination process may not significantly shorten the time between clone and deposition or publication. For example, in a medium size laboratory equipped with the LabDB and HKL-3000 systems, we show that automation of some (and integration of all) steps of the X-ray structure determination pathway is critical for laboratory productivity. Moreover, we show that the lag period after which the impact of a technology change is observed is longer than expected.

  13. The Automated Planet Finder's automation & first two years of science

    NASA Astrophysics Data System (ADS)

    Burt, Jennifer; Laughlin, Greg; Vogt, Steven S.; Holden, Bradford

    2016-01-01

    The Automated Planet Finder (APF) is the newest facility at Lick Observatory, comprised of a 2.4m telescope coupled with the high-resolution Levy echelle spectrograph. Purpose built for exoplanet detection and characterization, 80% of the telescope's observing time is dedicated to these science goals. The APF has demonstrated 1 m/s radial velocity precision on bright, RV standard stars and performs with the same speed-on-sky as Keck/HIRES when observing M-dwarfs.The telesope is fully automated for RV operations, using a dynamic scheduler that makes informed decisions on which targets to observe based on scientific interest, desired cadence, required precision levels and current observing conditions, all on a minute-to-minute basis. This ensures that time is not wasted chasing non-optimal targets on nights with poor conditions and enables rapid changes to the overall science observing strategy.The APF has contributed to the detection of four planetary systems in its first two years of scientific operations. Our most recent detection is that of a 6-planet system around the bright (V=5.5), nearby (d=6.5pc), K3V star HD 219134. The planets in this system have masses ranging from 3.5 to108 MEarth, with orbital periods from 3 to 2247 days. An independent detection of the inner 4 planets in this system by the HARPS-N team has shown that the 3d planet transits the star, making this system ideal for follow-up observations.I will discuss the design and implementation of the APF's dynamic scheduler, the telescope's planet detections to date, overall performance results of the telescope and our future observing strategy.

  14. Working toward Transparency in Library Automation

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2007-01-01

    In this article, the author argues the need for transparency with regard to the automation systems used in libraries. As librarians make decisions regarding automation software and services, they should have convenient access to information about the organizations it will potentially acquire technology from and about the collective experiences of…

  15. Library Automation: A Measure of Attitude.

    ERIC Educational Resources Information Center

    Molholt, Pat A.

    Findings of the study described in this report indicate that the attitudes of library school students toward library automation were not changed significantly by a semester of relevant coursework. It has been hypothesized that these students would have a somewhat negative attitude toward automation, but that through relevant course instruction…

  16. At the intersection of automation and culture

    NASA Technical Reports Server (NTRS)

    Sherman, P. J.; Wiener, E. L.

    1995-01-01

    The crash of an automated passenger jet at Nagoya, Japan, in 1995, is used as an example of crew error in using automatic systems. Automation provides pilots with the ability to perform tasks in various ways. National culture is cited as a factor that affects how a pilot and crew interact with each other and equipment.

  17. Workflow Automation: A Collective Case Study

    ERIC Educational Resources Information Center

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  18. Investing in the Future: Automation Marketplace 2009

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2009-01-01

    In a year where the general economy presented enormous challenges, libraries continued to make investments in automation, especially in products that help improve what and how they deliver to their end users. Access to electronic content remains a key driver. In response to anticipated needs for new approaches to library automation, many companies…

  19. An Automated Library Circulation System: A Justification.

    ERIC Educational Resources Information Center

    Harrell, Charles B.

    This report for an automated circulation control system to replace the currently used automated off-line batch system discusses the general requirements for the requested system, the equipment needed, the planned uses and design of the proposed system, its utilization, its expected benefits, its estimated costs, the alternatives considered, and…

  20. Partial Automated Alignment and Integration System

    NASA Technical Reports Server (NTRS)

    Kelley, Gary Wayne (Inventor)

    2014-01-01

    The present invention is a Partial Automated Alignment and Integration System (PAAIS) used to automate the alignment and integration of space vehicle components. A PAAIS includes ground support apparatuses, a track assembly with a plurality of energy-emitting components and an energy-receiving component containing a plurality of energy-receiving surfaces. Communication components and processors allow communication and feedback through PAAIS.

  1. Validation of Automated Scoring of Science Assessments

    ERIC Educational Resources Information Center

    Liu, Ou Lydia; Rios, Joseph A.; Heilman, Michael; Gerard, Libby; Linn, Marcia C.

    2016-01-01

    Constructed response items can both measure the coherence of student ideas and serve as reflective experiences to strengthen instruction. We report on new automated scoring technologies that can reduce the cost and complexity of scoring constructed-response items. This study explored the accuracy of c-rater-ML, an automated scoring engine…

  2. What's New in the Library Automation Arena?

    ERIC Educational Resources Information Center

    Breeding, Marshall

    1998-01-01

    Reviews trends in library automation based on vendors at the 1998 American Library Association Annual Conference. Discusses the major industry trend, a move from host-based computer systems to the new generation of client/server, object-oriented, open systems-based automation. Includes a summary of developments for 26 vendors. (LRW)

  3. The Automated Logistics Element Planning System (ALEPS)

    NASA Technical Reports Server (NTRS)

    Schwaab, Douglas G.

    1991-01-01

    The design and functions of ALEPS (Automated Logistics Element Planning System) is a computer system that will automate planning and decision support for Space Station Freedom Logistical Elements (LEs) resupply and return operations. ALEPS provides data management, planning, analysis, monitoring, interfacing, and flight certification for support of LE flight load planning activities. The prototype ALEPS algorithm development is described.

  4. Perspective on Automation: Three Talks to Educators.

    ERIC Educational Resources Information Center

    Theobald, Robert; And Others

    These papers take the view that automation impinges upon our socio-psychological as well as economic existence and we must take drastic measures to survive. Robert Theobald, presenting evidence that automation brings job displacement, suggests that we face the choice of trying to insure enough jobs, or of taking advantage of the new free time to…

  5. SAF line pellet gaging. [Secure Automated Fabrication

    SciTech Connect

    Jedlovec, D.R.; Bowen W.W.; Brown, R.L.

    1983-10-01

    Automated and remotely controlled pellet inspection operations will be utilized in the Secure Automated Fabrication (SAF) line. A prototypic pellet gage was designed and tested to verify conformance to the functions and requirements for measurement of diameter, surface flaws and weight-per-unit length.

  6. Do You Automate? Saving Time and Dollars

    ERIC Educational Resources Information Center

    Carmichael, Christine H.

    2010-01-01

    An automated workforce management strategy can help schools save jobs, improve the job satisfaction of teachers and staff, and free up precious budget dollars for investments in critical learning resources. Automated workforce management systems can help schools control labor costs, minimize compliance risk, and improve employee satisfaction.…

  7. Approaches to automated protein crystal harvesting

    SciTech Connect

    Deller, Marc C. Rupp, Bernhard

    2014-01-28

    Approaches to automated and robot-assisted harvesting of protein crystals are critically reviewed. While no true turn-key solutions for automation of protein crystal harvesting are currently available, systems incorporating advanced robotics and micro-electromechanical systems represent exciting developments with the potential to revolutionize the way in which protein crystals are harvested.

  8. 33 CFR 161.21 - Automated reporting.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 2 2014-07-01 2014-07-01 false Automated reporting. 161.21...) PORTS AND WATERWAYS SAFETY VESSEL TRAFFIC MANAGEMENT Vessel Movement Reporting System § 161.21 Automated... required to make continuous, all stations, AIS broadcasts, in lieu of voice Position Reports, to...

  9. 49 CFR 238.445 - Automated monitoring.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Automated monitoring. 238.445 Section 238.445 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Equipment § 238.445 Automated monitoring. (a) Each passenger train shall be equipped to monitor...

  10. 49 CFR 238.445 - Automated monitoring.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Automated monitoring. 238.445 Section 238.445 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Equipment § 238.445 Automated monitoring. (a) Each passenger train shall be equipped to monitor...

  11. 49 CFR 238.237 - Automated monitoring.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Automated monitoring. 238.237 Section 238.237... Equipment § 238.237 Automated monitoring. (a) Except as further specified in this paragraph, on or after... service for the first time on or after September 9, 2002, a working alerter shall be provided. (b)...

  12. 33 CFR 161.21 - Automated reporting.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 2 2013-07-01 2013-07-01 false Automated reporting. 161.21...) PORTS AND WATERWAYS SAFETY VESSEL TRAFFIC MANAGEMENT Vessel Movement Reporting System § 161.21 Automated... required to make continuous, all stations, AIS broadcasts, in lieu of voice Position Reports, to...

  13. 49 CFR 238.445 - Automated monitoring.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Automated monitoring. 238.445 Section 238.445 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Equipment § 238.445 Automated monitoring. (a) Each passenger train shall be equipped to monitor...

  14. 49 CFR 238.445 - Automated monitoring.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Automated monitoring. 238.445 Section 238.445 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD ADMINISTRATION... Equipment § 238.445 Automated monitoring. (a) Each passenger train shall be equipped to monitor...

  15. 49 CFR 238.237 - Automated monitoring.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Automated monitoring. 238.237 Section 238.237... Equipment § 238.237 Automated monitoring. (a) Except as further specified in this paragraph, on or after... service for the first time on or after September 9, 2002, a working alerter shall be provided. (b)...

  16. 49 CFR 238.237 - Automated monitoring.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Automated monitoring. 238.237 Section 238.237... Equipment § 238.237 Automated monitoring. (a) Except as further specified in this paragraph, on or after... service for the first time on or after September 9, 2002, a working alerter shall be provided. (b)...

  17. 49 CFR 238.237 - Automated monitoring.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Automated monitoring. 238.237 Section 238.237... Equipment § 238.237 Automated monitoring. (a) Except as further specified in this paragraph, on or after... service for the first time on or after September 9, 2002, a working alerter shall be provided. (b)...

  18. 33 CFR 161.21 - Automated reporting.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 2 2011-07-01 2011-07-01 false Automated reporting. 161.21...) PORTS AND WATERWAYS SAFETY VESSEL TRAFFIC MANAGEMENT Vessel Movement Reporting System § 161.21 Automated... required to make continuous, all stations, AIS broadcasts, in lieu of voice Position Reports, to...

  19. 33 CFR 161.21 - Automated reporting.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 2 2010-07-01 2010-07-01 false Automated reporting. 161.21...) PORTS AND WATERWAYS SAFETY VESSEL TRAFFIC MANAGEMENT Vessel Movement Reporting System § 161.21 Automated... required to make continuous, all stations, AIS broadcasts, in lieu of voice Position Reports, to...

  20. 33 CFR 161.21 - Automated reporting.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 2 2012-07-01 2012-07-01 false Automated reporting. 161.21...) PORTS AND WATERWAYS SAFETY VESSEL TRAFFIC MANAGEMENT Vessel Movement Reporting System § 161.21 Automated... required to make continuous, all stations, AIS broadcasts, in lieu of voice Position Reports, to...

  1. 49 CFR 238.237 - Automated monitoring.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Automated monitoring. 238.237 Section 238.237... Equipment § 238.237 Automated monitoring. (a) Except as further specified in this paragraph, on or after... service for the first time on or after September 9, 2002, a working alerter shall be provided. (b)...

  2. Physiological Self-Regulation and Adaptive Automation

    NASA Technical Reports Server (NTRS)

    Prinzell, Lawrence J.; Pope, Alan T.; Freeman, Frederick G.

    2007-01-01

    Adaptive automation has been proposed as a solution to current problems of human-automation interaction. Past research has shown the potential of this advanced form of automation to enhance pilot engagement and lower cognitive workload. However, there have been concerns voiced regarding issues, such as automation surprises, associated with the use of adaptive automation. This study examined the use of psychophysiological self-regulation training with adaptive automation that may help pilots deal with these problems through the enhancement of cognitive resource management skills. Eighteen participants were assigned to 3 groups (self-regulation training, false feedback, and control) and performed resource management, monitoring, and tracking tasks from the Multiple Attribute Task Battery. The tracking task was cycled between 3 levels of task difficulty (automatic, adaptive aiding, manual) on the basis of the electroencephalogram-derived engagement index. The other two tasks remained in automatic mode that had a single automation failure. Those participants who had received self-regulation training performed significantly better and reported lower National Aeronautics and Space Administration Task Load Index scores than participants in the false feedback and control groups. The theoretical and practical implications of these results for adaptive automation are discussed.

  3. Injured brain regions associated with anxiety in Vietnam veterans.

    PubMed

    Knutson, Kristine M; Rakowsky, Shana T; Solomon, Jeffrey; Krueger, Frank; Raymont, Vanessa; Tierney, Michael C; Wassermann, Eric M; Grafman, Jordan

    2013-03-01

    Anxiety negatively affects quality of life and psychosocial functioning. Previous research has shown that anxiety symptoms in healthy individuals are associated with variations in the volume of brain regions, such as the amygdala, hippocampus, and the bed nucleus of the stria terminalis. Brain lesion data also suggests the hemisphere damaged may affect levels of anxiety. We studied a sample of 182 male Vietnam War veterans with penetrating brain injuries, using a semi-automated voxel-based lesion-symptom mapping (VLSM) approach. VLSM reveals significant associations between a symptom such as anxiety and the location of brain lesions, and does not require a broad, subjective assignment of patients into categories based on lesion location. We found that lesioned brain regions in cortical and limbic areas of the left hemisphere, including middle, inferior and superior temporal lobe, hippocampus, and fusiform regions, along with smaller areas in the inferior occipital lobe, parahippocampus, amygdala, and insula, were associated with increased anxiety symptoms as measured by the Neurobehavioral Rating Scale (NRS). These results were corroborated by similar findings using Neuropsychiatric Inventory (NPI) anxiety scores, which supports these regions' role in regulating anxiety. In summary, using a semi-automated analysis tool, we detected an effect of focal brain damage on the presentation of anxiety. We also separated the effects of brain injury and war experience by including a control group of combat veterans without brain injury. We compared this control group against veterans with brain lesions in areas associated with anxiety, and against veterans with lesions only in other brain areas. PMID:23328629

  4. Corpus Callosum Area and Brain Volume in Autism Spectrum Disorder: Quantitative Analysis of Structural MRI from the ABIDE Database

    ERIC Educational Resources Information Center

    Kucharsky Hiess, R.; Alter, R.; Sojoudi, S.; Ardekani, B. A.; Kuzniecky, R.; Pardoe, H. R.

    2015-01-01

    Reduced corpus callosum area and increased brain volume are two commonly reported findings in autism spectrum disorder (ASD). We investigated these two correlates in ASD and healthy controls using T1-weighted MRI scans from the Autism Brain Imaging Data Exchange (ABIDE). Automated methods were used to segment the corpus callosum and intracranial…

  5. Cockpit Adaptive Automation and Pilot Performance

    NASA Technical Reports Server (NTRS)

    Parasuraman, Raja

    2001-01-01

    The introduction of high-level automated systems in the aircraft cockpit has provided several benefits, e.g., new capabilities, enhanced operational efficiency, and reduced crew workload. At the same time, conventional 'static' automation has sometimes degraded human operator monitoring performance, increased workload, and reduced situation awareness. Adaptive automation represents an alternative to static automation. In this approach, task allocation between human operators and computer systems is flexible and context-dependent rather than static. Adaptive automation, or adaptive task allocation, is thought to provide for regulation of operator workload and performance, while preserving the benefits of static automation. In previous research we have reported beneficial effects of adaptive automation on the performance of both pilots and non-pilots of flight-related tasks. For adaptive systems to be viable, however, such benefits need to be examined jointly in the context of a single set of tasks. The studies carried out under this project evaluated a systematic method for combining different forms of adaptive automation. A model for effective combination of different forms of adaptive automation, based on matching adaptation to operator workload was proposed and tested. The model was evaluated in studies using IFR-rated pilots flying a general-aviation simulator. Performance, subjective, and physiological (heart rate variability, eye scan-paths) measures of workload were recorded. The studies compared workload-based adaptation to to non-adaptive control conditions and found evidence for systematic benefits of adaptive automation. The research provides an empirical basis for evaluating the effectiveness of adaptive automation in the cockpit. The results contribute to the development of design principles and guidelines for the implementation of adaptive automation in the cockpit, particularly in general aviation, and in other human-machine systems. Project goals

  6. Laboratory automation: trajectory, technology, and tactics.

    PubMed

    Markin, R S; Whalen, S A

    2000-05-01

    Laboratory automation is in its infancy, following a path parallel to the development of laboratory information systems in the late 1970s and early 1980s. Changes on the horizon in healthcare and clinical laboratory service that affect the delivery of laboratory results include the increasing age of the population in North America, the implementation of the Balanced Budget Act (1997), and the creation of disease management companies. Major technology drivers include outcomes optimization and phenotypically targeted drugs. Constant cost pressures in the clinical laboratory have forced diagnostic manufacturers into less than optimal profitability states. Laboratory automation can be a tool for the improvement of laboratory services and may decrease costs. The key to improvement of laboratory services is implementation of the correct automation technology. The design of this technology should be driven by required functionality. Automation design issues should be centered on the understanding of the laboratory and its relationship to healthcare delivery and the business and operational processes in the clinical laboratory. Automation design philosophy has evolved from a hardware-based approach to a software-based approach. Process control software to support repeat testing, reflex testing, and transportation management, and overall computer-integrated manufacturing approaches to laboratory automation implementation are rapidly expanding areas. It is clear that hardware and software are functionally interdependent and that the interface between the laboratory automation system and the laboratory information system is a key component. The cost-effectiveness of automation solutions suggested by vendors, however, has been difficult to evaluate because the number of automation installations are few and the precision with which operational data have been collected to determine payback is suboptimal. The trend in automation has moved from total laboratory automation to a

  7. Janice VanCleave's Rocks and Minerals: Mind-Boggling Experiments You Can Turn into Science Fair Projects.

    ERIC Educational Resources Information Center

    VanCleave, Janice

    Science projects are a great way for students to learn more about science as they search for the answers to specific problems. This book offers guidance and provides ideas for students as they plan experiments, find and record information related to the problem, and organize data to find answers to the problem. The 20 topics in this book suggest…

  8. Rapid task acquisition of spatial-delayed alternation in an automated T-maze by mice.

    PubMed

    Schaefers, Andrea T U; Winter, York

    2011-11-20

    The spatial-delayed alternation task using a T-maze is the standard method for testing working memory in rodents and is widely used. Until now, however, there has been a gap in the understanding of the underlying brain mechanisms. The development of new manganese-enhanced brain imaging methods now permit a more specific examination of these mechanisms by allowing behavioural brain stimulation to take place outside the MRI scanner and the scan identifying the activation of specific brain regions to take place subsequently. The requirements for this method are a frequent repetition of the behaviour of interest, a control group that differs in only one task parameter and the minimization of unspecific environmental factors to avoid irrelevant stimulation. To meet these requirements, a fully automated spatial-delayed alternation task in a T-maze was developed that used identity detectors and automated gates to route mice individually from their social home cage to the T-maze. An experimental and a control group of mice were trained in procedures that differed only in the parameter "working-memory based alternation". Our data demonstrate that both groups can be trained concurrently with a rapid procedure using the automated T-maze. With its high level of stimulation, the minimization of unspecific stimulation through environmental factors and the simultaneous training of a control group that differs in only one task parameter our set-up and procedure met the requirements of new imaging techniques for the study of the influence of a specific cognitive component of spatial-delayed alternation on activity in specific brain regions.

  9. Principles and methods for automated palynology.

    PubMed

    Holt, K A; Bennett, K D

    2014-08-01

    Pollen grains are microscopic so their identification and quantification has, for decades, depended upon human observers using light microscopes: a labour-intensive approach. Modern improvements in computing and imaging hardware and software now bring automation of pollen analyses within reach. In this paper, we provide the first review in over 15 yr of progress towards automation of the part of palynology concerned with counting and classifying pollen, bringing together literature published from a wide spectrum of sources. We consider which attempts offer the most potential for an automated palynology system for universal application across all fields of research concerned with pollen classification and counting. We discuss what is required to make the datasets of these automated systems as acceptable as those produced by human palynologists, and present suggestions for how automation will generate novel approaches to counting and classifying pollen that have hitherto been unthinkable.

  10. Automation of the longwall mining system

    NASA Technical Reports Server (NTRS)

    Zimmerman, W.; Aster, R. W.; Harris, J.; High, J.

    1982-01-01

    Cost effective, safe, and technologically sound applications of automation technology to underground coal mining were identified. The longwall analysis commenced with a general search for government and industry experience of mining automation technology. A brief industry survey was conducted to identify longwall operational, safety, and design problems. The prime automation candidates resulting from the industry experience and survey were: (1) the shearer operation, (2) shield and conveyor pan line advance, (3) a management information system to allow improved mine logistics support, and (4) component fault isolation and diagnostics to reduce untimely maintenance delays. A system network analysis indicated that a 40% improvement in productivity was feasible if system delays associated with all of the above four areas were removed. A technology assessment and conceptual system design of each of the four automation candidate areas showed that state of the art digital computer, servomechanism, and actuator technologies could be applied to automate the longwall system.

  11. Automation literature: A brief review and analysis

    NASA Technical Reports Server (NTRS)

    Smith, D.; Dieterly, D. L.

    1980-01-01

    Current thought and research positions which may allow for an improved capability to understand the impact of introducing automation to an existing system are established. The orientation was toward the type of studies which may provide some general insight into automation; specifically, the impact of automation in human performance and the resulting system performance. While an extensive number of articles were reviewed, only those that addressed the issue of automation and human performance were selected to be discussed. The literature is organized along two dimensions: time, Pre-1970, Post-1970; and type of approach, Engineering or Behavioral Science. The conclusions reached are not definitive, but do provide the initial stepping stones in an attempt to begin to bridge the concept of automation in a systematic progression.

  12. Automation concepts for large space power systems

    NASA Technical Reports Server (NTRS)

    Imamura, M. S.; Moser, R.; Aichele, D.; Lanier, R., Jr.

    1983-01-01

    A study was undertaken to develop a methodology for analyzing, selecting, and implementing automation functions for multi-hundred-kW photovoltaic power systems intended for manned space station. The study involved identification of generic power system elements and their potential faults, definition of automation functions and their resulting benefits, and partitioning of automation functions between power subsystem, central spacecraft computer, and ground. Automation to a varying degree was concluded to be mandatory to meet the design and operational requirements of the space station. The key drivers are indefinite lifetime, modular growth, high performance flexibility, a need to accommodate different electrical user load equipment, on-orbit assembly/maintenance/servicing, and potentially large number of power subsystem components. Functions that are good candidates for automation via expert system approach includes battery management and electrical consumables management.

  13. AUTOMATING GROUNDWATER SAMPLING AT HANFORD

    SciTech Connect

    CONNELL CW; HILDEBRAND RD; CONLEY SF; CUNNINGHAM DE

    2009-01-16

    Until this past October, Fluor Hanford managed Hanford's integrated groundwater program for the U.S. Department of Energy (DOE). With the new contract awards at the Site, however, the CH2M HILL Plateau Remediation Company (CHPRC) has assumed responsibility for the groundwater-monitoring programs at the 586-square-mile reservation in southeastern Washington State. These programs are regulated by the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response Compensation and Liability Act (CERCLA). The purpose of monitoring is to track existing groundwater contamination from past practices, as well as other potential contamination that might originate from RCRA treatment, storage, and disposal (TSD) facilities. An integral part of the groundwater-monitoring program involves taking samples of the groundwater and measuring the water levels in wells scattered across the site. More than 1,200 wells are sampled each year. Historically, field personnel or 'samplers' have been issued pre-printed forms that have information about the well(s) for a particular sampling evolution. This information is taken from the Hanford Well Information System (HWIS) and the Hanford Environmental Information System (HEIS)--official electronic databases. The samplers used these hardcopy forms to document the groundwater samples and well water-levels. After recording the entries in the field, the samplers turned the forms in at the end of the day and the collected information was posted onto a spreadsheet that was then printed and included in a log book. The log book was then used to make manual entries of the new information into the software application(s) for the HEIS and HWIS databases. This is a pilot project for automating this tedious process by providing an electronic tool for automating water-level measurements and groundwater field-sampling activities. The automation will eliminate the manual forms and associated data entry, improve the accuracy of the

  14. Automation of the longwall mining system

    SciTech Connect

    Zimmerman, W.; Aster, R.; Harris, J.; High, J.

    1982-11-01

    The longwall automation study presented is the first phase of a study to evaluate mining automation opportunities. The objective was to identify cost-effective, safe, and technologically sound applications of automation technology to understand coal mining. The prime automation candidates resulting from the industry experience and survey were: (1) the shearer operation, (2) shield and conveyor pan-line advance, (3) a management information system to allow improved mine logistics support, and (4) component fault isolation and diagnostics to reduce untimely maintenance delays. A system network analysis indicated that a 40% improvement in productivity was feasible if system delays associated with all of the above four areas were removed. A technology assessment and conceptual system design of each of the four automation candidate areas showed that state-of-the-art digital computer, servomechanism, and actuator technologies could be applied to automate the longwall system. The final cost benefit analysis of all of the automation areas indicated a total net national benefit (profit) of roughly $200 million to the longwall mining industry if all automation candidates were installed. This cost benefit represented an approximate order of magnitude payback on the research and development (R and D) investment. In conclusion, it is recommended that the shearer operation be automated first because it provides a large number of other sensor inputs required for face alignment (i.e., shields and conveyor). Automation of the shield and conveyor pan-line advance is suggested as the next step since both the shearer and face alignment operations contributed the greatest time delays to the overall system downtime.

  15. Automated measurements of cerebral atrophy in multiple sclerosis.

    PubMed

    Hageleit, U; Will, C H; Seidel, D

    1987-01-01

    An automated method of measuring cerebral atrophy is introduced. Using this method we studied patients with multiple sclerosis and a control group showing premature cerebral atrophy in multiple sclerosis (P = 1,32 x 10(-8) for male and P = 3,6 x 10(-14) for female). There was only a weak correlation between cerebral atrophy and psychological deficits. Multivariate analysis did not show any significant correlation between cerebral atrophy, duration of disease, clinical manifestations and progression of disease. We conclude that our method to measure cerebral atrophy is more accurate and less time-consuming than the use of linear indices. It might be appropriate for further investigations in evaluating atrophic processes in cerebro-vascular, degenerative and exogen-toxic disease of brain.

  16. Special Report: Brain Chemistry.

    ERIC Educational Resources Information Center

    Krassner, Michael B.

    1983-01-01

    Chemical actions in the brain result in cognitive, emotional, neuroendocrine, neuromuscular, and/or neurocirculatory effects. Developments in understanding brain chemistry are discussed, considering among others, neurotransmitter chemistry, neuropeptides, drugs and the brain, antidepressants, and actions of minor tranquilizers. (JN)

  17. Brain-based Learning.

    ERIC Educational Resources Information Center

    Weiss, Ruth Palombo

    2000-01-01

    Discusses brain research and how new imaging technologies allow scientists to explore how human brains process memory, emotion, attention, patterning, motivation, and context. Explains how brain research is being used to revise learning theories. (JOW)

  18. Traumatic Brain Injury

    MedlinePlus

    Traumatic brain injury (TBI) happens when a bump, blow, jolt, or other head injury causes damage to the brain. Every year, millions of people in the U.S. suffer brain injuries. More than half are bad enough that ...

  19. Brain tumor (image)

    MedlinePlus

    Brain tumors are classified depending on the exact site of the tumor, the type of tissue involved, benign ... tendencies of the tumor, and other factors. Primary brain tumors can arise from the brain cells, the meninges ( ...

  20. Traumatic Brain Injury

    MedlinePlus

    ... Center PTACs Workspaces Log-in Search for: Traumatic Brain Injury A legacy resource from NICHCY Disability Fact ... in her. Back to top What is Traumatic Brain Injury? A traumatic brain injury (TBI) is an ...

  1. That's Using Your Brain!

    ERIC Educational Resources Information Center

    Visser, Dana R.

    1996-01-01

    Discusses new adult learning theories, including those of Roger Sperry (left brain/right brain), Paul McLean (triune brain), and Howard Gardner (multiple intelligences). Relates adult learning theory to training. (JOW)

  2. Automated cleaning of electronic components

    SciTech Connect

    Drotning, W.; Meirans, L.; Wapman, W.; Hwang, Y.; Koenig, L.; Petterson, B.

    1994-07-01

    Environmental and operator safety concerns are leading to the elimination of trichloroethylene and chlorofluorocarbon solvents in cleaning processes that remove rosin flux, organic and inorganic contamination, and particulates from electronic components. Present processes depend heavily on these solvents for manual spray cleaning of small components and subassemblies. Use of alternative solvent systems can lead to longer processing times and reduced quality. Automated spray cleaning can improve the quality of the cleaning process, thus enabling the productive use of environmentally conscious materials, while minimizing personnel exposure to hazardous materials. We describe the development of a prototype robotic system for cleaning electronic components in a spray cleaning workcell. An important feature of the prototype system is the capability to generate the robot paths and motions automatically from the CAD models of the part to be cleaned, and to embed cleaning process knowledge into the automatically programmed operations.

  3. SAMI Automated Plug Plate Configuration

    NASA Astrophysics Data System (ADS)

    Lorente, N. P. F.; Farrell, T.; Goodwin, M.

    2013-10-01

    The Sydney-AAO Multi-object Integral field spectrograph (SAMI) is a prototype wide-field system at the Anglo-Australian Telescope (AAT) which uses a plug-plate to mount its 13×61-core imaging fibre bundles (hexabundles) in the optical path at the telescope's prime focus. In this paper we describe the process of determining the positions of the plug-plate holes, where plates contain three or more stacked observation configurations. The process, which up until now has involved several separate processes and has required significant manual configuration and checking, is now being automated to increase efficiency and reduce error. This is carried out by means of a thin Java controller layer which drives the configuration cycle. This layer controls the user interface and the C++ algorithm layer where the plate configuration and optimisation is carried out. Additionally, through the Aladin display package, it provides visualisation and facilitates user verification of the resulting plates.

  4. Intelligent Robots for Factory Automation

    NASA Astrophysics Data System (ADS)

    Hall, E. L.; Oh, S. J.

    1985-04-01

    Industrial robots are now proven technology in a variety of applications including welding, materials handling, spray painting, machine loading and assembly. However, to fully realize the potential of these universal manipulators , "intelligence" needs to be added to the industrial robot. This involves adding sensory capability and machine intelligence to the controls. The "intelligence" may be added externally or as integral components of the robot. These new "intelligent robots" promise to greatly enhance the versatility of the robot for factory applications. The purpose of this paper is to present a brief review of the techniques and applications of intelligent robots for factory automation and to suggest possible designs for the intelligent robot of the future.

  5. Automated quantitative analysis for pneumoconiosis

    NASA Astrophysics Data System (ADS)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  6. The automated ground network system

    NASA Technical Reports Server (NTRS)

    Smith, Miles T.; Militch, Peter N.

    1993-01-01

    The primary goal of the Automated Ground Network System (AGNS) project is to reduce Ground Network (GN) station life-cycle costs. To accomplish this goal, the AGNS project will employ an object-oriented approach to develop a new infrastructure that will permit continuous application of new technologies and methodologies to the Ground Network's class of problems. The AGNS project is a Total Quality (TQ) project. Through use of an open collaborative development environment, developers and users will have equal input into the end-to-end design and development process. This will permit direct user input and feedback and will enable rapid prototyping for requirements clarification. This paper describes the AGNS objectives, operations concept, and proposed design.

  7. Automated Design of Quantum Circuits

    NASA Technical Reports Server (NTRS)

    Williams, Colin P.; Gray, Alexander G.

    2000-01-01

    In order to design a quantum circuit that performs a desired quantum computation, it is necessary to find a decomposition of the unitary matrix that represents that computation in terms of a sequence of quantum gate operations. To date, such designs have either been found by hand or by exhaustive enumeration of all possible circuit topologies. In this paper we propose an automated approach to quantum circuit design using search heuristics based on principles abstracted from evolutionary genetics, i.e. using a genetic programming algorithm adapted specially for this problem. We demonstrate the method on the task of discovering quantum circuit designs for quantum teleportation. We show that to find a given known circuit design (one which was hand-crafted by a human), the method considers roughly an order of magnitude fewer designs than naive enumeration. In addition, the method finds novel circuit designs superior to those previously known.

  8. Productivity goals drive office automation

    NASA Technical Reports Server (NTRS)

    Bradley, A. P.; Kurzhals, P. R.

    1983-01-01

    Office automation (OA) steps being taken by NASA to improve efficiency in communications between centers and personnel are outlined. NASA centers are currently linked by satellite for electronic mail and scheduling through dumb and intelligent terminals. The implementation of teleconferencing with interactive graphics transmitted between dial-up terminals is being examined in a pilot program, and interactive data bases are already in operation, with an on-line summary data base being planned for NASA headquarters. The NASA Recon on-line service is operating with citations of over 2,200,000 aeronautics and astronautics research documents and 300,000 scientific books accessed by over 250 terminals around the U.S. The emphasis for all the OA systems is on user-friendly design and minimizing the required input for entry and access.

  9. Automated Rocket Propulsion Test Management

    NASA Technical Reports Server (NTRS)

    Walters, Ian; Nelson, Cheryl; Jones, Helene

    2007-01-01

    The Rocket Propulsion Test-Automated Management System provides a central location for managing activities associated with Rocket Propulsion Test Management Board, National Rocket Propulsion Test Alliance, and the Senior Steering Group business management activities. A set of authorized users, both on-site and off-site with regard to Stennis Space Center (SSC), can access the system through a Web interface. Web-based forms are used for user input with generation and electronic distribution of reports easily accessible. Major functions managed by this software include meeting agenda management, meeting minutes, action requests, action items, directives, and recommendations. Additional functions include electronic review, approval, and signatures. A repository/library of documents is available for users, and all items are tracked in the system by unique identification numbers and status (open, closed, percent complete, etc.). The system also provides queries and version control for input of all items.

  10. Automated design of flexible linkers.

    PubMed

    Manion, Charles; Arlitt, Ryan; Campbell, Matthew I; Tumer, Irem; Stone, Rob; Greaney, P Alex

    2016-03-14

    This paper presents a method for the systematic and automated design of flexible organic linkers for construction of metal organic-frameworks (MOFs) in which flexibility, compliance, or other mechanically exotic properties originate at the linker level rather than from the framework kinematics. Our method couples a graph grammar method for systematically generating linker like molecules with molecular dynamics modeling of linkers' mechanical response. Using this approach we have generated a candidate pool of >59,000 hypothetical linkers. We screen linker candidates according to their mechanical behaviors under large deformation, and extract fragments common to the most performant candidate materials. To demonstrate the general approach to MOF design we apply our system to designing linkers for pressure switching MOFs-MOFs that undergo reversible structural collapse after a stress threshold is exceeded. PMID:26687337

  11. Automated information retrieval using CLIPS

    NASA Technical Reports Server (NTRS)

    Raines, Rodney Doyle, III; Beug, James Lewis

    1991-01-01

    Expert systems have considerable potential to assist computer users in managing the large volume of information available to them. One possible use of an expert system is to model the information retrieval interests of a human user and then make recommendations to the user as to articles of interest. At Cal Poly, a prototype expert system written in the C Language Integrated Production System (CLIPS) serves as an Automated Information Retrieval System (AIRS). AIRS monitors a user's reading preferences, develops a profile of the user, and then evaluates items returned from the information base. When prompted by the user, AIRS returns a list of items of interest to the user. In order to minimize the impact on system resources, AIRS is designed to run in the background during periods of light system use.

  12. Automated Fresnel lens tester system

    SciTech Connect

    Phipps, G.S.

    1981-07-01

    An automated data collection system controlled by a desktop computer has been developed for testing Fresnel concentrators (lenses) intended for solar energy applications. The system maps the two-dimensional irradiance pattern (image) formed in a plane parallel to the lens, whereas the lens and detector assembly track the sun. A point detector silicon diode (0.5-mm-dia active area) measures the irradiance at each point of an operator-defined rectilinear grid of data positions. Comparison with a second detector measuring solar insolation levels results in solar concentration ratios over the image plane. Summation of image plane energies allows calculation of lens efficiencies for various solar cell sizes. Various graphical plots of concentration ratio data help to visualize energy distribution patterns.

  13. Mobile Collection and Automated Interpretation of EEG Data

    NASA Technical Reports Server (NTRS)

    Mintz, Frederick; Moynihan, Philip

    2007-01-01

    A system that would comprise mobile and stationary electronic hardware and software subsystems has been proposed for collection and automated interpretation of electroencephalographic (EEG) data from subjects in everyday activities in a variety of environments. By enabling collection of EEG data from mobile subjects engaged in ordinary activities (in contradistinction to collection from immobilized subjects in clinical settings), the system would expand the range of options and capabilities for performing diagnoses. Each subject would be equipped with one of the mobile subsystems, which would include a helmet that would hold floating electrodes (see figure) in those positions on the patient s head that are required in classical EEG data-collection techniques. A bundle of wires would couple the EEG signals from the electrodes to a multi-channel transmitter also located in the helmet. Electronic circuitry in the helmet transmitter would digitize the EEG signals and transmit the resulting data via a multidirectional RF patch antenna to a remote location. At the remote location, the subject s EEG data would be processed and stored in a database that would be auto-administered by a newly designed relational database management system (RDBMS). In this RDBMS, in nearly real time, the newly stored data would be subjected to automated interpretation that would involve comparison with other EEG data and concomitant peer-reviewed diagnoses stored in international brain data bases administered by other similar RDBMSs.

  14. Automated subject-specific, hexahedral mesh generation via image registration

    PubMed Central

    Ji, Songbai; Ford, James C.; Greenwald, Richard M.; Beckwith, Jonathan G.; Paulsen, Keith D.; Flashman, Laura A.; McAllister, Thomas W.

    2011-01-01

    Generating subject-specific, all-hexahedral meshes for finite element analysis continues to be of significant interest in biomechanical research communities. To date, most automated methods “morph” an existing atlas mesh to match with a subject anatomy, which usually result in degradation in mesh quality because of mesh distortion. We present an automated meshing technique that produces satisfactory mesh quality and accuracy without mesh repair. An atlas mesh is first developed using a script. A subject-specific mesh is generated with the same script after transforming the geometry into the atlas space following rigid image registration, and is transformed back into the subject space. By meshing the brain in 11 subjects, we demonstrate that the technique’s performance is satisfactory in terms of both mesh quality (99.5% of elements had a scaled Jacobian >0.6 while <0.01% were between 0 and 0.2) and accuracy (average distance between mesh boundary and geometrical surface was 0.07 mm while <1% greater than 0.5mm). The combined computational cost for image registration and meshing was <4 min. Our results suggest that the technique is effective for generating subject-specific, all-hexahedral meshes and that it may be useful for meshing a variety of anatomical structures across different biomechanical research fields. PMID:21731153

  15. Automated Extraction of Flow Features

    NASA Technical Reports Server (NTRS)

    Dorney, Suzanne (Technical Monitor); Haimes, Robert

    2005-01-01

    Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.

  16. Automated seizure detection using EKG.

    PubMed

    Osorio, Ivan

    2014-03-01

    Changes in heart rate, most often increases, are associated with the onset of epileptic seizures and may be used in lieu of cortical activity for automated seizure detection. The feasibility of this aim was tested on 241 clinical seizures from 81 subjects admitted to several Epilepsy Centers for invasive monitoring for evaluation for epilepsy surgery. The performance of the EKG-based seizure detection algorithm was compared to that of a validated algorithm applied to electrocorticogram (ECoG). With the most sensitive detection settings [threshold T: 1.15; duration D: 0 s], 5/241 seizures (2%) were undetected (false negatives) and with the highest [T: 1.3; D: 5 s] settings, the number of false negative detections rose to 34 (14%). The rate of potential false positive (PFP) detections was 9.5/h with the lowest and 1.1/h with the highest T, D settings. Visual review of 336 ECoG segments associated with PFPs revealed that 120 (36%) were associated with seizures, 127 (38%) with bursts of epileptiform discharges and only 87 (26%) were true false positives. Electrocardiographic (EKG)-based seizure onset detection preceded clinical onset by 0.8 s with the lowest and followed it by 13.8 s with the highest T, D settings. Automated EKG-based seizure detection is feasible and has potential clinical utility given its ease of acquisition, processing, high signal/noise and ergonomic advantages viz-a-viz EEG (electroencephalogram) or ECoG. Its use as an "electronic" seizure diary will remedy in part, the inaccuracies of those generated by patients/care-givers in a cost-effective manner.

  17. Towards Brain-inspired Web Intelligence

    NASA Astrophysics Data System (ADS)

    Zhong, Ning

    Artificial Intelligence (AI) has been mainly studied within the realm of computer based technologies. Various computational models and knowledge based systems have been developed for automated reasoning, learning, and problem-solving. However, there still exist several grand challenges. The AI research has not produced major breakthrough recently due to a lack of understanding of human brains and natural intelligence. In addition, most of the AI models and systems will not work well when dealing with large-scale, dynamically changing, open and distributed information sources at a Web scale.

  18. Flight-deck automation - Promises and problems

    NASA Technical Reports Server (NTRS)

    Wiener, E. L.; Curry, R. E.

    1980-01-01

    The paper analyzes the role of human factors in flight-deck automation, identifies problem areas, and suggests design guidelines. Flight-deck automation using microprocessor technology and display systems improves performance and safety while leading to a decrease in size, cost, and power consumption. On the other hand negative factors such as failure of automatic equipment, automation-induced error compounded by crew error, crew error in equipment set-up, failure to heed automatic alarms, and loss of proficiency must also be taken into account. Among the problem areas discussed are automation of control tasks, monitoring of complex systems, psychosocial aspects of automation, and alerting and warning systems. Guidelines are suggested for designing, utilising, and improving control and monitoring systems. Investigation into flight-deck automation systems is important as the knowledge gained can be applied to other systems such as air traffic control and nuclear power generation, but the many problems encountered with automated systems need to be analyzed and overcome in future research.

  19. Automation tools for flexible aircraft maintenance.

    SciTech Connect

    Prentice, William J.; Drotning, William D.; Watterberg, Peter A.; Loucks, Clifford S.; Kozlowski, David M.

    2003-11-01

    This report summarizes the accomplishments of the Laboratory Directed Research and Development (LDRD) project 26546 at Sandia, during the period FY01 through FY03. The project team visited four DoD depots that support extensive aircraft maintenance in order to understand critical needs for automation, and to identify maintenance processes for potential automation or integration opportunities. From the visits, the team identified technology needs and application issues, as well as non-technical drivers that influence the application of automation in depot maintenance of aircraft. Software tools for automation facility design analysis were developed, improved, extended, and integrated to encompass greater breadth for eventual application as a generalized design tool. The design tools for automated path planning and path generation have been enhanced to incorporate those complex robot systems with redundant joint configurations, which are likely candidate designs for a complex aircraft maintenance facility. A prototype force-controlled actively compliant end-effector was designed and developed based on a parallel kinematic mechanism design. This device was developed for demonstration of surface finishing, one of many in-contact operations performed during aircraft maintenance. This end-effector tool was positioned along the workpiece by a robot manipulator, programmed for operation by the automated planning tools integrated for this project. Together, the hardware and software tools demonstrate many of the technologies required for flexible automation in a maintenance facility.

  20. Automated CPX support system preliminary design phase

    NASA Technical Reports Server (NTRS)

    Bordeaux, T. A.; Carson, E. T.; Hepburn, C. D.; Shinnick, F. M.

    1984-01-01

    The development of the Distributed Command and Control System (DCCS) is discussed. The development of an automated C2 system stimulated the development of an automated command post exercise (CPX) support system to provide a more realistic stimulus to DCCS than could be achieved with the existing manual system. An automated CPX system to support corps-level exercise was designed. The effort comprised four tasks: (1) collecting and documenting user requirements; (2) developing a preliminary system design; (3) defining a program plan; and (4) evaluating the suitability of the TRASANA FOURCE computer model.

  1. Application of advanced technology to space automation

    NASA Technical Reports Server (NTRS)

    Schappell, R. T.; Polhemus, J. T.; Lowrie, J. W.; Hughes, C. A.; Stephens, J. R.; Chang, C. Y.

    1979-01-01

    Automated operations in space provide the key to optimized mission design and data acquisition at minimum cost for the future. The results of this study strongly accentuate this statement and should provide further incentive for immediate development of specific automtion technology as defined herein. Essential automation technology requirements were identified for future programs. The study was undertaken to address the future role of automation in the space program, the potential benefits to be derived, and the technology efforts that should be directed toward obtaining these benefits.

  2. Administrative automation in a scientific environment

    NASA Technical Reports Server (NTRS)

    Jarrett, J. R.

    1984-01-01

    Although the scientific personnel at GSFC were advanced in the development and use of hardware and software for scientific applications, resistance to the use of automation or purchase of terminals, software and services, specifically for administrative functions was widespread. The approach used to address problems and constraints and plans for administrative automation within the Space and Earth Sciences Directorate are delineated. Accomplishments thus far include reduction of paperwork and manual efforts; improved communications through telemail and committees; additional support staff; increased awareness at all levels on ergonomic concerns and the need for training; better equipment; improved ADP skills through experience; management commitment; and an overall strategy for automating.

  3. Automation of Space Processing Applications Shuttle payloads

    NASA Technical Reports Server (NTRS)

    Crosmer, W. E.; Neau, O. T.; Poe, J.

    1975-01-01

    The Space Processing Applications Program is examining the effect of weightlessness on key industrial materials processes, such as crystal growth, fine-grain casting of metals, and production of unique and ultra-pure glasses. Because of safety and in order to obtain optimum performance, some of these processes lend themselves to automation. Automation can increase the number of potential Space Shuttle flight opportunities and increase the overall productivity of the program. Five automated facility design concepts and overall payload combinations incorporating these facilities are presented.

  4. Automated maintenance of embryonic stem cell cultures.

    PubMed

    Terstegge, Stefanie; Laufenberg, Iris; Pochert, Jörg; Schenk, Sabine; Itskovitz-Eldor, Joseph; Endl, Elmar; Brüstle, Oliver

    2007-01-01

    Embryonic stem cell (ESC) technology provides attractive perspectives for generating unlimited numbers of somatic cells for disease modeling and compound screening. A key prerequisite for these industrial applications are standardized and automated systems suitable for stem cell processing. Here we demonstrate that mouse and human ESC propagated by automated culture maintain their mean specific growth rates, their capacity for multi-germlayer differentiation, and the expression of the pluripotency-associated markers SSEA-1/Oct-4 and Tra-1-60/Tra-1-81/Oct-4, respectively. The feasibility of ESC culture automation may greatly facilitate the use of this versatile cell source for a variety of biomedical applications.

  5. CFD Process Automation Using Overset Grids

    NASA Technical Reports Server (NTRS)

    Buning, Pieter G.; George, Michael W. (Technical Monitor)

    1995-01-01

    This talk summarizes three applications of the overset grid method for CFD using some level of automated grid generation, flow solution and post-processing. These applications are 2D high-lift airfoil analysis (INS2D code), turbomachinery applications (ROTOR2/3 codes), and subsonic transport wing/body configurations (OVERFLOW code). These examples provide a forum for discussing the advantages and disadvantages of overset gridding for use in an automated CFD process. The goals and benefits of the automation incorporated in each application will be described, as well as the shortcomings of the approaches.

  6. Implementation of and experiences with new automation.

    PubMed

    Mahmud, I; Kim, D

    2000-01-01

    In an environment where cost, timeliness, and quality drives the business, it is essential to look for answers in technology where these challenges can be met. In the Novartis Pharmaceutical Quality Assurance Department, automation and robotics have become just the tools to meet these challenges. Although automation is a relatively new concept in our department, we have fully embraced it within just a few years. As our company went through a merger, there was a significant reduction in the workforce within the Quality Assurance Department through voluntary and involuntary separations. However the workload remained constant or in some cases actually increased. So even with reduction in laboratory personnel, we were challenged internally and from the headquarters in Basle to improve productivity while maintaining integrity in quality testing. Benchmark studies indicated the Suffern site to be the choice manufacturing site above other facilities. This is attributed to the Suffern facility employees' commitment to reduce cycle time, improve efficiency, and maintain high level of regulatory compliance. One of the stronger contributing factors was automation technology in the laboratoriess, and this technology will continue to help the site's status in the future. The Automation Group was originally formed about 2 years ago to meet the demands of high quality assurance testing throughput needs and to bring our testing group up to standard with the industry. Automation began with only two people in the group and now we have three people who are the next generation automation scientists. Even with such a small staff,we have made great strides in laboratory automation as we have worked extensively with each piece of equipment brought in. The implementation process of each project was often difficult because the second generation automation group came from the laboratory and without much automation experience. However, with the involvement from the users at 'get-go', we were

  7. Aviation Safety/Automation Program Conference

    NASA Technical Reports Server (NTRS)

    Morello, Samuel A. (Compiler)

    1990-01-01

    The Aviation Safety/Automation Program Conference - 1989 was sponsored by the NASA Langley Research Center on 11 to 12 October 1989. The conference, held at the Sheraton Beach Inn and Conference Center, Virginia Beach, Virginia, was chaired by Samuel A. Morello. The primary objective of the conference was to ensure effective communication and technology transfer by providing a forum for technical interchange of current operational problems and program results to date. The Aviation Safety/Automation Program has as its primary goal to improve the safety of the national airspace system through the development and integration of human-centered automation technologies for aircraft crews and air traffic controllers.

  8. Process development for automated solar cell and module production. Task 4: automated array assembly

    SciTech Connect

    Hagerty, J.J.

    1980-06-30

    The scope of work under this contract involves specifying a process sequence which can be used in conjunction with automated equipment for the mass production of solar cell modules for terrestrial use. This process sequence is then critically analyzed from a technical and economic standpoint to determine the technological readiness of each process step for implementation. The process steps are ranked according to the degree of development effort required and according to their significance to the overall process. Under this contract the steps receiving analysis were: back contact metallization, automated cell array layup/interconnect, and module edge sealing. For automated layup/interconnect both hard automation and programmable automation (using an industrial robot) were studied. The programmable automation system was then selected for actual hardware development. Economic analysis using the SAMICS system has been performed during these studies to assure that development efforts have been directed towards the ultimate goal of price reduction. Details are given. (WHK)

  9. Brain Tumor Symptoms

    MedlinePlus

    ... Types of Tumors Risk Factors Brain Tumor Statistics Brain Tumor Dictionary Webinars Anytime Learning About Us Our Founders Board of Directors Staff ... Types of Tumors Risk Factors Brain Tumor Statistics Brain Tumor Dictionary Webinars Anytime Learning Donate to the ABTA Help advance the understanding ...

  10. 12 CFR 1005.16 - Disclosures at automated teller machines.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Disclosures at automated teller machines. 1005... TRANSFERS (REGULATION E) General § 1005.16 Disclosures at automated teller machines. (a) Definition. “Automated teller machine operator” means any person that operates an automated teller machine at which...

  11. 12 CFR 205.16 - Disclosures at automated teller machines.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 2 2010-01-01 2010-01-01 false Disclosures at automated teller machines. 205... SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.16 Disclosures at automated teller machines. (a) Definition. Automated teller machine operator means any person that operates an automated teller machine...

  12. 12 CFR 205.16 - Disclosures at automated teller machines.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 2 2012-01-01 2012-01-01 false Disclosures at automated teller machines. 205... SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.16 Disclosures at automated teller machines. (a) Definition. Automated teller machine operator means any person that operates an automated teller machine...

  13. 12 CFR 1005.16 - Disclosures at automated teller machines.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Disclosures at automated teller machines. 1005... TRANSFERS (REGULATION E) General § 1005.16 Disclosures at automated teller machines. (a) Definition. “Automated teller machine operator” means any person that operates an automated teller machine at which...

  14. 12 CFR 205.16 - Disclosures at automated teller machines.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 2 2013-01-01 2013-01-01 false Disclosures at automated teller machines. 205... SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.16 Disclosures at automated teller machines. (a) Definition. Automated teller machine operator means any person that operates an automated teller machine...

  15. 12 CFR 205.16 - Disclosures at automated teller machines.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 2 2014-01-01 2014-01-01 false Disclosures at automated teller machines. 205... SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.16 Disclosures at automated teller machines. (a) Definition. Automated teller machine operator means any person that operates an automated teller machine...

  16. 12 CFR 1005.16 - Disclosures at automated teller machines.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Disclosures at automated teller machines. 1005... TRANSFERS (REGULATION E) § 1005.16 Disclosures at automated teller machines. (a) Definition. “Automated teller machine operator” means any person that operates an automated teller machine at which a...

  17. 12 CFR 205.16 - Disclosures at automated teller machines.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 2 2011-01-01 2011-01-01 false Disclosures at automated teller machines. 205... SYSTEM ELECTRONIC FUND TRANSFERS (REGULATION E) § 205.16 Disclosures at automated teller machines. (a) Definition. Automated teller machine operator means any person that operates an automated teller machine...

  18. Automation of Space Station module power management and distribution system

    NASA Technical Reports Server (NTRS)

    Bechtel, Robert; Weeks, Dave; Walls, Bryan

    1990-01-01

    Viewgraphs on automation of space station module (SSM) power management and distribution (PMAD) system are presented. Topics covered include: reasons for power system automation; SSM/PMAD approach to automation; SSM/PMAD test bed; SSM/PMAD topology; functional partitioning; SSM/PMAD control; rack level autonomy; FRAMES AI system; and future technology needs for power system automation.

  19. 47 CFR 80.385 - Frequencies for automated systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Frequencies for automated systems. 80.385... SERVICES STATIONS IN THE MARITIME SERVICES Frequencies Automated Systems § 80.385 Frequencies for automated systems. This section describes the carrier frequencies for the Automated Maritime...

  20. 19 CFR 24.25 - Statement processing and Automated Clearinghouse.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 1 2013-04-01 2013-04-01 false Statement processing and Automated Clearinghouse... processing and Automated Clearinghouse. (a) Description. Statement processing is a voluntary automated program for participants in the Automated Broker Interface (ABI), allowing the grouping of...