Sample records for detect minimally processed

  1. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    NASA Astrophysics Data System (ADS)

    Araújo, M. M.; Duarte, R. C.; Silva, P. V.; Marchioni, E.; Villavicencio, A. L. C. H.

    2009-07-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  2. Mitigating direct detection bounds in non-minimal Higgs portal scalar dark matter models

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Subhaditya; Ghosh, Purusottam; Maity, Tarak Nath; Ray, Tirtha Sankar

    2017-10-01

    The minimal Higgs portal dark matter model is increasingly in tension with recent results form direct detection experiments like LUX and XENON. In this paper we make a systematic study of simple extensions of the Z_2 stabilized singlet scalar Higgs portal scenario in terms of their prospects at direct detection experiments. We consider both enlarging the stabilizing symmetry to Z_3 and incorporating multipartite features in the dark sector. We demonstrate that in these non-minimal models the interplay of annihilation, co-annihilation and semi-annihilation processes considerably relax constraints from present and proposed direct detection experiments while simultaneously saturating observed dark matter relic density. We explore in particular the resonant semi-annihilation channel within the multipartite Z_3 framework which results in new unexplored regions of parameter space that would be difficult to constrain by direct detection experiments in the near future. The role of dark matter exchange processes within multi-component Z_3× Z_3^' } framework is illustrated. We make quantitative estimates to elucidate the role of various annihilation processes in the different allowed regions of parameter space within these models.

  3. Assessment of microbiological contamination of fresh, minimally processed, and ready-to-eat lettuces (Lactuca sativa), Rio de Janeiro State, Brazil.

    PubMed

    Brandão, Marcelo L L; Almeida, Davi O; Bispo, Fernanda C P; Bricio, Silvia M L; Marin, Victor A; Miagostovich, Marize P

    2014-05-01

    This study aimed to assess the microbiological contamination of lettuces commercialized in Rio de Janeiro, Brazil, in order to investigate detection of norovirus genogroup II (NoV GII), Salmonella spp., total and fecal coliforms, such as Escherichia coli. For NoV detection samples were processed using the adsorption-elution concentration method associated to real-time quantitative polymerase chain reaction (qPCR). A total of 90 samples of lettuce including 30 whole fresh lettuces, 30 minimally processed (MP) lettuces, and 30 raw ready-to-eat (RTE) lettuce salads were randomly collected from different supermarkets (fresh and MP lettuce samples), food services, and self-service restaurants (RTE lettuce salads), all located in Rio de Janeiro, Brazil, from October 2010 to December 2011. NoV GII was not detected and PP7 bacteriophage used as internal control process (ICP) was recovered in 40.0%, 86.7%, and 76.7% of those samples, respectively. Salmonella spp. was not detected although fecal contamination has been observed by fecal coliform concentrations higher than 10(2) most probable number/g. E. coli was detected in 70.0%, 6.7%, and 30.0% of fresh, MP, and RTE samples, respectively. This study highlights the need to improve hygiene procedures at all stages of vegetable production and to show PP7 bacteriophage as an ICP for recovering RNA viruses' methods from MP and RTE lettuce samples, encouraging the evaluation of new protocols that facilitate the establishment of methodologies for NoV detection in a greater number of food microbiology laboratories. The PP7 bacteriophage can be used as an internal control process in methods for recovering RNA viruses from minimally processed and ready-to-eat lettuce samples. © 2014 Institute of Food Technologists®

  4. Minimally processed vegetable salads: microbial quality evaluation.

    PubMed

    Fröder, Hans; Martins, Cecília Geraldes; De Souza, Katia Leani Oliveira; Landgraf, Mariza; Franco, Bernadette D G M; Destro, Maria Teresa

    2007-05-01

    The increasing demand for fresh fruits and vegetables and for convenience foods is causing an expansion of the market share for minimally processed vegetables. Among the more common pathogenic microorganisms that can be transmitted to humans by these products are Listeria monocytogenes, Escherichia coli O157:H7, and Salmonella. The aim of this study was to evaluate the microbial quality of a selection of minimally processed vegetables. A total of 181 samples of minimally processed leafy salads were collected from retailers in the city of Sao Paulo, Brazil. Counts of total coliforms, fecal coliforms, Enterobacteriaceae, psychrotrophic microorganisms, and Salmonella were conducted for 133 samples. L. monocytogenes was assessed in 181 samples using the BAX System and by plating the enrichment broth onto Palcam and Oxford agars. Suspected Listeria colonies were submitted to classical biochemical tests. Populations of psychrotrophic microorganisms >10(6) CFU/g were found in 51% of the 133 samples, and Enterobacteriaceae populations between 10(5) and 106 CFU/g were found in 42% of the samples. Fecal coliform concentrations higher than 10(2) CFU/g (Brazilian standard) were found in 97 (73%) of the samples, and Salmonella was detected in 4 (3%) of the samples. Two of the Salmonella-positive samples had <10(2) CFU/g concentrations of fecal coliforms. L. monocytogenes was detected in only 1 (0.6%) of the 181 samples examined. This positive sample was simultaneously detected by both methods. The other Listeria species identified by plating were L. welshimeri (one sample of curly lettuce) and L. innocua (2 samples of watercress). The results indicate that minimally processed vegetables had poor microbiological quality, and these products could be a vehicle for pathogens such as Salmonella and L. monocytogenes.

  5. Effectiveness of radiation processing for elimination of Salmonella Typhimurium from minimally processed pineapple (Ananas comosus Merr.).

    PubMed

    Shashidhar, Ravindranath; Dhokane, Varsha S; Hajare, Sachin N; Sharma, Arun; Bandekar, Jayant R

    2007-04-01

    The microbiological quality of market samples of minimally processed (MP) pineapple was examined. The effectiveness of radiation treatment in eliminating Salmonella Typhimurium from laboratory inoculated ready-to-eat pineapple slices was also studied. Microbiological quality of minimally processed pineapple samples from Mumbai market was poor; 8.8% of the samples were positive for Salmonella. D(10) (the radiation dose required to reduce bacterial population by 90%) value for S. Typhimurium inoculated in pineapple was 0.242 kGy. Inoculated pack studies in minimally processed pineapple showed that the treatment with a 2-kGy dose of gamma radiation could eliminate 5 log CFU/g of S. Typhimurium. The pathogen was not detected from radiation-processed samples up to 12 d during storage at 4 and 10 degrees C. The processing of market samples with 1 and 2 kGy was effective in improving the microbiological quality of these products.

  6. Production and characterization of curcumin microcrystals and evaluation of the antimicrobial and sensory aspects in minimally processed carrots.

    PubMed

    Silva, Anderson Clayton da; Santos, Priscila Dayane de Freitas; Palazzi, Nicole Campezato; Leimann, Fernanda Vitória; Fuchs, Renata Hernandez Barros; Bracht, Lívia; Gonçalves, Odinei Hess

    2017-05-24

    Nontoxic conserving agents are in demand by the food industry due to consumers concern about synthetic conservatives, especially in minimally processed food. The antimicrobial activity of curcumin, a natural phenolic compound, has been extensively investigated but hydrophobicity is an issue when applying curcumin to foodstuff. The objective of this work was to evaluate curcumin microcrystals as an antimicrobial agent in minimally processed carrots. The antimicrobial activity of curcumin microcrystals was evaluated in vitro against Gram-positive (Bacillus cereus and Staphylococcus aureus) and Gram-negative (Escherichia coli and Pseudomonas aeruginosa) microorganisms, showing a statistically significant (p < 0.05) decrease in the minimum inhibitory concentration compared to in natura, pristine curcumin. Curcumin microcrystals were effective in inhibiting psychrotrophic and mesophile microorganisms in minimally processed carrots. Sensory analyses were carried out showing no significant difference (p < 0.05) between curcumin microcrystal-treated carrots and non-treated carrots in triangular and tetrahedral discriminative tests. Sensory tests also showed that curcumin microcrystals could be added as a natural preservative in minimally processed carrots without causing noticeable differences that could be detected by the consumer. One may conclude that the analyses of the minimally processed carrots demonstrated that curcumin microcrystals are a suitable natural compound to inhibit the natural microbiota of carrots from a statistical point of view.

  7. Use of new minimum intervention dentistry technologies in caries management.

    PubMed

    Tassery, H; Levallois, B; Terrer, E; Manton, D J; Otsuki, M; Koubi, S; Gugnani, N; Panayotov, I; Jacquot, B; Cuisinier, F; Rechmann, P

    2013-06-01

    Preservation of natural tooth structure requires early detection of the carious lesion and is associated with comprehensive patient dental care. Processes aiming to detect carious lesions in the initial stage with optimum efficiency employ a variety of technologies such as magnifying loupes, transillumination, light and laser fluorescence (QLF® and DIAGNOdent® ) and autofluorescence (Soprolife® and VistaCam®), electric current/impedance (CarieScan(®) ), tomographic imaging and image processing. Most fluorescent caries detection tools can discriminate between healthy and carious dental tissue, demonstrating different levels of sensitivity and specificity. Based on the fluorescence principle, an LED camera (Soprolife® ) was developed (Sopro-Acteon, La Ciotat, France) which combined magnification, fluorescence, picture acquisition and an innovative therapeutic concept called light-induced fluorescence evaluator for diagnosis and treatment (LIFEDT). This article is rounded off by a Soprolife® illustration about minimally or even non-invasive dental techniques, distinguishing those that preserve or reinforce the enamel and enamel-dentine structures without any preparation (MIT1- minimally invasive therapy 1) from those that require minimum preparation of the dental tissues (MIT2 - minimally invasive therapy 2) using several clinical cases as examples. MIT1 encompasses all the dental techniques aimed at disinfection, remineralizing, reversing and sealing the caries process and MIT2 involves a series of specific tools, including microburs, air abrasion devices, sonic and ultrasonic inserts and photo-activated disinfection to achieve minimal preparation of the tooth. With respect to minimally invasive treatment and prevention, the use of lasers is discussed. Furthermore, while most practices operate under a surgical model, Caries Management by Risk Assessment (CaMBRA) encourages a medical model of disease prevention and management to control the manifestation of the disease, or keep the oral environment in a state of balance between pathological and preventive factors. Early detection and diagnosis and prediction of lesion activity are of great interest and may change traditional operative procedures substantially. Fluorescence tools with high levels of magnification and observational capacity should guide clinicians towards a more preventive and minimally invasive treatment strategy. © 2013 Australian Dental Association.

  8. A highly stable minimally processed plant-derived recombinant acetylcholinesterase for nerve agent detection in adverse conditions

    PubMed Central

    Rosenberg, Yvonne J.; Walker, Jeremy; Jiang, Xiaoming; Donahue, Scott; Robosky, Jason; Sack, Markus; Lees, Jonathan; Urban, Lori

    2015-01-01

    Although recent innovations in transient plant systems have enabled gram quantities of proteins in 1–2 weeks, very few have been translated into applications due to technical challenges and high downstream processing costs. Here we report high-level production, using a Nicotiana benthamiana/p19 system, of an engineered recombinant human acetylcholinesterase (rAChE) that is highly stable in a minimally processed leaf extract. Lyophylized clarified extracts withstand prolonged storage at 70 °C and, upon reconstitution, can be used in several devices to detect organophosphate (OP) nerve agents and pesticides on surfaces ranging from 0 °C to 50 °C. The recent use of sarin in Syria highlights the urgent need for nerve agent detection and countermeasures necessary for preparedness and emergency responses. Bypassing cumbersome and expensive downstream processes has enabled us to fully exploit the speed, low cost and scalability of transient production systems resulting in the first successful implementation of plant-produced rAChE into a commercial biotechnology product. PMID:26268538

  9. A highly stable minimally processed plant-derived recombinant acetylcholinesterase for nerve agent detection in adverse conditions.

    PubMed

    Rosenberg, Yvonne J; Walker, Jeremy; Jiang, Xiaoming; Donahue, Scott; Robosky, Jason; Sack, Markus; Lees, Jonathan; Urban, Lori

    2015-08-13

    Although recent innovations in transient plant systems have enabled gram quantities of proteins in 1-2 weeks, very few have been translated into applications due to technical challenges and high downstream processing costs. Here we report high-level production, using a Nicotiana benthamiana/p19 system, of an engineered recombinant human acetylcholinesterase (rAChE) that is highly stable in a minimally processed leaf extract. Lyophylized clarified extracts withstand prolonged storage at 70 °C and, upon reconstitution, can be used in several devices to detect organophosphate (OP) nerve agents and pesticides on surfaces ranging from 0 °C to 50 °C. The recent use of sarin in Syria highlights the urgent need for nerve agent detection and countermeasures necessary for preparedness and emergency responses. Bypassing cumbersome and expensive downstream processes has enabled us to fully exploit the speed, low cost and scalability of transient production systems resulting in the first successful implementation of plant-produced rAChE into a commercial biotechnology product.

  10. Contour Detection and Completion for Inpainting and Segmentation Based on Topological Gradient and Fast Marching Algorithms

    PubMed Central

    Auroux, Didier; Cohen, Laurent D.; Masmoudi, Mohamed

    2011-01-01

    We combine in this paper the topological gradient, which is a powerful method for edge detection in image processing, and a variant of the minimal path method in order to find connected contours. The topological gradient provides a more global analysis of the image than the standard gradient and identifies the main edges of an image. Several image processing problems (e.g., inpainting and segmentation) require continuous contours. For this purpose, we consider the fast marching algorithm in order to find minimal paths in the topological gradient image. This coupled algorithm quickly provides accurate and connected contours. We present then two numerical applications, to image inpainting and segmentation, of this hybrid algorithm. PMID:22194734

  11. Culture-free, highly sensitive, quantitative detection of bacteria from minimally processed samples using fluorescence imaging by smartphone.

    PubMed

    Shrivastava, Sajal; Lee, Won-Il; Lee, Nae-Eung

    2018-06-30

    A critical unmet need in the diagnosis of bacterial infections, which remain a major cause of human morbidity and mortality, is the detection of scarce bacterial pathogens in a variety of samples in a rapid and quantitative manner. Herein, we demonstrate smartphone-based detection of Staphylococcus aureus in a culture-free, rapid, quantitative manner from minimally processed liquid samples using aptamer-functionalized fluorescent magnetic nanoparticles. The tagged S. aureus cells were magnetically captured in a detection cassette, and then fluorescence was imaged using a smartphone camera with a light-emitting diode as the excitation source. Our results showed quantitative detection capability with a minimum detectable concentration as low as 10 cfu/ml by counting individual bacteria cells, efficiently capturing S. aureus cells directly from a peanut milk sample within 10 min. When the selectivity of detection was investigated using samples spiked with other pathogenic bacteria, no significant non-specific detection occurred. Furthermore, strains of S. aureus from various origins showed comparable results, ensuring that the approach can be widely adopted. Therefore, the quantitative fluorescence imaging platform on a smartphone could allow on-site detection of bacteria, providing great potential assistance during major infectious disease outbreaks in remote and resource-limited settings. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. RGB-to-RGBG conversion algorithm with adaptive weighting factors based on edge detection and minimal square error.

    PubMed

    Huang, Chengqiang; Yang, Youchang; Wu, Bo; Yu, Weize

    2018-06-01

    The sub-pixel arrangement of the RGBG panel and the image with RGB format are different and the algorithm that converts RGB to RGBG is urgently needed to display an image with RGB arrangement on the RGBG panel. However, the information loss is still large although color fringing artifacts are weakened in the published papers that study this conversion. In this paper, an RGB-to-RGBG conversion algorithm with adaptive weighting factors based on edge detection and minimal square error (EDMSE) is proposed. The main points of innovation include the following: (1) the edge detection is first proposed to distinguish image details with serious color fringing artifacts and image details which are prone to be lost in the process of RGB-RGBG conversion; (2) for image details with serious color fringing artifacts, the weighting factor 0.5 is applied to weaken the color fringing artifacts; and (3) for image details that are prone to be lost in the process of RGB-RGBG conversion, a special mechanism to minimize square error is proposed. The experiment shows that the color fringing artifacts are slightly improved by EDMSE, and the values of MSE of the image processed are 19.6% and 7% smaller than those of the image processed by the direct assignment and weighting factor algorithm, respectively. The proposed algorithm is implemented on a field programmable gate array to enable the image display on the RGBG panel.

  13. Information transmission using non-poisson regular firing.

    PubMed

    Koyama, Shinsuke; Omi, Takahiro; Kass, Robert E; Shinomoto, Shigeru

    2013-04-01

    In many cortical areas, neural spike trains do not follow a Poisson process. In this study, we investigate a possible benefit of non-Poisson spiking for information transmission by studying the minimal rate fluctuation that can be detected by a Bayesian estimator. The idea is that an inhomogeneous Poisson process may make it difficult for downstream decoders to resolve subtle changes in rate fluctuation, but by using a more regular non-Poisson process, the nervous system can make rate fluctuations easier to detect. We evaluate the degree to which regular firing reduces the rate fluctuation detection threshold. We find that the threshold for detection is reduced in proportion to the coefficient of variation of interspike intervals.

  14. Detecting number processing and mental calculation in patients with disorders of consciousness using a hybrid brain-computer interface system.

    PubMed

    Li, Yuanqing; Pan, Jiahui; He, Yanbin; Wang, Fei; Laureys, Steven; Xie, Qiuyou; Yu, Ronghao

    2015-12-15

    For patients with disorders of consciousness such as coma, a vegetative state or a minimally conscious state, one challenge is to detect and assess the residual cognitive functions in their brains. Number processing and mental calculation are important brain functions but are difficult to detect in patients with disorders of consciousness using motor response-based clinical assessment scales such as the Coma Recovery Scale-Revised due to the patients' motor impairments and inability to provide sufficient motor responses for number- and calculation-based communication. In this study, we presented a hybrid brain-computer interface that combines P300 and steady state visual evoked potentials to detect number processing and mental calculation in Han Chinese patients with disorders of consciousness. Eleven patients with disorders of consciousness who were in a vegetative state (n = 6) or in a minimally conscious state (n = 3) or who emerged from a minimally conscious state (n = 2) participated in the brain-computer interface-based experiment. During the experiment, the patients with disorders of consciousness were instructed to perform three tasks, i.e., number recognition, number comparison, and mental calculation, including addition and subtraction. In each experimental trial, an arithmetic problem was first presented. Next, two number buttons, only one of which was the correct answer to the problem, flickered at different frequencies to evoke steady state visual evoked potentials, while the frames of the two buttons flashed in a random order to evoke P300 potentials. The patients needed to focus on the target number button (the correct answer). Finally, the brain-computer interface system detected P300 and steady state visual evoked potentials to determine the button to which the patients attended, further presenting the results as feedback. Two of the six patients who were in a vegetative state, one of the three patients who were in a minimally conscious state, and the two patients that emerged from a minimally conscious state achieved accuracies significantly greater than the chance level. Furthermore, P300 potentials and steady state visual evoked potentials were observed in the electroencephalography signals from the five patients. Number processing and arithmetic abilities as well as command following were demonstrated in the five patients. Furthermore, our results suggested that through brain-computer interface systems, many cognitive experiments may be conducted in patients with disorders of consciousness, although they cannot provide sufficient behavioral responses.

  15. Interaction-free measurement as quantum channel discrimination

    NASA Astrophysics Data System (ADS)

    Zhou, You; Yung, Man-Hong

    2017-12-01

    Interaction-free measurement is a quantum process where, in the ideal situation, an object can be detected as if no interaction took place with the probing photon. Here we show that the problem of interaction-free measurement can be regarded as a problem of quantum-channel discrimination. In particular, we look for the optimal photonic states that can minimize the detection error and the photon loss in detecting the presence or absence of the object, which is taken to be semitransparent, and the number of the interrogation cycle is assumed to be finite. Furthermore, we also investigated the possibility of minimizing the detection error through the use of entangled photons, which is essentially a setting of quantum illumination. However, our results indicate that entanglement does not exhibit a clear advantage; the same performance can be achieved with unentangled photonic states.

  16. Edge detection - Image-plane versus digital processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.; Fales, Carl L.; Park, Stephen K.; Triplett, Judith A.

    1987-01-01

    To optimize edge detection with the familiar Laplacian-of-Gaussian operator, it has become common to implement this operator with a large digital convolution mask followed by some interpolation of the processed data to determine the zero crossings that locate edges. It is generally recognized that this large mask causes substantial blurring of fine detail. It is shown that the spatial detail can be improved by a factor of about four with either the Wiener-Laplacian-of-Gaussian filter or an image-plane processor. The Wiener-Laplacian-of-Gaussian filter minimizes the image-gathering degradations if the scene statistics are at least approximately known and also serves as an interpolator to determine the desired zero crossings directly. The image-plane processor forms the Laplacian-of-Gaussian response by properly combining the optical design of the image-gathering system with a minimal three-by-three lateral-inhibitory processing mask. This approach, which is suggested by Marr's model of early processing in human vision, also reduces data processing by about two orders of magnitude and data transmission by up to an order of magnitude.

  17. Minimally invasive surgical method to detect sound processing in the cochlear apex by optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Ramamoorthy, Sripriya; Zhang, Yuan; Petrie, Tracy; Fridberger, Anders; Ren, Tianying; Wang, Ruikang; Jacques, Steven L.; Nuttall, Alfred L.

    2016-02-01

    Sound processing in the inner ear involves separation of the constituent frequencies along the length of the cochlea. Frequencies relevant to human speech (100 to 500 Hz) are processed in the apex region. Among mammals, the guinea pig cochlear apex processes similar frequencies and is thus relevant for the study of speech processing in the cochlea. However, the requirement for extensive surgery has challenged the optical accessibility of this area to investigate cochlear processing of signals without significant intrusion. A simple method is developed to provide optical access to the guinea pig cochlear apex in two directions with minimal surgery. Furthermore, all prior vibration measurements in the guinea pig apex involved opening an observation hole in the otic capsule, which has been questioned on the basis of the resulting changes to cochlear hydrodynamics. Here, this limitation is overcome by measuring the vibrations through the unopened otic capsule using phase-sensitive Fourier domain optical coherence tomography. The optically and surgically advanced method described here lays the foundation to perform minimally invasive investigation of speech-related signal processing in the cochlea.

  18. THREAT ENSEMBLE VULNERABILITY ASSESSMENT ...

    EPA Pesticide Factsheets

    software and manual TEVA-SPOT is used by water utilities to optimize the number and location of contamination detection sensors so that economic and/or public health consequences are minimized. TEVA-SPOT is interactive, allowing a user to specify the minimization objective (e.g., the number of people exposed, the time to detection, or the extent of pipe length contaminated). It also allows a user to specify constraints. For example, a TEVA-SPOT user can employ expert knowledge during the design process by identifying either existing or unfeasible sensor locations. Installation and maintenance costs for sensor placement can also be factored into the analysis. Python and Java are required to run TEVA-SPOT

  19. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    ERIC Educational Resources Information Center

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  20. Algorithm-Based Motion Magnification for Video Processing in Urological Laparoscopy.

    PubMed

    Adams, Fabian; Schoelly, Reto; Schlager, Daniel; Schoenthaler, Martin; Schoeb, Dominik S; Wilhelm, Konrad; Hein, Simon; Wetterauer, Ulrich; Miernik, Arkadiusz

    2017-06-01

    Minimally invasive surgery is in constant further development and has replaced many conventional operative procedures. If vascular structure movement could be detected during these procedures, it could reduce the risk of vascular injury and conversion to open surgery. The recently proposed motion-amplifying algorithm, Eulerian Video Magnification (EVM), has been shown to substantially enhance minimal object changes in digitally recorded video that is barely perceptible to the human eye. We adapted and examined this technology for use in urological laparoscopy. Video sequences of routine urological laparoscopic interventions were recorded and further processed using spatial decomposition and filtering algorithms. The freely available EVM algorithm was investigated for its usability in real-time processing. In addition, a new image processing technology, the CRS iimotion Motion Magnification (CRSMM) algorithm, was specifically adjusted for endoscopic requirements, applied, and validated by our working group. Using EVM, no significant motion enhancement could be detected without severe impairment of the image resolution, motion, and color presentation. The CRSMM algorithm significantly improved image quality in terms of motion enhancement. In particular, the pulsation of vascular structures could be displayed more accurately than in EVM. Motion magnification image processing technology has the potential for clinical importance as a video optimizing modality in endoscopic and laparoscopic surgery. Barely detectable (micro)movements can be visualized using this noninvasive marker-free method. Despite these optimistic results, the technology requires considerable further technical development and clinical tests.

  1. The detectability half-life in predator-prey research: what it is, why we need it, how to measure it, and what it’s good for

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. However, gut-content assays produce qualitative results, necessitating care in using them to infer the impact of predators on prey populations. In order for gut-content assays to ...

  2. Prevalence and counts of Salmonella spp. in minimally processed vegetables in São Paulo, Brazil.

    PubMed

    Sant'Ana, Anderson S; Landgraf, Mariza; Destro, Maria Teresa; Franco, Bernadette D G M

    2011-09-01

    Minimally processed vegetables (MPV) may be important vehicles of Salmonella spp. and cause disease. This study aimed at detecting and enumerating Salmonella spp. in MPV marketed in the city of São Paulo, Brazil. A total of 512 samples of MPV packages collected in retail stores were tested for Salmonella spp. and total coliforms and Escherichia coli as indication of the hygienic status. Salmonella spp. was detected in four samples, two using the detection method and two using the counting method, where the results were 8.8 × 10(2) CFU/g and 2.4 × 10(2) CFU/g. The serovars were Salmonella Typhimurium (three samples) and Salmonella enterica subsp. enterica O:47:z4,z23:- (one sample). Fourteen samples (2.7%) presented counts of E. coli above the maximum limit established by the Brazilian regulation for MPV (10(2) CFU/g). Therefore, tightened surveillance and effective intervention strategies are necessary in order to address consumers and governments concerns on safety of MPV. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Prevalence and level of Listeria monocytogenes and other Listeria sp. in ready-to-eat minimally processed and refrigerated vegetables.

    PubMed

    Kovačević, Mira; Burazin, Jelena; Pavlović, Hrvoje; Kopjar, Mirela; Piližota, Vlasta

    2013-04-01

    Minimally processed and refrigerated vegetables can be contaminated with Listeria species bacteria including Listeria monocytogenes due to extensive handling during processing or by cross contamination from the processing environment. The objective of this study was to examine the microbiological quality of ready-to-eat minimally processed and refrigerated vegetables from supermarkets in Osijek, Croatia. 100 samples of ready-to-eat vegetables collected from different supermarkets in Osijek, Croatia, were analyzed for presence of Listeria species and Listeria monocytogenes. The collected samples were cut iceberg lettuces (24 samples), other leafy vegetables (11 samples), delicatessen salads (23 samples), cabbage salads (19 samples), salads from mixed (17 samples) and root vegetables (6 samples). Listeria species was found in 20 samples (20 %) and Listeria monocytogenes was detected in only 1 sample (1 %) of cut red cabbage (less than 100 CFU/g). According to Croatian and EU microbiological criteria these results are satisfactory. However, the presence of Listeria species and Listeria monocytogenes indicates poor hygiene quality. The study showed that these products are often improperly labeled, since 24 % of analyzed samples lacked information about shelf life, and 60 % of samples lacked information about storage conditions. With regard to these facts, cold chain abruption with extended use after expiration date is a probable scenario. Therefore, the microbiological risk for consumers of ready-to-eat minimally processed and refrigerated vegetables is not completely eliminated.

  4. Information Assurance Technology Analysis Center Information Assurance Tools Report Intrusion Detection

    DTIC Science & Technology

    1998-01-01

    such as central processing unit (CPU) usage, disk input/output (I/O), memory usage, user activity, and number of logins attempted. The statistics... EMERALD Commercial anomaly detection, system monitoring SRI porras@csl.sri.com www.csl.sri.com/ emerald /index. html Gabriel Commercial system...sensors, it starts to protect the network with minimal configuration and maximum intelligence. T 11 EMERALD TITLE EMERALD (Event Monitoring

  5. Coherent-Phase Monitoring Of Cavitation In Turbomachines

    NASA Technical Reports Server (NTRS)

    Jong, Jen-Yi

    1996-01-01

    Digital electronic signal-processing system analyzes outputs of accelerometers mounted on turbomachine to detect vibrations characteristic of cavitation. Designed to overcome limitation imposed by interference from discrete components. System digitally implements technique called "coherent-phase wide-band demodulation" (CPWBD), using phase-only (PO) filtering along envelope detection to search for unique coherent-phase relationship associated with cavitation and to minimize influence of large-amplitude discrete components.

  6. Prevalence of Listeria monocytogenes in Retail Lightly Pickled Vegetables and Its Successful Control at Processing Plants.

    PubMed

    Taguchi, Masumi; Kanki, Masashi; Yamaguchi, Yuko; Inamura, Hideichi; Koganei, Yosuke; Sano, Tetsuya; Nakamura, Hiromi; Asakura, Hiroshi

    2017-03-01

    Incidences of food poisoning traced to nonanimal food products have been increasingly reported. One of these was a recent large outbreak of Shiga toxin-producing Escherichia coli (STEC) O157 infection from the consumption of lightly pickled vegetables, indicating the necessity of imposing hygienic controls during manufacturing. However, little is known about the bacterial contamination levels in these minimally processed vegetables. Here we examined the prevalence of STEC, Salmonella spp., and Listeria monocytogenes in 100 lightly pickled vegetable products manufactured at 55 processing factories. Simultaneously, we also performed quantitative measurements of representative indicator bacteria (total viable counts, coliform counts, and β-glucuronidase-producing E. coli counts). STEC and Salmonella spp. were not detected in any of the samples; L. monocytogenes was detected in 12 samples manufactured at five of the factories. Microbiological surveillance at two factories (two surveys at factory A and three surveys at factory B) between June 2014 and January 2015 determined that the areas predominantly contaminated with L. monocytogenes included the refrigerators and packaging rooms. Genotyping provided further evidence that the contaminants found in these areas were linked to those found in the final products. Taken together, we demonstrated the prevalence of L. monocytogenes in lightly pickled vegetables sold at the retail level. Microbiological surveillance at the manufacturing factories further clarified the sources of the contamination in the retail products. These data indicate the necessity of implementing adequate monitoring programs to minimize health risks attributable to the consumption of these minimally processed vegetables.

  7. Effects of nisin-incorporated films on the microbiological and physicochemical quality of minimally processed mangoes.

    PubMed

    Barbosa, Ana Andréa Teixeira; Silva de Araújo, Hyrla Grazielle; Matos, Patrícia Nogueira; Carnelossi, Marcelo Augusto Guitierrez; Almeida de Castro, Alessandra

    2013-06-17

    The aim of this study is to examine the effects of nisin-incorporated cellulose films on the physicochemical and microbiological qualities of minimally processed mangoes. The use of antimicrobial films did not affect the physicochemical characteristics of mangoes and showed antimicrobial activity against Staphylococcus aureus, Listeria monocytogenes, Alicyclobacillus acidoterrestris and Bacillus cereus. The mango slices were inoculated with S. aureus and L. monocytogenes (10(7)CFU/g), and the viable cell numbers remained at 10(5) and 10(6)CFU/g, respectively, after 12days. In samples packed with antimicrobial films, the viable number of L. monocytogenes cells was reduced below the detection level after 4days. After 6days, a reduction of six log units was observed for S. aureus. In conclusion, nisin showed antimicrobial activity in mangoes without interfering with the organoleptic characteristics of the fruit. This result suggests that nisin could potentially be used in active packing to improve the safety of minimally processed mangoes. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Conflict Detection Algorithm to Minimize Locking for MPI-IO Atomicity

    NASA Astrophysics Data System (ADS)

    Sehrish, Saba; Wang, Jun; Thakur, Rajeev

    Many scientific applications require high-performance concurrent I/O accesses to a file by multiple processes. Those applications rely indirectly on atomic I/O capabilities in order to perform updates to structured datasets, such as those stored in HDF5 format files. Current support for atomicity in MPI-IO is provided by locking around the operations, imposing lock overhead in all situations, even though in many cases these operations are non-overlapping in the file. We propose to isolate non-overlapping accesses from overlapping ones in independent I/O cases, allowing the non-overlapping ones to proceed without imposing lock overhead. To enable this, we have implemented an efficient conflict detection algorithm in MPI-IO using MPI file views and datatypes. We show that our conflict detection scheme incurs minimal overhead on I/O operations, making it an effective mechanism for avoiding locks when they are not needed.

  9. Early print-tuned ERP response with minimal involvement of linguistic processing in Japanese Hiragana strings.

    PubMed

    Okumura, Yasuko; Kasai, Tetsuko; Murohashi, Harumitsu

    2014-04-16

    The act of reading leads to the development of specific neural responses for print, the most frequently reported of which is the left occipitotemporal N170 component of event-related potentials. However, it remains unclear whether this electrophysiological response solely involves print-tuned neural activities. The present study examined an early print-tuned event-related potential response with minimal involvement of linguistic processing in a nonalphabetic language. Japanese Hiragana words, nonwords, and alphanumeric symbol strings were presented rapidly and the task was to detect the change in color of a fixation cross to restrict linguistic processing. As a result, Hiragana words and nonwords elicited a larger posterior N1 than alphanumeric symbol strings bilaterally, irrespective of intercharacter spacing. The fact that this N1 was enhanced specifically for rapidly presented Hiragana strings suggests the existence of print-tuned neural processes that are relatively independent of the influence of linguistic processing.

  10. MicroSensors Systems: detection of a dismounted threat

    NASA Astrophysics Data System (ADS)

    Davis, Bill; Berglund, Victor; Falkofske, Dwight; Krantz, Brian

    2005-05-01

    The Micro Sensor System (MSS) is a layered sensor network with the goal of detecting dismounted threats approaching high value assets. A low power unattended ground sensor network is dependant on a network protocol for efficiency in order to minimize data transmissions after network establishment. The reduction of network 'chattiness' is a primary driver for minimizing power consumption and is a factor in establishing a low probability of detection and interception. The MSS has developed a unique protocol to meet these challenges. Unattended ground sensor systems are most likely dependant on batteries for power which due to size determines the ability of the sensor to be concealed after placement. To minimize power requirements, overcome size limitations, and maintain a low system cost the MSS utilizes advanced manufacturing processes know as Fluidic Self-Assembly and Chip Scale Packaging. The type of sensing element and the ability to sense various phenomenologies (particularly magnetic) at ranges greater than a few meters limits the effectiveness of a system. The MicroSensor System will overcome these limitations by deploying large numbers of low cost sensors, which is made possible by the advanced manufacturing process used in production of the sensors. The MSS program will provide unprecedented levels of real-time battlefield information which greatly enhances combat situational awareness when integrated with the existing Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) infrastructure. This system will provide an important boost to realizing the information dominant, network-centric objective of Joint Vision 2020.

  11. Value stream mapping of the Pap test processing procedure: a lean approach to improve quality and efficiency.

    PubMed

    Michael, Claire W; Naik, Kalyani; McVicker, Michael

    2013-05-01

    We developed a value stream map (VSM) of the Papanicolaou test procedure to identify opportunities to reduce waste and errors, created a new VSM, and implemented a new process emphasizing Lean tools. Preimplementation data revealed the following: (1) processing time (PT) for 1,140 samples averaged 54 hours; (2) 27 accessioning errors were detected on review of 357 random requisitions (7.6%); (3) 5 of the 20,060 tests had labeling errors that had gone undetected in the processing stage. Four were detected later during specimen processing but 1 reached the reporting stage. Postimplementation data were as follows: (1) PT for 1,355 samples averaged 31 hours; (2) 17 accessioning errors were detected on review of 385 random requisitions (4.4%); and (3) no labeling errors were undetected. Our results demonstrate that implementation of Lean methods, such as first-in first-out processes and minimizing batch size by staff actively participating in the improvement process, allows for higher quality, greater patient safety, and improved efficiency.

  12. USEPA PERSPECTIVE ON CONTROLLING PATHOGENS

    EPA Science Inventory

    EPA minimizes the risk of infectious diseases from the beneficial use of sludge by requiring its treatment to reduce pathogen levels below the detection limit. How new treatment processes can be shown equivalent to ones specified in 40CFR503 will be discussed together with ways t...

  13. Non-Intrusive Cable Tester

    NASA Technical Reports Server (NTRS)

    Medelius, Pedro J. (Inventor); Simpson, Howard J. (Inventor)

    1999-01-01

    A cable tester is described for low frequency testing of a cable for faults. The tester allows for testing a cable beyond a point where a signal conditioner is installed, minimizing the number of connections which have to be disconnected. A magnetic pickup coil is described for detecting a test signal injected into the cable. A narrow bandpass filter is described for increasing detection of the test signal. The bandpass filter reduces noise so that a high gain amplifier provided for detecting a test signal is not completely saturate by noise. To further increase the accuracy of the cable tester, processing gain is achieved by comparing the signal from the amplifier with at least one reference signal emulating the low frequency input signal injected into the cable. Different processing techniques are described evaluating a detected signal.

  14. Jitter model and signal processing techniques for pulse width modulation optical recording

    NASA Technical Reports Server (NTRS)

    Liu, Max M.-K.

    1991-01-01

    A jitter model and signal processing techniques are discussed for data recovery in Pulse Width Modulation (PWM) optical recording. In PWM, information is stored through modulating sizes of sequential marks alternating in magnetic polarization or in material structure. Jitter, defined as the deviation from the original mark size in the time domain, will result in error detection if it is excessively large. A new approach is taken in data recovery by first using a high speed counter clock to convert time marks to amplitude marks, and signal processing techniques are used to minimize jitter according to the jitter model. The signal processing techniques include motor speed and intersymbol interference equalization, differential and additive detection, and differential and additive modulation.

  15. Computed Tomography For Internal Inspection Of Castings

    NASA Technical Reports Server (NTRS)

    Hanna, Timothy L.

    1995-01-01

    Computed tomography used to detect internal flaws in metal castings before machining and otherwise processing them into finished parts. Saves time and money otherwise wasted on machining and other processing of castings eventually rejected because of internal defects. Knowledge of internal defects gained by use of computed tomography also provides guidance for changes in foundry techniques, procedures, and equipment to minimize defects and reduce costs.

  16. Removing external DNA contamination from arthropod predators destined for molecular gut-content analysis

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. Field and laboratory experiments have demonstrated that mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, can lead to contamination of fed pred...

  17. Removing external DNA decontamination from arthropod predators destined for molecular gut-content analysis

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. Field and laboratory experiments have demonstrated that mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, can lead to contamination of fed pred...

  18. Lidar-based door and stair detection from a mobile robot

    NASA Astrophysics Data System (ADS)

    Bansal, Mayank; Southall, Ben; Matei, Bogdan; Eledath, Jayan; Sawhney, Harpreet

    2010-04-01

    We present an on-the-move LIDAR-based object detection system for autonomous and semi-autonomous unmanned vehicle systems. In this paper we make several contributions: (i) we describe an algorithm for real-time detection of objects such as doors and stairs in indoor environments; (ii) we describe efficient data structures and algorithms for processing 3D point clouds acquired by laser scanners in a streaming manner, which minimize the memory copying and access. We show qualitative results demonstrating the effectiveness of our approach on runs in an indoor office environment.

  19. Quantification of Listeria monocytogenes in minimally processed leafy vegetables using a combined method based on enrichment and 16S rRNA real-time PCR.

    PubMed

    Aparecida de Oliveira, Maria; Abeid Ribeiro, Eliana Guimarães; Morato Bergamini, Alzira Maria; Pereira De Martinis, Elaine Cristina

    2010-02-01

    Modern lifestyle markedly changed eating habits worldwide, with an increasing demand for ready-to-eat foods, such as minimally processed fruits and leafy greens. Packaging and storage conditions of those products may favor the growth of psychrotrophic bacteria, including the pathogen Listeria monocytogenes. In this work, minimally processed leafy vegetables samples (n = 162) from retail market from Ribeirão Preto, São Paulo, Brazil, were tested for the presence or absence of Listeria spp. by the immunoassay Listeria Rapid Test, Oxoid. Two L. monocytogenes positive and six artificially contaminated samples of minimally processed leafy vegetables were evaluated by the Most Probable Number (MPN) with detection by classical culture method and also culture method combined with real-time PCR (RTi-PCR) for 16S rRNA genes of L. monocytogenes. Positive MPN enrichment tubes were analyzed by RTi-PCR with primers specific for L. monocytogenes using the commercial preparation ABSOLUTE QPCR SYBR Green Mix (ABgene, UK). Real-time PCR assay presented good exclusivity and inclusivity results and no statistical significant difference was found in comparison with the conventional culture method (p < 0.05). Moreover, RTi-PCR was fast and easy to perform, with MPN results obtained in ca. 48 h for RTi-PCR in comparison to 7 days for conventional method.

  20. Optical communication with two-photon coherent states. II - Photoemissive detection and structured receiver performance

    NASA Technical Reports Server (NTRS)

    Shapiro, J. H.; Yuen, H. P.; Machado Mata, J. A.

    1979-01-01

    In a previous paper (1978), the authors developed a method of analyzing the performance of two-photon coherent state (TCS) systems for free-space optical communications. General theorems permitting application of classical point process results to detection and estimation of signals in arbitrary quantum states were derived. The present paper examines the general problem of photoemissive detection statistics. On the basis of the photocounting theory of Kelley and Kleiner (1964) it is shown that for arbitrary pure state illumination, the resulting photocurrent is in general a self-exciting point process. The photocount statistics for first-order coherent fields reduce to those of a special class of Markov birth processes, which the authors term single-mode birth processes. These general results are applied to the structure of TCS radiation, and it is shown that the use of TCS radiation with direct or heterodyne detection results in minimal performance increments over comparable coherent-state systems. However, significant performance advantages are offered by use of TCS radiation with homodyne detection. The abstract quantum descriptions of homodyne and heterodyne detection are derived and a synthesis procedure for obtaining quantum measurements described by arbitrary TCS is given.

  1. Development of a qPCR direct detection method for Listeria monocytogenes in milk

    USDA-ARS?s Scientific Manuscript database

    There is a growing movement among consumers in the US and Europe towards minimally processed foods, including raw milk and dairy products. This trend significantly increases exposure to dairy-borne pathogens and indicates a need for rapid, sensitive screening tests for raw dairy products to reduce ...

  2. Variable threshold method for ECG R-peak detection.

    PubMed

    Kew, Hsein-Ping; Jeong, Do-Un

    2011-10-01

    In this paper, a wearable belt-type ECG electrode worn around the chest by measuring the real-time ECG is produced in order to minimize the inconvenient in wearing. ECG signal is detected using a potential instrument system. The measured ECG signal is transmits via an ultra low power consumption wireless data communications unit to personal computer using Zigbee-compatible wireless sensor node. ECG signals carry a lot of clinical information for a cardiologist especially the R-peak detection in ECG. R-peak detection generally uses the threshold value which is fixed. There will be errors in peak detection when the baseline changes due to motion artifacts and signal size changes. Preprocessing process which includes differentiation process and Hilbert transform is used as signal preprocessing algorithm. Thereafter, variable threshold method is used to detect the R-peak which is more accurate and efficient than fixed threshold value method. R-peak detection using MIT-BIH databases and Long Term Real-Time ECG is performed in this research in order to evaluate the performance analysis.

  3. Development of suspect and non-target screening methods for detection of organic contaminants in highway runoff and fish tissue with high-resolution time-of-flight mass spectrometry.

    PubMed

    Du, Bowen; Lofton, Jonathan M; Peter, Katherine T; Gipe, Alexander D; James, C Andrew; McIntyre, Jenifer K; Scholz, Nathaniel L; Baker, Joel E; Kolodziej, Edward P

    2017-09-20

    Untreated urban stormwater runoff contributes to poor water quality in receiving waters. The ability to identify toxicants and other bioactive molecules responsible for observed adverse effects in a complex mixture of contaminants is critical to effective protection of ecosystem and human health, yet this is a challenging analytical task. The objective of this study was to develop analytical methods using liquid chromatography coupled to high-resolution quadrupole time-of-flight mass spectrometry (LC-QTOF-MS) to detect organic contaminants in highway runoff and in runoff-exposed fish (adult coho salmon, Oncorhynchus kisutch). Processing of paired water and tissue samples facilitated contaminant prioritization and aided investigation of chemical bioavailability and uptake processes. Simple, minimal processing effort solid phase extraction (SPE) and elution procedures were optimized for water samples, and selective pressurized liquid extraction (SPLE) procedures were optimized for fish tissues. Extraction methods were compared by detection of non-target features and target compounds (e.g., quantity and peak area), while minimizing matrix interferences. Suspect screening techniques utilized in-house and commercial databases to prioritize high-risk detections for subsequent MS/MS characterization and identification efforts. Presumptive annotations were also screened with an in-house linear regression (log K ow vs. retention time) to exclude isobaric compounds. Examples of confirmed identifications (via reference standard comparison) in highway runoff include ethoprophos, prometon, DEET, caffeine, cotinine, 4(or 5)-methyl-1H-methylbenzotriazole, and acetanilide. Acetanilide was also detected in runoff-exposed fish gill and liver samples. Further characterization of highway runoff and fish tissues (14 and 19 compounds, respectively with tentative identification by MS/MS data) suggests that many novel or poorly characterized organic contaminants exist in urban stormwater runoff and exposed biota.

  4. [THE COMPARISON OF RESULTS OF DETECTION OF MINIMAL RESIDUAL DISEASE IN PERIPHERAL BLOOD AND MARROW IN CHILDREN OF THE FIRST YEAR OF LIFE WITH ACUTE LYMPHOBLASTIC LEUCOSIS].

    PubMed

    Tsaur, G A; Riger, T O; Popov, A M; Nasedkina, T V; Kustanovich, A M; Solodovnikov, A G; Streneva, O V; Shorikov, E V; Tsvirenko, S V; Saveliev, L I; Fechina, L G

    2015-04-01

    The occurrence of minimal residual disease is an important prognostic factor under acute lymphoblastic leucosis in children and adults. In overwhelming majority of research studies bone marrow is used to detect minimal residual disease. The comparative characteristic of detection of minimal residual disease in peripheral blood and bone marrow was carried out. The prognostic role of occurrence of minimal residual disease in peripheral blood and bone marrow under therapy according protocol MLL-Baby was evaluated. The analysis embraced 142 pair samples from 53 patients with acute lymphoblastic leucosis and various displacements of gene MLL younger than 365 days. The minimal residual disease was detected by force of identification of chimeric transcripts using polymerase chain reaction in real-time mode in 7 sequential points of observation established by protocol of therapy. The comparability of results of qualitative detection of minimal residual disease in bone marrow and peripheral blood amounted to 84.5%. At that, in all 22 (15.5%) discordant samples minimal residual disease was detected only in bone marrow. Despite of high level of comparability of results of detection of minimal residual disease in peripheral blood and bone marrow the occurrence of minimal residual disease in peripheral blood at various stages of therapy demonstrated no independent prognostic significance. The established differences had no relationship with sensitivity of method determined by value of absolute expression of gene ABL. Most likely, these differences reflected real distribution of tumor cells. The results of study demonstrated that application of peripheral blood instead of bone marrow for monitoring of minimal residual disease under acute lymphoblastic leucosis in children of first year of life is inappropriate. At the same time, retention of minimal residual disease in TH4 in bone marrow was an independent and prognostic unfavorable factor under therapy of acute lymphoblastic leucosis of children of first year of life according protocol MLL-Baby (OO=7.326, confidence interval 2.378-22.565).

  5. Digital pulse processing in Mössbauer spectroscopy

    NASA Astrophysics Data System (ADS)

    Veiga, A.; Grunfeld, C. M.

    2014-04-01

    In this work we present some advances towards full digitization of the detection subsystem of a Mössbauer transmission spectrometer. We show how, using adequate instrumentation, preamplifier output of a proportional counter can be digitized with no deterioration in spectrum quality, avoiding the need of a shaping amplifier. A pipelined architecture is proposed for a digital processor, which constitutes a versatile platform for the development of pulse processing techniques. Requirements for minimization of the analog processing are considered and experimental results are presented.

  6. A Technical Survey on Optimization of Processing Geo Distributed Data

    NASA Astrophysics Data System (ADS)

    Naga Malleswari, T. Y. J.; Ushasukhanya, S.; Nithyakalyani, A.; Girija, S.

    2018-04-01

    With growing cloud services and technology, there is growth in some geographically distributed data centers to store large amounts of data. Analysis of geo-distributed data is required in various services for data processing, storage of essential information, etc., processing this geo-distributed data and performing analytics on this data is a challenging task. The distributed data processing is accompanied by issues in storage, computation and communication. The key issues to be dealt with are time efficiency, cost minimization, utility maximization. This paper describes various optimization methods like end-to-end multiphase, G-MR, etc., using the techniques like Map-Reduce, CDS (Community Detection based Scheduling), ROUT, Workload-Aware Scheduling, SAGE, AMP (Ant Colony Optimization) to handle these issues. In this paper various optimization methods and techniques used are analyzed. It has been observed that end-to end multiphase achieves time efficiency; Cost minimization concentrates to achieve Quality of Service, Computation and reduction of Communication cost. SAGE achieves performance improvisation in processing geo-distributed data sets.

  7. Prospects and fundamental limitations of room temperature, non-avalanche, semiconductor photon-counting sensors (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ma, Jiaju; Zhang, Yang; Wang, Xiaoxin; Ying, Lei; Masoodian, Saleh; Wang, Zhiyuan; Starkey, Dakota A.; Deng, Wei; Kumar, Rahul; Wu, Yang; Ghetmiri, Seyed Amir; Yu, Zongfu; Yu, Shui-Qing; Salamo, Gregory J.; Fossum, Eric R.; Liu, Jifeng

    2017-05-01

    This research investigates the fundamental limits and trade-space of quantum semiconductor photodetectors using the Schrödinger equation and the laws of thermodynamics.We envision that, to optimize the metrics of single photon detection, it is critical to maximize the optical absorption in the minimal volume and minimize the carrier transit process simultaneously. Integration of photon management with quantum charge transport/redistribution upon optical excitation can be engineered to maximize the quantum efficiency (QE) and data rate and minimize timing jitter at the same time. Due to the ultra-low capacitance of these quantum devices, even a single photoelectron transfer can induce a notable change in the voltage, enabling non-avalanche single photon detection at room temperature as has been recently demonstrated in Si quanta image sensors (QIS). In this research, uniform III-V quantum dots (QDs) and Si QIS are used as model systems to test the theory experimentally. Based on the fundamental understanding, we also propose proof-of-concept, photon-managed quantum capacitance photodetectors. Built upon the concepts of QIS and single electron transistor (SET), this novel device structure provides a model system to synergistically test the fundamental limits and tradespace predicted by the theory for semiconductor detectors. This project is sponsored under DARPA/ARO's DETECT Program: Fundamental Limits of Quantum Semiconductor Photodetectors.

  8. Human pathogens in plant biofilms: Formation, physiology, and detection

    USDA-ARS?s Scientific Manuscript database

    Fresh produce, viewed as an essential part of a healthy life style is usually consumed in the form of raw or minimally processed fruits and vegetables, and is a potentially important source of food-borne human pathogenic bacteria and viruses. These are passed on to the consumer since the bacteria ca...

  9. Discrete-State and Continuous Models of Recognition Memory: Testing Core Properties under Minimal Assumptions

    ERIC Educational Resources Information Center

    Kellen, David; Klauer, Karl Christoph

    2014-01-01

    A classic discussion in the recognition-memory literature concerns the question of whether recognition judgments are better described by continuous or discrete processes. These two hypotheses are instantiated by the signal detection theory model (SDT) and the 2-high-threshold model, respectively. Their comparison has almost invariably relied on…

  10. Unnecessary roughness? Testing the hypothesis that predators destined for molecular gut-content analysis must be hand-collected to avoid cross-contamination

    USDA-ARS?s Scientific Manuscript database

    Molecular gut-content analysis enables direct detection of arthropod predation with minimal disruption of on-going ecosystem processes. Mass-collection methods, such as sweep-netting, vacuum sampling, and foliage beating, could lead to regurgitation or even rupturing of predators along with uneaten ...

  11. The Chandra Source Catalog: Processing and Infrastructure

    NASA Astrophysics Data System (ADS)

    Evans, Janet; Evans, Ian N.; Glotfelty, Kenny J.; Hain, Roger; Hall, Diane M.; Miller, Joseph B.; Plummer, David A.; Zografou, Panagoula; Primini, Francis A.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.

    2009-09-01

    Chandra Source Catalog processing recalibrates each observation using the latest available calibration data, and employs a wavelet-based source detection algorithm to identify all the X-ray sources in the field of view. Source properties are then extracted from each detected source that is a candidate for inclusion in the catalog. Catalog processing is completed by matching sources across multiple observations, merging common detections, and applying quality assurance checks. The Chandra Source Catalog processing system shares a common processing infrastructure and utilizes much of the functionality that is built into the Standard Data Processing (SDP) pipeline system that provides calibrated Chandra data to end-users. Other key components of the catalog processing system have been assembled from the portable CIAO data analysis package. Minimal new software tool development has been required to support the science algorithms needed for catalog production. Since processing pipelines must be instantiated for each detected source, the number of pipelines that are run during catalog construction is a factor of order 100 times larger than for SDP. The increased computational load, and inherent parallel nature of the processing, is handled by distributing the workload across a multi-node Beowulf cluster. Modifications to the SDP automated processing application to support catalog processing, and extensions to Chandra Data Archive software to ingest and retrieve catalog products, complete the upgrades to the infrastructure to support catalog processing.

  12. Automatic detection of snow avalanches in continuous seismic data using hidden Markov models

    NASA Astrophysics Data System (ADS)

    Heck, Matthias; Hammer, Conny; van Herwijnen, Alec; Schweizer, Jürg; Fäh, Donat

    2018-01-01

    Snow avalanches generate seismic signals as many other mass movements. Detection of avalanches by seismic monitoring is highly relevant to assess avalanche danger. In contrast to other seismic events, signals generated by avalanches do not have a characteristic first arrival nor is it possible to detect different wave phases. In addition, the moving source character of avalanches increases the intricacy of the signals. Although it is possible to visually detect seismic signals produced by avalanches, reliable automatic detection methods for all types of avalanches do not exist yet. We therefore evaluate whether hidden Markov models (HMMs) are suitable for the automatic detection of avalanches in continuous seismic data. We analyzed data recorded during the winter season 2010 by a seismic array deployed in an avalanche starting zone above Davos, Switzerland. We re-evaluated a reference catalogue containing 385 events by grouping the events in seven probability classes. Since most of the data consist of noise, we first applied a simple amplitude threshold to reduce the amount of data. As first classification results were unsatisfying, we analyzed the temporal behavior of the seismic signals for the whole data set and found that there is a high variability in the seismic signals. We therefore applied further post-processing steps to reduce the number of false alarms by defining a minimal duration for the detected event, implementing a voting-based approach and analyzing the coherence of the detected events. We obtained the best classification results for events detected by at least five sensors and with a minimal duration of 12 s. These processing steps allowed identifying two periods of high avalanche activity, suggesting that HMMs are suitable for the automatic detection of avalanches in seismic data. However, our results also showed that more sensitive sensors and more appropriate sensor locations are needed to improve the signal-to-noise ratio of the signals and therefore the classification.

  13. Adaptive near-field beamforming techniques for sound source imaging.

    PubMed

    Cho, Yong Thung; Roan, Michael J

    2009-02-01

    Phased array signal processing techniques such as beamforming have a long history in applications such as sonar for detection and localization of far-field sound sources. Two sometimes competing challenges arise in any type of spatial processing; these are to minimize contributions from directions other than the look direction and minimize the width of the main lobe. To tackle this problem a large body of work has been devoted to the development of adaptive procedures that attempt to minimize side lobe contributions to the spatial processor output. In this paper, two adaptive beamforming procedures-minimum variance distorsionless response and weight optimization to minimize maximum side lobes--are modified for use in source visualization applications to estimate beamforming pressure and intensity using near-field pressure measurements. These adaptive techniques are compared to a fixed near-field focusing technique (both techniques use near-field beamforming weightings focusing at source locations estimated based on spherical wave array manifold vectors with spatial windows). Sound source resolution accuracies of near-field imaging procedures with different weighting strategies are compared using numerical simulations both in anechoic and reverberant environments with random measurement noise. Also, experimental results are given for near-field sound pressure measurements of an enclosed loudspeaker.

  14. Video-processing-based system for automated pedestrian data collection and analysis when crossing the street

    NASA Astrophysics Data System (ADS)

    Mansouri, Nabila; Watelain, Eric; Ben Jemaa, Yousra; Motamed, Cina

    2018-03-01

    Computer-vision techniques for pedestrian detection and tracking have progressed considerably and become widely used in several applications. However, a quick glance at the literature shows a minimal use of these techniques in pedestrian behavior and safety analysis, which might be due to the technical complexities facing the processing of pedestrian videos. To extract pedestrian trajectories from a video automatically, all road users must be detected and tracked during sequences, which is a challenging task, especially in a congested open-outdoor urban space. A multipedestrian tracker based on an interframe-detection-association process was proposed and evaluated. The tracker results are used to implement an automatic tool for pedestrians data collection when crossing the street based on video processing. The variations in the instantaneous speed allowed the detection of the street crossing phases (approach, waiting, and crossing). These were addressed for the first time in the pedestrian road security analysis to illustrate the causal relationship between pedestrian behaviors in the different phases. A comparison with a manual data collection method, by computing the root mean square error and the Pearson correlation coefficient, confirmed that the procedures proposed have significant potential to automate the data collection process.

  15. Proactive detection of bones in poultry processing

    NASA Astrophysics Data System (ADS)

    Daley, W. D. R.; Stewart, John

    2009-05-01

    Bones continue to be a problem of concern for the poultry industry. Most further processed products begin with the requirement for raw material with minimal bones. The current process for generating deboned product requires systems for monitoring and inspecting the output product. The current detection systems are either people palpitating the product or X-ray systems. The current performance of these inspection techniques are below the desired levels of accuracies and are costly. We propose a technique for monitoring bones that conduct the inspection operation in the deboning the process so as to have enough time to take action to reduce the probability that bones will end up in the final product. This is accomplished by developing active cones with built in illumination to backlight the cage (skeleton) on the deboning line. If the bones of interest are still on the cage then the bones are not in the associated meat. This approach also allows for the ability to practice process control on the deboning operation to keep the process under control as opposed to the current system where the detection is done post production and does not easily present the opportunity to adjust the process. The proposed approach shows overall accuracies of about 94% for the detection of the clavicle bones.

  16. Practicing safe cell culture: applied process designs for minimizing virus contamination risk.

    PubMed

    Kiss, Robert D

    2011-01-01

    CONFERENCE PROCEEDING Proceedings of the PDA/FDA Adventitious Viruses in Biologics: Detection and Mitigation Strategies Workshop in Bethesda, MD, USA; December 1-3, 2010 Guest Editors: Arifa Khan (Bethesda, MD), Patricia Hughes (Bethesda, MD) and Michael Wiebe (San Francisco, CA) Genentech responded to a virus contamination in its biologics manufacturing facility by developing and implementing a series of barriers specifically designed to prevent recurrence of this significant and impactful event. The barriers included steps to inactivate or remove potential virus particles from the many raw materials used in cell culture processing. Additionally, analytical testing barriers provided protection of the downstream processing areas should a culture contamination occur, and robust virus clearance capability provided further assurance of virus safety should a low level contamination go undetected. This conference proceeding will review Genentech's approach, and lessons learned, in minimizing virus contamination risk in cell culture processes through multiple layers of targeted barriers designed to deliver biologics products with high success rates.

  17. A Fast, Reliable, and Sensitive Method for Detection and Quantification of Listeria monocytogenes and Escherichia coli O157:H7 in Ready-to-Eat Fresh-Cut Products by MPN-qPCR

    PubMed Central

    Russo, Pasquale; Botticella, Giuseppe; Capozzi, Vittorio; Massa, Salvatore; Spano, Giuseppe; Beneduce, Luciano

    2014-01-01

    In the present work we developed a MPN quantitative real-time PCR (MPN-qPCR) method for a fast and reliable detection and quantification of Listeria monocytogenes and Escherichia coli O157:H7 in minimally processed vegetables. In order to validate the proposed technique, the results were compared with conventional MPN followed by phenotypic and biochemical assays methods. When L. monocytogenes and E. coli O157:H7 were artificially inoculated in fresh-cut vegetables, a concentration as low as 1 CFU g−1 could be detected in 48 hours for both pathogens. qPCR alone allowed a limit of detection of 101 CFU g−1 after 2 hours of enrichment for L. monocytogenes and E. coli O157:H7. Since minimally processed ready-to-eat vegetables are characterized by very short shelf life, our method can potentially address the consistent reduction of time for microbial analysis, allowing a better management of quality control. Moreover, the occurrences of both pathogenic bacteria in mixed salad samples and fresh-cut melons were monitored in two production plants from the receipt of the raw materials to the early stages of shelf life. No sample was found to be contaminated by L. monocytogenes. One sample of raw mixed salad was found positive to an H7 enterohemorrhagic serotype. PMID:24949460

  18. The perception of minimal structures: performance on open and closed versions of visually presented Euclidean travelling salesperson problems.

    PubMed

    Vickers, Douglas; Bovet, Pierre; Lee, Michael D; Hughes, Peter

    2003-01-01

    The planar Euclidean version of the travelling salesperson problem (TSP) requires finding a tour of minimal length through a two-dimensional set of nodes. Despite the computational intractability of the TSP, people can produce rapid, near-optimal solutions to visually presented versions of such problems. To explain this, MacGregor et al (1999, Perception 28 1417-1428) have suggested that people use a global-to-local process, based on a perceptual tendency to organise stimuli into convex figures. We review the evidence for this idea and propose an alternative, local-to-global hypothesis, based on the detection of least distances between the nodes in an array. We present the results of an experiment in which we examined the relationships between three objective measures and performance measures of optimality and response uncertainty in tasks requiring participants to construct a closed tour or an open path. The data are not well accounted for by a process based on the convex hull. In contrast, results are generally consistent with a locally focused process based initially on the detection of nearest-neighbour clusters. Individual differences are interpreted in terms of a hierarchical process of constructing solutions, and the findings are related to a more general analysis of the role of nearest neighbours in the perception of structure and motion.

  19. Batch-processed semiconductor gas sensor array for the selective detection of NOx in automotive exhaust gas

    NASA Astrophysics Data System (ADS)

    Jang, Hani; Kim, Minki; Kim, Yongjun

    2016-12-01

    This paper reports on a semiconductor gas sensor array to detect nitrogen oxides (NOx) in automotive exhaust gas. The proposed semiconductor gas sensor array consisted of one common electrode and three individual electrodes to minimize the size of the sensor array, and three sensing layers [TiO2 + SnO2 (15 wt%), SnO2, and Ga2O3] were deposited using screen printing. In addition, sensing materials were sintered under the same conditions in order to take advantage of batch processing. The sensing properties of the proposed sensor array were verified by experimental measurements, and the selectivity improved by using pattern recognition.

  20. Effect of different film packaging on microbial growth in minimally processed cactus pear (Opuntia ficus-indica).

    PubMed

    Palma, A; Mangia, N P; Fadda, A; Barberis, A; Schirra, M; D'Aquino, S

    2013-01-01

    Microorganisms are natural contaminants of fresh produce and minimally processed products, and contamination arises from a number of sources, including the environment, postharvest handling and processing. Fresh-cut products are particularly susceptible to microbial contaminations because of the changes occurring in the tissues during processing. In package gas composition of modified atmosphere packaging (MAP) in combination with low storage temperatures besides reducing physiological activity of packaged produce, can also delay pathogen growth. Present study investigated on the effect of MAPs, achieved with different plastic films, on microbial growth of minimally processed cactus pear (Opuntio ficus-indica) fruit. Five different plastic materials were used for packaging the manually peeled fruit. That is: a) polypropylene film (Termoplast MY 40 micron thickness, O2 transmission rate 300 cc/m2/24h); b) polyethylene film (Bolphane BHE, 11 micron thickness, O2 transmission rate 19000 cc/m2/24h); c) polypropylene laser-perforated films (Mach Packaging) with 8, 16 or 32 100-micron holes. Total aerobic psychrophilic, mesophilic microorganisms, Enterobacteriaceae, yeast, mould populations and in-package CO2, O2 and C2H4 were determined at each storage time. Different final gas compositions, ranging from 7.8 KPa to 17.1 KPa O2, and 12.7 KPa to 2.6 KPa CO2, were achieved with MY and micro perforated films, respectively. Differences were detected in the mesophilic, Enterobacteriaceae and yeast loads, while no difference was detected in psychrophilic microorganisms. At the end of storage, microbial load in fruits sealed with MY film was significantly lower than in those sealed with BHE and micro perforated films. Furthermore, fruits packed with micro-perforated films showed the highest microbial load. This occurrence may in part be related to in-package gas composition and in part to a continuous contamination of microorganisms through micro-holes.

  1. Fusion of Scores in a Detection Context Based on Alpha Integration.

    PubMed

    Soriano, Antonio; Vergara, Luis; Ahmed, Bouziane; Salazar, Addisson

    2015-09-01

    We present a new method for fusing scores corresponding to different detectors (two-hypotheses case). It is based on alpha integration, which we have adapted to the detection context. Three optimization methods are presented: least mean square error, maximization of the area under the ROC curve, and minimization of the probability of error. Gradient algorithms are proposed for the three methods. Different experiments with simulated and real data are included. Simulated data consider the two-detector case to illustrate the factors influencing alpha integration and demonstrate the improvements obtained by score fusion with respect to individual detector performance. Two real data cases have been considered. In the first, multimodal biometric data have been processed. This case is representative of scenarios in which the probability of detection is to be maximized for a given probability of false alarm. The second case is the automatic analysis of electroencephalogram and electrocardiogram records with the aim of reproducing the medical expert detections of arousal during sleeping. This case is representative of scenarios in which probability of error is to be minimized. The general superior performance of alpha integration verifies the interest of optimizing the fusing parameters.

  2. The distributed neural system for top-down letter processing: an fMRI study

    NASA Astrophysics Data System (ADS)

    Liu, Jiangang; Feng, Lu; Li, Ling; Tian, Jie

    2011-03-01

    This fMRI study used Psychophysiological interaction (PPI) to investigate top-down letter processing with an illusory letter detection task. After an initial training that became increasingly difficult, participant was instructed to detect a letter from pure noise images where there was actually no letter. Such experimental paradigm allowed for isolating top-down components of letter processing and minimizing the influence of bottom-up perceptual input. A distributed cortical network of top-down letter processing was identified by analyzing the functional connectivity patterns of letter-preferential area (LA) within the left fusiform gyrus. Such network extends from the visual cortex to high level cognitive cortexes, including the left middle frontal gyrus, left medial frontal gyrus, left superior parietal gyrus, bilateral precuneus, and left inferior occipital gyrus. These findings suggest that top-down letter processing contains not only regions for processing of letter phonology and appearance, but also those involved in internal information generation and maintenance, and attention and memory processing.

  3. Lidar Luminance Quantizer

    NASA Technical Reports Server (NTRS)

    Quilligan, Gerard; DeMonthier, Jeffrey; Suarez, George

    2011-01-01

    This innovation addresses challenges in lidar imaging, particularly with the detection scheme and the shapes of the detected signals. Ideally, the echoed pulse widths should be extremely narrow to resolve fine detail at high event rates. However, narrow pulses require wideband detection circuitry with increased power dissipation to minimize thermal noise. Filtering is also required to shape each received signal into a form suitable for processing by a constant fraction discriminator (CFD) followed by a time-to-digital converter (TDC). As the intervals between the echoes decrease, the finite bandwidth of the shaping circuits blends the pulses into an analog signal (luminance) with multiple modes, reducing the ability of the CFD to discriminate individual events

  4. Irradiation treatment of minimally processed carrots for ensuring microbiological safety

    NASA Astrophysics Data System (ADS)

    Ashraf Chaudry, Muhammad; Bibi, Nizakat; Khan, Misal; Khan, Maazullah; Badshah, Amal; Jamil Qureshi, Muhammad

    2004-09-01

    Minimally processed fruits and vegetables are very common in developed countries and are gaining popularity in developing countries due to their convenience and freshness. However, minimally processing may result in undesirable changes in colour, taste and appearance due to the transfer of microbes from skin to the flesh. Irradiation is a well-known technology for elimination of microbial contamination. Food irradiation has been approved by 50 countries and is being applied commercially in USA. The purpose of this study was to evaluate the effect of irradiation on the quality of minimally processed carrots. Fresh carrots were peeled, sliced and PE packaged. The samples were irradiated (0, 0.5, 1.0, 2.0, 2.5, 3.0 kGy) and stored at 5°C for 2 weeks. The samples were analyzed for hardness, organoleptic acceptance and microbial load at 0, 7th and 15th day. The mean firmness of the control and all irradiated samples remained between 4.31 and 4.42 kg of force, showing no adverse effect of radiation dose. The effect of storage (2 weeks) was significant ( P< 0.05) with values ranging between 4.28 and 4.39 kg of force. The total bacterial counts at 5°C for non-irradiated and 0.5 kGy irradiated samples were 6.3×10 5 cfu/g, 3.0×10 2 and few colonies(>10) in all other irradiated samples(1.0, 2.0, 2.5 and 3.0 kGy) after 2 weeks storage. No coliform or E. coli were detected in any of the samples (radiated or control) immediately after irradiation and during the entire storage period in minimally processed carrots. A dose of 2.0 kGy completely controlled the fungal and bacterial counts. The irradiated samples (2.0 kGy) were also acceptable sensorially.

  5. Fast-acting sprinkler system design considerations for propellant manufacture

    NASA Astrophysics Data System (ADS)

    Matthews, A. L.; Crable, J. M.; Kristoff, P. T.

    1984-08-01

    Fast-acting sprinkler systems for detection and suppression of fires in propellant operations, which require activation in the millisecond range in order to be effective, can be easily defeated unless particular attention is paid to design and maintenance details. Of primary consideration are detector selection and placement in processes to minimize the effect of environmental influences. Also important are nozzle placement, water flow density, water supply pressure, and pattern and sloping of piping. When all of these design criteria are properly implemented, water application can occur within 100 ms of fire detection.

  6. Oligosaccharide formation during commercial pear juice processing.

    PubMed

    Willems, Jamie L; Low, Nicholas H

    2016-08-01

    The effect of enzyme treatment and processing on the oligosaccharide profile of commercial pear juice samples was examined by high performance anion exchange chromatography with pulsed amperometric detection and capillary gas chromatography with flame ionization detection. Industrial samples representing the major stages of processing produced with various commercial enzyme preparations were studied. Through the use of commercially available standards and laboratory scale enzymatic hydrolysis of pectin, starch and xyloglucan; galacturonic acid oligomers, glucose oligomers (e.g., maltose and cellotriose) and isoprimeverose were identified as being formed during pear juice production. It was found that the majority of polysaccharide hydrolysis and oligosaccharide formation occurred during enzymatic treatment at the pear mashing stage and that the remaining processing steps had minimal impact on the carbohydrate-based chromatographic profile of pear juice. Also, all commercial enzyme preparations and conditions (time and temperature) studied produced similar carbohydrate-based chromatographic profiles. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Consensus criteria for sensitive detection of minimal neuroblastoma cells in bone marrow, blood and stem cell preparations by immunocytology and QRT-PCR: recommendations by the International Neuroblastoma Risk Group Task Force

    PubMed Central

    Beiske, K; Burchill, S A; Cheung, I Y; Hiyama, E; Seeger, R C; Cohn, S L; Pearson, A D J; Matthay, K K

    2009-01-01

    Disseminating disease is a predictive and prognostic indicator of poor outcome in children with neuroblastoma. Its accurate and sensitive assessment can facilitate optimal treatment decisions. The International Neuroblastoma Risk Group (INRG) Task Force has defined standardised methods for the determination of minimal disease (MD) by immunocytology (IC) and quantitative reverse transcriptase-polymerase chain reaction (QRT-PCR) using disialoganglioside GD2 and tyrosine hydroxylase mRNA respectively. The INRG standard operating procedures (SOPs) define methods for collecting, processing and evaluating bone marrow (BM), peripheral blood (PB) and peripheral blood stem cell harvest by IC and QRT-PCR. Sampling PB and BM is recommended at diagnosis, before and after myeloablative therapy and at the end of treatment. Peripheral blood stem cell products should be analysed at the time of harvest. Performing MD detection according to INRG SOPs will enable laboratories throughout the world to compare their results and thus facilitate quality-controlled multi-centre prospective trials to assess the clinical significance of MD and minimal residual disease in heterogeneous patient groups. PMID:19401690

  8. Autonomous Object Characterization with Large Datasets

    DTIC Science & Technology

    2015-10-18

    desk, where a substantial amount of effort is required to transform raw photometry into a data product, minimizing the amount of time the analyst has...were used to explore concepts in satellite characterization and satellite state change. The first algorithm provides real- time stability estimation... Timely and effective space object (SO) characterization is a challenge, and requires advanced data processing techniques. Detection and identification

  9. Hyperspectral Imaging Using Flexible Endoscopy for Laryngeal Cancer Detection

    PubMed Central

    Regeling, Bianca; Thies, Boris; Gerstner, Andreas O. H.; Westermann, Stephan; Müller, Nina A.; Bendix, Jörg; Laffers, Wiebke

    2016-01-01

    Hyperspectral imaging (HSI) is increasingly gaining acceptance in the medical field. Up until now, HSI has been used in conjunction with rigid endoscopy to detect cancer in vivo. The logical next step is to pair HSI with flexible endoscopy, since it improves access to hard-to-reach areas. While the flexible endoscope’s fiber optic cables provide the advantage of flexibility, they also introduce an interfering honeycomb-like pattern onto images. Due to the substantial impact this pattern has on locating cancerous tissue, it must be removed before the HS data can be further processed. Thereby, the loss of information is to minimize avoiding the suppression of small-area variations of pixel values. We have developed a system that uses flexible endoscopy to record HS cubes of the larynx and designed a special filtering technique to remove the honeycomb-like pattern with minimal loss of information. We have confirmed its feasibility by comparing it to conventional filtering techniques using an objective metric and by applying unsupervised and supervised classifications to raw and pre-processed HS cubes. Compared to conventional techniques, our method successfully removes the honeycomb-like pattern and considerably improves classification performance, while preserving image details. PMID:27529255

  10. Hyperspectral Imaging Using Flexible Endoscopy for Laryngeal Cancer Detection.

    PubMed

    Regeling, Bianca; Thies, Boris; Gerstner, Andreas O H; Westermann, Stephan; Müller, Nina A; Bendix, Jörg; Laffers, Wiebke

    2016-08-13

    Hyperspectral imaging (HSI) is increasingly gaining acceptance in the medical field. Up until now, HSI has been used in conjunction with rigid endoscopy to detect cancer in vivo. The logical next step is to pair HSI with flexible endoscopy, since it improves access to hard-to-reach areas. While the flexible endoscope's fiber optic cables provide the advantage of flexibility, they also introduce an interfering honeycomb-like pattern onto images. Due to the substantial impact this pattern has on locating cancerous tissue, it must be removed before the HS data can be further processed. Thereby, the loss of information is to minimize avoiding the suppression of small-area variations of pixel values. We have developed a system that uses flexible endoscopy to record HS cubes of the larynx and designed a special filtering technique to remove the honeycomb-like pattern with minimal loss of information. We have confirmed its feasibility by comparing it to conventional filtering techniques using an objective metric and by applying unsupervised and supervised classifications to raw and pre-processed HS cubes. Compared to conventional techniques, our method successfully removes the honeycomb-like pattern and considerably improves classification performance, while preserving image details.

  11. Operation and force analysis of the guide wire in a minimally invasive vascular interventional surgery robot system

    NASA Astrophysics Data System (ADS)

    Yang, Xue; Wang, Hongbo; Sun, Li; Yu, Hongnian

    2015-03-01

    To develop a robot system for minimally invasive surgery is significant, however the existing minimally invasive surgery robots are not applicable in practical operations, due to their limited functioning and weaker perception. A novel wire feeder is proposed for minimally invasive vascular interventional surgery. It is used for assisting surgeons in delivering a guide wire, balloon and stenting into a specific lesion location. By contrasting those existing wire feeders, the motion methods for delivering and rotating the guide wire in blood vessel are described, and their mechanical realization is presented. A new resistant force detecting method is given in details. The change of the resistance force can help the operator feel the block or embolism existing in front of the guide wire. The driving torque for rotating the guide wire is developed at different positions. Using the CT reconstruction image and extracted vessel paths, the path equation of the blood vessel is obtained. Combining the shapes of the guide wire outside the blood vessel, the whole bending equation of the guide wire is obtained. That is a risk criterion in the delivering process. This process can make operations safer and man-machine interaction more reliable. A novel surgery robot for feeding guide wire is designed, and a risk criterion for the system is given.

  12. Combining contour detection algorithms for the automatic extraction of the preparation line from a dental 3D measurement

    NASA Astrophysics Data System (ADS)

    Ahlers, Volker; Weigl, Paul; Schachtzabel, Hartmut

    2005-04-01

    Due to the increasing demand for high-quality ceramic crowns and bridges, the CAD/CAM-based production of dental restorations has been a subject of intensive research during the last fifteen years. A prerequisite for the efficient processing of the 3D measurement of prepared teeth with a minimal amount of user interaction is the automatic determination of the preparation line, which defines the sealing margin between the restoration and the prepared tooth. Current dental CAD/CAM systems mostly require the interactive definition of the preparation line by the user, at least by means of giving a number of start points. Previous approaches to the automatic extraction of the preparation line rely on single contour detection algorithms. In contrast, we use a combination of different contour detection algorithms to find several independent potential preparation lines from a height profile of the measured data. The different algorithms (gradient-based, contour-based, and region-based) show their strengths and weaknesses in different clinical situations. A classifier consisting of three stages (range check, decision tree, support vector machine), which is trained by human experts with real-world data, finally decides which is the correct preparation line. In a test with 101 clinical preparations, a success rate of 92.0% has been achieved. Thus the combination of different contour detection algorithms yields a reliable method for the automatic extraction of the preparation line, which enables the setup of a turn-key dental CAD/CAM process chain with a minimal amount of interactive screen work.

  13. Detection of minimal residual disease following induction immunochemotherapy predicts progression free survival in mantle cell lymphoma: final results of CALGB 59909

    PubMed Central

    Liu, Hongtao; Johnson, Jeffrey L.; Koval, Greg; Malnassy, Greg; Sher, Dorie; Damon, Lloyd E.; Hsi, Eric D.; Bucci, Donna Marie; Linker, Charles A.; Cheson, Bruce D.; Stock, Wendy

    2012-01-01

    Background In the present study, the prognostic impact of minimal residual disease during treatment on time to progression and overall survival was analyzed prospectively in patients with mantle cell lymphoma treated on the Cancer and Leukemia Group B 59909 clinical trial. Design and Methods Peripheral blood and bone marrow samples were collected during different phases of the Cancer and Leukemia Group B 59909 study for minimal residual disease analysis. Minimal residual disease status was determined by quantitative polymerase chain reaction of IgH and/or BCL-1/JH gene rearrangement. Correlation of minimal residual disease status with time to progression and overall survival was determined. In multivariable analysis, minimal residual disease, and other risk factors were correlated with time to progression. Results Thirty-nine patients had evaluable, sequential peripheral blood and bone marrow samples for minimal residual disease analysis. Using peripheral blood monitoring, 18 of 39 (46%) achieved molecular remission following induction therapy. The molecular remission rate increased from 46 to 74% after one course of intensification therapy. Twelve of 21 minimal residual disease positive patients (57%) progressed within three years of follow up compared to 4 of 18 (22%) molecular remission patients (P=0.049). Detection of minimal residual disease following induction therapy predicted disease progression with a hazard ratio of 3.7 (P=0.016). The 3-year probability of time to progression among those who were in molecular remission after induction chemotherapy was 82% compared to 48% in patients with detectable minimal residual disease. The prediction of time to progression by post-induction minimal residual disease was independent of other prognostic factors in multivariable analysis. Conclusions Detection of minimal residual disease following induction immunochemotherapy was an independent predictor of time to progression following immunochemotherapy and autologous stem cell transplantation for mantle cell lymphoma. The clinical trial was registered at ClinicalTrials.gov: NCT00020943. PMID:22102709

  14. Enhanced automatic artifact detection based on independent component analysis and Renyi's entropy.

    PubMed

    Mammone, Nadia; Morabito, Francesco Carlo

    2008-09-01

    Artifacts are disturbances that may occur during signal acquisition and may affect their processing. The aim of this paper is to propose a technique for automatically detecting artifacts from the electroencephalographic (EEG) recordings. In particular, a technique based on both Independent Component Analysis (ICA) to extract artifactual signals and on Renyi's entropy to automatically detect them is presented. This technique is compared to the widely known approach based on ICA and the joint use of kurtosis and Shannon's entropy. The novel processing technique is shown to detect on average 92.6% of the artifactual signals against the average 68.7% of the previous technique on the studied available database. Moreover, Renyi's entropy is shown to be able to detect muscle and very low frequency activity as well as to discriminate them from other kinds of artifacts. In order to achieve an efficient rejection of the artifacts while minimizing the information loss, future efforts will be devoted to the improvement of blind artifact separation from EEG in order to ensure a very efficient isolation of the artifactual activity from any signals deriving from other brain tasks.

  15. Automation of servicibility of radio-relay station equipment

    NASA Astrophysics Data System (ADS)

    Uryev, A. G.; Mishkin, Y. I.; Itkis, G. Y.

    1985-03-01

    Automation of the serviceability of radio relay station equipment must ensure central gathering and primary processing of reliable instrument reading with subsequent display on the control panel, detection and recording of failures soon enough, advance enough warning based on analysis of detertioration symptoms, and correct remote measurement of equipment performance parameters. Such an inspection will minimize transmission losses while reducing nonproductive time and labor spent on documentation and measurement. A multichannel automated inspection system for this purpose should operate by a parallel rather than sequential procedure. Digital data processing is more expedient in this case than analog method and, therefore, analog to digital converters are required. Spepcial normal, above limit and below limit test signals provide means of self-inspection, to which must be added adequate interference immunization, stabilization, and standby power supply. Use of a microcomputer permits overall refinement and expansion of the inspection system while it minimizes though not completely eliminates dependence on subjective judgment.

  16. Direct Detection of Nucleic Acid with Minimizing Background and Improving Sensitivity Based on a Conformation-Discriminating Indicator.

    PubMed

    Zhu, Lixuan; Qing, Zhihe; Hou, Lina; Yang, Sheng; Zou, Zhen; Cao, Zhong; Yang, Ronghua

    2017-08-25

    As is well-known, the nucleic acid indicator-based strategy is one of the major approaches to monitor the nucleic acid hybridization-mediated recognition events in biochemical analysis, displaying obvious advantages including simplicity, low cost, convenience, and generality. However, conventional indicators either hold strong self-fluorescence or can be lighted by both ssDNA and dsDNA, lacking absolute selectivity for a certain conformation, always with high background interference and low sensitivity in sensing; and additional processing (e.g., nanomaterial-mediated background suppression, and enzyme-catalyzed signal amplification) is generally required to improve the detection performance. In this work, a carbazole derivative, EBCB, has been synthesized and screened as a dsDNA-specific fluorescent indicator. Compared with conventional indicators under the same conditions, EBCB displayed a much higher selective coefficient for dsDNA, with little self-fluorescence and negligible effect from ssDNA. Based on its superior capability in DNA conformation-discrimination, high sensitivity with minimizing background interference was demonstrated for direct detection of nucleic acid, and monitoring nucleic acid-based circuitry with good reversibity, resulting in low detection limit and high capability for discriminating base-mismatching. Thus, we expect that this highly specific DNA conformation-discriminating indicator will hold good potential for application in biochemical sensing and molecular logic switching.

  17. Using Knowledge Base for Event-Driven Scheduling of Web Monitoring Systems

    NASA Astrophysics Data System (ADS)

    Kim, Yang Sok; Kang, Sung Won; Kang, Byeong Ho; Compton, Paul

    Web monitoring systems report any changes to their target web pages by revisiting them frequently. As they operate under significant resource constraints, it is essential to minimize revisits while ensuring minimal delay and maximum coverage. Various statistical scheduling methods have been proposed to resolve this problem; however, they are static and cannot easily cope with events in the real world. This paper proposes a new scheduling method that manages unpredictable events. An MCRDR (Multiple Classification Ripple-Down Rules) document classification knowledge base was reused to detect events and to initiate a prompt web monitoring process independent of a static monitoring schedule. Our experiment demonstrates that the approach improves monitoring efficiency significantly.

  18. Active shape models unleashed

    NASA Astrophysics Data System (ADS)

    Kirschner, Matthias; Wesarg, Stefan

    2011-03-01

    Active Shape Models (ASMs) are a popular family of segmentation algorithms which combine local appearance models for boundary detection with a statistical shape model (SSM). They are especially popular in medical imaging due to their ability for fast and accurate segmentation of anatomical structures even in large and noisy 3D images. A well-known limitation of ASMs is that the shape constraints are over-restrictive, because the segmentations are bounded by the Principal Component Analysis (PCA) subspace learned from the training data. To overcome this limitation, we propose a new energy minimization approach which combines an external image energy with an internal shape model energy. Our shape energy uses the Distance From Feature Space (DFFS) concept to allow deviations from the PCA subspace in a theoretically sound and computationally fast way. In contrast to previous approaches, our model does not rely on post-processing with constrained free-form deformation or additional complex local energy models. In addition to the energy minimization approach, we propose a new method for liver detection, a new method for initializing an SSM and an improved k-Nearest Neighbour (kNN)-classifier for boundary detection. Our ASM is evaluated with leave-one-out tests on a data set with 34 tomographic CT scans of the liver and is compared to an ASM with standard shape constraints. The quantitative results of our experiments show that we achieve higher segmentation accuracy with our energy minimization approach than with standard shape constraints.nym

  19. Recent Advances for the Detection of Ochratoxin A.

    PubMed

    Ha, Tai Hwan

    2015-12-04

    Ochratoxin A (OTA) is one of the mycotoxins secreted by Aspersillus and Penicillium that can easily colonize various grains like coffee, peanut, rice, and maize. Since OTA is a chemically stable compound that can endure the physicochemical conditions of modern food processing, additional research efforts have been devoted to develop sensitive and cost-effective surveillance solutions. Although traditional chromatographic and immunoassays appear to be mature enough to attain sensitivity up to the regulation levels, alternative detection schemes are still being enthusiastically pursued in an attempt to meet the requirements of rapid and cost-effective detections. Herein, this review presents recent progresses in OTA detections with minimal instrumental usage, which have been facilitated by the development of OTA aptamers and by the innovations in functional nanomaterials. In addition to the introduction of aptamer-based OTA detection techniques, OTA-specific detection principles are also presented, which exclusively take advantage of the unique chemical structure and related physicochemical characteristics.

  20. Hardware accelerator design for change detection in smart camera

    NASA Astrophysics Data System (ADS)

    Singh, Sanjay; Dunga, Srinivasa Murali; Saini, Ravi; Mandal, A. S.; Shekhar, Chandra; Chaudhury, Santanu; Vohra, Anil

    2011-10-01

    Smart Cameras are important components in Human Computer Interaction. In any remote surveillance scenario, smart cameras have to take intelligent decisions to select frames of significant changes to minimize communication and processing overhead. Among many of the algorithms for change detection, one based on clustering based scheme was proposed for smart camera systems. However, such an algorithm could achieve low frame rate far from real-time requirements on a general purpose processors (like PowerPC) available on FPGAs. This paper proposes the hardware accelerator capable of detecting real time changes in a scene, which uses clustering based change detection scheme. The system is designed and simulated using VHDL and implemented on Xilinx XUP Virtex-IIPro FPGA board. Resulted frame rate is 30 frames per second for QVGA resolution in gray scale.

  1. Electro-optic tracking R&D for defense surveillance

    NASA Astrophysics Data System (ADS)

    Sutherland, Stuart; Woodruff, Chris J.

    1995-09-01

    Two aspects of work on automatic target detection and tracking for electro-optic (EO) surveillance are described. Firstly, a detection and tracking algorithm test-bed developed by DSTO and running on a PC under Windows NT is being used to assess candidate algorithms for unresolved and minimally resolved target detection. The structure of this test-bed is described and examples are given of its user interfaces and outputs. Secondly, a development by Australian industry under a Defence-funded contract, of a reconfigurable generic track processor (GTP) is outlined. The GTP will include reconfigurable image processing stages and target tracking algorithms. It will be used to demonstrate to the Australian Defence Force automatic detection and tracking capabilities, and to serve as a hardware base for real time algorithm refinement.

  2. Dynamic Vehicle Detection via the Use of Magnetic Field Sensors

    PubMed Central

    Markevicius, Vytautas; Navikas, Dangirutis; Zilys, Mindaugas; Andriukaitis, Darius; Valinevicius, Algimantas; Cepenas, Mindaugas

    2016-01-01

    The vehicle detection process plays the key role in determining the success of intelligent transport management system solutions. The measurement of distortions of the Earth’s magnetic field using magnetic field sensors served as the basis for designing a solution aimed at vehicle detection. In accordance with the results obtained from research into process modeling and experimentally testing all the relevant hypotheses an algorithm for vehicle detection using the state criteria was proposed. Aiming to evaluate all of the possibilities, as well as pros and cons of the use of anisotropic magnetoresistance (AMR) sensors in the transport flow control process, we have performed a series of experiments with various vehicles (or different series) from several car manufacturers. A comparison of 12 selected methods, based on either the process of determining the peak signal values and their concurrence in time whilst calculating the delay, or by measuring the cross-correlation of these signals, was carried out. It was established that the relative error can be minimized via the Z component cross-correlation and Kz criterion cross-correlation methods. The average relative error of vehicle speed determination in the best case did not exceed 1.5% when the distance between sensors was set to 2 m. PMID:26797615

  3. Detection of tiny amounts of fissile materials in large-sized containers with radioactive waste

    NASA Astrophysics Data System (ADS)

    Batyaev, V. F.; Skliarov, S. V.

    2018-01-01

    The paper is devoted to non-destructive control of tiny amounts of fissile materials in large-sized containers filled with radioactive waste (RAW). The aim of this work is to model an active neutron interrogation facility for detection of fissile ma-terials inside NZK type containers with RAW and determine the minimal detectable mass of U-235 as a function of various param-eters: matrix type, nonuniformity of container filling, neutron gen-erator parameters (flux, pulse frequency, pulse duration), meas-urement time. As a result the dependence of minimal detectable mass on fissile materials location inside container is shown. Nonu-niformity of the thermal neutron flux inside a container is the main reason of the space-heterogeneity of minimal detectable mass in-side a large-sized container. Our experiments with tiny amounts of uranium-235 (<1 g) confirm the detection of fissile materials in NZK containers by using active neutron interrogation technique.

  4. An UGS radar with micro-Doppler capabilities for wide area persistent surveillance

    NASA Astrophysics Data System (ADS)

    Tahmoush, Dave; Silvious, Jerry; Clark, John

    2010-04-01

    Detecting humans and distinguishing them from natural fauna is an important issue in security applications to reduce false alarm rates. In particular, it is important to detect and classify people who are walking in remote locations and transmit back detections over extended periods at a low cost and with minimal maintenance. The ability to discriminate men versus animals and vehicles at long range would give a distinct sensor advantage. The reduction in false positive detections due to animals would increase the usefulness of detections, while dismount identification could reduce friendly-fire. We developed and demonstrate a compact radar technology that is scalable to a variety of ultra-lightweight and low-power platforms for wide area persistent surveillance as an unattended, unmanned, and man-portable ground sensor. The radar uses micro-Doppler processing to characterize the tracks of moving targets and to then eliminate unimportant detections due to animals or civilian activity. This paper presents the system and data on humans, vehicles, and animals at multiple angles and directions of motion, demonstrates the signal processing approach that makes the targets visually recognizable, and verifies that the UGS radar has enough micro-Doppler capability to distinguish between humans, vehicles, and animals.

  5. From Data to Knowledge — Faster: GOES Early Fire Detection System to Inform Operational Wildfire Response and Management

    NASA Astrophysics Data System (ADS)

    Koltunov, A.; Quayle, B.; Prins, E. M.; Ambrosia, V. G.; Ustin, S.

    2014-12-01

    Fire managers at various levels require near-real-time, low-cost, systematic, and reliable early detection capabilities with minimal latency to effectively respond to wildfire ignitions and minimize the risk of catastrophic development. The GOES satellite images collected for vast territories at high temporal frequencies provide a consistent and reliable source for operational active fire mapping realized by the WF-ABBA algorithm. However, their potential to provide early warning or rapid confirmation of initial fire ignition reports from conventional sources remains underutilized, partly because the operational wildfire detection has been successfully optimized for users and applications for which timeliness of initial detection is a low priority, contrasting to the needs of first responders. We present our progress in developing the GOES Early Fire Detection (GOES-EFD) system, a collaborative effort led by University of California-Davis and USDA Forest Service. The GOES-EFD specifically focuses on first detection timeliness for wildfire incidents. It is automatically trained for a monitored scene and capitalizes on multiyear cross-disciplinary algorithm research. Initial retrospective tests in Western US demonstrate significantly earlier identification detection of new ignitions than existing operational capabilities and a further improvement prospect. The GOES-EFD-β prototype will be initially deployed for the Western US region to process imagery from GOES-NOP and the rapid and 4 times higher spatial resolution imagery from GOES-R — the upcoming next generation of GOES satellites. These and other enhanced capabilities of GOES-R are expected to significantly improve the timeliness of fire ignition information from GOES-EFD.

  6. Clever eye algorithm for target detection of remote sensing imagery

    NASA Astrophysics Data System (ADS)

    Geng, Xiurui; Ji, Luyan; Sun, Kang

    2016-04-01

    Target detection algorithms for hyperspectral remote sensing imagery, such as the two most commonly used remote sensing detection algorithms, the constrained energy minimization (CEM) and matched filter (MF), can usually be attributed to the inner product between a weight filter (or detector) and a pixel vector. CEM and MF have the same expression except that MF requires data centralization first. However, this difference leads to a difference in the target detection results. That is to say, the selection of the data origin could directly affect the performance of the detector. Therefore, does there exist another data origin other than the zero and mean-vector points for a better target detection performance? This is a very meaningful issue in the field of target detection, but it has not been paid enough attention yet. In this study, we propose a novel objective function by introducing the data origin as another variable, and the solution of the function is corresponding to the data origin with the minimal output energy. The process of finding the optimal solution can be vividly regarded as a clever eye automatically searching the best observing position and direction in the feature space, which corresponds to the largest separation between the target and background. Therefore, this new algorithm is referred to as the clever eye algorithm (CE). Based on the Sherman-Morrison formula and the gradient ascent method, CE could derive the optimal target detection result in terms of energy. Experiments with both synthetic and real hyperspectral data have verified the effectiveness of our method.

  7. Microbial Evaluation of Fresh, Minimally-processed Vegetables and Bagged Sprouts from Chain Supermarkets

    PubMed Central

    Jeddi, Maryam Zare; Yunesian, Masud; Gorji, Mohamad Es'haghi; Noori, Negin; Pourmand, Mohammad Reza

    2014-01-01

    ABSTRACT The aim of this study was to evaluate the bacterial and fungal quality of minimally-processed vegetables (MPV) and sprouts. A total of 116 samples of fresh-cut vegetables, ready-to-eat salads, and mung bean and wheat sprouts were randomly collected and analyzed. The load of aerobic mesophilic bacteria was minimum and maximum in the fresh-cut vegetables and fresh mung bean sprouts respectively, corresponding to populations of 5.3 and 8.5 log CFU/g. E. coli O157:H7 was found to be absent in all samples; however,  other E. coli strains were detected in 21 samples (18.1%), and Salmonella spp. were found in one mung bean (3.1%) and one ready-to-eat salad sample (5%). Yeasts were the predominant organisms and were found in 100% of the samples. Geotrichum, Fusarium, and Penicillium spp. were the most prevalent molds in mung sprouts while Cladosporium and Penicillium spp. were most frequently found in ready-to-eat salad samples. According to results from the present study, effective control measures should be implemented to minimize the microbiological contamination of fresh produce sold in Tehran, Iran. PMID:25395902

  8. Microbial evaluation of fresh, minimally-processed vegetables and bagged sprouts from chain supermarkets.

    PubMed

    Jeddi, Maryam Zare; Yunesian, Masud; Gorji, Mohamad Es'haghi; Noori, Negin; Pourmand, Mohammad Reza; Khaniki, Gholam Reza Jahed

    2014-09-01

    The aim of this study was to evaluate the bacterial and fungal quality of minimally-processed vegetables (MPV) and sprouts. A total of 116 samples of fresh-cut vegetables, ready-to-eat salads, and mung bean and wheat sprouts were randomly collected and analyzed. The load of aerobic mesophilic bacteria was minimum and maximum in the fresh-cut vegetables and fresh mung bean sprouts respectively, corresponding to populations of 5.3 and 8.5 log CFU/g. E. coli O157:H7 was found to be absent in all samples; however,  other E. coli strains were detected in 21 samples (18.1%), and Salmonella spp. were found in one mung bean (3.1%) and one ready-to-eat salad sample (5%). Yeasts were the predominant organisms and were found in 100% of the samples. Geotrichum, Fusarium, and Penicillium spp. were the most prevalent molds in mung sprouts while Cladosporium and Penicillium spp. were most frequently found in ready-to-eat salad samples. According to results from the present study, effective control measures should be implemented to minimize the microbiological contamination of fresh produce sold in Tehran, Iran.

  9. Automated Power-Distribution System

    NASA Technical Reports Server (NTRS)

    Thomason, Cindy; Anderson, Paul M.; Martin, James A.

    1990-01-01

    Automated power-distribution system monitors and controls electrical power to modules in network. Handles both 208-V, 20-kHz single-phase alternating current and 120- to 150-V direct current. Power distributed to load modules from power-distribution control units (PDCU's) via subsystem distributors. Ring busses carry power to PDCU's from power source. Needs minimal attention. Detects faults and also protects against them. Potential applications include autonomous land vehicles and automated industrial process systems.

  10. Continuous Tamper-proof Logging using TPM2.0

    DTIC Science & Technology

    2014-06-16

    process each log entry. Additional hardware support could mitigate this problem. Tradeoffs between performance and security guarantees Disk write...becomes weaker as the block size increases. This problem is mitigated in protocol B by allowing offline recovery from a power failure and detection of...M.K., Isozaki, H.: Flicker : An execution infrastructure for TCB minimization. ACM SIGOPS Operating Systems Review 42(4) (2008) 315–328 24. Parno, B

  11. A Review on Microfluidic Paper-Based Analytical Devices for Glucose Detection

    PubMed Central

    Liu, Shuopeng; Su, Wenqiong; Ding, Xianting

    2016-01-01

    Glucose, as an essential substance directly involved in metabolic processes, is closely related to the occurrence of various diseases such as glucose metabolism disorders and islet cell carcinoma. Therefore, it is crucial to develop sensitive, accurate, rapid, and cost effective methods for frequent and convenient detections of glucose. Microfluidic Paper-based Analytical Devices (μPADs) not only satisfying the above requirements but also occupying the advantages of portability and minimal sample consumption, have exhibited great potential in the field of glucose detection. This article reviews and summarizes the most recent improvements in glucose detection in two aspects of colorimetric and electrochemical μPADs. The progressive techniques for fabricating channels on μPADs are also emphasized in this article. With the growth of diabetes and other glucose indication diseases in the underdeveloped and developing countries, low-cost and reliably commercial μPADs for glucose detection will be in unprecedentedly demand. PMID:27941634

  12. Improved Snow Mapping Accuracy with Revised MODIS Snow Algorithm

    NASA Technical Reports Server (NTRS)

    Riggs, George; Hall, Dorothy K.

    2012-01-01

    The MODIS snow cover products have been used in over 225 published studies. From those reports, and our ongoing analysis, we have learned about the accuracy and errors in the snow products. Revisions have been made in the algorithms to improve the accuracy of snow cover detection in Collection 6 (C6), the next processing/reprocessing of the MODIS data archive planned to start in September 2012. Our objective in the C6 revision of the MODIS snow-cover algorithms and products is to maximize the capability to detect snow cover while minimizing snow detection errors of commission and omission. While the basic snow detection algorithm will not change, new screens will be applied to alleviate snow detection commission and omission errors, and only the fractional snow cover (FSC) will be output (the binary snow cover area (SCA) map will no longer be included).

  13. Method for phosphorothioate antisense DNA sequencing by capillary electrophoresis with UV detection.

    PubMed

    Froim, D; Hopkins, C E; Belenky, A; Cohen, A S

    1997-11-01

    The progress of antisense DNA therapy demands development of reliable and convenient methods for sequencing short single-stranded oligonucleotides. A method of phosphorothioate antisense DNA sequencing analysis using UV detection coupled to capillary electrophoresis (CE) has been developed based on a modified chain termination sequencing method. The proposed method reduces the sequencing cost since it uses affordable CE-UV instrumentation and requires no labeling with minimal sample processing before analysis. Cycle sequencing with ThermoSequenase generates quantities of sequencing products that are readily detectable by UV. Discrimination of undesired components from sequencing products in the reaction mixture, previously accomplished by fluorescent or radioactive labeling, is now achieved by bringing concentrations of undesired components below the UV detection range which yields a 'clean', well defined sequence. UV detection coupled with CE offers additional conveniences for sequencing since it can be accomplished with commercially available CE-UV equipment and is readily amenable to automation.

  14. Method for phosphorothioate antisense DNA sequencing by capillary electrophoresis with UV detection.

    PubMed Central

    Froim, D; Hopkins, C E; Belenky, A; Cohen, A S

    1997-01-01

    The progress of antisense DNA therapy demands development of reliable and convenient methods for sequencing short single-stranded oligonucleotides. A method of phosphorothioate antisense DNA sequencing analysis using UV detection coupled to capillary electrophoresis (CE) has been developed based on a modified chain termination sequencing method. The proposed method reduces the sequencing cost since it uses affordable CE-UV instrumentation and requires no labeling with minimal sample processing before analysis. Cycle sequencing with ThermoSequenase generates quantities of sequencing products that are readily detectable by UV. Discrimination of undesired components from sequencing products in the reaction mixture, previously accomplished by fluorescent or radioactive labeling, is now achieved by bringing concentrations of undesired components below the UV detection range which yields a 'clean', well defined sequence. UV detection coupled with CE offers additional conveniences for sequencing since it can be accomplished with commercially available CE-UV equipment and is readily amenable to automation. PMID:9336449

  15. Algorithms Based on CWT and Classifiers to Control Cardiac Alterations and Stress Using an ECG and a SCR

    PubMed Central

    Villarejo, María Viqueira; Zapirain, Begoña García; Zorrilla, Amaia Méndez

    2013-01-01

    This paper presents the results of using a commercial pulsimeter as an electrocardiogram (ECG) for wireless detection of cardiac alterations and stress levels for home control. For these purposes, signal processing techniques (Continuous Wavelet Transform (CWT) and J48) have been used, respectively. The designed algorithm analyses the ECG signal and is able to detect the heart rate (99.42%), arrhythmia (93.48%) and extrasystoles (99.29%). The detection of stress level is complemented with Skin Conductance Response (SCR), whose success is 94.02%. The heart rate variability does not show added value to the stress detection in this case. With this pulsimeter, it is possible to prevent and detect anomalies for a non-intrusive way associated to a telemedicine system. It is also possible to use it during physical activity due to the fact the CWT minimizes the motion artifacts. PMID:23666135

  16. Algorithms based on CWT and classifiers to control cardiac alterations and stress using an ECG and a SCR.

    PubMed

    Villarejo, María Viqueira; Zapirain, Begoña García; Zorrilla, Amaia Méndez

    2013-05-10

    This paper presents the results of using a commercial pulsimeter as an electrocardiogram (ECG) for wireless detection of cardiac alterations and stress levels for home control. For these purposes, signal processing techniques (Continuous Wavelet Transform (CWT) and J48) have been used, respectively. The designed algorithm analyses the ECG signal and is able to detect the heart rate (99.42%), arrhythmia (93.48%) and extrasystoles (99.29%). The detection of stress level is complemented with Skin Conductance Response (SCR), whose success is 94.02%. The heart rate variability does not show added value to the stress detection in this case. With this pulsimeter, it is possible to prevent and detect anomalies for a non-intrusive way associated to a telemedicine system. It is also possible to use it during physical activity due to the fact the CWT minimizes the motion artifacts.

  17. Intensity dependent spread theory

    NASA Technical Reports Server (NTRS)

    Holben, Richard

    1990-01-01

    The Intensity Dependent Spread (IDS) procedure is an image-processing technique based on a model of the processing which occurs in the human visual system. IDS processing is relevant to many aspects of machine vision and image processing. For quantum limited images, it produces an ideal trade-off between spatial resolution and noise averaging, performs edge enhancement thus requiring only mean-crossing detection for the subsequent extraction of scene edges, and yields edge responses whose amplitudes are independent of scene illumination, depending only upon the ratio of the reflectance on the two sides of the edge. These properties suggest that the IDS process may provide significant bandwidth reduction while losing only minimal scene information when used as a preprocessor at or near the image plane.

  18. Overall quality and shelf life of minimally processed and modified atmosphere packaged 'ready-to-eat' pomegranate arils.

    PubMed

    Ayhan, Zehra; Eştürk, Okan

    2009-06-01

    Minimally processed ready-to-eat pomegranate arils have become popular due to their convenience, high value, unique sensory characteristics, and health benefits. The objective of this study was to monitor quality parameters and to extend the shelf life of ready-to-eat pomegranate arils packaged with modified atmospheres. Minimally processed pomegranate arils were packed in PP trays sealed with BOPP film under 4 atmospheres including low and super atmospheric oxygen. Packaged arils were stored at 5 degrees C for 18 d and monitored for internal atmosphere and quality attributes. Atmosphere equilibrium was reached for all MAP applications except for high oxygen. As a general trend, slight or no significant change was detected in chemical and physical attributes of pomegranate arils during cold storage. The aerobic mesophilic bacteria were in the range of 2.30 to 4.51 log CFU/g at the end of the storage, which did not affect the sensory quality. Overall, the pomegranate arils packed with air, nitrogen, and enriched oxygen kept quality attributes and were acceptable to sensory panelists on day 18; however, marketability period was limited to 15 d for the low oxygen atmosphere. PP trays sealed with BOPP film combined with either passive or active modified atmospheres and storage at 5 degrees C provided commercially acceptable arils for 18 d with high quality and convenience.

  19. Fibre optic portable rail vehicle detector

    NASA Astrophysics Data System (ADS)

    Kepak, Stanislav; Cubik, Jakub; Zavodny, Petr; Hejduk, Stanislav; Nedoma, Jan; Davidson, Alan; Vasinek, Vladimir

    2016-12-01

    During track maintenance operations, the early detection of oncoming rail vehicles is critical for the safety of maintenance personnel. In addition, the detection system should be simple to install at the trackside by minimally qualified personnel. Fibre optic based sensor systems have the inherent advantages of being passive, unaffected by radio frequency interference (RFI) and suffering very low signal attenuation. Such a system therefore represents a good alternative to conventional approaches such as ultrasonic based sensor systems. The proposed system consists of one or more passive fibre trackside sensors and an x86 processing unit located at the work site. The solid fibre connection between sensors and processing unit eliminates the risk of RFI. In addition, the detection system sensors are easy to install with no requirement for electrical power at the sensor site. The system was tested on a tram line in Ostrava with the results obtained indicating the successful detection of all the trams in the monitoring windows using a single sensor. However, the platform allows flexibility in configuring multiple sensors where required by system users.

  20. Light Weight MP3 Watermarking Method for Mobile Terminals

    NASA Astrophysics Data System (ADS)

    Takagi, Koichi; Sakazawa, Shigeyuki; Takishima, Yasuhiro

    This paper proposes a novel MP3 watermarking method which is applicable to a mobile terminal with limited computational resources. Considering that in most cases the embedded information is copyright information or metadata, which should be extracted before playing back audio contents, the watermark detection process should be executed at high speed. However, when conventional methods are used with a mobile terminal, it takes a considerable amount of time to detect a digital watermark. This paper focuses on scalefactor manipulation to enable high speed watermark embedding/detection for MP3 audio and also proposes the manipulation method which minimizes audio quality degradation adaptively. Evaluation tests showed that the proposed method is capable of embedding 3 bits/frame information without degrading audio quality and detecting it at very high speed. Finally, this paper describes application examples for authentication with a digital signature.

  1. Design and Practices for Use of Automated Drilling and Sample Handling in MARTE While Minimizing Terrestrial and Cross Contamination

    NASA Astrophysics Data System (ADS)

    Miller, David P.; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources -- whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination) -- to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  2. Design and practices for use of automated drilling and sample handling in MARTE while minimizing terrestrial and cross contamination.

    PubMed

    Miller, David P; Bonaccorsi, Rosalba; Davis, Kiel

    2008-10-01

    Mars Astrobiology Research and Technology Experiment (MARTE) investigators used an automated drill and sample processing hardware to detect and categorize life-forms found in subsurface rock at Río Tinto, Spain. For the science to be successful, it was necessary for the biomass from other sources--whether from previously processed samples (cross contamination) or the terrestrial environment (forward contamination)-to be insignificant. The hardware and practices used in MARTE were designed around this problem. Here, we describe some of the design issues that were faced and classify them into problems that are unique to terrestrial tests versus problems that would also exist for a system that was flown to Mars. Assessment of the biomass at various stages in the sample handling process revealed mixed results; the instrument design seemed to minimize cross contamination, but contamination from the surrounding environment sometimes made its way onto the surface of samples. Techniques used during the MARTE Río Tinto project, such as facing the sample, appear to remove this environmental contamination without introducing significant cross contamination from previous samples.

  3. Accuracy of a Radiological Evaluation Method for Thoracic and Lumbar Spinal Curvatures Using Spinous Processes.

    PubMed

    Marchetti, Bárbara V; Candotti, Cláudia T; Raupp, Eduardo G; Oliveira, Eduardo B C; Furlanetto, Tássia S; Loss, Jefferson F

    The purpose of this study was to assess a radiographic method for spinal curvature evaluation in children, based on spinous processes, and identify its normality limits. The sample consisted of 90 radiographic examinations of the spines of children in the sagittal plane. Thoracic and lumbar curvatures were evaluated using angular (apex angle [AA]) and linear (sagittal arrow [SA]) measurements based on the spinous processes. The same curvatures were also evaluated using the Cobb angle (CA) method, which is considered the gold standard. For concurrent validity (AA vs CA), Pearson's product-moment correlation coefficient, root-mean-square error, Pitman- Morgan test, and Bland-Altman analysis were used. For reproducibility (AA, SA, and CA), the intraclass correlation coefficient, standard error of measurement, and minimal detectable change measurements were used. A significant correlation was found between CA and AA measurements, as was a low root-mean-square error. The mean difference between the measurements was 0° for thoracic and lumbar curvatures, and the mean standard deviations of the differences were ±5.9° and 6.9°, respectively. The intraclass correlation coefficients of AA and SA were similar to or higher than the gold standard (CA). The standard error of measurement and minimal detectable change of the AA were always lower than the CA. This study determined the concurrent validity, as well as intra- and interrater reproducibility, of the radiographic measurements of kyphosis and lordosis in children. Copyright © 2017. Published by Elsevier Inc.

  4. Shelf life extension of minimally processed cabbage and cucumber through gamma irradiation.

    PubMed

    Khattak, Amal Badshah; Bibi, Nizakat; Chaudry, Muhammad Ashraf; Khan, Misal; Khan, Maazullah; Qureshi, Muhammad Jamil

    2005-01-01

    The influence of irradiation of minimally processed cabbage and cucumber on microbial safety, texture, and sensory quality was investigated. Minimally processed, polyethylene-packed, and irradiated cabbage and cucumber were stored at refrigeration temperature (5 degrees C) for 2 weeks. The firmness values ranged from 3.23 kg (control) to 2.82 kg (3.0-kGy irradiated samples) for cucumbers, with a gradual decrease in firmness with increasing radiation dose (0 to 3 kGy). Cucumbers softened just after irradiation with a dose of 3.0 kGy and after 14 days storage, whereas the texture remained within acceptable limits up to a radiation dose of 2.5 kGy. The radiation treatment had no effect on the appearance scores of cabbage; however, scores decreased from 7.0 to 6.7 during storage. The appearance and flavor scores of cucumbers decreased with increasing radiation dose, and overall acceptability was better after radiation doses of 2.5 and 3.0 kGy. The aerobic plate counts per gram for cabbage increased from 3 to 5 log CFU (control), from 1.85 to 2.93 log CFU (2.5 kGy), and from a few colonies to 2.6 log CFU (3.0 kGy) after 14 days of storage at 5 degrees C. A similar trend was noted for cucumber samples. No coliform bacteria were detected at radiation doses greater than 2.0 kGy in either cabbage or cucumber samples. Total fungal counts per gram of sample were within acceptable limits for cucumbers irradiated at 3.0 kGy, and for cabbage no fungi were detected after 2.0-kGy irradiation. The D-values for Escherichia coli in cucumber and cabbage were 0.19 and 0.17 kGy, and those for Salmonella Paratyphi A were 0.25 and 0.29 kGy for cucumber and cabbage, respectively.

  5. Top quark decays with flavor violation in the B-LSSM

    NASA Astrophysics Data System (ADS)

    Yang, Jin-Lei; Feng, Tai-Fu; Zhang, Hai-Bin; Ning, Guo-Zhu; Yang, Xiu-Yi

    2018-06-01

    The decays of top quark t→ cγ ,t→ cg,t→ cZ,t→ ch are extremely rare processes in the standard model (SM). The predictions on the corresponding branching ratios in the SM are too small to be detected in the future, hence any measurable signal for the processes at the LHC is a smoking gun for new physics. In the extension of minimal supersymmetric standard model with an additional local U(1)_B {-}L gauge symmetry (B-LSSM), new gauge interaction and new flavor changing interaction affect the theoretical evaluations on corresponding branching ratios of those processes. In this work, we analyze those processes in the B-LSSM, under a minimal flavor violating assumption for the soft breaking terms. Considering the constraints from updated experimental data, the numerical results imply Br(t→ cγ )˜ 5× 10^{-7}, Br(t→ cg)˜ 2× 10^{-6}, Br(t→ cZ)˜ 4× 10^{-7} and Br(t→ ch)˜ 3× 10^{-9} in our chosen parameter space. Simultaneously, new gauge coupling constants g_{_B},g_{_{YB}} in the B-LSSM can also affect the numerical results of Br(t→ cγ ,cg,cZ,ch).

  6. Detection of CMOS bridging faults using minimal stuck-at fault test sets

    NASA Technical Reports Server (NTRS)

    Ijaz, Nabeel; Frenzel, James F.

    1993-01-01

    The performance of minimal stuck-at fault test sets at detecting bridging faults are evaluated. New functional models of circuit primitives are presented which allow accurate representation of bridging faults under switch-level simulation. The effectiveness of the patterns is evaluated using both voltage and current testing.

  7. Image-based tracking of the suturing needle during laparoscopic interventions

    NASA Astrophysics Data System (ADS)

    Speidel, S.; Kroehnert, A.; Bodenstedt, S.; Kenngott, H.; Müller-Stich, B.; Dillmann, R.

    2015-03-01

    One of the most complex and difficult tasks for surgeons during minimally invasive interventions is suturing. A prerequisite to assist the suturing process is the tracking of the needle. The endoscopic images provide a rich source of information which can be used for needle tracking. In this paper, we present an image-based method for markerless needle tracking. The method uses a color-based and geometry-based segmentation to detect the needle. Once an initial needle detection is obtained, a region of interest enclosing the extracted needle contour is passed on to a reduced segmentation. It is evaluated with in vivo images from da Vinci interventions.

  8. A minimum distance estimation approach to the two-sample location-scale problem.

    PubMed

    Zhang, Zhiyi; Yu, Qiqing

    2002-09-01

    As reported by Kalbfleisch and Prentice (1980), the generalized Wilcoxon test fails to detect a difference between the lifetime distributions of the male and female mice died from Thymic Leukemia. This failure is a result of the test's inability to detect a distributional difference when a location shift and a scale change exist simultaneously. In this article, we propose an estimator based on the minimization of an average distance between two independent quantile processes under a location-scale model. Large sample inference on the proposed estimator, with possible right-censorship, is discussed. The mouse leukemia data are used as an example for illustration purpose.

  9. Application of edible coating with starch and carvacrol in minimally processed pumpkin.

    PubMed

    Santos, Adriele R; da Silva, Alex F; Amaral, Viviane C S; Ribeiro, Alessandra B; de Abreu Filho, Benicio A; Mikcha, Jane M G

    2016-04-01

    The present study evaluated the effect of an edible coating of cassava starch and carvacrol in minimally processed pumpkin (MPP). The minimal inhibitory concentration (MIC) of carvacrol against Escherichia coli, Salmonella enterica serotype Typhimurium, Aeromonas hydrophila, and Staphylococcus aureus was determined. The edible coating that contained carvacrol at the MIC and 2 × MIC was applied to MPP, and effects were evaluated with regard to the survival of experimentally inoculated bacteria and autochthonous microflora in MPP. Total titratable acidity, pH, weight loss, and soluble solids over 7 days of storage under refrigeration was also analyzed. MIC of carvacrol was 312 μg/ml. Carvacrol at the MIC reduced the counts of E. coli and S. Typhimurium by approximately 5 log CFU/g. A. hydrophila was reduced by approximately 8 log CFU/g, and S. aureus was reduced by approximately 2 log CFU/g on the seventh day of storage. Carvacrol at the 2 × MIC completely inhibited all isolates on the first day of Storage. coliforms at 35 °C and 45 °C were not detected (< 3 MPN/g) with either treatment on all days of shelf life. The treatment groups exhibited a reduction of approximately 2 log CFU/g in psychrotrophic counts compared with controls on the last day of storage. Yeast and mold were not detected with either treatment over the same period. The addition of carvacrol did not affect total titratable acidity, pH, or soluble solids and improved weight loss. The edible coating of cassava starch with carvacrol may be an interesting approach to improve the safety and microbiological quality of MPP.

  10. A cloud masking algorithm for EARLINET lidar systems

    NASA Astrophysics Data System (ADS)

    Binietoglou, Ioannis; Baars, Holger; D'Amico, Giuseppe; Nicolae, Doina

    2015-04-01

    Cloud masking is an important first step in any aerosol lidar processing chain as most data processing algorithms can only be applied on cloud free observations. Up to now, the selection of a cloud-free time interval for data processing is typically performed manually, and this is one of the outstanding problems for automatic processing of lidar data in networks such as EARLINET. In this contribution we present initial developments of a cloud masking algorithm that permits the selection of the appropriate time intervals for lidar data processing based on uncalibrated lidar signals. The algorithm is based on a signal normalization procedure using the range of observed values of lidar returns, designed to work with different lidar systems with minimal user input. This normalization procedure can be applied to measurement periods of only few hours, even if no suitable cloud-free interval exists, and thus can be used even when only a short period of lidar measurements is available. Clouds are detected based on a combination of criteria including the magnitude of the normalized lidar signal and time-space edge detection performed using the Sobel operator. In this way the algorithm avoids misclassification of strong aerosol layers as clouds. Cloud detection is performed using the highest available time and vertical resolution of the lidar signals, allowing the effective detection of low-level clouds (e.g. cumulus humilis). Special attention is given to suppress false cloud detection due to signal noise that can affect the algorithm's performance, especially during day-time. In this contribution we present the details of algorithm, the effect of lidar characteristics (space-time resolution, available wavelengths, signal-to-noise ratio) to detection performance, and highlight the current strengths and limitations of the algorithm using lidar scenes from different lidar systems in different locations across Europe.

  11. Distributed ice accretion sensor for smart aircraft structures

    NASA Technical Reports Server (NTRS)

    Gerardi, J. J.; Hickman, G. A.

    1989-01-01

    A distributed ice accretion sensor is presented, based on the concept of smart structures. Ice accretion is determined using spectral techniques to process signals from piezoelectric sensors integral to the airfoil skin. Frequency shifts in the leading edge structural skin modes are correlated to ice thickness. It is suggested that this method may be used to detect ice over large areas with minimal hardware. Results are presented from preliminary tests to measure simulated ice growth.

  12. A Compact, Solid-State UV (266 nm) Laser System Capable of Burst-Mode Operation for Laser Ablation Desorption Processing

    NASA Technical Reports Server (NTRS)

    Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William

    2015-01-01

    Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).

  13. Multicutter machining of compound parametric surfaces

    NASA Astrophysics Data System (ADS)

    Hatna, Abdelmadjid; Grieve, R. J.; Broomhead, P.

    2000-10-01

    Parametric free forms are used in industries as disparate as footwear, toys, sporting goods, ceramics, digital content creation, and conceptual design. Optimizing tool path patterns and minimizing the total machining time is a primordial issue in numerically controlled (NC) machining of free form surfaces. We demonstrate in the present work that multi-cutter machining can achieve as much as 60% reduction in total machining time for compound sculptured surfaces. The given approach is based upon the pre-processing as opposed to the usual post-processing of surfaces for the detection and removal of interference followed by precise tracking of unmachined areas.

  14. The simultaneous ex vivo detection of low-frequency antigen-specific CD4+ and CD8+ T-cell responses using overlapping peptide pools.

    PubMed

    Singh, Satwinder Kaur; Meyering, Maaike; Ramwadhdoebe, Tamara H; Stynenbosch, Linda F M; Redeker, Anke; Kuppen, Peter J K; Melief, Cornelis J M; Welters, Marij J P; van der Burg, Sjoerd H

    2012-11-01

    The ability to measure antigen-specific T cells at the single-cell level by intracellular cytokine staining (ICS) is a promising immunomonitoring tool and is extensively applied in the evaluation of immunotherapy of cancer. The protocols used to detect antigen-specific CD8+ T-cell responses generally work for the detection of antigen-specific T cells in samples that have undergone at least one round of in vitro pre-stimulation. Application of a common protocol but now using long peptides as antigens was not suitable to simultaneously detect antigen-specific CD8+ and CD4+ T cells directly ex vivo in cryopreserved samples. CD8 T-cell reactivity to monocytes pulsed with long peptides as antigens ranged between 5 and 25 % of that observed against monocytes pulsed with a direct HLA class I fitting minimal CTL peptide epitope. Therefore, we adapted our ICS protocol and show that the use of tenfold higher concentration of long peptides to load APC, the use of IFN-α and poly(I:C) to promote antigen processing and improve T-cell stimulation, does allow for the ex vivo detection of low-frequency antigen-specific CD8+ and CD4+ T cells in an HLA-independent setting. While most of the improvements were related to increasing the ability to measure CD8+ T-cell reactivity following stimulation with long peptides to at least 50 % of the response detected when using a minimal peptide epitope, the final analysis of blood samples from vaccinated patients successfully showed that the adapted ICS protocol also increases the ability to ex vivo detect low-frequency p53-specific CD4+ T-cell responses in cryopreserved PBMC samples.

  15. An Automated Flying-Insect-Detection System

    NASA Technical Reports Server (NTRS)

    Vann, Timi; Andrews, Jane C.; Howell, Dane; Ryan, Robert

    2005-01-01

    An automated flying-insect-detection system (AFIDS) was developed as a proof-of-concept instrument for real-time detection and identification of flying insects. This type of system has use in public health and homeland security decision support, agriculture and military pest management, and/or entomological research. Insects are first lured into the AFIDS integrated sphere by insect attractants. Once inside the sphere, the insect's wing beats cause alterations in light intensity that is detected by a photoelectric sensor. Following detection, the insects are encouraged (with the use of a small fan) to move out of the sphere and into a designated insect trap where they are held for taxonomic identification or serological testing. The acquired electronic wing beat signatures are preprocessed (Fourier transformed) in real-time to display a periodic signal. These signals are sent to the end user where they are graphically displayed. All AFIDS data are pre-processed in the field with the use of a laptop computer equipped with LABVIEW. The AFIDS software can be programmed to run continuously or at specific time intervals when insects are prevalent. A special DC-restored transimpedance amplifier reduces the contributions of low-frequency background light signals, and affords approximately two orders of magnitude greater AC gain than conventional amplifiers. This greatly increases the signal-to-noise ratio and enables the detection of small changes in light intensity. The AFIDS light source consists of high-intensity Al GaInP light-emitting diodes (LEDs). The AFIDS circuitry minimizes brightness fluctuations in the LEDs and when integrated with an integrating sphere, creates a diffuse uniform light field. The insect wing beats isotropically scatter the diffuse light in the sphere and create wing beat signatures that are detected by the sensor. This configuration minimizes variations in signal associated with insect flight orientation.

  16. Asynchronous Processing of a Constellation of Geostationary and Polar-Orbiting Satellites for Fire Detection and Smoke Estimation

    NASA Astrophysics Data System (ADS)

    Hyer, E. J.; Peterson, D. A.; Curtis, C. A.; Schmidt, C. C.; Hoffman, J.; Prins, E. M.

    2014-12-01

    The Fire Locating and Monitoring of Burning Emissions (FLAMBE) system converts satellite observations of thermally anomalous pixels into spatially and temporally continuous estimates of smoke release from open biomass burning. This system currently processes data from a constellation of 5 geostationary and 2 polar-orbiting sensors. Additional sensors, including NPP VIIRS and the imager on the Korea COMS-1 geostationary satellite, will soon be added. This constellation experiences schedule changes and outages of various durations, making the set of available scenes for fire detection highly variable on an hourly and daily basis. Adding to the complexity, the latency of the satellite data is variable between and within sensors. FLAMBE shares with many fire detection systems the goal of detecting as many fires as possible as early as possible, but the FLAMBE system must also produce a consistent estimate of smoke production with minimal artifacts from the changing constellation. To achieve this, NRL has developed a system of asynchronous processing and cross-calibration that permits satellite data to be used as it arrives, while preserving the consistency of the smoke emission estimates. This talk describes the asynchronous data ingest methodology, including latency statistics for the constellation. We also provide an overview and show results from the system we have developed to normalize multi-sensor fire detection for consistency.

  17. Applications of LC-MS in PET Radioligand Development and Metabolic Elucidation

    PubMed Central

    Ma, Ying; Kiesewetter, Dale O.; Lang, Lixin; Gu, Dongyu; Chen, Xiaoyuan

    2013-01-01

    Positron emission tomography (PET) is a very sensitive molecular imaging technique that when employed with an appropriate radioligand has the ability to quantititate physiological processes in a non-invasive manner. Since the imaging technique detects all radioactive emissions in the field of view, the presence and biological activity of radiolabeled metabolites must be determined for each radioligand in order to validate the utility of the radiotracer for measuring the desired physiological process. Thus, the identification of metabolic profiles of radiolabeled compounds is an important aspect of design, development, and validation of new radiopharmaceuticals and their applications in drug development and molecular imaging. Metabolite identification for different chemical classes of radiopharmaceuticals allows rational design to minimize the formation and accumulation of metabolites in the target tissue, either through enhanced excretion or minimized metabolism. This review will discuss methods for identifying and quantitating metabolites during the pre-clinical development of radiopharmaceuticals with special emphasis on the application of LC/MS. PMID:20540692

  18. Reversal electron attachment ionizer for detection of trace species

    NASA Technical Reports Server (NTRS)

    Bernius, Mark T. (Inventor); Chutjian, Ara (Inventor)

    1990-01-01

    An in-line reversal electron, high-current ionizer capable of focusing a beam of electrons to a reversal region and executing a reversal of said electrons, such that the electrons possess zero kinetic energy at the point of reversal, may be used to produce both negative and positive ions. A sample gas is introduced at the point of electron reversal for low energy electron-(sample gas) molecule attachment with high efficiency. The attachment process produces negative ions from the sample gas, which includes species present in trace (minute) amounts. These ions are extracted efficiently and directed to a mass analyzer where they may be detected and identified. The generation and detection of positive ions is accomplished in a similar fashion with minimal adjustment to potentials applied to the apparatus.

  19. Reversal electron attachment ionizer for detection of trace species

    NASA Technical Reports Server (NTRS)

    Bernius, Mark T. (Inventor); Chutjian, Ara (Inventor)

    1989-01-01

    An in-line reversal electron, high-current ionizer capable of focusing a beam of electrons to a reversal region and executing a reversal of the electrons, such that the electrons possess zero kinetic energy at the point of reversal, may be used to produce both negative and positive ions. A sample gas is introduced at the point of electron reversal for low energy electron-(sample gas) molecule attachment with high efficiency. The attachment process produces negative ions from the sample gas, which includes species present in trace (minute) amounts. These ions are extracted efficiently and directed to a mass analyzer where they may be detected and identified. The generation and detection of positive ions is accomplished in a similar fashion with minimal adjustment to potentials applied to the apparatus.

  20. Estimation of anomaly location and size using electrical impedance tomography.

    PubMed

    Kwon, Ohin; Yoon, Jeong Rock; Seo, Jin Keun; Woo, Eung Je; Cho, Young Gu

    2003-01-01

    We developed a new algorithm that estimates locations and sizes of anomalies in electrically conducting medium based on electrical impedance tomography (EIT) technique. When only the boundary current and voltage measurements are available, it is not practically feasible to reconstruct accurate high-resolution cross-sectional conductivity or resistivity images of a subject. In this paper, we focus our attention on the estimation of locations and sizes of anomalies with different conductivity values compared with the background tissues. We showed the performance of the algorithm from experimental results using a 32-channel EIT system and saline phantom. With about 1.73% measurement error in boundary current-voltage data, we found that the minimal size (area) of the detectable anomaly is about 0.72% of the size (area) of the phantom. Potential applications include the monitoring of impedance related physiological events and bubble detection in two-phase flow. Since this new algorithm requires neither any forward solver nor time-consuming minimization process, it is fast enough for various real-time applications in medicine and nondestructive testing.

  1. Detection of microbial biofilms on food processing surfaces: hyperspectral fluorescence imaging study

    NASA Astrophysics Data System (ADS)

    Jun, Won; Kim, Moon S.; Chao, Kaunglin; Lefcourt, Alan M.; Roberts, Michael S.; McNaughton, James L.

    2009-05-01

    We used a portable hyperspectral fluorescence imaging system to evaluate biofilm formations on four types of food processing surface materials including stainless steel, polypropylene used for cutting boards, and household counter top materials such as formica and granite. The objective of this investigation was to determine a minimal number of spectral bands suitable to differentiate microbial biofilm formation from the four background materials typically used during food processing. Ultimately, the resultant spectral information will be used in development of handheld portable imaging devices that can be used as visual aid tools for sanitation and safety inspection (microbial contamination) of the food processing surfaces. Pathogenic E. coli O157:H7 and Salmonella cells were grown in low strength M9 minimal medium on various surfaces at 22 +/- 2 °C for 2 days for biofilm formation. Biofilm autofluorescence under UV excitation (320 to 400 nm) obtained by hyperspectral fluorescence imaging system showed broad emissions in the blue-green regions of the spectrum with emission maxima at approximately 480 nm for both E. coli O157:H7 and Salmonella biofilms. Fluorescence images at 480 nm revealed that for background materials with near-uniform fluorescence responses such as stainless steel and formica cutting board, regardless of the background intensity, biofilm formation can be distinguished. This suggested that a broad spectral band in the blue-green regions can be used for handheld imaging devices for sanitation inspection of stainless, cutting board, and formica surfaces. The non-uniform fluorescence responses of granite make distinctions between biofilm and background difficult. To further investigate potential detection of the biofilm formations on granite surfaces with multispectral approaches, principal component analysis (PCA) was performed using the hyperspectral fluorescence image data. The resultant PCA score images revealed distinct contrast between biofilms and granite surfaces. This investigation demonstrated that biofilm formations on food processing surfaces, even for background materials with heterogeneous fluorescence responses, can be detected. Furthermore, a multispectral approach in developing handheld inspection devices may be needed to inspect surface materials that exhibit non-uniform fluorescence.

  2. Adapting existing natural language processing resources for cardiovascular risk factors identification in clinical notes.

    PubMed

    Khalifa, Abdulrahman; Meystre, Stéphane

    2015-12-01

    The 2014 i2b2 natural language processing shared task focused on identifying cardiovascular risk factors such as high blood pressure, high cholesterol levels, obesity and smoking status among other factors found in health records of diabetic patients. In addition, the task involved detecting medications, and time information associated with the extracted data. This paper presents the development and evaluation of a natural language processing (NLP) application conceived for this i2b2 shared task. For increased efficiency, the application main components were adapted from two existing NLP tools implemented in the Apache UIMA framework: Textractor (for dictionary-based lookup) and cTAKES (for preprocessing and smoking status detection). The application achieved a final (micro-averaged) F1-measure of 87.5% on the final evaluation test set. Our attempt was mostly based on existing tools adapted with minimal changes and allowed for satisfying performance with limited development efforts. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Mitochondrial DNA Targets Increase Sensitivity of Malaria Detection Using Loop-Mediated Isothermal Amplification ▿

    PubMed Central

    Polley, Spencer D.; Mori, Yasuyoshi; Watson, Julie; Perkins, Mark D.; González, Iveth J.; Notomi, Tsugunori; Chiodini, Peter L.; Sutherland, Colin J.

    2010-01-01

    Loop-mediated isothermal amplification (LAMP) of DNA offers the ability to detect very small quantities of pathogen DNA following minimal tissue sample processing and is thus an attractive methodology for point-of-care diagnostics. Previous attempts to diagnose malaria by the use of blood samples and LAMP have targeted the parasite small-subunit rRNA gene, with a resultant sensitivity for Plasmodium falciparum of around 100 parasites per μl. Here we describe the use of mitochondrial targets for LAMP-based detection of any Plasmodium genus parasite and of P. falciparum specifically. These new targets allow routine amplification from samples containing as few as five parasites per μl of blood. Amplification is complete within 30 to 40 min and is assessed by real-time turbidimetry, thereby offering rapid diagnosis with greater sensitivity than is achieved by the most skilled microscopist or antigen detection using lateral flow immunoassays. PMID:20554824

  4. Steady State Fluorescence Spectroscopy for Medical Diagnosis

    NASA Astrophysics Data System (ADS)

    Mahadevan-Jansen, Anita; Gebhart, Steven C.

    Light can react with tissue in different ways and provide information for identifying the physiological state of tissue or detecting the presence of disease. The light used to probe tissue does so in a non-intrusive manner and typically uses very low levels of light far below the requirements for therapeutic applications. The use of fiber optics simplifies the delivery and collection of this light in a minimally invasive manner. Since tissue response is virtually instantaneous, the results are obtained in real-time and the use of data processing techniques and multi-variate statistical analysis allows for automated detection and therefore provides an objective estimation of the tissue state. These then form the fundamental basis for the application of optical techniques for the detection of tissue physiology as well as pathology. These distinct advantages have encouraged many researchers to pursue the development of the different optical interactions for biological and medical detection.

  5. PCB Fault Detection Using Image Processing

    NASA Astrophysics Data System (ADS)

    Nayak, Jithendra P. R.; Anitha, K.; Parameshachari, B. D., Dr.; Banu, Reshma, Dr.; Rashmi, P.

    2017-08-01

    The importance of the Printed Circuit Board inspection process has been magnified by requirements of the modern manufacturing environment where delivery of 100% defect free PCBs is the expectation. To meet such expectations, identifying various defects and their types becomes the first step. In this PCB inspection system the inspection algorithm mainly focuses on the defect detection using the natural images. Many practical issues like tilt of the images, bad light conditions, height at which images are taken etc. are to be considered to ensure good quality of the image which can then be used for defect detection. Printed circuit board (PCB) fabrication is a multidisciplinary process, and etching is the most critical part in the PCB manufacturing process. The main objective of Etching process is to remove the exposed unwanted copper other than the required circuit pattern. In order to minimize scrap caused by the wrongly etched PCB panel, inspection has to be done in early stage. However, all of the inspections are done after the etching process where any defective PCB found is no longer useful and is simply thrown away. Since etching process costs 0% of the entire PCB fabrication, it is uneconomical to simply discard the defective PCBs. In this paper a method to identify the defects in natural PCB images and associated practical issues are addressed using Software tools and some of the major types of single layer PCB defects are Pattern Cut, Pin hole, Pattern Short, Nick etc., Therefore the defects should be identified before the etching process so that the PCB would be reprocessed. In the present approach expected to improve the efficiency of the system in detecting the defects even in low quality images

  6. Real-time detection system for tumor localization during minimally invasive surgery for gastric and colon cancer removal: In vivo feasibility study in a swine model.

    PubMed

    Choi, Won Jung; Moon, Jin-Hee; Min, Jae Seok; Song, Yong Keun; Lee, Seung A; Ahn, Jin Woo; Lee, Sang Hun; Jung, Ha Chul

    2018-03-01

    During minimally invasive surgery (MIS), it is impossible to directly detect marked clips around tumors via palpation. Therefore, we developed a novel method and device using Radio Frequency IDentification (RFID) technology to detect the position of clips during minimally invasive gastrectomy or colectomy. The feasibility of the RFID-based detection system was evaluated in an animal experiment consisting of seven swine. The primary outcome was to successfully detect the location of RFID clips in the stomach and colon. The secondary outcome measures were to detect time (time during the intracorporeal detection of the RFID clip), and accuracy (distance between the RFID clip and the detected site). A total of 25 detection attempts (14 in the stomach and 11 in the colon) using the RFID antenna had a 100% success rate. The median detection time was 32.5 s (range, 15-119 s) for the stomach and 28.0 s (range, 8-87 s) for the colon. The median detection distance was 6.5 mm (range, 4-18 mm) for the stomach and 6.0 mm (range, 3-13 mm) for the colon. We demonstrated favorable results for a RFID system that detects the position of gastric and colon tumors in real-time during MIS. © 2017 Wiley Periodicals, Inc.

  7. Normal tissue toxicity after small field hypofractionated stereotactic body radiation.

    PubMed

    Milano, Michael T; Constine, Louis S; Okunieff, Paul

    2008-10-31

    Stereotactic body radiation (SBRT) is an emerging tool in radiation oncology in which the targeting accuracy is improved via the detection and processing of a three-dimensional coordinate system that is aligned to the target. With improved targeting accuracy, SBRT allows for the minimization of normal tissue volume exposed to high radiation dose as well as the escalation of fractional dose delivery. The goal of SBRT is to minimize toxicity while maximizing tumor control. This review will discuss the basic principles of SBRT, the radiobiology of hypofractionated radiation and the outcome from published clinical trials of SBRT, with a focus on late toxicity after SBRT. While clinical data has shown SBRT to be safe in most circumstances, more data is needed to refine the ideal dose-volume metrics.

  8. Image re-sampling detection through a novel interpolation kernel.

    PubMed

    Hilal, Alaa

    2018-06-01

    Image re-sampling involved in re-size and rotation transformations is an essential element block in a typical digital image alteration. Fortunately, traces left from such processes are detectable, proving that the image has gone a re-sampling transformation. Within this context, we present in this paper two original contributions. First, we propose a new re-sampling interpolation kernel. It depends on five independent parameters that controls its amplitude, angular frequency, standard deviation, and duration. Then, we demonstrate its capacity to imitate the same behavior of the most frequent interpolation kernels used in digital image re-sampling applications. Secondly, the proposed model is used to characterize and detect the correlation coefficients involved in re-sampling transformations. The involved process includes a minimization of an error function using the gradient method. The proposed method is assessed over a large database of 11,000 re-sampled images. Additionally, it is implemented within an algorithm in order to assess images that had undergone complex transformations. Obtained results demonstrate better performance and reduced processing time when compared to a reference method validating the suitability of the proposed approaches. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Raman scattering spectroscopy for explosives identification

    NASA Astrophysics Data System (ADS)

    Nagli, L.; Gaft, M.

    2007-04-01

    Real time detection and identification of explosives at a standoff distance is a major issue in efforts to develop defense against so-called Improvised Explosive Devices (IED). It is recognized that the only technique, which is potentially capable to standoff detection of minimal amounts of explosives is laser-based spectroscopy. LDS technique belongs to trace detection, namely to its micro-particles variety. We applied gated Raman and time-resolved luminescence spectroscopy for detection of main explosive materials, both factory and homemade. Raman system was developed and tested by LDS for field remote detection and identification of minimal amounts of explosives on relevant surfaces at a distance of up to 30 meters.

  10. A Three-Step Resolution-Reconfigurable Hazardous Multi-Gas Sensor Interface for Wireless Air-Quality Monitoring Applications.

    PubMed

    Choi, Subin; Park, Kyeonghwan; Lee, Seungwook; Lim, Yeongjin; Oh, Byungjoo; Chae, Hee Young; Park, Chan Sam; Shin, Heugjoo; Kim, Jae Joon

    2018-03-02

    This paper presents a resolution-reconfigurable wide-range resistive sensor readout interface for wireless multi-gas monitoring applications that displays results on a smartphone. Three types of sensing resolutions were selected to minimize processing power consumption, and a dual-mode front-end structure was proposed to support the detection of a variety of hazardous gases with wide range of characteristic resistance. The readout integrated circuit (ROIC) was fabricated in a 0.18 μm CMOS process to provide three reconfigurable data conversions that correspond to a low-power resistance-to-digital converter (RDC), a 12-bit successive approximation register (SAR) analog-to-digital converter (ADC), and a 16-bit delta-sigma modulator. For functional feasibility, a wireless sensor system prototype that included in-house microelectromechanical (MEMS) sensing devices and commercial device products was manufactured and experimentally verified to detect a variety of hazardous gases.

  11. ASPIC: a novel method to predict the exon-intron structure of a gene that is optimally compatible to a set of transcript sequences.

    PubMed

    Bonizzoni, Paola; Rizzi, Raffaella; Pesole, Graziano

    2005-10-05

    Currently available methods to predict splice sites are mainly based on the independent and progressive alignment of transcript data (mostly ESTs) to the genomic sequence. Apart from often being computationally expensive, this approach is vulnerable to several problems--hence the need to develop novel strategies. We propose a method, based on a novel multiple genome-EST alignment algorithm, for the detection of splice sites. To avoid limitations of splice sites prediction (mainly, over-predictions) due to independent single EST alignments to the genomic sequence our approach performs a multiple alignment of transcript data to the genomic sequence based on the combined analysis of all available data. We recast the problem of predicting constitutive and alternative splicing as an optimization problem, where the optimal multiple transcript alignment minimizes the number of exons and hence of splice site observations. We have implemented a splice site predictor based on this algorithm in the software tool ASPIC (Alternative Splicing PredICtion). It is distinguished from other methods based on BLAST-like tools by the incorporation of entirely new ad hoc procedures for accurate and computationally efficient transcript alignment and adopts dynamic programming for the refinement of intron boundaries. ASPIC also provides the minimal set of non-mergeable transcript isoforms compatible with the detected splicing events. The ASPIC web resource is dynamically interconnected with the Ensembl and Unigene databases and also implements an upload facility. Extensive bench marking shows that ASPIC outperforms other existing methods in the detection of novel splicing isoforms and in the minimization of over-predictions. ASPIC also requires a lower computation time for processing a single gene and an EST cluster. The ASPIC web resource is available at http://aspic.algo.disco.unimib.it/aspic-devel/.

  12. Apollo experience report: Detection and minimization of ignition hazards from water/glycol contamination of silver-clad electrical circuitry

    NASA Technical Reports Server (NTRS)

    Downs, W. R.

    1976-01-01

    The potential flammability hazard when a water/glycol solution contacts defectively insulated silver-clad copper circuitry or electrical components carrying a direct current is described. The chemical reactions and means for detecting them are explained. Methods for detecting and cleaning contaminated areas and the use of inhibitors to arrest chemical reactivity are also explained. Preventive measures to minimize hazards are given. Photomicrographs of the chemical reactions occurring on silver clad wires are also included.

  13. Object detection in cinematographic video sequences for automatic indexing

    NASA Astrophysics Data System (ADS)

    Stauder, Jurgen; Chupeau, Bertrand; Oisel, Lionel

    2003-06-01

    This paper presents an object detection framework applied to cinematographic post-processing of video sequences. Post-processing is done after production and before editing. At the beginning of each shot of a video, a slate (also called clapperboard) is shown. The slate contains notably an electronic audio timecode that is necessary for audio-visual synchronization. This paper presents an object detection framework to detect slates in video sequences for automatic indexing and post-processing. It is based on five steps. The first two steps aim to reduce drastically the video data to be analyzed. They ensure high recall rate but have low precision. The first step detects images at the beginning of a shot possibly showing up a slate while the second step searches in these images for candidates regions with color distribution similar to slates. The objective is to not miss any slate while eliminating long parts of video without slate appearance. The third and fourth steps are statistical classification and pattern matching to detected and precisely locate slates in candidate regions. These steps ensure high recall rate and high precision. The objective is to detect slates with very little false alarms to minimize interactive corrections. In a last step, electronic timecodes are read from slates to automize audio-visual synchronization. The presented slate detector has a recall rate of 89% and a precision of 97,5%. By temporal integration, much more than 89% of shots in dailies are detected. By timecode coherence analysis, the precision can be raised too. Issues for future work are to accelerate the system to be faster than real-time and to extend the framework for several slate types.

  14. Assessment of Data Fusion Algorithms for Earth Observation Change Detection Processes.

    PubMed

    Molina, Iñigo; Martinez, Estibaliz; Morillo, Carmen; Velasco, Jesus; Jara, Alvaro

    2016-09-30

    In this work a parametric multi-sensor Bayesian data fusion approach and a Support Vector Machine (SVM) are used for a Change Detection problem. For this purpose two sets of SPOT5-PAN images have been used, which are in turn used for Change Detection Indices (CDIs) calculation. For minimizing radiometric differences, a methodology based on zonal "invariant features" is suggested. The choice of one or the other CDI for a change detection process is a subjective task as each CDI is probably more or less sensitive to certain types of changes. Likewise, this idea might be employed to create and improve a "change map", which can be accomplished by means of the CDI's informational content. For this purpose, information metrics such as the Shannon Entropy and "Specific Information" have been used to weight the changes and no-changes categories contained in a certain CDI and thus introduced in the Bayesian information fusion algorithm. Furthermore, the parameters of the probability density functions (pdf's) that best fit the involved categories have also been estimated. Conversely, these considerations are not necessary for mapping procedures based on the discriminant functions of a SVM. This work has confirmed the capabilities of probabilistic information fusion procedure under these circumstances.

  15. Camouflaged target detection based on polarized spectral features

    NASA Astrophysics Data System (ADS)

    Tan, Jian; Zhang, Junping; Zou, Bin

    2016-05-01

    The polarized hyperspectral images (PHSI) include polarization, spectral, spatial and radiant features, which provide more information about objects and scenes than traditional intensity or spectrum ones. And polarization can suppress the background and highlight the object, leading to the high potential to improve camouflaged target detection. So polarized hyperspectral imaging technique has aroused extensive concern in the last few years. Nowadays, the detection methods are still not very mature, most of which are rooted in the detection of hyperspectral image. And before using these algorithms, Stokes vector is used to process the original four-dimensional polarized hyperspectral data firstly. However, when the data is large and complex, the amount of calculation and error will increase. In this paper, tensor is applied to reconstruct the original four-dimensional data into new three-dimensional data, then, the constraint energy minimization (CEM) is used to process the new data, which adds the polarization information to construct the polarized spectral filter operator and takes full advantages of spectral and polarized information. This way deals with the original data without extracting the Stokes vector, so as to reduce the computation and error greatly. The experimental results also show that the proposed method in this paper is more suitable for the target detection of the PHSI.

  16. Effect of the chlorinated washing of minimally processed vegetables on the generation of haloacetic acids.

    PubMed

    Cardador, Maria Jose; Gallego, Mercedes

    2012-07-25

    Chlorine solutions are usually used to sanitize fruit and vegetables in the fresh-cut industry due to their efficacy, low cost, and simple use. However, disinfection byproducts such as haloacetic acids (HAAs) can be formed during this process, which can remain on minimally processed vegetables (MPVs). These compounds are toxic and/or carcinogenic and have been associated with human health risks; therefore, the U.S. Environmental Protection Agency has set a maximum contaminant level for five HAAs at 60 μg/L in drinking water. This paper describes the first method to determine the nine HAAs that can be present in MPV samples, with static headspace coupled with gas chromatography-mass spectrometry where the leaching and derivatization of the HAAs are carried out in a single step. The proposed method is sensitive, with limits of detection between 0.1 and 2.4 μg/kg and an average relative standard deviation of ∼8%. From the samples analyzed, we can conclude that about 23% of them contain at least two HAAs (<0.4-24 μg/kg), which showed that these compounds are formed during washing and then remain on the final product.

  17. Biochemical study of leaf browning in minimally processed leaves of lettuce (Lactuca sativa L. var. acephala).

    PubMed

    Degl'Innocenti, E; Guidi, L; Pardossi, A; Tognoni, F

    2005-12-28

    A series of biochemical parameters, including the concentration of total ascorbic acid (ASA(tot)) and the activities of phenylalanine ammonia lyase (PAL), polyphenol oxidase (PPO), and peroxidases (PODs), was investigated during cold storage (72 h at 4 degrees C in the dark) in fresh-cut (minimally processed) leaves of two lettuce (Lactuca sativa L. var. acephala) cultivars differing in the susceptibility to tissue browning: Green Salade Bowl (GSB), susceptible, and Red Salade Bowl (RSB), resistant. The two cultivars showed differences also at the biochemical level. The content in ASA(tot) increased in RSB, as a consequence of increased DHA concentration; conversely, ASA(tot) diminished in GSB, in which ASA was not detectable after 72 h of storage, thus suggesting a disappearance of ascorbate (both ASA and DHA) into nonactive forms. The antioxidant capacity (as determined by using FRAP analysis) decreased significantly during storage in RSB, while a strong increase was observed in GSB. PAL activity increased soon after processing reaching a maximum by 3 h, then it declined to a relatively constant value in RSB, while in GSB it showed a tendency to decrease in the first few hours from harvest and processing. POD activity, at least for chlorogenic acid, increased significantly during storage only in GSB.

  18. UV gated Raman spectroscopy for standoff detection of explosives

    NASA Astrophysics Data System (ADS)

    Gaft, M.; Nagli, L.

    2008-07-01

    Real-time detection and identification of explosives at a standoff distance is a major issue in efforts to develop defense against so-called improvised explosive devices (IED). It is recognized that the only method, which is potentially capable to standoff detection of minimal amounts of explosives is laser-based spectroscopy. LDS technique belongs to trace detection, namely to its micro-particles variety. It is based on commonly held belief that surface contamination was very difficult to avoid and could be exploited for standoff detection. We have applied gated Raman spectroscopy for detection of main explosive materials, both factory and homemade. We developed and tested a Raman system for the field remote detection and identification of minimal amounts of explosives on relevant surfaces at a distance of up to 30 m.

  19. SU-E-T-497: Semi-Automated in Vivo Radiochromic Film Dosimetry Using a Novel Image Processing Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reyhan, M; Yue, N

    Purpose: To validate an automated image processing algorithm designed to detect the center of radiochromic film used for in vivo film dosimetry against the current gold standard of manual selection. Methods: An image processing algorithm was developed to automatically select the region of interest (ROI) in *.tiff images that contain multiple pieces of radiochromic film (0.5x1.3cm{sup 2}). After a user has linked a calibration file to the processing algorithm and selected a *.tiff file for processing, an ROI is automatically detected for all films by a combination of thresholding and erosion, which removes edges and any additional markings for orientation.more » Calibration is applied to the mean pixel values from the ROIs and a *.tiff image is output displaying the original image with an overlay of the ROIs and the measured doses. Validation of the algorithm was determined by comparing in vivo dose determined using the current gold standard (manually drawn ROIs) versus automated ROIs for n=420 scanned films. Bland-Altman analysis, paired t-test, and linear regression were performed to demonstrate agreement between the processes. Results: The measured doses ranged from 0.2-886.6cGy. Bland-Altman analysis of the two techniques (automatic minus manual) revealed a bias of -0.28cGy and a 95% confidence interval of (5.5cGy,-6.1cGy). These values demonstrate excellent agreement between the two techniques. Paired t-test results showed no statistical differences between the two techniques, p=0.98. Linear regression with a forced zero intercept demonstrated that Automatic=0.997*Manual, with a Pearson correlation coefficient of 0.999. The minimal differences between the two techniques may be explained by the fact that the hand drawn ROIs were not identical to the automatically selected ones. The average processing time was 6.7seconds in Matlab on an IntelCore2Duo processor. Conclusion: An automated image processing algorithm has been developed and validated, which will help minimize user interaction and processing time of radiochromic film used for in vivo dosimetry.« less

  20. A high performance parallel computing architecture for robust image features

    NASA Astrophysics Data System (ADS)

    Zhou, Renyan; Liu, Leibo; Wei, Shaojun

    2014-03-01

    A design of parallel architecture for image feature detection and description is proposed in this article. The major component of this architecture is a 2D cellular network composed of simple reprogrammable processors, enabling the Hessian Blob Detector and Haar Response Calculation, which are the most computing-intensive stage of the Speeded Up Robust Features (SURF) algorithm. Combining this 2D cellular network and dedicated hardware for SURF descriptors, this architecture achieves real-time image feature detection with minimal software in the host processor. A prototype FPGA implementation of the proposed architecture achieves 1318.9 GOPS general pixel processing @ 100 MHz clock and achieves up to 118 fps in VGA (640 × 480) image feature detection. The proposed architecture is stand-alone and scalable so it is easy to be migrated into VLSI implementation.

  1. Automated detection of optical counterparts to GRBs with RAPTOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wozniak, P. R.; Vestrand, W. T.; Evans, S.

    2006-05-19

    The RAPTOR system (RAPid Telescopes for Optical Response) is an array of several distributed robotic telescopes that automatically respond to GCN localization alerts. Raptor-S is a 0.4-m telescope with 24 arc min. field of view employing a 1k x 1k Marconi CCD detector, and has already detected prompt optical emission from several GRBs within the first minute of the explosion. We present a real-time data analysis and alert system for automated identification of optical transients in Raptor-S GRB response data down to the sensitivity limit of {approx} 19 mag. Our custom data processing pipeline is designed to minimize the timemore » required to reliably identify transients and extract actionable information. The system utilizes a networked PostgreSQL database server for catalog access and distributes email alerts with successful detections.« less

  2. Signal processing for the detection of explosive residues on varying substrates using laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Morton, Kenneth D., Jr.; Torrione, Peter A.; Collins, Leslie

    2011-05-01

    Laser induced breakdown spectroscopy (LIBS) can provide rapid, minimally destructive, chemical analysis of substances with the benefit of little to no sample preparation. Therefore, LIBS is a viable technology for the detection of substances of interest in near real-time fielded remote sensing scenarios. Of particular interest to military and security operations is the detection of explosive residues on various surfaces. It has been demonstrated that LIBS is capable of detecting such residues, however, the surface or substrate on which the residue is present can alter the observed spectra. Standard chemometric techniques such as principal components analysis and partial least squares discriminant analysis have previously been applied to explosive residue detection, however, the classification techniques developed on such data perform best against residue/substrate pairs that were included in model training but do not perform well when the residue/substrate pairs are not in the training set. Specifically residues in the training set may not be correctly detected if they are presented on a previously unseen substrate. In this work, we explicitly model LIBS spectra resulting from the residue and substrate to attempt to separate the response from each of the two components. This separation process is performed jointly with classifier design to ensure that the classifier that is developed is able to detect residues of interest without being confused by variations in the substrates. We demonstrate that the proposed classification algorithm provides improved robustness to variations in substrate compared to standard chemometric techniques for residue detection.

  3. DCL System Using Deep Learning Approaches for Land-based or Ship-based Real-Time Recognition and Localization of Marine Mammals

    DTIC Science & Technology

    2012-09-30

    platform (HPC) was developed, called the HPC-Acoustic Data Accelerator, or HPC-ADA for short. The HPC-ADA was designed based on fielded systems [1-4...software (Detection cLassificaiton for MAchine learning - High Peformance Computing). The software package was designed to utilize parallel and...Sedna [7] and is designed using a parallel architecture2, allowing existing algorithms to distribute to the various processing nodes with minimal changes

  4. Minimal disease detection of B-cell lymphoproliferative disorders by flow cytometry: multidimensional cluster analysis.

    PubMed

    Duque, Ricardo E

    2012-04-01

    Flow cytometric analysis of cell suspensions involves the sequential 'registration' of intrinsic and extrinsic parameters of thousands of cells in list mode files. Thus, it is almost irresistible to describe phenomena in numerical terms or by 'ratios' that have the appearance of 'accuracy' due to the presence of numbers obtained from thousands of cells. The concepts involved in the detection and characterization of B cell lymphoproliferative processes are revisited in this paper by identifying parameters that, when analyzed appropriately, are both necessary and sufficient. The neoplastic process (cluster) can be visualized easily because the parameters that distinguish it form a cluster in multidimensional space that is unique and distinguishable from neighboring clusters that are not of diagnostic interest but serve to provide a background. For B cell neoplasia it is operationally necessary to identify the multidimensional space occupied by a cluster whose kappa:lambda ratio is 100:0 or 0:100. Thus, the concept of kappa:lambda ratio is without meaning and would not detect B cell neoplasia in an unacceptably high number of cases.

  5. Detection of bacteria in platelet concentrates prepared from spiked single donations using cultural and molecular genetic methods.

    PubMed

    Störmer, M; Cassens, U; Kleesiek, K; Dreier, J

    2007-02-01

    Bacteria show differences in their growth kinetics depending on the type of blood component. On to storage at 22 degrees C, platelet concentrates (PCs) seem to be more prone to bacterial multiplication than red cell concentrates. Knowledge of the potential for bacterial proliferation in blood components, which are stored at a range of temperatures, is essential before considering implementation of a detection strategy. The efficacy of bacterial detection was determined, using real-time reverse transcriptase-polymerase chain reaction (RT-PCR), following bacterial growth in blood components obtained from a deliberately contaminated whole-blood (WB) unit. Cultivation was used as the reference method. WB was spiked with 2 colony-forming units mL(-1)Staphylococcus epidermidis or Klebsiella pneumoniae, kept for 15 h at room temperature and component preparation was processed. Samples were drawn, at intervals throughout the whole separation process, from each blood component. Nucleic acids were extracted using an automated high-volume extraction method. The 15-h storage revealed an insignificant increase in bacterial titre. No bacterial growth was detected in red blood cell or plasma units. K. pneumoniae showed rapid growth in the pooled PC and could be detected immediately after preparation using RT-PCR. S. epidermidis grew slowly and was detected 24 h after separation. These experiments show that sampling is indicative at 24 h after preparation of PCs at the earliest to minimize the sampling error.

  6. Determination of fat-soluble vitamins in vegetable oils through microwave-assisted high-performance liquid chromatography.

    PubMed

    Carballo, Silvia; Prats, Soledad; Maestre, Salvador; Todolí, José-Luis

    2015-04-01

    In this manuscript, a study of the effect of microwave radiation on the high-performance liquid chromatography separation of tocopherols and vitamin K1 was conducted. The novelty of the application was the use of a relatively low polarity mobile phase in which the dielectric heating effect was minimized to evaluate the nonthermal effect of the microwave radiation over the separation process. Results obtained show that microwave-assisted high-performance liquid chromatography had a shorter analysis time from 31.5 to 13.3 min when the lowest microwave power was used. Moreover, narrower peaks were obtained; hence the separation was more efficient maintaining or even increasing the resolution between the peaks. This result confirms that the increase in mobile phase temperature is not the only variable for improving the separation process but also other nonthermal processes must intervene. Fluorescence detection demonstrated better signal-to-noise compared to photodiode arrayed detection mainly due to the independent effect of microwave pulses on the baseline noise, but photodiode array detection was finally chosen as it allowed a simultaneous detection of nonfluorescent compounds. Finally, a determination of the content of the vitamin E homologs was carried out in different vegetable oils. Results were coherent with those found in the literature. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Evaluation of low-dose irradiation on microbiological quality of white carrots and string beans

    NASA Astrophysics Data System (ADS)

    Koike, Amanda C. R.; Santillo, Amanda G.; Rodrigues, Flávio T.; Duarte, Renato C.; Villavicencio, Anna Lucia C. H.

    2012-08-01

    The minimally processed food provided the consumer with a product quality, safety and practicality. However, minimal processing of food does not reduce pathogenic population of microorganisms to safe levels. Ionizing radiation used in low doses is effective to maintain the quality of food, reducing the microbiological load but rather compromising the nutritional values and sensory property. The association of minimal processing with irradiation could improve the quality and safety of product. The purpose of this study was to evaluate the effectiveness of low-doses of ionizing radiation on the reduction of microorganisms in minimally processed foods. The results show that the ionizing radiation of minimally processed vegetables could decontaminate them without several changes in its properties.

  8. Compact USB-powered mobile ELISA-based pathogen detection: design and implementation challenges

    NASA Astrophysics Data System (ADS)

    Starodubov, Dmitry; Asanbaeva, Anya; Berezhnyy, Ihor; Chao, Chung-Yen; Koziol, Richard; Miller, David; Patton, Edward; Trehan, Sushma; Ulmer, Chris

    2011-05-01

    Physical Optics Corporation (POC) presents a novel Mobile ELISA-based Pathogen Detection system that is based on a disposable microfluidic chip for multiple-threat detection and a highly sensitive portable microfluidic fluorescence measurement unit that also controls the flow of samples and reagents through the microfluidic channels of the chip. The fluorescence detection subsystem is composed of a commercial 635-nm diode laser, an avalanche photodiode (APD) that measures fluorescence, and three filtering mirrors that provide more than 100 dB of excitation line suppression in the signal detection channel. Special techniques to suppress the fluorescence and scattering background allow optimizing the dynamic range for a compact package. Concentrations below 100 ng/mL can be reliably identified. The entire instrument is powered using a USB port of a notebook PC and operates as a plug-and-play human-interface device, resulting in a truly peripheral biosensor. The operation of the system is fully automated, with minimal user intervention through the detection process. The resolved challenges of the design and implementation are presented in detail in this publication.

  9. Independent component analysis (ICA) and self-organizing map (SOM) approach to multidetection system for network intruders

    NASA Astrophysics Data System (ADS)

    Abdi, Abdi M.; Szu, Harold H.

    2003-04-01

    With the growing rate of interconnection among computer systems, network security is becoming a real challenge. Intrusion Detection System (IDS) is designed to protect the availability, confidentiality and integrity of critical network information systems. Today"s approach to network intrusion detection involves the use of rule-based expert systems to identify an indication of known attack or anomalies. However, these techniques are less successful in identifying today"s attacks. Hackers are perpetually inventing new and previously unanticipated techniques to compromise information infrastructure. This paper proposes a dynamic way of detecting network intruders on time serious data. The proposed approach consists of a two-step process. Firstly, obtaining an efficient multi-user detection method, employing the recently introduced complexity minimization approach as a generalization of a standard ICA. Secondly, we identified unsupervised learning neural network architecture based on Kohonen"s Self-Organizing Map for potential functional clustering. These two steps working together adaptively will provide a pseudo-real time novelty detection attribute to supplement the current intrusion detection statistical methodology.

  10. Optimizing Urine Processing Protocols for Protein and Metabolite Detection.

    PubMed

    Siddiqui, Nazema Y; DuBois, Laura G; St John-Williams, Lisa; Will, Thompson J; Grenier, Carole; Burke, Emily; Fraser, Matthew O; Amundsen, Cindy L; Murphy, Susan K

    In urine, factors such as timing of voids, and duration at room temperature (RT) may affect the quality of recovered protein and metabolite data. Additives may aid with detection, but can add more complexity in sample collection or analysis. We aimed to identify the optimal urine processing protocol for clinically-obtained urine samples that allows for the highest protein and metabolite yields with minimal degradation. Healthy women provided multiple urine samples during the same day. Women collected their first morning (1 st AM) void and another "random void". Random voids were aliquotted with: 1) no additive; 2) boric acid (BA); 3) protease inhibitor (PI); or 4) both BA + PI. Of these aliquots, some were immediately stored at 4°C, and some were left at RT for 4 hours. Proteins and individual metabolites were quantified, normalized to creatinine concentrations, and compared across processing conditions. Sample pools corresponding to each processing condition were analyzed using mass spectrometry to assess protein degradation. Ten Caucasian women between 35-65 years of age provided paired 1 st morning and random voided urine samples. Normalized protein concentrations were slightly higher in 1 st AM compared to random "spot" voids. The addition of BA did not significantly change proteins, while PI significantly improved normalized protein concentrations, regardless of whether samples were immediately cooled or left at RT for 4 hours. In pooled samples, there were minimal differences in protein degradation under the various conditions we tested. In metabolite analyses, there were significant differences in individual amino acids based on the timing of the void. For comparative translational research using urine, information about void timing should be collected and standardized. For urine samples processed in the same day, BA does not appear to be necessary while the addition of PI enhances protein yields, regardless of 4°C or RT storage temperature.

  11. High microbial loads found in minimally-processed sliced mushrooms from Italian market.

    PubMed

    Jiang, Haiyang; Miraglia, Dino; Ranucci, David; Donnini, Domizia; Roila, Rossana; Branciari, Raffaella; Li, Cheng

    2018-03-31

    There is an increased consumer interest in minimally processed vegetables that has led to the development of products, such as pre-cut sliced mushrooms. Few data are available on the hygienic condition and the presence of foodborne pathogens in such products. Therefore, the current study aimed to evaluate the safety and hygienic characteristics of both ready-to-eat and ready-to-cook, pre-cut sliced mushrooms obtained from a local Italian market. For the evaluation of the hygienic condition, the aerobic mesophilic bacteria, aerobic psychrotrophic bacteria and Escherichia coli enumerations were performed. Salmonella spp., Listeria monocytogenes and Campylobacter spp. were considered in the assessment of the foodborne pathogens. High microbial loads were detected, including counts higher than 5 log CFU/g for E. coli and 6 log CFU/g for the other bacteria counts considered, but no pathogens were found. Ready-to-eat and ready-to-cook products differed only for aerobic mesophilic counts (7.87 and 8.26 log CFU/g, respectively, P=0.003). Strategies to enhance the hygienic level of the mushrooms, particularly the ready-to-eat products, are needed.

  12. High microbial loads found in minimally-processed sliced mushrooms from Italian market

    PubMed Central

    Jiang, Haiyang; Miraglia, Dino; Ranucci, David; Donnini, Domizia; Roila, Rossana; Branciari, Raffaella; Li, Cheng

    2018-01-01

    There is an increased consumer interest in minimally processed vegetables that has led to the development of products, such as pre-cut sliced mushrooms. Few data are available on the hygienic condition and the presence of foodborne pathogens in such products. Therefore, the current study aimed to evaluate the safety and hygienic characteristics of both ready-to-eat and ready-to-cook, pre-cut sliced mushrooms obtained from a local Italian market. For the evaluation of the hygienic condition, the aerobic mesophilic bacteria, aerobic psychrotrophic bacteria and Escherichia coli enumerations were performed. Salmonella spp., Listeria monocytogenes and Campylobacter spp. were considered in the assessment of the foodborne pathogens. High microbial loads were detected, including counts higher than 5 log CFU/g for E. coli and 6 log CFU/g for the other bacteria counts considered, but no pathogens were found. Ready-to-eat and ready-to-cook products differed only for aerobic mesophilic counts (7.87 and 8.26 log CFU/g, respectively, P=0.003). Strategies to enhance the hygienic level of the mushrooms, particularly the ready-to-eat products, are needed. PMID:29732334

  13. A minimally processed dietary pattern is associated with lower odds of metabolic syndrome among Lebanese adults.

    PubMed

    Nasreddine, Lara; Tamim, Hani; Itani, Leila; Nasrallah, Mona P; Isma'eel, Hussain; Nakhoul, Nancy F; Abou-Rizk, Joana; Naja, Farah

    2018-01-01

    To (i) estimate the consumption of minimally processed, processed and ultra-processed foods in a sample of Lebanese adults; (ii) explore patterns of intakes of these food groups; and (iii) investigate the association of the derived patterns with cardiometabolic risk. Cross-sectional survey. Data collection included dietary assessment using an FFQ and biochemical, anthropometric and blood pressure measurements. Food items were categorized into twenty-five groups based on the NOVA food classification. The contribution of each food group to total energy intake (TEI) was estimated. Patterns of intakes of these food groups were examined using exploratory factor analysis. Multivariate logistic regression analysis was used to evaluate the associations of derived patterns with cardiometabolic risk factors. Greater Beirut area, Lebanon. Adults ≥18 years (n 302) with no prior history of chronic diseases. Of TEI, 36·53 and 27·10 % were contributed by ultra-processed and minimally processed foods, respectively. Two dietary patterns were identified: the 'ultra-processed' and the 'minimally processed/processed'. The 'ultra-processed' consisted mainly of fast foods, snacks, meat, nuts, sweets and liquor, while the 'minimally processed/processed' consisted mostly of fruits, vegetables, legumes, breads, cheeses, sugar and fats. Participants in the highest quartile of the 'minimally processed/processed' pattern had significantly lower odds for metabolic syndrome (OR=0·18, 95 % CI 0·04, 0·77), hyperglycaemia (OR=0·25, 95 % CI 0·07, 0·98) and low HDL cholesterol (OR=0·17, 95 % CI 0·05, 0·60). The study findings may be used for the development of evidence-based interventions aimed at encouraging the consumption of minimally processed foods.

  14. Ionization-Enhanced Decomposition of 2,4,6-Trinitrotoluene (TNT) Molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Bin; Wright, David; Cliffel, David

    2011-01-01

    The unimolecular decomposition reaction of TNT can in principle be used to design ways to either detect or remove TNT from the environment. Here, we report the results of a density functional theory study of possible ways to lower the reaction barrier for this decomposition process by ionization, so that decomposition and/or detection can occur at room temperature. We find that ionizing TNT lowers the reaction barrier for the initial step of this decomposition. We further show that a similar effect can occur if a positive moiety is bound to the TNT molecule. The positive charge produces a pronounced electronmore » redistribution and dipole formation in TNT with minimal charge transfer from TNT to the positive moiety.« less

  15. Ultra-Low Power Optical Sensor for Xylophagous Insect Detection in Wood.

    PubMed

    Perles, Angel; Mercado, Ricardo; Capella, Juan V; Serrano, Juan José

    2016-11-23

    The early detection of pests is key for the maintenance of high-value masterpieces and historical buildings made of wood. In this work, we the present detailed design of an ultra-low power sensor device that permits the continuous monitoring of the presence of termites and other xylophagous insects. The operating principle of the sensor is based on the variations of reflected light induced by the presence of termites, and specific processing algorithms that deal with the behavior of the electronics and the natural ageing of components. With a typical CR2032 lithium battery, the device lasts more than nine years, and is ideal for incorporation in more complex monitoring systems where maintenance tasks should be minimized.

  16. Ultra-Low Power Optical Sensor for Xylophagous Insect Detection in Wood

    PubMed Central

    Perles, Angel; Mercado, Ricardo; Capella, Juan V.; Serrano, Juan José

    2016-01-01

    The early detection of pests is key for the maintenance of high-value masterpieces and historical buildings made of wood. In this work, we the present detailed design of an ultra-low power sensor device that permits the continuous monitoring of the presence of termites and other xylophagous insects. The operating principle of the sensor is based on the variations of reflected light induced by the presence of termites, and specific processing algorithms that deal with the behavior of the electronics and the natural ageing of components. With a typical CR2032 lithium battery, the device lasts more than nine years, and is ideal for incorporation in more complex monitoring systems where maintenance tasks should be minimized. PMID:27886082

  17. Social Groups Prioritize Selective Attention to Faces: How Social Identity Shapes Distractor Interference

    PubMed Central

    Hill, LaBarron K.; Williams, DeWayne P.; Thayer, Julian F.

    2016-01-01

    Human faces automatically attract visual attention and this process appears to be guided by social group memberships. In two experiments, we examined how social groups guide selective attention toward in-group and out-group faces. Black and White participants detected a target letter among letter strings superimposed on faces (Experiment 1). White participants were less accurate on trials with racial out-group (Black) compared to in-group (White) distractor faces. Likewise, Black participants were less accurate on trials with racial out-group (White) compared to in-group (Black) distractor faces. However, this pattern of out-group bias was only evident under high perceptual load—when the task was visually difficult. To examine the malleability of this pattern of racial bias, a separate sample of participants were assigned to mixed-race minimal groups (Experiment 2). Participants assigned to groups were less accurate on trials with their minimal in-group members compared to minimal out-group distractor faces, regardless of race. Again, this pattern of out-group bias was only evident under high perceptual load. Taken together, these results suggest that social identity guides selective attention toward motivationally relevant social groups—shifting from out-group bias in the domain of race to in-group bias in the domain of minimal groups—when perceptual resources are scarce. PMID:27556646

  18. Fault Detection, Diagnosis, and Mitigation for Long-Duration AUV Missions with Minimal Human Intervention

    DTIC Science & Technology

    2014-09-30

    Duration AUV Missions with Minimal Human Intervention James Bellingham Monterey Bay Aquarium Research Institute 7700 Sandholdt Road Moss Landing...subsystem failures and environmental challenges. For example, should an AUV suffer the failure of one of its internal actuators, can that failure be...reduce the need for operator intervention in the event of performance anomalies on long- duration AUV deployments, - To allow the vehicle to detect

  19. Pulmonary nodule detection with digital projection radiography: an ex-vivo study on increased latitude post-processing.

    PubMed

    Biederer, Juergen; Gottwald, Tobias; Bolte, Hendrik; Riedel, Christian; Freitag, Sandra; Van Metter, Richard; Heller, Martin

    2007-04-01

    To evaluate increased image latitude post-processing of digital projection radiograms for the detection of pulmonary nodules. 20 porcine lungs were inflated inside a chest phantom, prepared with 280 solid nodules of 4-8 mm in diameter and examined with direct radiography (3.0x2.5 k detector, 125 kVp, 4 mAs). Nodule position and size were documented by CT controls and dissection. Four intact lungs served as negative controls. Image post-processing included standard tone scales and increased latitude with detail contrast enhancement (log-factors 1.0, 1.5 and 2.0). 1280 sub-images (512x512 pixel) were centred on nodules or controls, behind the diaphragm and over free parenchyma, randomized and presented to six readers. Confidence in the decision was recorded with a scale of 0-100%. Sensitivity and specificity for nodules behind the diaphragm were 0.87/0.97 at standard tone scale and 0.92/0.92 with increased latitude (log factor 2.0). The fraction of "not diagnostic" readings was reduced (from 208/1920 to 52/1920). As an indicator of increased detection confidence, the median of the ratings behind the diaphragm approached 100 and 0, respectively, and the inter-quartile width decreased (controls: p<0.001, nodules: p=0.239) at higher image latitude. Above the diaphragm, accuracy and detection confidence remained unchanged. Here, the sensitivity for nodules was 0.94 with a specificity from 0.96 to 0.97 (all p>0.05). Increased latitude post-processing has minimal effects on the overall accuracy, but improves the detection confidence for sub-centimeter nodules in the posterior recesses of the lung.

  20. Effect of Processing on Silk-Based Biomaterials: Reproducibility and Biocompatibility

    PubMed Central

    Wray, Lindsay S.; Hu, Xiao; Gallego, Jabier; Georgakoudi, Irene; Omenetto, Fiorenzo G.; Schmidt, Daniel; Kaplan, David L.

    2012-01-01

    Silk fibroin has been successfully used as a biomaterial for tissue regeneration. In order to prepare silk fibroin biomaterials for human implantation a series of processing steps are required to purify the protein. Degumming to remove inflammatory sericin is a crucial step related to biocompatibility and variability in the material. Detailed characterization of silk fibroin degumming is reported. The degumming conditions significantly affected cell viability on the silk fibroin material and the ability to form three-dimensional porous scaffolds from the silk fibroin, but did not affect macrophage activation or β-sheet content in the materials formed. Methods are also provided to determine the content of residual sericin in silk fibroin solutions and to assess changes in silk fibroin molecular weight. Amino acid composition analysis was used to detect sericin residuals in silk solutions with a detection limit between 1.0% and 10% wt/wt, while fluorescence spectroscopy was used to reproducibly distinguish between silk samples with different molecular weights. Both methods are simple and require minimal sample volume, providing useful quality control tools for silk fibroin preparation processes. PMID:21695778

  1. Current and Prospective Methods for Plant Disease Detection

    PubMed Central

    Fang, Yi; Ramasamy, Ramaraja P.

    2015-01-01

    Food losses due to crop infections from pathogens such as bacteria, viruses and fungi are persistent issues in agriculture for centuries across the globe. In order to minimize the disease induced damage in crops during growth, harvest and postharvest processing, as well as to maximize productivity and ensure agricultural sustainability, advanced disease detection and prevention in crops are imperative. This paper reviews the direct and indirect disease identification methods currently used in agriculture. Laboratory-based techniques such as polymerase chain reaction (PCR), immunofluorescence (IF), fluorescence in-situ hybridization (FISH), enzyme-linked immunosorbent assay (ELISA), flow cytometry (FCM) and gas chromatography-mass spectrometry (GC-MS) are some of the direct detection methods. Indirect methods include thermography, fluorescence imaging and hyperspectral techniques. Finally, the review also provides a comprehensive overview of biosensors based on highly selective bio-recognition elements such as enzyme, antibody, DNA/RNA and bacteriophage as a new tool for the early identification of crop diseases. PMID:26287253

  2. A Fully Redundant On-Line Mass Spectrometer System Used to Monitor Cryogenic Fuel Leaks on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Griffin, Timothy P.; Naylor, Guy R.; Haskell, William D.; Breznik, Greg S.; Mizell, Carolyn A.; Helms, William R.; Voska, N. (Technical Monitor)

    2002-01-01

    An on-line gas monitoring system was developed to replace the older systems used to monitor for cryogenic leaks on the Space Shuttles before launch. The system uses a mass spectrometer to monitor multiple locations in the process, which allows the system to monitor all gas constituents of interest in a nearly simultaneous manner. The system is fully redundant and meets all requirements for ground support equipment (GSE). This includes ruggedness to withstand launch on the Mobile Launcher Platform (MLP), ease of operation, and minimal operator intervention. The system can be fully automated so that an operator is notified when an unusual situation or fault is detected. User inputs are through personal computer using mouse and keyboard commands. The graphical user for detecting cryogenic leaks, many other gas constituents could be monitored using the Hazardous Gas Detection System (HGDS) 2000.

  3. Assessing residual reasoning ability in overtly non-communicative patients using fMRI☆

    PubMed Central

    Hampshire, Adam; Parkin, Beth L.; Cusack, Rhodri; Espejo, Davinia Fernández; Allanson, Judith; Kamau, Evelyn; Pickard, John D.; Owen, Adrian M.

    2012-01-01

    It is now well established that some patients who are diagnosed as being in a vegetative state or a minimally conscious state show reliable signs of volition that may only be detected by measuring neural responses. A pertinent question is whether these patients are also capable of logical thought. Here, we validate an fMRI paradigm that can detect the neural fingerprint of reasoning processes and moreover, can confirm whether a participant derives logical answers. We demonstrate the efficacy of this approach in a physically non-communicative patient who had been shown to engage in mental imagery in response to simple auditory instructions. Our results demonstrate that this individual retains a remarkable capacity for higher cognition, engaging in the reasoning task and deducing logical answers. We suggest that this approach is suitable for detecting residual reasoning ability using neural responses and could readily be adapted to assess other aspects of cognition. PMID:24179769

  4. Intelligent Extruder

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AlperEker; Mark Giammattia; Paul Houpt

    ''Intelligent Extruder'' described in this report is a software system and associated support services for monitoring and control of compounding extruders to improve material quality, reduce waste and energy use, with minimal addition of new sensors or changes to the factory floor system components. Emphasis is on process improvements to the mixing, melting and de-volatilization of base resins, fillers, pigments, fire retardants and other additives in the :finishing'' stage of high value added engineering polymer materials. While GE Plastics materials were used for experimental studies throughout the program, the concepts and principles are broadly applicable to other manufacturers materials. Themore » project involved a joint collaboration among GE Global Research, GE Industrial Systems and Coperion Werner & Pleiderer, USA, a major manufacturer of compounding equipment. Scope of the program included development of a algorithms for monitoring process material viscosity without rheological sensors or generating waste streams, a novel detection scheme for rapid detection of process upsets and an adaptive feedback control system to compensate for process upsets where at line adjustments are feasible. Software algorithms were implemented and tested on a laboratory scale extruder (50 lb/hr) at GE Global Research and data from a production scale system (2000 lb/hr) at GE Plastics was used to validate the monitoring and detection software. Although not evaluated experimentally, a new concept for extruder process monitoring through estimation of high frequency drive torque without strain gauges is developed and demonstrated in simulation. A plan to commercialize the software system is outlined, but commercialization has not been completed.« less

  5. Method for universal detection of two-photon polarization entanglement

    NASA Astrophysics Data System (ADS)

    Bartkiewicz, Karol; Horodecki, Paweł; Lemr, Karel; Miranowicz, Adam; Życzkowski, Karol

    2015-03-01

    Detecting and quantifying quantum entanglement of a given unknown state poses problems that are fundamentally important for quantum information processing. Surprisingly, no direct (i.e., without quantum tomography) universal experimental implementation of a necessary and sufficient test of entanglement has been designed even for a general two-qubit state. Here we propose an experimental method for detecting a collective universal witness, which is a necessary and sufficient test of two-photon polarization entanglement. It allows us to detect entanglement for any two-qubit mixed state and to establish tight upper and lower bounds on its amount. A different element of this method is the sequential character of its main components, which allows us to obtain relatively complicated information about quantum correlations with the help of simple linear-optical elements. As such, this proposal realizes a universal two-qubit entanglement test within the present state of the art of quantum optics. We show the optimality of our setup with respect to the minimal number of measured quantities.

  6. Method of evaluating, expanding, and collapsing connectivity regions within dynamic systems

    DOEpatents

    Bailey, David A [Schenectady, NY

    2004-11-16

    An automated process defines and maintains connectivity regions within a dynamic network. The automated process requires an initial input of a network component around which a connectivity region will be defined. The process automatically and autonomously generates a region around the initial input, stores the region's definition, and monitors the network for a change. Upon detecting a change in the network, the effect is evaluated, and if necessary the regions are adjusted and redefined to accommodate the change. Only those regions of the network affected by the change will be updated. This process eliminates the need for an operator to manually evaluate connectivity regions within a network. Since the automated process maintains the network, the reliance on an operator is minimized; thus, reducing the potential for operator error. This combination of region maintenance and reduced operator reliance, results in a reduction of overall error.

  7. USER'S GUIDE: Strategic Waste Minimization Initiative (SWAMI) Version 2.0 - A Software Tool to Aid in Process Analysis for Pollution Prevention

    EPA Science Inventory

    The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...

  8. Standoff laser-based spectroscopy for explosives detection

    NASA Astrophysics Data System (ADS)

    Gaft, M.; Nagli, L.

    2007-10-01

    Real time detection and identification of explosives at a standoff distance is a major issue in efforts to develop defense against so-called Improvised Explosive Devices (IED). It is recognized that the only technique, which is potentially capable to standoff detection of minimal amounts of explosives is laser-based spectroscopy. LDS activity is based on a combination of laser-based spectroscopic methods with orthogonal capabilities. Our technique belongs to trace detection, namely to its micro-particles variety. It is based on commonly held belief that surface contamination was very difficult to avoid and could be exploited for standoff detection. We has applied optical techniques including gated Raman and time-resolved luminescence spectroscopy for detection of main explosive materials, both factory and homemade. We developed and tested a Raman system for the field remote detection and identification of minimal amounts of explosives on relevant surfaces at a distance of up to 30 meters.

  9. Revealing the Effects of Nanoscale Membrane Curvature on Lipid Mobility.

    PubMed

    Kabbani, Abir Maarouf; Woodward, Xinxin; Kelly, Christopher V

    2017-10-18

    Recent advances in nanoengineering and super-resolution microscopy have enabled new capabilities for creating and observing membrane curvature. However, the effects of curvature on single-lipid diffusion have yet to be revealed. The simulations presented here describe the capabilities of varying experimental methods for revealing the effects of nanoscale curvature on single-molecule mobility. Traditionally, lipid mobility is revealed through fluorescence recovery after photobleaching (FRAP), fluorescence correlation spectroscopy (FCS), and single particle tracking (SPT). However, these techniques vary greatly in their ability to detect the effects of nanoscale curvature on lipid behavior. Traditionally, FRAP and FCS depend on diffraction-limited illumination and detection. A simulation of FRAP shows minimal effects on lipids diffusion due to a 50 nm radius membrane bud. Throughout the stages of the budding process, FRAP detected minimal changes in lipid recovery time due to the curvature versus flat membrane. Simulated FCS demonstrated small effects due to a 50 nm radius membrane bud that was more apparent with curvature-dependent lipid mobility changes. However, SPT achieves a sub-diffraction-limited resolution of membrane budding and lipid mobility through the identification of the single-lipid positions with ≤15 nm spatial and ≤20 ms temporal resolution. By mapping the single-lipid step lengths to locations on the membrane, the effects of membrane topography and curvature could be correlated to the effective membrane viscosity. Single-fluorophore localization techniques, such SPT, can detect membrane curvature and its effects on lipid behavior. These simulations and discussion provide a guideline for optimizing the experimental procedures in revealing the effects of curvature on lipid mobility and effective local membrane viscosity.

  10. The detectability half-life in arthropod predator-prey research: what it is, why we need it, how to measure it, and how to use it.

    PubMed

    Greenstone, Matthew H; Payton, Mark E; Weber, Donald C; Simmons, Alvin M

    2014-08-01

    Molecular gut-content analysis enables detection of arthropod predation with minimal disruption of ecosystem processes. Most assays produce only qualitative results, with each predator testing either positive or negative for target prey remains. Nevertheless, they have yielded important insights into community processes. For example, they have confirmed the long-hypothesized role of generalist predators in retarding early-season build-up of pest populations prior to the arrival of more specialized predators and parasitoids and documented the ubiquity of secondary and intraguild predation. However, raw qualitative gut-content data cannot be used to assess the relative impact of different predator taxa on prey population dynamics: they must first be weighted by the relative detectability periods for molecular prey remains for each predator-prey combination. If this is not carried out, interpretations of predator impact will be biased towards those with the longest detectabilities. We review the challenges in determining detectability half-lives, including unstated assumptions that have often been ignored in the performance of feeding trials. We also show how detectability half-lives can be used to properly weight assay data to rank predators by their importance in prey population suppression, and how sets of half-lives can be used to test hypotheses concerning predator ecology and physiology. We use data from 32 publications, comprising 97 half-lives, to generate and test hypotheses on taxonomic differences in detectability half-lives and discuss the possible role of the detectability half-life in interpreting qPCR and next-generation sequencing data. © 2013 John Wiley & Sons Ltd.

  11. Model-based tomographic reconstruction

    DOEpatents

    Chambers, David H; Lehman, Sean K; Goodman, Dennis M

    2012-06-26

    A model-based approach to estimating wall positions for a building is developed and tested using simulated data. It borrows two techniques from geophysical inversion problems, layer stripping and stacking, and combines them with a model-based estimation algorithm that minimizes the mean-square error between the predicted signal and the data. The technique is designed to process multiple looks from an ultra wideband radar array. The processed signal is time-gated and each section processed to detect the presence of a wall and estimate its position, thickness, and material parameters. The floor plan of a building is determined by moving the array around the outside of the building. In this paper we describe how the stacking and layer stripping algorithms are combined and show the results from a simple numerical example of three parallel walls.

  12. Intelligence algorithms for autonomous navigation in a ground vehicle

    NASA Astrophysics Data System (ADS)

    Petkovsek, Steve; Shakya, Rahul; Shin, Young Ho; Gautam, Prasanna; Norton, Adam; Ahlgren, David J.

    2012-01-01

    This paper will discuss the approach to autonomous navigation used by "Q," an unmanned ground vehicle designed by the Trinity College Robot Study Team to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2011 competition, Q's intelligence was upgraded in several different areas, resulting in a more robust decision-making process and a more reliable system. In 2010-2011, the software of Q was modified to operate in a modular parallel manner, with all subtasks (including motor control, data acquisition from sensors, image processing, and intelligence) running simultaneously in separate software processes using the National Instruments (NI) LabVIEW programming language. This eliminated processor bottlenecks and increased flexibility in the software architecture. Though overall throughput was increased, the long runtime of the image processing process (150 ms) reduced the precision of Q's realtime decisions. Q had slow reaction times to obstacles detected only by its cameras, such as white lines, and was limited to slow speeds on the course. To address this issue, the image processing software was simplified and also pipelined to increase the image processing throughput and minimize the robot's reaction times. The vision software was also modified to detect differences in the texture of the ground, so that specific surfaces (such as ramps and sand pits) could be identified. While previous iterations of Q failed to detect white lines that were not on a grassy surface, this new software allowed Q to dynamically alter its image processing state so that appropriate thresholds could be applied to detect white lines in changing conditions. In order to maintain an acceptable target heading, a path history algorithm was used to deal with local obstacle fields and GPS waypoints were added to provide a global target heading. These modifications resulted in Q placing 5th in the autonomous challenge and 4th in the navigation challenge at IGVC.

  13. 24/7 security system: 60-FPS color EMCCD camera with integral human recognition

    NASA Astrophysics Data System (ADS)

    Vogelsong, T. L.; Boult, T. E.; Gardner, D. W.; Woodworth, R.; Johnson, R. C.; Heflin, B.

    2007-04-01

    An advanced surveillance/security system is being developed for unattended 24/7 image acquisition and automated detection, discrimination, and tracking of humans and vehicles. The low-light video camera incorporates an electron multiplying CCD sensor with a programmable on-chip gain of up to 1000:1, providing effective noise levels of less than 1 electron. The EMCCD camera operates in full color mode under sunlit and moonlit conditions, and monochrome under quarter-moonlight to overcast starlight illumination. Sixty frame per second operation and progressive scanning minimizes motion artifacts. The acquired image sequences are processed with FPGA-compatible real-time algorithms, to detect/localize/track targets and reject non-targets due to clutter under a broad range of illumination conditions and viewing angles. The object detectors that are used are trained from actual image data. Detectors have been developed and demonstrated for faces, upright humans, crawling humans, large animals, cars and trucks. Detection and tracking of targets too small for template-based detection is achieved. For face and vehicle targets the results of the detection are passed to secondary processing to extract recognition templates, which are then compared with a database for identification. When combined with pan-tilt-zoom (PTZ) optics, the resulting system provides a reliable wide-area 24/7 surveillance system that avoids the high life-cycle cost of infrared cameras and image intensifiers.

  14. Assessment of Data Fusion Algorithms for Earth Observation Change Detection Processes

    PubMed Central

    Molina, Iñigo; Martinez, Estibaliz; Morillo, Carmen; Velasco, Jesus; Jara, Alvaro

    2016-01-01

    In this work a parametric multi-sensor Bayesian data fusion approach and a Support Vector Machine (SVM) are used for a Change Detection problem. For this purpose two sets of SPOT5-PAN images have been used, which are in turn used for Change Detection Indices (CDIs) calculation. For minimizing radiometric differences, a methodology based on zonal “invariant features” is suggested. The choice of one or the other CDI for a change detection process is a subjective task as each CDI is probably more or less sensitive to certain types of changes. Likewise, this idea might be employed to create and improve a “change map”, which can be accomplished by means of the CDI’s informational content. For this purpose, information metrics such as the Shannon Entropy and “Specific Information” have been used to weight the changes and no-changes categories contained in a certain CDI and thus introduced in the Bayesian information fusion algorithm. Furthermore, the parameters of the probability density functions (pdf’s) that best fit the involved categories have also been estimated. Conversely, these considerations are not necessary for mapping procedures based on the discriminant functions of a SVM. This work has confirmed the capabilities of probabilistic information fusion procedure under these circumstances. PMID:27706048

  15. Artificial immune system via Euclidean Distance Minimization for anomaly detection in bearings

    NASA Astrophysics Data System (ADS)

    Montechiesi, L.; Cocconcelli, M.; Rubini, R.

    2016-08-01

    In recent years new diagnostics methodologies have emerged, with particular interest into machinery operating in non-stationary conditions. In fact continuous speed changes and variable loads make non-trivial the spectrum analysis. A variable speed means a variable characteristic fault frequency related to the damage that is no more recognizable in the spectrum. To overcome this problem the scientific community proposed different approaches listed in two main categories: model-based approaches and expert systems. In this context the paper aims to present a simple expert system derived from the mechanisms of the immune system called Euclidean Distance Minimization, and its application in a real case of bearing faults recognition. The proposed method is a simplification of the original process, adapted by the class of Artificial Immune Systems, which proved to be useful and promising in different application fields. Comparative results are provided, with a complete explanation of the algorithm and its functioning aspects.

  16. Fuzzy automata and pattern matching

    NASA Technical Reports Server (NTRS)

    Setzer, C. B.; Warsi, N. A.

    1986-01-01

    A wide-ranging search for articles and books concerned with fuzzy automata and syntactic pattern recognition is presented. A number of survey articles on image processing and feature detection were included. Hough's algorithm is presented to illustrate the way in which knowledge about an image can be used to interpret the details of the image. It was found that in hand generated pictures, the algorithm worked well on following the straight lines, but had great difficulty turning corners. An algorithm was developed which produces a minimal finite automaton recognizing a given finite set of strings. One difficulty of the construction is that, in some cases, this minimal automaton is not unique for a given set of strings and a given maximum length. This algorithm compares favorably with other inference algorithms. More importantly, the algorithm produces an automaton with a rigorously described relationship to the original set of strings that does not depend on the algorithm itself.

  17. The applications of deep neural networks to sdBV classification

    NASA Astrophysics Data System (ADS)

    Boudreaux, Thomas M.

    2017-12-01

    With several new large-scale surveys on the horizon, including LSST, TESS, ZTF, and Evryscope, faster and more accurate analysis methods will be required to adequately process the enormous amount of data produced. Deep learning, used in industry for years now, allows for advanced feature detection in minimally prepared datasets at very high speeds; however, despite the advantages of this method, its application to astrophysics has not yet been extensively explored. This dearth may be due to a lack of training data available to researchers. Here we generate synthetic data loosely mimicking the properties of acoustic mode pulsating stars and we show that two separate paradigms of deep learning - the Artificial Neural Network And the Convolutional Neural Network - can both be used to classify this synthetic data effectively. And that additionally this classification can be performed at relatively high levels of accuracy with minimal time spent adjusting network hyperparameters.

  18. Dialysis Extraction for Chromatography

    NASA Technical Reports Server (NTRS)

    Jahnsen, V. J.

    1985-01-01

    Chromatographic-sample pretreatment by dialysis detects traces of organic contaminants in water samples analyzed in field with minimal analysis equipment and minimal quantities of solvent. Technique also of value wherever aqueous sample and solvent must not make direct contact.

  19. A Sensitive DNA Capacitive Biosensor Using Interdigitated Electrodes

    PubMed Central

    Wang, Lei; Veselinovic, Milena; Yang, Lang; Geiss, Brian J.; Dandy, David S.; Chen, Tom

    2017-01-01

    This paper presents a label-free affinity-based capacitive biosensor using interdigitated electrodes. Using an optimized process of DNA probe preparation to minimize the effect of contaminants in commercial thiolated DNA probe, the electrode surface was functionalized with the 24-nucleotide DNA probes based on the West Nile virus sequence (Kunjin strain). The biosensor has the ability to detect complementary DNA fragments with a detection limit down to 20 DNA target molecules (1.5 aM range), making it suitable for a practical point-of-care (POC) platform for low target count clinical applications without the need for amplification. The reproducibility of the biosensor detection was improved with efficient covalent immobilization of purified single-stranded DNA probe oligomers on cleaned gold microelectrodes. In addition to the low detection limit, the biosensor showed a dynamic range of detection from 1 μL−1 to 105 μL−1 target molecules (20 to 2 million targets), making it suitable for sample analysis in a typical clinical application environment. The binding results presented in this paper were validated using fluorescent oligomers. PMID:27619528

  20. Development of electrochemical based sandwich enzyme linked immunosensor for Cryptosporidium parvum detection in drinking water.

    PubMed

    Thiruppathiraja, Chinnasamy; Saroja, Veerappan; Kamatchiammal, Senthilkumar; Adaikkappan, Periyakaruppan; Alagar, Muthukaruppan

    2011-10-01

    Cryptosporidium parvum is one of the most important biological contaminants in drinking water and generates significant risks to public health. Due to low infectious dose of C. parvum, remarkably sensitive detection methods are required for water and food industry analysis. This present study describes a simple, sensitive, enzyme amplified sandwich form of an electrochemical immunosensor using dual labeled gold nanoparticles (alkaline phosphatase and anti-oocysts monoclonal antibody) in indium tin oxide (ITO) as an electrode to detect C. parvum. The biosensor was fabricated by immobilizing the anti-oocysts McAb on a gold nanoparticle functionalized ITO electrode, followed by the corresponding capture of analytes and dual labeled gold nanoparticle probe to detect the C. parvum target. The outcome shows the sensitivity of electrochemical immune sensor enhanced by gold nanoparticles with a limit of detection of 3 oocysts/mL in a minimal processing period. Our results demonstrated the sensitivity of the new approach compared to the customary method and the immunosensors showed acceptable precision, reproducibility, stability, and could be readily applied to multi analyte determination for environmental monitoring.

  1. FPGA design for constrained energy minimization

    NASA Astrophysics Data System (ADS)

    Wang, Jianwei; Chang, Chein-I.; Cao, Mang

    2004-02-01

    The Constrained Energy Minimization (CEM) has been widely used for hyperspectral detection and classification. The feasibility of implementing the CEM as a real-time processing algorithm in systolic arrays has been also demonstrated. The main challenge of realizing the CEM in hardware architecture in the computation of the inverse of the data correlation matrix performed in the CEM, which requires a complete set of data samples. In order to cope with this problem, the data correlation matrix must be calculated in a causal manner which only needs data samples up to the sample at the time it is processed. This paper presents a Field Programmable Gate Arrays (FPGA) design of such a causal CEM. The main feature of the proposed FPGA design is to use the Coordinate Rotation DIgital Computer (CORDIC) algorithm that can convert a Givens rotation of a vector to a set of shift-add operations. As a result, the CORDIC algorithm can be easily implemented in hardware architecture, therefore in FPGA. Since the computation of the inverse of the data correlction involves a series of Givens rotations, the utility of the CORDIC algorithm allows the causal CEM to perform real-time processing in FPGA. In this paper, an FPGA implementation of the causal CEM will be studied and its detailed architecture will be also described.

  2. Automated and unsupervised detection of malarial parasites in microscopic images.

    PubMed

    Purwar, Yashasvi; Shah, Sirish L; Clarke, Gwen; Almugairi, Areej; Muehlenbachs, Atis

    2011-12-13

    Malaria is a serious infectious disease. According to the World Health Organization, it is responsible for nearly one million deaths each year. There are various techniques to diagnose malaria of which manual microscopy is considered to be the gold standard. However due to the number of steps required in manual assessment, this diagnostic method is time consuming (leading to late diagnosis) and prone to human error (leading to erroneous diagnosis), even in experienced hands. The focus of this study is to develop a robust, unsupervised and sensitive malaria screening technique with low material cost and one that has an advantage over other techniques in that it minimizes human reliance and is, therefore, more consistent in applying diagnostic criteria. A method based on digital image processing of Giemsa-stained thin smear image is developed to facilitate the diagnostic process. The diagnosis procedure is divided into two parts; enumeration and identification. The image-based method presented here is designed to automate the process of enumeration and identification; with the main advantage being its ability to carry out the diagnosis in an unsupervised manner and yet have high sensitivity and thus reducing cases of false negatives. The image based method is tested over more than 500 images from two independent laboratories. The aim is to distinguish between positive and negative cases of malaria using thin smear blood slide images. Due to the unsupervised nature of method it requires minimal human intervention thus speeding up the whole process of diagnosis. Overall sensitivity to capture cases of malaria is 100% and specificity ranges from 50-88% for all species of malaria parasites. Image based screening method will speed up the whole process of diagnosis and is more advantageous over laboratory procedures that are prone to errors and where pathological expertise is minimal. Further this method provides a consistent and robust way of generating the parasite clearance curves.

  3. Planetary Crater Detection and Registration Using Marked Point Processes, Multiple Birth and Death Algorithms, and Region-Based Analysis

    NASA Technical Reports Server (NTRS)

    Solarna, David; Moser, Gabriele; Le Moigne-Stewart, Jacqueline; Serpico, Sebastiano B.

    2017-01-01

    Because of the large variety of sensors and spacecraft collecting data, planetary science needs to integrate various multi-sensor and multi-temporal images. These multiple data represent a precious asset, as they allow the study of targets spectral responses and of changes in the surface structure; because of their variety, they also require accurate and robust registration. A new crater detection algorithm, used to extract features that will be integrated in an image registration framework, is presented. A marked point process-based method has been developed to model the spatial distribution of elliptical objects (i.e. the craters) and a birth-death Markov chain Monte Carlo method, coupled with a region-based scheme aiming at computational efficiency, is used to find the optimal configuration fitting the image. The extracted features are exploited, together with a newly defined fitness function based on a modified Hausdorff distance, by an image registration algorithm whose architecture has been designed to minimize the computational time.

  4. Detection of biological contaminants on foods and food surfaces using laser-induced breakdown spectroscopy (LIBS).

    PubMed

    Multari, Rosalie A; Cremers, David A; Dupre, Jo Anne M; Gustafson, John E

    2013-09-11

    The rapid detection of biological contaminants, such as Escherichia coli O157:H7 and Salmonella enterica , on foods and food-processing surfaces is important to ensure food safety and streamline the food-monitoring process. Laser-induced breakdown spectroscopy (LIBS) is an ideal candidate technology for this application because sample preparation is minimal and results are available rapidly (seconds to minutes). Here, multivariate regression analysis of LIBS data is used to differentiate the live bacterial pathogens E. coli O157:H7 and S. enterica on various foods (eggshell, milk, bologna, ground beef, chicken, and lettuce) and surfaces (metal drain strainer and cutting board). The type (E. coli or S. enterica) of bacteria could be differentiated in all cases studied along with the metabolic state (viable or heat killed). This study provides data showing the potential of LIBS for the rapid identification of biological contaminants using spectra collected directly from foods and surfaces.

  5. Plasma processing conditions substantially influence circulating microRNA biomarker levels.

    PubMed

    Cheng, Heather H; Yi, Hye Son; Kim, Yeonju; Kroh, Evan M; Chien, Jason W; Eaton, Keith D; Goodman, Marc T; Tait, Jonathan F; Tewari, Muneesh; Pritchard, Colin C

    2013-01-01

    Circulating, cell-free microRNAs (miRNAs) are promising candidate biomarkers, but optimal conditions for processing blood specimens for miRNA measurement remain to be established. Our previous work showed that the majority of plasma miRNAs are likely blood cell-derived. In the course of profiling lung cancer cases versus healthy controls, we observed a broad increase in circulating miRNA levels in cases compared to controls and that higher miRNA expression correlated with higher platelet and particle counts. We therefore hypothesized that the quantity of residual platelets and microparticles remaining after plasma processing might impact miRNA measurements. To systematically investigate this, we subjected matched plasma from healthy individuals to stepwise processing with differential centrifugation and 0.22 µm filtration and performed miRNA profiling. We found a major effect on circulating miRNAs, with the majority (72%) of detectable miRNAs substantially affected by processing alone. Specifically, 10% of miRNAs showed 4-30x variation, 46% showed 30-1,000x variation, and 15% showed >1,000x variation in expression solely from processing. This was predominantly due to platelet contamination, which persisted despite using standard laboratory protocols. Importantly, we show that platelet contamination in archived samples could largely be eliminated by additional centrifugation, even in frozen samples stored for six years. To minimize confounding effects in microRNA biomarker studies, additional steps to limit platelet contamination for circulating miRNA biomarker studies are necessary. We provide specific practical recommendations to help minimize confounding variation attributable to plasma processing and platelet contamination.

  6. Differential Sources for 2 Neural Signatures of Target Detection: An Electrocorticography Study.

    PubMed

    Kam, J W Y; Szczepanski, S M; Canolty, R T; Flinker, A; Auguste, K I; Crone, N E; Kirsch, H E; Kuperman, R A; Lin, J J; Parvizi, J; Knight, R T

    2018-01-01

    Electrophysiology and neuroimaging provide conflicting evidence for the neural contributions to target detection. Scalp electroencephalography (EEG) studies localize the P3b event-related potential component mainly to parietal cortex, whereas neuroimaging studies report activations in both frontal and parietal cortices. We addressed this discrepancy by examining the sources that generate the target-detection process using electrocorticography (ECoG). We recorded ECoG activity from cortex in 14 patients undergoing epilepsy monitoring, as they performed an auditory or visual target-detection task. We examined target-related responses in 2 domains: high frequency band (HFB) activity and the P3b. Across tasks, we observed a greater proportion of electrodes that showed target-specific HFB power relative to P3b over frontal cortex, but their proportions over parietal cortex were comparable. Notably, there was minimal overlap in the electrodes that showed target-specific HFB and P3b activity. These results revealed that the target-detection process is characterized by at least 2 different neural markers with distinct cortical distributions. Our findings suggest that separate neural mechanisms are driving the differential patterns of activity observed in scalp EEG and neuroimaging studies, with the P3b reflecting EEG findings and HFB activity reflecting neuroimaging findings, highlighting the notion that target detection is not a unitary phenomenon. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Glial brain tumor detection by using symmetry analysis

    NASA Astrophysics Data System (ADS)

    Pedoia, Valentina; Binaghi, Elisabetta; Balbi, Sergio; De Benedictis, Alessandro; Monti, Emanuele; Minotto, Renzo

    2012-02-01

    In this work a fully automatic algorithm to detect brain tumors by using symmetry analysis is proposed. In recent years a great effort of the research in field of medical imaging was focused on brain tumors segmentation. The quantitative analysis of MRI brain tumor allows to obtain useful key indicators of disease progression. The complex problem of segmenting tumor in MRI can be successfully addressed by considering modular and multi-step approaches mimicking the human visual inspection process. The tumor detection is often an essential preliminary phase to solvethe segmentation problem successfully. In visual analysis of the MRI, the first step of the experts cognitive process, is the detection of an anomaly respect the normal tissue, whatever its nature. An healthy brain has a strong sagittal symmetry, that is weakened by the presence of tumor. The comparison between the healthy and ill hemisphere, considering that tumors are generally not symmetrically placed in both hemispheres, was used to detect the anomaly. A clustering method based on energy minimization through Graph-Cut is applied on the volume computed as a difference between the left hemisphere and the right hemisphere mirrored across the symmetry plane. Differential analysis involves the loss the knowledge of the tumor side. Through an histogram analysis the ill hemisphere is recognized. Many experiments are performed to assess the performance of the detection strategy on MRI volumes in presence of tumors varied in terms of shapes positions and intensity levels. The experiments showed good results also in complex situations.

  8. Face detection and eyeglasses detection for thermal face recognition

    NASA Astrophysics Data System (ADS)

    Zheng, Yufeng

    2012-01-01

    Thermal face recognition becomes an active research direction in human identification because it does not rely on illumination condition. Face detection and eyeglasses detection are necessary steps prior to face recognition using thermal images. Infrared light cannot go through glasses and thus glasses will appear as dark areas in a thermal image. One possible solution is to detect eyeglasses and to exclude the eyeglasses areas before face matching. In thermal face detection, a projection profile analysis algorithm is proposed, where region growing and morphology operations are used to segment the body of a subject; then the derivatives of two projections (horizontal and vertical) are calculated and analyzed to locate a minimal rectangle of containing the face area. Of course, the searching region of a pair of eyeglasses is within the detected face area. The eyeglasses detection algorithm should produce either a binary mask if eyeglasses present, or an empty set if no eyeglasses at all. In the proposed eyeglasses detection algorithm, block processing, region growing, and priori knowledge (i.e., low mean and variance within glasses areas, the shapes and locations of eyeglasses) are employed. The results of face detection and eyeglasses detection are quantitatively measured and analyzed using the manually defined ground truths (for both face and eyeglasses). Our experimental results shown that the proposed face detection and eyeglasses detection algorithms performed very well in contrast with the predefined ground truths.

  9. [A review on studies and applications of near infrared spectroscopy technique(NIRS) in detecting quality of hay].

    PubMed

    Ding, Wu-Rong; Gan, You-Min; Guo, Xu-Sheng; Yang, Fu-Yu

    2009-02-01

    The quality of hay can directly affect the price of hay and also livestock productivity. Many kinds of methods have been developed for detecting the quality of hay and the method of near infrared spectroscopy (NIRS) has been widely used with consideration of its fast, effective and nondestructive characteristics during detecting process. In the present paper, the feasibility and effectiveness of application of NIRS to detecting hay quality were expounded. Meanwhile, the advance in the study of using NIRS to detect chemical compositions, extent of incursion by epiphyte, amount of toxicant excreted by endogenetic epiphyte and some minim components that can not be detected by using chemical methods were also introduced detailedly. Based on the review of the progresses in using NIRS to detect the quality of hay, it can be concluded that using NIRS to detect hay quality can avoid the disadvantages of time wasting, complication and high cost when using traditional chemical method. And for better utilization of NIRS in practice, some more studies still need to be implemented to further perfect and improve the utilization of NIRS for detecting forage quality, and more accurate modes and systematic analysis software need to be established in times to come.

  10. Direct and sensitive detection of foodborne pathogens within fresh produce samples using a field-deployable handheld device.

    PubMed

    You, David J; Geshell, Kenneth J; Yoon, Jeong-Yeol

    2011-10-15

    Direct and sensitive detection of foodborne pathogens from fresh produce samples was accomplished using a handheld lab-on-a-chip device, requiring little to no sample processing and enrichment steps for a near-real-time detection and truly field-deployable device. The detection of Escherichia coli K12 and O157:H7 in iceberg lettuce was achieved utilizing optimized Mie light scatter parameters with a latex particle immunoagglutination assay. The system exhibited good sensitivity, with a limit of detection of 10 CFU mL(-1) and an assay time of <6 min. Minimal pretreatment with no detrimental effects on assay sensitivity and reproducibility was accomplished with a simple and cost-effective KimWipes filter and disposable syringe. Mie simulations were used to determine the optimal parameters (particle size d, wavelength λ, and scatter angle θ) for the assay that maximize light scatter intensity of agglutinated latex microparticles and minimize light scatter intensity of the tissue fragments of iceberg lettuce, which were experimentally validated. This introduces a powerful method for detecting foodborne pathogens in fresh produce and other potential sample matrices. The integration of a multi-channel microfluidic chip allowed for differential detection of the agglutinated particles in the presence of the antigen, revealing a true field-deployable detection system with decreased assay time and improved robustness over comparable benchtop systems. Additionally, two sample preparation methods were evaluated through simulated field studies based on overall sensitivity, protocol complexity, and assay time. Preparation of the plant tissue sample by grinding resulted in a two-fold improvement in scatter intensity over washing, accompanied with a significant increase in assay time: ∼5 min (grinding) versus ∼1 min (washing). Specificity studies demonstrated binding of E. coli O157:H7 EDL933 to only O157:H7 antibody conjugated particles, with no cross-reactivity to K12. This suggests the adaptability of the system for use with a wide variety of pathogens, and the potential to detect in a variety of biological matrices with little to no sample pretreatment. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Authentication of processed meat products by peptidomic analysis using rapid ambient mass spectrometry.

    PubMed

    Montowska, Magdalena; Alexander, Morgan R; Tucker, Gregory A; Barrett, David A

    2015-11-15

    We present the application of a novel ambient LESA-MS method for the authentication of processed meat products. A set of 25 species and protein-specific heat stable peptide markers has been detected in processed samples manufactured from beef, pork, horse, chicken and turkey meat. We demonstrate that several peptides derived from myofibrillar and sarcoplasmic proteins are sufficiently resistant to processing to serve as specific markers of processed products. The LESA-MS technique required minimal sample preparation without fractionation and enabled the unambiguous and simultaneous identification of skeletal muscle proteins and peptides as well as other components of animal origin, including the milk protein such as casein alpha-S1, in whole meat product digests. We have identified, for the first time, six fast type II and five slow/cardiac type I MHC peptide markers in various processed meat products. The study demonstrates that complex mixtures of processed proteins/peptides can be examined effectively using this approach. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Evaluation of Incident Detection Methodologies

    DOT National Transportation Integrated Search

    1999-10-01

    Original Report Date: October 1998. The detection of freeway incidents is an essential element of an area's traffic management system. Incidents need to be detected and handled as promptly as possible to minimize delay to the public. Various algorith...

  13. Multi-objective optimization model of CNC machining to minimize processing time and environmental impact

    NASA Astrophysics Data System (ADS)

    Hamada, Aulia; Rosyidi, Cucuk Nur; Jauhari, Wakhid Ahmad

    2017-11-01

    Minimizing processing time in a production system can increase the efficiency of a manufacturing company. Processing time are influenced by application of modern technology and machining parameter. Application of modern technology can be apply by use of CNC machining, one of the machining process can be done with a CNC machining is turning. However, the machining parameters not only affect the processing time but also affect the environmental impact. Hence, optimization model is needed to optimize the machining parameters to minimize the processing time and environmental impact. This research developed a multi-objective optimization to minimize the processing time and environmental impact in CNC turning process which will result in optimal decision variables of cutting speed and feed rate. Environmental impact is converted from environmental burden through the use of eco-indicator 99. The model were solved by using OptQuest optimization software from Oracle Crystal Ball.

  14. A point of minimal important difference (MID): a critique of terminology and methods.

    PubMed

    King, Madeleine T

    2011-04-01

    The minimal important difference (MID) is a phrase with instant appeal in a field struggling to interpret health-related quality of life and other patient-reported outcomes. The terminology can be confusing, with several terms differing only slightly in definition (e.g., minimal clinically important difference, clinically important difference, minimally detectable difference, the subjectively significant difference), and others that seem similar despite having quite different meanings (minimally detectable difference versus minimum detectable change). Often, nuances of definition are of little consequence in the way that these quantities are estimated and used. Four methods are commonly employed to estimate MIDs: patient rating of change (global transition items); clinical anchors; standard error of measurement; and effect size. These are described and critiqued in this article. There is no universal MID, despite the appeal of the notion. Indeed, for a particular patient-reported outcome instrument or scale, the MID is not an immutable characteristic, but may vary by population and context. At both the group and individual level, the MID may depend on the clinical context and decision at hand, the baseline from which the patient starts, and whether they are improving or deteriorating. Specific estimates of MIDs should therefore not be overinterpreted. For a given health-related quality-of-life scale, all available MID estimates (and their confidence intervals) should be considered, amalgamated into general guidelines and applied judiciously to any particular clinical or research context.

  15. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  16. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  17. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  18. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  19. 5 CFR 581.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... accompany legal process. 581.203 Section 581.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT... Process § 581.203 Information minimally required to accompany legal process. (a) Sufficient identifying information must accompany the legal process in order to enable processing by the governmental entity named...

  20. Standard operating procedures for serum and plasma collection: early detection research network consensus statement standard operating procedure integration working group.

    PubMed

    Tuck, Melissa K; Chan, Daniel W; Chia, David; Godwin, Andrew K; Grizzle, William E; Krueger, Karl E; Rom, William; Sanda, Martin; Sorbara, Lynn; Stass, Sanford; Wang, Wendy; Brenner, Dean E

    2009-01-01

    Specimen collection is an integral component of clinical research. Specimens from subjects with various stages of cancers or other conditions, as well as those without disease, are critical tools in the hunt for biomarkers, predictors, or tests that will detect serious diseases earlier or more readily than currently possible. Analytic methodologies evolve quickly. Access to high-quality specimens, collected and handled in standardized ways that minimize potential bias or confounding factors, is key to the "bench to bedside" aim of translational research. It is essential that standard operating procedures, "the how" of creating the repositories, be defined prospectively when designing clinical trials. Small differences in the processing or handling of a specimen can have dramatic effects in analytical reliability and reproducibility, especially when multiplex methods are used. A representative working group, Standard Operating Procedures Internal Working Group (SOPIWG), comprised of members from across Early Detection Research Network (EDRN) was formed to develop standard operating procedures (SOPs) for various types of specimens collected and managed for our biomarker discovery and validation work. This report presents our consensus on SOPs for the collection, processing, handling, and storage of serum and plasma for biomarker discovery and validation.

  1. Smart Cup: A Minimally-Instrumented, Smartphone-Based Point-of-Care Molecular Diagnostic Device.

    PubMed

    Liao, Shih-Chuan; Peng, Jing; Mauk, Michael G; Awasthi, Sita; Song, Jinzhao; Friedman, Harvey; Bau, Haim H; Liu, Changchun

    2016-06-28

    Nucleic acid amplification-based diagnostics offer rapid, sensitive, and specific means for detecting and monitoring the progression of infectious diseases. However, this method typically requires extensive sample preparation, expensive instruments, and trained personnel. All of which hinder its use in resource-limited settings, where many infectious diseases are endemic. Here, we report on a simple, inexpensive, minimally-instrumented, smart cup platform for rapid, quantitative molecular diagnostics of pathogens at the point of care. Our smart cup takes advantage of water-triggered, exothermic chemical reaction to supply heat for the nucleic acid-based, isothermal amplification. The amplification temperature is regulated with a phase-change material (PCM). The PCM maintains the amplification reactor at a constant temperature, typically, 60-65°C, when ambient temperatures range from 12 to 35°C. To eliminate the need for an optical detector and minimize cost, we use the smartphone's flashlight to excite the fluorescent dye and the phone camera to record real-time fluorescence emission during the amplification process. The smartphone can concurrently monitor multiple amplification reactors and analyze the recorded data. Our smart cup's utility was demonstrated by amplifying and quantifying herpes simplex virus type 2 (HSV-2) with LAMP assay in our custom-made microfluidic diagnostic chip. We have consistently detected as few as 100 copies of HSV-2 viral DNA per sample. Our system does not require any lab facilities and is suitable for use at home, in the field, and in the clinic, as well as in resource-poor settings, where access to sophisticated laboratories is impractical, unaffordable, or nonexistent.

  2. Clinical feasibility test on a minimally invasive laser therapy system in microsurgery of nerves.

    PubMed

    Mack, K F; Leinung, M; Stieve, M; Lenarz, T; Schwab, B

    2008-01-01

    The clinical feasibility test described here evaluates the basis for a laser therapy system that enables tumour tissue to be separated from nerves in a minimally invasive manner. It was first investigated whether, using an Er:YAG laser, laser-induced nerve (specifically, facial nerve) responses in the rabbit in vivo can be reliably detected with the hitherto standard monitoring techniques. Peripherally recordable neuromuscular signals (i.e. compound action potentials, CAPs) were used to monitor nerve function and to establish a feedback loop. The first occurrence of laser-evoked CAPs was taken as the criterion for deciding when to switch off the laser. When drawing up criteria governing the control and termination of the laser application, the priority was the maintenance of nerve function. Five needle-electrode arrays specially developed for this purpose, each with a miniature preamplifier, were then placed into the facial musculature instead of single-needle electrodes. The system was tested in vivo under realistic surgical conditions (i.e. facial-nerve surgery in the rabbit). This modified multi-channel electromyography (EMG) system enabled laser-evoked CAPs to be detected that have amplitudes 10 times smaller than those picked up by commercially available systems. This optimization, and the connection of the neuromuscular unit with the Er:YAG laser via the electrode array to create a feedback loop, were designed to make it possible to maintain online control of the laser ablation process in the vicinity of neuronal tissue, thus ensuring that tissue excision is both reliable and does not affect function. Our results open up new possibilities in minimally invasive surgery near neural structures.

  3. 16S rRNA beacons for bacterial monitoring during human space missions.

    PubMed

    Larios-Sanz, Maia; Kourentzi, Katerina D; Warmflash, David; Jones, Jeffrey; Pierson, Duane L; Willson, Richard C; Fox, George E

    2007-04-01

    Microorganisms are unavoidable in space environments and their presence has, at times, been a source of problems. Concerns about disease during human space missions are particularly important considering the significant changes the immune system incurs during spaceflight and the history of microbial contamination aboard the Mir space station. Additionally, these contaminants may have adverse effects on instrumentation and life-support systems. A sensitive, highly specific system to detect, characterize, and monitor these microbial populations is essential. Herein we describe a monitoring approach that uses 16S rRNA targeted molecular beacons to successfully detect several specific bacterial groupings. This methodology will greatly simplify in-flight monitoring by minimizing sample handling and processing. We also address and provide solutions to target accessibility problems encountered in hybridizations that target 16S rRNA.

  4. Droplet-Based Segregation and Extraction of Concentrated Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buie, C R; Buckley, P; Hamilton, J

    2007-02-23

    Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less

  5. Investigation of Mercury Reduction in Gold Stripping Process at Elevated Temperature

    NASA Astrophysics Data System (ADS)

    Pramudya, Irawan

    Mercury is present in many gold ores. By processing these ores, there is a potential of emitting mercury to the environment. Carbon regeneration kiln stacks have been observed as one of the primary source of mercury emission into the atmosphere. Before it is recycled back into the carbon in leach (CIL) or carbon in columns (CIC), carbon used in the gold extraction process needs to be reactivated thermally. Emission of mercury can be minimized by keeping the mercury left in the carbon low before it goes to the carbon regeneration kiln stacks. The objective of this study is establishing the optimum elution conditions of mercury cyanide from loaded carbon (which includes the eluent, concentration, temperature and elution time) with respect to gold stripping. Several methods such as acid washing (UNR-100, HCl or ethanol/UNR-100) were investigated prior to the stripping process. Furthermore, conventional pressurized Zadra and modified Zadra were also studied with regards to mercury concentration in the solution and vapor state as well as maximizing the gold stripping from industrial loaded carbon. 7% UNR-100 acid washing of loaded carbon at 80°C was able to wash out approximately 90% of mercury while maintaining the gold adsorption on the carbon (selective washing). The addition of alcohol in the UNR-100 acid washing solution was able to enhance mercury washing from 90% to 97%. Furthermore, mercury stripping using conventional pressurized (cyanide-alkaline) Zadra was best performed at 80°C (minimal amount of mercury reduced and volatilized) whereas using the same process only 40% of gold was stripped, which makes this process not viable. When alcohol was added to the stripping solution, at 80°C, 95% of gold was detected in the solution while keeping the reduction and volatilization of mercury low. The outcome of this study provides a better understanding of mercury behavior during the acid washing and stripping processes so that the risk of mercury exposure and contamination can be minimized while maximizing the gold overall recovery.

  6. Visual-Vestibular Conflict Detection Depends on Fixation.

    PubMed

    Garzorz, Isabelle T; MacNeilage, Paul R

    2017-09-25

    Visual and vestibular signals are the primary sources of sensory information for self-motion. Conflict among these signals can be seriously debilitating, resulting in vertigo [1], inappropriate postural responses [2], and motion, simulator, or cyber sickness [3-8]. Despite this significance, the mechanisms mediating conflict detection are poorly understood. Here we model conflict detection simply as crossmodal discrimination with benchmark performance limited by variabilities of the signals being compared. In a series of psychophysical experiments conducted in a virtual reality motion simulator, we measure these variabilities and assess conflict detection relative to this benchmark. We also examine the impact of eye movements on visual-vestibular conflict detection. In one condition, observers fixate a point that is stationary in the simulated visual environment by rotating the eyes opposite head rotation, thereby nulling retinal image motion. In another condition, eye movement is artificially minimized via fixation of a head-fixed fixation point, thereby maximizing retinal image motion. Visual-vestibular integration performance is also measured, similar to previous studies [9-12]. We observe that there is a tradeoff between integration and conflict detection that is mediated by eye movements. Minimizing eye movements by fixating a head-fixed target leads to optimal integration but highly impaired conflict detection. Minimizing retinal motion by fixating a scene-fixed target improves conflict detection at the cost of impaired integration performance. The common tendency to fixate scene-fixed targets during self-motion [13] may indicate that conflict detection is typically a higher priority than the increase in precision of self-motion estimation that is obtained through integration. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. A comparative study between an improved novel air-cushion sensor and a wheeled probe for minimally invasive surgery.

    PubMed

    Zbyszewski, Dinusha; Challacombe, Benjamin; Li, Jichun; Seneviratne, Lakmal; Althoefer, Kaspar; Dasgupta, Prokar; Murphy, Declan

    2010-07-01

    We describe a comparative study between an enhanced air-cushion tactile sensor and a wheeled indentation probe. These laparoscopic tools are designed to rapidly locate soft-tissue abnormalities during minimally invasive surgery (MIS). The air-cushion tactile sensor consists of an optically based sensor with a 7.8 mm sphere "floating" on a cushion of air at the tip of a shaft. The wheeled indentation probe is a 10 mm wide and 5 mm in diameter wheel mounted to a force/torque sensor. A continuous rolling indentation technique is used to pass the sensors over the soft-tissue surfaces. The variations in stiffness of the viscoelastic materials that are detected during the rolling indentations are illustrated by stiffness maps that can be used for tissue diagnosis. The probes were tested by having to detect four embedded nodules in a silicone phantom. Each probe was attached to a robotic manipulator and rolled over the silicone phantom in parallel paths. The readings of each probe collected during the process of rolling indentation were used to achieve the final results. The results show that both sensors reliably detected the areas of variable stiffness by accurately identifying the location of each nodule. These are illustrated in the form of two three-dimensional spatiomechanical maps. These probes have the potential to be used in MIS because they could provide surgeons with information on the mechanical properties of soft tissue, consequently enhancing the reduction in haptic feedback.

  8. Low-rank matrix decomposition and spatio-temporal sparse recovery for STAP radar

    DOE PAGES

    Sen, Satyabrata

    2015-08-04

    We develop space-time adaptive processing (STAP) methods by leveraging the advantages of sparse signal processing techniques in order to detect a slowly-moving target. We observe that the inherent sparse characteristics of a STAP problem can be formulated as the low-rankness of clutter covariance matrix when compared to the total adaptive degrees-of-freedom, and also as the sparse interference spectrum on the spatio-temporal domain. By exploiting these sparse properties, we propose two approaches for estimating the interference covariance matrix. In the first approach, we consider a constrained matrix rank minimization problem (RMP) to decompose the sample covariance matrix into a low-rank positivemore » semidefinite and a diagonal matrix. The solution of RMP is obtained by applying the trace minimization technique and the singular value decomposition with matrix shrinkage operator. Our second approach deals with the atomic norm minimization problem to recover the clutter response-vector that has a sparse support on the spatio-temporal plane. We use convex relaxation based standard sparse-recovery techniques to find the solutions. With extensive numerical examples, we demonstrate the performances of proposed STAP approaches with respect to both the ideal and practical scenarios, involving Doppler-ambiguous clutter ridges, spatial and temporal decorrelation effects. As a result, the low-rank matrix decomposition based solution requires secondary measurements as many as twice the clutter rank to attain a near-ideal STAP performance; whereas the spatio-temporal sparsity based approach needs a considerably small number of secondary data.« less

  9. Preparative electrophoresis with on-column optical fiber monitoring and direct elution into a minimized volume.

    PubMed

    Jackson, George W; Willson, Richard

    2005-11-01

    A "column-format" preparative electrophoresis device which obviates the need for gel extraction or secondary electro-elution steps is described. Separated biomolecules are continuously detected and eluted directly into a minimal volume of free solution for subsequent use. An optical fiber allows the species of interest to be detected just prior to elution from the gel column, and a small collection volume is created by addition of an ion-exchange membrane near the end of the column.

  10. 46 CFR 108.404 - Selection of fire detection system.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Selection of fire detection system. 108.404 Section 108... DESIGN AND EQUIPMENT Fire Extinguishing Systems § 108.404 Selection of fire detection system. (a) If a... space. (b) The fire detection system must be designed to minimize false alarms. ...

  11. GFP-complementation assay to detect functional CPP and protein delivery into living cells

    PubMed Central

    Milech, Nadia; Longville, Brooke AC; Cunningham, Paula T; Scobie, Marie N; Bogdawa, Heique M; Winslow, Scott; Anastasas, Mark; Connor, Theresa; Ong, Ferrer; Stone, Shane R; Kerfoot, Maria; Heinrich, Tatjana; Kroeger, Karen M; Tan, Yew-Foon; Hoffmann, Katrin; Thomas, Wayne R; Watt, Paul M; Hopkins, Richard M

    2015-01-01

    Efficient cargo uptake is essential for cell-penetrating peptide (CPP) therapeutics, which deliver widely diverse cargoes by exploiting natural cell processes to penetrate the cell’s membranes. Yet most current CPP activity assays are hampered by limitations in assessing uptake, including confounding effects of conjugated fluorophores or ligands, indirect read-outs requiring secondary processing, and difficulty in discriminating internalization from endosomally trapped cargo. Split-complementation Endosomal Escape (SEE) provides the first direct assay visualizing true cytoplasmic-delivery of proteins at biologically relevant concentrations. The SEE assay has minimal background, is amenable to high-throughput processes, and adaptable to different transient and stable cell lines. This split-GFP-based platform can be useful to study transduction mechanisms, cellular imaging, and characterizing novel CPPs as pharmaceutical delivery agents in the treatment of disease. PMID:26671759

  12. Automation for Air Traffic Control: The Rise of a New Discipline

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; Tobias, Leonard (Technical Monitor)

    1997-01-01

    The current debate over the concept of Free Flight has renewed interest in automated conflict detection and resolution in the enroute airspace. An essential requirement for effective conflict detection is accurate prediction of trajectories. Trajectory prediction is, however, an inexact process which accumulates errors that grow in proportion to the length of the prediction time interval. Using a model of prediction errors for the trajectory predictor incorporated in the Center-TRACON Automation System (CTAS), a computationally fast algorithm for computing conflict probability has been derived. Furthermore, a method of conflict resolution has been formulated that minimizes the average cost of resolution, when cost is defined as the increment in airline operating costs incurred in flying the resolution maneuver. The method optimizes the trade off between early resolution at lower maneuver costs but higher prediction error on the one hand and late resolution with higher maneuver costs but lower prediction errors on the other. The method determines both the time to initiate the resolution maneuver as well as the characteristics of the resolution trajectory so as to minimize the cost of the resolution. Several computational examples relevant to the design of a conflict probe that can support user-preferred trajectories in the enroute airspace will be presented.

  13. Nonlinear vibrational microscopy

    DOEpatents

    Holtom, Gary R.; Xie, Xiaoliang Sunney; Zumbusch, Andreas

    2000-01-01

    The present invention is a method and apparatus for microscopic vibrational imaging using coherent Anti-Stokes Raman Scattering or Sum Frequency Generation. Microscopic imaging with a vibrational spectroscopic contrast is achieved by generating signals in a nonlinear optical process and spatially resolved detection of the signals. The spatial resolution is attained by minimizing the spot size of the optical interrogation beams on the sample. Minimizing the spot size relies upon a. directing at least two substantially co-axial laser beams (interrogation beams) through a microscope objective providing a focal spot on the sample; b. collecting a signal beam together with a residual beam from the at least two co-axial laser beams after passing through the sample; c. removing the residual beam; and d. detecting the signal beam thereby creating said pixel. The method has significantly higher spatial resolution then IR microscopy and higher sensitivity than spontaneous Raman microscopy with much lower average excitation powers. CARS and SFG microscopy does not rely on the presence of fluorophores, but retains the resolution and three-dimensional sectioning capability of confocal and two-photon fluorescence microscopy. Complementary to these techniques, CARS and SFG microscopy provides a contrast mechanism based on vibrational spectroscopy. This vibrational contrast mechanism, combined with an unprecedented high sensitivity at a tolerable laser power level, provides a new approach for microscopic investigations of chemical and biological samples.

  14. Design of a portable fluoroquinolone analyzer based on terbium-sensitized luminescence

    NASA Astrophysics Data System (ADS)

    Chen, Guoying

    2007-09-01

    A portable fluoroquinolone (FQ) analyzer is designed and prototyped based on terbium-sensitized luminescence (TSL). The excitation source is a 327-nm light emitting diode (LED) operated in pulsed mode; and the luminescence signal is detected by a photomultiplier tube (PMT). In comparison to a conventional xenon flashlamp, an LED is small, light, robust, and energy efficient. More importantly, its narrow emission bandwidth and low residual radiation reduce background signal. In pulse mode, an LED operates at a current 1-2 orders of magnitude lower than that of a xenon flashlamp, thus minimizing electromagnetic interference (EMI) to the detector circuitry. The PMT is gated to minimize its response to the light source. These measures lead to reduced background noise in time domain. To overcome pulse-to-pulse variation signal normalization is implemented based on individual pulse energy. Instrument operation and data processing are controlled by a computer running a custom LabVIEW program. Enrofloxacin (ENRO) is used as a model analyte to evaluate instrument performance. The integrated TSL intensity reveals a linear dependence up to 2 ppm. A 1.1-ppb limit of detection (LOD) is achieved with relative standard deviation (RSD) averaged at 5.1%. The background noise corresponds to ~5 ppb. At 19 lbs, this portable analyzer is field deployable for agriculture, environmental and clinical analyses.

  15. 5 CFR 582.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...

  16. 5 CFR 582.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...

  17. 5 CFR 582.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...

  18. 5 CFR 582.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...

  19. 5 CFR 582.203 - Information minimally required to accompany legal process.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... accompany legal process. 582.203 Section 582.203 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS COMMERCIAL GARNISHMENT OF FEDERAL EMPLOYEES' PAY Service of Legal Process § 582.203 Information minimally required to accompany legal process. (a) Sufficient identifying...

  20. Revealing the Effects of Nanoscale Membrane Curvature on Lipid Mobility

    PubMed Central

    Kabbani, Abir Maarouf; Woodward, Xinxin

    2017-01-01

    Recent advances in nanoengineering and super-resolution microscopy have enabled new capabilities for creating and observing membrane curvature. However, the effects of curvature on single-lipid diffusion have yet to be revealed. The simulations presented here describe the capabilities of varying experimental methods for revealing the effects of nanoscale curvature on single-molecule mobility. Traditionally, lipid mobility is revealed through fluorescence recovery after photobleaching (FRAP), fluorescence correlation spectroscopy (FCS), and single particle tracking (SPT). However, these techniques vary greatly in their ability to detect the effects of nanoscale curvature on lipid behavior. Traditionally, FRAP and FCS depend on diffraction-limited illumination and detection. A simulation of FRAP shows minimal effects on lipids diffusion due to a 50 nm radius membrane bud. Throughout the stages of the budding process, FRAP detected minimal changes in lipid recovery time due to the curvature versus flat membrane. Simulated FCS demonstrated small effects due to a 50 nm radius membrane bud that was more apparent with curvature-dependent lipid mobility changes. However, SPT achieves a sub-diffraction-limited resolution of membrane budding and lipid mobility through the identification of the single-lipid positions with ≤15 nm spatial and ≤20 ms temporal resolution. By mapping the single-lipid step lengths to locations on the membrane, the effects of membrane topography and curvature could be correlated to the effective membrane viscosity. Single-fluorophore localization techniques, such SPT, can detect membrane curvature and its effects on lipid behavior. These simulations and discussion provide a guideline for optimizing the experimental procedures in revealing the effects of curvature on lipid mobility and effective local membrane viscosity. PMID:29057801

  1. Gamma radiation in the reduction of S almonella spp. inoculated on minimally processed watercress ( Nasturtium officinalis)

    NASA Astrophysics Data System (ADS)

    Martins, C. G.; Behrens, J. H.; Destro, M. T.; Franco, B. D. G. M.; Vizeu, D. M.; Hutzler, B.; Landgraf, M.

    2004-09-01

    Consumer attitudes towards foods have changed in the last two decades increasing requirements for freshlike products. Consequently, less extreme treatments or additives are being required. Minimally processed foods have freshlike characteristics and satisfy this new consumer demand. Besides freshness, the minimally processing also provide convenience required by the market. Salad vegetables can be source of pathogen such as Salmonella, Escherichia coli O157:H7, Shigella spp. The minimal processing does not reduce the levels of pathogenic microorganisms to safe levels. Therefore, this study was carried out in order to improve the microbiological safety and the shelf-life of minimally processed vegetables using gamma radiation. Minimally processed watercress inoculated with a cocktail of Salmonella spp was exposed to 0.0, 0.2, 0.5, 0.7, 1.0, 1.2 and 1.5 kGy. Irradiated samples were diluted 1:10 in saline peptone water and plated onto tryptic soy agar that were incubated at 37°C/24 h. D 10 values for Salmonella spp. inoculated in watercress varied from 0.29 to 0.43 kGy. Therefore, a dose of 1.7 kGy will reduce Salmonella population in watercress by 4 log 10. The shelf-life was increased by 1 {1}/{2} day when the product was exposed to 1 kGy.

  2. From face processing to face recognition: Comparing three different processing levels.

    PubMed

    Besson, G; Barragan-Jason, G; Thorpe, S J; Fabre-Thorpe, M; Puma, S; Ceccaldi, M; Barbeau, E J

    2017-01-01

    Verifying that a face is from a target person (e.g. finding someone in the crowd) is a critical ability of the human face processing system. Yet how fast this can be performed is unknown. The 'entry-level shift due to expertise' hypothesis suggests that - since humans are face experts - processing faces should be as fast - or even faster - at the individual than at superordinate levels. In contrast, the 'superordinate advantage' hypothesis suggests that faces are processed from coarse to fine, so that the opposite pattern should be observed. To clarify this debate, three different face processing levels were compared: (1) a superordinate face categorization level (i.e. detecting human faces among animal faces), (2) a face familiarity level (i.e. recognizing famous faces among unfamiliar ones) and (3) verifying that a face is from a target person, our condition of interest. The minimal speed at which faces can be categorized (∼260ms) or recognized as familiar (∼360ms) has largely been documented in previous studies, and thus provides boundaries to compare our condition of interest to. Twenty-seven participants were included. The recent Speed and Accuracy Boosting procedure paradigm (SAB) was used since it constrains participants to use their fastest strategy. Stimuli were presented either upright or inverted. Results revealed that verifying that a face is from a target person (minimal RT at ∼260ms) was remarkably fast but longer than the face categorization level (∼240ms) and was more sensitive to face inversion. In contrast, it was much faster than recognizing a face as familiar (∼380ms), a level severely affected by face inversion. Face recognition corresponding to finding a specific person in a crowd thus appears achievable in only a quarter of a second. In favor of the 'superordinate advantage' hypothesis or coarse-to-fine account of the face visual hierarchy, these results suggest a graded engagement of the face processing system across processing levels as reflected by the face inversion effects. Furthermore, they underline how verifying that a face is from a target person and detecting a face as familiar - both often referred to as "Face Recognition" - in fact differs. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Managed traffic evacuation using distributed sensor processing

    NASA Astrophysics Data System (ADS)

    Ramuhalli, Pradeep; Biswas, Subir

    2005-05-01

    This paper presents an integrated sensor network and distributed event processing architecture for managed in-building traffic evacuation during natural and human-caused disasters, including earthquakes, fire and biological/chemical terrorist attacks. The proposed wireless sensor network protocols and distributed event processing mechanisms offer a new distributed paradigm for improving reliability in building evacuation and disaster management. The networking component of the system is constructed using distributed wireless sensors for measuring environmental parameters such as temperature, humidity, and detecting unusual events such as smoke, structural failures, vibration, biological/chemical or nuclear agents. Distributed event processing algorithms will be executed by these sensor nodes to detect the propagation pattern of the disaster and to measure the concentration and activity of human traffic in different parts of the building. Based on this information, dynamic evacuation decisions are taken for maximizing the evacuation speed and minimizing unwanted incidents such as human exposure to harmful agents and stampedes near exits. A set of audio-visual indicators and actuators are used for aiding the automated evacuation process. In this paper we develop integrated protocols, algorithms and their simulation models for the proposed sensor networking and the distributed event processing framework. Also, efficient harnessing of the individually low, but collectively massive, processing abilities of the sensor nodes is a powerful concept behind our proposed distributed event processing algorithms. Results obtained through simulation in this paper are used for a detailed characterization of the proposed evacuation management system and its associated algorithmic components.

  4. Goal-directed ultrasound in the detection of long-bone fractures

    NASA Technical Reports Server (NTRS)

    Marshburn, Thomas H.; Legome, Eric; Sargsyan, Ashot; Li, Shannon Melton James; Noble, Vicki A.; Dulchavsky, Scott A.; Sims, Carrie; Robinson, David

    2004-01-01

    BACKGROUND: New portable ultrasound (US) systems are capable of detecting fractures in the remote setting. However, the accuracy of ultrasound by physicians with minimal ultrasound training is unknown. METHODS: After one hour of standardized training, physicians with minimal US experience clinically evaluated patients presenting with pain and trauma to the upper arm or leg. The investigators then performed a long-bone US evaluation, recording their impression of fracture presence or absence. Results of the examination were compared with routine plain or computer aided radiography (CT). RESULTS: 58 patients were examined. The sensitivity and specificity of US were 92.9% and 83.3%, and of the physical examination were 78.6% and 90.0%, respectively. US provided improved sensitivity with less specificity compared with physical examination in the detection of fractures in long bones. CONCLUSION: Ultrasound scans by minimally trained clinicians may be used to rule out a long-bone fracture in patients with a medium to low probability of fracture.

  5. Sub-surface defects detection of by using active thermography and advanced image edge detection

    NASA Astrophysics Data System (ADS)

    Tse, Peter W.; Wang, Gaochao

    2017-05-01

    Active or pulsed thermography is a popular non-destructive testing (NDT) tool for inspecting the integrity and anomaly of industrial equipment. One of the recent research trends in using active thermography is to automate the process in detecting hidden defects. As of today, human effort has still been using to adjust the temperature intensity of the thermo camera in order to visually observe the difference in cooling rates caused by a normal target as compared to that by a sub-surface crack exists inside the target. To avoid the tedious human-visual inspection and minimize human induced error, this paper reports the design of an automatic method that is capable of detecting subsurface defects. The method used the technique of active thermography, edge detection in machine vision and smart algorithm. An infrared thermo-camera was used to capture a series of temporal pictures after slightly heating up the inspected target by flash lamps. Then the Canny edge detector was employed to automatically extract the defect related images from the captured pictures. The captured temporal pictures were preprocessed by a packet of Canny edge detector and then a smart algorithm was used to reconstruct the whole sequences of image signals. During the processes, noise and irrelevant backgrounds exist in the pictures were removed. Consequently, the contrast of the edges of defective areas had been highlighted. The designed automatic method was verified by real pipe specimens that contains sub-surface cracks. After applying such smart method, the edges of cracks can be revealed visually without the need of using manual adjustment on the setting of thermo-camera. With the help of this automatic method, the tedious process in manually adjusting the colour contract and the pixel intensity in order to reveal defects can be avoided.

  6. Sensitivity and accuracy of high-throughput metabarcoding methods for early detection of invasive fish species

    EPA Science Inventory

    For early detection biomonitoring of aquatic invasive species, sensitivity to rare individuals and accurate, high-resolution taxonomic classification are critical to minimize detection errors. Given the great expense and effort associated with morphological identification of many...

  7. Quantitative three-dimensional transrectal ultrasound (TRUS) for prostate imaging

    NASA Astrophysics Data System (ADS)

    Pathak, Sayan D.; Aarnink, Rene G.; de la Rosette, Jean J.; Chalana, Vikram; Wijkstra, Hessel; Haynor, David R.; Debruyne, Frans M. J.; Kim, Yongmin

    1998-06-01

    With the number of men seeking medical care for prostate diseases rising steadily, the need of a fast and accurate prostate boundary detection and volume estimation tool is being increasingly experienced by the clinicians. Currently, these measurements are made manually, which results in a large examination time. A possible solution is to improve the efficiency by automating the boundary detection and volume estimation process with minimal involvement from the human experts. In this paper, we present an algorithm based on SNAKES to detect the boundaries. Our approach is to selectively enhance the contrast along the edges using an algorithm called sticks and integrate it with a SNAKES model. This integrated algorithm requires an initial curve for each ultrasound image to initiate the boundary detection process. We have used different schemes to generate the curves with a varying degree of automation and evaluated its effects on the algorithm performance. After the boundaries are identified, the prostate volume is calculated using planimetric volumetry. We have tested our algorithm on 6 different prostate volumes and compared the performance against the volumes manually measured by 3 experts. With the increase in the user inputs, the algorithm performance improved as expected. The results demonstrate that given an initial contour reasonably close to the prostate boundaries, the algorithm successfully delineates the prostate boundaries in an image, and the resulting volume measurements are in close agreement with those made by the human experts.

  8. Lateral-Line Detection of Underwater Objects: From Goldfish to Submarines

    NASA Astrophysics Data System (ADS)

    van Hemmen, J. Leo

    2010-03-01

    Fish and some aquatic amphibians use their mechanosensory lateral-line system to navigate by means of hydrodynamic cues. How a fish determines an object's position and shape only through the lateral-line system and the ensuing neuronal processing is still a challenging problem. Our studies have shown that both stimulus position and stimulus form can be determined within the range of about one fish length and are encoded through the response of the afferent nerves originating from the detectors. A minimal detection model of a vibrating sphere (a dipole) has now been extended to other stimuli such as translating spheres, ellipsoids, or even wakes (vortex rings). The theoretical model is fully verified by experimental data. We have also constructed an underwater robot with an artificial lateral-line system designed to detect e.g. the presence of walls by measuring the change of water flow around the body. We will show how a simple model fits experimental results obtained from trout and goldfish and how a submarine may well be able to detect underwater objects by using an artificial lateral-line system.

  9. Multi-temporal change image inference towards false alarms reduction for an operational photogrammetric rockfall detection system

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Kallimani, Christina; Tripolitsiotis, Achilleas

    2015-06-01

    Rockfall incidents affect civil security and hamper the sustainable growth of hard to access mountainous areas due to casualties, injuries and infrastructure loss. Rockfall occurrences cannot be easily prevented, whereas previous studies for rockfall multiple sensor early detection systems have focused on large scale incidents. However, even a single rock may cause the loss of a human life along transportation routes thus, it is highly important to establish methods for the early detection of small-scale rockfall incidents. Terrestrial photogrammetric techniques are prone to a series of errors leading to false alarm incidents, including vegetation, wind, and non relevant change in the scene under consideration. In this study, photogrammetric monitoring of rockfall prone slopes is established and the resulting multi-temporal change imagery is processed in order to minimize false alarm incidents. Integration of remote sensing imagery analysis techniques is hereby applied to enhance early detection of a rockfall. Experimental data demonstrated that an operational system able to identify a 10-cm rock movement within a 10% false alarm rate is technically feasible.

  10. Testing for the Presence of Correlation Changes in a Multivariate Time Series: A Permutation Based Approach.

    PubMed

    Cabrieto, Jedelyn; Tuerlinckx, Francis; Kuppens, Peter; Hunyadi, Borbála; Ceulemans, Eva

    2018-01-15

    Detecting abrupt correlation changes in multivariate time series is crucial in many application fields such as signal processing, functional neuroimaging, climate studies, and financial analysis. To detect such changes, several promising correlation change tests exist, but they may suffer from severe loss of power when there is actually more than one change point underlying the data. To deal with this drawback, we propose a permutation based significance test for Kernel Change Point (KCP) detection on the running correlations. Given a requested number of change points K, KCP divides the time series into K + 1 phases by minimizing the within-phase variance. The new permutation test looks at how the average within-phase variance decreases when K increases and compares this to the results for permuted data. The results of an extensive simulation study and applications to several real data sets show that, depending on the setting, the new test performs either at par or better than the state-of-the art significance tests for detecting the presence of correlation changes, implying that its use can be generally recommended.

  11. Isothermal Amplification Methods for the Detection of Nucleic Acids in Microfluidic Devices

    PubMed Central

    Zanoli, Laura Maria; Spoto, Giuseppe

    2012-01-01

    Diagnostic tools for biomolecular detection need to fulfill specific requirements in terms of sensitivity, selectivity and high-throughput in order to widen their applicability and to minimize the cost of the assay. The nucleic acid amplification is a key step in DNA detection assays. It contributes to improving the assay sensitivity by enabling the detection of a limited number of target molecules. The use of microfluidic devices to miniaturize amplification protocols reduces the required sample volume and the analysis times and offers new possibilities for the process automation and integration in one single device. The vast majority of miniaturized systems for nucleic acid analysis exploit the polymerase chain reaction (PCR) amplification method, which requires repeated cycles of three or two temperature-dependent steps during the amplification of the nucleic acid target sequence. In contrast, low temperature isothermal amplification methods have no need for thermal cycling thus requiring simplified microfluidic device features. Here, the use of miniaturized analysis systems using isothermal amplification reactions for the nucleic acid amplification will be discussed. PMID:25587397

  12. Microfluidics-to-Mass Spectrometry: A review of coupling methods and applications

    PubMed Central

    Wang, Xue; Yi, Lian; Mukhitov, Nikita; Schrell, Adrian M.; Dhumpa, Raghuram; Roper, Michael G.

    2014-01-01

    Microfluidic devices offer great advantages in integrating sample processes, minimizing sample and reagent volumes, and increasing analysis speed, while mass spectrometry detection provides high information content, is sensitive, and can be used in quantitative analyses. The coupling of microfluidic devices to mass spectrometers is becoming more common with the strengths of both systems being combined to analyze precious and complex samples. This review summarizes select achievements published between 2010 – July 2014 in novel coupling between microfluidic devices and mass spectrometers. The review is subdivided by the types of ionization sources employed, and the different microfluidic systems used. PMID:25458901

  13. Micromotors to capture and destroy anthrax simulant spores.

    PubMed

    Orozco, Jahir; Pan, Guoqing; Sattayasamitsathit, Sirilak; Galarnyk, Michael; Wang, Joseph

    2015-03-07

    Towards addressing the need for detecting and eliminating biothreats, we describe a micromotor-based approach for screening, capturing, isolating and destroying anthrax simulant spores in a simple and rapid manner with minimal sample processing. The B. globilli antibody-functionalized micromotors can recognize, capture and transport B. globigii spores in environmental matrices, while showing non-interactions with excess of non-target bacteria. Efficient destruction of the anthrax simulant spores is demonstrated via the micromotor-induced mixing of a mild oxidizing solution. The new micromotor-based approach paves a way to dynamic multifunctional systems that rapidly recognize, isolate, capture and destroy biological threats.

  14. Minimum entropy density method for the time series analysis

    NASA Astrophysics Data System (ADS)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  15. Deep Learning for real-time gravitational wave detection and parameter estimation: Results with Advanced LIGO data

    NASA Astrophysics Data System (ADS)

    George, Daniel; Huerta, E. A.

    2018-03-01

    The recent Nobel-prize-winning detections of gravitational waves from merging black holes and the subsequent detection of the collision of two neutron stars in coincidence with electromagnetic observations have inaugurated a new era of multimessenger astrophysics. To enhance the scope of this emergent field of science, we pioneered the use of deep learning with convolutional neural networks, that take time-series inputs, for rapid detection and characterization of gravitational wave signals. This approach, Deep Filtering, was initially demonstrated using simulated LIGO noise. In this article, we present the extension of Deep Filtering using real data from LIGO, for both detection and parameter estimation of gravitational waves from binary black hole mergers using continuous data streams from multiple LIGO detectors. We demonstrate for the first time that machine learning can detect and estimate the true parameters of real events observed by LIGO. Our results show that Deep Filtering achieves similar sensitivities and lower errors compared to matched-filtering while being far more computationally efficient and more resilient to glitches, allowing real-time processing of weak time-series signals in non-stationary non-Gaussian noise with minimal resources, and also enables the detection of new classes of gravitational wave sources that may go unnoticed with existing detection algorithms. This unified framework for data analysis is ideally suited to enable coincident detection campaigns of gravitational waves and their multimessenger counterparts in real-time.

  16. Probing GeV-scale MSSM neutralino dark matter in collider and direct detection experiments

    NASA Astrophysics Data System (ADS)

    Duan, Guang Hua; Wang, Wenyu; Wu, Lei; Yang, Jin Min; Zhao, Jun

    2018-03-01

    Given the recent constraints from the dark matter (DM) direct detections, we examine a light GeV-scale (2-30 GeV) neutralino DM in the alignment limit of the Minimal Supersymmetric Standard Model (MSSM). In this limit without decoupling, the heavy CP-even scalar H plays the role of the Standard Model (SM) Higgs boson while the other scalar h can be rather light so that the DM can annihilate through the h resonance or into a pair of h to achieve the observed relic density. With the current collider and cosmological constraints, we find that such a light neutralino DM above 6 GeV can be excluded by the XENON-1T (2017) limits while the survivied parameter space below 6 GeV can be fully tested by the future germanium-based light dark matter detections (such as CDEX), by the Higgs coupling precison measurements or by the production process e+e- → hA at an electron-positron collider (Higgs factory).

  17. Theory of mind in early psychosis.

    PubMed

    Langdon, Robyn; Still, Megan; Connors, Michael H; Ward, Philip B; Catts, Stanley V

    2014-08-01

    A deficit in theory of mind--the ability to infer and reason about the mental states of others - might underpin the poor social functioning of patients with psychosis. Unfortunately, however, there is considerable variation in how such a deficit is assessed. The current study compared three classic tests of theory of mind in terms of their ability to detect impairment in patients in the early stages of psychosis. Twenty-three patients within 2 years of their first psychotic episode and 19 healthy controls received picture-sequencing, joke-appreciation and story-comprehension tests of theory of mind. Whereas the picture-sequencing and joke-appreciation tests successfully detected a selective theory-of-mind deficit in patients, the story-comprehension test did not. The findings suggest that tests that place minimal demands on language processing and involve indirect, rather than explicit, instructions to assess theory of mind might be best suited to detecting theory-of-mind impairment in early stages of psychosis. © 2013 Wiley Publishing Asia Pty Ltd.

  18. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    NASA Astrophysics Data System (ADS)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in particular the spectral characteristics of events of interest, Is entirely model-based, i.e. does not rely on a priori's - particularly important for nuclear monitoring, Does not rely on individualized signal detection thresholds - it's the network solution that matters.

  19. Production of gravitational waves during preheating with nonminimal coupling

    NASA Astrophysics Data System (ADS)

    Fu, Chengjie; Wu, Puxun; Yu, Hongwei

    2018-04-01

    We study the preheating and the in-process production of gravitational waves (GWs) after inflation in which the inflaton is nonminimally coupled to the curvature in a self-interacting quartic potential with the method of lattice simulation. We find that the nonminimal coupling enhances the amplitude of the density spectrum of inflaton quanta, and as a result, the peak value of the GW spectrum generated during preheating is enhanced as well and might reach the limit of detection in future GW experiments. The peaks of the GW spectrum not only exhibit distinctive characteristics as compared to those of minimally coupled inflaton potentials but also imprint information on the nonminimal coupling and the parametric resonance, and thus the detection of these peaks in the future will provide us a new avenue to reveal the physics of the early universe.

  20. Intermittent search strategies

    NASA Astrophysics Data System (ADS)

    Bénichou, O.; Loverdo, C.; Moreau, M.; Voituriez, R.

    2011-01-01

    This review examines intermittent target search strategies, which combine phases of slow motion, allowing the searcher to detect the target, and phases of fast motion during which targets cannot be detected. It is first shown that intermittent search strategies are actually widely observed at various scales. At the macroscopic scale, this is, for example, the case of animals looking for food; at the microscopic scale, intermittent transport patterns are involved in a reaction pathway of DNA-binding proteins as well as in intracellular transport. Second, generic stochastic models are introduced, which show that intermittent strategies are efficient strategies that enable the minimization of search time. This suggests that the intrinsic efficiency of intermittent search strategies could justify their frequent observation in nature. Last, beyond these modeling aspects, it is proposed that intermittent strategies could also be used in a broader context to design and accelerate search processes.

  1. Assessment of DNA degradation induced by thermal and UV radiation processing: implications for quantification of genetically modified organisms.

    PubMed

    Ballari, Rajashekhar V; Martin, Asha

    2013-12-01

    DNA quality is an important parameter for the detection and quantification of genetically modified organisms (GMO's) using the polymerase chain reaction (PCR). Food processing leads to degradation of DNA, which may impair GMO detection and quantification. This study evaluated the effect of various processing treatments such as heating, baking, microwaving, autoclaving and ultraviolet (UV) irradiation on the relative transgenic content of MON 810 maize using pRSETMON-02, a dual target plasmid as a model system. Amongst all the processing treatments examined, autoclaving and UV irradiation resulted in the least recovery of the transgenic (CaMV 35S promoter) and taxon-specific (zein) target DNA sequences. Although a profound impact on DNA degradation was seen during the processing, DNA could still be reliably quantified by Real-time PCR. The measured mean DNA copy number ratios of the processed samples were in agreement with the expected values. Our study confirms the premise that the final analytical value assigned to a particular sample is independent of the degree of DNA degradation since the transgenic and the taxon-specific target sequences possessing approximately similar lengths degrade in parallel. The results of our study demonstrate that food processing does not alter the relative quantification of the transgenic content provided the quantitative assays target shorter amplicons and the difference in the amplicon size between the transgenic and taxon-specific genes is minimal. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Mechanisms of food processing and storage-related stress tolerance in Clostridium botulinum.

    PubMed

    Dahlsten, Elias; Lindström, Miia; Korkeala, Hannu

    2015-05-01

    Vegetative cultures of Clostridium botulinum produce the extremely potent botulinum neurotoxin, and may jeopardize the safety of foods unless sufficient measures to prevent growth are applied. Minimal food processing relies on combinations of mild treatments, primarily to avoid deterioration of the sensory qualities of the food. Tolerance of C. botulinum to minimal food processing is well characterized. However, data on effects of successive treatments on robustness towards further processing is lacking. Developments in genetic manipulation tools and the availability of annotated genomes have allowed identification of genetic mechanisms involved in stress tolerance of C. botulinum. Most studies focused on low temperature, and the importance of various regulatory mechanisms in cold tolerance of C. botulinum has been demonstrated. Furthermore, novel roles in cold tolerance were shown for metabolic pathways under the control of these regulators. A role for secondary oxidative stress in tolerance to extreme temperatures has been proposed. Additionally, genetic mechanisms related to tolerance to heat, low pH, and high salinity have been characterized. Data on genetic stress-related mechanisms of psychrotrophic Group II C. botulinum strains are scarce; these mechanisms are of interest for food safety research and should thus be investigated. This minireview encompasses the importance of C. botulinum as a food safety hazard and its central physiological characteristics related to food-processing and storage-related stress. Special attention is given to recent findings considering genetic mechanisms C. botulinum utilizes in detecting and countering these adverse conditions. Copyright © 2014 Institut Pasteur. Published by Elsevier Masson SAS. All rights reserved.

  3. Parallel, confocal, and complete spectrum imager for fluorescent detection of high-density microarray

    NASA Astrophysics Data System (ADS)

    Bogdanov, Valery L.; Boyce-Jacino, Michael

    1999-05-01

    Confined arrays of biochemical probes deposited on a solid support surface (analytical microarray or 'chip') provide an opportunity to analysis multiple reactions simultaneously. Microarrays are increasingly used in genetics, medicine and environment scanning as research and analytical instruments. A power of microarray technology comes from its parallelism which grows with array miniaturization, minimization of reagent volume per reaction site and reaction multiplexing. An optical detector of microarray signals should combine high sensitivity, spatial and spectral resolution. Additionally, low-cost and a high processing rate are needed to transfer microarray technology into biomedical practice. We designed an imager that provides confocal and complete spectrum detection of entire fluorescently-labeled microarray in parallel. Imager uses microlens array, non-slit spectral decomposer, and high- sensitive detector (cooled CCD). Two imaging channels provide a simultaneous detection of localization, integrated and spectral intensities for each reaction site in microarray. A dimensional matching between microarray and imager's optics eliminates all in moving parts in instrumentation, enabling highly informative, fast and low-cost microarray detection. We report theory of confocal hyperspectral imaging with microlenses array and experimental data for implementation of developed imager to detect fluorescently labeled microarray with a density approximately 103 sites per cm2.

  4. Near‐surface void detection using a seismic landstreamer and horizontal velocity and attenuation tomography

    USGS Publications Warehouse

    Buckley, Sean F.; Lane, John W.

    2012-01-01

    The detection and characterization of subsurface voids plays an important role in the study of karst formations and clandestine tunnels. Horizontal velocity and attenuation tomography (HVAT) using offset‐fan shooting and a towed seismic land streamer is a simple, rapid, minimally invasive method that shows promise for detecting near‐surface voids and providing information on the orientation of linear voids. HVAT surveys were conducted over a known subsurface steam tunnel on the University of Connecticut Depot Campus, Storrs, Connecticut. First‐arrival travel‐time and amplitude data were used to produce two‐dimensional (2D) horizontal (map view) velocity and attenuation tomograms. In addition, attenuation tomograms were produced based on normalized total trace energy (TTE). Both the velocity and TTE attenuation tomograms depict an anomaly consistent with the location and orientation of the known tunnel; the TTE method, however, requires significantly less processing time, and therefore may provide a path forward to semi‐automated, near real‐time detection of near‐surface voids. Further study is needed to assess the utility of the HVAT method to detect deeper voids and the effects of a more complex geology on HVAT results.

  5. Spatially Varying Spectrally Thresholds for MODIS Cloud Detection

    NASA Technical Reports Server (NTRS)

    Haines, S. L.; Jedlovec, G. J.; Lafontaine, F.

    2004-01-01

    The EOS science team has developed an elaborate global MODIS cloud detection procedure, and the resulting MODIS product (MOD35) is used in the retrieval process of several geophysical parameters to mask out clouds. While the global application of the cloud detection approach appears quite robust, the product has some shortcomings on the regional scale, often over determining clouds in a variety of settings, particularly at night. This over-determination of clouds can cause a reduction in the spatial coverage of MODIS derived clear-sky products. To minimize this problem, a new regional cloud detection method for use with MODIS data has been developed at NASA's Global Hydrology and Climate Center (GHCC). The approach is similar to that used by the GHCC for GOES data over the continental United States. Several spatially varying thresholds are applied to MODIS spectral data to produce a set of tests for detecting clouds. The thresholds are valid for each MODIS orbital pass, and are derived from 20-day composites of GOES channels with similar wavelengths to MODIS. This paper and accompanying poster will introduce the GHCC MODIS cloud mask, provide some examples, and present some preliminary validation.

  6. A multi-step system for screening and localization of hard exudates in retinal images

    NASA Astrophysics Data System (ADS)

    Bopardikar, Ajit S.; Bhola, Vishal; Raghavendra, B. S.; Narayanan, Rangavittal

    2012-03-01

    The number of people being affected by Diabetes mellitus worldwide is increasing at an alarming rate. Monitoring of the diabetic condition and its effects on the human body are therefore of great importance. Of particular interest is diabetic retinopathy (DR) which is a result of prolonged, unchecked diabetes and affects the visual system. DR is a leading cause of blindness throughout the world. At any point of time 25 - 44% of people with diabetes are afflicted by DR. Automation of the screening and monitoring process for DR is therefore essential for efficient utilization of healthcare resources and optimizing treatment of the affected individuals. Such automation would use retinal images and detect the presence of specific artifacts such as hard exudates, hemorrhages and soft exudates (that may appear in the image) to gauge the severity of DR. In this paper, we focus on the detection of hard exudates. We propose a two step system that consists of a screening step that classifies retinal images as normal or abnormal based on the presence of hard exudates and a detection stage that localizes these artifacts in an abnormal retinal image. The proposed screening step automatically detects the presence of hard exudates with a high sensitivity and positive predictive value (PPV ). The detection/localization step uses a k-means based clustering approach to localize hard exudates in the retinal image. Suitable feature vectors are chosen based on their ability to isolate hard exudates while minimizing false detections. The algorithm was tested on a benchmark dataset (DIARETDB1) and was seen to provide a superior performance compared to existing methods. The two-step process described in this paper can be embedded in a tele-ophthalmology system to aid with speedy detection and diagnosis of the severity of DR.

  7. Automated cloud and shadow detection and filling using two-date Landsat imagery in the United States

    USGS Publications Warehouse

    Jin, Suming; Homer, Collin G.; Yang, Limin; Xian, George; Fry, Joyce; Danielson, Patrick; Townsend, Philip A.

    2013-01-01

    A simple, efficient, and practical approach for detecting cloud and shadow areas in satellite imagery and restoring them with clean pixel values has been developed. Cloud and shadow areas are detected using spectral information from the blue, shortwave infrared, and thermal infrared bands of Landsat Thematic Mapper or Enhanced Thematic Mapper Plus imagery from two dates (a target image and a reference image). These detected cloud and shadow areas are further refined using an integration process and a false shadow removal process according to the geometric relationship between cloud and shadow. Cloud and shadow filling is based on the concept of the Spectral Similarity Group (SSG), which uses the reference image to find similar alternative pixels in the target image to serve as replacement values for restored areas. Pixels are considered to belong to one SSG if the pixel values from Landsat bands 3, 4, and 5 in the reference image are within the same spectral ranges. This new approach was applied to five Landsat path/rows across different landscapes and seasons with various types of cloud patterns. Results show that almost all of the clouds were captured with minimal commission errors, and shadows were detected reasonably well. Among five test scenes, the lowest producer's accuracy of cloud detection was 93.9% and the lowest user's accuracy was 89%. The overall cloud and shadow detection accuracy ranged from 83.6% to 99.3%. The pixel-filling approach resulted in a new cloud-free image that appears seamless and spatially continuous despite differences in phenology between the target and reference images. Our methods offer a straightforward and robust approach for preparing images for the new 2011 National Land Cover Database production.

  8. The Parameterization of Top-Hat Particle Sensors with Microchannel-Plate-Based Detection Systems and its Application to the Fast Plasma Investigation on NASA's Magnetospheric MultiScale Mission

    NASA Technical Reports Server (NTRS)

    Gershman, Daniel J.; Gliese, Ulrik; Dorelli, John C.; Avanov, Levon A.; Barrie, Alexander C.; Chornay, Dennis J.; MacDonald, Elizabeth A.; Holland, Matthew P.; Pollock, Craig J.

    2015-01-01

    The most common instrument for low energy plasmas consists of a top-hat electrostatic analyzer geometry coupled with a microchannel-plate (MCP)-based detection system. While the electrostatic optics for such sensors are readily simulated and parameterized during the laboratory calibration process, the detection system is often less well characterized. Furthermore, due to finite resources, for large sensor suites such as the Fast Plasma Investigation (FPI) on NASA's Magnetospheric Multiscale (MMS) mission, calibration data are increasingly sparse. Measurements must be interpolated and extrapolated to understand instrument behavior for untestable operating modes and yet sensor inter-calibration is critical to mission success. To characterize instruments from a minimal set of parameters we have developed the first comprehensive mathematical description of both sensor electrostatic optics and particle detection systems. We include effects of MCP efficiency, gain, scattering, capacitive crosstalk, and charge cloud spreading at the detector output. Our parameterization enables the interpolation and extrapolation of instrument response to all relevant particle energies, detector high voltage settings, and polar angles from a small set of calibration data. We apply this model to the 32 sensor heads in the Dual Electron Sensor (DES) and 32 sensor heads in the Dual Ion Sensor (DIS) instruments on the 4 MMS observatories and use least squares fitting of calibration data to extract all key instrument parameters. Parameters that will evolve in flight, namely MCP gain, will be determined daily through application of this model to specifically tailored in-flight calibration activities, providing a robust characterization of sensor suite performance throughout mission lifetime. Beyond FPI, our model provides a valuable framework for the simulation and evaluation of future detection system designs and can be used to maximize instrument understanding with minimal calibration resources.

  9. Comparative responsiveness and minimal clinically important differences for idiopathic ulnar impaction syndrome.

    PubMed

    Kim, Jae Kwang; Park, Eun Soo

    2013-05-01

    Patient-reported questionnaires have been widely used to predict symptom severity and functional disability in musculoskeletal disease. Importantly, questionnaires can detect clinical changes in patients; however, this impact has not been determined for ulnar impaction syndrome. We asked (1) which of Patient-Rated Wrist Evaluation (PRWE), DASH, and other physical measures was more responsive to clinical improvements, and (2) what was the minimal clinically important difference for the PRWE and DASH after ulnar shortening osteotomy for idiopathic ulnar impaction syndrome. All patients who underwent ulnar shortening osteotomy between March 2008 and February 2011 for idiopathic ulnar impaction syndrome were enrolled in this study. All patients completed the PRWE and DASH questionnaires, and all were evaluated for grip strength and wrist ROM, preoperatively and 12 months postoperatively. We compared the effect sizes observed by each of these instruments. Effect size is calculated by dividing the mean change in a score of each instrument during a specified interval by the standard deviation of the baseline score. In addition, patient-perceived overall improvement was used as the anchor to determine the minimal clinically important differences on the PRWE and DASH 12 months after surgery. The average score of each item except for wrist flexion and supination improved after surgery. The PRWE was more sensitive than the DASH or than physical measurements in detecting clinical changes. The effect sizes and standardized response means of the outcome measures were as follows: PRWE (1.51, 1.64), DASH (1.12, 1.24), grip strength (0.59, 0.68), wrist pronation (0.33, 0.41), and wrist extension (0.28, 0.36). Patient-perceived overall improvement and score changes of the PRWE and DASH correlated significantly. Minimal clinically important differences were 17 points (of a possible 100) for the PRWE and 13.5 for the DASH (also of 100), and minimal detectable changes were 7.7 points for the PRWE and 9.3 points for the DASH. Although the PRWE and DASH were highly sensitive to clinical changes, the PRWE was more sensitive in terms of detecting clinical changes after ulnar shortening osteotomy for idiopathic ulnar impaction syndrome. A minimal change of 17 PRWE points or 13.5 DASH points was necessary to achieve a benefit that patients perceived as clinically important. The minimal clinically important differences using these instruments were higher than the values produced by measurement errors.

  10. Integrated immunoassay using tuneable surface acoustic waves and lensfree detection.

    PubMed

    Bourquin, Yannyk; Reboud, Julien; Wilson, Rab; Zhang, Yi; Cooper, Jonathan M

    2011-08-21

    The diagnosis of infectious diseases in the Developing World is technologically challenging requiring complex biological assays with a high analytical performance, at minimal cost. By using an opto-acoustic immunoassay technology, integrating components commonly used in mobile phone technologies, including surface acoustic wave (SAW) transducers to provide pressure driven flow and a CMOS camera to enable lensfree detection technique, we demonstrate the potential to produce such an assay. To achieve this, antibody functionalised microparticles were manipulated on a low-cost disposable cartridge using the surface acoustic waves and were then detected optically. Our results show that the biomarker, interferon-γ, used for the diagnosis of diseases such as latent tuberculosis, can be detected at pM concentrations, within a few minutes (giving high sensitivity at a minimal cost). This journal is © The Royal Society of Chemistry 2011

  11. Improved sensitivity to fluorescence for cancer detection in wide-field image-guided neurosurgery

    PubMed Central

    Jermyn, Michael; Gosselin, Yoann; Valdes, Pablo A.; Sibai, Mira; Kolste, Kolbein; Mercier, Jeanne; Angulo, Leticia; Roberts, David W.; Paulsen, Keith D.; Petrecca, Kevin; Daigle, Olivier; Wilson, Brian C.; Leblond, Frederic

    2015-01-01

    In glioma surgery, Protoporphyrin IX (PpIX) fluorescence may identify residual tumor that could be resected while minimizing damage to normal brain. We demonstrate that improved sensitivity for wide-field spectroscopic fluorescence imaging is achieved with minimal disruption to the neurosurgical workflow using an electron-multiplying charge-coupled device (EMCCD) relative to a state-of-the-art CMOS system. In phantom experiments the EMCCD system can detect at least two orders-of-magnitude lower PpIX. Ex vivo tissue imaging on a rat glioma model demonstrates improved fluorescence contrast compared with neurosurgical fluorescence microscope technology, and the fluorescence detection is confirmed with measurements from a clinically-validated spectroscopic probe. Greater PpIX sensitivity in wide-field fluorescence imaging may improve the residual tumor detection during surgery with consequent impact on survival. PMID:26713218

  12. On-chip electrical detection of parallel loop-mediated isothermal amplification with DG-BioFETs for the detection of foodborne bacterial pathogens

    USDA-ARS?s Scientific Manuscript database

    The use of field effect transistors (FETs) as the transduction element for the detection of DNA amplification reactions will enable portable and inexpensive nucleic acid analysis. Transistors used as biological sensors,or BioFETs, minimize the cost and size of detection platforms by leveraging fabri...

  13. Post-mortem MRI versus conventional autopsy in fetuses and children: a prospective validation study.

    PubMed

    Thayyil, Sudhin; Sebire, Neil J; Chitty, Lyn S; Wade, Angie; Chong, Wk; Olsen, Oystein; Gunny, Roxana S; Offiah, Amaka C; Owens, Catherine M; Saunders, Dawn E; Scott, Rosemary J; Jones, Rod; Norman, Wendy; Addison, Shea; Bainbridge, Alan; Cady, Ernest B; Vita, Enrico De; Robertson, Nicola J; Taylor, Andrew M

    2013-07-20

    Post-mortem MRI is a potential diagnostic alternative to conventional autopsy, but few large prospective studies have compared its accuracy with that of conventional autopsy. We assessed the accuracy of whole-body, post-mortem MRI for detection of major pathological lesions associated with death in a prospective cohort of fetuses and children. In this prospective validation study, we did pre-autopsy, post-mortem, whole-body MRI at 1·5 T in an unselected population of fetuses (≤24 weeks' or >24 weeks' gestation) and children (aged <16 years) at two UK centres in London between March 1, 2007 and Sept 30, 2011. With conventional autopsy as the diagnostic gold standard, we assessed MRI findings alone, or in conjunction with other minimally invasive post-mortem investigations (minimally invasive autopsy), for accuracy in detection of cause of death or major pathological abnormalities. A radiologist and pathologist who were masked to the autopsy findings indicated whether the minimally invasive autopsy would have been adequate. The primary outcome was concordance rate between minimally invasive and conventional autopsy. We analysed 400 cases, of which 277 (69%) were fetuses and 123 (31%) were children. Cause of death or major pathological lesion detected by minimally invasive autopsy was concordant with conventional autopsy in 357 (89·3%, 95% CI 85·8-91·9) cases: 175 (94·6%, 90·3-97·0) of 185 fetuses at 24 weeks' gestation or less, 88 (95·7%, 89·3-98·3) of 92 fetuses at more than 24 weeks' gestation, 34 (81·0%, 66·7-90·0) [corrected] of 42 newborns aged 1 month or younger, 45 (84·9%, 72·9-92·1) of 53 infants aged older than 1 month to 1 year or younger, and 15 (53·6%, 35·8-70·5) of 28 children aged older than 1 year to 16 years or younger. The dedicated radiologist or pathologist review of the minimally invasive autopsy showed that in 165 (41%) cases a full autopsy might not have been needed; in these cases, concordance between autopsy and minimally invasive autopsy was 99·4% (96·6-99·9). Minimally invasive autopsy has accuracy similar to that of conventional autopsy for detection of cause of death or major pathological abnormality after death in fetuses, newborns, and infants, but was less accurate in older children. If undertaken jointly by pathologists and radiologists, minimally invasive autopsy could be an acceptable alternative to conventional autopsy in selected cases. Policy research Programme, Department of Health, UK. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Investigation of Matlab® as platform in navigation and control of an Automatic Guided Vehicle utilising an omnivision sensor.

    PubMed

    Kotze, Ben; Jordaan, Gerrit

    2014-08-25

    Automatic Guided Vehicles (AGVs) are navigated utilising multiple types of sensors for detecting the environment. In this investigation such sensors are replaced and/or minimized by the use of a single omnidirectional camera picture stream. An area of interest is extracted, and by using image processing the vehicle is navigated on a set path. Reconfigurability is added to the route layout by signs incorporated in the navigation process. The result is the possible manipulation of a number of AGVs, each on its own designated colour-signed path. This route is reconfigurable by the operator with no programming alteration or intervention. A low resolution camera and a Matlab® software development platform are utilised. The use of Matlab® lends itself to speedy evaluation and implementation of image processing options on the AGV, but its functioning in such an environment needs to be assessed.

  15. Boundary Conditions for the Paleoenvironment: Chemical and Physical Processes in the Pre-Solar Nebula

    NASA Technical Reports Server (NTRS)

    Irvine, William M.; Schloerb, F. Peter

    1997-01-01

    The basic theme of this program is the study of molecular complexity and evolution in interstellar clouds and in primitive solar system objects. Research has included the detection and study of a number of new interstellar molecules and investigation of reaction pathways for astrochemistry from a comparison of theory and observed molecular abundances. The latter includes studies of cold, dark clouds in which ion-molecule chemistry should predominate, searches for the effects of interchange of material between the gas and solid phases in interstellar clouds, unbiased spectral surveys of particular sources, and systematic investigation of the interlinked chemistry and physics of dense interstellar clouds. In addition, the study of comets has allowed a comparison between the chemistry of such minimally thermally processed objects and that of interstellar clouds, shedding light on the evolution of the biogenic elements during the process of solar system formation.

  16. Investigation of Matlab® as Platform in Navigation and Control of an Automatic Guided Vehicle Utilising an Omnivision Sensor

    PubMed Central

    Kotze, Ben; Jordaan, Gerrit

    2014-01-01

    Automatic Guided Vehicles (AGVs) are navigated utilising multiple types of sensors for detecting the environment. In this investigation such sensors are replaced and/or minimized by the use of a single omnidirectional camera picture stream. An area of interest is extracted, and by using image processing the vehicle is navigated on a set path. Reconfigurability is added to the route layout by signs incorporated in the navigation process. The result is the possible manipulation of a number of AGVs, each on its own designated colour-signed path. This route is reconfigurable by the operator with no programming alteration or intervention. A low resolution camera and a Matlab® software development platform are utilised. The use of Matlab® lends itself to speedy evaluation and implementation of image processing options on the AGV, but its functioning in such an environment needs to be assessed. PMID:25157548

  17. Knockdown of Polyphenol Oxidase Gene Expression in Potato (Solanum tuberosum L.) with Artificial MicroRNAs.

    PubMed

    Chi, Ming; Bhagwat, Basdeo; Tang, Guiliang; Xiang, Yu

    2016-01-01

    It is of great importance and interest to develop crop varieties with low polyphenol oxidase (PPO) activity for the food industry because PPO-mediated oxidative browning is a main cause of post-harvest deterioration and quality loss of fresh produce and processed foods. We recently demonstrated that potato tubers with reduced browning phenotypes can be produced by inhibition of the expression of several PPO gene isoforms using artificial microRNA (amiRNA) technology. The approach introduces a single type of 21-nucleotide RNA population to guide silencing of the PPO gene transcripts in potato tissues. Some advantages of the technology are: small RNA molecules are genetically transformed, off-target gene silencing can be avoided or minimized at the stage of amiRNA designs, and accuracy and efficiency of the processes can be detected at every step using molecular biological techniques. Here we describe the methods for transformation and regeneration of potatoes with amiRNA vectors, detection of the expression of amiRNAs, identification of the cleaved product of the target gene transcripts, and assay of the expression level of PPO gene isoforms in potatoes.

  18. [Oligonucleotide derivatives in the nucleic acid hybridization analysis. II. Isothermal signal amplification in process of DNA analysis by minisequencing].

    PubMed

    Dmitrienko, E V; Khomiakova, E A; Pyshnaia; Bragin, A G; Vedernikov, V E; Pyshnyĭ, D V

    2010-01-01

    The isothermal amplification of reporter signal via limited probe extension (minisequencing) upon hybridization of nucleic acids has been studied. The intensity of reporter signal has been shown to increase due to enzymatic labeling of multiple probes upon consecutive hybridization with one DNA template both in homophase and heterophase assays using various kinds of detection signal: radioisotope label, fluorescent label, and enzyme-linked assay. The kinetic scheme of the process has been proposed and kinetic parameters for each step have been determined. The signal intensity has been shown to correlate with physicochemical characteristics of both complexes: probe/DNA and product/DNA. The maximum intensity has been observed at minimal difference between the thermodynamic stability of these complexes, provided the reaction temperature has been adjusted near their melting temperature values; rising or lowering the reaction temperature reduces the amount of reporting product. The signal intensity has been shown to decrease significantly upon hybridization with the DNA template containing single-nucleotide mismatches. Limited probe extension assay is useful not only for detection of DNA template but also for its quantitative characterization.

  19. Oxychlorine Detections on Mars: Implications for Cl Cycling

    NASA Technical Reports Server (NTRS)

    Sutter, B.; Jackson, W. A.; Ming, D. W.; Archer, P. D.; Stern, J. C.; Mahaffy, P. R.; Gellert, R.

    2016-01-01

    The Sample Analysis at Mars (SAM) instrument has detected evolved O2 and HCl indicating the presence of perchlorate and/or chlorate (oxychlorine) in all 11 sediments analyzed to date. The hyperarid martian climate is believed to have allowed accumulation of oxychlorine and assumed chloride contents similar to those in hyperarid terrestrial settings. The linear correlation of oxychlorine and chloride of Gale Crater sediments is low (r (sup 2) equals 0.64). Correlations present in hyperarid Antarctica and the Atacama Desert are attributed to unaltered atmospheric source coupled with minimal redox cycling by biological activity. Terrestrial semi-arid to arid settings have low correlations similar to Gale Crater and are attributed to additional inputs of Cl minus from sea salt, dust, and/or proximal playa settings, and possible reduction of oxychlorine phases during wetter periods. While microbiological processes could contribute to low oxychlorine/chloride correlations on Mars, several abiotic mechanisms are more likely, such as changing oxychlorine production rates with time and/or post-depositional geochemical redox processes that altered the Gale Crater oxychlorine and chloride contents.

  20. Tissue recovery practices and bioburden: a systematic review.

    PubMed

    Brubaker, S; Lotherington, K; Zhao, Jie; Hamilton, B; Rockl, G; Duong, A; Garibaldi, A; Simunovic, N; Alsop, D; Dao, D; Bessemer, R; Ayeni, O R

    2016-12-01

    For successful transplantation, allografts should be free of microorganisms that may cause harm to the allograft recipient. Before or during recovery and subsequent processing, tissues can become contaminated. Effective tissue recovery methods, such as minimizing recovery times (<24 h after death) and the number of experienced personnel performing recovery, are examples of factors that can affect the rate of tissue contamination at recovery. Additional factors, such as minimizing the time after asystole to recovery and the total time it takes to perform recovery, the type of recovery site, the efficacy of the skin prep performed immediately prior to recovery of tissue, and certain technical recovery procedures may also result in control of the rate of contamination. Due to the heterogeneity of reported recovery practices and experiences, it cannot be concluded if the use of other barriers and/or hygienic precautions to avoid contamination have had an effect on bioburden detected after tissue recovery. Qualified studies are lacking which indicates a need exists for evidence-based data to support methods that reduce or control bioburden.

  1. Intricacies of Using Kevlar and Thermal Knives in a Deployable Release System: Issues and Solutions

    NASA Technical Reports Server (NTRS)

    Stewart, Alphonso C.; Hair, Jason H.; Broduer, Steve (Technical Monitor)

    2002-01-01

    The utilization of Kevlar cord and thermal knives in a deployable release system produces a number of issues that must be addressed in the design of the system. This paper proposes design considerations that minimize the major issues, thermal knife failure, Kevlar cord relaxation, and the measurement of the cord tension. Design practices can minimize the potential for thermal knife laminate and element damage that result in failure of the knife. A process for in-situ inspection of the knife with resistance, rather than continuity, checks and 10x zoom optical imaging can detect damaged knives. Tests allow the characterization of the behavior of the particular Kevlar cord in use and the development of specific pre-stretching techniques and initial tension values needed to meet requirements. A new method can accurately measure the tension of the Kevlar cord using a guitar tuner, because more conventional methods do not apply to arimid cords such as Kevlar.

  2. Intricacies of Using Kevlar Cord and Thermal Knives in a Deployable Release System: Issues and Solutions

    NASA Technical Reports Server (NTRS)

    Stewart, Alphonso; Hair, Jason H.

    2002-01-01

    The utilization of Kevlar cord and thermal knives in a deployable release system produces a number of issues that must be addressed in the design of the system. This paper proposes design considerations that minimize the major issues, thermal knife failure, Kevlar cord relaxation, and the measurement of the cord tension. Design practices can minimize the potential for thermal knife laminate and element damage that result in failure of the knife. A process for in-situ inspection of the knife with resistance, rather than continuity, checks and 10x zoom optical imaging can detect damaged knives. Tests allow the characterization of the behavior of the particular Kevlar cord in use and the development of specific prestretching techniques and initial tension values needed to meet requirements. A new method can accurately measure the tension of the Kevlar cord using a guitar tuner, because more conventional methods do not apply to arimid cords such as Kevlar.

  3. Intricacies of Using Kevlar Cord and Thermal Knives in a Deployable Release System: Issues and Solutions

    NASA Astrophysics Data System (ADS)

    Stewart, Alphonso; Hair, Jason H.

    2002-04-01

    The utilization of Kevlar cord and thermal knives in a deployable release system produces a number of issues that must be addressed in the design of the system. This paper proposes design considerations that minimize the major issues, thermal knife failure, Kevlar cord relaxation, and the measurement of the cord tension. Design practices can minimize the potential for thermal knife laminate and element damage that result in failure of the knife. A process for in-situ inspection of the knife with resistance, rather than continuity, checks and 10x zoom optical imaging can detect damaged knives. Tests allow the characterization of the behavior of the particular Kevlar cord in use and the development of specific prestretching techniques and initial tension values needed to meet requirements. A new method can accurately measure the tension of the Kevlar cord using a guitar tuner, because more conventional methods do not apply to arimid cords such as Kevlar.

  4. Augmented reality and haptic interfaces for robot-assisted surgery.

    PubMed

    Yamamoto, Tomonori; Abolhassani, Niki; Jung, Sung; Okamura, Allison M; Judkins, Timothy N

    2012-03-01

    Current teleoperated robot-assisted minimally invasive surgical systems do not take full advantage of the potential performance enhancements offered by various forms of haptic feedback to the surgeon. Direct and graphical haptic feedback systems can be integrated with vision and robot control systems in order to provide haptic feedback to improve safety and tissue mechanical property identification. An interoperable interface for teleoperated robot-assisted minimally invasive surgery was developed to provide haptic feedback and augmented visual feedback using three-dimensional (3D) graphical overlays. The software framework consists of control and command software, robot plug-ins, image processing plug-ins and 3D surface reconstructions. The feasibility of the interface was demonstrated in two tasks performed with artificial tissue: palpation to detect hard lumps and surface tracing, using vision-based forbidden-region virtual fixtures to prevent the patient-side manipulator from entering unwanted regions of the workspace. The interoperable interface enables fast development and successful implementation of effective haptic feedback methods in teleoperation. Copyright © 2011 John Wiley & Sons, Ltd.

  5. Translating Big Data into Smart Data for Veterinary Epidemiology.

    PubMed

    VanderWaal, Kimberly; Morrison, Robert B; Neuhauser, Claudia; Vilalta, Carles; Perez, Andres M

    2017-01-01

    The increasing availability and complexity of data has led to new opportunities and challenges in veterinary epidemiology around how to translate abundant, diverse, and rapidly growing "big" data into meaningful insights for animal health. Big data analytics are used to understand health risks and minimize the impact of adverse animal health issues through identifying high-risk populations, combining data or processes acting at multiple scales through epidemiological modeling approaches, and harnessing high velocity data to monitor animal health trends and detect emerging health threats. The advent of big data requires the incorporation of new skills into veterinary epidemiology training, including, for example, machine learning and coding, to prepare a new generation of scientists and practitioners to engage with big data. Establishing pipelines to analyze big data in near real-time is the next step for progressing from simply having "big data" to create "smart data," with the objective of improving understanding of health risks, effectiveness of management and policy decisions, and ultimately preventing or at least minimizing the impact of adverse animal health issues.

  6. Calorimetric determination of inhibition of ice crystal growth by antifreeze protein in hydroxyethyl starch solutions.

    PubMed Central

    Hansen, T N; Carpenter, J F

    1993-01-01

    Differential scanning calorimetry and cryomicroscopy were used to investigate the effects of type I antifreeze protein (AFP) from winter flounder on 58% solutions of hydroxyethyl starch. The glass, devitrification, and melt transitions noted during rewarming were unaffected by 100 micrograms/ml AFP. Isothermal annealing experiments were undertaken to detect the effects of AFP-induced inhibition of ice crystal growth using calorimetry. A premelt endothermic peak was detected during warming after the annealing procedure. Increasing the duration or the temperature of the annealing for the temperature range from -28 and -18 degrees C resulted in a gradual increase in the enthalpy of the premelt endotherm. This transition was unaffected by 100 micrograms/ml AFP. Annealing between -18 and -10 degrees C resulted in a gradual decrease in the premelt peak enthalpy. This process was inhibited by 100 micrograms/ml AFP. Cryomicroscopic examination of the samples revealed that AFP inhibited ice recrystallization during isothermal annealing at -10 degrees C. Annealing at lower temperatures resulted in minimal ice recrystallization and no visible effect of AFP. Thus, the 100 micrograms/ml AFP to have a detectable influence on thermal events in the calorimeter, conditions must be used that result in significant ice growth without AFP and visible inhibition of this process by AFP. Images FIGURE 8 PMID:7690257

  7. Statistically qualified neuro-analytic failure detection method and system

    DOEpatents

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  8. Cassini finds molecular hydrogen in the Enceladus plume: Evidence for hydrothermal processes

    NASA Astrophysics Data System (ADS)

    Waite, J. Hunter; Glein, Christopher R.; Perryman, Rebecca S.; Teolis, Ben D.; Magee, Brian A.; Miller, Greg; Grimes, Jacob; Perry, Mark E.; Miller, Kelly E.; Bouquet, Alexis; Lunine, Jonathan I.; Brockwell, Tim; Bolton, Scott J.

    2017-04-01

    Saturn’s moon Enceladus has an ice-covered ocean; a plume of material erupts from cracks in the ice. The plume contains chemical signatures of water-rock interaction between the ocean and a rocky core. We used the Ion Neutral Mass Spectrometer onboard the Cassini spacecraft to detect molecular hydrogen in the plume. By using the instrument’s open-source mode, background processes of hydrogen production in the instrument were minimized and quantified, enabling the identification of a statistically significant signal of hydrogen native to Enceladus. We find that the most plausible source of this hydrogen is ongoing hydrothermal reactions of rock containing reduced minerals and organic materials. The relatively high hydrogen abundance in the plume signals thermodynamic disequilibrium that favors the formation of methane from CO2 in Enceladus’ ocean.

  9. Typical and atypical neurodevelopment for face specialization: An fMRI study

    PubMed Central

    Joseph, Jane E.; Zhu, Xun; Gundran, Andrew; Davies, Faraday; Clark, Jonathan D.; Ruble, Lisa; Glaser, Paul; Bhatt, Ramesh S.

    2014-01-01

    Individuals with Autism Spectrum Disorder (ASD) and their relatives process faces differently from typically developed (TD) individuals. In an fMRI face-viewing task, TD and undiagnosed sibling (SIB) children (5–18 years) showed face specialization in the right amygdala and ventromedial prefrontal cortex (vmPFC), with left fusiform and right amygdala face specialization increasing with age in TD subjects. SIBs showed extensive antero-medial temporal lobe activation for faces that was not present in any other group, suggesting a potential compensatory mechanism. In ASD, face specialization was minimal but increased with age in the right fusiform and decreased with age in the left amygdala, suggesting atypical development of a frontal-amygdala-fusiform system which is strongly linked to detecting salience and processing facial information. PMID:25479816

  10. Biohazards Assessment in Large-Scale Zonal Centrifugation

    PubMed Central

    Baldwin, C. L.; Lemp, J. F.; Barbeito, M. S.

    1975-01-01

    A study was conducted to determine the biohazards associated with use of the large-scale zonal centrifuge for purification of moderate risk oncogenic viruses. To safely and conveniently assess the hazard, coliphage T3 was substituted for the virus in a typical processing procedure performed in a National Cancer Institute contract laboratory. Risk of personnel exposure was found to be minimal during optimal operation but definite potential for virus release from a number of centrifuge components during mechanical malfunction was shown by assay of surface, liquid, and air samples collected during the processing. High concentration of phage was detected in the turbine air exhaust and the seal coolant system when faulty seals were employed. The simulant virus was also found on both centrifuge chamber interior and rotor surfaces. Images PMID:1124921

  11. Detection of ɛ-ergodicity breaking in experimental data—A study of the dynamical functional sensibility

    NASA Astrophysics Data System (ADS)

    Loch-Olszewska, Hanna; Szwabiński, Janusz

    2018-05-01

    The ergodicity breaking phenomenon has already been in the area of interest of many scientists, who tried to uncover its biological and chemical origins. Unfortunately, testing ergodicity in real-life data can be challenging, as sample paths are often too short for approximating their asymptotic behaviour. In this paper, the authors analyze the minimal lengths of empirical trajectories needed for claiming the ɛ-ergodicity based on two commonly used variants of an autoregressive fractionally integrated moving average model. The dependence of the dynamical functional on the parameters of the process is studied. The problem of choosing proper ɛ for ɛ-ergodicity testing is discussed with respect to especially the variation of the innovation process and the data sample length, with a presentation on two real-life examples.

  12. Detection of ε-ergodicity breaking in experimental data-A study of the dynamical functional sensibility.

    PubMed

    Loch-Olszewska, Hanna; Szwabiński, Janusz

    2018-05-28

    The ergodicity breaking phenomenon has already been in the area of interest of many scientists, who tried to uncover its biological and chemical origins. Unfortunately, testing ergodicity in real-life data can be challenging, as sample paths are often too short for approximating their asymptotic behaviour. In this paper, the authors analyze the minimal lengths of empirical trajectories needed for claiming the ε-ergodicity based on two commonly used variants of an autoregressive fractionally integrated moving average model. The dependence of the dynamical functional on the parameters of the process is studied. The problem of choosing proper ε for ε-ergodicity testing is discussed with respect to especially the variation of the innovation process and the data sample length, with a presentation on two real-life examples.

  13. a Novel Method for Automation of 3d Hydro Break Line Generation from LIDAR Data Using Matlab

    NASA Astrophysics Data System (ADS)

    Toscano, G. J.; Gopalam, U.; Devarajan, V.

    2013-08-01

    Water body detection is necessary to generate hydro break lines, which are in turn useful in creating deliverables such as TINs, contours, DEMs from LiDAR data. Hydro flattening follows the detection and delineation of water bodies (lakes, rivers, ponds, reservoirs, streams etc.) with hydro break lines. Manual hydro break line generation is time consuming and expensive. Accuracy and processing time depend on the number of vertices marked for delineation of break lines. Automation with minimal human intervention is desired for this operation. This paper proposes using a novel histogram analysis of LiDAR elevation data and LiDAR intensity data to automatically detect water bodies. Detection of water bodies using elevation information was verified by checking against LiDAR intensity data since the spectral reflectance of water bodies is very small compared with that of land and vegetation in near infra-red wavelength range. Detection of water bodies using LiDAR intensity data was also verified by checking against LiDAR elevation data. False detections were removed using morphological operations and 3D break lines were generated. Finally, a comparison of automatically generated break lines with their semi-automated/manual counterparts was performed to assess the accuracy of the proposed method and the results were discussed.

  14. Detecting and Locating Seismic Events Without Phase Picks or Velocity Models

    NASA Astrophysics Data System (ADS)

    Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.

    2015-12-01

    The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.

  15. An advanced dual labeled gold nanoparticles probe to detect Cryptosporidium parvum using rapid immuno-dot blot assay.

    PubMed

    Thiruppathiraja, Chinnasamy; Kamatchiammal, Senthilkumar; Adaikkappan, Periyakaruppan; Alagar, Muthukaruppan

    2011-07-15

    The zoonotic protozoan parasite Cryptosporidium parvum poses a significant risk to public health. Due to the low infectious dose of C. parvum, remarkably sensitive detection methods are required for water and food industries analysis. However PCR affirmed sensing method of the causative nucleic acid has numerous advantages, still criterion demands for simple techniques and expertise understanding to extinguish its routine use. In contrast, protein based immuno detecting techniques are simpler to perform by a commoner, but lack of sensitivity due to inadequate signal amplification. In this paper, we focused on the development of a mere sensitive immuno detection method by coupling anti-cyst antibody and alkaline phosphatase on gold nanoparticle for C. parvum is described. Outcome of the sensitivity in an immuno-dot blot assay detection is enhanced by 500 fold (using conventional method) and visually be able to detect up to 10 oocysts/mL with minimal processing period. Techniques reported in this paper substantiate the convenience of immuno-dot blot assay for the routine screening of C. parvum in water/environmental examines and most importantly, demonstrates the potential of a prototype development of simple and inexpensive diagnostic technique. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Safety in the Chemical Laboratory: Flood Control.

    ERIC Educational Resources Information Center

    Pollard, Bruce D.

    1983-01-01

    Describes events leading to a flood in the Wehr Chemistry Laboratory at Marquette University, discussing steps taken to minimize damage upon discovery. Analyzes the problem of flooding in the chemical laboratory and outlines seven steps of flood control: prevention; minimization; early detection; stopping the flood; evaluation; clean-up; and…

  17. New opportunities for quality enhancing of images captured by passive THz camera

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2014-10-01

    As it is well-known, the passive THz camera allows seeing concealed object without contact with a person and this camera is non-dangerous for a person. Obviously, efficiency of using the passive THz camera depends on its temperature resolution. This characteristic specifies possibilities of the detection for concealed object: minimal size of the object; maximal distance of the detection; image quality. Computer processing of the THz image may lead to many times improving of the image quality without any additional engineering efforts. Therefore, developing of modern computer code for its application to THz images is urgent problem. Using appropriate new methods one may expect such temperature resolution which will allow to see banknote in pocket of a person without any real contact. Modern algorithms for computer processing of THz images allow also to see object inside the human body using a temperature trace on the human skin. This circumstance enhances essentially opportunity of passive THz camera applications for counterterrorism problems. We demonstrate opportunities, achieved at present time, for the detection both of concealed objects and of clothes components due to using of computer processing of images captured by passive THz cameras, manufactured by various companies. Another important result discussed in the paper consists in observation of both THz radiation emitted by incandescent lamp and image reflected from ceramic floorplate. We consider images produced by THz passive cameras manufactured by Microsemi Corp., and ThruVision Corp., and Capital Normal University (Beijing, China). All algorithms for computer processing of the THz images under consideration in this paper were developed by Russian part of author list. Keywords: THz wave, passive imaging camera, computer processing, security screening, concealed and forbidden objects, reflected image, hand seeing, banknote seeing, ceramic floorplate, incandescent lamp.

  18. Sensitivity and accuracy of high-throughput metabarcoding methods used to describe aquatic communities for early detection of invasve fish species

    EPA Science Inventory

    For early detection biomonitoring of aquatic invasive species, sensitivity to rare individuals and accurate, high-resolution taxonomic classification are critical to minimize Type I and II detection errors. Given the great expense and effort associated with morphological identifi...

  19. Minimally processed beetroot waste as an alternative source to obtain functional ingredients.

    PubMed

    Costa, Anne Porto Dalla; Hermes, Vanessa Stahl; Rios, Alessandro de Oliveira; Flôres, Simone Hickmann

    2017-06-01

    Large amounts of waste are generated by the minimally processed vegetables industry, such as those from beetroot processing. The aim of this study was to determine the best method to obtain flour from minimally processed beetroot waste dried at different temperatures, besides producing a colorant from such waste and assessing its stability along 45 days. Beetroot waste dried at 70 °C originates flour with significant antioxidant activity and higher betalain content than flour produced from waste dried at 60 and 80 °C, while chlorination had no impact on the process since microbiological results were consistent for its application. The colorant obtained from beetroot waste showed color stability for 20 days and potential antioxidant activity over the analysis period, thus it can be used as a functional additive to improve nutritional characteristics and appearance of food products. These results are promising since minimally processed beetroot waste can be used as an alternative source of natural and functional ingredients with high antioxidant activity and betalain content.

  20. Beta-Delayed Neutron Spectroscopy with Trapped Fission Products

    NASA Astrophysics Data System (ADS)

    Czeszumska, A.; Scielzo, N. D.; Norman, E. B.; Savard, G.; Aprahamian, A.; Burkey, M.; Caldwell, S. A.; Chiara, C. J.; Clark, J. A.; Harker, J.; Marley, S. T.; Morgan, G.; Orford, R.; Padgett, S.; Perez Galvan, A.; Segel, R. E.; Sharma, K. S.; Siegl, K.; Strauss, S.; Yee, R. M.

    2014-09-01

    Characterizing β-delayed neutron emission (βn) is of importance in reactor safety modeling, understanding of r-process nucleosynthesis, and nuclear structure studies. A newly developed technique enables a reliable measurement of βn branching ratios and neutron energy spectra without directly detecting neutrons. Ions of interest are loaded into a Paul trap surrounded by an array of radiation detectors. Upon decay, recoiling daughter nuclei and emitted particles emerge from the center of the trap with minimal scattering. The neutron energy is then determined from the time-of-flight, and hence momentum, of the recoiling ions. I will explain the details of the technique, and present the results from the most recent experimental campaign at the CARIBU facility at Argonne National Laboratory. Characterizing β-delayed neutron emission (βn) is of importance in reactor safety modeling, understanding of r-process nucleosynthesis, and nuclear structure studies. A newly developed technique enables a reliable measurement of βn branching ratios and neutron energy spectra without directly detecting neutrons. Ions of interest are loaded into a Paul trap surrounded by an array of radiation detectors. Upon decay, recoiling daughter nuclei and emitted particles emerge from the center of the trap with minimal scattering. The neutron energy is then determined from the time-of-flight, and hence momentum, of the recoiling ions. I will explain the details of the technique, and present the results from the most recent experimental campaign at the CARIBU facility at Argonne National Laboratory. This work was supported under contracts DE-NA0000979 (NSSC), DE-AC52-07NA27344 (LLNL), DE-AC02-06CH11357 (ANL), DE-FG02-94ER40834 (U. Maryland), DE-FG02-98ER41086 (Northwestern U.), NSERC, Canada, under Application No. 216974, and DHS.

  1. Experiences with integral microelectronics on smart structures for space

    NASA Astrophysics Data System (ADS)

    Nye, Ted; Casteel, Scott; Navarro, Sergio A.; Kraml, Bob

    1995-05-01

    One feature of a smart structure implies that some computational and signal processing capability can be performed at a local level, perhaps integral to the controlled structure. This requires electronics with a minimal mechanical influence regarding structural stiffening, heat dissipation, weight, and electrical interface connectivity. The Advanced Controls Technology Experiment II (ACTEX II) space-flight experiments implemented such a local control electronics scheme by utilizing composite smart members with integral processing electronics. These microelectronics, tested to MIL-STD-883B levels, were fabricated with conventional thick film on ceramic multichip module techniques. Kovar housings and aluminum-kapton multilayer insulation was used to protect against harsh space radiation and thermal environments. Development and acceptance testing showed the electronics design was extremely robust, operating in vacuum and at temperature range with minimal gain variations occurring just above room temperatures. Four electronics modules, used for the flight hardware configuration, were connected by a RS-485 2 Mbit per second serial data bus. The data bus was controlled by Actel field programmable gate arrays arranged in a single master, four slave configuration. An Intel 80C196KD microprocessor was chosen as the digital compensator in each controller. It was used to apply a series of selectable biquad filters, implemented via Delta Transforms. Instability in any compensator was expected to appear as large amplitude oscillations in the deployed structure. Thus, over-vibration detection circuitry with automatic output isolation was incorporated into the design. This was not used however, since during experiment integration and test, intentionally induced compensator instabilities resulted in benign mechanical oscillation symptoms. Not too surprisingly, it was determined that instabilities were most detectable by large temperature increases in the electronics, typically noticeable within minutes of unstable operation.

  2. Minimal time change detection algorithm for reconfigurable control system and application to aerospace

    NASA Technical Reports Server (NTRS)

    Kim, Sungwan

    1994-01-01

    System parameters should be tracked on-line to build a reconfigurable control system even though there exists an abrupt change. For this purpose, a new performance index that we are studying is the speed of adaptation- how quickly does the system determine that a change has occurred? In this paper, a new, robust algorithm that is optimized to minimize the time delay in detecting a change for fixed false alarm probability is proposed. Simulation results for the aircraft lateral motion with a known or unknown change in control gain matrices, in the presence of doublet input, indicate that the algorithm works fairly well. One of its distinguishing properties is that detection delay of this algorithm is superior to that of Whiteness Test.

  3. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    PubMed

    Almutairy, Meznah; Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  4. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches

    PubMed Central

    Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989

  5. Highly sensitive MYD88L265P mutation detection by droplet digital PCR in Waldenström Macroglobulinemia.

    PubMed

    Drandi, Daniela; Genuardi, Elisa; Dogliotti, Irene; Ferrante, Martina; Jiménez, Cristina; Guerrini, Francesca; Lo Schirico, Mariella; Mantoan, Barbara; Muccio, Vittorio; Lia, Giuseppe; Zaccaria, Gian Maria; Omedè, Paola; Passera, Roberto; Orsucci, Lorella; Benevolo, Giulia; Cavallo, Federica; Galimberti, Sara; García-Sanz, Ramón; Boccadoro, Mario; Ladetto, Marco; Ferrero, Simone

    2018-03-22

    We here describe a novel method for MYD88 L265P mutation detection and minimal residual disease monitoring in Waldenström Macroglobulinemia, by droplet digital PCR, in bone marrow and peripheral blood cells, as well as in circulating cell free DNA. Our method shows a sensitivity of 5.00E-05, by far superior to the widely used allele-specific polymerase chain reaction (1.00E-03). Overall, 291 unsorted samples from 148 patients (133 Waldenstrom 11 IgG-lymphoplasmacytic lymphoma and 4 IgM-monoclonal gammopathy of undetermined significance), 194 baseline and 97 follow-up, were analyzed. 122/128 (95.3%) bone marrow and 47/66 (71.2%) baseline peripheral blood samples scored positive for MYD88 L265P Moreover, to investigate whether MYD88 L265P by droplet digital PCR could be used for minimal residual disease monitoring, mutation levels were compared with IGH-based minimal residual disease analysis in 10 patients, showing to be as informative as to the classical, standardized but not yet validated in Waldenström Macroglobulinemia, IGH-based minimal residual disease assay (r 2 =0.64). Finally, MYD88 L265P detection performed by droplet digital PCR on plasmatic circulating tumor DNA from 60 patients showed a good correlation with bone marrow (bone marrow median mutational value 1.92E-02, plasmatic circulating tumor DNA value: 1.4E-02, peripheral blood value: 1.03E-03). This study indicates that droplet digital PCR MYD88 L265P assay is a feasible and sensitive tool for mutational screening and minimal residual disease monitoring in Waldenström Macroglobulinemia. Both unsorted bone marrow and peripheral blood samples can be reliably tested, as well as circulating tumor DNA, that represents an attractive, less invasive alternative to bone marrow for MYD88 L265P detection. Copyright © 2018, Ferrata Storti Foundation.

  6. Context-specific selection of algorithms for recursive feature tracking in endoscopic image using a new methodology.

    PubMed

    Selka, F; Nicolau, S; Agnus, V; Bessaid, A; Marescaux, J; Soler, L

    2015-03-01

    In minimally invasive surgery, the tracking of deformable tissue is a critical component for image-guided applications. Deformation of the tissue can be recovered by tracking features using tissue surface information (texture, color,...). Recent work in this field has shown success in acquiring tissue motion. However, the performance evaluation of detection and tracking algorithms on such images are still difficult and are not standardized. This is mainly due to the lack of ground truth data on real data. Moreover, in order to avoid supplementary techniques to remove outliers, no quantitative work has been undertaken to evaluate the benefit of a pre-process based on image filtering, which can improve feature tracking robustness. In this paper, we propose a methodology to validate detection and feature tracking algorithms, using a trick based on forward-backward tracking that provides an artificial ground truth data. We describe a clear and complete methodology to evaluate and compare different detection and tracking algorithms. In addition, we extend our framework to propose a strategy to identify the best combinations from a set of detector, tracker and pre-process algorithms, according to the live intra-operative data. Experimental results have been performed on in vivo datasets and show that pre-process can have a strong influence on tracking performance and that our strategy to find the best combinations is relevant for a reasonable computation cost. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Warning systems evaluation for overhead clearance detection : final report.

    DOT National Transportation Integrated Search

    2017-02-01

    This study reports on off-the-shelf systems designed to detect the heights of vehicles to minimize or eliminate collisions with roadway bridges. Implemented systems were identified, reviewed, and compared and relatively inexpensive options recommende...

  8. Application of multi-objective optimization to pooled experiments of next generation sequencing for detection of rare mutations.

    PubMed

    Zilinskas, Julius; Lančinskas, Algirdas; Guarracino, Mario Rosario

    2014-01-01

    In this paper we propose some mathematical models to plan a Next Generation Sequencing experiment to detect rare mutations in pools of patients. A mathematical optimization problem is formulated for optimal pooling, with respect to minimization of the experiment cost. Then, two different strategies to replicate patients in pools are proposed, which have the advantage to decrease the overall costs. Finally, a multi-objective optimization formulation is proposed, where the trade-off between the probability to detect a mutation and overall costs is taken into account. The proposed solutions are devised in pursuance of the following advantages: (i) the solution guarantees mutations are detectable in the experimental setting, and (ii) the cost of the NGS experiment and its biological validation using Sanger sequencing is minimized. Simulations show replicating pools can decrease overall experimental cost, thus making pooling an interesting option.

  9. Simple Approaches to Minimally-Instrumented, Microfluidic-Based Point-of-Care Nucleic Acid Amplification Tests

    PubMed Central

    Mauk, Michael G.; Song, Jinzhao; Liu, Changchun; Bau, Haim H.

    2018-01-01

    Designs and applications of microfluidics-based devices for molecular diagnostics (Nucleic Acid Amplification Tests, NAATs) in infectious disease testing are reviewed, with emphasis on minimally instrumented, point-of-care (POC) tests for resource-limited settings. Microfluidic cartridges (‘chips’) that combine solid-phase nucleic acid extraction; isothermal enzymatic nucleic acid amplification; pre-stored, paraffin-encapsulated lyophilized reagents; and real-time or endpoint optical detection are described. These chips can be used with a companion module for separating plasma from blood through a combined sedimentation-filtration effect. Three reporter types: Fluorescence, colorimetric dyes, and bioluminescence; and a new paradigm for end-point detection based on a diffusion-reaction column are compared. Multiplexing (parallel amplification and detection of multiple targets) is demonstrated. Low-cost detection and added functionality (data analysis, control, communication) can be realized using a cellphone platform with the chip. Some related and similar-purposed approaches by others are surveyed. PMID:29495424

  10. Food processing by high hydrostatic pressure.

    PubMed

    Yamamoto, Kazutaka

    2017-04-01

    High hydrostatic pressure (HHP) process, as a nonthermal process, can be used to inactivate microbes while minimizing chemical reactions in food. In this regard, a HHP level of 100 MPa (986.9 atm/1019.7 kgf/cm 2 ) and more is applied to food. Conventional thermal process damages food components relating color, flavor, and nutrition via enhanced chemical reactions. However, HHP process minimizes the damages and inactivates microbes toward processing high quality safe foods. The first commercial HHP-processed foods were launched in 1990 as fruit products such as jams, and then some other products have been commercialized: retort rice products (enhanced water impregnation), cooked hams and sausages (shelf life extension), soy sauce with minimized salt (short-time fermentation owing to enhanced enzymatic reactions), and beverages (shelf life extension). The characteristics of HHP food processing are reviewed from viewpoints of nonthermal process, history, research and development, physical and biochemical changes, and processing equipment.

  11. Optimizing the TESS Planet Finding Pipeline

    NASA Astrophysics Data System (ADS)

    Chitamitara, Aerbwong; Smith, Jeffrey C.; Tenenbaum, Peter; TESS Science Processing Operations Center

    2017-10-01

    The Transiting Exoplanet Survey Satellite (TESS) is a new NASA planet finding all-sky survey that will observe stars within 200 light years and 10-100 times brighter than that of the highly successful Kepler mission. TESS is expected to detect ~1000 planets smaller than Neptune and dozens of Earth size planets. As in the Kepler mission, the Science Processing Operations Center (SPOC) processing pipeline at NASA Ames Research center is tasked with calibrating the raw pixel data, generating systematic error corrected light curves and then detecting and validating transit signals. The Transiting Planet Search (TPS) component of the pipeline must be modified and tuned for the new data characteristics in TESS. For example, due to each sector being viewed for as little as 28 days, the pipeline will be identifying transiting planets based on a minimum of two transit signals rather than three, as in the Kepler mission. This may result in a significantly higher false positive rate. The study presented here is to measure the detection efficiency of the TESS pipeline using simulated data. Transiting planets identified by TPS are compared to transiting planets from the simulated transit model using the measured epochs, periods, transit durations and the expected detection statistic of injected transit signals (expected MES). From the comparisons, the recovery and false positive rates of TPS is measured. Measurements of recovery in TPS are then used to adjust TPS configuration parameters to maximize the planet recovery rate and minimize false detections. The improvements in recovery rate between initial TPS conditions and after various adjustments will be presented and discussed.

  12. Attribute-driven transfer learning for detecting novel buried threats with ground-penetrating radar

    NASA Astrophysics Data System (ADS)

    Colwell, Kenneth A.; Collins, Leslie M.

    2016-05-01

    Ground-penetrating radar (GPR) technology is an effective method of detecting buried explosive threats. The system uses a binary classifier to distinguish "targets", or buried threats, from "nontargets" arising from system prescreener false alarms; this classifier is trained on a dataset of previously-observed buried threat types. However, the threat environment is not static, and new threat types that appear must be effectively detected even if they are not highly similar to every previously-observed type. Gathering a new dataset that includes a new threat type is expensive and time-consuming; minimizing the amount of new data required to effectively detect the new type is therefore valuable. This research aims to reduce the number of training examples needed to effectively detect new types using transfer learning, which leverages previous learning tasks to accelerate and improve new ones. Further, new types have attribute data, such as composition, components, construction, and size, which can be observed without GPR and typically are not explicitly included in the learning process. Since attribute tags for buried threats determine many aspects of their GPR representation, a new threat type's attributes can be highly relevant to the transfer-learning process. In this work, attribute data is used to drive transfer learning, both by using attributes to select relevant dataset examples for classifier fusion, and by extending a relevance vector machine (RVM) model to perform intelligent attribute clustering and selection. Classification performance results for both the attribute-only case and the low-data case are presented, using a dataset containing a variety of threat types.

  13. Low-dose CT in clinical diagnostics.

    PubMed

    Fuentes-Orrego, Jorge M; Sahani, Dushyant V

    2013-09-01

    Computed tomography (CT) has become key for patient management due to its outstanding capabilities for detecting disease processes and assessing treatment response, which has led to expansion in CT imaging for diagnostic and image-guided therapeutic interventions. Despite these benefits, the growing use of CT has raised concerns as radiation risks associated with radiation exposure. The purpose of this article is to familiarize the reader with fundamental concepts of dose metrics for assessing radiation exposure and weighting radiation-associated risks. The article also discusses general approaches for reducing radiation dose while preserving diagnostic quality. The authors provide additional insight for undertaking protocol optimization, customizing scanning techniques based on the patients' clinical scenario and demographics. Supplemental strategies are postulated using more advanced post-processing techniques for achieving further dose improvements. The technologic offerings of CT are integral to modern medicine and its role will continue to evolve. Although, the estimated risks from low levels of radiation of a single CT exam are uncertain, it is prudent to minimize the dose from CT by applying common sense solutions and using other simple strategies as well as exploiting technologic innovations. These efforts will enable us to take advantage of all the clinical benefits of CT while minimizing the likelihood of harm to patients.

  14. Minimal Model of Prey Localization through the Lateral-Line System

    NASA Astrophysics Data System (ADS)

    Franosch, Jan-Moritz P.; Sobotka, Marion C.; Elepfandt, Andreas; van Hemmen, J. Leo

    2003-10-01

    The clawed frog Xenopus is an aquatic predator catching prey at night by detecting water movements caused by its prey. We present a general method, a “minimal model” based on a minimum-variance estimator, to explain prey detection through the frog's many lateral-line organs, even in case several of them are defunct. We show how waveform reconstruction allows Xenopus' neuronal system to determine both the direction and the character of the prey and even to distinguish two simultaneous wave sources. The results can be applied to many aquatic amphibians, fish, or reptiles such as crocodilians.

  15. Sensory shelf life estimation of minimally processed lettuce considering two stages of consumers' decision-making process.

    PubMed

    Ares, Gastón; Giménez, Ana; Gámbaro, Adriana

    2008-01-01

    The aim of the present work was to study the influence of context, particularly the stage of the decision-making process (purchase vs consumption stage), on sensory shelf life of minimally processed lettuce. Leaves of butterhead lettuce were placed in common polypropylene bags and stored at 5, 10 and 15 degrees C. Periodically, a panel of six assessors evaluated the appearance of the samples, and a panel of 40 consumers evaluated their appearance and answered "yes" or "no" to the questions: "Imagine you are in a supermarket, you want to buy a minimally processed lettuce, and you find a package of lettuce with leaves like this, would you normally buy it?" and "Imagine you have this leaf of lettuce stored in your refrigerator, would you normally consume it?". Survival analysis was used to calculate the shelf lives of minimally processed lettuce, considering both decision-making stages. Shelf lives estimated considering rejection to purchase were significantly lower than those estimated considering rejection to consume. Therefore, in order to be conservative and assure the products' quality, shelf life should be estimated considering consumers' rejection to purchase instead of rejection to consume, as traditionally has been done. On the other hand, results from logistic regressions of consumers' rejection percentage as a function of the evaluated appearance attributes suggested that consumers considered them differently while deciding whether to purchase or to consume minimally processed lettuce.

  16. Performance of an improved thermal neutron activation detector for buried bulk explosives

    NASA Astrophysics Data System (ADS)

    McFee, J. E.; Faust, A. A.; Andrews, H. R.; Clifford, E. T. H.; Mosquera, C. M.

    2013-06-01

    First generation thermal neutron activation (TNA) sensors, employing an isotopic source and NaI(Tl) gamma ray detectors, were deployed by Canadian Forces in 2002 as confirmation sensors on multi-sensor landmine detection systems. The second generation TNA detector is being developed with a number of improvements aimed at increasing sensitivity and facilitating ease of operation. Among these are an electronic neutron generator to increase sensitivity for deeper and horizontally displaced explosives; LaBr3(Ce) scintillators, to improve time response and energy resolution; improved thermal and electronic stability; improved sensor head geometry to minimize spatial response nonuniformity; and more robust data processing. The sensor is described, with emphasis on the improvements. Experiments to characterize the performance of the second generation TNA in detecting buried landmines and improvised explosive devices (IEDs) hidden in culverts are described. Performance results, including comparisons between the performance of the first and second generation systems are presented.

  17. A semantic autonomous video surveillance system for dense camera networks in Smart Cities.

    PubMed

    Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M; Carro, Belén; Sánchez-Esguevillas, Antonio

    2012-01-01

    This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network.

  18. Swept-frequency feedback interferometry using terahertz frequency QCLs: a method for imaging and materials analysis.

    PubMed

    Rakić, Aleksandar D; Taimre, Thomas; Bertling, Karl; Lim, Yah Leng; Dean, Paul; Indjin, Dragan; Ikonić, Zoran; Harrison, Paul; Valavanis, Alexander; Khanna, Suraj P; Lachab, Mohammad; Wilson, Stephen J; Linfield, Edmund H; Davies, A Giles

    2013-09-23

    The terahertz (THz) frequency quantum cascade laser (QCL) is a compact source of high-power radiation with a narrow intrinsic linewidth. As such, THz QCLs are extremely promising sources for applications including high-resolution spectroscopy, heterodyne detection, and coherent imaging. We exploit the remarkable phase-stability of THz QCLs to create a coherent swept-frequency delayed self-homodyning method for both imaging and materials analysis, using laser feedback interferometry. Using our scheme we obtain amplitude-like and phase-like images with minimal signal processing. We determine the physical relationship between the operating parameters of the laser under feedback and the complex refractive index of the target and demonstrate that this coherent detection method enables extraction of complex refractive indices with high accuracy. This establishes an ultimately compact and easy-to-implement THz imaging and materials analysis system, in which the local oscillator, mixer, and detector are all combined into a single laser.

  19. Approximate Computing Techniques for Iterative Graph Algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Panyala, Ajay R.; Subasi, Omer; Halappanavar, Mahantesh

    Approximate computing enables processing of large-scale graphs by trading off quality for performance. Approximate computing techniques have become critical not only due to the emergence of parallel architectures but also the availability of large scale datasets enabling data-driven discovery. Using two prototypical graph algorithms, PageRank and community detection, we present several approximate computing heuristics to scale the performance with minimal loss of accuracy. We present several heuristics including loop perforation, data caching, incomplete graph coloring and synchronization, and evaluate their efficiency. We demonstrate performance improvements of up to 83% for PageRank and up to 450x for community detection, with lowmore » impact of accuracy for both the algorithms. We expect the proposed approximate techniques will enable scalable graph analytics on data of importance to several applications in science and their subsequent adoption to scale similar graph algorithms.« less

  20. Characterizing Interference in Radio Astronomy Observations through Active and Unsupervised Learning

    NASA Technical Reports Server (NTRS)

    Doran, G.

    2013-01-01

    In the process of observing signals from astronomical sources, radio astronomers must mitigate the effects of manmade radio sources such as cell phones, satellites, aircraft, and observatory equipment. Radio frequency interference (RFI) often occurs as short bursts (< 1 ms) across a broad range of frequencies, and can be confused with signals from sources of interest such as pulsars. With ever-increasing volumes of data being produced by observatories, automated strategies are required to detect, classify, and characterize these short "transient" RFI events. We investigate an active learning approach in which an astronomer labels events that are most confusing to a classifier, minimizing the human effort required for classification. We also explore the use of unsupervised clustering techniques, which automatically group events into classes without user input. We apply these techniques to data from the Parkes Multibeam Pulsar Survey to characterize several million detected RFI events from over a thousand hours of observation.

  1. ISTP SBIR phase 1 Full-Sky Scanner: A feasibility study

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The objective was to develop a Full-Sky Sensor (FSS) to detect the Earth, Sun and Moon from a spinning spacecraft. The concept adopted has infinitely variable resolution. A high-speed search mode is implemented on the spacecraft. The advantages are: (1) a single sensor determines attitude parameters from Earth, Sun and Moon, thus eliminating instrument mounting errors; (2) the bias between the actual spacecraft spin axis and the intended spin axis can be determined; (3) cost is minimized; and (4) ground processing is straightforward. The FSS is a modification of an existing flight-proven sensor. Modifications to the electronics are necessary to accommodate the amplitude range and signal width range of the celestial bodies to be detected. Potential applications include ISTP missions, Multi-Spacecraft Satellite Program (MSSP), dual-spin spacecraft at any altitude, spinning spacecraft at any altitude, and orbit parameter determination for low-Earth orbits.

  2. Expression of the B subunit of Escherichia coli heat-labile enterotoxin as a fusion protein in transgenic tomato.

    PubMed

    Walmsley, A M; Alvarez, M L; Jin, Y; Kirk, D D; Lee, S M; Pinkhasov, J; Rigano, M M; Arntzen, C J; Mason, H S

    2003-06-01

    Epitopes often require co-delivery with an adjuvant or targeting protein to enable recognition by the immune system. This paper reports the ability of transgenic tomato plants to express a fusion protein consisting of the B subunit of the Escherichia coli heat-labile enterotoxin (LTB) and an immunocontraceptive epitope. The fusion protein was found to assemble into pentamers, as evidenced by its ability to bind to gangliosides, and had an average expression level of 37.8 microg g(-1) in freeze-dried transgenic tissues. Processing of selected transgenic fruit resulted in a 16-fold increase in concentration of the antigen with minimal loss in detectable antigen. The species-specific nature of this epitope was shown by the inability of antibodies raised against non-target species to detect the LTB fusion protein. The immunocontraceptive ability of this vaccine will be tested in future pilot mice studies.

  3. A Novel Segment-Based Approach for Improving Classification Performance of Transport Mode Detection.

    PubMed

    Guvensan, M Amac; Dusun, Burak; Can, Baris; Turkmen, H Irem

    2017-12-30

    Transportation planning and solutions have an enormous impact on city life. To minimize the transport duration, urban planners should understand and elaborate the mobility of a city. Thus, researchers look toward monitoring people's daily activities including transportation types and duration by taking advantage of individual's smartphones. This paper introduces a novel segment-based transport mode detection architecture in order to improve the results of traditional classification algorithms in the literature. The proposed post-processing algorithm, namely the Healing algorithm, aims to correct the misclassification results of machine learning-based solutions. Our real-life test results show that the Healing algorithm could achieve up to 40% improvement of the classification results. As a result, the implemented mobile application could predict eight classes including stationary, walking, car, bus, tram, train, metro and ferry with a success rate of 95% thanks to the proposed multi-tier architecture and Healing algorithm.

  4. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    NASA Technical Reports Server (NTRS)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  5. Induction motor broken rotor bar fault location detection through envelope analysis of start-up current using Hilbert transform

    NASA Astrophysics Data System (ADS)

    Abd-el-Malek, Mina; Abdelsalam, Ahmed K.; Hassan, Ola E.

    2017-09-01

    Robustness, low running cost and reduced maintenance lead Induction Motors (IMs) to pioneerly penetrate the industrial drive system fields. Broken rotor bars (BRBs) can be considered as an important fault that needs to be early assessed to minimize the maintenance cost and labor time. The majority of recent BRBs' fault diagnostic techniques focus on differentiating between healthy and faulty rotor cage. In this paper, a new technique is proposed for detecting the location of the broken bar in the rotor. The proposed technique relies on monitoring certain statistical parameters estimated from the analysis of the start-up stator current envelope. The envelope of the signal is obtained using Hilbert Transformation (HT). The proposed technique offers non-invasive, fast computational and accurate location diagnostic process. Various simulation scenarios are presented that validate the effectiveness of the proposed technique.

  6. A Semantic Autonomous Video Surveillance System for Dense Camera Networks in Smart Cities

    PubMed Central

    Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M.; Carro, Belén; Sánchez-Esguevillas, Antonio

    2012-01-01

    This paper presents a proposal of an intelligent video surveillance system able to detect and identify abnormal and alarming situations by analyzing object movement. The system is designed to minimize video processing and transmission, thus allowing a large number of cameras to be deployed on the system, and therefore making it suitable for its usage as an integrated safety and security solution in Smart Cities. Alarm detection is performed on the basis of parameters of the moving objects and their trajectories, and is performed using semantic reasoning and ontologies. This means that the system employs a high-level conceptual language easy to understand for human operators, capable of raising enriched alarms with descriptions of what is happening on the image, and to automate reactions to them such as alerting the appropriate emergency services using the Smart City safety network. PMID:23112607

  7. ISTP SBIR phase 1 Full-Sky Scanner: A feasibility study

    NASA Astrophysics Data System (ADS)

    1986-08-01

    The objective was to develop a Full-Sky Sensor (FSS) to detect the Earth, Sun and Moon from a spinning spacecraft. The concept adopted has infinitely variable resolution. A high-speed search mode is implemented on the spacecraft. The advantages are: (1) a single sensor determines attitude parameters from Earth, Sun and Moon, thus eliminating instrument mounting errors; (2) the bias between the actual spacecraft spin axis and the intended spin axis can be determined; (3) cost is minimized; and (4) ground processing is straightforward. The FSS is a modification of an existing flight-proven sensor. Modifications to the electronics are necessary to accommodate the amplitude range and signal width range of the celestial bodies to be detected. Potential applications include ISTP missions, Multi-Spacecraft Satellite Program (MSSP), dual-spin spacecraft at any altitude, spinning spacecraft at any altitude, and orbit parameter determination for low-Earth orbits.

  8. A Smart Detection System Based on Specific Magnetic and Rolling Cycle Amplification Signal-Amplified Dual-Aptamers to Accurately Monitor Minimal Residual Diseases in Patients with T-ALL.

    PubMed

    Li, Xa; Zhou, Bo; Zhao, Zilong; Hu, Zixi; Zhou, Sufang; Yang, Nuo; Huang, Yong; Zhang, Zhenghua; Su, Jing; Lan, Dan; Qin, Xue; Meng, Jinyu; Zheng, Duo; He, Jian; Huang, Xianing; Zhao, Jing; Zhang, Zhiyong; Tan, Weihong; Lu, Xiaoling; Zhao, Yongxiang

    2016-12-01

    It is a major clinical challenge for clinicians how to early find out minimal residual diseases (MRD) of leukemia. Here, we developed a smart detection system for MRD involving magnetic aptamer sgc8 probe (M-sgc8 probe) to capture CEM cells and rolling cycle amplification probe (RCA-sgc8 probe) to initiate RCA, producing a single-stranded tandem repeated copy of the circular template. The DNA products were hybridized with molecular beacon to generate the amplified fluorescence signal. An in vitro model to mimic MRD was established to evaluate the sensitivity of the smart detection system. The smart detection system was used to detect MRD in patients with T-ALL peri-chemotherapy, which could not only specifically captured T-ALL cells, but also significantly amplified fluorescence signals on them. The sensitivity was 1/20,000. These results indicate that the smart detection system with high specificity and sensitivity could more efficiently monitor the progress of T-ALL peri-chemotherapy.

  9. Multiwaveband simulation-based signature analysis of camouflaged human dismounts in cluttered environments with TAIThermIR and MuSES

    NASA Astrophysics Data System (ADS)

    Packard, Corey D.; Klein, Mark D.; Viola, Timothy S.; Hepokoski, Mark A.

    2016-10-01

    The ability to predict electro-optical (EO) signatures of diverse targets against cluttered backgrounds is paramount for signature evaluation and/or management. Knowledge of target and background signatures is essential for a variety of defense-related applications. While there is no substitute for measured target and background signatures to determine contrast and detection probability, the capability to simulate any mission scenario with desired environmental conditions is a tremendous asset for defense agencies. In this paper, a systematic process for the thermal and visible-through-infrared simulation of camouflaged human dismounts in cluttered outdoor environments is presented. This process, utilizing the thermal and EO/IR radiance simulation tool TAIThermIR (and MuSES), provides a repeatable and accurate approach for analyzing contrast, signature and detectability of humans in multiple wavebands. The engineering workflow required to combine natural weather boundary conditions and the human thermoregulatory module developed by ThermoAnalytics is summarized. The procedure includes human geometry creation, human segmental physiology description and transient physical temperature prediction using environmental boundary conditions and active thermoregulation. Radiance renderings, which use Sandford-Robertson BRDF optical surface property descriptions and are coupled with MODTRAN for the calculation of atmospheric effects, are demonstrated. Sensor effects such as optical blurring and photon noise can be optionally included, increasing the accuracy of detection probability outputs that accompany each rendering. This virtual evaluation procedure has been extensively validated and provides a flexible evaluation process that minimizes the difficulties inherent in human-subject field testing. Defense applications such as detection probability assessment, camouflage pattern evaluation, conspicuity tests and automatic target recognition are discussed.

  10. Physicochemical characterization of spray-dried PLGA/PEG microspheres, and preliminary assessment of biological response.

    PubMed

    Javiya, Curie; Jonnalagadda, Sriramakamal

    2016-09-01

    The use of spray-drying to prepare blended PLGA:PEG microspheres with lower immune detection. To study physical properties, polymer miscibility and alveolar macrophage response for blended PLGA:PEG microspheres prepared by a laboratory-scale spray-drying process. Microspheres were prepared by spray-drying 0-20% w/w ratios of PLGA 65:35 and PEG 3350 in dichloromethane. Particle size and morphology was studied using scanning electron microscopy. Polymer miscibility and residual solvent levels evaluated by thermal analysis (differential scanning calorimetry - DSC and thermogravimetric analysis - TGA). Immunogenicity was assessed in vitro by response of rat alveolar macrophages (NR8383) by the MTT-based cell viability assay and reactive oxygen species (ROS) detection. The spray dried particles were spherical, with a size range of about 2-3 µm and a yield of 16-60%. Highest yield was obtained at 1% PEG concentration. Thermal analysis showed a melting peak at 59 °C (enthalpy: 170.61 J/g) and a degradation-onset of 180 °C for PEG 3350. PLGA 65:35 was amorphous, with a Tg of 43 °C. Blended PLGA:PEG microspheres showed a delayed degradation-onset of 280 °C, and PEG enthalpy-loss corresponding to 15% miscibility of PEG in PLGA. NR8383 viability studies and ROS detection upon exposure to these cells suggested that blended PLGA:PEG microspheres containing 1 and 5% PEG are optimal in controling cell proliferation and activation. This research establishes the feasibility of using a spray-drying process to prepare spherical particles (2-3 µm) of molecularly-blended PLGA 65:35 and PEG 3350. A PEG concentration of 1-5% was optimal to maximize process yield, with minimal potential for immune detection.

  11. A Fully Redundant On-Line Mass Spectrometer System Used to Monitor Cryogenic Fuel Leaks on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Griffin, Timothy P.; Naylor, Guy R.; Haskell, William D.; Breznik, Greg S.; Mizell, Carolyn A.; Helms, William R.; Steinrock, T. (Technical Monitor)

    2001-01-01

    An on-line gas monitoring system was developed to replace the older systems used to monitor for cryogenic leaks on the Space Shuttles before launch. The system uses a mass spectrometer to monitor multiple locations in the process, which allows the system to monitor all gas constituents of interest in a nearly simultaneous manner. The system is fully redundant and meets all requirements for ground support equipment (GSE). This includes ruggedness to withstand launch on the Mobile Launcher Platform (MLP), ease of operation, and minimal operator intervention. The system can be fully automated so that an operator is notified when an unusual situation or fault is detected. User inputs are through personal computer using mouse and keyboard commands. The graphical user interface is very intuitive and easy to operate. The system has successfully supported four launches to date. It is currently being permanently installed as the primary system monitoring the Space Shuttles during ground processing and launch operations. Time and cost savings will be substantial over the current systems when it is fully implemented in the field. Tests were performed to demonstrate the performance of the system. Low limits-of-detection coupled with small drift make the system a major enhancement over the current systems. Though this system is currently optimized for detecting cryogenic leaks, many other gas constituents could be monitored using the Hazardous Gas Detection System (HGDS) 2000.

  12. Determination of secondary and tertiary amines as N-nitrosamine precursors in drinking water system using ultra-fast liquid chromatography-tandem mass spectrometry.

    PubMed

    Wu, Qihua; Shi, Honglan; Ma, Yinfa; Adams, Craig; Eichholz, Todd; Timmons, Terry; Jiang, Hua

    2015-01-01

    N-Nitrosamines are potent mutagenic and carcinogenic emerging water disinfection by-products (DBPs). The most effective strategy to control the formation of these DBPs is minimizing their precursors from source water. Secondary and tertiary amines are dominating precursors of N-nitrosamines formation during drinking water disinfection process. Therefore, the screening and removal of these amines in source water are very essential for preventing the formation of N-nitrosamines. A rapid, simple, and sensitive ultrafast liquid chromatography-tandem mass spectrometry (UFLC-MS/MS) method has been developed in this study to determine seven amines, including dimethylamine, ethylmethylamine, diethylamine, dipropylamine, trimethylamine, 3-(dimethylaminomethyl)indole, and 4-dimethylaminoantipyrine, as major precursors of N-nitrosamines in drinking water system. No sample preparation process is needed except a simple filtration. Separation and detection can be achieved in 11 min per sample. The method detection limits of selected amines are ranging from 0.02 μg/L to 1 μg/L except EMA (5 μg/L), and good calibration linearity was achieved. The developed method was applied to determine the selected precursors in source water and drinking water samples collected from Midwest area of the United States. In most of water samples, the concentrations of selected precursors of N-nitrosamines were below their method detection limits. Dimethylamine was detected in some of water samples at the concentration up to 25.4 μg/L. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. The DISC Quotient

    NASA Astrophysics Data System (ADS)

    Elliott, John R.; Baxter, Stephen

    2012-09-01

    D.I.S.C: Decipherment Impact of a Signal's Content. The authors present a numerical method to characterise the significance of the receipt of a complex and potentially decipherable signal from extraterrestrial intelligence (ETI). The purpose of the scale is to facilitate the public communication of work on any such claimed signal, as such work proceeds, and to assist in its discussion and interpretation. Building on a "position" paper rationale, this paper looks at the DISC quotient proposed and develops the algorithmic steps and comprising measures that form this post detection strategy for information dissemination, based on prior work on message detection, decipherment. As argued, we require a robust and incremental strategy, to disseminate timely, accurate and meaningful information, to the scientific community and the general public, in the event we receive an "alien" signal that displays decipherable information. This post-detection strategy is to serve as a stepwise algorithm for a logical approach to information extraction and a vehicle for sequential information dissemination, to manage societal impact. The "DISC Quotient", which is based on signal analysis processing stages, includes factors based on the signal's data quantity, structure, affinity to known human languages, and likely decipherment times. Comparisons with human and other phenomena are included as a guide to assessing likely societal impact. It is submitted that the development, refinement and implementation of DISC as an integral strategy, during the complex processes involved in post detection and decipherment, is essential if we wish to minimize disruption and optimize dissemination.

  14. A novel multireceiver communications system configuration based on optimal estimation theory

    NASA Technical Reports Server (NTRS)

    Kumar, R.

    1990-01-01

    A multireceiver configuration for the purpose of carrier arraying and/or signal arraying is presented. Such a problem arises for example, in the NASA Deep Space Network where the same data-modulated signal from a spacecraft is received by a number of geographically separated antennas and the data detection must be efficiently performed on the basis of the various received signals. The proposed configuration is arrived at by formulating the carrier and/or signal arraying problem as an optimal estimation problem. Two specific solutions are proposed. The first solution is to simultaneously and optimally estimate the various phase processes received at different receivers with coupled phase locked loops (PLLs) wherein the individual PLLs acquire and track their respective receivers' phase processes, but are aided by each other in an optimal manner. However, when the phase processes are relatively weakly correlated, and for the case of relatively high values of symbol energy-to-noise spectral density ratio, a novel configuration for combining the data modulated, loop-output signals is proposed. The scheme can be extended to the case of low symbol energy-to-noise case by performing the combining/detection process over a multisymbol period. Such a configuration results in the minimization of the effective radio loss at the combiner output, and thus a maximization of energy per bit to noise-power spectral density ration is achieved.

  15. Reliable and Efficient Parallel Processing Algorithms and Architectures for Modern Signal Processing. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Liu, Kuojuey Ray

    1990-01-01

    Least-squares (LS) estimations and spectral decomposition algorithms constitute the heart of modern signal processing and communication problems. Implementations of recursive LS and spectral decomposition algorithms onto parallel processing architectures such as systolic arrays with efficient fault-tolerant schemes are the major concerns of this dissertation. There are four major results in this dissertation. First, we propose the systolic block Householder transformation with application to the recursive least-squares minimization. It is successfully implemented on a systolic array with a two-level pipelined implementation at the vector level as well as at the word level. Second, a real-time algorithm-based concurrent error detection scheme based on the residual method is proposed for the QRD RLS systolic array. The fault diagnosis, order degraded reconfiguration, and performance analysis are also considered. Third, the dynamic range, stability, error detection capability under finite-precision implementation, order degraded performance, and residual estimation under faulty situations for the QRD RLS systolic array are studied in details. Finally, we propose the use of multi-phase systolic algorithms for spectral decomposition based on the QR algorithm. Two systolic architectures, one based on triangular array and another based on rectangular array, are presented for the multiphase operations with fault-tolerant considerations. Eigenvectors and singular vectors can be easily obtained by using the multi-pase operations. Performance issues are also considered.

  16. The Use of Bioluminescence in Detecting Biohazardous Substances in Water.

    ERIC Educational Resources Information Center

    Thomulka, Kenneth William; And Others

    1993-01-01

    Describes an inexpensive, reproducible alternative assay that requires minimal preparation and equipment for water testing. It provides students with a direct method of detecting potentially biohazardous material in water by observing the reduction in bacterial luminescence. (PR)

  17. Demand effects on positive response distortion by police officer applicants on the Revised NEO Personality Inventory.

    PubMed

    Detrick, Paul; Chibnall, John T; Call, Cynthia

    2010-09-01

    Understanding and detecting response distortion is important in the high-demand circumstances of personnel selection. In this article, we describe positive response distortion on the Revised NEO Personality Inventory (NEO PI-R; Costa & McCrae, 1992) among police officer applicants under high and low demand conditions. Positive response distortion primarily reflected denial/minimization of Neuroticism and accentuation of traits associated with moralistic bias (Agreeableness and Conscientiousness). Validity of the NEO PI-R research validity scale, Positive Presentation Management, was weakly supported with respect to the Neuroticism domain only. Results will be useful in interpreting personality inventory results in the police personnel selection process.

  18. Photonics

    NASA Astrophysics Data System (ADS)

    Roh, Won B.

    Photonic technologies-based computational systems are projected to be able to offer order-of-magnitude improvements in processing speed, due to their intrinsic architectural parallelism and ultrahigh switching speeds; these architectures also minimize connectors, thereby enhancing reliability, and preclude EMP vulnerability. The use of optoelectronic ICs would also extend weapons capabilities in such areas as automated target recognition, systems-state monitoring, and detection avoidance. Fiber-optics technologies have an information-carrying capacity fully five orders of magnitude greater than copper-wire-based systems; energy loss in transmission is two orders of magnitude lower, and error rates one order of magnitude lower. Attention is being given to ZrF glasses for optical fibers with unprecedentedly low scattering levels.

  19. Nanofabrication technique based on localized photocatalytic reactions using a TiO2-coated atomic force microscopy probe

    NASA Astrophysics Data System (ADS)

    Shibata, Takayuki; Iio, Naohiro; Furukawa, Hiromi; Nagai, Moeto

    2017-02-01

    We performed a fundamental study on the photocatalytic degradation of fluorescently labeled DNA molecules immobilized on titanium dioxide (TiO2) thin films under ultraviolet irradiation. The films were prepared by the electrochemical anodization of Ti thin films sputtered on silicon substrates. We also confirmed that the photocurrent arising from the photocatalytic oxidation of DNA molecules can be detected during this process. We then demonstrated an atomic force microscopy (AFM)-based nanofabrication technique by employing TiO2-coated AFM probes to penetrate living cell membranes under near-physiological conditions for minimally invasive intracellular delivery.

  20. Evaluation of radiometric and geometric characteristics of LANDSAT-D imaging system

    NASA Technical Reports Server (NTRS)

    Salisbury, J. W.; Podwysocki, M. H.; Bender, L. U.; Rowan, L. C. (Principal Investigator)

    1983-01-01

    With vegetation masked and noise sources eliminated or minimized, different carbonate facies could be discriminated in a south Florida scene. Laboratory spectra of grab samples indicate that a 20% change in depth of the carbonate absorption band was detected despite the effects of atmospheric absorption. Both bright and dark hydrothermally altered volcanic rocks can be discriminated from their unaltered equivalents. A previously unrecognized altered area was identified on the basis of the TM images. The ability to map desert varnish in semi-arid terrains has economic significance as it defines areas that are less susceptible desert erosional process and suitable for construction development.

  1. Absence of both auditory evoked potentials and auditory percepts dependent on timing cues.

    PubMed

    Starr, A; McPherson, D; Patterson, J; Don, M; Luxford, W; Shannon, R; Sininger, Y; Tonakawa, L; Waring, M

    1991-06-01

    An 11-yr-old girl had an absence of sensory components of auditory evoked potentials (brainstem, middle and long-latency) to click and tone burst stimuli that she could clearly hear. Psychoacoustic tests revealed a marked impairment of those auditory perceptions dependent on temporal cues, that is, lateralization of binaural clicks, change of binaural masked threshold with changes in signal phase, binaural beats, detection of paired monaural clicks, monaural detection of a silent gap in a sound, and monaural threshold elevation for short duration tones. In contrast, auditory functions reflecting intensity or frequency discriminations (difference limens) were only minimally impaired. Pure tone audiometry showed a moderate (50 dB) bilateral hearing loss with a disproportionate severe loss of word intelligibility. Those auditory evoked potentials that were preserved included (1) cochlear microphonics reflecting hair cell activity; (2) cortical sustained potentials reflecting processing of slowly changing signals; and (3) long-latency cognitive components (P300, processing negativity) reflecting endogenous auditory cognitive processes. Both the evoked potential and perceptual deficits are attributed to changes in temporal encoding of acoustic signals perhaps occurring at the synapse between hair cell and eighth nerve dendrites. The results from this patient are discussed in relation to previously published cases with absent auditory evoked potentials and preserved hearing.

  2. A deterministic compressive sensing model for bat biosonar.

    PubMed

    Hague, David A; Buck, John R; Bilik, Igal

    2012-12-01

    The big brown bat (Eptesicus fuscus) uses frequency modulated (FM) echolocation calls to accurately estimate range and resolve closely spaced objects in clutter and noise. They resolve glints spaced down to 2 μs in time delay which surpasses what traditional signal processing techniques can achieve using the same echolocation call. The Matched Filter (MF) attains 10-12 μs resolution while the Inverse Filter (IF) achieves higher resolution at the cost of significantly degraded detection performance. Recent work by Fontaine and Peremans [J. Acoustic. Soc. Am. 125, 3052-3059 (2009)] demonstrated that a sparse representation of bat echolocation calls coupled with a decimating sensing method facilitates distinguishing closely spaced objects over realistic SNRs. Their work raises the intriguing question of whether sensing approaches structured more like a mammalian auditory system contains the necessary information for the hyper-resolution observed in behavioral tests. This research estimates sparse echo signatures using a gammatone filterbank decimation sensing method which loosely models the processing of the bat's auditory system. The decimated filterbank outputs are processed with [script-l](1) minimization. Simulations demonstrate that this model maintains higher resolution than the MF and significantly better detection performance than the IF for SNRs of 5-45 dB while undersampling the return signal by a factor of six.

  3. On-chip integrated labelling, transport and detection of tumour cells.

    PubMed

    Woods, Jane; Docker, Peter T; Dyer, Charlotte E; Haswell, Stephen J; Greenman, John

    2011-11-01

    Microflow cytometry represents a promising tool for the investigation of diagnostic and prognostic cellular cancer markers, particularly if integrated within a device that allows primary cells to be freshly isolated from the solid tumour biopsies that more accurately reflect patient-specific in vivo tissue microenvironments at the time of staining. However, current tissue processing techniques involve several sequential stages with concomitant cell losses, and as such are inappropriate for use with small biopsies. Accordingly, we present a simple method for combined antibody-labelling and dissociation of heterogeneous cells from a tumour mass, which reduces the number of processing steps. Perfusion of ex vivo tissue at 4°C with antibodies and enzymes slows cellular activity while allowing sufficient time for the diffusion of minimally active enzymes. In situ antibody-labelled cells are then dissociated at 37°C from the tumour mass, whereupon hydrogel-filled channels allow the release of relatively low cell numbers (<1000) into a biomimetic microenvironment. This novel approach to sample processing is then further integrated with hydrogel-based electrokinetic transport of the freshly liberated fluorescent cells for downstream detection. It is anticipated that this integrated microfluidic methodology will have wide-ranging biomedical and clinical applications. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Uncoupling of sgRNAs from their associated barcodes during PCR amplification of combinatorial CRISPR screens

    PubMed Central

    2018-01-01

    Many implementations of pooled screens in mammalian cells rely on linking an element of interest to a barcode, with the latter subsequently quantitated by next generation sequencing. However, substantial uncoupling between these paired elements during lentiviral production has been reported, especially as the distance between elements increases. We detail that PCR amplification is another major source of uncoupling, and becomes more pronounced with increased amounts of DNA template molecules and PCR cycles. To lessen uncoupling in systems that use paired elements for detection, we recommend minimizing the distance between elements, using low and equal template DNA inputs for plasmid and genomic DNA during PCR, and minimizing the number of PCR cycles. We also present a vector design for conducting combinatorial CRISPR screens that enables accurate barcode-based detection with a single short sequencing read and minimal uncoupling. PMID:29799876

  5. Using Loop Heat Pipes to Minimize Survival Heater Power for NASA's Evolutionary Xenon Thruster Power Processing Units

    NASA Technical Reports Server (NTRS)

    Choi, Michael K.

    2017-01-01

    A thermal design concept of using propylene loop heat pipes to minimize survival heater power for NASA's Evolutionary Xenon Thruster power processing units is presented. It reduces the survival heater power from 183 W to 35 W per power processing unit. The reduction is 81%.

  6. Alternative sanitization methods for minimally processed lettuce in comparison to sodium hypochlorite

    PubMed Central

    Bachelli, Mara Lígia Biazotto; Amaral, Rívia Darla Álvares; Benedetti, Benedito Carlos

    2013-01-01

    Lettuce is a leafy vegetable widely used in industry for minimally processed products, in which the step of sanitization is the crucial moment for ensuring a safe food for consumption. Chlorinated compounds, mainly sodium hypochlorite, are the most used in Brazil, but the formation of trihalomethanes from this sanitizer is a drawback. Then, the search for alternative methods to sodium hypochlorite has been emerging as a matter of great interest. The suitability of chlorine dioxide (60 mg L−1/10 min), peracetic acid (100 mg L−1/15 min) and ozonated water (1.2 mg L−1 /1 min) as alternative sanitizers to sodium hypochlorite (150 mg L−1 free chlorine/15 min) were evaluated. Minimally processed lettuce washed with tap water for 1 min was used as a control. Microbiological analyses were performed in triplicate, before and after sanitization, and at 3, 6, 9 and 12 days of storage at 2 ± 1 °C with the product packaged on LDPE bags of 60 μm. It was evaluated total coliforms, Escherichia coli, Salmonella spp., psicrotrophic and mesophilic bacteria, yeasts and molds. All samples of minimally processed lettuce showed absence of E. coli and Salmonella spp. The treatments of chlorine dioxide, peracetic acid and ozonated water promoted reduction of 2.5, 1.1 and 0.7 log cycle, respectively, on count of microbial load of minimally processed product and can be used as substitutes for sodium hypochlorite. These alternative compounds promoted a shelf-life of six days to minimally processed lettuce, while the shelf-life with sodium hypochlorite was 12 days. PMID:24516433

  7. Alternative sanitization methods for minimally processed lettuce in comparison to sodium hypochlorite.

    PubMed

    Bachelli, Mara Lígia Biazotto; Amaral, Rívia Darla Álvares; Benedetti, Benedito Carlos

    2013-01-01

    Lettuce is a leafy vegetable widely used in industry for minimally processed products, in which the step of sanitization is the crucial moment for ensuring a safe food for consumption. Chlorinated compounds, mainly sodium hypochlorite, are the most used in Brazil, but the formation of trihalomethanes from this sanitizer is a drawback. Then, the search for alternative methods to sodium hypochlorite has been emerging as a matter of great interest. The suitability of chlorine dioxide (60 mg L(-1)/10 min), peracetic acid (100 mg L(-1)/15 min) and ozonated water (1.2 mg L(-1)/1 min) as alternative sanitizers to sodium hypochlorite (150 mg L(-1) free chlorine/15 min) were evaluated. Minimally processed lettuce washed with tap water for 1 min was used as a control. Microbiological analyses were performed in triplicate, before and after sanitization, and at 3, 6, 9 and 12 days of storage at 2 ± 1 °C with the product packaged on LDPE bags of 60 μm. It was evaluated total coliforms, Escherichia coli, Salmonella spp., psicrotrophic and mesophilic bacteria, yeasts and molds. All samples of minimally processed lettuce showed absence of E. coli and Salmonella spp. The treatments of chlorine dioxide, peracetic acid and ozonated water promoted reduction of 2.5, 1.1 and 0.7 log cycle, respectively, on count of microbial load of minimally processed product and can be used as substitutes for sodium hypochlorite. These alternative compounds promoted a shelf-life of six days to minimally processed lettuce, while the shelf-life with sodium hypochlorite was 12 days.

  8. In vivo Electrochemical Biosensor for Brain Glutamate Detection: A Mini Review

    PubMed Central

    HAMDAN, Siti Kartika; MOHD ZAIN, ainiharyati

    2014-01-01

    Glutamate is one of the most prominent neurotransmitters in mammalian brains, which plays an important role in neuronal excitation. High levels of neurotransmitter cause numerous alterations, such as calcium overload and the dysfunction of mitochondrial and oxidative stress. These alterations may lead to excitotoxicity and may trigger multiple neuronal diseases, such as Alzheimer’s disease, stroke, and epilepsy. Excitotoxicity is a pathological process that damages nerve cells and kills cells via excessive stimulation by neurotransmitters. Monitoring the concentration level of brain glutamate via an implantable microbiosensor is a promising alternative approach to closely investigate in the function of glutamate as a neurotransmitter. This review outlines glutamate microbiosensor designs to enhance the sensitivity of glutamate detection with less biofouling occurrence and minimal detection of interference species. There are many challenges in the development of a reproducible and stable implantable microbiosensor because many factors and limitations may affect the detection performance. However, the incorporation of multiple scales is needed to address the basic issues and combinations across the various disciplines needed to achieve the success of the system to overcome the challenges in the development of an implantable glutamate biosensor. PMID:25941459

  9. Optochemical sensor based on screenprinted fluorescent sensorspots surrounded by organic photodiodes for multianalyte detection

    NASA Astrophysics Data System (ADS)

    Kraker, E.; Lamprecht, B.; Haase, A.; Jakopic, G.; Abel, T.; Konrad, C.; Köstler, S.; Tscherner, M.; Stadlober, B.; Mayr, T.

    2010-08-01

    A compact, integrated photoluminescence based oxygen sensor, utilizing an organic light emitting device (OLED) as the light source and an organic photodiode (OPD) as the detection unit, is described. The detection system of the sensor array consists of an array of circular screen-printed fluorescent sensor spots surrounded by organic photodiodes as integrated fluorescence detectors. The OPD originates from the well-known Tang photodiode, consisting of a stacked layer of copper phthalocyanine (CuPc, p-type material) and perylene tetracarboxylic bisbenzimidazole (PTCBi, n-type material). An additional layer of tris-8-hydroxyquinolinatoaluminium (Alq3, n-type material) was inserted between the PTCBi layer and cathode. An ORMOCERR layer was used as encapsulation layer. For excitation an organic light emitting diode is used. The sensor spot and the detector are processed on the same flexible substrate. This approach not only simplifies the detection system by minimizing the numbers of required optical components - no optical filters have to be used for separating the excitation light and the luminescent emission-, but also has a large potential for low-cost sensor applications. The feasibility of the concept is demonstrated by an integrated oxygen sensor, indicating good performance. Sensor schemes for other chemical parameters are proposed.

  10. A Single-Step Enrichment Medium for Nonchromogenic Isolation of Healthy and Cold-Injured Salmonella spp. from Fresh Vegetables.

    PubMed

    Kim, Hong-Seok; Choi, Dasom; Kang, Il-Byeong; Kim, Dong-Hyeon; Yim, Jin-Hyeok; Kim, Young-Ji; Chon, Jung-Whan; Oh, Deog-Hwan; Seo, Kun-Ho

    2017-02-01

    Culture-based detection of nontyphoidal Salmonella spp. in foods requires at least four working days; therefore, new detection methods that shorten the test time are needed. In this study, we developed a novel single-step Salmonella enrichment broth, SSE-1, and compared its detection capability with that of commercial single-step ONE broth-Salmonella (OBS) medium and a conventional two-step enrichment method using buffered peptone water and Rappaport-Vassiliadis soy broth (BPW-RVS). Minimally processed lettuce samples were artificially inoculated with low levels of healthy and cold-injured Salmonella Enteritidis (10 0 or 10 1 colony-forming unit/25 g), incubated in OBS, BPW-RVS, and SSE-1 broths, and streaked on xylose lysine deoxycholate (XLD) agar. Salmonella recoverability was significantly higher in BPW-RVS (79.2%) and SSE-1 (83.3%) compared to OBS (39.3%) (p < 0.05). Our data suggest that the SSE-1 single-step enrichment broth could completely replace two-step enrichment with reduced enrichment time from 48 to 24 h, performing better than commercial single-step enrichment medium in the conventional nonchromogenic Salmonella detection, thus saving time, labor, and cost.

  11. A comparison of five approaches to measurement of anatomic knee alignment from radiographs.

    PubMed

    McDaniel, G; Mitchell, K L; Charles, C; Kraus, V B

    2010-02-01

    The recent recognition of the correlation of the hip-knee-ankle angle (HKA) with femur-tibia angle (FTA) on a standard knee radiograph has led to the increasing inclusion of FTA assessments in OA studies due to its clinical relevance, cost effectiveness and minimal radiation exposure. Our goal was to investigate the performance metrics of currently used methods of FTA measurement to determine whether a specific protocol could be recommended based on these results. Inter- and intra-rater reliability of FTA measurements were determined by intraclass correlation coefficient (ICC) of two independent analysts. Minimal detectable differences were determined and the correlation of FTA and HKA was analyzed by linear regression. Differences among methods of measuring HKA were assessed by ANOVA. All five methods of FTA measurement demonstrated high precision by inter- and intra-rater reproducibility (ICCs>or=0.93). All five methods displayed good accuracy, but after correction for the offset of FTA from HKA, the femoral notch landmark method was the least accurate. However, the methods differed according to their minimal detectable differences; the FTA methods utilizing the center of the base of the tibial spines or the center of the tibial plateau as knee center landmarks yielded the smallest minimal detectable differences (1.25 degrees and 1.72 degrees, respectively). All methods of FTA were highly reproducible, but varied in their accuracy and sensitivity to detect meaningful differences. Based on these parameters we recommend standardizing measurement angles with vertices at the base of the tibial spines or the center of the tibia and comparing single-point and two-point methods in larger studies. Copyright 2009 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  12. The reliability, accuracy and minimal detectable difference of a multi-segment kinematic model of the foot-shoe complex.

    PubMed

    Bishop, Chris; Paul, Gunther; Thewlis, Dominic

    2013-04-01

    Kinematic models are commonly used to quantify foot and ankle kinematics, yet no marker sets or models have been proven reliable or accurate when wearing shoes. Further, the minimal detectable difference of a developed model is often not reported. We present a kinematic model that is reliable, accurate and sensitive to describe the kinematics of the foot-shoe complex and lower leg during walking gait. In order to achieve this, a new marker set was established, consisting of 25 markers applied on the shoe and skin surface, which informed a four segment kinematic model of the foot-shoe complex and lower leg. Three independent experiments were conducted to determine the reliability, accuracy and minimal detectable difference of the marker set and model. Inter-rater reliability of marker placement on the shoe was proven to be good to excellent (ICC=0.75-0.98) indicating that markers could be applied reliably between raters. Intra-rater reliability was better for the experienced rater (ICC=0.68-0.99) than the inexperienced rater (ICC=0.38-0.97). The accuracy of marker placement along each axis was <6.7 mm for all markers studied. Minimal detectable difference (MDD90) thresholds were defined for each joint; tibiocalcaneal joint--MDD90=2.17-9.36°, tarsometatarsal joint--MDD90=1.03-9.29° and the metatarsophalangeal joint--MDD90=1.75-9.12°. These thresholds proposed are specific for the description of shod motion, and can be used in future research designed at comparing between different footwear. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Early identification of microorganisms in blood culture prior to the detection of a positive signal in the BACTEC FX system using matrix-assisted laser desorption/ionization-time of flight mass spectrometry.

    PubMed

    Wang, Ming-Cheng; Lin, Wei-Hung; Yan, Jing-Jou; Fang, Hsin-Yi; Kuo, Te-Hui; Tseng, Chin-Chung; Wu, Jiunn-Jong

    2015-08-01

    Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) is a valuable method for rapid identification of blood stream infection (BSI) pathogens. Integration of MALDI-TOF MS and blood culture system can speed the identification of causative BSI microorganisms. We investigated the minimal microorganism concentrations of common BSI pathogens required for positive blood culture using BACTEC FX and for positive identification using MALDI-TOF MS. The time to detection with positive BACTEC FX and minimal incubation time with positive MALDI-TOF MS identification were determined for earlier identification of common BSI pathogens. The minimal microorganism concentrations required for positive blood culture using BACTEC FX were >10(7)-10(8) colony forming units/mL for most of the BSI pathogens. The minimal microorganism concentrations required for identification using MALDI-TOF MS were > 10(7) colony forming units/mL. Using simulated BSI models, one can obtain enough bacterial concentration from blood culture bottles for successful identification of five common Gram-positive and Gram-negative bacteria using MALDI-TOF MS 1.7-2.3 hours earlier than the usual time to detection in blood culture systems. This study provides an approach to earlier identification of BSI pathogens prior to the detection of a positive signal in the blood culture system using MALDI-TOF MS, compared to current methods. It can speed the time for identification of BSI pathogens and may have benefits of earlier therapy choice and on patient outcome. Copyright © 2013. Published by Elsevier B.V.

  14. Alternative method for VIIRS Moon in space view process

    NASA Astrophysics Data System (ADS)

    Anderson, Samuel; Chiang, Kwofu V.; Xiong, Xiaoxiong

    2013-09-01

    The Visible Infrared Imaging Radiometer Suite (VIIRS) is a radiometric sensing instrument currently operating onboard the Suomi National Polar-orbiting Partnership (S-NPP) spacecraft. It provides high spatial-resolution images of the emitted and reflected radiation from the Earth and its atmosphere in 22 spectral bands (16 moderate resolution bands M1-M16, 5 imaging bands I1-I5, and 1 day/night pan band DNB) spanning the visible and infrared wavelengths from 412 nm to 12 μm. Just prior to each scan it makes of the Earth, the VIIRS instrument makes a measurement of deep space to serve as a background reference. These space view (SV) measurements form a crucial input to the VIIRS calibration process and are a major determinant of its accuracy. On occasion, the orientation of the Suomi NPP spacecraft coincides with the position of the moon in such a fashion that the SV measurements include light from the moon, rendering the SV measurements unusable for calibration. This paper investigates improvements to the existing baseline SV data processing algorithm of the Sensor Data Record (SDR) processing software. The proposed method makes use of a Moon-in-SV detection algorithm that identifies moon-contaminated SV data on a scan-by-scan basis. Use of this algorithm minimizes the number of SV scans that are rejected initially, so that subsequent substitution processes are always able to find alternative substitute SV scans in the near vicinity of detected moon-contaminated scans.

  15. High-Speed Data Acquisition and Digital Signal Processing System for PET Imaging Techniques Applied to Mammography

    NASA Astrophysics Data System (ADS)

    Martinez, J. D.; Benlloch, J. M.; Cerda, J.; Lerche, Ch. W.; Pavon, N.; Sebastia, A.

    2004-06-01

    This paper is framed into the Positron Emission Mammography (PEM) project, whose aim is to develop an innovative gamma ray sensor for early breast cancer diagnosis. Currently, breast cancer is detected using low-energy X-ray screening. However, functional imaging techniques such as PET/FDG could be employed to detect breast cancer and track disease changes with greater sensitivity. Furthermore, a small and less expensive PET camera can be utilized minimizing main problems of whole body PET. To accomplish these objectives, we are developing a new gamma ray sensor based on a newly released photodetector. However, a dedicated PEM detector requires an adequate data acquisition (DAQ) and processing system. The characterization of gamma events needs a free-running analog-to-digital converter (ADC) with sampling rates of more than 50 Ms/s and must achieve event count rates up to 10 MHz. Moreover, comprehensive data processing must be carried out to obtain event parameters necessary for performing the image reconstruction. A new generation digital signal processor (DSP) has been used to comply with these requirements. This device enables us to manage the DAQ system at up to 80 Ms/s and to execute intensive calculi over the detector signals. This paper describes our designed DAQ and processing architecture whose main features are: very high-speed data conversion, multichannel synchronized acquisition with zero dead time, a digital triggering scheme, and high throughput of data with an extensive optimization of the signal processing algorithms.

  16. Fluorescence detection of thrombin using autocatalytic strand displacement cycle reaction and a dual-aptamer DNA sandwich assay.

    PubMed

    Niu, Shuyan; Qu, Lijing; Zhang, Qing; Lin, Jiehua

    2012-02-15

    A sensitive and specific sandwich assay for the detection of thrombin is described. Two affiliative aptamers were used to increase the assay specificity through sandwich recognition. Recognition DNA loaded on gold nanoparticles (AuNPs) partially hybridized with the initiator DNA, which was displaced by surviving DNA. After the initiator DNA was released into the solution, one hairpin structure was opened, which in turn opened another hairpin structure. The initiator DNA was displaced and released into the solution again by another hairpin structure because of the hybridized reaction. Then the released initiator DNA initiated another autocatalytic strand displacement reaction. A sophisticated network of three such duplex formation cycles was designed to amplify the fluorescence signal. Other proteins, such as bovine serum albumin and lysozyme, did not interfere with the detection of thrombin. This approach enables rapid and specific thrombin detection with reduced costs and minimized material consumption compared with traditional assay processes. The detection limit of thrombin was as low as 4.3 × 10⁻¹³ M based on the AuNP amplification and the autocatalytic strand displacement cycle reaction. This method could be used in biological samples with excellent selectivity. Copyright © 2011 Elsevier Inc. All rights reserved.

  17. Simultaneous Detection of α-Fetoprotein and Carcinoembryonic Antigen Based on Si Nanowire Field-Effect Transistors.

    PubMed

    Zhu, Kuiyu; Zhang, Ye; Li, Zengyao; Zhou, Fan; Feng, Kang; Dou, Huiqiang; Wang, Tong

    2015-08-05

    Primary hepatic carcinoma (PHC) is one of the most common malignancies worldwide, resulting in death within six to 20 months. The survival rate can be improved by effective treatments when diagnosed at an early stage. The α-fetoprotein (AFP) and carcinoembryonic antigen (CEA) have been identified as markers that are expressed at higher levels in PHC patients. In this study, we employed silicon nanowire field-effect transistors (SiNW-FETs) with polydimethylsiloxane (PDMS) microfluidic channels to simultaneously detect AFP and CEA in desalted human serum. Dual-channel PDMS was first utilized for the selective modification of AFP and CEA antibodies on SiNWs, while single-channel PDMS offers faster and more sensitive detection of AFP and CEA in serum. During the SiNW modification process, 0.1% BSA was utilized to minimize nonspecific protein binding from serum. The linear dynamic ranges for the AFP and CEA detection were measured to be 500 fg/mL to 50 ng/mL and 50 fg/mL to 10 ng/mL, respectively. Our work demonstrates the promising potential of fabricated SiNW-FETs as a direct detection kit for multiple tumor markers in serum; therefore, it provides a chance for early stage diagnose and, hence, more effective treatments for PHC patients.

  18. Robust detection of multiple sclerosis lesions from intensity-normalized multi-channel MRI

    NASA Astrophysics Data System (ADS)

    Karpate, Yogesh; Commowick, Olivier; Barillot, Christian

    2015-03-01

    Multiple sclerosis (MS) is a disease with heterogeneous evolution among the patients. Quantitative analysis of longitudinal Magnetic Resonance Images (MRI) provides a spatial analysis of the brain tissues which may lead to the discovery of biomarkers of disease evolution. Better understanding of the disease will lead to a better discovery of pathogenic mechanisms, allowing for patient-adapted therapeutic strategies. To characterize MS lesions, we propose a novel paradigm to detect white matter lesions based on a statistical framework. It aims at studying the benefits of using multi-channel MRI to detect statistically significant differences between each individual MS patient and a database of control subjects. This framework consists in two components. First, intensity standardization is conducted to minimize the inter-subject intensity difference arising from variability of the acquisition process and different scanners. The intensity normalization maps parameters obtained using a robust Gaussian Mixture Model (GMM) estimation not affected by the presence of MS lesions. The second part studies the comparison of multi-channel MRI of MS patients with respect to an atlas built from the control subjects, thereby allowing us to look for differences in normal appearing white matter, in and around the lesions of each patient. Experimental results demonstrate that our technique accurately detects significant differences in lesions consequently improving the results of MS lesion detection.

  19. Network capability estimation. Vela network evaluation and automatic processing research. Technical report. [NETWORTH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snell, N.S.

    1976-09-24

    NETWORTH is a computer program which calculates the detection and location capability of seismic networks. A modified version of NETWORTH has been developed. This program has been used to evaluate the effect of station 'downtime', the signal amplitude variance, and the station detection threshold upon network detection capability. In this version all parameters may be changed separately for individual stations. The capability of using signal amplitude corrections has been added. The function of amplitude corrections is to remove possible bias in the magnitude estimate due to inhomogeneous signal attenuation. These corrections may be applied to individual stations, individual epicenters, ormore » individual station/epicenter combinations. An option has been added to calculate the effect of station 'downtime' upon network capability. This study indicates that, if capability loss due to detection errors can be minimized, then station detection threshold and station reliability will be the fundamental limits to network performance. A baseline network of thirteen stations has been performed. These stations are as follows: Alaskan Long Period Array, (ALPA); Ankara, (ANK); Chiang Mai, (CHG); Korean Seismic Research Station, (KSRS); Large Aperture Seismic Array, (LASA); Mashhad, (MSH); Mundaring, (MUN); Norwegian Seismic Array, (NORSAR); New Delhi, (NWDEL); Red Knife, Ontario, (RK-ON); Shillong, (SHL); Taipei, (TAP); and White Horse, Yukon, (WH-YK).« less

  20. Lateral interactions and speed of information processing in highly functioning multiple sclerosis patients.

    PubMed

    Nagy, Helga; Bencsik, Krisztina; Rajda, Cecília; Benedek, Krisztina; Janáky, Márta; Beniczky, Sándor; Kéri, Szabolcs; Vécsei, László

    2007-06-01

    Visual impairment is a common feature of multiple sclerosis. The aim of this study was to investigate lateral interactions in the visual cortex of highly functioning patients with multiple sclerosis and to compare that with basic visual and neuropsychologic functions. Twenty-two young, visually unimpaired multiple sclerosis patients with minimal symptoms (Expanded Disability Status Scale <2) and 30 healthy controls subjects participated in the study. Lateral interactions were investigated with the flanker task, during which participants were asked to detect the orientation of a low-contrast Gabor patch (vertical or horizontal), flanked with 2 collinear or orthogonal Gabor patches. Stimulus exposure time was 40, 60, 80, and 100 ms. Digit span forward/backward, digit symbol, verbal fluency, and California Verbal Learning Test procedures were used for background neuropsychologic assessment. Results revealed that patients with multiple sclerosis showed intact visual contrast sensitivity and neuropsychologic functions, whereas orientation detection in the orthogonal condition was significantly impaired. At 40-ms exposure time, collinear flankers facilitated the orientation detection performance of the patients resulting in normal performance. In conclusion, the detection of briefly presented, low-contrast visual stimuli was selectively impaired in multiple sclerosis. Lateral interactions between target and flankers robustly facilitated target detection in the patient group.

  1. Integration of biomimicry and nanotechnology for significantly improved detection of circulating tumor cells (CTCs).

    PubMed

    Myung, Ja Hye; Park, Sin-Jung; Wang, Andrew Z; Hong, Seungpyo

    2017-12-13

    Circulating tumor cells (CTCs) have received a great deal of scientific and clinical attention as a biomarker for diagnosis and prognosis of many types of cancer. Given their potential significance in clinics, a variety of detection methods, utilizing the recent advances in nanotechnology and microfluidics, have been introduced in an effort of achieving clinically significant detection of CTCs. However, effective detection and isolation of CTCs still remain a tremendous challenge due to their extreme rarity and phenotypic heterogeneity. Among many approaches that are currently under development, this review paper focuses on a unique, promising approach that takes advantages of naturally occurring processes achievable through application of nanotechnology to realize significant improvement in sensitivity and specificity of CTC capture. We provide an overview of successful outcome of this biomimetic CTC capture system in detection of tumor cells from in vitro, in vivo, and clinical pilot studies. We also emphasize the clinical impact of CTCs as biomarkers in cancer diagnosis and predictive prognosis, which provides a cost-effective, minimally invasive method that potentially replaces or supplements existing methods such as imaging technologies and solid tissue biopsy. In addition, their potential prognostic values as treatment guidelines and that ultimately help to realize personalized therapy are discussed. Copyright © 2017. Published by Elsevier B.V.

  2. Convolution neural networks for real-time needle detection and localization in 2D ultrasound.

    PubMed

    Mwikirize, Cosmas; Nosher, John L; Hacihaliloglu, Ilker

    2018-05-01

    We propose a framework for automatic and accurate detection of steeply inserted needles in 2D ultrasound data using convolution neural networks. We demonstrate its application in needle trajectory estimation and tip localization. Our approach consists of a unified network, comprising a fully convolutional network (FCN) and a fast region-based convolutional neural network (R-CNN). The FCN proposes candidate regions, which are then fed to a fast R-CNN for finer needle detection. We leverage a transfer learning paradigm, where the network weights are initialized by training with non-medical images, and fine-tuned with ex vivo ultrasound scans collected during insertion of a 17G epidural needle into freshly excised porcine and bovine tissue at depth settings up to 9 cm and [Formula: see text]-[Formula: see text] insertion angles. Needle detection results are used to accurately estimate needle trajectory from intensity invariant needle features and perform needle tip localization from an intensity search along the needle trajectory. Our needle detection model was trained and validated on 2500 ex vivo ultrasound scans. The detection system has a frame rate of 25 fps on a GPU and achieves 99.6% precision, 99.78% recall rate and an [Formula: see text] score of 0.99. Validation for needle localization was performed on 400 scans collected using a different imaging platform, over a bovine/porcine lumbosacral spine phantom. Shaft localization error of [Formula: see text], tip localization error of [Formula: see text] mm, and a total processing time of 0.58 s were achieved. The proposed method is fully automatic and provides robust needle localization results in challenging scanning conditions. The accurate and robust results coupled with real-time detection and sub-second total processing make the proposed method promising in applications for needle detection and localization during challenging minimally invasive ultrasound-guided procedures.

  3. Development of loop-mediated isothermal amplification (LAMP) assays for the rapid detection of allergic peanut in processed food.

    PubMed

    Sheu, Shyang-Chwen; Tsou, Po-Chuan; Lien, Yi-Yang; Lee, Meng-Shiou

    2018-08-15

    Peanut is a widely and common used in many cuisines around the world. However, peanut is also one of the most important food allergen for causing anaphylactic reaction. To prevent allergic reaction, the best way is to avoid the food allergen or food containing allergic ingredient such as peanut before food consuming. Thus, to efficient and precisely detect the allergic ingredient, peanut or related product, is essential and required for maintain consumer's health or their interest. In this study, a loop-mediated isothermal amplification (LAMP) assay was developed for the detection of allergic peanut using specifically designed primer sets. Two sets of the specific LAMP primers respectively targeted the internal transcribed sequence 1 (ITS1) of nuclear ribosomal DNA sequence regions and the ara h1 gene sequence of Arachia hypogeae (peanut) were used to address the application of LAMP for detecting peanut in processed food or diet. The results demonstrated that the identification of peanut using the newly designed primers for ITS 1 sequence is more sensitive rather than primers for sequence of Ara h1 gene when performing LAMP assay. Besides, the sensitivity of LAMP for detecting peanut is also higher than the traditional PCR method. These LAMP primers sets showed high specificity for the identification of the peanut and had no cross-reaction to other species of nut including walnut, hazelnut, almonds, cashew and macadamia nut. Moreover, when minimal 0.1% peanuts were mixed with other nuts ingredients at different ratios, no any cross-reactivity was evident during performing LAMP. Finally, genomic DNAs extracted from boiled and steamed peanut were used as templates; the detection of peanut by LAMP was not affected and reproducible. As to this established LAMP herein, not only can peanut ingredients be detected but commercial foods containing peanut can also be identified. This assay will be useful and potential for the rapid detection of peanut in practical food markets. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Investigation of the applicability of a functional programming model to fault-tolerant parallel processing for knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Harper, Richard

    1989-01-01

    In a fault-tolerant parallel computer, a functional programming model can facilitate distributed checkpointing, error recovery, load balancing, and graceful degradation. Such a model has been implemented on the Draper Fault-Tolerant Parallel Processor (FTPP). When used in conjunction with the FTPP's fault detection and masking capabilities, this implementation results in a graceful degradation of system performance after faults. Three graceful degradation algorithms have been implemented and are presented. A user interface has been implemented which requires minimal cognitive overhead by the application programmer, masking such complexities as the system's redundancy, distributed nature, variable complement of processing resources, load balancing, fault occurrence and recovery. This user interface is described and its use demonstrated. The applicability of the functional programming style to the Activation Framework, a paradigm for intelligent systems, is then briefly described.

  5. Laser Shockwave Technique For Characterization Of Nuclear Fuel Plate Interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James A. Smith; Barry H. Rabin; Mathieu Perton

    2012-07-01

    The US National Nuclear Security Agency is tasked with minimizing the worldwide use of high-enriched uranium. One aspect of that effort is the conversion of research reactors to monolithic fuel plates of low-enriched uranium. The manufacturing process includes hot isostatic press bonding of an aluminum cladding to the fuel foil. The Laser Shockwave Technique (LST) is here evaluated for characterizing the interface strength of fuel plates using depleted Uranium/Mo foils. LST is a non-contact method that uses lasers for the generation and detection of large amplitude acoustic waves and is therefore well adapted to the quality assurance of this process.more » Preliminary results show a clear signature of well-bonded and debonded interfaces and the method is able to classify/rank the bond strength of fuel plates prepared under different HIP conditions.« less

  6. Laser shockwave technique for characterization of nuclear fuel plate interfaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perton, M.; Levesque, D.; Monchalin, J.-P.

    2013-01-25

    The US National Nuclear Security Agency is tasked with minimizing the worldwide use of high-enriched uranium. One aspect of that effort is the conversion of research reactors to monolithic fuel plates of low-enriched uranium. The manufacturing process includes hot isostatic press bonding of an aluminum cladding to the fuel foil. The Laser Shockwave Technique (LST) is here evaluated for characterizing the interface strength of fuel plates using depleted Uranium/Mo foils. LST is a non-contact method that uses lasers for the generation and detection of large amplitude acoustic waves and is therefore well adapted to the quality assurance of this process.more » Preliminary results show a clear signature of well-bonded and debonded interfaces and the method is able to classify/rank the bond strength of fuel plates prepared under different HIP conditions.« less

  7. Cassini finds molecular hydrogen in the Enceladus plume: Evidence for hydrothermal processes.

    PubMed

    Waite, J Hunter; Glein, Christopher R; Perryman, Rebecca S; Teolis, Ben D; Magee, Brian A; Miller, Greg; Grimes, Jacob; Perry, Mark E; Miller, Kelly E; Bouquet, Alexis; Lunine, Jonathan I; Brockwell, Tim; Bolton, Scott J

    2017-04-14

    Saturn's moon Enceladus has an ice-covered ocean; a plume of material erupts from cracks in the ice. The plume contains chemical signatures of water-rock interaction between the ocean and a rocky core. We used the Ion Neutral Mass Spectrometer onboard the Cassini spacecraft to detect molecular hydrogen in the plume. By using the instrument's open-source mode, background processes of hydrogen production in the instrument were minimized and quantified, enabling the identification of a statistically significant signal of hydrogen native to Enceladus. We find that the most plausible source of this hydrogen is ongoing hydrothermal reactions of rock containing reduced minerals and organic materials. The relatively high hydrogen abundance in the plume signals thermodynamic disequilibrium that favors the formation of methane from CO 2 in Enceladus' ocean. Copyright © 2017, American Association for the Advancement of Science.

  8. Near infrared spectroscopy based monitoring of extraction processes of raw material with the help of dynamic predictive modeling

    NASA Astrophysics Data System (ADS)

    Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng

    2018-03-01

    The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.

  9. Process control systems: integrated for future process technologies

    NASA Astrophysics Data System (ADS)

    Botros, Youssry; Hajj, Hazem M.

    2003-06-01

    Process Control Systems (PCS) are becoming more crucial to the success of Integrated Circuit makers due to their direct impact on product quality, cost, and Fab output. The primary objective of PCS is to minimize variability by detecting and correcting non optimal performance. Current PCS implementations are considered disparate, where each PCS application is designed, deployed and supported separately. Each implementation targets a specific area of control such as equipment performance, wafer manufacturing, and process health monitoring. With Intel entering the nanometer technology era, tighter process specifications are required for higher yields and lower cost. This requires areas of control to be tightly coupled and integrated to achieve the optimal performance. This requirement can be achieved via consistent design and deployment of the integrated PCS. PCS integration will result in several benefits such as leveraging commonalities, avoiding redundancy, and facilitating sharing between implementations. This paper will address PCS implementations and focus on benefits and requirements of the integrated PCS. Intel integrated PCS Architecture will be then presented and its components will be briefly discussed. Finally, industry direction and efforts to standardize PCS interfaces that enable PCS integration will be presented.

  10. a Weighted Closed-Form Solution for Rgb-D Data Registration

    NASA Astrophysics Data System (ADS)

    Vestena, K. M.; Dos Santos, D. R.; Oilveira, E. M., Jr.; Pavan, N. L.; Khoshelham, K.

    2016-06-01

    Existing 3D indoor mapping of RGB-D data are prominently point-based and feature-based methods. In most cases iterative closest point (ICP) and its variants are generally used for pairwise registration process. Considering that the ICP algorithm requires an relatively accurate initial transformation and high overlap a weighted closed-form solution for RGB-D data registration is proposed. In this solution, we weighted and normalized the 3D points based on the theoretical random errors and the dual-number quaternions are used to represent the 3D rigid body motion. Basically, dual-number quaternions provide a closed-form solution by minimizing a cost function. The most important advantage of the closed-form solution is that it provides the optimal transformation in one-step, it does not need to calculate good initial estimates and expressively decreases the demand for computer resources in contrast to the iterative method. Basically, first our method exploits RGB information. We employed a scale invariant feature transformation (SIFT) for extracting, detecting, and matching features. It is able to detect and describe local features that are invariant to scaling and rotation. To detect and filter outliers, we used random sample consensus (RANSAC) algorithm, jointly with an statistical dispersion called interquartile range (IQR). After, a new RGB-D loop-closure solution is implemented based on the volumetric information between pair of point clouds and the dispersion of the random errors. The loop-closure consists to recognize when the sensor revisits some region. Finally, a globally consistent map is created to minimize the registration errors via a graph-based optimization. The effectiveness of the proposed method is demonstrated with a Kinect dataset. The experimental results show that the proposed method can properly map the indoor environment with an absolute accuracy around 1.5% of the travel of a trajectory.

  11. An algorithm for pavement crack detection based on multiscale space

    NASA Astrophysics Data System (ADS)

    Liu, Xiang-long; Li, Qing-quan

    2006-10-01

    Conventional human-visual and manual field pavement crack detection method and approaches are very costly, time-consuming, dangerous, labor-intensive and subjective. They possess various drawbacks such as having a high degree of variability of the measure results, being unable to provide meaningful quantitative information and almost always leading to inconsistencies in crack details over space and across evaluation, and with long-periodic measurement. With the development of the public transportation and the growth of the Material Flow System, the conventional method can far from meet the demands of it, thereby, the automatic pavement state data gathering and data analyzing system come to the focus of the vocation's attention, and developments in computer technology, digital image acquisition, image processing and multi-sensors technology made the system possible, but the complexity of the image processing always made the data processing and data analyzing come to the bottle-neck of the whole system. According to the above description, a robust and high-efficient parallel pavement crack detection algorithm based on Multi-Scale Space is proposed in this paper. The proposed method is based on the facts that: (1) the crack pixels in pavement images are darker than their surroundings and continuous; (2) the threshold values of gray-level pavement images are strongly related with the mean value and standard deviation of the pixel-grey intensities. The Multi-Scale Space method is used to improve the data processing speed and minimize the effectiveness caused by image noise. Experiment results demonstrate that the advantages are remarkable: (1) it can correctly discover tiny cracks, even from very noise pavement image; (2) the efficiency and accuracy of the proposed algorithm are superior; (3) its application-dependent nature can simplify the design of the entire system.

  12. Perceptual consequences of disrupted auditory nerve activity.

    PubMed

    Zeng, Fan-Gang; Kong, Ying-Yee; Michalewski, Henry J; Starr, Arnold

    2005-06-01

    Perceptual consequences of disrupted auditory nerve activity were systematically studied in 21 subjects who had been clinically diagnosed with auditory neuropathy (AN), a recently defined disorder characterized by normal outer hair cell function but disrupted auditory nerve function. Neurological and electrophysical evidence suggests that disrupted auditory nerve activity is due to desynchronized or reduced neural activity or both. Psychophysical measures showed that the disrupted neural activity has minimal effects on intensity-related perception, such as loudness discrimination, pitch discrimination at high frequencies, and sound localization using interaural level differences. In contrast, the disrupted neural activity significantly impairs timing related perception, such as pitch discrimination at low frequencies, temporal integration, gap detection, temporal modulation detection, backward and forward masking, signal detection in noise, binaural beats, and sound localization using interaural time differences. These perceptual consequences are the opposite of what is typically observed in cochlear-impaired subjects who have impaired intensity perception but relatively normal temporal processing after taking their impaired intensity perception into account. These differences in perceptual consequences between auditory neuropathy and cochlear damage suggest the use of different neural codes in auditory perception: a suboptimal spike count code for intensity processing, a synchronized spike code for temporal processing, and a duplex code for frequency processing. We also proposed two underlying physiological models based on desynchronized and reduced discharge in the auditory nerve to successfully account for the observed neurological and behavioral data. These methods and measures cannot differentiate between these two AN models, but future studies using electric stimulation of the auditory nerve via a cochlear implant might. These results not only show the unique contribution of neural synchrony to sensory perception but also provide guidance for translational research in terms of better diagnosis and management of human communication disorders.

  13. The Pedestrian Detection Method Using an Extension Background Subtraction about the Driving Safety Support Systems

    NASA Astrophysics Data System (ADS)

    Muranaka, Noriaki; Date, Kei; Tokumaru, Masataka; Imanishi, Shigeru

    In recent years, the traffic accident occurs frequently with explosion of traffic density. Therefore, we think that the safe and comfortable transportation system to defend the pedestrian who is the traffic weak is necessary. First, we detect and recognize the pedestrian (the crossing person) by the image processing. Next, we inform all the drivers of the right or left turn that the pedestrian exists by the sound and the image and so on. By prompting a driver to do safe driving in this way, the accident to the pedestrian can decrease. In this paper, we are using a background subtraction method for the movement detection of the movement object. In the background subtraction method, the update method in the background was important, and as for the conventional way, the threshold values of the subtraction processing and background update were identical. That is, the mixing rate of the input image and the background image of the background update was a fixation value, and the fine tuning which corresponded to the environment change of the weather was difficult. Therefore, we propose the update method of the background image that the estimated mistake is difficult to be amplified. We experiment and examines in the comparison about five cases of sunshine, cloudy, evening, rain, sunlight change, except night. This technique can set separately the threshold values of the subtraction processing and background update processing which suited the environmental condition of the weather and so on. Therefore, the fine tuning becomes possible freely in the mixing rate of the input image and the background image of the background update. Because the setting of the parameter which suited an environmental condition becomes important to minimize mistaking percentage, we examine about the setting of a parameter.

  14. Performance of basic kinematic thresholds in the identification of crash and near-crash events within naturalistic driving data.

    PubMed

    Perez, Miguel A; Sudweeks, Jeremy D; Sears, Edie; Antin, Jonathan; Lee, Suzanne; Hankey, Jonathan M; Dingus, Thomas A

    2017-06-01

    Understanding causal factors for traffic safety-critical events (e.g., crashes and near-crashes) is an important step in reducing their frequency and severity. Naturalistic driving data offers unparalleled insight into these factors, but requires identification of situations where crashes are present within large volumes of data. Sensitivity and specificity of these identification approaches are key to minimizing the resources required to validate candidate crash events. This investigation used data from the Second Strategic Highway Research Program Naturalistic Driving Study (SHRP 2 NDS) and the Canada Naturalistic Driving Study (CNDS) to develop and validate different kinematic thresholds that can be used to detect crash events. Results indicate that the sensitivity of many of these approaches can be quite low, but can be improved by selecting particular threshold levels based on detection performance. Additional improvements in these approaches are possible, and may involve leveraging combinations of different detection approaches, including advanced statistical techniques and artificial intelligence approaches, additional parameter modifications, and automation of validation processes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Cryo-balloon catheter localization in fluoroscopic images

    NASA Astrophysics Data System (ADS)

    Kurzendorfer, Tanja; Brost, Alexander; Jakob, Carolin; Mewes, Philip W.; Bourier, Felix; Koch, Martin; Kurzidim, Klaus; Hornegger, Joachim; Strobel, Norbert

    2013-03-01

    Minimally invasive catheter ablation has become the preferred treatment option for atrial fibrillation. Although the standard ablation procedure involves ablation points set by radio-frequency catheters, cryo-balloon catheters have even been reported to be more advantageous in certain cases. As electro-anatomical mapping systems do not support cryo-balloon ablation procedures, X-ray guidance is needed. However, current methods to provide support for cryo-balloon catheters in fluoroscopically guided ablation procedures rely heavily on manual user interaction. To improve this, we propose a first method for automatic cryo-balloon catheter localization in fluoroscopic images based on a blob detection algorithm. Our method is evaluated on 24 clinical images from 17 patients. The method successfully detected the cryoballoon in 22 out of 24 images, yielding a success rate of 91.6 %. The successful localization achieved an accuracy of 1.00 mm +/- 0.44 mm. Even though our methods currently fails in 8.4 % of the images available, it still offers a significant improvement over manual methods. Furthermore, detecting a landmark point along the cryo-balloon catheter can be a very important step for additional post-processing operations.

  16. Design of liver functional reserve monitor based on three-wavelength from IR to NIR.

    PubMed

    Ye, Fuli; Zhan, Huimiao; Shi, Guilian

    2018-05-04

    The preoperative evaluation of liver functional reserve is very important to determine the excision of liver lobe for the patients with liver cancer. There already exist many effective evaluation methods, but these ones have many disadvantages such as large trauma, complicated process and so on. Therefore, it is essential to develop a fast, accurate and simple detection method of liver functional reserve for the practical application in the clinical engineering field. According to the principle of spectrophotometry, this paper proposes a detection method of liver functional reserve based on three-wavelength from infrared light (IR) to near-infrared light (NIR), in which the artery pulse, the vein pulse and the move of tissue are taken into account. By using near-infrared photoelectric sensor technology and excreting experiment of indocyanine green, a minimally invasive, fast and simple testing equipment is designed in this paper. The testing result shows this equipment can greatly reduce the interference from human body and ambient, realize continuous and real-time detection of arterial degree of blood oxygen saturation and liver functional reserve.

  17. Conditional heteroscedasticity as a leading indicator of ecological regime shifts.

    PubMed

    Seekell, David A; Carpenter, Stephen R; Pace, Michael L

    2011-10-01

    Regime shifts are massive, often irreversible, rearrangements of nonlinear ecological processes that occur when systems pass critical transition points. Ecological regime shifts sometimes have severe consequences for human well-being, including eutrophication in lakes, desertification, and species extinctions. Theoretical and laboratory evidence suggests that statistical anomalies may be detectable leading indicators of regime shifts in ecological time series, making it possible to foresee and potentially avert incipient regime shifts. Conditional heteroscedasticity is persistent variance characteristic of time series with clustered volatility. Here, we analyze conditional heteroscedasticity as a potential leading indicator of regime shifts in ecological time series. We evaluate conditional heteroscedasticity by using ecological models with and without four types of critical transition. On approaching transition points, all time series contain significant conditional heteroscedasticity. This signal is detected hundreds of time steps in advance of the regime shift. Time series without regime shifts do not have significant conditional heteroscedasticity. Because probability values are easily associated with tests for conditional heteroscedasticity, detection of false positives in time series without regime shifts is minimized. This property reduces the need for a reference system to compare with the perturbed system.

  18. Work Processes, Role Conflict, and Role Overload: The Case of Nurses and Engineers in the Public Sector.

    ERIC Educational Resources Information Center

    Bacharach, Samuel B.; And Others

    1990-01-01

    Study of five sets of work process variables and their relationship to role conflict and overload among public sector nurses and engineers found managerial strategies appropriate for minimizing role conflict not necessarily appropriate for minimizing role overload. Some work process predictors may be similar across professions, and managerial…

  19. Implementation of a portable device for real-time ECG signal analysis.

    PubMed

    Jeon, Taegyun; Kim, Byoungho; Jeon, Moongu; Lee, Byung-Geun

    2014-12-10

    Cardiac disease is one of the main causes of catastrophic mortality. Therefore, detecting the symptoms of cardiac disease as early as possible is important for increasing the patient's survival. In this study, a compact and effective architecture for detecting atrial fibrillation (AFib) and myocardial ischemia is proposed. We developed a portable device using this architecture, which allows real-time electrocardiogram (ECG) signal acquisition and analysis for cardiac diseases. A noisy ECG signal was preprocessed by an analog front-end consisting of analog filters and amplifiers before it was converted into digital data. The analog front-end was minimized to reduce the size of the device and power consumption by implementing some of its functions with digital filters realized in software. With the ECG data, we detected QRS complexes based on wavelet analysis and feature extraction for morphological shape and regularity using an ARM processor. A classifier for cardiac disease was constructed based on features extracted from a training dataset using support vector machines. The classifier then categorized the ECG data into normal beats, AFib, and myocardial ischemia. A portable ECG device was implemented, and successfully acquired and processed ECG signals. The performance of this device was also verified by comparing the processed ECG data with high-quality ECG data from a public cardiac database. Because of reduced computational complexity, the ARM processor was able to process up to a thousand samples per second, and this allowed real-time acquisition and diagnosis of heart disease. Experimental results for detection of heart disease showed that the device classified AFib and ischemia with a sensitivity of 95.1% and a specificity of 95.9%. Current home care and telemedicine systems have a separate device and diagnostic service system, which results in additional time and cost. Our proposed portable ECG device provides captured ECG data and suspected waveform to identify sporadic and chronic events of heart diseases. This device has been built and evaluated for high quality of signals, low computational complexity, and accurate detection.

  20. [Arthroscopy-guided fracture management. Ankle joint and calcaneus].

    PubMed

    Schoepp, C; Rixen, D

    2013-04-01

    Arthroscopic fracture management of the ankle and calcaneus requires a differentiated approach. The aim is to minimize surgical soft tissue damage and to visualize anatomical fracture reduction arthroscopically. Moreover, additional cartilage damage can be detected and treated. The arthroscopic approach is limited by deep impressions of the joint surface needing cancellous bone grafting, by multiple fracture lines on the articular side and by high-grade soft tissue damage. An alternative to the minimally invasive arthroscopic approach is open arthroscopic reduction in conventional osteosynthesis. This facilitates correct assessment of surgical reduction of complex calcaneal fractures, otherwise remaining non-anatomical reduction might not be fluoroscopically detected during surgery.

  1. Edge guided image reconstruction in linear scan CT by weighted alternating direction TV minimization.

    PubMed

    Cai, Ailong; Wang, Linyuan; Zhang, Hanming; Yan, Bin; Li, Lei; Xi, Xiaoqi; Li, Jianxin

    2014-01-01

    Linear scan computed tomography (CT) is a promising imaging configuration with high scanning efficiency while the data set is under-sampled and angularly limited for which high quality image reconstruction is challenging. In this work, an edge guided total variation minimization reconstruction (EGTVM) algorithm is developed in dealing with this problem. The proposed method is modeled on the combination of total variation (TV) regularization and iterative edge detection strategy. In the proposed method, the edge weights of intermediate reconstructions are incorporated into the TV objective function. The optimization is efficiently solved by applying alternating direction method of multipliers. A prudential and conservative edge detection strategy proposed in this paper can obtain the true edges while restricting the errors within an acceptable degree. Based on the comparison on both simulation studies and real CT data set reconstructions, EGTVM provides comparable or even better quality compared to the non-edge guided reconstruction and adaptive steepest descent-projection onto convex sets method. With the utilization of weighted alternating direction TV minimization and edge detection, EGTVM achieves fast and robust convergence and reconstructs high quality image when applied in linear scan CT with under-sampled data set.

  2. [Current situation and prospect of breast cancer liquid biopsy].

    PubMed

    Zhou, B; Xin, L; Xu, L; Ye, J M; Liu, Y H

    2018-02-01

    Liquid biopsy is a diagnostic approach by analyzing body fluid samples. Peripheral blood is the most common sample. Urine, saliva, pleural effusion and ascites are also used. Now liquid biopsy is mainly used in the area of neoplasm diagnosis and treatment. Compared with traditional tissue biopsy, liquid biopsy is minimally invasive, convenient to sample and easy to repeat. Liquid biopsy mainly includes circulating tumor cells and circulating tumor DNA (ctDNA) detection. Detection of ctDNA requires sensitive and accurate methods. The progression of next-generation sequencing (NGS) and digital PCR promote the process of studies in ctDNA. In 2016, Nature published the result of whole-genome sequencing study of breast cancer. The study found 1 628 mutations of 93 protein-coding genes which may be driver mutations of breast cancer. The result of this study provided a new platform for breast cancer ctDNA studies. In recent years, there were many studies using ctDNA detection to monitor therapeutic effect and guide treatment. NGS is a promising technique in accessing genetic information and guiding targeted therapy. It must be emphasized that ctDNA detection using NGS is still at research stage. It is important to standardize ctDNA detection technique and perform prospective clinical researches. The time is not ripe for using ctDNA detection to guide large-scale breast cancer clinical practice at present.

  3. Detection of mercury(II) ions using colorimetric gold nanoparticles on paper-based analytical devices.

    PubMed

    Chen, Guan-Hua; Chen, Wei-Yu; Yen, Yu-Chun; Wang, Chia-Wei; Chang, Huan-Tsung; Chen, Chien-Fu

    2014-07-15

    An on-field colorimetric sensing strategy employing gold nanoparticles (AuNPs) and a paper-based analytical platform was investigated for mercury ion (Hg(2+)) detection at water sources. By utilizing thymine-Hg(2+)-thymine (T-Hg(2+)-T) coordination chemistry, label-free detection oligonucleotide sequences were attached to unmodified gold nanoparticles to provide rapid mercury ion sensing without complicated and time-consuming thiolated or other costly labeled probe preparation processes. Not only is this strategy's sensing mechanism specific toward Hg(2+), rather than other metal ions, but also the conformational change in the detection oligonucleotide sequences introduces different degrees of AuNP aggregation that causes the color of AuNPs to exhibit a mixture variance. To eliminate the use of sophisticated equipment and minimize the power requirement for data analysis and transmission, the color variance of multiple detection results were transferred and concentrated on cellulose-based paper analytical devices, and the data were subsequently transmitted for the readout and storage of results using cloud computing via a smartphone. As a result, a detection limit of 50 nM for Hg(2+) spiked pond and river water could be achieved. Furthermore, multiple tests could be performed simultaneously with a 40 min turnaround time. These results suggest that the proposed platform possesses the capability for sensitive and high-throughput on-site mercury pollution monitoring in resource-constrained settings.

  4. Detecting and Measuring Land Subsidence in Houston-Galveston, Texas using Interferometric Synthetic Aperture Radar (InSAR) and Global Positioning System Data, 2012-2016

    NASA Astrophysics Data System (ADS)

    Reed, A.; Baker, S.

    2016-12-01

    Several cities in the Houston-Galveston (HG) region in Texas have subsided up to 13 feet over several decades due to natural and anthropogenic processes [Yu et al. 2014]. Land subsidence, a gradual sinking of the Earth's surface, is an often human-induced hazard and a major environmental problem expedited by activities such as mining, oil and gas extraction, urbanization and excessive groundwater pumping. We are able to detect and measure subsidence in HG using interferometric synthetic aperture radar (InSAR) and global positioning systems (GPS). Qu et al. [2015] used ERS, Envisat, and ALOS-1 to characterize subsidence in HG from 1995 to 2011, but a five-year gap in InSAR measurements exists due to a lack of freely available SAR data. We build upon the previous study by comparing subsidence patterns detected by Sentinel-1 data starting in July 2015. We used GMT5SAR to generate a stack of interferograms with perpendicular baselines less than 100 meters and temporal baselines less than 100 days to minimize temporal and spatial decorrelation. We applied the short baseline subset (SBAS) time series processing using GIAnT and compared our results with GPS measurements. The implications of this work will strengthen land subsidence monitoring systems in HG and broadly aid in the development of effective water resource management policies and strategies.

  5. Prediction and generation of binary Markov processes: Can a finite-state fox catch a Markov mouse?

    NASA Astrophysics Data System (ADS)

    Ruebeck, Joshua B.; James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2018-01-01

    Understanding the generative mechanism of a natural system is a vital component of the scientific method. Here, we investigate one of the fundamental steps toward this goal by presenting the minimal generator of an arbitrary binary Markov process. This is a class of processes whose predictive model is well known. Surprisingly, the generative model requires three distinct topologies for different regions of parameter space. We show that a previously proposed generator for a particular set of binary Markov processes is, in fact, not minimal. Our results shed the first quantitative light on the relative (minimal) costs of prediction and generation. We find, for instance, that the difference between prediction and generation is maximized when the process is approximately independently, identically distributed.

  6. Real-time image-processing algorithm for markerless tumour tracking using X-ray fluoroscopic imaging.

    PubMed

    Mori, S

    2014-05-01

    To ensure accuracy in respiratory-gating treatment, X-ray fluoroscopic imaging is used to detect tumour position in real time. Detection accuracy is strongly dependent on image quality, particularly positional differences between the patient and treatment couch. We developed a new algorithm to improve the quality of images obtained in X-ray fluoroscopic imaging and report the preliminary results. Two oblique X-ray fluoroscopic images were acquired using a dynamic flat panel detector (DFPD) for two patients with lung cancer. The weighting factor was applied to the DFPD image in respective columns, because most anatomical structures, as well as the treatment couch and port cover edge, were aligned in the superior-inferior direction when the patient lay on the treatment couch. The weighting factors for the respective columns were varied until the standard deviation of the pixel values within the image region was minimized. Once the weighting factors were calculated, the quality of the DFPD image was improved by applying the factors to multiframe images. Applying the image-processing algorithm produced substantial improvement in the quality of images, and the image contrast was increased. The treatment couch and irradiation port edge, which were not related to a patient's position, were removed. The average image-processing time was 1.1 ms, showing that this fast image processing can be applied to real-time tumour-tracking systems. These findings indicate that this image-processing algorithm improves the image quality in patients with lung cancer and successfully removes objects not related to the patient. Our image-processing algorithm might be useful in improving gated-treatment accuracy.

  7. Minimally invasive approaches for gastric cancer-Korean experience.

    PubMed

    Yang, Han-Kwang; Suh, Yun-Suhk; Lee, Hyuk-Joon

    2013-03-01

    Laparoscopic surgery in Korea increased rapidly because of the early detection of gastric cancer by the development of diagnostic tools and nationwide screening. The Korean Laparoscopic Gastrointestinal Surgery Study Group (KLASS group) played a leading role in various projects related with minimally invasive surgery. The justification of minimally invasive procedures including robotic surgery, sentinel-node biopsy, or single-port surgery/Natural Orifice Transluminal Endoscopic Surgery (NOTES) must be predetermined by the clinical trial before a wide application, and the medical industry as well as surgeons should have great responsibility. Copyright © 2012 Wiley Periodicals, Inc.

  8. Stop co-annihilation in the minimal supersymmetric standard model revisited

    NASA Astrophysics Data System (ADS)

    Pierce, Aaron; Shah, Nausheen R.; Vogl, Stefan

    2018-01-01

    We reexamine the stop co-annihilation scenario of the minimal supersymmetric standard model, wherein a binolike lightest supersymmetric particle has a thermal relic density set by co-annihilations with a scalar partner of the top quark in the early universe. We concentrate on the case where only the top partner sector is relevant for the cosmology, and other particles are heavy. We discuss the cosmology with focus on low energy parameters and an emphasis on the implications of the measured Higgs boson mass and its properties. We find that the irreducible direct detection signal correlated with this cosmology is generically well below projected experimental sensitivity, and in most cases lies below the neutrino background. A larger, detectable, direct detection rate is possible, but is unrelated to the co-annihilation cosmology. LHC searches for compressed spectra are crucial for probing this scenario.

  9. A silicon-based peptide biosensor for label-free detection of cancer cells

    NASA Astrophysics Data System (ADS)

    Martucci, Nicola M.; Rea, Ilaria; Ruggiero, Immacolata; Terracciano, Monica; De Stefano, Luca; Migliaccio, Nunzia; Dardano, Principia; Arcari, Paolo; Rendina, Ivo; Lamberti, Annalisa

    2015-05-01

    Sensitive and accurate detection of cancer cells plays a crucial role in diagnosis of cancer and minimal residual disease, so being one of the most hopeful approaches to reduce cancer death rates. In this paper, a strategy for highly selective and sensitive detection of lymphoma cells on planar silicon-based biosensor has been evaluated. In this setting an Idiotype peptide, able to specifically bind the B-cell receptor (BCR) of A20 cells in mice engrafted with A20 lymphoma, has been covalently linked to the sensor active surface and used as molecular probe. The biochip here presented showed a coverage efficiency of 85% with a detection efficiency of 8.5×10-3 cells/μm2. The results obtained suggested an efficient way for specific label-free cell detection by using a silicon-based peptide biosensor. In addition, the present recognition strategy, besides being useful for the development of sensing devices capable of monitoring minimal residual disease, could be used to find and characterize new specific receptor-ligand interactions through the screening of a recombinant phage library.

  10. Multi objective optimization model for minimizing production cost and environmental impact in CNC turning process

    NASA Astrophysics Data System (ADS)

    Widhiarso, Wahyu; Rosyidi, Cucuk Nur

    2018-02-01

    Minimizing production cost in a manufacturing company will increase the profit of the company. The cutting parameters will affect total processing time which then will affect the production cost of machining process. Besides affecting the production cost and processing time, the cutting parameters will also affect the environment. An optimization model is needed to determine the optimum cutting parameters. In this paper, we develop an optimization model to minimize the production cost and the environmental impact in CNC turning process. The model is used a multi objective optimization. Cutting speed and feed rate are served as the decision variables. Constraints considered are cutting speed, feed rate, cutting force, output power, and surface roughness. The environmental impact is converted from the environmental burden by using eco-indicator 99. Numerical example is given to show the implementation of the model and solved using OptQuest of Oracle Crystal Ball software. The results of optimization indicate that the model can be used to optimize the cutting parameters to minimize the production cost and the environmental impact.

  11. Ground standoff mine detection system (GSTAMIDS) engineering, manufacturing, and development (EMD) Block 0

    NASA Astrophysics Data System (ADS)

    Pressley, Jackson R.; Pabst, Donald; Sower, Gary D.; Nee, Larry; Green, Brian; Howard, Peter

    2001-10-01

    The United States Army has contracted EG&G Technical Services to build the GSTAMIDS EMD Block 0. This system autonomously detects and marks buried anti-tank land mines from an unmanned vehicle. It consists of a remotely operated host vehicle, standard teleoperation system (STS) control, mine detection system (MDS) and a control vehicle. Two complete systems are being fabricated, along with a third MDS. The host vehicle for Block 0 is the South African Meerkat that has overpass capability for anti-tank mines, as well as armor anti-mine blast protection and ballistic protection. It is operated via the STS radio link from within the control vehicle. The Main Computer System (MCS), located in the control vehicle, receives sensor data from the MDS via a high speed radio link, processes and fuses the data to make a decision of a mine detection, and sends the information back to the host vehicle for a mark to be placed on the mine location. The MCS also has the capability to interface into the FBCB2 system via SINGARS radio. The GSTAMIDS operator station and the control vehicle communications system also connect to the MCS. The MDS sensors are mounted on the host vehicle and include Ground Penetrating Radar (GPR), Pulsed Magnetic Induction (PMI) metal detector, and (as an option) long-wave infrared (LWIR). A distributed processing architecture is used so that pre-processing is performed on data at the sensor level before transmission to the MCS, minimizing required throughput. Nine (9) channels each of GPR and PMI are mounted underneath the meerkat to provide a three-meter detection swath. Two IR cameras are mounted on the upper sides of the Meerkat, providing a field of view of the required swath with overlap underneath the vehicle. Also included on the host vehicle are an Internal Navigation System (INS), Global Positioning System (GPS), and radio communications for remote control and data transmission. The GSTAMIDS Block 0 is designed as a modular, expandable system with sufficient bandwidth and processing capability for incorporation of additional sensor systems in future Blocks. It is also designed to operate in adverse weather conditions and to be transportable around the world.

  12. Deep learning classifier with optical coherence tomography images for early dental caries detection

    NASA Astrophysics Data System (ADS)

    Karimian, Nima; Salehi, Hassan S.; Mahdian, Mina; Alnajjar, Hisham; Tadinada, Aditya

    2018-02-01

    Dental caries is a microbial disease that results in localized dissolution of the mineral content of dental tissue. Despite considerable decline in the incidence of dental caries, it remains a major health problem in many societies. Early detection of incipient lesions at initial stages of demineralization can result in the implementation of non-surgical preventive approaches to reverse the demineralization process. In this paper, we present a novel approach combining deep convolutional neural networks (CNN) and optical coherence tomography (OCT) imaging modality for classification of human oral tissues to detect early dental caries. OCT images of oral tissues with various densities were input to a CNN classifier to determine variations in tissue densities resembling the demineralization process. The CNN automatically learns a hierarchy of increasingly complex features and a related classifier directly from training data sets. The initial CNN layer parameters were randomly selected. The training set is split into minibatches, with 10 OCT images per batch. Given a batch of training patches, the CNN employs two convolutional and pooling layers to extract features and then classify each patch based on the probabilities from the SoftMax classification layer (output-layer). Afterward, the CNN calculates the error between the classification result and the reference label, and then utilizes the backpropagation process to fine-tune all the layer parameters to minimize this error using batch gradient descent algorithm. We validated our proposed technique on ex-vivo OCT images of human oral tissues (enamel, cortical-bone, trabecular-bone, muscular-tissue, and fatty-tissue), which attested to effectiveness of our proposed method.

  13. Do Circulating Tumor Cells, Exosomes, and Circulating Tumor Nucleic Acids Have Clinical Utility?

    PubMed Central

    Gold, Bert; Cankovic, Milena; Furtado, Larissa V.; Meier, Frederick; Gocke, Christopher D.

    2016-01-01

    Diagnosing and screening for tumors through noninvasive means represent an important paradigm shift in precision medicine. In contrast to tissue biopsy, detection of circulating tumor cells (CTCs) and circulating tumor nucleic acids provides a minimally invasive method for predictive and prognostic marker detection. This allows early and serial assessment of metastatic disease, including follow-up during remission, characterization of treatment effects, and clonal evolution. Isolation and characterization of CTCs and circulating tumor DNA (ctDNA) are likely to improve cancer diagnosis, treatment, and minimal residual disease monitoring. However, more trials are required to validate the clinical utility of precise molecular markers for a variety of tumor types. This review focuses on the clinical utility of CTCs and ctDNA testing in patients with solid tumors, including somatic and epigenetic alterations that can be detected. A comparison of methods used to isolate and detect CTCs and some of the intricacies of the characterization of the ctDNA are also provided. PMID:25908243

  14. Radar detection with the Neyman-Pearson criterion using supervised-learning-machines trained with the cross-entropy error

    NASA Astrophysics Data System (ADS)

    Jarabo-Amores, María-Pilar; la Mata-Moya, David de; Gil-Pita, Roberto; Rosa-Zurera, Manuel

    2013-12-01

    The application of supervised learning machines trained to minimize the Cross-Entropy error to radar detection is explored in this article. The detector is implemented with a learning machine that implements a discriminant function, which output is compared to a threshold selected to fix a desired probability of false alarm. The study is based on the calculation of the function the learning machine approximates to during training, and the application of a sufficient condition for a discriminant function to be used to approximate the optimum Neyman-Pearson (NP) detector. In this article, the function a supervised learning machine approximates to after being trained to minimize the Cross-Entropy error is obtained. This discriminant function can be used to implement the NP detector, which maximizes the probability of detection, maintaining the probability of false alarm below or equal to a predefined value. Some experiments about signal detection using neural networks are also presented to test the validity of the study.

  15. Detection of content adaptive LSB matching: a game theory approach

    NASA Astrophysics Data System (ADS)

    Denemark, Tomáš; Fridrich, Jessica

    2014-02-01

    This paper is an attempt to analyze the interaction between Alice and Warden in Steganography using the Game Theory. We focus on the modern steganographic embedding paradigm based on minimizing an additive distortion function. The strategies of both players comprise of the probabilistic selection channel. The Warden is granted the knowledge of the payload and the embedding costs, and detects embedding using the likelihood ratio. In particular, the Warden is ignorant about the embedding probabilities chosen by Alice. When adopting a simple multivariate Gaussian model for the cover, the payoff function in the form of the Warden's detection error can be numerically evaluated for a mutually independent embedding operation. We demonstrate on the example of a two-pixel cover that the Nash equilibrium is different from the traditional Alice's strategy that minimizes the KL divergence between cover and stego objects under an omnipotent Warden. Practical implications of this case study include computing the loss per pixel of Warden's ability to detect embedding due to her ignorance about the selection channel.

  16. Performance of chip seals using local and minimally processed aggregates for preservation of low traffic volume roadways.

    DOT National Transportation Integrated Search

    2013-07-01

    This report documents the performance of two low traffic volume experimental chip seals constructed using : locally available, minimally processed sand and gravel aggregates after four winters of service. The projects : were constructed by CDOT maint...

  17. INTELLIGENT DECISION SUPPORT FOR WASTE MINIMIZATION IN ELECTROPLATING PLANTS. (R824732)

    EPA Science Inventory

    Abstract

    Wastewater, spent solvent, spent process solutions, and sludge are the major waste streams generated in large volumes daily in electroplating plants. These waste streams can be significantly minimized through process modification and operational improvement. I...

  18. Development of a qualitative, multiplex real-time PCR kit for screening of genetically modified organisms (GMOs).

    PubMed

    Dörries, Hans-Henno; Remus, Ivonne; Grönewald, Astrid; Grönewald, Cordt; Berghof-Jäger, Kornelia

    2010-03-01

    The number of commercially available genetically modified organisms (GMOs) and therefore the diversity of possible target sequences for molecular detection techniques are constantly increasing. As a result, GMO laboratories and the food production industry currently are forced to apply many different methods to reliably test raw material and complex processed food products. Screening methods have become more and more relevant to minimize the analytical effort and to make a preselection for further analysis (e.g., specific identification or quantification of the GMO). A multiplex real-time PCR kit was developed to detect the 35S promoter of the cauliflower mosaic virus, the terminator of the nopaline synthase gene of Agrobacterium tumefaciens, the 35S promoter from the figwort mosaic virus, and the bar gene of the soil bacterium Streptomyces hygroscopicus as the most widely used sequences in GMOs. The kit contains a second assay for the detection of plant-derived DNA to control the quality of the often processed and refined sample material. Additionally, the plant-specific assay comprises a homologous internal amplification control for inhibition control. The determined limits of detection for the five assays were 10 target copies/reaction. No amplification products were observed with DNAs of 26 bacterial species, 25 yeasts, 13 molds, and 41 not genetically modified plants. The specificity of the assays was further demonstrated to be 100% by the specific amplification of DNA derived from reference material from 22 genetically modified crops. The applicability of the kit in routine laboratory use was verified by testing of 50 spiked and unspiked food products. The herein described kit represents a simple and sensitive GMO screening method for the reliable detection of multiple GMO-specific target sequences in a multiplex real-time PCR reaction.

  19. Markers of pregnancy: how early can we detect pregnancies in cattle using pregnancy-associated glycoproteins (PAGs) and microRNAs?

    USDA-ARS?s Scientific Manuscript database

    Pregnancy detection has evolved over the last few decades and the importance of early pregnancy detection is critical to minimize the amount of time a cow spends not pregnant or open. Embryonic mortality (EM) is generally considered to be the primary factor limiting pregnancy rates in cattle and occ...

  20. FEM analysis of bonding process used for minimization of deformation of optical surface under Metis coronagraph mirrors manufacturing

    NASA Astrophysics Data System (ADS)

    Procháska, F.; Vít, T.; Matoušek, O.; Melich, R.

    2016-11-01

    High demands on the final surfaces micro-roughness as well as great shape accuracy have to be achieved under the manufacturing process of the precise mirrors for Metis orbital coronagraph. It is challenging engineering task with respect to lightweight design of the mirrors and resulting objectionable optical surface shape stability. Manufacturing of such optical elements is usually affected by number of various effects. Most of them are caused by instability of temperature field. It is necessary to explore, comprehend and consequently minimize all thermo - mechanical processes which take place during mirror cementing, grinding and polishing processes to minimize the optical surface deformation. Application of FEM simulation was proved as a useful tool to help to solve this task. FEM simulations were used to develop and virtually compare different mirror holders to minimize the residual stress generated by temperature changes and to suppress the shape deformation of the optical surface below the critical limit of about 100 nm.

  1. Mammography screening using independent double reading with consensus: is there a potential benefit for computer-aided detection?

    PubMed

    Skaane, Per; Kshirsagar, Ashwini; Hofvind, Solveig; Jahr, Gunnar; Castellino, Ronald A

    2012-04-01

    Double reading improves the cancer detection rate in mammography screening. Single reading with computer-aided detection (CAD) has been considered to be an alternative to double reading. Little is known about the potential benefit of CAD in breast cancer screening with double reading. To compare prospective independent double reading of screen-film (SFM) and full-field digital (FFDM) mammography in population-based screening with retrospective standalone CAD performance on the baseline mammograms of the screen-detected cancers and subsequent cancers diagnosed during the follow-up period. The study had ethics committee approval. A 5-point rating scale for probability of cancer was used for 23,923 (SFM = 16,983; FFDM = 6940) screening mammograms. Of 208 evaluable cancers, 104 were screen-detected and 104 were subsequent (44 interval and 60 next screening round) cancers. Baseline mammograms of subsequent cancers were retrospectively classified in consensus without information about cancer location, histology, or CAD prompting as normal, non-specific minimal signs, significant minimal signs, and false-negatives. The baseline mammograms of the screen-detected cancers and subsequent cancers were evaluated by CAD. Significant minimal signs and false-negatives were considered 'actionable' and potentially diagnosable if correctly prompted by CAD. CAD correctly marked 94% (98/104) of the baseline mammograms of the screen-detected cancers (SFM = 95% [61/64]; FFDM = 93% [37/40]), including 96% (23/24) of those with discordant interpretations. Considering only those baseline examinations of subsequent cancers prospectively interpreted as normal and retrospectively categorized as 'actionable', CAD input at baseline screening had the potential to increase the cancer detection rate from 0.43% to 0.51% (P = 0.13); and to increase cancer detection by 16% ([104 + 17]/104) and decrease interval cancers by 20% (from 44 to 35). CAD may have the potential to increase cancer detection by up to 16%, and to reduce the number of interval cancers by up to 20% in SFM and FFDM screening programs using independent double reading with consensus review. The influence of true- and false-positive CAD marks on decision-making can, however, only be evaluated in a prospective clinical study.

  2. Disappearance of Ph1 chromosome with intensive chemotherapy and detection of minimal residual disease by polymerase chain reaction in a patient with blast crisis of chronic myelogenous leukemia.

    PubMed

    Honda, H; Miyagawa, K; Endo, M; Takaku, F; Yazaki, Y; Hirai, H

    1993-06-01

    We diagnosed a patient with chronic myelogenous leukemia (CML) in chronic phase (CP) on the basis of clinical findings, Ph1 chromosome detected by cytogenetic analysis, and bcr-abl fusion mRNA detected by reverse transcriptase-dependent polymerase chain reaction (RT-PCR). One month after diagnosis, the patient developed extramedullary blast crisis in the lymph nodes, and then medullary blast crisis in the bone marrow, in which different surface markers were shown. Combination chemotherapy with BH-AC, VP16, and mitoxantrone was administered; this resulted in rapid disappearance of the lymphadenopathy, restoration of normal hematopoiesis, and no Ph1 chromosome being detected by cytogenetic analysis. RT-PCR performed to detect the residual Ph1 clone revealed that although the Ph1 clone was preferentially suppressed, it was still residual. The intensive chemotherapy regimen preferentially suppressed the Ph1-positive clone and led to both clinical and cytogenetic remission in this patient with BC of CML; we suggest that RT-PCR is a sensitive and useful method for detecting minimal residual disease during the clinical course of this disease.

  3. Validity, Responsiveness, Minimal Detectable Change, and Minimal Clinically Important Change of "Pediatric Balance Scale" in Children with Cerebral Palsy

    ERIC Educational Resources Information Center

    Chen, Chia-ling; Shen, I-hsuan; Chen, Chung-yao; Wu, Ching-yi; Liu, Wen-Yu; Chung, Chia-ying

    2013-01-01

    This study examined criterion-related validity and clinimetric properties of the pediatric balance scale ("PBS") in children with cerebral palsy (CP). Forty-five children with CP (age range: 19-77 months) and their parents participated in this study. At baseline and at follow up, Pearson correlation coefficients were used to determine…

  4. Translating Big Data into Smart Data for Veterinary Epidemiology

    PubMed Central

    VanderWaal, Kimberly; Morrison, Robert B.; Neuhauser, Claudia; Vilalta, Carles; Perez, Andres M.

    2017-01-01

    The increasing availability and complexity of data has led to new opportunities and challenges in veterinary epidemiology around how to translate abundant, diverse, and rapidly growing “big” data into meaningful insights for animal health. Big data analytics are used to understand health risks and minimize the impact of adverse animal health issues through identifying high-risk populations, combining data or processes acting at multiple scales through epidemiological modeling approaches, and harnessing high velocity data to monitor animal health trends and detect emerging health threats. The advent of big data requires the incorporation of new skills into veterinary epidemiology training, including, for example, machine learning and coding, to prepare a new generation of scientists and practitioners to engage with big data. Establishing pipelines to analyze big data in near real-time is the next step for progressing from simply having “big data” to create “smart data,” with the objective of improving understanding of health risks, effectiveness of management and policy decisions, and ultimately preventing or at least minimizing the impact of adverse animal health issues. PMID:28770216

  5. Application of polymer-coated metal-insulator-semiconductor sensors for the detection of dissolved hydrogen

    NASA Astrophysics Data System (ADS)

    Li, Dongmei; Medlin, J. W.; Bastasz, R.

    2006-06-01

    The detection of dissolved hydrogen in liquids is crucial to many industrial applications, such as fault detection for oil-filled electrical equipment. To enhance the performance of metal-insulator-semiconductor (MIS) sensors for dissolved hydrogen detection, a palladium MIS sensor has been modified by depositing a polyimide (PI) layer above the palladium surface. Response measurements of the PI-coated sensors in mineral oil indicate that hydrogen is sensitively detected, while the effect of interfering gases on sensor response is minimized.

  6. Qualifying a Bonding Process for the Space Interferometry Mission

    NASA Technical Reports Server (NTRS)

    Joyce, Gretchen P.

    2005-01-01

    The Space Interferometry Mission consists of three parallel Michelson interferometers that will be capable of detecting extrasolar planets with a high degree of accuracy and precision. High levels of stability must be met in order to fulfill the scientific requirements of this mission. To attain successful measurements the coefficient of thermal expansion between optics and bonding material must be minimized without jeopardizing the integrity of the bonds. Optic-to-optic bonds have been analyzed to better understand variables such as the effects of the coefficient of thermal expansion differences between optics and bonding materials, and materials have been chosen for the project based on these analyses. A study was conducted to determine if a reliable, repeatable process for bonding by wicking adhesive could be obtained using a low-viscosity epoxy and ultra-low expansion glass. A process of creating a methodology of bonding fused silica optics with Z-6020 silane primer and Epo-Tek 301 epoxy will be discussed.

  7. Discourse changes in early Alzheimer disease, mild cognitive impairment, and normal aging.

    PubMed

    Chapman, Sandra Bond; Zientz, Jennifer; Weiner, Myron; Rosenberg, Roger; Frawley, William; Burns, Mary Hope

    2002-01-01

    The purpose of this study was to determine the sensitivity of discourse gist measures to the early cognitive-linguistic changes in Alzheimer disease (AD) and in the preclinical stages. Differences in discourse abilities were examined in 25 cognitively normal adults, 24 adults with mild probable AD, and 20 adults with mild cognitive impairment (MCI) at gist and detail levels of discourse processing. The authors found that gist and detail levels of discourse processing were significantly impaired in persons with AD and MCI as compared with normal control subjects. Gist-level discourse processing abilities showed minimal overlap between cognitively normal control subjects and those with mild AD. Moreover, the majority of the persons with MCI performed in the range of AD on gist measures. These findings indicate that discourse gist measures hold promise as a diagnostic complement to enhance early detection of AD. Further studies are needed to determine how early the discourse gist deficits arise in AD.

  8. Evolution of a modular software network

    PubMed Central

    Fortuna, Miguel A.; Bonachela, Juan A.; Levin, Simon A.

    2011-01-01

    “Evolution behaves like a tinkerer” (François Jacob, Science, 1977). Software systems provide a singular opportunity to understand biological processes using concepts from network theory. The Debian GNU/Linux operating system allows us to explore the evolution of a complex network in a unique way. The modular design detected during its growth is based on the reuse of existing code in order to minimize costs during programming. The increase of modularity experienced by the system over time has not counterbalanced the increase in incompatibilities between software packages within modules. This negative effect is far from being a failure of design. A random process of package installation shows that the higher the modularity, the larger the fraction of packages working properly in a local computer. The decrease in the relative number of conflicts between packages from different modules avoids a failure in the functionality of one package spreading throughout the entire system. Some potential analogies with the evolutionary and ecological processes determining the structure of ecological networks of interacting species are discussed. PMID:22106260

  9. Using fingerprint image quality to improve the identification performance of the U.S. Visitor and Immigrant Status Indicator Technology Program

    PubMed Central

    Wein, Lawrence M.; Baveja, Manas

    2005-01-01

    Motivated by the difficulty of biometric systems to correctly match fingerprints with poor image quality, we formulate and solve a game-theoretic formulation of the identification problem in two settings: U.S. visa applicants are checked against a list of visa holders to detect visa fraud, and visitors entering the U.S. are checked against a watchlist of criminals and suspected terrorists. For three types of biometric strategies, we solve the game in which the U.S. Government chooses the strategy's optimal parameter values to maximize the detection probability subject to a constraint on the mean biometric processing time per legal visitor, and then the terrorist chooses the image quality to minimize the detection probability. At current inspector staffing levels at ports of entry, our model predicts that a quality-dependent two-finger strategy achieves a detection probability of 0.733, compared to 0.526 under the quality-independent two-finger strategy that is currently implemented at the U.S. border. Increasing the staffing level of inspectors offers only minor increases in the detection probability for these two strategies. Using more than two fingers to match visitors with poor image quality allows a detection probability of 0.949 under current staffing levels, but may require major changes to the current U.S. biometric program. The detection probabilities during visa application are ≈11–22% smaller than at ports of entry for all three strategies, but the same qualitative conclusions hold. PMID:15894628

  10. Using fingerprint image quality to improve the identification performance of the U.S. Visitor and Immigrant Status Indicator Technology Program.

    PubMed

    Wein, Lawrence M; Baveja, Manas

    2005-05-24

    Motivated by the difficulty of biometric systems to correctly match fingerprints with poor image quality, we formulate and solve a game-theoretic formulation of the identification problem in two settings: U.S. visa applicants are checked against a list of visa holders to detect visa fraud, and visitors entering the U.S. are checked against a watchlist of criminals and suspected terrorists. For three types of biometric strategies, we solve the game in which the U.S. Government chooses the strategy's optimal parameter values to maximize the detection probability subject to a constraint on the mean biometric processing time per legal visitor, and then the terrorist chooses the image quality to minimize the detection probability. At current inspector staffing levels at ports of entry, our model predicts that a quality-dependent two-finger strategy achieves a detection probability of 0.733, compared to 0.526 under the quality-independent two-finger strategy that is currently implemented at the U.S. border. Increasing the staffing level of inspectors offers only minor increases in the detection probability for these two strategies. Using more than two fingers to match visitors with poor image quality allows a detection probability of 0.949 under current staffing levels, but may require major changes to the current U.S. biometric program. The detection probabilities during visa application are approximately 11-22% smaller than at ports of entry for all three strategies, but the same qualitative conclusions hold.

  11. Fast learning method for convolutional neural networks using extreme learning machine and its application to lane detection.

    PubMed

    Kim, Jihun; Kim, Jonghong; Jang, Gil-Jin; Lee, Minho

    2017-03-01

    Deep learning has received significant attention recently as a promising solution to many problems in the area of artificial intelligence. Among several deep learning architectures, convolutional neural networks (CNNs) demonstrate superior performance when compared to other machine learning methods in the applications of object detection and recognition. We use a CNN for image enhancement and the detection of driving lanes on motorways. In general, the process of lane detection consists of edge extraction and line detection. A CNN can be used to enhance the input images before lane detection by excluding noise and obstacles that are irrelevant to the edge detection result. However, training conventional CNNs requires considerable computation and a big dataset. Therefore, we suggest a new learning algorithm for CNNs using an extreme learning machine (ELM). The ELM is a fast learning method used to calculate network weights between output and hidden layers in a single iteration and thus, can dramatically reduce learning time while producing accurate results with minimal training data. A conventional ELM can be applied to networks with a single hidden layer; as such, we propose a stacked ELM architecture in the CNN framework. Further, we modify the backpropagation algorithm to find the targets of hidden layers and effectively learn network weights while maintaining performance. Experimental results confirm that the proposed method is effective in reducing learning time and improving performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A model system for pathogen detection using a two-component bacteriophage/bioluminescent signal amplification assay

    NASA Astrophysics Data System (ADS)

    Bright, Nathan G.; Carroll, Richard J.; Applegate, Bruce M.

    2004-03-01

    Microbial contamination has become a mounting concern the last decade due to an increased emphasis of minimally processed food products specifically produce, and the recognition of foodborne pathogens such as Campylobacter jejuni, Escherichia coli O157:H7, and Listeria monocytogenes. This research investigates a detection approach utilizing bacteriophage pathogen specificity coupled with a bacterial bioluminescent bioreporter utilizing the quorum sensing molecule from Vibrio fischeri, N-(3-oxohexanoyl)-homoserine lactone (3-oxo-C6-HSL). The 3-oxo-C6-HSL molecules diffuse out of the target cell after infection and induce bioluminescence from a population of 3-oxo-C6-HSL bioreporters (ROLux). E. coli phage M13, a well-characterized bacteriophage, offers a model system testing the use of bacteriophage for pathogen detection through cell-to-cell communication via a LuxR/3-oxo-C6-HSL system. Simulated temperate phage assays tested functionality of the ROLux reporter and production of 3-oxo-C6-HSL by various test strains. These assays showed detection limits of 102cfu after 24 hours in a varietry of detection formats. Assays incorporating the bacteriophage M13-luxI with the ROLux reporter and a known population of target cells were subsequently developed and have shown consistent detection limits of 105cfu target organisms. Measurable light response from high concentrations of target cells was almost immediate, suggesting an enrichment step to further improve detection limits and reduce assay time.

  13. 7 CFR 3430.36 - Procedures to minimize or eliminate duplication of effort.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) COOPERATIVE STATE RESEARCH, EDUCATION, AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE COMPETITIVE AND... may implement appropriate business processes to minimize or eliminate the awarding of CSREES Federal... awards made by other Federal agencies. Business processes may include the review of the Current and...

  14. MPLP and the Catalog Record as a Finding Aid

    ERIC Educational Resources Information Center

    Bowen Maier, Shannon

    2011-01-01

    The cataloging of otherwise unprocessed collections is an innovative minimal processing technique with important implications for reference service. This article mines the existing literature for how institutions engaged in minimal processing view reference, the strengths and weaknesses of catalog records as finding aids, and information about…

  15. Clinical study on minimally invasive liquefaction and drainage of intracerebral hematoma in the treatment of hypertensive putamen hemorrhage.

    PubMed

    Liang, Ke-Shan; Ding, Jian; Yin, Cheng-Bin; Peng, Li-Jing; Liu, Zhen-Chuan; Guo, Xiao; Liang, Shu-Yu; Zhang, Yong; Zhou, Sheng-Nian

    2017-12-04

    This study aims to compare the curative effect of different treatment methods of hypertensive putamen hemorrhage, in order to determine an ideal method of treatment; and to explore the curative effect of the application of soft channel technology-minimally invasive liquefaction and drainage of intracerebral hematoma in the treatment of hypertensive putamen hemorrhage. Patients with hypertensive cerebral hemorrhage, who were treated in our hospital from January 2015 to January 2016, were included into this study. Patients were divided into three groups: minimally invasive drainage group, internal medical treatment group and craniotomy group. In the minimally invasive drainage group, puncture aspiration and drainage were performed according to different hematoma conditions detected in brain CT, the frontal approach was selected for putamen and intracerebral hemorrhage, and drainage was reserved until the hematoma disappeared in CT detection. Drug therapy was dominated in the internal medical treatment group, while surgery under general anesthesia was performed to remove the hematoma in the craniotomy group. Post-treatment neurological function defect scores in minimally invasive drainage group and internal medical group were 16.14 ± 11.27 and 31.43 ± 10.42, respectively; and the difference was remarkably significant (P< 0.01). Post-treatment neurological function defect scores in the minimally invasive drainage group and craniotomy group were 16.14 ± 11.27 and 24.20 ± 12.23, respectively; and the difference was statistically significant (P< 0.05). There was a remarkable significant difference in ADL1-2 level during followed-up in survival patients between the minimally invasive drainage group and internal medical treatment group (P< 0.01), and there was a significant difference in followed-up mortality between these two groups (P< 0.01). Clinical observation and following-up results revealed that minimally invasive drainage treatment was superior to internal medical treatment and craniotomy.

  16. Knowledge and Attitude among General Dental Practitioners towards Minimally Invasive Dentistry in Riyadh and AlKharj

    PubMed Central

    Sheddi, Faisal Mohammed; Alharqan, Mesfer Saad; Khawja, Shabnam Gulzar; Vohra, Fahim; Akram, Zohaib; Faden, Asmaa Ahmed; Khalil, Hesham Saleh

    2016-01-01

    Introduction Minimally Invasive Dentistry (MID) emphasizes conservative caries management strategies resulting in less destruction of tooth structure, a deviation of the traditional GV Black’s restorative principles. However, there seems to be either deficiency in knowledge or little intention by general dental practitioners to adopt these principles. Aim The aim of this study was to assess the knowledge and attitude among general dental practitioners towards minimally invasive dentistry in Riyadh and AlKharj cities of Saudi Arabia. Materials and Methods Self-administered structured questionnaires were handed to general dental practitioners (GDPs) in the cities of Riyadh and AlKharj in Saudi Arabia. Several questions, including Likert-type scale response categories (1–5), were used. The questions assessed the respondents’ levels of agreement regarding diagnostic, preventive and restorative techniques such as use of caries risk assessment, use of high fluoride tooth paste, Atraumatic Restorative Treatment and tunnel preparations. Results Out of 200 respondents, 161 GDPs with overall response rate of 80.5% completed the questionnaires. The GDPs showed significantly different approach with regards to the use of sharp explorer for caries detection (p = 0.014). Almost 60% of the participants had received no special education regarding minimally invasive procedures. Moreover, GDPs who had received MID training showed significantly better knowledge and attitude in adopting minimally invasive techniques for both diagnosis and treatment of dental caries. Conclusion Although GDPs possess knowledge about the benefits of MID; however, study showed deficiencies in their attitudes towards caries detection methods and application of minimally invasive dentistry procedures. PMID:27630962

  17. Current status of cryotherapy for prostate and kidney cancer.

    PubMed

    Cho, Seok; Kang, Seok Ho

    2014-12-01

    In terms of treating diseases, minimally invasive treatment has become a key element in reducing perioperative complications. Among the various minimally invasive treatments, cryotherapy is often used in urology to treat various types of cancers, especially prostate cancer and renal cancer. In prostate cancer, the increased incidence of low-risk, localized prostate cancer has made minimally invasive treatment modalities an attractive option. Focal cryotherapy for localized unilateral disease offers the added benefit of minimal morbidities. In renal cancer, owing to the increasing utilization of cross-sectional imaging, nearly 70% of newly detected renal masses are stage T1a, making them more susceptible to minimally invasive nephron-sparing therapies including laparoscopic and robotic partial nephrectomy and ablative therapies. This article reviews the various outcomes of cryotherapy compared with other treatments and the possible uses of cryotherapy in surgery.

  18. Current Status of Cryotherapy for Prostate and Kidney Cancer

    PubMed Central

    Cho, Seok

    2014-01-01

    In terms of treating diseases, minimally invasive treatment has become a key element in reducing perioperative complications. Among the various minimally invasive treatments, cryotherapy is often used in urology to treat various types of cancers, especially prostate cancer and renal cancer. In prostate cancer, the increased incidence of low-risk, localized prostate cancer has made minimally invasive treatment modalities an attractive option. Focal cryotherapy for localized unilateral disease offers the added benefit of minimal morbidities. In renal cancer, owing to the increasing utilization of cross-sectional imaging, nearly 70% of newly detected renal masses are stage T1a, making them more susceptible to minimally invasive nephron-sparing therapies including laparoscopic and robotic partial nephrectomy and ablative therapies. This article reviews the various outcomes of cryotherapy compared with other treatments and the possible uses of cryotherapy in surgery. PMID:25512811

  19. Quenching of cascade reaction between triplet and photochrome probes with nitroxide radicals. A novel labeling method in study of membranes and surface systems.

    PubMed

    Papper, V; Medvedeva, N; Fishov, I; Likhtenshtein, G I

    2000-01-01

    We proposed a new method for the study of molecular dynamics and fluidity of the living and model biomembranes and surface systems. The method is based on the measurements of the sensitized photoisomerization kinetics of a photochrome probe. The cascade triplet cis-trans photoisomerization of the excited stilbene derivative sensitized with the excited triplet Erythrosin B has been studied in a model liposome membrane. The photoisomerization reaction is depressed with nitroxide radicals quenching the excited triplet state of the sensitizer. The enhanced fluorescence polarization of the stilbene probe incorporated into liposome membranes indicates that the stilbene molecules are squeezed in a relatively viscous media of the phospholipids. Calibration of the "triple" cascade system is based on a previously proposed method that allows the measurement of the product of the quenching rate constant and the sensitizer's triplet lifetime, as well as the quantitative detection of the nitroxide radicals in the vicinity of the membrane surface. The experiment was conducted using the constant-illumination fluorescence technique. Sensitivity of the method using a standard commercial spectrofluorimeter is about 10(-12) mol of fluorescence molecules per sample and can be improved using an advanced fluorescence technique. The minimal local concentration of nitroxide radicals or any other quenchers being detected is about 10(-5) M. This method enables the investigation of any chemical and biological surface processes of microscopic scale when the minimal volume is about 10(-3) microL or less.

  20. Analysis of human tissue optical scattering spectra for the purpose of breast cancer diagnostics using multi-layer perceptron

    NASA Astrophysics Data System (ADS)

    Nuzhny, Anton S.; Shumsky, Sergey A.; Korzhov, Alexey G.; Lyubynskaya, Tatiana E.

    2008-02-01

    Optical scattering spectra obtained in the clinical trials of breast cancer diagnostic system were analyzed for the purpose to detect in the dataflow the segments corresponding to malignant tissues. Minimal invasive probe with optical fibers inside delivers white light from the source and collects the scattering light while being moved through the tissue. The sampling rate is 100 Hz and each record contains the results of measurements of scattered light intensity at 184 fixed wavelength points. Large amount of information acquired in each procedure, fuzziness in criteria of 'cancer' family membership and data noisiness make neural networks to be an attractive tool for analysis of these data. To define the dividing rule between 'cancer' and 'non-cancer' spectral families a three-layer perceptron was applied. In the process of perceptron learning back propagation method was used to minimize the learning error. Regularization was done using the Bayesian approach. The learning sample was formed by the experts. End-to-end probability calculation throughout the procedure dataset showed reliable detection of the 'cancer' segments. Much attention was paid on the spectra of the tissues with high blood content. Often the reason is vessel injury caused by the penetrating optical probe. But also it can be a dense vessel net surrounding the malignant tumor. To make the division into 'cancer' and 'non-cancer' families for the tissues with high blood content a special perceptron was learnt exceptionally on such spectra.

  1. Breast tissue classification in digital tomosynthesis images based on global gradient minimization and texture features

    NASA Astrophysics Data System (ADS)

    Qin, Xulei; Lu, Guolan; Sechopoulos, Ioannis; Fei, Baowei

    2014-03-01

    Digital breast tomosynthesis (DBT) is a pseudo-three-dimensional x-ray imaging modality proposed to decrease the effect of tissue superposition present in mammography, potentially resulting in an increase in clinical performance for the detection and diagnosis of breast cancer. Tissue classification in DBT images can be useful in risk assessment, computer-aided detection and radiation dosimetry, among other aspects. However, classifying breast tissue in DBT is a challenging problem because DBT images include complicated structures, image noise, and out-of-plane artifacts due to limited angular tomographic sampling. In this project, we propose an automatic method to classify fatty and glandular tissue in DBT images. First, the DBT images are pre-processed to enhance the tissue structures and to decrease image noise and artifacts. Second, a global smooth filter based on L0 gradient minimization is applied to eliminate detailed structures and enhance large-scale ones. Third, the similar structure regions are extracted and labeled by fuzzy C-means (FCM) classification. At the same time, the texture features are also calculated. Finally, each region is classified into different tissue types based on both intensity and texture features. The proposed method is validated using five patient DBT images using manual segmentation as the gold standard. The Dice scores and the confusion matrix are utilized to evaluate the classified results. The evaluation results demonstrated the feasibility of the proposed method for classifying breast glandular and fat tissue on DBT images.

  2. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    PubMed Central

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  3. Lactic acid bacteria and natural antimicrobials to improve the safety and shelf-life of minimally processed sliced apples and lamb's lettuce.

    PubMed

    Siroli, Lorenzo; Patrignani, Francesca; Serrazanetti, Diana I; Tabanelli, Giulia; Montanari, Chiara; Gardini, Fausto; Lanciotti, Rosalba

    2015-05-01

    Outbreaks of food-borne disease associated with the consumption of fresh and minimally processed fruits and vegetables have increased dramatically over the last few years. Traditional chemical sanitizers are unable to completely eradicate or kill the microorganisms on fresh produce. These conditions have stimulated research to alternative methods for increasing food safety. The use of protective cultures, particularly lactic acid bacteria (LAB), has been proposed for minimally processed products. However, the application of bioprotective cultures has been limited at the industrial level. From this perspective, the main aims of this study were to select LAB from minimally processed fruits and vegetables to be used as biocontrol agents and then to evaluate the effects of the selected strains, alone or in combination with natural antimicrobials (2-(E)-hexenal/hexanal, 2-(E)-hexenal/citral for apples and thyme for lamb's lettuce), on the shelf-life and safety characteristics of minimally processed apples and lamb's lettuce. The results indicated that applying the Lactobacillus plantarum strains CIT3 and V7B3 to apples and lettuce, respectively, increased both the safety and shelf-life. Moreover, combining the selected strains with natural antimicrobials produced a further increase in the shelf-life of these products without detrimental effects on the organoleptic qualities. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osovizky, A.; Rotem Industries Ltd, Rotem Industrial Park; University of Maryland, College park, Maryland

    A Chromatic Analysis Neutron Diffractometer Or Reflectometer (CANDOR) is under development at the NIST Center for Neutron Research (NCNR). The CANDOR neutron sensor will rely on scintillator material for detecting the neutrons scattered by the sample under test. It consists of {sup 6}LiF:ZnS(Ag) scintillator material into which wavelength shifting (WLS) fibers have been embedded. Solid state photo-sensors (silicon photomultipliers) coupled to the WLS fibers are used to detect the light produced by the neutron capture event ({sup 6}Li (n,α) {sup 3}H reaction) and ionization of the ZnS(Ag). This detector configuration has the potential to accomplish the CANDOR performance requirements formore » efficiency of 90% for 5 A (3.35 meV) neutrons with high gamma rejection (10{sup 7}) along with compact design, affordable cost and materials availability. However this novel design includes challenges for precise neutron detection. The recognizing of the neutron signature versus the noise event produce by gamma event cannot be easy overcome by pulse height discrimination obstacle as can be achieved with {sup 3}He gas tube. Furthermore the selection of silicon photomultipliers (SiPM) as the light sensor maintains the obstacle of dark noise that does not exist when a photomultiplier tube is coupled to the scintillator. A proper selection of SiPM should focus on increasing the output signal and reducing the dark noise in order to optimize the detection sensitivity and to provide a clean signal pulse shape discrimination. The main parameters for evaluation are: - Quantum Efficiency (QE) - matching the SiPM peak QE with the peak transmission wavelength emission of the WLS. - Recovery time - a short recovery time is preferred to minimize the pulse width beyond the intrinsic decay time of the scintillator crystal (improves the gamma rejection based output pulse shape (time)). - Diode dimensions -The dark noise is proportional to the diode active area while the signal is provided by the WLS fibers; therefore the diode area should ideally be only minimally larger than fiber bundle area. - Low dark noise - it is desirable to minimize the dark noise during the pulse integration period so as to minimize the background for pulse shape discrimination. - Photon Detection Efficiency - it is desirable to increase the SiPM PDE in order to enhance light collection. This will increase the likelihood of detecting neutron events with lower light production and will present a cleaner raw signal for pulse shape discrimination. We will present the SiPM optimization process and studies of dark noise and gamma and neutron sensitivity as a function of bias voltage and operating temperature that have enabled us to optimize the detector sensitivity and gamma rejection. The gamma rejection performance goal requires to overcome the challenge of discriminating between the light signature accepted by neutron event to the one received by the noise. In addition there is a huge variation between the number of light photons that reaching the WLS fibers for different neutron events caused by the heavy ions energy losses prior to ionizing the ZnS(Ag) and the high light attenuation of the scintillation mixture. This variation in the light signal along with the long decay time of the ZnS(Ag) (tens of microseconds) can cause double counting of the same neutron event in the case of high light output signature or preventing the detection of low sequential light output signature neutron event. We will presents the algorithm developed for {sup 6}LiF:ZnS(Ag) sensor readout and the results achieved by an off-line analysis by Matlab software code that successfully achieved both the high gamma rejection with a sensitive and accurate neutron event detection. (authors)« less

  5. Improving and Assessing Planet Sensitivity of the GPI Exoplanet Survey with a Forward Model Matched Filter

    NASA Astrophysics Data System (ADS)

    Ruffio, Jean-Baptiste; Macintosh, Bruce; Wang, Jason J.; Pueyo, Laurent; Nielsen, Eric L.; De Rosa, Robert J.; Czekala, Ian; Marley, Mark S.; Arriaga, Pauline; Bailey, Vanessa P.; Barman, Travis; Bulger, Joanna; Chilcote, Jeffrey; Cotten, Tara; Doyon, Rene; Duchêne, Gaspard; Fitzgerald, Michael P.; Follette, Katherine B.; Gerard, Benjamin L.; Goodsell, Stephen J.; Graham, James R.; Greenbaum, Alexandra Z.; Hibon, Pascale; Hung, Li-Wei; Ingraham, Patrick; Kalas, Paul; Konopacky, Quinn; Larkin, James E.; Maire, Jérôme; Marchis, Franck; Marois, Christian; Metchev, Stanimir; Millar-Blanchaer, Maxwell A.; Morzinski, Katie M.; Oppenheimer, Rebecca; Palmer, David; Patience, Jennifer; Perrin, Marshall; Poyneer, Lisa; Rajan, Abhijith; Rameau, Julien; Rantakyrö, Fredrik T.; Savransky, Dmitry; Schneider, Adam C.; Sivaramakrishnan, Anand; Song, Inseok; Soummer, Remi; Thomas, Sandrine; Wallace, J. Kent; Ward-Duong, Kimberly; Wiktorowicz, Sloane; Wolff, Schuyler

    2017-06-01

    We present a new matched-filter algorithm for direct detection of point sources in the immediate vicinity of bright stars. The stellar point-spread function (PSF) is first subtracted using a Karhunen-Loéve image processing (KLIP) algorithm with angular and spectral differential imaging (ADI and SDI). The KLIP-induced distortion of the astrophysical signal is included in the matched-filter template by computing a forward model of the PSF at every position in the image. To optimize the performance of the algorithm, we conduct extensive planet injection and recovery tests and tune the exoplanet spectra template and KLIP reduction aggressiveness to maximize the signal-to-noise ratio (S/N) of the recovered planets. We show that only two spectral templates are necessary to recover any young Jovian exoplanets with minimal S/N loss. We also developed a complete pipeline for the automated detection of point-source candidates, the calculation of receiver operating characteristics (ROC), contrast curves based on false positives, and completeness contours. We process in a uniform manner more than 330 data sets from the Gemini Planet Imager Exoplanet Survey and assess GPI typical sensitivity as a function of the star and the hypothetical companion spectral type. This work allows for the first time a comparison of different detection algorithms at a survey scale accounting for both planet completeness and false-positive rate. We show that the new forward model matched filter allows the detection of 50% fainter objects than a conventional cross-correlation technique with a Gaussian PSF template for the same false-positive rate.

  6. An accuracy aware low power wireless EEG unit with information content based adaptive data compression.

    PubMed

    Tolbert, Jeremy R; Kabali, Pratik; Brar, Simeranjit; Mukhopadhyay, Saibal

    2009-01-01

    We present a digital system for adaptive data compression for low power wireless transmission of Electroencephalography (EEG) data. The proposed system acts as a base-band processor between the EEG analog-to-digital front-end and RF transceiver. It performs a real-time accuracy energy trade-off for multi-channel EEG signal transmission by controlling the volume of transmitted data. We propose a multi-core digital signal processor for on-chip processing of EEG signals, to detect signal information of each channel and perform real-time adaptive compression. Our analysis shows that the proposed approach can provide significant savings in transmitter power with minimal impact on the overall signal accuracy.

  7. New trends in logic synthesis for both digital designing and data processing

    NASA Astrophysics Data System (ADS)

    Borowik, Grzegorz; Łuba, Tadeusz; Poźniak, Krzysztof

    2016-09-01

    FPGA devices are equipped with memory-based structures. These memories act as very large logic cells where the number of inputs equals the number of address lines. At the same time, there is a huge demand in the market of Internet of Things for devices implementing virtual routers, intrusion detection systems, etc.; where such memories are crucial for realizing pattern matching circuits, IP address tables, and other. Unfortunately, existing CAD tools are not well suited to utilize capabilities that such large memory blocks offer due to the lack of appropriate synthesis procedures. This paper presents methods which are useful for memory-based implementations: minimization of the number of input variables and functional decomposition.

  8. Sun Safe Mode Controller Design for LADEE

    NASA Technical Reports Server (NTRS)

    Fusco, Jesse C.; Swei, Sean S. M.; Nakamura, Robert H.

    2015-01-01

    This paper presents the development of sun safe controllers which are designed to keep the spacecraft power positive and thermally balanced in the event an anomaly is detected. Employed by NASA's Lunar Atmosphere and Dust Environment Explorer (LADEE), the controllers utilize the measured sun vector and the spacecraft body rates for feedback control. To improve the accuracy of sun vector estimation, the least square minimization approach is applied to process the sensor data, which is proven to be effective and accurate. To validate the controllers, the LADEE spacecraft model engaging the sun safe mode was first simulated and then compared with the actual LADEE orbital fight data. The results demonstrated the applicability of the proposed sun safe controllers.

  9. Principle of minimal work fluctuations.

    PubMed

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  10. Hyperspectral Fluorescence and Reflectance Imaging Instrument

    NASA Technical Reports Server (NTRS)

    Ryan, Robert E.; O'Neal, S. Duane; Lanoue, Mark; Russell, Jeffrey

    2008-01-01

    The system is a single hyperspectral imaging instrument that has the unique capability to acquire both fluorescence and reflectance high-spatial-resolution data that is inherently spatially and spectrally registered. Potential uses of this instrument include plant stress monitoring, counterfeit document detection, biomedical imaging, forensic imaging, and general materials identification. Until now, reflectance and fluorescence spectral imaging have been performed by separate instruments. Neither a reflectance spectral image nor a fluorescence spectral image alone yields as much information about a target surface as does a combination of the two modalities. Before this system was developed, to benefit from this combination, analysts needed to perform time-consuming post-processing efforts to co-register the reflective and fluorescence information. With this instrument, the inherent spatial and spectral registration of the reflectance and fluorescence images minimizes the need for this post-processing step. The main challenge for this technology is to detect the fluorescence signal in the presence of a much stronger reflectance signal. To meet this challenge, the instrument modulates artificial light sources from ultraviolet through the visible to the near-infrared part of the spectrum; in this way, both the reflective and fluorescence signals can be measured through differencing processes to optimize fluorescence and reflectance spectra as needed. The main functional components of the instrument are a hyperspectral imager, an illumination system, and an image-plane scanner. The hyperspectral imager is a one-dimensional (line) imaging spectrometer that includes a spectrally dispersive element and a two-dimensional focal plane detector array. The spectral range of the current imaging spectrometer is between 400 to 1,000 nm, and the wavelength resolution is approximately 3 nm. The illumination system consists of narrowband blue, ultraviolet, and other discrete wavelength light-emitting-diode (LED) sources and white-light LED sources designed to produce consistently spatially stable light. White LEDs provide illumination for the measurement of reflectance spectra, while narrowband blue and UV LEDs are used to excite fluorescence. Each spectral type of LED can be turned on or off depending on the specific remote-sensing process being performed. Uniformity of illumination is achieved by using an array of LEDs and/or an integrating sphere or other diffusing surface. The image plane scanner uses a fore optic with a field of view large enough to provide an entire scan line on the image plane. It builds up a two-dimensional image in pushbroom fashion as the target is scanned across the image plane either by moving the object or moving the fore optic. For fluorescence detection, spectral filtering of a narrowband light illumination source is sometimes necessary to minimize the interference of the source spectrum wings with the fluorescence signal. Spectral filtering is achieved with optical interference filters and absorption glasses. This dual spectral imaging capability will enable the optimization of reflective, fluorescence, and fused datasets as well as a cost-effective design for multispectral imaging solutions. This system has been used in plant stress detection studies and in currency analysis.

  11. Algorithms for Monitoring Heart Rate and Respiratory Rate From the Video of a User’s Face

    PubMed Central

    Sanyal, Shourjya

    2018-01-01

    Smartphone cameras can measure heart rate (HR) by detecting pulsatile photoplethysmographic (iPPG) signals from post-processing the video of a subject’s face. The iPPG signal is often derived from variations in the intensity of the green channel as shown by Poh et. al. and Verkruysse et. al.. In this pilot study, we have introduced a novel iPPG method where by measuring variations in color of reflected light, i.e., Hue, and can therefore measure both HR and respiratory rate (RR) from the video of a subject’s face. This paper was performed on 25 healthy individuals (Ages 20–30, 15 males and 10 females, and skin color was Fitzpatrick scale 1–6). For each subject we took two 20 second video of the subject’s face with minimal movement, one with flash ON and one with flash OFF. While recording the videos we simultaneously measuring HR using a Biosync B-50DL Finger Heart Rate Monitor, and RR using self-reporting. This paper shows that our proposed approach of measuring iPPG using Hue (range 0–0.1) gives more accurate readings than the Green channel. HR/Hue (range 0–0.1) (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$r=0.9201$ \\end{document}, \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p$ \\end{document}-value = 4.1617, and RMSE = 0.8887) is more accurate compared with HR/Green (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$r=0.4916$ \\end{document}, \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p$ \\end{document}-value = 11.60172, and RMSE = 0.9068). RR/Hue (range 0–0.1) (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$r=0.6575$ \\end{document}, \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p$ \\end{document}-value = 0.2885, and RMSE = 3.8884) is more accurate compared with RR/Green (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$r=0.3352$ \\end{document}, \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p$ \\end{document}-value = 0.5608, and RMSE = 5.6885). We hope that this hardware agnostic approach for detection of vital signals will have a huge potential impact in telemedicine, and can be used to tackle challenges, such as continuous non-contact monitoring of neo-natal and elderly patients. An implementation of the algorithm can be found at https://pulser.thinkbiosolution.com

  12. A graphene solution to conductivity mismatch: spin injection from ferromagnetic metal/graphene tunnel contacts into silicon

    NASA Astrophysics Data System (ADS)

    van't Erve, Olaf

    2014-03-01

    New paradigms for spin-based devices, such as spin-FETs and reconfigurable logic, have been proposed and modeled. These devices rely on electron spin being injected, transported, manipulated and detected in a semiconductor channel. This work is the first demonstration on how a single layer of graphene can be used as a low resistance tunnel barrier solution for electrical spin injection into Silicon at room temperature. We will show that a FM metal / monolayer graphene contact serves as a spin-polarized tunnel barrier which successfully circumvents the classic metal / semiconductor conductivity mismatch issue for electrical spin injection. We demonstrate electrical injection and detection of spin accumulation in Si above room temperature, and show that the corresponding spin lifetimes correlate with the Si carrier concentration, confirming that the spin accumulation measured occurs in the Si and not in interface trap states. An ideal tunnel barrier should exhibit several key material characteristics: a uniform and planar habit with well-controlled thickness, minimal defect / trapped charge density, a low resistance-area product for minimal power consumption, and compatibility with both the FM metal and semiconductor, insuring minimal diffusion to/from the surrounding materials at temperatures required for device processing. Graphene, offers all of the above, while preserving spin injection properties, making it a compelling solution to the conductivity mismatch for spin injection into Si. Although Graphene is very conductive in plane, it exhibits poor conductivity perpendicular to the plane. Its sp2 bonding results in a highly uniform, defect free layer, which is chemically inert, thermally robust, and essentially impervious to diffusion. The use of a single monolayer of graphene at the Si interface provides a much lower RA product than any film of an oxide thick enough to prevent pinholes (1 nm). Our results identify a new route to low resistance-area product spin-polarized contacts, a crucial requirement enabling future semiconductor spintronic devices, which rely upon two-terminal magnetoresistance, including spin-based transistors, logic and memory.

  13. Enzyme catalysis with small ionic liquid quantities.

    PubMed

    Fischer, Fabian; Mutschler, Julien; Zufferey, Daniel

    2011-04-01

    Enzyme catalysis with minimal ionic liquid quantities improves reaction rates, stereoselectivity and enables solvent-free processing. In particular the widely used lipases combine well with many ionic liquids. Demonstrated applications are racemate separation, esterification and glycerolysis. Minimal solvent processing is also an alternative to sluggish solvent-free catalysis. The method allows simplified down-stream processing, as only traces of ionic liquids have to be removed.

  14. Ultra-processed food purchases in Norway: a quantitative study on a representative sample of food retailers.

    PubMed

    Solberg, Siri Løvsjø; Terragni, Laura; Granheim, Sabrina Ionata

    2016-08-01

    To identify the use of ultra-processed foods - vectors of salt, sugar and fats - in the Norwegian diet through an assessment of food sales. Sales data from a representative sample of food retailers in Norway, collected in September 2005 (n 150) and September 2013 (n 170), were analysed. Data consisted of barcode scans of individual food item purchases, reporting type of food, price, geographical region and retail concept. Foods were categorized as minimally processed, culinary ingredients, processed products and ultra-processed. Indicators were share of purchases and share of expenditure on food categories. Six geographical regions in Norway. The barcode data included 296 121 observations in 2005 and 501 938 observations in 2013. Ultra-processed products represented 58·8 % of purchases and 48·8 % of expenditure in 2013. Minimally processed foods accounted for 17·2 % of purchases and 33·0 % of expenditure. Every third purchase was a sweet ultra-processed product. Food sales changed marginally in favour of minimally processed foods and in disfavour of processed products between 2005 and 2013 (χ 2 (3)=203 195, P<0·001, Cramer's V=0·017, P<0·001). Ultra-processed products accounted for the majority of food sales in Norway, indicating a high consumption of such products. This could be contributing to rising rates of overweight, obesity and non-communicable diseases in the country, as findings from other countries indicate. Policy measures should aim at decreasing consumption of ultra-processed products and facilitating access (including economic) to minimally processed foods.

  15. Perceptual Repetition Blindness Effects

    NASA Technical Reports Server (NTRS)

    Hochhaus, Larry; Johnston, James C.; Null, Cynthia H. (Technical Monitor)

    1994-01-01

    The phenomenon of repetition blindness (RB) may reveal a new limitation on human perceptual processing. Recently, however, researchers have attributed RB to post-perceptual processes such as memory retrieval and/or reporting biases. The standard rapid serial visual presentation (RSVP) paradigm used in most RB studies is, indeed, open to such objections. Here we investigate RB using a "single-frame" paradigm introduced by Johnston and Hale (1984) in which memory demands are minimal. Subjects made only a single judgement about whether one masked target word was the same or different than a post-target probe. Confidence ratings permitted use of signal detection methods to assess sensitivity and bias effects. In the critical condition for RB a precue of the post-target word was provided prior to the target stimulus (identity precue), so that the required judgement amounted to whether the target did or did not repeat the precue word. In control treatments, the precue was either an unrelated word or a dummy.

  16. Fluorescence correlation spectroscopy: novel variations of an established technique.

    PubMed

    Haustein, Elke; Schwille, Petra

    2007-01-01

    Fluorescence correlation spectroscopy (FCS) is one of the major biophysical techniques used for unraveling molecular interactions in vitro and in vivo. It allows minimally invasive study of dynamic processes in biological specimens with extremely high temporal and spatial resolution. By recording and correlating the fluorescence fluctuations of single labeled molecules through the exciting laser beam, FCS gives information on molecular mobility and photophysical and photochemical reactions. By using dual-color fluorescence cross-correlation, highly specific binding studies can be performed. These have been extended to four reaction partners accessible by multicolor applications. Alternative detection schemes shift accessible time frames to slower processes (e.g., scanning FCS) or higher concentrations (e.g., TIR-FCS). Despite its long tradition, FCS is by no means dated. Rather, it has proven to be a highly versatile technique that can easily be adapted to solve specific biological questions, and it continues to find exciting applications in biology and medicine.

  17. Wavelength-normalized spectroscopic analysis of Staphylococcus aureus and Pseudomonas aeruginosa growth rates.

    PubMed

    McBirney, Samantha E; Trinh, Kristy; Wong-Beringer, Annie; Armani, Andrea M

    2016-10-01

    Optical density (OD) measurements are the standard approach used in microbiology for characterizing bacteria concentrations in culture media. OD is based on measuring the optical absorbance of a sample at a single wavelength, and any error will propagate through all calculations, leading to reproducibility issues. Here, we use the conventional OD technique to measure the growth rates of two different species of bacteria, Pseudomonas aeruginosa and Staphylococcus aureus. The same samples are also analyzed over the entire UV-Vis wavelength spectrum, allowing a distinctly different strategy for data analysis to be performed. Specifically, instead of only analyzing a single wavelength, a multi-wavelength normalization process is implemented. When the OD method is used, the detected signal does not follow the log growth curve. In contrast, the multi-wavelength normalization process minimizes the impact of bacteria byproducts and environmental noise on the signal, thereby accurately quantifying growth rates with high fidelity at low concentrations.

  18. Design of Knowledge Management System for Diabetic Complication Diseases

    NASA Astrophysics Data System (ADS)

    Fiarni, Cut

    2017-01-01

    This paper examines how to develop a Model for Knowledge Management System (KMS) for diabetes complication diseases. People with diabetes have a higher risk of developing a series of serious health problems. Each patient has different condition that could lead to different disease and health problem. But, with the right information, patient could have early detection so the health risk could be minimized and avoided. Hence, the objective of this research is to propose a conceptual framework that integrates social network model, Knowledge Management activities, and content based reasoning (CBR) for designing such a diabetes health and complication disease KMS. The framework indicates that the critical knowledge management activities are in the process to find similar case and the index table for algorithm to fit the framework for the social media. With this framework, KMS developers can work with healthcare provider to easily identify the suitable IT associated with the CBR process when developing a diabetes KMS.

  19. Occurrence of Regulated Mycotoxins and Other Microbial Metabolites in Dried Cassava Products from Nigeria.

    PubMed

    Abass, Adebayo B; Awoyale, Wasiu; Sulyok, Michael; Alamu, Emmanuel O

    2017-06-29

    Dried cassava products are perceived as one of the potential sources of mycotoxin ingestion in human foods. Processing either contributes to the reduction of toxins or further exposes products to contamination by microorganisms that release metabolic toxins into the products. Thus, the prevalence of microbial metabolites in 373 processed cassava products was investigated in Nigeria. With the use of liquid chromatography tandem-mass spectrometry (LC-MS/MS) for the constituent analysis, a few major mycotoxins (aflatoxin B₁ and G₁, fumonisin B₁ and B₂, and zearalenone) regulated in food crops by the Commission of the European Union were found at concentrations which are toxicologically acceptable in many other crops. Some bioactive compounds were detected at low concentrations in the cassava products. Therefore, the exposure of cassava consumers in Nigeria to regulated mycotoxins was estimated to be minimal. The results provide useful information regarding the probable safety of cassava products in Nigeria.

  20. POLLUTION BALANCE: A NEW METHODOLOGY FOR MINIMIZING WASTE PRODUCTION IN MANUFACTURING PROCESSES.

    EPA Science Inventory

    A new methodolgy based on a generic pollution balance equation, has been developed for minimizing waste production in manufacturing processes. A "pollution index," defined as the mass of waste produced per unit mass of a product, has been introduced to provide a quantitative meas...

  1. 40 CFR 63.543 - What are my standards for process vents?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... develop and follow standard operating procedures designed to minimize emissions of total hydrocarbon for... manufacturer's recommended procedures, if available, and the standard operating procedures designed to minimize... 40 Protection of Environment 10 2014-07-01 2014-07-01 false What are my standards for process...

  2. Technique minimizes the effects of dropouts on telemetry records

    NASA Technical Reports Server (NTRS)

    Anderson, T. O.; Hurd, W. J.

    1972-01-01

    Recorder deficiencies are minimized by using two-channel system to prepare two tapes, each having noise, wow and flutter, and dropout characteristics of channel on which it was made. Processing tapes by computer and combining signals from two channels produce single tape free of dropouts caused by recording process.

  3. Automatic Assessment of Acquisition and Transmission Losses in Indian Remote Sensing Satellite Data

    NASA Astrophysics Data System (ADS)

    Roy, D.; Purna Kumari, B.; Manju Sarma, M.; Aparna, N.; Gopal Krishna, B.

    2016-06-01

    The quality of Remote Sensing data is an important parameter that defines the extent of its usability in various applications. The data from Remote Sensing satellites is received as raw data frames at the ground station. This data may be corrupted with data losses due to interferences during data transmission, data acquisition and sensor anomalies. Thus it is important to assess the quality of the raw data before product generation for early anomaly detection, faster corrective actions and product rejection minimization. Manual screening of raw images is a time consuming process and not very accurate. In this paper, an automated process for identification and quantification of losses in raw data like pixel drop out, line loss and data loss due to sensor anomalies is discussed. Quality assessment of raw scenes based on these losses is also explained. This process is introduced in the data pre-processing stage and gives crucial data quality information to users at the time of browsing data for product ordering. It has also improved the product generation workflow by enabling faster and more accurate quality estimation.

  4. Enhanced visualization of the bile duct via parallel white light and indocyanine green fluorescence laparoscopic imaging

    NASA Astrophysics Data System (ADS)

    Demos, Stavros G.; Urayama, Shiro

    2014-03-01

    Despite best efforts, bile duct injury during laparoscopic cholecystectomy is a major potential complication. Precise detection method of extrahepatic bile duct during laparoscopic procedures would minimize the risk of injury. Towards this goal, we have developed a compact imaging instrumentation designed to enable simultaneous acquisition of conventional white color and NIR fluorescence endoscopic/laparoscopic imaging using ICG as contrast agent. The capabilities of this system, which offers optimized sensitivity and functionality, are demonstrated for the detection of the bile duct in an animal model. This design could also provide a low-cost real-time surgical navigation capability to enhance the efficacy of a variety of other image-guided minimally invasive procedures.

  5. Searching for Physics Beyond the Standard Model and Beyond

    NASA Astrophysics Data System (ADS)

    Abdullah, Mohammad

    The hierarchy problem, convolved with the various known puzzles in particle physics, grants us a great outlook of new physics soon to be discovered. We present multiple approaches to searching for physics beyond the standard model. First, two models with a minimal amount of theoretical guidance are analyzed using existing or simulated LHC data. Then, an extension of the Minimal Supersymmetric Standard Model (MSSM) is studied with an emphasis on the cosmological implications as well as the current and future sensitivity of colliders, direct detection and indirect detection experiments. Finally, a more complete model of the MSSM is presented through which we attempt to resolve tension with observations within the context of gauge mediated supersymmetry breaking.

  6. Diagnosis of Plasma Cell Dyscrasias and Monitoring of Minimal Residual Disease by Multiparametric Flow Cytometry

    PubMed Central

    Soh, Kah Teong; Tario, Joseph D.; Wallace, Paul K.

    2018-01-01

    Synopsis Plasma cell dyscrasia (PCD) is a heterogeneous disease which has seen a tremendous change in outcomes due to improved therapies. Over the last few decades, multiparametric flow cytometry has played an important role in the detection and monitoring of PCDs. Flow cytometry is a high sensitivity assay for early detection of minimal residual disease (MRD) that correlates well with progression-free survival and overall survival. Before flow cytometry can be effectively implemented in the clinical setting sample preparation, panel configuration, analysis, and gating strategies must be optimized to ensure accurate results. Current consensus methods and reporting guidelines for MRD testing are discussed. PMID:29128071

  7. Tire Force Estimation using a Proportional Integral Observer

    NASA Astrophysics Data System (ADS)

    Farhat, Ahmad; Koenig, Damien; Hernandez-Alcantara, Diana; Morales-Menendez, Ruben

    2017-01-01

    This paper addresses a method for detecting critical stability situations in the lateral vehicle dynamics by estimating the non-linear part of the tire forces. These forces indicate the road holding performance of the vehicle. The estimation method is based on a robust fault detection and estimation approach which minimize the disturbance and uncertainties to residual sensitivity. It consists in the design of a Proportional Integral Observer (PIO), while minimizing the well known H ∞ norm for the worst case uncertainties and disturbance attenuation, and combining a transient response specification. This multi-objective problem is formulated as a Linear Matrix Inequalities (LMI) feasibility problem where a cost function subject to LMI constraints is minimized. This approach is employed to generate a set of switched robust observers for uncertain switched systems, where the convergence of the observer is ensured using a Multiple Lyapunov Function (MLF). Whilst the forces to be estimated can not be physically measured, a simulation scenario with CarSimTM is presented to illustrate the developed method.

  8. Recovery and characterization of proteins from pangas (Pangasius pangasius) processing waste obtained through pH shift processing.

    PubMed

    Surasani, Vijay Kumar Reddy; Kudre, Tanaji; Ballari, Rajashekhar V

    2018-04-01

    Study was conducted to recover proteins from pangas (Pangasius pangasius) processing waste (fillet frames) using pH shift method and to characterize the recovered isolates. pH 2.0 from acidic range and pH 13.0 from alkaline range were found to have maximum protein recovery (p < 0.05). During the recovery process, acidic pH (pH 2.0) was found to have minimal effect on proteins resulting in more stable isolates and strong protein gels. Alkaline pH (pH 13.0) caused protein denaturation resulting in less stable proteins and poor gel network. Both acidic and alkaline-aided processing caused significant (p < 0.05) reductions in total lipid, myoglobin, and pigment content thus by resulting in whiter protein isolates and gels. The content of total essential amino acids increased during pH shift processing, indicating the enrichment of essential amino acids. No microbial counts were detected in any of the isolates prepared using acid and alkaline extraction methods. pH shift processing was found to be promising in the utilization of fish processing waste for the recovery of functional proteins from pangas processing waste thus by reducing the supply demand gap as well pollution problems.

  9. Detection of entanglement with few local measurements

    NASA Astrophysics Data System (ADS)

    Gühne, O.; Hyllus, P.; Bruß, D.; Ekert, A.; Lewenstein, M.; Macchiavello, C.; Sanpera, A.

    2002-12-01

    We introduce a general method for the experimental detection of entanglement by performing only few local measurements, assuming some prior knowledge of the density matrix. The idea is based on the minimal decomposition of witness operators into a pseudomixture of local operators. We discuss an experimentally relevant case of two qubits, and show an example how bound entanglement can be detected with few local measurements.

  10. Differentiating retroperitoneal liposarcoma tumors with optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Lev, Dina; Baranov, Stepan A.; Carbajal, Esteban F.; Young, Eric D.; Pollock, Raphael E.; Larin, Kirill V.

    2011-03-01

    Liposarcoma (LS) is a rare and heterogeneous group of malignant mesenchymal neoplasms exhibiting characteristics of adipocytic differentiation. Currently, radical surgical resection represents the most effective and widely used therapy for patients with abdominal/retroperitoneal LS, but the presence of contiguous essential organs, such as the kidney, pancreas, spleen, adrenal glands, esophagus or colon, as well as often reoccurrence of LS in A/RP calls for the enhancement of surgical techniques to minimize resection and avoid LS reoccurrences. Difficulty in detecting the margins of neoplasms due to their affinity to healthy fat tissue accounts for the high reoccurrence of LS within A/RP. Nowadays, the microscopic detection of margins is possible only by use of biopsy, and the minimization of surgical resection of healthy tissues is challenging. In this presentation we'll demonstrate the initial OCT results for the imaging and distinction of LS and normal human fat tissues and clear detection of tumor boundaries.

  11. Validity, Responsiveness, Minimal Detectable Change, and Minimal Clinically Important Change of the Pediatric Motor Activity Log in Children with Cerebral Palsy

    ERIC Educational Resources Information Center

    Lin, Keh-chung; Chen, Hui-fang; Chen, Chia-ling; Wang, Tien-ni; Wu, Ching-yi; Hsieh, Yu-wei; Wu, Li-ling

    2012-01-01

    This study examined criterion-related validity and clinimetric properties of the Pediatric Motor Activity Log (PMAL) in children with cerebral palsy. Study participants were 41 children (age range: 28-113 months) and their parents. Criterion-related validity was evaluated by the associations between the PMAL and criterion measures at baseline and…

  12. CCD Detects Two Images In Quick Succession

    NASA Technical Reports Server (NTRS)

    Janesick, James R.; Collins, Andy

    1996-01-01

    Prototype special-purpose charge-coupled device (CCD) designed to detect two 1,024 x 1,024-pixel images in rapid succession. Readout performed slowly to minimize noise. CCD operated in synchronism with pulsed laser, stroboscope, or other pulsed source of light to form pairs of images of rapidly moving objects.

  13. DETECTION OF ARSENOSUGARS FROM KELP EXTRACTS VIA IC-ELECTROSPRAY IONIZATION-MS-MS AND IC MEMBRANE HYDRIDE GENERATION ICP-MS

    EPA Science Inventory

    The selectivity and the ability to obtain structural information from detection schemes used in arsenic speciation research are growing analytical requirements driven by the growing number of arsenicalS extracted from natural products and the need to minimize misidentification in...

  14. A scalable approach for tree segmentation within small-footprint airborne LiDAR data

    NASA Astrophysics Data System (ADS)

    Hamraz, Hamid; Contreras, Marco A.; Zhang, Jun

    2017-05-01

    This paper presents a distributed approach that scales up to segment tree crowns within a LiDAR point cloud representing an arbitrarily large forested area. The approach uses a single-processor tree segmentation algorithm as a building block in order to process the data delivered in the shape of tiles in parallel. The distributed processing is performed in a master-slave manner, in which the master maintains the global map of the tiles and coordinates the slaves that segment tree crowns within and across the boundaries of the tiles. A minimal bias was introduced to the number of detected trees because of trees lying across the tile boundaries, which was quantified and adjusted for. Theoretical and experimental analyses of the runtime of the approach revealed a near linear speedup. The estimated number of trees categorized by crown class and the associated error margins as well as the height distribution of the detected trees aligned well with field estimations, verifying that the distributed approach works correctly. The approach enables providing information of individual tree locations and point cloud segments for a forest-level area in a timely manner, which can be used to create detailed remotely sensed forest inventories. Although the approach was presented for tree segmentation within LiDAR point clouds, the idea can also be generalized to scale up processing other big spatial datasets.

  15. Method for automatic localization of MR-visible markers using morphological image processing and conventional pulse sequences: feasibility for image-guided procedures.

    PubMed

    Busse, Harald; Trampel, Robert; Gründer, Wilfried; Moche, Michael; Kahn, Thomas

    2007-10-01

    To evaluate the feasibility and accuracy of an automated method to determine the 3D position of MR-visible markers. Inductively coupled RF coils were imaged in a whole-body 1.5T scanner using the body coil and two conventional gradient echo sequences (FLASH and TrueFISP) and large imaging volumes up to (300 mm(3)). To minimize background signals, a flip angle of approximately 1 degrees was used. Morphological 2D image processing in orthogonal scan planes was used to determine the 3D positions of a configuration of three fiducial markers (FMC). The accuracies of the marker positions and of the orientation of the plane defined by the FMC were evaluated at various distances r(M) from the isocenter. Fiducial marker detection with conventional equipment (pulse sequences, imaging coils) was very reliable and highly reproducible over a wide range of experimental conditions. For r(M)

  16. High throughput and quantitative approaches for measuring circadian rhythms in cyanobacteria using bioluminescence

    PubMed Central

    Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.

    2016-01-01

    The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451

  17. Immersion lithography defectivity analysis at DUV inspection wavelength

    NASA Astrophysics Data System (ADS)

    Golan, E.; Meshulach, D.; Raccah, N.; Yeo, J. Ho.; Dassa, O.; Brandl, S.; Schwarz, C.; Pierson, B.; Montgomery, W.

    2007-03-01

    Significant effort has been directed in recent years towards the realization of immersion lithography at 193nm wavelength. Immersion lithography is likely a key enabling technology for the production of critical layers for 45nm and 32nm design rule (DR) devices. In spite of the significant progress in immersion lithography technology, there remain several key technology issues, with a critical issue of immersion lithography process induced defects. The benefits of the optical resolution and depth of focus, made possible by immersion lithography, are well understood. Yet, these benefits cannot come at the expense of increased defect counts and decreased production yield. Understanding the impact of the immersion lithography process parameters on wafer defects formation and defect counts, together with the ability to monitor, control and minimize the defect counts down to acceptable levels is imperative for successful introduction of immersion lithography for production of advanced DR's. In this report, we present experimental results of immersion lithography defectivity analysis focused on topcoat layer thickness parameters and resist bake temperatures. Wafers were exposed on the 1150i-α-immersion scanner and 1200B Scanner (ASML), defect inspection was performed using a DUV inspection tool (UVision TM, Applied Materials). Higher sensitivity was demonstrated at DUV through detection of small defects not detected at the visible wavelength, indicating on the potential high sensitivity benefits of DUV inspection for this layer. The analysis indicates that certain types of defects are associated with different immersion process parameters. This type of analysis at DUV wavelengths would enable the optimization of immersion lithography processes, thus enabling the qualification of immersion processes for volume production.

  18. X-Ray Computed Tomography: The First Step in Mars Sample Return Processing

    NASA Technical Reports Server (NTRS)

    Welzenbach, L. C.; Fries, M. D.; Grady, M. M.; Greenwood, R. C.; McCubbin, F. M.; Zeigler, R. A.; Smith, C. L.; Steele, A.

    2017-01-01

    The Mars 2020 rover mission will collect and cache samples from the martian surface for possible retrieval and subsequent return to Earth. If the samples are returned, that mission would likely present an opportunity to analyze returned Mars samples within a geologic context on Mars. In addition, it may provide definitive information about the existence of past or present life on Mars. Mars sample return presents unique challenges for the collection, containment, transport, curation and processing of samples [1] Foremost in the processing of returned samples are the closely paired considerations of life detection and Planetary Protection. In order to achieve Mars Sample Return (MSR) science goals, reliable analyses will depend on overcoming some challenging signal/noise-related issues where sparse martian organic compounds must be reliably analyzed against the contamination background. While reliable analyses will depend on initial clean acquisition and robust documentation of all aspects of developing and managing the cache [2], there needs to be a reliable sample handling and analysis procedure that accounts for a variety of materials which may or may not contain evidence of past or present martian life. A recent report [3] suggests that a defined set of measurements should be made to effectively inform both science and Planetary Protection, when applied in the context of the two competing null hypotheses: 1) that there is no detectable life in the samples; or 2) that there is martian life in the samples. The defined measurements would include a phased approach that would be accepted by the community to preserve the bulk of the material, but provide unambiguous science data that can be used and interpreted by various disciplines. Fore-most is the concern that the initial steps would ensure the pristine nature of the samples. Preliminary, non-invasive techniques such as computed X-ray tomography (XCT) have been suggested as the first method to interrogate and characterize the cached samples without altering the materials [1,2]. A recent report [4] indicates that XCT may minimally alter samples for some techniques, and work is needed to quantify these effects, maximizing science return from XCT initial analysis while minimizing effects.

  19. Cost-effectiveness and budget impact analyses of a long-term hypertension detection and control program for stroke prevention.

    PubMed

    Yamagishi, Kazumasa; Sato, Shinichi; Kitamura, Akihiko; Kiyama, Masahiko; Okada, Takeo; Tanigawa, Takeshi; Ohira, Tetsuya; Imano, Hironori; Kondo, Masahide; Okubo, Ichiro; Ishikawa, Yoshinori; Shimamoto, Takashi; Iso, Hiroyasu

    2012-09-01

    The nation-wide, community-based intensive hypertension detection and control program, as well as universal health insurance coverage, may well be contributing factors for helping Japan rank near the top among countries with the longest life expectancy. We sought to examine the cost-effectiveness of such a community-based intervention program, as no evidence has been available for this issue. The hypertension detection and control program was initiated in 1963 in full intervention and minimal intervention communities in Akita, Japan. We performed comparative cost-effectiveness and budget-impact analyses for the period 1964-1987 of the costs of public health services and treatment of patients with hypertension and stroke on the one hand, and incidence of stroke on the other in the full intervention and minimal intervention communities. The program provided in the full intervention community was found to be cost saving 13 years after the beginning of program in addition to the fact of effectiveness that; the prevalence and incidence of stroke were consistently lower in the full intervention community than in the minimal intervention community throughout the same period. The incremental cost was minus 28,358 yen per capita over 24 years. The community-based intensive hypertension detection and control program was found to be both effective and cost saving. The national government's policy to support this program may have contributed in part to the substantial decline in stroke incidence and mortality, which was largely responsible for the increase in Japanese life expectancy.

  20. Immunohistochemical identification of Propionibacterium acnes in granuloma and inflammatory cells of myocardial tissues obtained from cardiac sarcoidosis patients.

    PubMed

    Asakawa, Naoya; Uchida, Keisuke; Sakakibara, Mamoru; Omote, Kazunori; Noguchi, Keiji; Tokuda, Yusuke; Kamiya, Kiwamu; Hatanaka, Kanako C; Matsuno, Yoshihiro; Yamada, Shiro; Asakawa, Kyoko; Fukasawa, Yuichiro; Nagai, Toshiyuki; Anzai, Toshihisa; Ikeda, Yoshihiko; Ishibashi-Ueda, Hatsue; Hirota, Masanori; Orii, Makoto; Akasaka, Takashi; Uto, Kenta; Shingu, Yasushige; Matsui, Yoshiro; Morimoto, Shin-Ichiro; Tsutsui, Hiroyuki; Eishi, Yoshinobu

    2017-01-01

    Although rare, cardiac sarcoidosis (CS) is potentially fatal. Early diagnosis and intervention are essential, but histopathologic diagnosis is limited. We aimed to detect Propionibacterium acnes, a commonly implicated etiologic agent of sarcoidosis, in myocardial tissues obtained from CS patients. We examined formalin-fixed paraffin-embedded myocardial tissues obtained by surgery or autopsy and endomyocardial biopsy from patients with CS (n = 26; CS-group), myocarditis (n = 15; M-group), or other cardiomyopathies (n = 39; CM-group) using immunohistochemistry (IHC) with a P. acnes-specific monoclonal antibody. We found granulomas in 16 (62%) CS-group samples. Massive (≥14 inflammatory cells) and minimal (<14 inflammatory cells) inflammatory foci, respectively, were detected in 16 (62%) and 11 (42%) of the CS-group samples, 10 (67%) and 10 (67%) of the M-group samples, and 1 (3%) and 18 (46%) of the CM-group samples. P. acnes-positive reactivity in granulomas, massive inflammatory foci, and minimal inflammatory foci were detected in 10 (63%), 10 (63%), and 8 (73%) of the CS-group samples, respectively, and in none of the M-group and CM-group samples. Frequent identification of P. acnes in sarcoid granulomas of originally aseptic myocardial tissues suggests that this indigenous bacterium causes granuloma in many CS patients. IHC detection of P. acnes in massive or minimal inflammatory foci of myocardial biopsy samples without granulomas may be useful for differentiating sarcoidosis from myocarditis or other cardiomyopathies.

  1. The detection of impact regions of asteroids' orbits by means of constrained minimization of confidence coefficient. (Russian Title: Выявление областей столкновительных орбит астероидов с помощью условной минимизации доверительного коэффициента)

    NASA Astrophysics Data System (ADS)

    Baturin, A. P.

    2014-12-01

    The theme of NEO's impact orbits' regions detecting has been considered. The regions have been detected in the space of initial motion parameters. The detecting has been done by means of constrained minimization of so called "confidence coefficient". This coefficient determines the position of an orbit inside its confidence ellipsoid obtained from a least-square orbit fitting. As a condition the constraining of an asteroid-Earth distance at considered encounter has been used. By means of random variation of initial approximations for the minimization and of the parameter constraining an asteroid-Earth distance it has been demonstrated that impact regions usually have a form of some long tubes in the space of initial motion parameters. The demonstration has been done for the asteroids 2009 FD, 2011 TO and 2012 PB20 at their waited closest encounters to the Earth.

  2. High-throughput label-free detection of aggregate platelets with optofluidic time-stretch microscopy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Jiang, Yiyue; Lei, Cheng; Yasumoto, Atsushi; Ito, Takuro; Guo, Baoshan; Kobayashi, Hirofumi; Ozeki, Yasuyuki; Yatomi, Yutaka; Goda, Keisuke

    2017-02-01

    According to WHO, approximately 10 million new cases of thrombotic disorders are diagnosed worldwide every year. In the U.S. and Europe, their related diseases kill more people than those from AIDS, prostate cancer, breast cancer and motor vehicle accidents combined. Although thrombotic disorders, especially arterial ones, mainly result from enhanced platelet aggregability in the vascular system, visual detection of platelet aggregates in vivo is not employed in clinical settings. Here we present a high-throughput label-free platelet aggregate detection method, aiming at the diagnosis and monitoring of thrombotic disorders in clinical settings. With optofluidic time-stretch microscopy with a spatial resolution of 780 nm and an ultrahigh linear scanning rate of 75 MHz, it is capable of detecting aggregated platelets in lysed blood which flows through a hydrodynamic-focusing microfluidic device at a high throughput of 10,000 particles/s. With digital image processing and statistical analysis, we are able to distinguish them from single platelets and other blood cells via morphological features. The detection results are compared with results of fluorescence-based detection (which is slow and inaccurate, but established). Our results indicate that the method holds promise for real-time, low-cost, label-free, and minimally invasive detection of platelet aggregates, which is potentially applicable to detection of platelet aggregates in vivo and to the diagnosis and monitoring of thrombotic disorders in clinical settings. This technique, if introduced clinically, may provide important clinical information in addition to that obtained by conventional techniques for thrombotic disorder diagnosis, including ex vivo platelet aggregation tests.

  3. Referent control and motor equivalence of reaching from standing

    PubMed Central

    Tomita, Yosuke; Feldman, Anatol G.

    2016-01-01

    Motor actions may result from central changes in the referent body configuration, defined as the body posture at which muscles begin to be activated or deactivated. The actual body configuration deviates from the referent configuration, particularly because of body inertia and environmental forces. Within these constraints, the system tends to minimize the difference between these configurations. For pointing movement, this strategy can be expressed as the tendency to minimize the difference between the referent trajectory (RT) and actual trajectory (QT) of the effector (hand). This process may underlie motor equivalent behavior that maintains the pointing trajectory regardless of the number of body segments involved. We tested the hypothesis that the minimization process is used to produce pointing in standing subjects. With eyes closed, 10 subjects reached from a standing position to a remembered target located beyond arm length. In randomly chosen trials, hip flexion was unexpectedly prevented, forcing subjects to take a step during pointing to prevent falling. The task was repeated when subjects were instructed to intentionally take a step during pointing. In most cases, reaching accuracy and trajectory curvature were preserved due to adaptive condition-specific changes in interjoint coordination. Results suggest that referent control and the minimization process associated with it may underlie motor equivalence in pointing. NEW & NOTEWORTHY Motor actions may result from minimization of the deflection of the actual body configuration from the centrally specified referent body configuration, in the limits of neuromuscular and environmental constraints. The minimization process may maintain reaching trajectory and accuracy regardless of the number of body segments involved (motor equivalence), as confirmed in this study of reaching from standing in young healthy individuals. Results suggest that the referent control process may underlie motor equivalence in reaching. PMID:27784802

  4. Distributed query plan generation using multiobjective genetic algorithm.

    PubMed

    Panicker, Shina; Kumar, T V Vijay

    2014-01-01

    A distributed query processing strategy, which is a key performance determinant in accessing distributed databases, aims to minimize the total query processing cost. One way to achieve this is by generating efficient distributed query plans that involve fewer sites for processing a query. In the case of distributed relational databases, the number of possible query plans increases exponentially with respect to the number of relations accessed by the query and the number of sites where these relations reside. Consequently, computing optimal distributed query plans becomes a complex problem. This distributed query plan generation (DQPG) problem has already been addressed using single objective genetic algorithm, where the objective is to minimize the total query processing cost comprising the local processing cost (LPC) and the site-to-site communication cost (CC). In this paper, this DQPG problem is formulated and solved as a biobjective optimization problem with the two objectives being minimize total LPC and minimize total CC. These objectives are simultaneously optimized using a multiobjective genetic algorithm NSGA-II. Experimental comparison of the proposed NSGA-II based DQPG algorithm with the single objective genetic algorithm shows that the former performs comparatively better and converges quickly towards optimal solutions for an observed crossover and mutation probability.

  5. Distributed Query Plan Generation Using Multiobjective Genetic Algorithm

    PubMed Central

    Panicker, Shina; Vijay Kumar, T. V.

    2014-01-01

    A distributed query processing strategy, which is a key performance determinant in accessing distributed databases, aims to minimize the total query processing cost. One way to achieve this is by generating efficient distributed query plans that involve fewer sites for processing a query. In the case of distributed relational databases, the number of possible query plans increases exponentially with respect to the number of relations accessed by the query and the number of sites where these relations reside. Consequently, computing optimal distributed query plans becomes a complex problem. This distributed query plan generation (DQPG) problem has already been addressed using single objective genetic algorithm, where the objective is to minimize the total query processing cost comprising the local processing cost (LPC) and the site-to-site communication cost (CC). In this paper, this DQPG problem is formulated and solved as a biobjective optimization problem with the two objectives being minimize total LPC and minimize total CC. These objectives are simultaneously optimized using a multiobjective genetic algorithm NSGA-II. Experimental comparison of the proposed NSGA-II based DQPG algorithm with the single objective genetic algorithm shows that the former performs comparatively better and converges quickly towards optimal solutions for an observed crossover and mutation probability. PMID:24963513

  6. Fully Automated Centrifugal Microfluidic Device for Ultrasensitive Protein Detection from Whole Blood.

    PubMed

    Park, Yang-Seok; Sunkara, Vijaya; Kim, Yubin; Lee, Won Seok; Han, Ja-Ryoung; Cho, Yoon-Kyoung

    2016-04-16

    Enzyme-linked immunosorbent assay (ELISA) is a promising method to detect small amount of proteins in biological samples. The devices providing a platform for reduced sample volume and assay time as well as full automation are required for potential use in point-of-care-diagnostics. Recently, we have demonstrated ultrasensitive detection of serum proteins, C-reactive protein (CRP) and cardiac troponin I (cTnI), utilizing a lab-on-a-disc composed of TiO2 nanofibrous (NF) mats. It showed a large dynamic range with femto molar (fM) detection sensitivity, from a small volume of whole blood in 30 min. The device consists of several components for blood separation, metering, mixing, and washing that are automated for improved sensitivity from low sample volumes. Here, in the video demonstration, we show the experimental protocols and know-how for the fabrication of NFs as well as the disc, their integration and the operation in the following order: processes for preparing TiO2 NF mat; transfer-printing of TiO2 NF mat onto the disc; surface modification for immune-reactions, disc assembly and operation; on-disc detection and representative results for immunoassay. Use of this device enables multiplexed analysis with minimal consumption of samples and reagents. Given the advantages, the device should find use in a wide variety of applications, and prove beneficial in facilitating the analysis of low abundant proteins.

  7. Microfluidic paper-based analytical device for particulate metals.

    PubMed

    Mentele, Mallory M; Cunningham, Josephine; Koehler, Kirsten; Volckens, John; Henry, Charles S

    2012-05-15

    A microfluidic paper-based analytical device (μPAD) fabricated by wax printing was designed to assess occupational exposure to metal-containing aerosols. This method employs rapid digestion of particulate metals using microliters of acid added directly to a punch taken from an air sampling filter. Punches were then placed on a μPAD, and digested metals were transported to detection reservoirs upon addition of water. These reservoirs contained reagents for colorimetric detection of Fe, Cu, and Ni. Dried buffer components were used to set the optimal pH in each detection reservoir, while precomplexation agents were deposited in the channels between the sample and detection zones to minimize interferences from competing metals. Metal concentrations were quantified from color intensity images using a scanner in conjunction with image processing software. Reproducible, log-linear calibration curves were generated for each metal, with method detection limits ranging from 1.0 to 1.5 μg for each metal (i.e., total mass present on the μPAD). Finally, a standard incineration ash sample was aerosolized, collected on filters, and analyzed for the three metals of interest. Analysis of this collected aerosol sample using a μPAD showed good correlation with known amounts of the metals present in the sample. This technology can provide rapid assessment of particulate metal concentrations at or below current regulatory limits and at dramatically reduced cost.

  8. Microplate-reader method for the rapid analysis of copper in natural waters with chemiluminescence detection.

    PubMed

    Durand, Axel; Chase, Zanna; Remenyi, Tomas; Quéroué, Fabien

    2012-01-01

    We have developed a method for the determination of copper in natural waters at nanomolar levels. The use of a microplate-reader minimizes sample processing time (~25 s per sample), reagent consumption (~120 μL per sample), and sample volume (~700 μL). Copper is detected by chemiluminescence. This technique is based on the formation of a complex between copper and 1,10-phenanthroline and the subsequent emission of light during the oxidation of the complex by hydrogen peroxide. Samples are acidified to pH 1.7 and then introduced directly into a 24-well plate. Reagents are added during data acquisition via two reagent injectors. When trace metal clean protocols are employed, the reproducibility is generally less than 7% on blanks and the detection limit is 0.7 nM for seawater and 0.4 nM for freshwater. More than 100 samples per hour can be analyzed with this technique, which is simple, robust, and amenable to at-sea analysis. Seawater samples from Storm Bay in Tasmania illustrate the utility of the method for environmental science. Indeed other trace metals for which optical detection methods exist (e.g., chemiluminescence, fluorescence, and absorbance) could be adapted to the microplate-reader.

  9. Materials and Techniques for Implantable Nutrient Sensing Using Flexible Sensors Integrated with Metal-Organic Frameworks.

    PubMed

    Ling, Wei; Liew, Guoguang; Li, Ya; Hao, Yafeng; Pan, Huizhuo; Wang, Hanjie; Ning, Baoan; Xu, Hang; Huang, Xian

    2018-06-01

    The combination of novel materials with flexible electronic technology may yield new concepts of flexible electronic devices that effectively detect various biological chemicals to facilitate understanding of biological processes and conduct health monitoring. This paper demonstrates single- or multichannel implantable flexible sensors that are surface modified with conductive metal-organic frameworks (MOFs) such as copper-MOF and cobalt-MOF with large surface area, high porosity, and tunable catalysis capability. The sensors can monitor important nutriments such as ascorbicacid, glycine, l-tryptophan (l-Trp), and glucose with detection resolutions of 14.97, 0.71, 4.14, and 54.60 × 10 -6 m, respectively. In addition, they offer sensing capability even under extreme deformation and complex surrounding environment with continuous monitoring capability for 20 d due to minimized use of biological active chemicals. Experiments using live cells and animals indicate that the MOF-modified sensors are biologically safe to cells, and can detect l-Trp in blood and interstitial fluid. This work represents the first effort in integrating MOFs with flexible sensors to achieve highly specific and sensitive implantable electrochemical detection and may inspire appearance of more flexible electronic devices with enhanced capability in sensing, energy storage, and catalysis using various properties of MOFs. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Magnetic Nanoparticles and microNMR for Diagnostic Applications

    PubMed Central

    Shao, Huilin; Min, Changwook; Issadore, David; Liong, Monty; Yoon, Tae-Jong; Weissleder, Ralph; Lee, Hakho

    2012-01-01

    Sensitive and quantitative measurements of clinically relevant protein biomarkers, pathogens and cells in biological samples would be invaluable for disease diagnosis, monitoring of malignancy, and for evaluating therapy efficacy. Biosensing strategies using magnetic nanoparticles (MNPs) have recently received considerable attention, since they offer unique advantages over traditional detection methods. Specifically, because biological samples have negligible magnetic background, MNPs can be used to obtain highly sensitive measurements in minimally processed samples. This review focuses on the use of MNPs for in vitro detection of cellular biomarkers based on nuclear magnetic resonance (NMR) effects. This detection platform, termed diagnostic magnetic resonance (DMR), exploits MNPs as proximity sensors to modulate the spin-spin relaxation time of water molecules surrounding the molecularly-targeted nanoparticles. With new developments such as more effective MNP biosensors, advanced conjugational strategies, and highly sensitive miniaturized NMR systems, the DMR detection capabilities have been considerably improved. These developments have also enabled parallel and rapid measurements from small sample volumes and on a wide range of targets, including whole cells, proteins, DNA/mRNA, metabolites, drugs, viruses and bacteria. The DMR platform thus makes a robust and easy-to-use sensor system with broad applications in biomedicine, as well as clinical utility in point-of-care settings. PMID:22272219

  11. Quantification of disease marker in undiluted serum using an actuating layer-embedded microcantilever

    NASA Astrophysics Data System (ADS)

    Hwang, Kyo Seon; Jeon, Hye Kyung; Lee, Sang-Myung; Kim, Sang Kyung; Kim, Tae Song

    2009-05-01

    In this study, we describe the application feasibility of a dynamic microcantilever with regard to the detection of a specific protein in human serum or real blood using an end-point analysis. The mechanical response (i.e., resonant frequency) of a functionalized dynamic microcantilever was shown to be altered by molecular interactions, which allowed for the detection of biomolecules present in small quantities without any additional signal enhancements, such as labeling. For the application of the microcantilever sensors to bioassays of serum samples, the mechanical response from the nonspecific adsorption of abundant proteins must be reduced, because it significantly influences the output signal deviation of the microcantilever sensor. We implemented a label-free prostate specific antigen (PSA) detection protocol in standard serum via our established process, which was designed to minimize nonspecific protein adsorption. PSA is a tumor marker for prostate cancer, with a threshold concentration of 2-4 ng/ml (7.2-14.4 pM) for the distinction between patients and normal individuals. The dynamic range of our dynamic microcantilever-based PSA assay on the background of standard serum ranged between 0.1 and 100 ng/ml (3.6 and 3600 pM). It was suggested that the dynamic microcantilever might allow for the sensitive label-free detection of disease markers in an actual human sample.

  12. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    PubMed Central

    Liu, Datong; Peng, Yu; Peng, Xiyuan

    2018-01-01

    Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR) and relevance vector machine (RVM)) are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP), which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%). There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI) based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA) algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application. PMID:29587372

  13. Portable Wideband Microwave Imaging System for Intracranial Hemorrhage Detection Using Improved Back-projection Algorithm with Model of Effective Head Permittivity

    PubMed Central

    Mobashsher, Ahmed Toaha; Mahmoud, A.; Abbosh, A. M.

    2016-01-01

    Intracranial hemorrhage is a medical emergency that requires rapid detection and medication to restrict any brain damage to minimal. Here, an effective wideband microwave head imaging system for on-the-spot detection of intracranial hemorrhage is presented. The operation of the system relies on the dielectric contrast between healthy brain tissues and a hemorrhage that causes a strong microwave scattering. The system uses a compact sensing antenna, which has an ultra-wideband operation with directional radiation, and a portable, compact microwave transceiver for signal transmission and data acquisition. The collected data is processed to create a clear image of the brain using an improved back projection algorithm, which is based on a novel effective head permittivity model. The system is verified in realistic simulation and experimental environments using anatomically and electrically realistic human head phantoms. Quantitative and qualitative comparisons between the images from the proposed and existing algorithms demonstrate significant improvements in detection and localization accuracy. The radiation and thermal safety of the system are examined and verified. Initial human tests are conducted on healthy subjects with different head sizes. The reconstructed images are statistically analyzed and absence of false positive results indicate the efficacy of the proposed system in future preclinical trials. PMID:26842761

  14. Portable Wideband Microwave Imaging System for Intracranial Hemorrhage Detection Using Improved Back-projection Algorithm with Model of Effective Head Permittivity

    NASA Astrophysics Data System (ADS)

    Mobashsher, Ahmed Toaha; Mahmoud, A.; Abbosh, A. M.

    2016-02-01

    Intracranial hemorrhage is a medical emergency that requires rapid detection and medication to restrict any brain damage to minimal. Here, an effective wideband microwave head imaging system for on-the-spot detection of intracranial hemorrhage is presented. The operation of the system relies on the dielectric contrast between healthy brain tissues and a hemorrhage that causes a strong microwave scattering. The system uses a compact sensing antenna, which has an ultra-wideband operation with directional radiation, and a portable, compact microwave transceiver for signal transmission and data acquisition. The collected data is processed to create a clear image of the brain using an improved back projection algorithm, which is based on a novel effective head permittivity model. The system is verified in realistic simulation and experimental environments using anatomically and electrically realistic human head phantoms. Quantitative and qualitative comparisons between the images from the proposed and existing algorithms demonstrate significant improvements in detection and localization accuracy. The radiation and thermal safety of the system are examined and verified. Initial human tests are conducted on healthy subjects with different head sizes. The reconstructed images are statistically analyzed and absence of false positive results indicate the efficacy of the proposed system in future preclinical trials.

  15. Control of Aeromonas on minimally processed vegetables by decontamination with lactic acid, chlorinated water, or thyme essential oil solution.

    PubMed

    Uyttendaele, M; Neyts, K; Vanderswalmen, H; Notebaert, E; Debevere, J

    2004-02-01

    Aeromonas is an opportunistic pathogen, which, although in low numbers, may be present on minimally processed vegetables. Although the intrinsic and extrinsic factors of minimally processed prepacked vegetable mixes are not inhibitory to the growth of Aeromonas species, multiplication to high numbers during processing and storage of naturally contaminated grated carrots, mixed lettuce, and chopped bell peppers was not observed. Aeromonas was shown to be resistant towards chlorination of water, but was susceptible to 1% and 2% lactic acid and 0.5% and 1.0% thyme essential oil treatment, although the latter provoked adverse sensory properties when applied for decontamination of chopped bell peppers. Integration of a decontamination step with 2% lactic acid in the processing line of grated carrots was shown to have the potential to control the overall microbial quality of the grated carrots and was particularly effective towards Aeromonas.

  16. Understanding and Minimizing Staff Burnout. An Introductory Packet.

    ERIC Educational Resources Information Center

    California Univ., Los Angeles. Center for Mental Health Schools.

    Staff who bring a mental health perspective to the schools can deal with problems of staff burnout. This packet is designed to help in beginning the process of minimizing burnout, a process that requires reducing environmental stressors, increasing personal capabilities, and enhancing job supports. The packet opens with brief discussions of "What…

  17. Microbial safety and overall quality of cantaloupe fresh-cut pieces prepared from whole cantaloupe after wet steam treatment

    USDA-ARS?s Scientific Manuscript database

    Fresh-cut cantaloupes have been associated with outbreaks of Salmonelosis disease and the minimally processed fresh-cut fruits have a limited shelf life because of deterioration caused by spoilage microflora and physiological processes. In this study, we evaluated the effect of minimal wet steam t...

  18. Quantum feedback cooling of a mechanical oscillator using variational measurements: tweaking Heisenberg’s microscope

    NASA Astrophysics Data System (ADS)

    Habibi, Hojat; Zeuthen, Emil; Ghanaatshoar, Majid; Hammerer, Klemens

    2016-08-01

    We revisit the problem of preparing a mechanical oscillator in the vicinity of its quantum-mechanical ground state by means of feedback cooling based on continuous optical detection of the oscillator position. In the parameter regime relevant to ground-state cooling, the optical back-action and imprecision noise set the bottleneck of achievable cooling and must be carefully balanced. This can be achieved by adapting the phase of the local oscillator in the homodyne detection realizing a so-called variational measurement. The trade-off between accurate position measurement and minimal disturbance can be understood in terms of Heisenberg’s microscope and becomes particularly relevant when the measurement and feedback processes happen to be fast within the quantum coherence time of the system to be cooled. This corresponds to the regime of large quantum cooperativity {C}{{q}}≳ 1, which was achieved in recent experiments on feedback cooling. Our method provides a simple path to further pushing the limits of current state-of-the-art experiments in quantum optomechanics.

  19. Coincidence and covariance data acquisition in photoelectron and -ion spectroscopy. II. Analysis and applications

    NASA Astrophysics Data System (ADS)

    Mikosch, Jochen; Patchkovskii, Serguei

    2013-10-01

    We use an analytical theory of noisy Poisson processes, developed in the preceding companion publication, to compare coincidence and covariance measurement approaches in photoelectron and -ion spectroscopy. For non-unit detection efficiencies, coincidence data acquisition (DAQ) suffers from false coincidences. The rate of false coincidences grows quadratically with the rate of elementary ionization events. To minimize false coincidences for rare event outcomes, very low event rates may hence be required. Coincidence measurements exhibit high tolerance to noise introduced by unstable experimental conditions. Covariance DAQ on the other hand is free of systematic errors as long as stable experimental conditions are maintained. In the presence of noise, all channels in a covariance measurement become correlated. Under favourable conditions, covariance DAQ may allow orders of magnitude reduction in measurement times. Finally, we use experimental data for strong-field ionization of 1,3-butadiene to illustrate how fluctuations in experimental conditions can contaminate a covariance measurement, and how such contamination can be detected.

  20. Application of the FICTION technique for the simultaneous detection of immunophenotype and chromosomal abnormalities in routinely fixed, paraffin wax embedded bone marrow trephines

    PubMed Central

    Korać, P; Jones, M; Dominis, M; Kušec, R; Mason, D Y; Banham, A H; Ventura, R A

    2005-01-01

    The use of interphase fluorescence in situ hybridisation (FISH) to study cytogenetic abnormalities in routinely fixed paraffin wax embedded tissue has become commonplace over the past decade. However, very few studies have applied FISH to routinely fixed bone marrow trephines (BMTs). This may be because of the acid based decalcification methods that are commonly used during the processing of BMTs, which may adversely affect the suitability of the sample for FISH analysis. For the first time, this report describes the simultaneous application of FISH and immunofluorescent staining (the FICTION technique) to formalin fixed, EDTA decalcified and paraffin wax embedded BMTs. This technique allows the direct correlation of genetic abnormalities to immunophenotype, and therefore will be particularly useful for the identification of genetic abnormalities in specific tumour cells present in BMTs. The application of this to routine clinical practice will assist diagnosis and the detection of minimal residual disease. PMID:16311361

  1. Organic contaminants in onsite wastewater treatment systems

    USGS Publications Warehouse

    Conn, K.E.; Siegrist, R.L.; Barber, L.B.; Brown, G.K.

    2007-01-01

    Wastewater from thirty onsite wastewater treatment systems was sampled during a reconnaissance field study to quantify bulk parameters and the occurrence of organic wastewater contaminants including endocrine disrupting compounds in treatment systems representing a variety of wastewater sources and treatment processes and their receiving environments. Bulk parameters ranged in concentrations representative of the wide variety of wastewater sources (residential vs. non-residential). Organic contaminants such as sterols, surfactant metabolites, antimicrobial agents, stimulants, metal-chelating agents, and other consumer product chemicals, measured by gas chromatography/mass spectrometry were detected frequently in onsite system wastewater. Wastewater composition was unique between source type likely due to differences in source water and chemical usage. Removal efficiencies varied by engineered treatment type and physicochemical properties of the contaminant, resulting in discharge to the soil treatment unit at ecotoxicologically-relevant concentrations. Organic wastewater contaminants were detected less frequently and at lower concentrations in onsite system receiving environments. Understanding the occurrence and fate of organic wastewater contaminants in onsite wastewater treatment systems will aid in minimizing risk to ecological and human health.

  2. a Cloud-Based Architecture for Smart Video Surveillance

    NASA Astrophysics Data System (ADS)

    Valentín, L.; Serrano, S. A.; Oves García, R.; Andrade, A.; Palacios-Alonso, M. A.; Sucar, L. Enrique

    2017-09-01

    Turning a city into a smart city has attracted considerable attention. A smart city can be seen as a city that uses digital technology not only to improve the quality of people's life, but also, to have a positive impact in the environment and, at the same time, offer efficient and easy-to-use services. A fundamental aspect to be considered in a smart city is people's safety and welfare, therefore, having a good security system becomes a necessity, because it allows us to detect and identify potential risk situations, and then take appropriate decisions to help people or even prevent criminal acts. In this paper we present an architecture for automated video surveillance based on the cloud computing schema capable of acquiring a video stream from a set of cameras connected to the network, process that information, detect, label and highlight security-relevant events automatically, store the information and provide situational awareness in order to minimize response time to take the appropriate action.

  3. Searching for patterns in remote sensing image databases using neural networks

    NASA Technical Reports Server (NTRS)

    Paola, Justin D.; Schowengerdt, Robert A.

    1995-01-01

    We have investigated a method, based on a successful neural network multispectral image classification system, of searching for single patterns in remote sensing databases. While defining the pattern to search for and the feature to be used for that search (spectral, spatial, temporal, etc.) is challenging, a more difficult task is selecting competing patterns to train against the desired pattern. Schemes for competing pattern selection, including random selection and human interpreted selection, are discussed in the context of an example detection of dense urban areas in Landsat Thematic Mapper imagery. When applying the search to multiple images, a simple normalization method can alleviate the problem of inconsistent image calibration. Another potential problem, that of highly compressed data, was found to have a minimal effect on the ability to detect the desired pattern. The neural network algorithm has been implemented using the PVM (Parallel Virtual Machine) library and nearly-optimal speedups have been obtained that help alleviate the long process of searching through imagery.

  4. Infomechanical specializations for prey capture in knifefish

    NASA Astrophysics Data System (ADS)

    Maciver, Malcolm; Patankar, Neelesh; Curet, Oscar; Shirgaonkar, Anup

    2007-11-01

    How does an animal's mechanics and its information acquisition system work together to solve crucial behavioral tasks? We examine this question for the black ghost weakly electric knifefish (Apteronotus albifrons), which is a leading model system for the study of sensory processing in vertebrates. These animals hunt at night by detecting perturbations of a self-generated electric field caused by prey. While the fish searches for prey, it pitches at 30 . Fully resolved Navier-Stokes simulations of their swimming, which occurs through undulations of a long ribbon-like fin along the bottom edge of the body, indicates that this configuration enables maximal thrust while minimizing pitch moment. However, pitching the body also increases drag. Our analysis of the sensory volume for detection of prey shows this volume to be similar to a cylinder around the body. Thus, pitching the body enables a greater swept volume of scanned fluid. Examining the mechanical and information acquisition demands on the animal in this task gives insight into how these sometimes conflicting demands are resolved.

  5. Enhanced Resolution of Chiral Amino Acids with Capillary Electrophoresis for Biosignature Detection in Extraterrestrial Samples.

    PubMed

    Creamer, Jessica S; Mora, Maria F; Willis, Peter A

    2017-01-17

    Amino acids are fundamental building blocks of terrestrial life as well as ubiquitous byproducts of abiotic reactions. In order to distinguish between amino acids formed by abiotic versus biotic processes it is possible to use chemical distributions to identify patterns unique to life. This article describes two capillary electrophoresis methods capable of resolving 17 amino acids found in high abundance in both biotic and abiotic samples (seven enantiomer pairs d/l-Ala, -Asp, -Glu, -His, -Leu, -Ser, -Val and the three achiral amino acids Gly, β-Ala, and GABA). To resolve the 13 neutral amino acids one method utilizes a background electrolyte containing γ-cyclodextrin and sodium taurocholate micelles. The acidic amino acid enantiomers were resolved with γ-cyclodextrin alone. These methods allow detection limits down to 5 nM for the neutral amino acids and 500 nM for acidic amino acids and were used to analyze samples collected from Mono Lake with minimal sample preparation.

  6. Effect of gamma irradiation on microbial quality of minimally processed carrot and lettuce: A case study in Greater Accra region of Ghana

    NASA Astrophysics Data System (ADS)

    Frimpong, G. K.; Kottoh, I. D.; Ofosu, D. O.; Larbi, D.

    2015-05-01

    The effect of ionizing radiation on the microbiological quality on minimally processed carrot and lettuce was studied. The aim was to investigate the effect of irradiation as a sanitizing agent on the bacteriological quality of some raw eaten salad vegetables obtained from retailers in Accra, Ghana. Minimally processed carrot and lettuce were analysed for total viable count, total coliform count and pathogenic organisms. The samples collected were treated and analysed for a 15 day period. The total viable count for carrot ranged from 1.49 to 14.01 log10 cfu/10 g while that of lettuce was 0.70 to 8.5 7 log10 cfu/10 g. It was also observed that total coliform count for carrot was 1.46-7.53 log10 cfu/10 g and 0.14-7.35 log10 cfu/10 g for lettuce. The predominant pathogenic organisms identified were Bacillus cereus, Cronobacter sakazakii, Staphylococcus aureus, and Klebsiella spp. It was concluded that 2 kGy was most effective for medium dose treatment of minimally processed carrot and lettuce.

  7. A Method for the Minimization of Competition Bias in Signal Detection from Spontaneous Reporting Databases.

    PubMed

    Arnaud, Mickael; Salvo, Francesco; Ahmed, Ismaïl; Robinson, Philip; Moore, Nicholas; Bégaud, Bernard; Tubert-Bitter, Pascale; Pariente, Antoine

    2016-03-01

    The two methods for minimizing competition bias in signal of disproportionate reporting (SDR) detection--masking factor (MF) and masking ratio (MR)--have focused on the strength of disproportionality for identifying competitors and have been tested using competitors at the drug level. The aim of this study was to develop a method that relies on identifying competitors by considering the proportion of reports of adverse events (AEs) that mention the drug class at an adequate level of drug grouping to increase sensitivity (Se) for SDR unmasking, and its comparison with MF and MR. Reports in the French spontaneous reporting database between 2000 and 2005 were selected. Five AEs were considered: myocardial infarction, pancreatitis, aplastic anemia, convulsions, and gastrointestinal bleeding; related reports were retrieved using standardized Medical Dictionary for Regulatory Activities (MedDRA(®)) queries. Potential competitors of AEs were identified using the developed method, i.e. Competition Index (ComIn), as well as MF and MR. All three methods were tested according to Anatomical Therapeutic Chemical (ATC) classification levels 2-5. For each AE, SDR detection was performed, first in the complete database, and second after removing reports mentioning competitors; SDRs only detected after the removal were unmasked. All unmasked SDRs were validated using the Summary of Product Characteristics, and constituted the reference dataset used for computing the performance for SDR unmasking (area under the curve [AUC], Se). Performance of the ComIn was highest when considering competitors at ATC level 3 (AUC: 62 %; Se: 52 %); similar results were obtained with MF and MR. The ComIn could greatly minimize the competition bias in SDR detection. Further study using a larger dataset is needed.

  8. Standardization and performance evaluation of "modified" and "ultrasensitive" versions of the Abbott RealTime HIV-1 assay, adapted to quantify minimal residual viremia.

    PubMed

    Amendola, Alessandra; Bloisi, Maria; Marsella, Patrizia; Sabatini, Rosella; Bibbò, Angela; Angeletti, Claudio; Capobianchi, Maria Rosaria

    2011-09-01

    Numerous studies investigating clinical significance of HIV-1 minimal residual viremia (MRV) suggest potential utility of assays more sensitive than those routinely used to monitor viral suppression. However currently available methods, based on different technologies, show great variation in detection limit and input plasma volume, and generally suffer from lack of standardization. In order to establish new tools suitable for routine quantification of minimal residual viremia in patients under virological suppression, some modifications were introduced into standard procedure of the Abbott RealTime HIV-1 assay leading to a "modified" and an "ultrasensitive" protocols. The following modifications were introduced: calibration curve extended towards low HIV-1 RNA concentration; 4 fold increased sample volume by concentrating starting material; reduced volume of internal control; adoption of "open-mode" software for quantification. Analytical performances were evaluated using the HIV-1 RNA Working Reagent 1 for NAT assays (NIBSC). Both tests were applied to clinical samples from virologically suppressed patients. The "modified" and the "ultrasensitive" configurations of the assay reached a limit of detection of 18.8 (95% CI: 11.1-51.0 cp/mL) and 4.8 cp/mL (95% CI: 2.6-9.1 cp/mL), respectively, with high precision and accuracy. In clinical samples from virologically suppressed patients, "modified" and "ultrasensitive" protocols allowed to detect and quantify HIV RNA in 12.7% and 46.6%, respectively, of samples resulted "not-detectable", and in 70.0% and 69.5%, respectively, of samples "detected <40 cp/mL" in the standard assay. The "modified" and "ultrasensitive" assays are precise and accurate, and easily adoptable in routine diagnostic laboratories for measuring MRV. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Carbon Nanotube Based Devices for Intracellular Analysis

    NASA Astrophysics Data System (ADS)

    Singhal, Riju Mohan

    Scientific investigations on individual cells have gained increasing attention in recent years as efforts are being made to understand cellular functioning in complex processes, such as cell division during embryonic development, and owing to realization of heterogeneity amongst a population of a single cell type (for instance, certain individual cancer cells being immune to chemotherapy). Therefore devices enabling electrochemical detection, spectroscopy, optical observations, and separation techniques, along with cell piercing and fluid transfer capabilities at the intra-cellular level, are required. Glass pipettes have conventionally been used for single cell interrogation, however their poor mechanical properties and an intrusive conical geometry have led to limited precision and frequent cell damage or death, justifying research efforts to develop novel, non-intrusive cell probes. Carbon nanotubes (CNTs) are known for their superior physical properties and tunable chemical structure. They possess a high aspect ratio and offer minimally invasive thin carbon walls and tubular geometry. Moreover, possibility of chemical functionalization of CNTs enables multi-functional probes. In this dissertation, novel nanofluidic instruments that have nanostructured carbon tips will be presented along with techniques that utilize the exceptional physical properties of carbon nanotubes, to take miniature biomedical instrumentation to the next level. New methods for fabricating the probes were rigorously developed and their operation was extensively studied. The devices were mechanically robust and were used to inject liquids to a single cell, detect electrochemical signals and enable surface enhanced Raman spectroscopy (SERS) while inducing minimal harm to the cell. Particular attention was focused on the CVD process-which was used to deposit carbon, fluid flow through the nanotubes, and separation of chemical species (atto-liter chromatography) at the nanometer scale that would potentially lead to the highly sought after "selective component extraction" and analysis from a single cell. These multi-functional devices therefore provide a picture of the physiological state of a living cell and function as endoscopes for single cell analysis.

  10. Closed form unsupervised registration of multi-temporal structure from motion-multiview stereo data using non-linearly weighted image features

    NASA Astrophysics Data System (ADS)

    Seers, T. D.; Hodgetts, D.

    2013-12-01

    Seers, T. D. & Hodgetts, D. School of Earth, Atmospheric and Environmental Sciences, University of Manchester, UK. M13 9PL. The detection of topological change at the Earth's surface is of considerable scholarly interest, allowing the quantification of the rates of geomorphic processes whilst providing lucid insights into the underlying mechanisms driving landscape evolution. In this regard, the past decade has witnessed the ever increasing proliferation of studies employing multi-temporal topographic data in within the geosciences, bolstered by continuing technical advancements in the acquisition and processing of prerequisite datasets. Provided by workers within the field of Computer Vision, multiview stereo (MVS) dense surface reconstructions, primed by structure-from-motion (SfM) based camera pose estimation represents one such development. Providing a cost effective, operationally efficient data capture medium, the modest requirement of a consumer grade camera for data collection coupled with the minimal user intervention required during post-processing makes SfM-MVS an attractive alternative to terrestrial laser scanners for collecting multi-temporal topographic datasets. However, in similitude to terrestrial scanner derived data, the co-registration of spatially coincident or partially overlapping scans produced by SfM-MVS presents a major technical challenge, particularly in the case of semi non-rigid scenes produced during topographic change detection studies. Moreover, the arbitrary scaling resulting from SfM ambiguity requires that a scale matrix must be estimated during the transformation, introducing further complexity into its formulation. Here, we present a novel, fully unsupervised algorithm which utilises non-linearly weighted image features for the solving the similarity transform (scale, translation rotation) between partially overlapping scans produced by SfM-MVS image processing. With the only initialization condition being partial intersection between input image sets, our method has major advantages over conventional iterative least squares minimization based methods (e.g. Iterative Closest Point variants), acting only on rigid areas of target scenes, being capable of reliably estimating the scaling factor and requiring no incipient estimation of the transformation to initialize (i.e. manual rough alignment). Moreover, because the solution is closed form, convergence is considerably more expedient that most iterative methods. It is hoped that the availability of improved co-registration routines, such as the one presented here, will facilitate the routine collection of multi-temporal topographic datasets by a wider range of geoscience practitioners.

  11. Minimally-aggressive gestational trophoblastic neoplasms.

    PubMed

    Cole, Laurence A

    2012-04-01

    We have previously defined a new syndrome "Minimally-aggressive gestational trophoblastic neoplasms" in which choriocarcinoma or persistent hydatidiform mole has a minimal growth rate and becomes chemorefractory. Previously we described a new treatment protocol, waiting for hCG rise to >3000 mIU/ml and disease becomes more advanced, then using combination chemotherapy. Initially we found this treatment successful in 8 of 8 cases, here we find this protocol appropriate in a further 16 cases. Initially we used hyperglycosylated hCG, a limited availability test, to identify this syndrome. Here we propose also using hCG doubling rate to detect this syndrome. Minimally aggressive gestational trophoblastic disease can be detected by chemotherapy resistance or low hyperglycosylated hCG, <40% of total hCG. It can also be identified by hCG doubling rate, with doubling time greater than 2 weeks. Nineteen new cases were identified as having minimally aggressive gestational trophoblastic disease by hyperglycosylated hCG and by hCG doubling test. All were recommended to hold off further chemotherapy until hCG >3000mIU/ml. One case died prior to the start of the study, one case withdrew because of a lung nodule and one withdrew refusing the suggested combination chemotherapy. The remaining 16 women were all successfully treated. A total of 8 plus 16 or 24 of 24 women were successfully treated using the proposed protocol, holding back on chemotherapy until hCG >3000mIU/ml. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Sideband Algorithm for Automatic Wind Turbine Gearbox Fault Detection and Diagnosis: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zappala, D.; Tavner, P.; Crabtree, C.

    2013-01-01

    Improving the availability of wind turbines (WT) is critical to minimize the cost of wind energy, especially for offshore installations. As gearbox downtime has a significant impact on WT availabilities, the development of reliable and cost-effective gearbox condition monitoring systems (CMS) is of great concern to the wind industry. Timely detection and diagnosis of developing gear defects within a gearbox is an essential part of minimizing unplanned downtime of wind turbines. Monitoring signals from WT gearboxes are highly non-stationary as turbine load and speed vary continuously with time. Time-consuming and costly manual handling of large amounts of monitoring data representmore » one of the main limitations of most current CMSs, so automated algorithms are required. This paper presents a fault detection algorithm for incorporation into a commercial CMS for automatic gear fault detection and diagnosis. The algorithm allowed the assessment of gear fault severity by tracking progressive tooth gear damage during variable speed and load operating conditions of the test rig. Results show that the proposed technique proves efficient and reliable for detecting gear damage. Once implemented into WT CMSs, this algorithm can automate data interpretation reducing the quantity of information that WT operators must handle.« less

  13. A ROC-based feature selection method for computer-aided detection and diagnosis

    NASA Astrophysics Data System (ADS)

    Wang, Songyuan; Zhang, Guopeng; Liao, Qimei; Zhang, Junying; Jiao, Chun; Lu, Hongbing

    2014-03-01

    Image-based computer-aided detection and diagnosis (CAD) has been a very active research topic aiming to assist physicians to detect lesions and distinguish them from benign to malignant. However, the datasets fed into a classifier usually suffer from small number of samples, as well as significantly less samples available in one class (have a disease) than the other, resulting in the classifier's suboptimal performance. How to identifying the most characterizing features of the observed data for lesion detection is critical to improve the sensitivity and minimize false positives of a CAD system. In this study, we propose a novel feature selection method mR-FAST that combines the minimal-redundancymaximal relevance (mRMR) framework with a selection metric FAST (feature assessment by sliding thresholds) based on the area under a ROC curve (AUC) generated on optimal simple linear discriminants. With three feature datasets extracted from CAD systems for colon polyps and bladder cancer, we show that the space of candidate features selected by mR-FAST is more characterizing for lesion detection with higher AUC, enabling to find a compact subset of superior features at low cost.

  14. Knee X-ray image analysis method for automated detection of Osteoarthritis

    PubMed Central

    Shamir, Lior; Ling, Shari M.; Scott, William W.; Bos, Angelo; Orlov, Nikita; Macura, Tomasz; Eckley, D. Mark; Ferrucci, Luigi; Goldberg, Ilya G.

    2008-01-01

    We describe a method for automated detection of radiographic Osteoarthritis (OA) in knee X-ray images. The detection is based on the Kellgren-Lawrence classification grades, which correspond to the different stages of OA severity. The classifier was built using manually classified X-rays, representing the first four KL grades (normal, doubtful, minimal and moderate). Image analysis is performed by first identifying a set of image content descriptors and image transforms that are informative for the detection of OA in the X-rays, and assigning weights to these image features using Fisher scores. Then, a simple weighted nearest neighbor rule is used in order to predict the KL grade to which a given test X-ray sample belongs. The dataset used in the experiment contained 350 X-ray images classified manually by their KL grades. Experimental results show that moderate OA (KL grade 3) and minimal OA (KL grade 2) can be differentiated from normal cases with accuracy of 91.5% and 80.4%, respectively. Doubtful OA (KL grade 1) was detected automatically with a much lower accuracy of 57%. The source code developed and used in this study is available for free download at www.openmicroscopy.org. PMID:19342330

  15. Application of environmental DNA to detect an endangered marine skate species in the wild.

    PubMed

    Weltz, Kay; Lyle, Jeremy M; Ovenden, Jennifer; Morgan, Jessica A T; Moreno, David A; Semmens, Jayson M

    2017-01-01

    Environmental DNA (eDNA) techniques have only recently been applied in the marine environment to detect the presence of marine species. Species-specific primers and probes were designed to detect the eDNA of the endangered Maugean skate (Zearaja maugeana) from as little as 1 L of water collected at depth (10-15 m) in Macquarie Harbour (MH), Tasmania. The identity of the eDNA was confirmed as Z. maugeana by sequencing the qPCR products and aligning these with the target sequence for a 100% match. This result has validated the use of this eDNA technique for detecting a rare species, Z. maugeana, in the wild. Being able to investigate the presence, and possibly the abundance, of Z. maugeana in MH and Bathurst harbour (BH), would be addressing a conservation imperative for the endangered Z. maugeana. For future application of this technique in the field, the rate of decay was determined for Z. maugeana eDNA under ambient dissolved oxygen (DO) levels (55% saturation) and lower DO (20% saturation) levels, revealing that the eDNA can be detected for 4 and 16 hours respectively, after which eDNA concentration drops below the detection threshold of the assay. With the rate of decay being influenced by starting eDNA concentrations, it is recommended that samples be filtered as soon as possible after collection to minimize further loss of eDNA prior to and during sample processing.

  16. Application of environmental DNA to detect an endangered marine skate species in the wild

    PubMed Central

    Morgan, Jessica A. T.; Moreno, David A.

    2017-01-01

    Environmental DNA (eDNA) techniques have only recently been applied in the marine environment to detect the presence of marine species. Species-specific primers and probes were designed to detect the eDNA of the endangered Maugean skate (Zearaja maugeana) from as little as 1 L of water collected at depth (10–15 m) in Macquarie Harbour (MH), Tasmania. The identity of the eDNA was confirmed as Z. maugeana by sequencing the qPCR products and aligning these with the target sequence for a 100% match. This result has validated the use of this eDNA technique for detecting a rare species, Z. maugeana, in the wild. Being able to investigate the presence, and possibly the abundance, of Z. maugeana in MH and Bathurst harbour (BH), would be addressing a conservation imperative for the endangered Z. maugeana. For future application of this technique in the field, the rate of decay was determined for Z. maugeana eDNA under ambient dissolved oxygen (DO) levels (55% saturation) and lower DO (20% saturation) levels, revealing that the eDNA can be detected for 4 and 16 hours respectively, after which eDNA concentration drops below the detection threshold of the assay. With the rate of decay being influenced by starting eDNA concentrations, it is recommended that samples be filtered as soon as possible after collection to minimize further loss of eDNA prior to and during sample processing. PMID:28591215

  17. Minimization In Digital Design As A Meta-Planning Problem

    NASA Astrophysics Data System (ADS)

    Ho, William P. C.; Wu, Jung-Gen

    1987-05-01

    In our model-based expert system for automatic digital system design, we formalize the design process into three sub-processes - compiling high-level behavioral specifications into primitive behavioral operations, grouping primitive operations into behavioral functions, and grouping functions into modules. Consideration of design minimization explicitly controls decision-making in the last two subprocesses. Design minimization, a key task in the automatic design of digital systems, is complicated by the high degree of interaction among the time sequence and content of design decisions. In this paper, we present an AI approach which directly addresses these interactions and their consequences by modeling the minimization prob-lem as a planning problem, and the management of design decision-making as a meta-planning problem.

  18. A Statistical Model for Multilingual Entity Detection and Tracking

    DTIC Science & Technology

    2004-01-01

    tomatic Content Extraction ( ACE ) evaluation achieved top-tier results in all three evaluation languages. 1 Introduction Detecting entities, whether named...of com- bining the detected mentions into groups of references to the same object. The work presented here is motivated by the ACE eval- uation...Entropy (MaxEnt henceforth) (Berger et al., 1996) and Robust Risk Minimization (RRM henceforth) 1For a description of the ACE program see http

  19. Rapid detection of potyviruses from crude plant extracts.

    PubMed

    Silva, Gonçalo; Oyekanmi, Joshua; Nkere, Chukwuemeka K; Bömer, Moritz; Kumar, P Lava; Seal, Susan E

    2018-04-01

    Potyviruses (genus Potyvirus; family Potyviridae) are widely distributed and represent one of the most economically important genera of plant viruses. Therefore, their accurate detection is a key factor in developing efficient control strategies. However, this can sometimes be problematic particularly in plant species containing high amounts of polysaccharides and polyphenols such as yam (Dioscorea spp.). Here, we report the development of a reliable, rapid and cost-effective detection method for the two most important potyviruses infecting yam based on reverse transcription-recombinase polymerase amplification (RT-RPA). The developed method, named 'Direct RT-RPA', detects each target virus directly from plant leaf extracts prepared with a simple and inexpensive extraction method avoiding laborious extraction of high-quality RNA. Direct RT-RPA enables the detection of virus-positive samples in under 30 min at a single low operation temperature (37 °C) without the need for any expensive instrumentation. The Direct RT-RPA tests constitute robust, accurate, sensitive and quick methods for detection of potyviruses from recalcitrant plant species. The minimal sample preparation requirements and the possibility of storing RPA reagents without cold chain storage, allow Direct RT-RPA to be adopted in minimally equipped laboratories and with potential use in plant clinic laboratories and seed certification facilities worldwide. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Combined modified atmosphere packaging and low temperature storage delay lignification and improve the defense response of minimally processed water bamboo shoot.

    PubMed

    Song, Lili; Chen, Hangjun; Gao, Haiyan; Fang, Xiangjun; Mu, Honglei; Yuan, Ya; Yang, Qian; Jiang, Yueming

    2013-09-04

    Minimally processed water bamboo shoot (WBS) lignifies and deteriorates rapidly at room temperature, which limits greatly its marketability. This study was to investigate the effect of modified atmosphere packaging (MAP) on the sensory quality index, lignin formation, production of radical oxygen species (ROS) and activities of scavenging enzymes, membrane integrity and energy status of minimally processed WBS when packaged with or without the sealed low-density polyethylene (LDPE) bags, and then stored at 20°C for 9 days or 2°C for 60 days. The sensory quality of minimally processed WBS decreased quickly after 6 days of storage at 20°C. Low temperature storage maintained a higher sensory quality index within the first 30 days, but exhibited higher contents of lignin and hydrogen peroxide (H2O2) as compared with non-MAP shoots at 20°C. Combined MAP and low temperature storage not only maintained good sensory quality after 30 days, but also reduced significantly the increases in lignin content, superoxide anion (O2.-) production rate, H2O2 content and membrane permeability, maintained high activities of superoxide dismutase (SOD), catalase (CAT) and ascorbate peroxidase (APX), and reduced the increase in activities of lipase, phospholipase D (PLD) and lipoxygenase (LOX). Furthermore, the minimally processed WBS under MAP condition exhibited higher energy charge (EC) and lower adenosine monophosphate (AMP) content by the end of storage (60 days) at 2°C than those without MAP or stored for 9 days at 20°C. These results indicated that MAP in combination with low temperature storage reduced lignification of minimally processed WBS, which was closely associated with maintenance of energy status and enhanced activities of antioxidant enzymes, as well as reduced alleviation of membrane damage caused by ROS.

  1. Combined modified atmosphere packaging and low temperature storage delay lignification and improve the defense response of minimally processed water bamboo shoot

    PubMed Central

    2013-01-01

    Background Minimally processed water bamboo shoot (WBS) lignifies and deteriorates rapidly at room temperature, which limits greatly its marketability. This study was to investigate the effect of modified atmosphere packaging (MAP) on the sensory quality index, lignin formation, production of radical oxygen species (ROS) and activities of scavenging enzymes, membrane integrity and energy status of minimally processed WBS when packaged with or without the sealed low-density polyethylene (LDPE) bags, and then stored at 20°C for 9 days or 2°C for 60 days. Results The sensory quality of minimally processed WBS decreased quickly after 6 days of storage at 20°C. Low temperature storage maintained a higher sensory quality index within the first 30 days, but exhibited higher contents of lignin and hydrogen peroxide (H2O2) as compared with non-MAP shoots at 20°C. Combined MAP and low temperature storage not only maintained good sensory quality after 30 days, but also reduced significantly the increases in lignin content, superoxide anion (O2.-) production rate, H2O2 content and membrane permeability, maintained high activities of superoxide dismutase (SOD), catalase (CAT) and ascorbate peroxidase (APX), and reduced the increase in activities of lipase, phospholipase D (PLD) and lipoxygenase (LOX). Furthermore, the minimally processed WBS under MAP condition exhibited higher energy charge (EC) and lower adenosine monophosphate (AMP) content by the end of storage (60 days) at 2°C than those without MAP or stored for 9 days at 20°C. Conclusion These results indicated that MAP in combination with low temperature storage reduced lignification of minimally processed WBS, which was closely associated with maintenance of energy status and enhanced activities of antioxidant enzymes, as well as reduced alleviation of membrane damage caused by ROS. PMID:24006941

  2. Social-aware Event Handling within the FallRisk Project.

    PubMed

    De Backere, Femke; Van den Bergh, Jan; Coppers, Sven; Elprama, Shirley; Nelis, Jelle; Verstichel, Stijn; Jacobs, An; Coninx, Karin; Ongenae, Femke; De Turck, Filip

    2017-01-09

    With the uprise of the Internet of Things, wearables and smartphones are moving to the foreground. Ambient Assisted Living solutions are, for example, created to facilitate ageing in place. One example of such systems are fall detection systems. Currently, there exists a wide variety of fall detection systems using different methodologies and technologies. However, these systems often do not take into account the fall handling process, which starts after a fall is identified or this process only consists of sending a notification. The FallRisk system delivers an accurate analysis of incidents occurring in the home of the older adults using several sensors and smart devices. Moreover, the input from these devices can be used to create a social-aware event handling process, which leads to assisting the older adult as soon as possible and in the best possible way. The FallRisk system consists of several components, located in different places. When an incident is identified by the FallRisk system, the event handling process will be followed to assess the fall incident and select the most appropriate caregiver, based on the input of the smartphones of the caregivers. In this process, availability and location are automatically taken into account. The event handling process was evaluated during a decision tree workshop to verify if the current day practices reflect the requirements of all the stakeholders. Other knowledge, which is uncovered during this workshop can be taken into account to further improve the process. The FallRisk offers a way to detect fall incidents in an accurate way and uses context information to assign the incident to the most appropriate caregiver. This way, the consequences of the fall are minimized and help is at location as fast as possible. It could be concluded that the current guidelines on fall handling reflect the needs of the stakeholders. However, current technology evolutions, such as the uptake of wearables and smartphones, enables the improvement of these guidelines, such as the automatic ordering of the caregivers based on their location and availability.

  3. Development of resistance of mutans streptococci and Porphyromonas gingivalis to chlorhexidine digluconate and amine fluoride/stannous fluoride-containing mouthrinses, in vitro.

    PubMed

    Kulik, Eva M; Waltimo, Tuomas; Weiger, Roland; Schweizer, Irene; Lenkeit, Krystyna; Filipuzzi-Jenny, Elisabeth; Walter, Clemens

    2015-07-01

    The aim if this study was to determine the minimal inhibitory concentrations of chlorhexidine digluconate and an amine fluoride/stannous fluoride-containing mouthrinse against Porphyromonas gingivalis and mutans streptococci during an experimental long-term subinhibitory exposition. Five P. gingivalis strains and four mutans streptococci were subcultivated for 20-30 passages in subinhibitory concentrations of chlorhexidine digluconate or an amine fluoride/stannous fluoride-containing mouthrinse. Pre-passaging minimal inhibitory concentrations for chlorhexidine ranged from 0.5 to 2 mg/l for mutans streptococci and from 2 to 4 mg/l for the P. gingivalis isolates. For the amine fluoride/stannous fluoride-containing mouthrinse minimal inhibitory values from 0.125 to 0.25% for the mutans streptococci and from 0.063 to 0.125% for the P. gingivalis isolates were determined. Two- to fourfold increased minimal inhibitory concentrations against chlorhexidine were detected for two of the five P. gingivalis isolates, whereas no increase in minimal inhibitory concentrations was found for the mutans streptococci after repeated passaging through subinhibitory concentrations. Repeated exposure to subinhibitory concentrations of the amine fluoride/stannous fluoride-containing mouthrinse did not alter the minimally inhibitory concentrations of the bacterial isolates tested. Chlorhexidine and the amine fluoride/stannous fluoride-containing mouthrinse are effective inhibitory agents against the oral bacterial isolates tested. No general development of resistance against chlorhexidine or the amine fluoride/stannous fluoride-containing mouthrinse was detected. However, some strains showed potential to develop resistance against chlorhexidine after prolonged exposure. The use of chlorhexidine should be limited to short periods of time. The amine fluoride/stannous fluoride-containing mouthrinse appears to have the potential to be used on a long-term basis.

  4. Specialized minimal PDFs for optimized LHC calculations.

    PubMed

    Carrazza, Stefano; Forte, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-01-01

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct these SM-PDFs in such a way that sets corresponding to different input processes can be combined without losing information, specifically as regards their correlations, and that they are robust upon smooth variations of the kinematic cuts. The proposed strategy never discards information, so that the SM-PDF sets can be enlarged by the addition of new processes, until the prior PDF set is eventually recovered for a large enough set of processes. We illustrate the method by producing SM-PDFs tailored to Higgs, top-quark pair, and electroweak gauge boson physics, and we determine that, when the PDF4LHC15 combined set is used as the prior, around 11, 4, and 11 Hessian eigenvectors, respectively, are enough to fully describe the corresponding processes.

  5. Spin coating apparatus

    DOEpatents

    Torczynski, John R.

    2000-01-01

    A spin coating apparatus requires less cleanroom air flow than prior spin coating apparatus to minimize cleanroom contamination. A shaped exhaust duct from the spin coater maintains process quality while requiring reduced cleanroom air flow. The exhaust duct can decrease in cross section as it extends from the wafer, minimizing eddy formation. The exhaust duct can conform to entrainment streamlines to minimize eddy formation and reduce interprocess contamination at minimal cleanroom air flow rates.

  6. Improving the performance of minimizers and winnowing schemes

    PubMed Central

    Marçais, Guillaume; Pellow, David; Bork, Daniel; Orenstein, Yaron; Shamir, Ron; Kingsford, Carl

    2017-01-01

    Abstract Motivation: The minimizers scheme is a method for selecting k-mers from sequences. It is used in many bioinformatics software tools to bin comparable sequences or to sample a sequence in a deterministic fashion at approximately regular intervals, in order to reduce memory consumption and processing time. Although very useful, the minimizers selection procedure has undesirable behaviors (e.g. too many k-mers are selected when processing certain sequences). Some of these problems were already known to the authors of the minimizers technique, and the natural lexicographic ordering of k-mers used by minimizers was recognized as their origin. Many software tools using minimizers employ ad hoc variations of the lexicographic order to alleviate those issues. Results: We provide an in-depth analysis of the effect of k-mer ordering on the performance of the minimizers technique. By using small universal hitting sets (a recently defined concept), we show how to significantly improve the performance of minimizers and avoid some of its worse behaviors. Based on these results, we encourage bioinformatics software developers to use an ordering based on a universal hitting set or, if not possible, a randomized ordering, rather than the lexicographic order. This analysis also settles negatively a conjecture (by Schleimer et al.) on the expected density of minimizers in a random sequence. Availability and Implementation: The software used for this analysis is available on GitHub: https://github.com/gmarcais/minimizers.git. Contact: gmarcais@cs.cmu.edu or carlk@cs.cmu.edu PMID:28881970

  7. Minimizing data transfer with sustained performance in wireless brain-machine interfaces

    NASA Astrophysics Data System (ADS)

    Thor Thorbergsson, Palmi; Garwicz, Martin; Schouenborg, Jens; Johansson, Anders J.

    2012-06-01

    Brain-machine interfaces (BMIs) may be used to investigate neural mechanisms or to treat the symptoms of neurological disease and are hence powerful tools in research and clinical practice. Wireless BMIs add flexibility to both types of applications by reducing movement restrictions and risks associated with transcutaneous leads. However, since wireless implementations are typically limited in terms of transmission capacity and energy resources, the major challenge faced by their designers is to combine high performance with adaptations to limited resources. Here, we have identified three key steps in dealing with this challenge: (1) the purpose of the BMI should be clearly specified with regard to the type of information to be processed; (2) the amount of raw input data needed to fulfill the purpose should be determined, in order to avoid over- or under-dimensioning of the design; and (3) processing tasks should be allocated among the system parts such that all of them are utilized optimally with respect to computational power, wireless link capacity and raw input data requirements. We have focused on step (2) under the assumption that the purpose of the BMI (step 1) is to assess single- or multi-unit neuronal activity in the central nervous system with single-channel extracellular recordings. The reliability of this assessment depends on performance in detection and sorting of spikes. We have therefore performed absolute threshold spike detection and spike sorting with the principal component analysis and fuzzy c-means on a set of synthetic extracellular recordings, while varying the sampling rate and resolution, noise level and number of target units, and used the known ground truth to quantitatively estimate the performance. From the calculated performance curves, we have identified the sampling rate and resolution breakpoints, beyond which performance is not expected to increase by more than 1-5%. We have then estimated the performance of alternative algorithms for spike detection and spike sorting in order to examine the generalizability of our results to other algorithms. Our findings indicate that the minimization of recording noise is the primary factor to consider in the design process. In most cases, there are breakpoints for sampling rates and resolution that provide guidelines for BMI designers in terms of minimum amount raw input data that guarantees sustained performance. Such guidelines are essential during system dimensioning. Based on these findings we conclude by presenting a quantitative task-allocation scheme that can be followed to achieve optimal utilization of available resources.

  8. Minimizing data transfer with sustained performance in wireless brain-machine interfaces.

    PubMed

    Thorbergsson, Palmi Thor; Garwicz, Martin; Schouenborg, Jens; Johansson, Anders J

    2012-06-01

    Brain-machine interfaces (BMIs) may be used to investigate neural mechanisms or to treat the symptoms of neurological disease and are hence powerful tools in research and clinical practice. Wireless BMIs add flexibility to both types of applications by reducing movement restrictions and risks associated with transcutaneous leads. However, since wireless implementations are typically limited in terms of transmission capacity and energy resources, the major challenge faced by their designers is to combine high performance with adaptations to limited resources. Here, we have identified three key steps in dealing with this challenge: (1) the purpose of the BMI should be clearly specified with regard to the type of information to be processed; (2) the amount of raw input data needed to fulfill the purpose should be determined, in order to avoid over- or under-dimensioning of the design; and (3) processing tasks should be allocated among the system parts such that all of them are utilized optimally with respect to computational power, wireless link capacity and raw input data requirements. We have focused on step (2) under the assumption that the purpose of the BMI (step 1) is to assess single- or multi-unit neuronal activity in the central nervous system with single-channel extracellular recordings. The reliability of this assessment depends on performance in detection and sorting of spikes. We have therefore performed absolute threshold spike detection and spike sorting with the principal component analysis and fuzzy c-means on a set of synthetic extracellular recordings, while varying the sampling rate and resolution, noise level and number of target units, and used the known ground truth to quantitatively estimate the performance. From the calculated performance curves, we have identified the sampling rate and resolution breakpoints, beyond which performance is not expected to increase by more than 1-5%. We have then estimated the performance of alternative algorithms for spike detection and spike sorting in order to examine the generalizability of our results to other algorithms. Our findings indicate that the minimization of recording noise is the primary factor to consider in the design process. In most cases, there are breakpoints for sampling rates and resolution that provide guidelines for BMI designers in terms of minimum amount raw input data that guarantees sustained performance. Such guidelines are essential during system dimensioning. Based on these findings we conclude by presenting a quantitative task-allocation scheme that can be followed to achieve optimal utilization of available resources.

  9. Electrophysiological evidence for parallel and serial processing during visual search.

    PubMed

    Luck, S J; Hillyard, S A

    1990-12-01

    Event-related potentials were recorded from young adults during a visual search task in order to evaluate parallel and serial models of visual processing in the context of Treisman's feature integration theory. Parallel and serial search strategies were produced by the use of feature-present and feature-absent targets, respectively. In the feature-absent condition, the slopes of the functions relating reaction time and latency of the P3 component to set size were essentially identical, indicating that the longer reaction times observed for larger set sizes can be accounted for solely by changes in stimulus identification and classification time, rather than changes in post-perceptual processing stages. In addition, the amplitude of the P3 wave on target-present trials in this condition increased with set size and was greater when the preceding trial contained a target, whereas P3 activity was minimal on target-absent trials. These effects are consistent with the serial self-terminating search model and appear to contradict parallel processing accounts of attention-demanding visual search performance, at least for a subset of search paradigms. Differences in ERP scalp distributions further suggested that different physiological processes are utilized for the detection of feature presence and absence.

  10. Reducing uncertainty in wind turbine blade health inspection with image processing techniques

    NASA Astrophysics Data System (ADS)

    Zhang, Huiyi

    Structural health inspection has been widely applied in the operation of wind farms to find early cracks in wind turbine blades (WTBs). Increased numbers of turbines and expanded rotor diameters are driving up the workloads and safety risks for site employees. Therefore, it is important to automate the inspection process as well as minimize the uncertainties involved in routine blade health inspection. In addition, crack documentation and trending is vital to assess rotor blade and turbine reliability in the 20 year designed life span. A new crack recognition and classification algorithm is described that can support automated structural health inspection of the surface of large composite WTBs. The first part of the study investigated the feasibility of digital image processing in WTB health inspection and defined the capability of numerically detecting cracks as small as hairline thickness. The second part of the study identified and analyzed the uncertainty of the digital image processing method. A self-learning algorithm was proposed to recognize and classify cracks without comparing a blade image to a library of crack images. The last part of the research quantified the uncertainty in the field conditions and the image processing methods.

  11. Evidence accumulation detected in BOLD signal using slow perceptual decision making.

    PubMed

    Krueger, Paul M; van Vugt, Marieke K; Simen, Patrick; Nystrom, Leigh; Holmes, Philip; Cohen, Jonathan D

    2017-04-01

    We assessed whether evidence accumulation could be observed in the BOLD signal during perceptual decision making. This presents a challenge since the hemodynamic response is slow, while perceptual decisions are typically fast. Guided by theoretical predictions of the drift diffusion model, we slowed down decisions by penalizing participants for incorrect responses. Second, we distinguished BOLD activity related to stimulus detection (modeled using a boxcar) from activity related to integration (modeled using a ramp) by minimizing the collinearity of GLM regressors. This was achieved by dissecting a boxcar into its two most orthogonal components: an "up-ramp" and a "down-ramp." Third, we used a control condition in which stimuli and responses were similar to the experimental condition, but that did not engage evidence accumulation of the stimuli. The results revealed an absence of areas in parietal cortex that have been proposed to drive perceptual decision making but have recently come into question; and newly identified regions that are candidates for involvement in evidence accumulation. Previous fMRI studies have either used fast perceptual decision making, which precludes the measurement of evidence accumulation, or slowed down responses by gradually revealing stimuli. The latter approach confounds perceptual detection with evidence accumulation because accumulation is constrained by perceptual input. We slowed down the decision making process itself while leaving perceptual information intact. This provided a more sensitive and selective observation of brain regions associated with the evidence accumulation processes underlying perceptual decision making than previous methods. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Optics based signal processing methods for intraoperative blood vessel detection and quantification in real time (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Chaturvedi, Amal; Shukair, Shetha A.; Le Rolland, Paul; Vijayvergia, Mayank; Subramanian, Hariharan; Gunn, Jonathan W.

    2016-03-01

    Minimally invasive operations require surgeons to make difficult cuts to blood vessels and other tissues with impaired tactile and visual feedback. This leads to inadvertent cuts to blood vessels hidden beneath tissue, causing serious health risks to patients and a non-reimbursable financial burden to hospitals. Intraoperative imaging technologies have been developed, but these expensive systems can be cumbersome and provide only a high-level view of blood vessel networks. In this research, we propose a lean reflectance-based system, comprised of a dual wavelength LED, photodiode, and novel signal processing algorithms for rapid vessel characterization. Since this system takes advantage of the inherent pulsatile light absorption characteristics of blood vessels, no contrast agent is required for its ability to detect the presence of a blood vessel buried deep inside any tissue type (up to a cm) in real time. Once a vessel is detected, the system is able to estimate the distance of the vessel from the probe and the diameter size of the vessel (with a resolution of ~2mm), as well as delineate the type of tissue surrounding the vessel. The system is low-cost, functions in real-time, and could be mounted on already existing surgical tools, such as Kittner dissectors or laparoscopic suction irrigation cannulae. Having been successfully validated ex vivo, this technology will next be tested in a live porcine study and eventually in clinical trials.

  13. Analysis of residual products in benzyl chloride used for the industrial synthesis of quaternary compounds by liquid chromatography with diode-array detection.

    PubMed

    Prieto-Blanco, M C; López-Mahía, P; Prada-Rodríguez, D

    2009-02-01

    In industrial and pharmaceutical processes, the study of residual products becomes essential to guarantee the quality of compounds and to eliminate or minimize toxic residual products. Knowledge about the origin of impurities (raw materials, processes, the contamination of industrial plants, etc.) is necessary in preventive treatment and in the control of a product's lifecycle. Benzyl chloride is used as raw material to synthesize several quaternary ammonium compounds, such as benzalkonium chloride, which may have pharmaceutical applications. Benzaldehyde, benzyl alcohol, toluene, chloro derivatives of toluene, and dibenzyl ether are compounds that may be found as impurities in technical benzyl chloride. We proposed a high-performance liquid chromatography method for the separation of these compounds, testing two stationary phases with different dimensions and particle sizes, with the application of photodiode array-detection. The linearity for four possible impurities (benzaldehyde, toluene, alpha,alpha-dichlorotoluene, and 2-chlorotoluene) ranged from 0.1 to 10 microg/mL, limits of detection from 11 to 34 ng/mL, and repeatability from 1% to 2.9% for a 0.3-1.2 microg/mL concentration range. The method was applied to samples of technical benzyl chloride, and alpha,alpha-dichlorotoluene and benzaldehyde were identified by spectral analysis and quantitated. The selection of benzyl chloride with lower levels of impurities is important to guarantee the reduction of residual products in further syntheses.

  14. Rapid detection of technological disasters by using a RST-based processing chain

    NASA Astrophysics Data System (ADS)

    Filizzola, Carolina; Corrado, Rosita; Mazzeo, Giuseppe; Marchese, Francesco; Paciello, Rossana; Pergola, Nicola; Tramutoli, Valerio

    2010-05-01

    Natural disasters may be responsible for technological disasters which may cause injuries to citizens and damages to relevant infrastructures. When it is not possible to prevent or foresee such disasters it is hoped at least to rapidly detect the accident in order to intervene as soon as possible to minimize damages. In this context, the combination of a Robust Satellite Technique (RST), able to identify for sure actual (i.e. no false alarm) accidents, and satellite sensors with high temporal resolution seems to assure both a reliable and a timely detection of abrupt Thermal Infrared (TIR) transients related to dangerous explosions. A processing chain, based on the RST approach, has been developed in the framework of the G-MOSAIC project by DIFA-UNIBAS team, suitable for automatically identify on MSG-SEVIRI images harmful events. Maps of thermal anomalies are generated every 15 minutes (i.e. SEVIRI temporal repetition rate) over a selected area together with kml files (containing information on latitude and longitude of "thermally" anomalous SEVIRI pixel centre, time of image acquisition, relative intensity of anomalies, etc.) for a rapid visualization of the accident position even on google earth. Results achieved in the case of the event occurred in Russia on 10th May 2009 will be presented: a gas pipeline exploded, causing injures to citizens and a huge damage to a Physicochemical Scientific Research Institute which is, according to official data, an organisation, running especially dangerous production and facilities.

  15. Fuel Combustion Laboratory | Transportation Research | NREL

    Science.gov Websites

    detection of compounds at sub-parts per billion by volume levels. A high-performance liquid chromatograph ) platform; a high-pressure (1,200- bar) direct-injection system to minimize spray physics effects; and an combustion chamber. A high-speed pressure transducer measures chamber pressure to detect fuel ignition. Air

  16. SERS detection of indirect viral DNA capture using colloidal gold and methylene blue as a Raman label

    USDA-ARS?s Scientific Manuscript database

    An indirect capture model assay using colloidal Au nanoparticles is demonstrated for surface enhanced Raman scattering (SERS) spectroscopy detection of DNA. The sequence targeted for capture is derived from the West Nile Virus (WNV) RNA genome and was selected on the basis of exhibiting minimal seco...

  17. Optimized small molecule antibody labeling efficiency through continuous flow centrifugal diafiltration.

    PubMed

    Cappione, Amedeo; Mabuchi, Masaharu; Briggs, David; Nadler, Timothy

    2015-04-01

    Protein immuno-detection encompasses a broad range of analytical methodologies, including western blotting, flow cytometry, and microscope-based applications. These assays which detect, quantify, and/or localize expression for one or more proteins in complex biological samples, are reliant upon fluorescent or enzyme-tagged target-specific antibodies. While small molecule labeling kits are available with a range of detection moieties, the workflow is hampered by a requirement for multiple dialysis-based buffer exchange steps that are both time-consuming and subject to sample loss. In a previous study, we briefly described an alternative method for small-scale protein labeling with small molecule dyes whereby all phases of the conjugation workflow could be performed in a single centrifugal diafiltration device. Here, we expand on this foundational work addressing functionality of the device at each step in the workflow (sample cleanup, labeling, unbound dye removal, and buffer exchange/concentration) and the implications for optimizing labeling efficiency. When compared to other common buffer exchange methodologies, centrifugal diafiltration offered superior performance as measured by four key parameters (process time, desalting capacity, protein recovery, retain functional integrity). Originally designed for resin-based affinity purification, the device also provides a platform for up-front antibody purification or albumin carrier removal. Most significantly, by exploiting the rapid kinetics of NHS-based labeling reactions, the process of continuous diafiltration minimizes reaction time and long exposure to excess dye, guaranteeing maximal target labeling while limiting the risks associated with over-labeling. Overall, the device offers a simplified workflow with reduced processing time and hands-on requirements, without sacrificing labeling efficiency, final yield, or conjugate performance. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Performance bounds for matched field processing in subsurface object detection applications

    NASA Astrophysics Data System (ADS)

    Sahin, Adnan; Miller, Eric L.

    1998-09-01

    In recent years there has been considerable interest in the use of ground penetrating radar (GPR) for the non-invasive detection and localization of buried objects. In a previous work, we have considered the use of high resolution array processing methods for solving these problems for measurement geometries in which an array of electromagnetic receivers observes the fields scattered by the subsurface targets in response to a plane wave illumination. Our approach uses the MUSIC algorithm in a matched field processing (MFP) scheme to determine both the range and the bearing of the objects. In this paper we derive the Cramer-Rao bounds (CRB) for this MUSIC-based approach analytically. Analysis of the theoretical CRB has shown that there exists an optimum inter-element spacing of array elements for which the CRB is minimum. Furthermore, the optimum inter-element spacing minimizing CRB is smaller than the conventional half wavelength criterion. The theoretical bounds are then verified for two estimators using Monte-Carlo simulations. The first estimator is the MUSIC-based MFP and the second one is the maximum likelihood based MFP. The two approaches differ in the cost functions they optimize. We observe that Monte-Carlo simulated error variances always lie above the values established by CRB. Finally, we evaluate the performance of our MUSIC-based algorithm in the presence of model mismatches. Since the detection algorithm strongly depends on the model used, we have tested the performance of the algorithm when the object radius used in the model is different from the true radius. This analysis reveals that the algorithm is still capable of localizing the objects with a bias depending on the degree of mismatch.

  19. Automated wavelet denoising of photoacoustic signals for circulating melanoma cell detection and burn image reconstruction.

    PubMed

    Holan, Scott H; Viator, John A

    2008-06-21

    Photoacoustic image reconstruction may involve hundreds of point measurements, each of which contributes unique information about the subsurface absorbing structures under study. For backprojection imaging, two or more point measurements of photoacoustic waves induced by irradiating a biological sample with laser light are used to produce an image of the acoustic source. Each of these measurements must undergo some signal processing, such as denoising or system deconvolution. In order to process the numerous signals, we have developed an automated wavelet algorithm for denoising signals. We appeal to the discrete wavelet transform for denoising photoacoustic signals generated in a dilute melanoma cell suspension and in thermally coagulated blood. We used 5, 9, 45 and 270 melanoma cells in the laser beam path as test concentrations. For the burn phantom, we used coagulated blood in 1.6 mm silicon tube submerged in Intralipid. Although these two targets were chosen as typical applications for photoacoustic detection and imaging, they are of independent interest. The denoising employs level-independent universal thresholding. In order to accommodate nonradix-2 signals, we considered a maximal overlap discrete wavelet transform (MODWT). For the lower melanoma cell concentrations, as the signal-to-noise ratio approached 1, denoising allowed better peak finding. For coagulated blood, the signals were denoised to yield a clean photoacoustic resulting in an improvement of 22% in the reconstructed image. The entire signal processing technique was automated so that minimal user intervention was needed to reconstruct the images. Such an algorithm may be used for image reconstruction and signal extraction for applications such as burn depth imaging, depth profiling of vascular lesions in skin and the detection of single cancer cells in blood samples.

  20. Scanning electron microscopy coupled with energy-dispersive X-ray spectrometry for quick detection of sulfur-oxidizing bacteria in environmental water samples

    NASA Astrophysics Data System (ADS)

    Sun, Chengjun; Jiang, Fenghua; Gao, Wei; Li, Xiaoyun; Yu, Yanzhen; Yin, Xiaofei; Wang, Yong; Ding, Haibing

    2017-01-01

    Detection of sulfur-oxidizing bacteria has largely been dependent on targeted gene sequencing technology or traditional cell cultivation, which usually takes from days to months to carry out. This clearly does not meet the requirements of analysis for time-sensitive samples and/or complicated environmental samples. Since energy-dispersive X-ray spectrometry (EDS) can be used to simultaneously detect multiple elements in a sample, including sulfur, with minimal sample treatment, this technology was applied to detect sulfur-oxidizing bacteria using their high sulfur content within the cell. This article describes the application of scanning electron microscopy imaging coupled with EDS mapping for quick detection of sulfur oxidizers in contaminated environmental water samples, with minimal sample handling. Scanning electron microscopy imaging revealed the existence of dense granules within the bacterial cells, while EDS identified large amounts of sulfur within them. EDS mapping localized the sulfur to these granules. Subsequent 16S rRNA gene sequencing showed that the bacteria detected in our samples belonged to the genus Chromatium, which are sulfur oxidizers. Thus, EDS mapping made it possible to identify sulfur oxidizers in environmental samples based on localized sulfur within their cells, within a short time (within 24 h of sampling). This technique has wide ranging applications for detection of sulfur bacteria in environmental water samples.

Top