Sample records for tedious time consuming

  1. Templates in Action

    ERIC Educational Resources Information Center

    Serow, Penelope; Inglis, Michaela

    2010-01-01

    Circle Geometry, a senior mathematics topic, is often regarded as time-consuming and associated relational concepts difficult for students to grasp. Units of work that introduce students to circle geometry theorems are frequently described as a string of tedious constructions. This article explores teacher-designed dynamic geometry software…

  2. Graphing Polar Curves

    ERIC Educational Resources Information Center

    Lawes, Jonathan F.

    2013-01-01

    Graphing polar curves typically involves a combination of three traditional techniques, all of which can be time-consuming and tedious. However, an alternative method--graphing the polar function on a rectangular plane--simplifies graphing, increases student understanding of the polar coordinate system, and reinforces graphing techniques learned…

  3. SenseCube--A Novel Inexpensive Wireless Multisensor for Physics Lab Experimentations

    ERIC Educational Resources Information Center

    Mehta, Vedant; Lane, Charles D.

    2018-01-01

    SenseCube is a multisensor capable of measuring many different real-time events and changes in environment. Most conventional sensors used in introductory-physics labs use their own software and have wires that must be attached to a computer or an alternate device to analyze the data. This makes the standard sensors time consuming, tedious, and…

  4. Detection of anti-salmonella flgk antibodies in chickens by automated capillary immunoassay

    USDA-ARS?s Scientific Manuscript database

    Western blot is a very useful tool to identify specific protein, but is tedious, labor-intensive and time-consuming. An automated "Simple Western" assay has recently been developed that enables the protein separation, blotting and detection in an automatic manner. However, this technology has not ...

  5. Student Costing: An Essential Tool in Site-based Budgeting and Teacher Empowerment.

    ERIC Educational Resources Information Center

    Sanders, K. Penney; Thiemann, Francis C.

    1990-01-01

    Although the process of participative, school-based budgeting might seem tedious and time-consuming, it can truly empower teachers and administrators. One cannot set instructional and budgetary priorities without knowing costs. A costing formula to help facilitate the budgeting process is presented. Includes 18 references. (MLH)

  6. Use of recombinant salmonella flagellar hook protein (flgk) for detection of anti-salmonella antibodies in chickens by automated capillary immunoassay

    USDA-ARS?s Scientific Manuscript database

    Background: Conventional immunoblot assays are a very useful tool for specific protein identification, but are tedious, labor-intensive and time-consuming. An automated capillary electrophoresis-based immunoblot assay called "Simple Western" has recently been developed that enables the protein sepa...

  7. Course Recommendation Based on Query Classification Approach

    ERIC Educational Resources Information Center

    Gulzar, Zameer; Leema, A. Anny

    2018-01-01

    This article describes how with a non-formal education, a scholar has to choose courses among various domains to meet the research aims. In spite of this, the availability of large number of courses, makes the process of selecting the appropriate course a tedious, time-consuming, and risky decision, and the course selection will directly affect…

  8. Regulation of cold-induced sweetening in potatoes and markers for fast-track new variety development

    USDA-ARS?s Scientific Manuscript database

    Potato breeding is a tedious, time consuming process. With the growing requirements of the potato processing industry for new potato varieties, there is need for effective tools to speed-up new variety development. The purpose of this study was to understand the enzymatic regulation of cold-induce...

  9. Marking: A Pain in the Neck! The Computer to the Rescue.

    ERIC Educational Resources Information Center

    Felix, Uschi

    1993-01-01

    Discusses the annoyances and minor aches accompanying the tedious work of marking students' schoolwork and the application of the computer to providing students with good feedback. Notes that using the computer to mark students' papers is more enjoyable and less time consuming, provides the teacher with an easily accessible record of each…

  10. Vienna Fortran - A Language Specification. Version 1.1

    DTIC Science & Technology

    1992-03-01

    other computer archi- tectures is the fact that the memory is physically distributed among the processors; the time required to access a non-local...datum may be an order of magnitude higher than the time taken to access locally stored data. This has important consequences for program efficiency. In...machine in many aspects. It is tedious, time -consuming and error prone. It has led to particularly slow software development cycles and, in consequence

  11. Production of recombinant Salmonella flagellar protein, FlgK, and its uses in detection of anti-Salmonella antibodies in chickens by automated capillary immunoassay

    USDA-ARS?s Scientific Manuscript database

    Conventional immunoblot assays have been a very useful tool for specific protein identification in the past several decades, but are tedious, labor-intensive and time-consuming. An automated capillary electrophoresis-based immunoblot assay called "Simple Western" has recently been developed that en...

  12. A Circular-Impact Sampler for Forest Litter

    Treesearch

    Stephen S. Sackett

    1971-01-01

    Sampling the forest floor to determine litter weight is a tedious, time-consuming job. A new device has been designed and tested at the Southern Forest Fire Laboratory that eliminates many of the past sampling problems. The sampler has been fabricated in two sizes (6- and 12-inch diameters), and these are comparable in accuracy and sampling intensity. This Note...

  13. A Quick and Simple Conversion of Carboxylic Acids into Their Anilides of Heating with Phenyl Isothiocyanate.

    ERIC Educational Resources Information Center

    Ram, Ram N.; And Others

    1983-01-01

    Converting carboxylic acids into their anilides, which usually involves preparation of acid chloride or mixed anhydride followed by treatment with aniline, is tedious and/or time-consuming. A quick and easier procedure, using phenyl isothiocyanate, is provided. Reactions involved and a summary table of results are included. (JN)

  14. Comparison of an automated Most Probable Number (MPN) technique to traditional plating methods for estimating populations of total aerobes, coliforms and E. coli associated with freshly processed broiler chickens

    USDA-ARS?s Scientific Manuscript database

    Traditional microbiological techniques for estimating populations of viable bacteria can be laborious and time consuming. The Most Probable Number (MPN) technique is especially tedious as multiple series of tubes must be inoculated at several different dilutions. Recently, an instrument (TEMPOTM) ...

  15. Analogy as a Tool for the Acquisition of English Verb Tenses among Low Proficiency L2 Learners

    ERIC Educational Resources Information Center

    Yoke, Soo Kum; Hasan, Nor Haniza

    2014-01-01

    The teaching of English grammar to second language learners is usually a tedious, stressful and time consuming activity and even after all the effort, students have generally found these lessons boring and confusing. As such, innovative language instructors have been trying different approaches to the teaching of grammar in their classrooms. Using…

  16. Breakthrough at the Missouri River Breaks: A quick tool for comparing burned and unburned sites

    Treesearch

    Rachael Clark; Theresa Jain

    2009-01-01

    A quantitative understanding of how forests work, both before and after (prescribed and wild) fire, is essential to management. Yet acquiring the kind of broad yet detailed information needed for many management decisions can be costly, tedious, and time-consuming. After two sweeping wildfires in the Missouri River Breaks area of eastern Montana - the Indian and...

  17. The Use of Online Corrective Feedback in Academic Writing by L1 Malay Learners

    ERIC Educational Resources Information Center

    Yoke, Soo Kum; Rajendran, Cecilia Bai; Sain, Noridah; Kamaludin, Puteri Nur Hidayah; Nawi, Sofwah Md; Yusof, Suhaili

    2013-01-01

    Conventional corrective feedback has been widely practiced but has been said to be tedious, stressful and time consuming. As such, the focus of this study is to investigate the use of an alternative method to giving corrective feedback namely, an online corrective feedback through e-mail. In order to examine if this innovative form of corrective…

  18. Indexing Mount For Rotation Of Optical Component

    NASA Technical Reports Server (NTRS)

    Reichle, Donald J., Jr.; Barnes, Norman P.

    1993-01-01

    Indexing mount for polarizer, wave plate, birefringent plate, or other optical component facilitates rotation of component to one or more preset angles. Includes hexagonal nut holding polarizer or other optical component. Ball bearing loaded by screw engages notch on cylindrical extension of nut engaging bracket. Time-consuming and tedious angular adjustment unnecessary: component turned quickly and easily, by hand or by use of wrench, to preset angular positions maintained by simple ball-detent mechanism.

  19. A new technique of laparoscopic cholangiography.

    PubMed

    Hagan, K D; Rosemurgy, A S; Albrink, M H; Carey, L C

    1992-04-01

    With the advent and rapid proliferation of laparoscopic cholecystectomy, numerous techniques and "tips" have been described. Intraoperative cholangiography during laparoscopic cholecystectomy can be tedious, frustrating, and time consuming. Described herein is a technique of intraoperative cholangiography during laparoscopic cholecystectomy which has proven to be easy, fast, and succinct. This method utilizes a rigid cholangiogram catheter which is placed into the peritoneal cavity through a small additional puncture site. This catheter is easily inserted into the cystic duct by extracorporeal manipulation. We suggest this method to surgeons who have shared our prior frustration with intraoperative cholangiography.

  20. A novel approach for fabricating NiO hollow spheres for gas sensors

    NASA Astrophysics Data System (ADS)

    Kuang, Chengwei; Zeng, Wen; Ye, Hong; Li, Yanqiong

    2018-03-01

    Hollow spheres are usually fabricated by hard template methods or soft template methods with soft surfactants, which is quiet tedious and time-consuming. In this paper, NiO hollow spheres with fluffy surface were successfully synthesized by a facile hydrothermal method and subsequent calcination, where bubbles acted as the template. NiO hollow spheres exhibited excellent gas sensing performances, which results from its hollow structure and high specific surface area. In addition, a possible evolution mechanism of NiO hollow spheres was proposed based on experimental results.

  1. Structural health monitoring of pipelines rehabilitated with lining technology

    NASA Astrophysics Data System (ADS)

    Farhidzadeh, Alireza; Dehghan-Niri, Ehsan; Salamone, Salvatore

    2014-03-01

    Damage detection of pipeline systems is a tedious and time consuming job due to digging requirement, accessibility, interference with other facilities, and being extremely wide spread in metropolitans. Therefore, a real-time and automated monitoring system can pervasively reduce labor work, time, and expenditures. This paper presents the results of an experimental study aimed at monitoring the performance of full scale pipe lining systems, subjected to static and dynamic (seismic) loading, using Acoustic Emission (AE) technique and Guided Ultrasonic Waves (GUWs). Particularly, two damage mechanisms are investigated: 1) delamination between pipeline and liner as the early indicator of damage, and 2) onset of nonlinearity and incipient failure of the liner as critical damage state.

  2. Effects of Breakwater Construction of Tedious Creek Small Craft Harbor and Estuary, Maryland

    DTIC Science & Technology

    2006-09-01

    an area that provides excellent access to many productive fishing grounds in Chesapeake Bay. Tedious Creek Harbor provides anchorage to over 100...vessels involved in commercial and/or recreational fishing . The orientation of Tedious Creek allows the transmission of storm waves that, at times...entering the estuary. Due to the orientation of Tedious Creek to Fishing Bay, storm waves from the northeast, east, and southeast entered the

  3. Fast 3D Surface Extraction 2 pages (including abstract)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sewell, Christopher Meyer; Patchett, John M.; Ahrens, James P.

    Ocean scientists searching for isosurfaces and/or thresholds of interest in high resolution 3D datasets required a tedious and time-consuming interactive exploration experience. PISTON research and development activities are enabling ocean scientists to rapidly and interactively explore isosurfaces and thresholds in their large data sets using a simple slider with real time calculation and visualization of these features. Ocean Scientists can now visualize more features in less time, helping them gain a better understanding of the high resolution data sets they work with on a daily basis. Isosurface timings (512{sup 3} grid): VTK 7.7 s, Parallel VTK (48-core) 1.3 s, PISTONmore » OpenMP (48-core) 0.2 s, PISTON CUDA (Quadro 6000) 0.1 s.« less

  4. Robotic hair harvesting system: a new proposal.

    PubMed

    Lin, Xiang; Nakazawa, Toji; Yasuda, Ryuya; Kobayashi, Etsuko; Sakuma, Ichiro; Liao, Hongen

    2011-01-01

    Follicular Unit Extraction (FUE) has become a popular hair transplanting method for solving male-pattern baldness problem. Manually harvesting hairs one by one, however, is a tedious and time-consuming job to doctors. We design an accurate hair harvesting robot with a novel and efficient end-effector which consists of one digital microscope and a punch device. The microscope is first employed to automatically localize target hairs and then guides the punch device for harvesting after shifting. The end-effector shows average bias and precision of 0.014 mm by virtue of a rotary guidance design for the motorized shifting mechanism.

  5. Machine for preparing phosphors for the fluorimetric determination of uranium

    USGS Publications Warehouse

    Stevens, R.E.; Wood, W.H.; Goetz, K.G.; Horr, C.A.

    1956-01-01

    The time saved by use of a machine for preparing many phosphors at one time increases the rate of productivity of the fluorimetric method for determining uranium. The machine prepares 18 phosphors at a time and eliminates the tedious and time-consuming step of preparing them by hand, while improving the precision of the method in some localities. The machine consists of a ring burner over which the platinum dishes, containing uranium and flux, are rotated. By placing the machine in an inclined position the molten flux comes into contact with all surfaces within th dish as the dishes rotate over the flame. Precision is improved because the heating and cooling conditions are the same for each of the 18 phosphors in one run as well as for successive runs.

  6. Assesment on the performance of electrode arrays using image processing technique

    NASA Astrophysics Data System (ADS)

    Usman, N.; Khiruddin, A.; Nawawi, Mohd

    2017-08-01

    Interpreting inverted resistivity section is time consuming, tedious and requires other sources of information to be relevant geologically. Image processing technique was used in order to perform post inversion processing which make geophysical data interpretation easier. The inverted data sets were imported into the PCI Geomatica 9.0.1 for further processing. The data sets were clipped and merged together in order to match the coordinates of the three layers and permit pixel to pixel analysis. Dipole-dipole array is more sensitive to resistivity variation with depth in comparison with Werner-Schlumberger and pole-dipole. Image processing serves as good post-inversion tool in geophysical data processing.

  7. Improving 3D Character Posing with a Gestural Interface.

    PubMed

    Kyto, Mikko; Dhinakaran, Krupakar; Martikainen, Aki; Hamalainen, Perttu

    2017-01-01

    The most time-consuming part of character animation is 3D character posing. Posing using a mouse is a slow and tedious task that involves sequences of selecting on-screen control handles and manipulating the handles to adjust character parameters, such as joint rotations and end effector positions. Thus, various 3D user interfaces have been proposed to make animating easier, but they typically provide less accuracy. The proposed interface combines a mouse with the Leap Motion device to provide 3D input. A usability study showed that users preferred the Leap Motion over a mouse as a 3D gestural input device. The Leap Motion drastically decreased the number of required operations and the task completion time, especially for novice users.

  8. Low-cost digital dynamic visualization system

    NASA Astrophysics Data System (ADS)

    Asundi, Anand K.; Sajan, M. R.

    1995-05-01

    High speed photographic systems like the image rotation camera, the Cranz Schardin camera and the drum camera are typically used for recording and visualization of dynamic events in stress analysis, fluid mechanics, etc. All these systems are fairly expensive and generally not simple to use. Furthermore they are all based on photographic film recording systems requiring time consuming and tedious wet processing of the films. Currently digital cameras are replacing to certain extent the conventional cameras for static experiments. Recently, there is lot of interest in developing and modifying CCD architectures and recording arrangements for dynamic scene analysis. Herein we report the use of a CCD camera operating in the Time Delay and Integration (TDI) mode for digitally recording dynamic scenes. Applications in solid as well as fluid impact problems are presented.

  9. Model-based segmentation of hand radiographs

    NASA Astrophysics Data System (ADS)

    Weiler, Frank; Vogelsang, Frank

    1998-06-01

    An important procedure in pediatrics is to determine the skeletal maturity of a patient from radiographs of the hand. There is great interest in the automation of this tedious and time-consuming task. We present a new method for the segmentation of the bones of the hand, which allows the assessment of the skeletal maturity with an appropriate database of reference bones, similar to the atlas based methods. The proposed algorithm uses an extended active contour model for the segmentation of the hand bones, which incorporates a-priori knowledge of shape and topology of the bones in an additional energy term. This `scene knowledge' is integrated in a complex hierarchical image model, that is used for the image analysis task.

  10. Permanency analysis on human electroencephalogram signals for pervasive Brain-Computer Interface systems.

    PubMed

    Sadeghi, Koosha; Junghyo Lee; Banerjee, Ayan; Sohankar, Javad; Gupta, Sandeep K S

    2017-07-01

    Brain-Computer Interface (BCI) systems use some permanent features of brain signals to recognize their corresponding cognitive states with high accuracy. However, these features are not perfectly permanent, and BCI system should be continuously trained over time, which is tedious and time consuming. Thus, analyzing the permanency of signal features is essential in determining how often to repeat training. In this paper, we monitor electroencephalogram (EEG) signals, and analyze their behavior through continuous and relatively long period of time. In our experiment, we record EEG signals corresponding to rest state (eyes open and closed) from one subject everyday, for three and a half months. The results show that signal features such as auto-regression coefficients remain permanent through time, while others such as power spectral density specifically in 5-7 Hz frequency band are not permanent. In addition, eyes open EEG data shows more permanency than eyes closed data.

  11. SenseCube—a novel inexpensive wireless multisensor for physics lab experimentations

    NASA Astrophysics Data System (ADS)

    Mehta, Vedant; Lane, Charles D.

    2018-07-01

    SenseCube is a multisensor capable of measuring many different real-time events and changes in environment. Most conventional sensors used in introductory-physics labs use their own software and have wires that must be attached to a computer or an alternate device to analyze the data. This makes the standard sensors time consuming, tedious, and space-constricted. SenseCube was developed to overcome these limitations. This research was focused on developing a device that is all-encompassing, cost-effective, wireless, and compact, yet can perform the same tasks as the multiple standard sensors normally used in physics labs. It measures more than twenty distinct types of real-time events and transfers the data via Bluetooth. Both Windows and Mac software were developed so that the data from this device can be retrieved and/or saved on either platform. This paper describes the sensor itself, its development, its capabilities, and its cost comparison with standard sensors.

  12. An Automatic Phase-Change Detection Technique for Colloidal Hard Sphere Suspensions

    NASA Technical Reports Server (NTRS)

    McDowell, Mark; Gray, Elizabeth; Rogers, Richard B.

    2005-01-01

    Colloidal suspensions of monodisperse spheres are used as physical models of thermodynamic phase transitions and as precursors to photonic band gap materials. However, current image analysis techniques are not able to distinguish between densely packed phases within conventional microscope images, which are mainly characterized by degrees of randomness or order with similar grayscale value properties. Current techniques for identifying the phase boundaries involve manually identifying the phase transitions, which is very tedious and time consuming. We have developed an intelligent machine vision technique that automatically identifies colloidal phase boundaries. The algorithm utilizes intelligent image processing techniques that accurately identify and track phase changes vertically or horizontally for a sequence of colloidal hard sphere suspension images. This technique is readily adaptable to any imaging application where regions of interest are distinguished from the background by differing patterns of motion over time.

  13. Dynamic photoelasticity by TDI imaging

    NASA Astrophysics Data System (ADS)

    Asundi, Anand K.; Sajan, M. R.

    2001-06-01

    High speed photographic system like the image rotation camera, the Cranz Schardin camera and the drum camera are typically used for the recording and visualization of dynamic events in stress analysis, fluid mechanics, etc. All these systems are fairly expensive and generally not simple to use. Furthermore they are all based on photographic film recording system requiring time consuming and tedious wet processing of the films. Digital cameras are replacing the conventional cameras, to certain extent in static experiments. Recently, there is lots of interest in development and modifying CCD architectures and recording arrangements for dynamic scenes analysis. Herein we report the use of a CCD camera operating in the Time Delay and Integration mode for digitally recording dynamic photoelastic stress patterns. Applications in strobe and streak photoelastic pattern recording and system limitations will be explained in the paper.

  14. Computer-Based Mathematics Instructions for Engineering Students

    NASA Technical Reports Server (NTRS)

    Khan, Mustaq A.; Wall, Curtiss E.

    1996-01-01

    Almost every engineering course involves mathematics in one form or another. The analytical process of developing mathematical models is very important for engineering students. However, the computational process involved in the solution of some mathematical problems may be very tedious and time consuming. There is a significant amount of mathematical software such as Mathematica, Mathcad, and Maple designed to aid in the solution of these instructional problems. The use of these packages in classroom teaching can greatly enhance understanding, and save time. Integration of computer technology in mathematics classes, without de-emphasizing the traditional analytical aspects of teaching, has proven very successful and is becoming almost essential. Sample computer laboratory modules are developed for presentation in the classroom setting. This is accomplished through the use of overhead projectors linked to graphing calculators and computers. Model problems are carefully selected from different areas.

  15. An Automated Blur Detection Method for Histological Whole Slide Imaging

    PubMed Central

    Moles Lopez, Xavier; D'Andrea, Etienne; Barbot, Paul; Bridoux, Anne-Sophie; Rorive, Sandrine; Salmon, Isabelle; Debeir, Olivier; Decaestecker, Christine

    2013-01-01

    Whole slide scanners are novel devices that enable high-resolution imaging of an entire histological slide. Furthermore, the imaging is achieved in only a few minutes, which enables image rendering of large-scale studies involving multiple immunohistochemistry biomarkers. Although whole slide imaging has improved considerably, locally poor focusing causes blurred regions of the image. These artifacts may strongly affect the quality of subsequent analyses, making a slide review process mandatory. This tedious and time-consuming task requires the scanner operator to carefully assess the virtual slide and to manually select new focus points. We propose a statistical learning method that provides early image quality feedback and automatically identifies regions of the image that require additional focus points. PMID:24349343

  16. Fabricating a stable record base for completely edentulous patients treated with osseointegrated implants using healing abutments.

    PubMed

    Rungcharassaeng, K; Kan, J Y

    1999-02-01

    A stable record base is essential for accurate interocclusal centric relation records in a completely edentulous patient. In implant prosthodontics, several procedures have been suggested for the fabrication of a stable record base. However, these procedures necessitate removal of the healing abutments during the interocclusal record procedure and the trial denture placement, which makes the procedures tedious and time-consuming. When the implant-prosthesis interface is subgingival, the patient may also experience discomfort during these procedures. This article describes a procedure for fabricating a stable record base that uses the healing abutments, which eliminates the necessity of the healing abutment removal and its consequences. Advantages and disadvantages of this procedure are also discussed.

  17. Motion based parsing for video from observational psychology

    NASA Astrophysics Data System (ADS)

    Kokaram, Anil; Doyle, Erika; Lennon, Daire; Joyeux, Laurent; Fuller, Ray

    2006-01-01

    In Psychology it is common to conduct studies involving the observation of humans undertaking some task. The sessions are typically recorded on video and used for subjective visual analysis. The subjective analysis is tedious and time consuming, not only because much useless video material is recorded but also because subjective measures of human behaviour are not necessarily repeatable. This paper presents tools using content based video analysis that allow automated parsing of video from one such study involving Dyslexia. The tools rely on implicit measures of human motion that can be generalised to other applications in the domain of human observation. Results comparing quantitative assessment of human motion with subjective assessment are also presented, illustrating that the system is a useful scientific tool.

  18. Support for fast comprehension of ICU data: visualization using metaphor graphics.

    PubMed

    Horn, W; Popow, C; Unterasinger, L

    2001-01-01

    The time-oriented analysis of electronic patient records on (neonatal) intensive care units is a tedious and time-consuming task. Graphic data visualization should make it easier for physicians to assess the overall situation of a patient and to recognize essential changes over time. Metaphor graphics are used to sketch the most relevant parameters for characterizing a patient's situation. By repetition of the graphic object in 24 frames the situation of the ICU patient is presented in one display, usually summarizing the last 24 h. VIE-VISU is a data visualization system which uses multiples to present the change in the patient's status over time in graphic form. Each multiple is a highly structured metaphor graphic object. Each object visualizes important ICU parameters from circulation, ventilation, and fluid balance. The design using multiples promotes a focus on stability and change. A stable patient is recognizable at first sight, continuous improvement or worsening condition are easy to analyze, drastic changes in the patient's situation get the viewers attention immediately.

  19. Optimal nonlinear information processing capacity in delay-based reservoir computers

    NASA Astrophysics Data System (ADS)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-09-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  20. Optimal nonlinear information processing capacity in delay-based reservoir computers.

    PubMed

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-09-11

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  1. Optimal nonlinear information processing capacity in delay-based reservoir computers

    PubMed Central

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-01-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature. PMID:26358528

  2. Active Learning of Classification Models with Likert-Scale Feedback.

    PubMed

    Xue, Yanbing; Hauskrecht, Milos

    2017-01-01

    Annotation of classification data by humans can be a time-consuming and tedious process. Finding ways of reducing the annotation effort is critical for building the classification models in practice and for applying them to a variety of classification tasks. In this paper, we develop a new active learning framework that combines two strategies to reduce the annotation effort. First, it relies on label uncertainty information obtained from the human in terms of the Likert-scale feedback. Second, it uses active learning to annotate examples with the greatest expected change. We propose a Bayesian approach to calculate the expectation and an incremental SVM solver to reduce the time complexity of the solvers. We show the combination of our active learning strategy and the Likert-scale feedback can learn classification models more rapidly and with a smaller number of labeled instances than methods that rely on either Likert-scale labels or active learning alone.

  3. Active Learning of Classification Models with Likert-Scale Feedback

    PubMed Central

    Xue, Yanbing; Hauskrecht, Milos

    2017-01-01

    Annotation of classification data by humans can be a time-consuming and tedious process. Finding ways of reducing the annotation effort is critical for building the classification models in practice and for applying them to a variety of classification tasks. In this paper, we develop a new active learning framework that combines two strategies to reduce the annotation effort. First, it relies on label uncertainty information obtained from the human in terms of the Likert-scale feedback. Second, it uses active learning to annotate examples with the greatest expected change. We propose a Bayesian approach to calculate the expectation and an incremental SVM solver to reduce the time complexity of the solvers. We show the combination of our active learning strategy and the Likert-scale feedback can learn classification models more rapidly and with a smaller number of labeled instances than methods that rely on either Likert-scale labels or active learning alone. PMID:28979827

  4. Microfluidics-Based Lab-on-Chip Systems in DNA-Based Biosensing: An Overview

    PubMed Central

    Dutse, Sabo Wada; Yusof, Nor Azah

    2011-01-01

    Microfluidics-based lab-on-chip (LOC) systems are an active research area that is revolutionising high-throughput sequencing for the fast, sensitive and accurate detection of a variety of pathogens. LOCs also serve as portable diagnostic tools. The devices provide optimum control of nanolitre volumes of fluids and integrate various bioassay operations that allow the devices to rapidly sense pathogenic threat agents for environmental monitoring. LOC systems, such as microfluidic biochips, offer advantages compared to conventional identification procedures that are tedious, expensive and time consuming. This paper aims to provide a broad overview of the need for devices that are easy to operate, sensitive, fast, portable and sufficiently reliable to be used as complementary tools for the control of pathogenic agents that damage the environment. PMID:22163925

  5. More on approximations of Poisson probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kao, C

    1980-05-01

    Calculation of Poisson probabilities frequently involves calculating high factorials, which becomes tedious and time-consuming with regular calculators. The usual way to overcome this difficulty has been to find approximations by making use of the table of the standard normal distribution. A new transformation proposed by Kao in 1978 appears to perform better for this purpose than traditional transformations. In the present paper several approximation methods are stated and compared numerically, including an approximation method that utilizes a modified version of Kao's transformation. An approximation based on a power transformation was found to outperform those based on the square-root type transformationsmore » as proposed in literature. The traditional Wilson-Hilferty approximation and Makabe-Morimura approximation are extremely poor compared with this approximation. 4 tables. (RWR)« less

  6. Semi-automatic image analysis methodology for the segmentation of bubbles and drops in complex dispersions occurring in bioreactors

    NASA Astrophysics Data System (ADS)

    Taboada, B.; Vega-Alvarado, L.; Córdova-Aguilar, M. S.; Galindo, E.; Corkidi, G.

    2006-09-01

    Characterization of multiphase systems occurring in fermentation processes is a time-consuming and tedious process when manual methods are used. This work describes a new semi-automatic methodology for the on-line assessment of diameters of oil drops and air bubbles occurring in a complex simulated fermentation broth. High-quality digital images were obtained from the interior of a mechanically stirred tank. These images were pre-processed to find segments of edges belonging to the objects of interest. The contours of air bubbles and oil drops were then reconstructed using an improved Hough transform algorithm which was tested in two, three and four-phase simulated fermentation model systems. The results were compared against those obtained manually by a trained observer, showing no significant statistical differences. The method was able to reduce the total processing time for the measurements of bubbles and drops in different systems by 21-50% and the manual intervention time for the segmentation procedure by 80-100%.

  7. A method for automatic matching of multi-timepoint findings for enhanced clinical workflow

    NASA Astrophysics Data System (ADS)

    Raghupathi, Laks; Dinesh, MS; Devarakota, Pandu R.; Valadez, Gerardo Hermosillo; Wolf, Matthias

    2013-03-01

    Non-interventional diagnostics (CT or MR) enables early identification of diseases like cancer. Often, lesion growth assessment done during follow-up is used to distinguish between benign and malignant ones. Thus correspondences need to be found for lesions localized at each time point. Manually matching the radiological findings can be time consuming as well as tedious due to possible differences in orientation and position between scans. Also, the complicated nature of the disease makes the physicians to rely on multiple modalities (PETCT, PET-MR) where it is even more challenging. Here, we propose an automatic feature-based matching that is robust to change in organ volume, subpar or no registration that can be done with very less computations. Traditional matching methods rely mostly on accurate image registration and applying the resulting deformation map on the findings coordinates. This has disadvantages when accurate registration is time-consuming or may not be possible due to vast organ volume differences between scans. Our novel matching proposes supervised learning by taking advantage of the underlying CAD features that are already present and considering the matching as a classification problem. In addition, the matching can be done extremely fast and at reasonable accuracy even when the image registration fails for some reason. Experimental results∗ on real-world multi-time point thoracic CT data showed an accuracy of above 90% with negligible false positives on a variety of registration scenarios.

  8. Helping E-Commerce Consumers Make Good Purchase Decisions: A User Reviews-Based Approach

    NASA Astrophysics Data System (ADS)

    Zhang, Richong; Tran, Thomas T.

    Online product reviews provided by the consumers, who have previously purchased and used some particular products, form a rich source of information for other consumers who would like to study about these products in order to make their purchase decisions. Realizing this great need of consumers, several e-commerce web sites such as Amazon.com offer facilities for consumers to review products and exchange their purchase opinions. Unfortunately, reading through the massive amounts of product reviews available online from many e-communities, forums and newsgroups is not only a tedious task but also an impossible one. Indeed, nowadays consumers need an effective and reliable method to search through those huge sources of information and sort out the most appropriate and helpful product reviews. This paper proposes a model to discover the helpfulness of online product reviews. Product reviews can be analyzed and ranked by our scoring system and those reviews that may help consumers better than others will be found. In addition, we compare our model with a number of machine learning techniques. Our experimental results confirm that our approach is effective in ranking and classifying online product reviews.

  9. Development of On-Line High Performance Liquid Chromatography (HPLC)-Biochemical Detection Methods as Tools in the Identification of Bioactives

    PubMed Central

    Malherbe, Christiaan J.; de Beer, Dalene; Joubert, Elizabeth

    2012-01-01

    Biochemical detection (BCD) methods are commonly used to screen plant extracts for specific biological activities in batch assays. Traditionally, bioactives in the most active extracts were identified through time-consuming bio-assay guided fractionation until single active compounds could be isolated. Not only are isolation procedures often tedious, but they could also lead to artifact formation. On-line coupling of BCD assays to high performance liquid chromatography (HPLC) is gaining ground as a high resolution screening technique to overcome problems associated with pre-isolation by measuring the effects of compounds post-column directly after separation. To date, several on-line HPLC-BCD assays, applied to whole plant extracts and mixtures, have been published. In this review the focus will fall on enzyme-based, receptor-based and antioxidant assays. PMID:22489144

  10. Development of a HIV-1 Virus Detection System Based on Nanotechnology.

    PubMed

    Lee, Jin-Ho; Oh, Byung-Keun; Choi, Jeong-Woo

    2015-04-27

    Development of a sensitive and selective detection system for pathogenic viral agents is essential for medical healthcare from diagnostics to therapeutics. However, conventional detection systems are time consuming, resource-intensive and tedious to perform. Hence, the demand for sensitive and selective detection system for virus are highly increasing. To attain this aim, different aspects and techniques have been applied to develop virus sensor with improved sensitivity and selectivity. Here, among those aspects and techniques, this article reviews HIV virus particle detection systems incorporated with nanotechnology to enhance the sensitivity. This review mainly focused on four different detection system including vertically configured electrical detection based on scanning tunneling microscopy (STM), electrochemical detection based on direct electron transfer in virus, optical detection system based on localized surface plasmon resonance (LSPR) and surface enhanced Raman spectroscopy (SERS) using plasmonic nanoparticle.

  11. Simulation of Propellant Loading System Senior Design Implement in Computer Algorithm

    NASA Technical Reports Server (NTRS)

    Bandyopadhyay, Alak

    2010-01-01

    Propellant loading from the Storage Tank to the External Tank is one of the very important and time consuming pre-launch ground operations for the launch vehicle. The propellant loading system is a complex integrated system involving many physical components such as the storage tank filled with cryogenic fluid at a very low temperature, the long pipe line connecting the storage tank with the external tank, the external tank along with the flare stack, and vent systems for releasing the excess fuel. Some of the very important parameters useful for design purpose are the prediction of pre-chill time, loading time, amount of fuel lost, the maximum pressure rise etc. The physics involved for mathematical modeling is quite complex due to the fact the process is unsteady, there is phase change as some of the fuel changes from liquid to gas state, then conjugate heat transfer in the pipe walls as well as between solid-to-fluid region. The simulation is very tedious and time consuming too. So overall, this is a complex system and the objective of the work is student's involvement and work in the parametric study and optimization of numerical modeling towards the design of such system. The students have to first become familiar and understand the physical process, the related mathematics and the numerical algorithm. The work involves exploring (i) improved algorithm to make the transient simulation computationally effective (reduced CPU time) and (ii) Parametric study to evaluate design parameters by changing the operational conditions

  12. Multi-point estimation of total energy expenditure: a comparison between zinc-reduction and platinum-equilibration methodologies.

    PubMed

    Sonko, Bakary J; Miller, Leland V; Jones, Richard H; Donnelly, Joseph E; Jacobsen, Dennis J; Hill, James O; Fennessey, Paul V

    2003-12-15

    Reducing water to hydrogen gas by zinc or uranium metal for determining D/H ratio is both tedious and time consuming. This has forced most energy metabolism investigators to use the "two-point" technique instead of the "Multi-point" technique for estimating total energy expenditure (TEE). Recently, we purchased a new platinum (Pt)-equilibration system that significantly reduces both time and labor required for D/H ratio determination. In this study, we compared TEE obtained from nine overweight but healthy subjects, estimated using the traditional Zn-reduction method to that obtained from the new Pt-equilibration system. Rate constants, pool spaces, and CO2 production rates obtained from use of the two methodologies were not significantly different. Correlation analysis demonstrated that TEEs estimated using the two methods were significantly correlated (r=0.925, p=0.0001). Sample equilibration time was reduced by 66% compared to those of similar methods. The data demonstrated that the Zn-reduction method could be replaced by the Pt-equilibration method when TEE was estimated using the "Multi-Point" technique. Furthermore, D equilibration time was significantly reduced.

  13. The Bright, Artificial Intelligence-Augmented Future of Neuroimaging Reading.

    PubMed

    Hainc, Nicolin; Federau, Christian; Stieltjes, Bram; Blatow, Maria; Bink, Andrea; Stippich, Christoph

    2017-01-01

    Radiologists are among the first physicians to be directly affected by advances in computer technology. Computers are already capable of analyzing medical imaging data, and with decades worth of digital information available for training, will an artificial intelligence (AI) one day signal the end of the human radiologist? With the ever increasing work load combined with the looming doctor shortage, radiologists will be pushed far beyond their current estimated 3 s allotted time-of-analysis per image; an AI with super-human capabilities might seem like a logical replacement. We feel, however, that AI will lead to an augmentation rather than a replacement of the radiologist. The AI will be relied upon to handle the tedious, time-consuming tasks of detecting and segmenting outliers while possibly generating new, unanticipated results that can then be used as sources of medical discovery. This will affect not only radiologists but all physicians and also researchers dealing with medical imaging. Therefore, we must embrace future technology and collaborate interdisciplinary to spearhead the next revolution in medicine.

  14. Automated flight path planning for virtual endoscopy.

    PubMed

    Paik, D S; Beaulieu, C F; Jeffrey, R B; Rubin, G D; Napel, S

    1998-05-01

    In this paper, a novel technique for rapid and automatic computation of flight paths for guiding virtual endoscopic exploration of three-dimensional medical images is described. While manually planning flight paths is a tedious and time consuming task, our algorithm is automated and fast. Our method for positioning the virtual camera is based on the medial axis transform but is much more computationally efficient. By iteratively correcting a path toward the medial axis, the necessity of evaluating simple point criteria during morphological thinning is eliminated. The virtual camera is also oriented in a stable viewing direction, avoiding sudden twists and turns. We tested our algorithm on volumetric data sets of eight colons, one aorta and one bronchial tree. The algorithm computed the flight paths in several minutes per volume on an inexpensive workstation with minimal computation time added for multiple paths through branching structures (10%-13% per extra path). The results of our algorithm are smooth, centralized paths that aid in the task of navigation in virtual endoscopic exploration of three-dimensional medical images.

  15. Motion generation of robotic surgical tasks: learning from expert demonstrations.

    PubMed

    Reiley, Carol E; Plaku, Erion; Hager, Gregory D

    2010-01-01

    Robotic surgical assistants offer the possibility of automating portions of a task that are time consuming and tedious in order to reduce the cognitive workload of a surgeon. This paper proposes using programming by demonstration to build generative models and generate smooth trajectories that capture the underlying structure of the motion data recorded from expert demonstrations. Specifically, motion data from Intuitive Surgical's da Vinci Surgical System of a panel of expert surgeons performing three surgical tasks are recorded. The trials are decomposed into subtasks or surgemes, which are then temporally aligned through dynamic time warping. Next, a Gaussian Mixture Model (GMM) encodes the experts' underlying motion structure. Gaussian Mixture Regression (GMR) is then used to extract a smooth reference trajectory to reproduce a trajectory of the task. The approach is evaluated through an automated skill assessment measurement. Results suggest that this paper presents a means to (i) extract important features of the task, (ii) create a metric to evaluate robot imitative performance (iii) generate smoother trajectories for reproduction of three common medical tasks.

  16. Conceptual Design Oriented Wing Structural Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Lau, May Yuen

    1996-01-01

    Airplane optimization has always been the goal of airplane designers. In the conceptual design phase, a designer's goal could be tradeoffs between maximum structural integrity, minimum aerodynamic drag, or maximum stability and control, many times achieved separately. Bringing all of these factors into an iterative preliminary design procedure was time consuming, tedious, and not always accurate. For example, the final weight estimate would often be based upon statistical data from past airplanes. The new design would be classified based on gross characteristics, such as number of engines, wingspan, etc., to see which airplanes of the past most closely resembled the new design. This procedure works well for conventional airplane designs, but not very well for new innovative designs. With the computing power of today, new methods are emerging for the conceptual design phase of airplanes. Using finite element methods, computational fluid dynamics, and other computer techniques, designers can make very accurate disciplinary-analyses of an airplane design. These tools are computationally intensive, and when used repeatedly, they consume a great deal of computing time. In order to reduce the time required to analyze a design and still bring together all of the disciplines (such as structures, aerodynamics, and controls) into the analysis, simplified design computer analyses are linked together into one computer program. These design codes are very efficient for conceptual design. The work in this thesis is focused on a finite element based conceptual design oriented structural synthesis capability (CDOSS) tailored to be linked into ACSYNT.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, C. Shan; Hayworth, Kenneth J.; Lu, Zhiyuan

    Focused Ion Beam Scanning Electron Microscopy (FIB-SEM) can automatically generate 3D images with superior z-axis resolution, yielding data that needs minimal image registration and related post-processing. Obstacles blocking wider adoption of FIB-SEM include slow imaging speed and lack of long-term system stability, which caps the maximum possible acquisition volume. Here, we present techniques that accelerate image acquisition while greatly improving FIB-SEM reliability, allowing the system to operate for months and generating continuously imaged volumes > 10 6 ?m 3 . These volumes are large enough for connectomics, where the excellent z resolution can help in tracing of small neuronal processesmore » and accelerate the tedious and time-consuming human proofreading effort. Even higher resolution can be achieved on smaller volumes. We present example data sets from mammalian neural tissue, Drosophila brain, and Chlamydomonas reinhardtii to illustrate the power of this novel high-resolution technique to address questions in both connectomics and cell biology.« less

  18. A simple apparatus for quick qualitative analysis of CR39 nuclear track detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gautier, D. C.; Kline, J. L.; Flippo, K. A.

    2008-10-15

    Quantifying the ion pits in Columbia Resin 39 (CR39) nuclear track detector from Thomson parabolas is a time consuming and tedious process using conventional microscope based techniques. A simple inventive apparatus for fast screening and qualitative analysis of CR39 detectors has been developed, enabling efficient selection of data for a more detailed analysis. The system consists simply of a green He-Ne laser and a high-resolution digital single-lens reflex camera. The laser illuminates the edge of the CR39 at grazing incidence and couples into the plastic, acting as a light pipe. Subsequently, the laser illuminates all ion tracks on the surface.more » A high-resolution digital camera is used to photograph the scattered light from the ion tracks, enabling one to quickly determine charge states and energies measured by the Thomson parabola.« less

  19. An approach to knowledge engineering to support knowledge-based simulation of payload ground processing at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Mcmanus, Shawn; Mcdaniel, Michael

    1989-01-01

    Planning for processing payloads was always difficult and time-consuming. With the advent of Space Station Freedom and its capability to support a myriad of complex payloads, the planning to support this ground processing maze involves thousands of man-hours of often tedious data manipulation. To provide the capability to analyze various processing schedules, an object oriented knowledge-based simulation environment called the Advanced Generic Accomodations Planning Environment (AGAPE) is being developed. Having nearly completed the baseline system, the emphasis in this paper is directed toward rule definition and its relation to model development and simulation. The focus is specifically on the methodologies implemented during knowledge acquisition, analysis, and representation within the AGAPE rule structure. A model is provided to illustrate the concepts presented. The approach demonstrates a framework for AGAPE rule development to assist expert system development.

  20. A high-performance liquid chromatography-electronic circular dichroism online method for assessing the absolute enantiomeric excess and conversion ratio of asymmetric reactions

    NASA Astrophysics Data System (ADS)

    Zhang, Xiang; Wang, Mingchao; Li, Li; Yin, Dali

    2017-03-01

    Asymmetric reactions often need to be evaluated during the synthesis of chiral compounds. However, traditional evaluation methods require the isolation of the individual enantiomer, which is tedious and time-consuming. Thus, it is desirable to develop simple, practical online detection methods. We developed a method based on high-performance liquid chromatography-electronic circular dichroism (HPLC-ECD) that simultaneously analyzes the material conversion ratio and absolute optical purity of each enantiomer. In particular, only a reverse-phase C18 column instead of a chiral column is required in our method because the ECD measurement provides a g-factor that describes the ratio of each enantiomer in the mixtures. We used our method to analyze the asymmetric hydrosilylation of β-enamino esters, and we discussed the advantage, feasibility, and effectiveness of this new methodology.

  1. Spin wave Feynman diagram vertex computation package

    NASA Astrophysics Data System (ADS)

    Price, Alexander; Javernick, Philip; Datta, Trinanjan

    Spin wave theory is a well-established theoretical technique that can correctly predict the physical behavior of ordered magnetic states. However, computing the effects of an interacting spin wave theory incorporating magnons involve a laborious by hand derivation of Feynman diagram vertices. The process is tedious and time consuming. Hence, to improve productivity and have another means to check the analytical calculations, we have devised a Feynman Diagram Vertex Computation package. In this talk, we will describe our research group's effort to implement a Mathematica based symbolic Feynman diagram vertex computation package that computes spin wave vertices. Utilizing the non-commutative algebra package NCAlgebra as an add-on to Mathematica, symbolic expressions for the Feynman diagram vertices of a Heisenberg quantum antiferromagnet are obtained. Our existing code reproduces the well-known expressions of a nearest neighbor square lattice Heisenberg model. We also discuss the case of a triangular lattice Heisenberg model where non collinear terms contribute to the vertex interactions.

  2. Quantitation of sugar content in pyrolysis liquids after acid hydrolysis using high-performance liquid chromatography without neutralization.

    PubMed

    Johnston, Patrick A; Brown, Robert C

    2014-08-13

    A rapid method for the quantitation of total sugars in pyrolysis liquids using high-performance liquid chromatography (HPLC) was developed. The method avoids the tedious and time-consuming sample preparation required by current analytical methods. It is possible to directly analyze hydrolyzed pyrolysis liquids, bypassing the neutralization step usually required in determination of total sugars. A comparison with traditional methods was used to determine the validity of the results. The calibration curve coefficient of determination on all standard compounds was >0.999 using a refractive index detector. The relative standard deviation for the new method was 1.13%. The spiked sugar recoveries on the pyrolysis liquid samples were between 104 and 105%. The research demonstrates that it is possible to obtain excellent accuracy and efficiency using HPLC to quantitate glucose after acid hydrolysis of polymeric and oligomeric sugars found in fast pyrolysis bio-oils without neutralization.

  3. Detection and classification of subject-generated artifacts in EEG signals using autoregressive models.

    PubMed

    Lawhern, Vernon; Hairston, W David; McDowell, Kaleb; Westerfield, Marissa; Robbins, Kay

    2012-07-15

    We examine the problem of accurate detection and classification of artifacts in continuous EEG recordings. Manual identification of artifacts, by means of an expert or panel of experts, can be tedious, time-consuming and infeasible for large datasets. We use autoregressive (AR) models for feature extraction and characterization of EEG signals containing several kinds of subject-generated artifacts. AR model parameters are scale-invariant features that can be used to develop models of artifacts across a population. We use a support vector machine (SVM) classifier to discriminate among artifact conditions using the AR model parameters as features. Results indicate reliable classification among several different artifact conditions across subjects (approximately 94%). These results suggest that AR modeling can be a useful tool for discriminating among artifact signals both within and across individuals. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. A Review on Recent Developments for Detection of Diabetic Retinopathy.

    PubMed

    Amin, Javeria; Sharif, Muhammad; Yasmin, Mussarat

    2016-01-01

    Diabetic retinopathy is caused by the retinal micro vasculature which may be formed as a result of diabetes mellitus. Blindness may appear as a result of unchecked and severe cases of diabetic retinopathy. Manual inspection of fundus images to check morphological changes in microaneurysms, exudates, blood vessels, hemorrhages, and macula is a very time-consuming and tedious work. It can be made easily with the help of computer-aided system and intervariability for the observer. In this paper, several techniques for detecting microaneurysms, hemorrhages, and exudates are discussed for ultimate detection of nonproliferative diabetic retinopathy. Blood vessels detection techniques are also discussed for the diagnosis of proliferative diabetic retinopathy. Furthermore, the paper elaborates a discussion on the experiments accessed by authors for the detection of diabetic retinopathy. This work will be helpful for the researchers and technical persons who want to utilize the ongoing research in this area.

  5. Automatic liquid handling for life science: a critical review of the current state of the art.

    PubMed

    Kong, Fanwei; Yuan, Liang; Zheng, Yuan F; Chen, Weidong

    2012-06-01

    Liquid handling plays a pivotal role in life science laboratories. In experiments such as gene sequencing, protein crystallization, antibody testing, and drug screening, liquid biosamples frequently must be transferred between containers of varying sizes and/or dispensed onto substrates of varying types. The sample volumes are usually small, at the micro- or nanoliter level, and the number of transferred samples can be huge when investigating large-scope combinatorial conditions. Under these conditions, liquid handling by hand is tedious, time-consuming, and impractical. Consequently, there is a strong demand for automated liquid-handling methods such as sensor-integrated robotic systems. In this article, we survey the current state of the art in automatic liquid handling, including technologies developed by both industry and research institutions. We focus on methods for dealing with small volumes at high throughput and point out challenges for future advancements.

  6. Automated virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Hunt, Gordon W.; Hemler, Paul F.; Vining, David J.

    1997-05-01

    Virtual colonscopy (VC) is a minimally invasive alternative to conventional fiberoptic endoscopy for colorectal cancer screening. The VC technique involves bowel cleansing, gas distension of the colon, spiral computed tomography (CT) scanning of a patient's abdomen and pelvis, and visual analysis of multiplanar 2D and 3D images created from the spiral CT data. Despite the ability of interactive computer graphics to assist a physician in visualizing 3D models of the colon, a correct diagnosis hinges upon a physician's ability to properly identify small and sometimes subtle polyps or masses within hundreds of multiplanar and 3D images. Human visual analysis is time-consuming, tedious, and often prone to error of interpretation.We have addressed the problem of visual analysis by creating a software system that automatically highlights potential lesions in the 2D and 3D images in order to expedite a physician's interpretation of the colon data.

  7. Class II composite resin restorations: faster, easier, predictable.

    PubMed

    Jackson, R D

    2016-11-18

    Composite resin continues to displace amalgam as the preferred direct restorative material in developed countries. Even though composite materials have evolved to include nanoparticles with high physical properties and low shrinkage stress, dentists have been challenged to efficiently create quality, long lasting, predictable restorations. Unlike amalgam, composite resin cannot be condensed making the establishment of a predictable, proper contact more difficult. In addition, composite requires an understanding of adhesives and an appreciation for their exacting application. These facts combined with the precise adaptation and light-curing of multiple layers makes placement of quality Class II composite restorations tedious and time-consuming. For private practicing dentists, it can also have an effect on economic productivity. Clinicians have always wanted an easier, efficient placement technique for posterior composite restorations that rivals that for amalgam. It appears that advances in instrumentation, materials and technology have finally delivered it.

  8. Simplified Enrichment of Plasma Membrane Proteins from Arabidopsis thaliana Seedlings Using Differential Centrifugation and Brij-58 Treatment.

    PubMed

    Collins, Carina A; Leslie, Michelle E; Peck, Scott C; Heese, Antje

    2017-01-01

    The plasma membrane (PM) forms a barrier between a plant cell and its environment. Proteins at this subcellular location play diverse and complex roles, including perception of extracellular signals to coordinate cellular changes. Analyses of PM proteins, however, are often limited by the relatively low abundance of these proteins in the total cellular protein pool. Techniques traditionally used for enrichment of PM proteins are time consuming, tedious, and require extensive optimization. Here, we provide a simple and reproducible enrichment procedure for PM proteins from Arabidopsis thaliana seedlings starting from total microsomal membranes isolated by differential centrifugation. To enrich for PM proteins, total microsomes are treated with the nonionic detergent Brij-58 to decrease the abundance of contaminating organellar proteins. This protocol combined with the genetic resources available in Arabidopsis provides a powerful tool that will enhance our understanding of proteins at the PM.

  9. A Review on Recent Developments for Detection of Diabetic Retinopathy

    PubMed Central

    2016-01-01

    Diabetic retinopathy is caused by the retinal micro vasculature which may be formed as a result of diabetes mellitus. Blindness may appear as a result of unchecked and severe cases of diabetic retinopathy. Manual inspection of fundus images to check morphological changes in microaneurysms, exudates, blood vessels, hemorrhages, and macula is a very time-consuming and tedious work. It can be made easily with the help of computer-aided system and intervariability for the observer. In this paper, several techniques for detecting microaneurysms, hemorrhages, and exudates are discussed for ultimate detection of nonproliferative diabetic retinopathy. Blood vessels detection techniques are also discussed for the diagnosis of proliferative diabetic retinopathy. Furthermore, the paper elaborates a discussion on the experiments accessed by authors for the detection of diabetic retinopathy. This work will be helpful for the researchers and technical persons who want to utilize the ongoing research in this area. PMID:27777811

  10. Simple Recovery of Intracellular Gold Nanoparticles from Peanut Seedling Roots.

    PubMed

    Raju, D; Mehta, Urmil J; Ahmad, Absar

    2015-02-01

    Fabrication of inorganic nanomaterials via a biological route witnesses the formation either extracellularly, intracellulary or both. Whereas extracellular formation of these nanomaterials is cherished owing to their easy and economical extraction and purification processes; the intracellular formation of nanomaterials, due to the lack of a proper recovery protocol has always been dreaded, as the extraction processes used so far were tedious, costly, time consuming and often resulting in very low recovery. The aim of the present study was to overcome the problems related with the extraction and recovery of intracellularly synthesized inorganic nanoparticles, and to devise a method to increasing the output, the shape, size, composition and dispersal of nanoparticles is not altered. Water proved to be much better system as it provided well dispersed, stable gold nanoparticles and higher recovery. This is the first report, where intracellular nanoparticles have been recovered using a very cost-effective and eco-friendly approach.

  11. Automatic stage identification of Drosophila egg chamber based on DAPI images

    PubMed Central

    Jia, Dongyu; Xu, Qiuping; Xie, Qian; Mio, Washington; Deng, Wu-Min

    2016-01-01

    The Drosophila egg chamber, whose development is divided into 14 stages, is a well-established model for developmental biology. However, visual stage determination can be a tedious, subjective and time-consuming task prone to errors. Our study presents an objective, reliable and repeatable automated method for quantifying cell features and classifying egg chamber stages based on DAPI images. The proposed approach is composed of two steps: 1) a feature extraction step and 2) a statistical modeling step. The egg chamber features used are egg chamber size, oocyte size, egg chamber ratio and distribution of follicle cells. Methods for determining the on-site of the polytene stage and centripetal migration are also discussed. The statistical model uses linear and ordinal regression to explore the stage-feature relationships and classify egg chamber stages. Combined with machine learning, our method has great potential to enable discovery of hidden developmental mechanisms. PMID:26732176

  12. Interactive autonomy and robotic skills

    NASA Technical Reports Server (NTRS)

    Kellner, A.; Maediger, B.

    1994-01-01

    Current concepts of robot-supported operations for space laboratories (payload servicing, inspection, repair, and ORU exchange) are mainly based on the concept of 'interactive autonomy' which implies autonomous behavior of the robot according to predefined timelines, predefined sequences of elementary robot operations and within predefined world models supplying geometrical and other information for parameter instantiation on the one hand, and the ability to override and change the predefined course of activities by human intervention on the other hand. Although in principle a very powerful and useful concept, in practice the confinement of the robot to the abstract world models and predefined activities appears to reduce the robot's stability within real world uncertainties and its applicability to non-predefined parts of the world, calling for frequent corrective interaction by the operator, which in itself may be tedious and time-consuming. Methods are presented to improve this situation by incorporating 'robotic skills' into the concept of interactive autonomy.

  13. How to quantify conduits in wood?

    PubMed

    Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven

    2013-01-01

    Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology.

  14. Design of a self-aligned, wide temperature range (300 mK-300 K) atomic force microscope/magnetic force microscope with 10 nm magnetic force microscope resolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karcı, Özgür; Department of Nanotechnology and Nanomedicine, Hacettepe University, Beytepe, 06800 Ankara; Dede, Münir

    We describe the design of a wide temperature range (300 mK-300 K) atomic force microscope/magnetic force microscope with a self-aligned fibre-cantilever mechanism. An alignment chip with alignment groves and a special mechanical design are used to eliminate tedious and time consuming fibre-cantilever alignment procedure for the entire temperature range. A low noise, Michelson fibre interferometer was integrated into the system for measuring deflection of the cantilever. The spectral noise density of the system was measured to be ~12 fm/√Hz at 4.2 K at 3 mW incident optical power. Abrikosov vortices in BSCCO(2212) single crystal sample and a high density hardmore » disk sample were imaged at 10 nm resolution to demonstrate the performance of the system.« less

  15. Enhanced FIB-SEM systems for large-volume 3D imaging.

    PubMed

    Xu, C Shan; Hayworth, Kenneth J; Lu, Zhiyuan; Grob, Patricia; Hassan, Ahmed M; García-Cerdán, José G; Niyogi, Krishna K; Nogales, Eva; Weinberg, Richard J; Hess, Harald F

    2017-05-13

    Focused Ion Beam Scanning Electron Microscopy (FIB-SEM) can automatically generate 3D images with superior z-axis resolution, yielding data that needs minimal image registration and related post-processing. Obstacles blocking wider adoption of FIB-SEM include slow imaging speed and lack of long-term system stability, which caps the maximum possible acquisition volume. Here, we present techniques that accelerate image acquisition while greatly improving FIB-SEM reliability, allowing the system to operate for months and generating continuously imaged volumes > 10 6 µm 3 . These volumes are large enough for connectomics, where the excellent z resolution can help in tracing of small neuronal processes and accelerate the tedious and time-consuming human proofreading effort. Even higher resolution can be achieved on smaller volumes. We present example data sets from mammalian neural tissue, Drosophila brain, and Chlamydomonas reinhardtii to illustrate the power of this novel high-resolution technique to address questions in both connectomics and cell biology.

  16. Bone suppression in CT angiography data by region-based multiresolution segmentation

    NASA Astrophysics Data System (ADS)

    Blaffert, Thomas; Wiemker, Rafael; Lin, Zhong Min

    2003-05-01

    Multi slice CT (MSCT) scanners have the advantage of high and isotropic image resolution, which broadens the range of examinations for CT angiography (CTA). A very important method to present the large amount of high-resolution 3D data is the visualization by maximum intensity projections (MIP). A problem with MIP projections in angiography is that bones often hide the vessels of interest, especially the scull and vertebral column. Software tools for a manual selection of bone regions and their suppression in the MIP are available, but processing is time-consuming and tedious. A highly computer-assisted of even fully automated suppression of bones would considerably speed up the examination and probably increase the number of examined cases. In this paper we investigate the suppression (or removal) of bone regions in 3D CT data sets for vascular examinations of the head with a visualization of the carotids and the circle of Willis.

  17. A rapid method for counting nucleated erythrocytes on stained blood smears by digital image analysis

    USGS Publications Warehouse

    Gering, E.; Atkinson, C.T.

    2004-01-01

    Measures of parasitemia by intraerythrocytic hematozoan parasites are normally expressed as the number of infected erythrocytes per n erythrocytes and are notoriously tedious and time consuming to measure. We describe a protocol for generating rapid counts of nucleated erythrocytes from digital micrographs of thin blood smears that can be used to estimate intensity of hematozoan infections in nonmammalian vertebrate hosts. This method takes advantage of the bold contrast and relatively uniform size and morphology of erythrocyte nuclei on Giemsa-stained blood smears and uses ImageJ, a java-based image analysis program developed at the U.S. National Institutes of Health and available on the internet, to recognize and count these nuclei. This technique makes feasible rapid and accurate counts of total erythrocytes in large numbers of microscope fields, which can be used in the calculation of peripheral parasitemias in low-intensity infections.

  18. TOOKUIL: A case study in user interface development for safety code application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, D.L.; Harkins, C.K.; Hoole, J.G.

    1997-07-01

    Traditionally, there has been a very high learning curve associated with using nuclear power plant (NPP) analysis codes. Even for seasoned plant analysts and engineers, the process of building or modifying an input model for present day NPP analysis codes is tedious, error prone, and time consuming. Current cost constraints and performance demands place an additional burden on today`s safety analysis community. Advances in graphical user interface (GUI) technology have been applied to obtain significant productivity and quality assurance improvements for the Transient Reactor Analysis Code (TRAC) input model development. KAPL Inc. has developed an X Windows-based graphical user interfacemore » named TOOKUIL which supports the design and analysis process, acting as a preprocessor, runtime editor, help system, and post processor for TRAC. This paper summarizes the objectives of the project, the GUI development process and experiences, and the resulting end product, TOOKUIL.« less

  19. One Dimensional Analysis Model of a Condensing Spray Chamber Including Rocket Exhaust Using SINDA/FLUINT and CEA

    NASA Technical Reports Server (NTRS)

    Sakowski, Barbara; Edwards, Daryl; Dickens, Kevin

    2014-01-01

    Modeling droplet condensation via CFD codes can be very tedious, time consuming, and inaccurate. CFD codes may be tedious and time consuming in terms of using Lagrangian particle tracking approaches or particle sizing bins. Also since many codes ignore conduction through the droplet and or the degradating effect of heat and mass transfer if noncondensible species are present, the solutions may be inaccurate. The modeling of a condensing spray chamber where the significant size of the water droplets and the time and distance these droplets take to fall, can make the effect of droplet conduction a physical factor that needs to be considered in the model. Furthermore the presence of even a relatively small amount of noncondensible has been shown to reduce the amount of condensation [Ref 1]. It is desirable then to create a modeling tool that addresses these issues. The path taken to create such a tool is illustrated. The application of this tool and subsequent results are based on the spray chamber in the Spacecraft Propulsion Research Facility (B2) located at NASA's Plum Brook Station that tested an RL-10 engine. The platform upon which the condensation physics is modeled is SINDAFLUINT. The use of SINDAFLUINT enables the ability to model various aspects of the entire testing facility, including the rocket exhaust duct flow and heat transfer to the exhaust duct wall. The ejector pumping system of the spray chamber is also easily implemented via SINDAFLUINT. The goal is to create a transient one dimensional flow and heat transfer model beginning at the rocket, continuing through the condensing spray chamber, and finally ending with the ejector pumping system. However the model of the condensing spray chamber may be run independently of the rocket and ejector systems detail, with only appropriate mass flow boundary conditions placed at the entrance and exit of the condensing spray chamber model. The model of the condensing spray chamber takes into account droplet conduction as well as the degrading effect of mass and heat transfer due to the presence of noncondensibles. The one dimension model of the condensing spray chamber makes no presupposition on the pressure profile within the chamber, allowing the implemented droplet physics of heat and mass transfer coupled to the SINDAFLUINT solver to determine a transient pressure profile of the condensing spray chamber. Model results compare well to the RL-10 engine pressure test data.

  20. The Bright, Artificial Intelligence-Augmented Future of Neuroimaging Reading

    PubMed Central

    Hainc, Nicolin; Federau, Christian; Stieltjes, Bram; Blatow, Maria; Bink, Andrea; Stippich, Christoph

    2017-01-01

    Radiologists are among the first physicians to be directly affected by advances in computer technology. Computers are already capable of analyzing medical imaging data, and with decades worth of digital information available for training, will an artificial intelligence (AI) one day signal the end of the human radiologist? With the ever increasing work load combined with the looming doctor shortage, radiologists will be pushed far beyond their current estimated 3 s allotted time-of-analysis per image; an AI with super-human capabilities might seem like a logical replacement. We feel, however, that AI will lead to an augmentation rather than a replacement of the radiologist. The AI will be relied upon to handle the tedious, time-consuming tasks of detecting and segmenting outliers while possibly generating new, unanticipated results that can then be used as sources of medical discovery. This will affect not only radiologists but all physicians and also researchers dealing with medical imaging. Therefore, we must embrace future technology and collaborate interdisciplinary to spearhead the next revolution in medicine. PMID:28983278

  1. A semi-automatic annotation tool for cooking video

    NASA Astrophysics Data System (ADS)

    Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe

    2013-03-01

    In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.

  2. Rapid Measurement of Soil Carbon in Rice Paddy Field of Lombok Island Indonesia Using Near Infrared Technology

    NASA Astrophysics Data System (ADS)

    Kusumo, B. H.; Sukartono, S.; Bustan, B.

    2018-02-01

    Measuring soil organic carbon (C) using conventional analysis is tedious procedure, time consuming and expensive. It is needed simple procedure which is cheap and saves time. Near infrared technology offers rapid procedure as it works based on the soil spectral reflectance and without any chemicals. The aim of this research is to test whether this technology able to rapidly measure soil organic C in rice paddy field. Soil samples were collected from rice paddy field of Lombok Island Indonesia, and the coordinates of the samples were recorded. Parts of the samples were analysed using conventional analysis (Walkley and Black) and some other parts were scanned using near infrared spectroscopy (NIRS) for soil spectral collection. Partial Least Square Regression (PLSR) Models were developed using data of soil C analysed using conventional analysis and data from soil spectral reflectance. The models were moderately successful to measure soil C in rice paddy field of Lombok Island. This shows that the NIR technology can be further used to monitor the C change in rice paddy soil.

  3. High-definition optical coherence tomography - an aid to clinical practice and research in dermatology.

    PubMed

    Cao, Taige; Tey, Hong Liang

    2015-09-01

    At present, beyond clinical assessment, the diagnosis of skin diseases is primarily made histologically. However, skin biopsies have many disadvantages, including pain, scarring, risk of infection, and sampling error. With recent advances in skin imaging technology, the clinical use of imaging methods for the practical management of skin diseases has become an option. The in vivo high-definition optical coherence tomography (HD-OCT) has recently been developed and commercialized (Skintell; Agfa, Belgium). Compared with conventional OCT, it has a higher resolution; compared with reflectance confocal microscopy, it has a shorter time for image acquisition as well as a greater penetration depth and a larger field of view. HD-OCT is promising but much work is still required to develop it from a research tool to a valuable adjunct for the noninvasive diagnosis of skin lesions. Substantial work has been done to identify HD-OCT features in various diseases but interpretation can be time-consuming and tedious. Projects aimed at automating these processes and improving image quality are currently under way. © 2015 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  4. MPI, HPF or OpenMP: A Study with the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, Hao-Qiang; Frumkin, Michael; Hribar, Michelle; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1999-01-01

    Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but the task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study,potentials of applying some of the techniques to realistic aerospace applications will be presented

  5. Fast and accurate resonance assignment of small-to-large proteins by combining automated and manual approaches.

    PubMed

    Niklasson, Markus; Ahlner, Alexandra; Andresen, Cecilia; Marsh, Joseph A; Lundström, Patrik

    2015-01-01

    The process of resonance assignment is fundamental to most NMR studies of protein structure and dynamics. Unfortunately, the manual assignment of residues is tedious and time-consuming, and can represent a significant bottleneck for further characterization. Furthermore, while automated approaches have been developed, they are often limited in their accuracy, particularly for larger proteins. Here, we address this by introducing the software COMPASS, which, by combining automated resonance assignment with manual intervention, is able to achieve accuracy approaching that from manual assignments at greatly accelerated speeds. Moreover, by including the option to compensate for isotope shift effects in deuterated proteins, COMPASS is far more accurate for larger proteins than existing automated methods. COMPASS is an open-source project licensed under GNU General Public License and is available for download from http://www.liu.se/forskning/foass/tidigare-foass/patrik-lundstrom/software?l=en. Source code and binaries for Linux, Mac OS X and Microsoft Windows are available.

  6. MPI, HPF or OpenMP: A Study with the NAS Benchmarks

    NASA Technical Reports Server (NTRS)

    Jin, H.; Frumkin, M.; Hribar, M.; Waheed, A.; Yan, J.; Saini, Subhash (Technical Monitor)

    1999-01-01

    Porting applications to new high performance parallel and distributed platforms is a challenging task. Writing parallel code by hand is time consuming and costly, but this task can be simplified by high level languages and would even better be automated by parallelizing tools and compilers. The definition of HPF (High Performance Fortran, based on data parallel model) and OpenMP (based on shared memory parallel model) standards has offered great opportunity in this respect. Both provide simple and clear interfaces to language like FORTRAN and simplify many tedious tasks encountered in writing message passing programs. In our study, we implemented the parallel versions of the NAS Benchmarks with HPF and OpenMP directives. Comparison of their performance with the MPI implementation and pros and cons of different approaches will be discussed along with experience of using computer-aided tools to help parallelize these benchmarks. Based on the study, potentials of applying some of the techniques to realistic aerospace applications will be presented.

  7. Enhanced FIB-SEM systems for large-volume 3D imaging

    PubMed Central

    Xu, C Shan; Hayworth, Kenneth J; Lu, Zhiyuan; Grob, Patricia; Hassan, Ahmed M; García-Cerdán, José G; Niyogi, Krishna K; Nogales, Eva; Weinberg, Richard J; Hess, Harald F

    2017-01-01

    Focused Ion Beam Scanning Electron Microscopy (FIB-SEM) can automatically generate 3D images with superior z-axis resolution, yielding data that needs minimal image registration and related post-processing. Obstacles blocking wider adoption of FIB-SEM include slow imaging speed and lack of long-term system stability, which caps the maximum possible acquisition volume. Here, we present techniques that accelerate image acquisition while greatly improving FIB-SEM reliability, allowing the system to operate for months and generating continuously imaged volumes > 106 µm3. These volumes are large enough for connectomics, where the excellent z resolution can help in tracing of small neuronal processes and accelerate the tedious and time-consuming human proofreading effort. Even higher resolution can be achieved on smaller volumes. We present example data sets from mammalian neural tissue, Drosophila brain, and Chlamydomonas reinhardtii to illustrate the power of this novel high-resolution technique to address questions in both connectomics and cell biology. DOI: http://dx.doi.org/10.7554/eLife.25916.001 PMID:28500755

  8. Enhanced FIB-SEM systems for large-volume 3D imaging

    DOE PAGES

    Xu, C. Shan; Hayworth, Kenneth J.; Lu, Zhiyuan; ...

    2017-05-13

    Focused Ion Beam Scanning Electron Microscopy (FIB-SEM) can automatically generate 3D images with superior z-axis resolution, yielding data that needs minimal image registration and related post-processing. Obstacles blocking wider adoption of FIB-SEM include slow imaging speed and lack of long-term system stability, which caps the maximum possible acquisition volume. Here, we present techniques that accelerate image acquisition while greatly improving FIB-SEM reliability, allowing the system to operate for months and generating continuously imaged volumes > 10 6 ?m 3 . These volumes are large enough for connectomics, where the excellent z resolution can help in tracing of small neuronal processesmore » and accelerate the tedious and time-consuming human proofreading effort. Even higher resolution can be achieved on smaller volumes. We present example data sets from mammalian neural tissue, Drosophila brain, and Chlamydomonas reinhardtii to illustrate the power of this novel high-resolution technique to address questions in both connectomics and cell biology.« less

  9. Automated structure refinement of macromolecular assemblies from cryo-EM maps using Rosetta.

    PubMed

    Wang, Ray Yu-Ruei; Song, Yifan; Barad, Benjamin A; Cheng, Yifan; Fraser, James S; DiMaio, Frank

    2016-09-26

    Cryo-EM has revealed the structures of many challenging yet exciting macromolecular assemblies at near-atomic resolution (3-4.5Å), providing biological phenomena with molecular descriptions. However, at these resolutions, accurately positioning individual atoms remains challenging and error-prone. Manually refining thousands of amino acids - typical in a macromolecular assembly - is tedious and time-consuming. We present an automated method that can improve the atomic details in models that are manually built in near-atomic-resolution cryo-EM maps. Applying the method to three systems recently solved by cryo-EM, we are able to improve model geometry while maintaining the fit-to-density. Backbone placement errors are automatically detected and corrected, and the refinement shows a large radius of convergence. The results demonstrate that the method is amenable to structures with symmetry, of very large size, and containing RNA as well as covalently bound ligands. The method should streamline the cryo-EM structure determination process, providing accurate and unbiased atomic structure interpretation of such maps.

  10. Fluorometric determination of zirconium in minerals

    USGS Publications Warehouse

    Alford, W.C.; Shapiro, L.; White, C.E.

    1951-01-01

    The increasing use of zirconium in alloys and in the ceramics industry has created renewed interest in methods for its determination. It is a common constituent of many minerals, but is usually present in very small amounts. Published methods tend to be tedious, time-consuming, and uncertain as to accuracy. A new fluorometric procedure, which overcomes these objections to a large extent, is based on the blue fluorescence given by zirconium and flavonol in sulfuric acid solution. Hafnium is the only element that interferes. The sample is fused with borax glass and sodium carbonate and extracted with water. The residue is dissolved in sulfuric acid, made alkaline with sodium hydroxide to separate aluminum, and filtered. The precipitate is dissolved in sulfuric acid and electrolysed in a Melaven cell to remove iron. Flavonol is then added and the fluorescence intensity is measured with a photo-fluorometer. Analysis of seven standard mineral samples shows excellent results. The method is especially useful for minerals containing less than 0.25% zirconium oxide.

  11. Automated Epileptiform Spike Detection via Affinity Propagation-Based Template Matching

    PubMed Central

    Thomas, John; Jin, Jing; Dauwels, Justin; Cash, Sydney S.; Westover, M. Brandon

    2018-01-01

    Interictal epileptiform spikes are the key diagnostic biomarkers for epilepsy. The clinical gold standard of spike detection is visual inspection performed by neurologists. This is a tedious, time-consuming, and expert-centered process. The development of automated spike detection systems is necessary in order to provide a faster and more reliable diagnosis of epilepsy. In this paper, we propose an efficient template matching spike detector based on a combination of spike and background waveform templates. We generate a template library by clustering a collection of spikes and background waveforms extracted from a database of 50 patients with epilepsy. We benchmark the performance of five clustering techniques based on the receiver operating characteristic (ROC) curves. In addition, background templates are integrated with existing spike templates to improve the overall performance. The affinity propagation-based template matching system with a combination of spike and background templates is shown to outperform the other four conventional methods with the highest area-under-curve (AUC) of 0.953. PMID:29060543

  12. Fast and Accurate Resonance Assignment of Small-to-Large Proteins by Combining Automated and Manual Approaches

    PubMed Central

    Niklasson, Markus; Ahlner, Alexandra; Andresen, Cecilia; Marsh, Joseph A.; Lundström, Patrik

    2015-01-01

    The process of resonance assignment is fundamental to most NMR studies of protein structure and dynamics. Unfortunately, the manual assignment of residues is tedious and time-consuming, and can represent a significant bottleneck for further characterization. Furthermore, while automated approaches have been developed, they are often limited in their accuracy, particularly for larger proteins. Here, we address this by introducing the software COMPASS, which, by combining automated resonance assignment with manual intervention, is able to achieve accuracy approaching that from manual assignments at greatly accelerated speeds. Moreover, by including the option to compensate for isotope shift effects in deuterated proteins, COMPASS is far more accurate for larger proteins than existing automated methods. COMPASS is an open-source project licensed under GNU General Public License and is available for download from http://www.liu.se/forskning/foass/tidigare-foass/patrik-lundstrom/software?l=en. Source code and binaries for Linux, Mac OS X and Microsoft Windows are available. PMID:25569628

  13. Magnetic hybrid magnetite/metal organic framework nanoparticles: facile preparation, post-synthetic biofunctionalization and tracking in vivo with magnetic methods

    NASA Astrophysics Data System (ADS)

    Tregubov, A. A.; Sokolov, I. L.; Babenyshev, A. V.; Nikitin, P. I.; Cherkasov, V. R.; Nikitin, M. P.

    2018-03-01

    Multifunctional hybrid nanocomposites remain to be of great interest in biomedicine as a universal tool in a number of applications. As a promising example, the nanoparticles with magnetic core and porous shell have a potential as theranostic agents combining both the diagnostics probe and drug delivery vehicle properties. However, reported methods of the nanostructure preparation are complex and include tedious time-consuming growth of porous shell by means of layer by layer assembly technique. In this study, we develop new way of fabrication of the superparamagnetic magnetite core @ porous metal organic framework shell nanoparticles and demonstrate their application both as a multimodal (MRI contrasting, magnetometric and optical labeling) and multifunctional (in vivo bioimaging, biotargeting by coupled receptors, lateral flow assay) agents. The easiness of fabrication, controllable bioconjugation properties and low level of non-specific binding indicate high potential of the nanoparticles to be employed as multifunctional agents in theranostics, advanced biosensing and bioimaging.

  14. Aqueous two-phase systems enable multiplexing of homogeneous immunoassays

    PubMed Central

    Simon, Arlyne B.; Frampton, John P.; Huang, Nien-Tsu; Kurabayashi, Katsuo; Paczesny, Sophie; Takayama, Shuichi

    2014-01-01

    Quantitative measurement of protein biomarkers is critical for biomarker validation and early disease detection. Current multiplex immunoassays are time consuming costly and can suffer from low accuracy. For example, multiplex ELISAs require multiple, tedious, washing and blocking steps. Moreover, they suffer from nonspecific antibody cross-reactions, leading to high background and false-positive signals. Here, we show that co-localizing antibody-bead pairs in an aqueous two-phase system (ATPS) enables multiplexing of sensitive, no-wash, homogeneous assays, while preventing nonspecific antibody cross-reactions. Our cross-reaction-free, multiplex assay can simultaneously detect picomolar concentrations of four protein biomarkers ((C-X-C motif) ligand 10 (CXCL10), CXCL9, interleukin (IL)-8 and IL-6) in cell supernatants using a single assay well. The potential clinical utility of the assay is demonstrated by detecting diagnostic biomarkers (CXCL10 and CXCL9) in plasma from 88 patients at the onset of the clinical symptoms of chronic graft-versus-host disease (GVHD). PMID:25083509

  15. Computational Biology Methods for Characterization of Pluripotent Cells.

    PubMed

    Araúzo-Bravo, Marcos J

    2016-01-01

    Pluripotent cells are a powerful tool for regenerative medicine and drug discovery. Several techniques have been developed to induce pluripotency, or to extract pluripotent cells from different tissues and biological fluids. However, the characterization of pluripotency requires tedious, expensive, time-consuming, and not always reliable wet-lab experiments; thus, an easy, standard quality-control protocol of pluripotency assessment remains to be established. Here to help comes the use of high-throughput techniques, and in particular, the employment of gene expression microarrays, which has become a complementary technique for cellular characterization. Research has shown that the transcriptomics comparison with an Embryonic Stem Cell (ESC) of reference is a good approach to assess the pluripotency. Under the premise that the best protocol is a computer software source code, here I propose and explain line by line a software protocol coded in R-Bioconductor for pluripotency assessment based on the comparison of transcriptomics data of pluripotent cells with an ESC of reference. I provide advice for experimental design, warning about possible pitfalls, and guides for results interpretation.

  16. Estimation of Wheat Plant Density at Early Stages Using High Resolution Imagery

    PubMed Central

    Liu, Shouyang; Baret, Fred; Andrieu, Bruno; Burger, Philippe; Hemmerlé, Matthieu

    2017-01-01

    Crop density is a key agronomical trait used to manage wheat crops and estimate yield. Visual counting of plants in the field is currently the most common method used. However, it is tedious and time consuming. The main objective of this work is to develop a machine vision based method to automate the density survey of wheat at early stages. RGB images taken with a high resolution RGB camera are classified to identify the green pixels corresponding to the plants. Crop rows are extracted and the connected components (objects) are identified. A neural network is then trained to estimate the number of plants in the objects using the object features. The method was evaluated over three experiments showing contrasted conditions with sowing densities ranging from 100 to 600 seeds⋅m-2. Results demonstrate that the density is accurately estimated with an average relative error of 12%. The pipeline developed here provides an efficient and accurate estimate of wheat plant density at early stages. PMID:28559901

  17. Transition Time: Make It a Learning Time.

    ERIC Educational Resources Information Center

    Baker, Betty Ruth

    Teacher selection and planning of appropriate transition activities for preschool age children is discussed in this paper. Teachers are encouraged to use transition time to provide an opportunity for imaginative and creative thinking and to avoid tedious waiting and chaos. Transition activities can be used as a teaching technique to prepare…

  18. Simulation of Simple Controlled Processes with Dead-Time.

    ERIC Educational Resources Information Center

    Watson, Keith R.; And Others

    1985-01-01

    The determination of closed-loop response of processes containing dead-time is typically not covered in undergraduate process control, possibly because the solution by Laplace transforms requires the use of Pade approximation for dead-time, which makes the procedure lengthy and tedious. A computer-aided method is described which simplifies the…

  19. Thoughtflow: Standards and Tools for Provenance Capture and Workflow Definition to Support Model-Informed Drug Discovery and Development.

    PubMed

    Wilkins, J J; Chan, Pls; Chard, J; Smith, G; Smith, M K; Beer, M; Dunn, A; Flandorfer, C; Franklin, C; Gomeni, R; Harnisch, L; Kaye, R; Moodie, S; Sardu, M L; Wang, E; Watson, E; Wolstencroft, K; Cheung, Sya

    2017-05-01

    Pharmacometric analyses are complex and multifactorial. It is essential to check, track, and document the vast amounts of data and metadata that are generated during these analyses (and the relationships between them) in order to comply with regulations, support quality control, auditing, and reporting. It is, however, challenging, tedious, error-prone, and time-consuming, and diverts pharmacometricians from the more useful business of doing science. Automating this process would save time, reduce transcriptional errors, support the retention and transfer of knowledge, encourage good practice, and help ensure that pharmacometric analyses appropriately impact decisions. The ability to document, communicate, and reconstruct a complete pharmacometric analysis using an open standard would have considerable benefits. In this article, the Innovative Medicines Initiative (IMI) Drug Disease Model Resources (DDMoRe) consortium proposes a set of standards to facilitate the capture, storage, and reporting of knowledge (including assumptions and decisions) in the context of model-informed drug discovery and development (MID3), as well as to support reproducibility: "Thoughtflow." A prototype software implementation is provided. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  20. Autonomous characterization of plastic-bonded explosives

    NASA Astrophysics Data System (ADS)

    Linder, Kim Dalton; DeRego, Paul; Gomez, Antonio; Baumgart, Chris

    2006-08-01

    Plastic-Bonded Explosives (PBXs) are a newer generation of explosive compositions developed at Los Alamos National Laboratory (LANL). Understanding the micromechanical behavior of these materials is critical. The size of the crystal particles and porosity within the PBX influences their shock sensitivity. Current methods to characterize the prominent structural characteristics include manual examination by scientists and attempts to use commercially available image processing packages. Both methods are time consuming and tedious. LANL personnel, recognizing this as a manually intensive process, have worked with the Kansas City Plant / Kirtland Operations to develop a system which utilizes image processing and pattern recognition techniques to characterize PBX material. System hardware consists of a CCD camera, zoom lens, two-dimensional, motorized stage, and coaxial, cross-polarized light. System integration of this hardware with the custom software is at the core of the machine vision system. Fundamental processing steps involve capturing images from the PBX specimen, and extraction of void, crystal, and binder regions. For crystal extraction, a Quadtree decomposition segmentation technique is employed. Benefits of this system include: (1) reduction of the overall characterization time; (2) a process which is quantifiable and repeatable; (3) utilization of personnel for intelligent review rather than manual processing; and (4) significantly enhanced characterization accuracy.

  1. Paper-based microreactor array for rapid screening of cell signaling cascades.

    PubMed

    Huang, Chia-Hao; Lei, Kin Fong; Tsang, Ngan-Ming

    2016-08-07

    Investigation of cell signaling pathways is important for the study of pathogenesis of cancer. However, the related operations used in these studies are time consuming and labor intensive. Thus, the development of effective therapeutic strategies may be hampered. In this work, gel-free cell culture and subsequent immunoassay has been successfully integrated and conducted in a paper-based microreactor array. Study of the activation level of different kinases of cells stimulated by different conditions, i.e., IL-6 stimulation, starvation, and hypoxia, was demonstrated. Moreover, rapid screening of cell signaling cascades after the stimulations of HGF, doxorubicin, and UVB irradiation was respectively conducted to simultaneously screen 40 kinases and transcription factors. Activation of multi-signaling pathways could be identified and the correlation between signaling pathways was discussed to provide further information to investigate the entire signaling network. The present technique integrates most of the tedious operations using a single paper substrate, reduces sample and reagent consumption, and shortens the time required by the entire process. Therefore, it provides a first-tier rapid screening tool for the study of complicated signaling cascades. It is expected that the technique can be developed for routine protocol in conventional biological research laboratories.

  2. SOMM: A New Service Oriented Middleware for Generic Wireless Multimedia Sensor Networks Based on Code Mobility

    PubMed Central

    Faghih, Mohammad Mehdi; Moghaddam, Mohsen Ebrahimi

    2011-01-01

    Although much research in the area of Wireless Multimedia Sensor Networks (WMSNs) has been done in recent years, the programming of sensor nodes is still time-consuming and tedious. It requires expertise in low-level programming, mainly because of the use of resource constrained hardware and also the low level API provided by current operating systems. The code of the resulting systems has typically no clear separation between application and system logic. This minimizes the possibility of reusing code and often leads to the necessity of major changes when the underlying platform is changed. In this paper, we present a service oriented middleware named SOMM to support application development for WMSNs. The main goal of SOMM is to enable the development of modifiable and scalable WMSN applications. A network which uses the SOMM is capable of providing multiple services to multiple clients at the same time with the specified Quality of Service (QoS). SOMM uses a virtual machine with the ability to support mobile agents. Services in SOMM are provided by mobile agents and SOMM also provides a t space on each node which agents can use to communicate with each other. PMID:22346646

  3. Nano-materials for use in sensing of salmonella infections: Recent advances.

    PubMed

    Pashazadeh, Paria; Mokhtarzadeh, Ahad; Hasanzadeh, Mohammad; Hejazi, Maryam; Hashemi, Maryam; de la Guardia, Miguel

    2017-01-15

    Salmonella infectious diseases spreading every day through food have become a life-threatening problem for millions of people and growing menace to society. Health expert's estimate that the yearly cost of all the food borne diseases is approximately $5-6 billion. Traditional methodologies for salmonella analysis provide high reliability and very low limits of detection. Among them immunoassays and Nucleic acid-based assays provide results within 24h, but they are expensive, tedious and time consuming. So, there is an urgent need for development of rapid, robust and cost-effective alternative technologies for real-time monitoring of salmonella. Several biosensors have been designed and commercialized for detection of this pathogen in food and water. In this overview, we have updated the literature concerning novel biosensing methods such as various optical and electrochemical biosensors and newly developed nano- and micro-scaled and aptamers based biosensors for detection of salmonella pathogen. Furthermore, attention has been focused on the principal concepts, applications, and examples that have been achieved up to diagnose salmonella. In addition, commercial biosensors and foreseeable future trends for onsite detecting salmonella have been summarized. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Splitting a colon geometry with multiplanar clipping

    NASA Astrophysics Data System (ADS)

    Ahn, David K.; Vining, David J.; Ge, Yaorong; Stelts, David R.

    1998-06-01

    Virtual colonoscopy, a recent three-dimensional (3D) visualization technique, has provided radiologists with a unique diagnostic tool. Using this technique, a radiologist can examine the internal morphology of a patient's colon by navigating through a surface-rendered model that is constructed from helical computed tomography image data. Virtual colonoscopy can be used to detect early forms of colon cancer in a way that is less invasive and expensive compared to conventional endoscopy. However, the common approach of 'flying' through the colon lumen to visually search for polyps is tedious and time-consuming, especially when a radiologist loses his or her orientation within the colon. Furthermore, a radiologist's field of view is often limited by the 3D camera position located inside the colon lumen. We have developed a new technique, called multi-planar geometry clipping, that addresses these problems. Our algorithm divides a complex colon anatomy into several smaller segments, and then splits each of these segments in half for display on a static medium. Multi-planar geometry clipping eliminates virtual colonoscopy's dependence upon expensive, real-time graphics workstations by enabling radiologists to globally inspect the entire internal surface of the colon from a single viewpoint.

  5. SOMM: A new service oriented middleware for generic wireless multimedia sensor networks based on code mobility.

    PubMed

    Faghih, Mohammad Mehdi; Moghaddam, Mohsen Ebrahimi

    2011-01-01

    Although much research in the area of Wireless Multimedia Sensor Networks (WMSNs) has been done in recent years, the programming of sensor nodes is still time-consuming and tedious. It requires expertise in low-level programming, mainly because of the use of resource constrained hardware and also the low level API provided by current operating systems. The code of the resulting systems has typically no clear separation between application and system logic. This minimizes the possibility of reusing code and often leads to the necessity of major changes when the underlying platform is changed. In this paper, we present a service oriented middleware named SOMM to support application development for WMSNs. The main goal of SOMM is to enable the development of modifiable and scalable WMSN applications. A network which uses the SOMM is capable of providing multiple services to multiple clients at the same time with the specified Quality of Service (QoS). SOMM uses a virtual machine with the ability to support mobile agents. Services in SOMM are provided by mobile agents and SOMM also provides a t space on each node which agents can use to communicate with each other.

  6. [Application of skin adhesives in head and neck surgery: analysis of cosmetic results, applicability and cost-effectiveness of cyanoacrylate-based adhesives].

    PubMed

    Graefe, H; Wollenberg, B; Brocks, C

    2008-09-01

    In this work cyanoacrylate-based skin adhesives used in Germany for skin closure in head and neck surgery are compared with respect to ease of application, cost-effectiveness and cosmetic results. We compared 25 wounds sealed with a skin adhesive with 25 suture-sealed wounds. Bonding of surgical wounds with glue had a high level of acceptance in all patients. The tedious, time-consuming and sometimes painful postoperative removal of many sutures in patients is omitted. Patients can shower soon afterwards without additional protection as the adhesive provides a waterproof barrier. Problems of wound healing can immediately be detected through the transparent skin adhesive. Cosmetic long-term results of skin closure by adhesives are comparable to suture-sealed wounds. The adhesives available on the market differ mainly in the form of the applicator, the viscosity on application, as well as the strength after hardening. The application is easy to implement and significantly faster than conventional suturing. Apart from the cost savings of materials compared to the use of skin sutures and investment of Steri-Strips, expensive anesthesia and surgical time can also be saved.

  7. Unified Software Solution for Efficient SPR Data Analysis in Drug Research

    PubMed Central

    Dahl, Göran; Steigele, Stephan; Hillertz, Per; Tigerström, Anna; Egnéus, Anders; Mehrle, Alexander; Ginkel, Martin; Edfeldt, Fredrik; Holdgate, Geoff; O’Connell, Nichole; Kappler, Bernd; Brodte, Annette; Rawlins, Philip B.; Davies, Gareth; Westberg, Eva-Lotta; Folmer, Rutger H. A.; Heyse, Stephan

    2016-01-01

    Surface plasmon resonance (SPR) is a powerful method for obtaining detailed molecular interaction parameters. Modern instrumentation with its increased throughput has enabled routine screening by SPR in hit-to-lead and lead optimization programs, and SPR has become a mainstream drug discovery technology. However, the processing and reporting of SPR data in drug discovery are typically performed manually, which is both time-consuming and tedious. Here, we present the workflow concept, design and experiences with a software module relying on a single, browser-based software platform for the processing, analysis, and reporting of SPR data. The efficiency of this concept lies in the immediate availability of end results: data are processed and analyzed upon loading the raw data file, allowing the user to immediately quality control the results. Once completed, the user can automatically report those results to data repositories for corporate access and quickly generate printed reports or documents. The software module has resulted in a very efficient and effective workflow through saved time and improved quality control. We discuss these benefits and show how this process defines a new benchmark in the drug discovery industry for the handling, interpretation, visualization, and sharing of SPR data. PMID:27789754

  8. Application of surface plasmon resonance for the detection of carbohydrates, glycoconjugates, and measurement of the carbohydrate-specific interactions: a comparison with conventional analytical techniques. A critical review.

    PubMed

    Safina, Gulnara

    2012-01-27

    Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Computer-aided diagnosis software for vulvovaginal candidiasis detection from Pap smear images.

    PubMed

    Momenzadeh, Mohammadreza; Vard, Alireza; Talebi, Ardeshir; Mehri Dehnavi, Alireza; Rabbani, Hossein

    2018-01-01

    Vulvovaginal candidiasis (VVC) is a common gynecologic infection and it occurs when there is overgrowth of the yeast called Candida. VVC diagnosis is usually done by observing a Pap smear sample under a microscope and searching for the conidium and mycelium components of Candida. This manual method is time consuming, subjective and tedious. Any diagnosis tools that detect VVC, semi- or full-automatically, can be very helpful to pathologists. This article presents a computer aided diagnosis (CAD) software to improve human diagnosis of VVC from Pap smear samples. The proposed software is designed based on phenotypic and morphology features of the Candida in Pap smear sample images. This software provide a user-friendly interface which consists of a set of image processing tools and analytical results that helps to detect Candida and determine severity of illness. The software was evaluated on 200 Pap smear sample images and obtained specificity of 91.04% and sensitivity of 92.48% to detect VVC. As a result, the use of the proposed software reduces diagnostic time and can be employed as a second objective opinion for pathologists. © 2017 Wiley Periodicals, Inc.

  10. Design and Mechanical Evaluation of a Capacitive Sensor-Based Indexed Platform for Verification of Portable Coordinate Measuring Instruments

    PubMed Central

    Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar

    2014-01-01

    During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures. PMID:24451458

  11. Design and mechanical evaluation of a capacitive sensor-based indexed platform for verification of portable coordinate measuring instruments.

    PubMed

    Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar

    2014-01-02

    During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures.

  12. Disambiguating ambiguous biomedical terms in biomedical narrative text: an unsupervised method.

    PubMed

    Liu, H; Lussier, Y A; Friedman, C

    2001-08-01

    With the growing use of Natural Language Processing (NLP) techniques for information extraction and concept indexing in the biomedical domain, a method that quickly and efficiently assigns the correct sense of an ambiguous biomedical term in a given context is needed concurrently. The current status of word sense disambiguation (WSD) in the biomedical domain is that handcrafted rules are used based on contextual material. The disadvantages of this approach are (i) generating WSD rules manually is a time-consuming and tedious task, (ii) maintenance of rule sets becomes increasingly difficult over time, and (iii) handcrafted rules are often incomplete and perform poorly in new domains comprised of specialized vocabularies and different genres of text. This paper presents a two-phase unsupervised method to build a WSD classifier for an ambiguous biomedical term W. The first phase automatically creates a sense-tagged corpus for W, and the second phase derives a classifier for W using the derived sense-tagged corpus as a training set. A formative experiment was performed, which demonstrated that classifiers trained on the derived sense-tagged corpora achieved an overall accuracy of about 97%, with greater than 90% accuracy for each individual ambiguous term.

  13. Modification of the USLE K factor for soil erodibility assessment on calcareous soils in Iran

    NASA Astrophysics Data System (ADS)

    Ostovari, Yaser; Ghorbani-Dashtaki, Shoja; Bahrami, Hossein-Ali; Naderi, Mehdi; Dematte, Jose Alexandre M.; Kerry, Ruth

    2016-11-01

    The measurement of soil erodibility (K) in the field is tedious, time-consuming and expensive; therefore, its prediction through pedotransfer functions (PTFs) could be far less costly and time-consuming. The aim of this study was to develop new PTFs to estimate the K factor using multiple linear regression, Mamdani fuzzy inference systems, and artificial neural networks. For this purpose, K was measured in 40 erosion plots with natural rainfall. Various soil properties including the soil particle size distribution, calcium carbonate equivalent, organic matter, permeability, and wet-aggregate stability were measured. The results showed that the mean measured K was 0.014 t h MJ- 1 mm- 1 and 2.08 times less than the estimated mean K (0.030 t h MJ- 1 mm- 1) using the USLE model. Permeability, wet-aggregate stability, very fine sand, and calcium carbonate were selected as independent variables by forward stepwise regression in order to assess the ability of multiple linear regression, Mamdani fuzzy inference systems and artificial neural networks to predict K. The calcium carbonate equivalent, which is not accounted for in the USLE model, had a significant impact on K in multiple linear regression due to its strong influence on the stability of aggregates and soil permeability. Statistical indices in validation and calibration datasets determined that the artificial neural networks method with the highest R2, lowest RMSE, and lowest ME was the best model for estimating the K factor. A strong correlation (R2 = 0.81, n = 40, p < 0.05) between the estimated K from multiple linear regression and measured K indicates that the use of calcium carbonate equivalent as a predictor variable gives a better estimation of K in areas with calcareous soils.

  14. Fully convolutional networks (FCNs)-based segmentation method for colorectal tumors on T2-weighted magnetic resonance images.

    PubMed

    Jian, Junming; Xiong, Fei; Xia, Wei; Zhang, Rui; Gu, Jinhui; Wu, Xiaodong; Meng, Xiaochun; Gao, Xin

    2018-06-01

    Segmentation of colorectal tumors is the basis of preoperative prediction, staging, and therapeutic response evaluation. Due to the blurred boundary between lesions and normal colorectal tissue, it is hard to realize accurate segmentation. Routinely manual or semi-manual segmentation methods are extremely tedious, time-consuming, and highly operator-dependent. In the framework of FCNs, a segmentation method for colorectal tumor was presented. Normalization was applied to reduce the differences among images. Borrowing from transfer learning, VGG-16 was employed to extract features from normalized images. We conducted five side-output blocks from the last convolutional layer of each block of VGG-16 along the network, these side-output blocks can deep dive multiscale features, and produced corresponding predictions. Finally, all of the predictions from side-output blocks were fused to determine the final boundaries of the tumors. A quantitative comparison of 2772 colorectal tumor manual segmentation results from T2-weighted magnetic resonance images shows that the average Dice similarity coefficient, positive predictive value, specificity, sensitivity, Hammoude distance, and Hausdorff distance were 83.56, 82.67, 96.75, 87.85%, 0.2694, and 8.20, respectively. The proposed method is superior to U-net in colorectal tumor segmentation (P < 0.05). There is no difference between cross-entropy loss and Dice-based loss in colorectal tumor segmentation (P > 0.05). The results indicate that the introduction of FCNs contributed to accurate segmentation of colorectal tumors. This method has the potential to replace the present time-consuming and nonreproducible manual segmentation method.

  15. Classifying seismic waveforms from scratch: a case study in the alpine environment

    NASA Astrophysics Data System (ADS)

    Hammer, C.; Ohrnberger, M.; Fäh, D.

    2013-01-01

    Nowadays, an increasing amount of seismic data is collected by daily observatory routines. The basic step for successfully analyzing those data is the correct detection of various event types. However, the visually scanning process is a time-consuming task. Applying standard techniques for detection like the STA/LTA trigger still requires the manual control for classification. Here, we present a useful alternative. The incoming data stream is scanned automatically for events of interest. A stochastic classifier, called hidden Markov model, is learned for each class of interest enabling the recognition of highly variable waveforms. In contrast to other automatic techniques as neural networks or support vector machines the algorithm allows to start the classification from scratch as soon as interesting events are identified. Neither the tedious process of collecting training samples nor a time-consuming configuration of the classifier is required. An approach originally introduced for the volcanic task force action allows to learn classifier properties from a single waveform example and some hours of background recording. Besides a reduction of required workload this also enables to detect very rare events. Especially the latter feature provides a milestone point for the use of seismic devices in alpine warning systems. Furthermore, the system offers the opportunity to flag new signal classes that have not been defined before. We demonstrate the application of the classification system using a data set from the Swiss Seismological Survey achieving very high recognition rates. In detail we document all refinements of the classifier providing a step-by-step guide for the fast set up of a well-working classification system.

  16. Economic Benefits of Improved Water Quality: Public Perceptions of Option and Preservation Values

    NASA Astrophysics Data System (ADS)

    Bouwes, Nicolaas W., Sr.

    The primary objective of this book is to report the authors‧ research approach to the estimation of benefits of water quality improvements in the South Platte River of northeastern Colorado. Benefits included a “consumer surplus” from enhanced enjoyment of water-based recreation, an “option value” of assured choice of future recreation use, and a “preservation value” of the ecosystem and its bequest to future generations. Concepts such as preservation and option value benefits have been often mentioned but seldom estimated in natural resources research. The authors have met their objective by providing the reader with a detailed description of their research without being tedious.

  17. A real-time PCR approach to detect predation on anchovy and sardine early life stages

    NASA Astrophysics Data System (ADS)

    Cuende, Elsa; Mendibil, Iñaki; Bachiller, Eneko; Álvarez, Paula; Cotano, Unai; Rodriguez-Ezpeleta, Naiara

    2017-12-01

    Recruitment of sardine (Sardina pilchardus Walbaum, 1792) and anchovy (Engraulis encrasicolus Linnaeus, 1758) is thought to be regulated by predation of their eggs and larvae. Predators of sardine and anchovy can be identified by visual taxonomic identification of stomach contents, but this method is time consuming, tedious and may underestimate predation, especially in small predators such as fish larvae. Alternatively, genetic tools may offer a more cost-effective and accurate alternative. Here, we have developed a multiplex real-time polymerase chain reaction (RT-PCR) assay based on TaqMan probes to simultaneously detect sardine and anchovy remains in gut contents of potential predators. The assay combines previously described and newly generated species-specific primers and probes for anchovy and sardine detection respectively, and allows the detection of 0,001 ng of target DNA (which corresponds to about one hundredth of the total DNA present in a single egg). We applied the method to candidate anchovy and sardine egg predators in the Bay of Biscay, Atlantic Mackerel (Scomber scombrus) larvae. Egg predation observed was limited primarily to those stations where sardine and/or anchovy eggs were present. Our developed assay offers a suitable tool to understand the effects of predation on the survival of anchovy and sardine early life stages.

  18. 3D laser scanning for quality control and assurance in bridge deck construction.

    DOT National Transportation Integrated Search

    2014-08-01

    The inspection of installations of rebar and other embedded components in bridge deck construction is a tedious : task for eld inspectors, requiring considerable eld time for measurement and verication against code requirement. The verica...

  19. An integrated method for the emotional conceptualization and sensory characterization of food products: The EmoSensory® Wheel.

    PubMed

    Schouteten, Joachim J; De Steur, Hans; De Pelsmaeker, Sara; Lagast, Sofie; De Bourdeaudhuij, Ilse; Gellynck, Xavier

    2015-12-01

    Although acceptability is commonly used to examine liking for food products, more studies now emphasize the importance of measuring consumers' conceptualizations, such as emotions for food products. It is also important to identify how consumers perceive the sensory attributes of food products, as illustrated by the increasing involvement of consumers in product characterization. The objective of this paper is to examine the use of a wheel-format questionnaire to obtain both an emotional and sensory profiles for food products using a hands-on consumer tool. Terms selected were product-specific and the rate-all-that-apply (RATA) approach was used as a scaling technique. Three different experiments demonstrated that the EmoSensory® Wheel could discriminate within and between food product categories. The added value of the RATA approach was illustrated in the sample discrimination for some food products when using the weighted attribute scores for analysis. The tool was used in both blind and informed conditions to illustrate its applicability across different experimental designs. In general, the respondents did not find the task tedious when using the wheel-questionnaire format, demonstrating the potential for collecting information in a more facile way. Although further studies with other food products are needed, this paper shows the potential for using this wheel format to obtain information about consumers' emotional and sensory profiling of food products. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. LO + EPSS = Just-in-Time Reuse of Content to Support Employee Performance

    ERIC Educational Resources Information Center

    Nguyen, Frank; Hanzel, Matthew

    2007-01-01

    Those involved in training know that creating instructional materials can become a tedious, repetitive process. They also know that business conditions often require training interventions to be delivered in ways that are not ideally structured or timed. This article examines the notion that learning objects can be reused and adapted for…

  1. A semi-automatic method for quantification and classification of erythrocytes infected with malaria parasites in microscopic images.

    PubMed

    Díaz, Gloria; González, Fabio A; Romero, Eduardo

    2009-04-01

    Visual quantification of parasitemia in thin blood films is a very tedious, subjective and time-consuming task. This study presents an original method for quantification and classification of erythrocytes in stained thin blood films infected with Plasmodium falciparum. The proposed approach is composed of three main phases: a preprocessing step, which corrects luminance differences. A segmentation step that uses the normalized RGB color space for classifying pixels either as erythrocyte or background followed by an Inclusion-Tree representation that structures the pixel information into objects, from which erythrocytes are found. Finally, a two step classification process identifies infected erythrocytes and differentiates the infection stage, using a trained bank of classifiers. Additionally, user intervention is allowed when the approach cannot make a proper decision. Four hundred fifty malaria images were used for training and evaluating the method. Automatic identification of infected erythrocytes showed a specificity of 99.7% and a sensitivity of 94%. The infection stage was determined with an average sensitivity of 78.8% and average specificity of 91.2%.

  2. In silico and in vitro inhibition of cytochrome P450 3A by synthetic stilbenoids.

    PubMed

    Basheer, Loai; Schultz, Keren; Guttman, Yelena; Kerem, Zohar

    2017-12-15

    Inhibition of cytochrome P450 3A4 (CYP3A4), the major drug metabolizing enzyme, by dietary compounds has recently attracted increased attention. Evaluating the potency of the many known inhibitory compounds is a tedious and time consuming task, yet it can be achieved using computing tools. Here, CDOCKER and Glide served to design model inhibitors in order to characterize molecular features of an inhibitor. Assessing nitro-stilbenoids, both approaches suggested nitrostilbene to be a weaker inhibitor of CYP3A4 than resveratrol, and stronger than dimethoxy-nitrostilbene. Nitrostilbene and resveratrol, but not dimethoxy-nitrostilbene, engage electrostatic interactions in the enzyme cavity, and with the haem. In vitro assessment of the inhibitory capacity supported the in silico predictions, suggesting that evaluating the electrostatic interactions of a compound with the prosthetic group allows the prediction of inhibitory potency. Since both programs yielded related results, it is suggested that for CYP3A4, computing tools may allow rapid identification of potent dietary inhibitors. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. An automated method of quantifying ferrite microstructures using electron backscatter diffraction (EBSD) data.

    PubMed

    Shrestha, Sachin L; Breen, Andrew J; Trimby, Patrick; Proust, Gwénaëlle; Ringer, Simon P; Cairney, Julie M

    2014-02-01

    The identification and quantification of the different ferrite microconstituents in steels has long been a major challenge for metallurgists. Manual point counting from images obtained by optical and scanning electron microscopy (SEM) is commonly used for this purpose. While classification systems exist, the complexity of steel microstructures means that identifying and quantifying these phases is still a great challenge. Moreover, point counting is extremely tedious, time consuming, and subject to operator bias. This paper presents a new automated identification and quantification technique for the characterisation of complex ferrite microstructures by electron backscatter diffraction (EBSD). This technique takes advantage of the fact that different classes of ferrite exhibit preferential grain boundary misorientations, aspect ratios and mean misorientation, all of which can be detected using current EBSD software. These characteristics are set as criteria for identification and linked to grain size to determine the area fractions. The results of this method were evaluated by comparing the new automated technique with point counting results. The technique could easily be applied to a range of other steel microstructures. © 2013 Published by Elsevier B.V.

  4. Thermodynamic heuristics with case-based reasoning: combined insights for RNA pseudoknot secondary structure.

    PubMed

    Al-Khatib, Ra'ed M; Rashid, Nur'Aini Abdul; Abdullah, Rosni

    2011-08-01

    The secondary structure of RNA pseudoknots has been extensively inferred and scrutinized by computational approaches. Experimental methods for determining RNA structure are time consuming and tedious; therefore, predictive computational approaches are required. Predicting the most accurate and energy-stable pseudoknot RNA secondary structure has been proven to be an NP-hard problem. In this paper, a new RNA folding approach, termed MSeeker, is presented; it includes KnotSeeker (a heuristic method) and Mfold (a thermodynamic algorithm). The global optimization of this thermodynamic heuristic approach was further enhanced by using a case-based reasoning technique as a local optimization method. MSeeker is a proposed algorithm for predicting RNA pseudoknot structure from individual sequences, especially long ones. This research demonstrates that MSeeker improves the sensitivity and specificity of existing RNA pseudoknot structure predictions. The performance and structural results from this proposed method were evaluated against seven other state-of-the-art pseudoknot prediction methods. The MSeeker method had better sensitivity than the DotKnot, FlexStem, HotKnots, pknotsRG, ILM, NUPACK and pknotsRE methods, with 79% of the predicted pseudoknot base-pairs being correct.

  5. Dosage delivery of sensitive reagents enables glove-box-free synthesis

    NASA Astrophysics Data System (ADS)

    Sather, Aaron C.; Lee, Hong Geun; Colombe, James R.; Zhang, Anni; Buchwald, Stephen L.

    2015-08-01

    Contemporary organic chemists employ a broad range of catalytic and stoichiometric methods to construct molecules for applications in the material sciences, and as pharmaceuticals, agrochemicals, and sensors. The utility of a synthetic method may be greatly reduced if it relies on a glove box to enable the use of air- and moisture-sensitive reagents or catalysts. Furthermore, many synthetic chemistry laboratories have numerous containers of partially used reagents that have been spoiled by exposure to the ambient atmosphere. This is exceptionally wasteful from both an environmental and a cost perspective. Here we report an encapsulation method for stabilizing and storing air- and moisture-sensitive compounds. We demonstrate this approach in three contexts, by describing single-use capsules that contain all of the reagents (catalysts, ligands, and bases) necessary for the glove-box-free palladium-catalysed carbon-fluorine, carbon-nitrogen, and carbon-carbon bond-forming reactions. This strategy should reduce the number of error-prone, tedious and time-consuming weighing procedures required for such syntheses and should be applicable to a wide range of reagents, catalysts, and substrate combinations.

  6. Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon

    2014-01-01

    Approaches used in Earth science research such as case study analysis and climatology studies involve discovering and gathering diverse data sets and information to support the research goals. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. In cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. This paper presents a specialized search, aggregation and curation tool for Earth science to address these challenges. The search rool automatically creates curated 'Data Albums', aggregated collections of information related to a specific event, containing links to relevant data files [granules] from different instruments, tools and services for visualization and analysis, and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non relevant information and data.

  7. Applications of emerging imaging techniques for meat quality and safety detection and evaluation: A review.

    PubMed

    Xiong, Zhenjie; Sun, Da-Wen; Pu, Hongbin; Gao, Wenhong; Dai, Qiong

    2017-03-04

    With improvement in people's living standards, many people nowadays pay more attention to quality and safety of meat. However, traditional methods for meat quality and safety detection and evaluation, such as manual inspection, mechanical methods, and chemical methods, are tedious, time-consuming, and destructive, which cannot meet the requirements of modern meat industry. Therefore, seeking out rapid, non-destructive, and accurate inspection techniques is important for the meat industry. In recent years, a number of novel and noninvasive imaging techniques, such as optical imaging, ultrasound imaging, tomographic imaging, thermal imaging, and odor imaging, have emerged and shown great potential in quality and safety assessment. In this paper, a detailed overview of advanced applications of these emerging imaging techniques for quality and safety assessment of different types of meat (pork, beef, lamb, chicken, and fish) is presented. In addition, advantages and disadvantages of each imaging technique are also summarized. Finally, future trends for these emerging imaging techniques are discussed, including integration of multiple imaging techniques, cost reduction, and developing powerful image-processing algorithms.

  8. relaxGUI: a new software for fast and simple NMR relaxation data analysis and calculation of ps-ns and μs motion of proteins.

    PubMed

    Bieri, Michael; d'Auvergne, Edward J; Gooley, Paul R

    2011-06-01

    Investigation of protein dynamics on the ps-ns and μs-ms timeframes provides detailed insight into the mechanisms of enzymes and the binding properties of proteins. Nuclear magnetic resonance (NMR) is an excellent tool for studying protein dynamics at atomic resolution. Analysis of relaxation data using model-free analysis can be a tedious and time consuming process, which requires good knowledge of scripting procedures. The software relaxGUI was developed for fast and simple model-free analysis and is fully integrated into the software package relax. It is written in Python and uses wxPython to build the graphical user interface (GUI) for maximum performance and multi-platform use. This software allows the analysis of NMR relaxation data with ease and the generation of publication quality graphs as well as color coded images of molecular structures. The interface is designed for simple data analysis and management. The software was tested and validated against the command line version of relax.

  9. Using simulated fluorescence cell micrographs for the evaluation of cell image segmentation algorithms.

    PubMed

    Wiesmann, Veit; Bergler, Matthias; Palmisano, Ralf; Prinzen, Martin; Franz, Daniela; Wittenberg, Thomas

    2017-03-18

    Manual assessment and evaluation of fluorescent micrograph cell experiments is time-consuming and tedious. Automated segmentation pipelines can ensure efficient and reproducible evaluation and analysis with constant high quality for all images of an experiment. Such cell segmentation approaches are usually validated and rated in comparison to manually annotated micrographs. Nevertheless, manual annotations are prone to errors and display inter- and intra-observer variability which influence the validation results of automated cell segmentation pipelines. We present a new approach to simulate fluorescent cell micrographs that provides an objective ground truth for the validation of cell segmentation methods. The cell simulation was evaluated twofold: (1) An expert observer study shows that the proposed approach generates realistic fluorescent cell micrograph simulations. (2) An automated segmentation pipeline on the simulated fluorescent cell micrographs reproduces segmentation performances of that pipeline on real fluorescent cell micrographs. The proposed simulation approach produces realistic fluorescent cell micrographs with corresponding ground truth. The simulated data is suited to evaluate image segmentation pipelines more efficiently and reproducibly than it is possible on manually annotated real micrographs.

  10. Thoughtflow: Standards and Tools for Provenance Capture and Workflow Definition to Support Model‐Informed Drug Discovery and Development

    PubMed Central

    Wilkins, JJ; Chan, PLS; Chard, J; Smith, G; Smith, MK; Beer, M; Dunn, A; Flandorfer, C; Franklin, C; Gomeni, R; Harnisch, L; Kaye, R; Moodie, S; Sardu, ML; Wang, E; Watson, E; Wolstencroft, K

    2017-01-01

    Pharmacometric analyses are complex and multifactorial. It is essential to check, track, and document the vast amounts of data and metadata that are generated during these analyses (and the relationships between them) in order to comply with regulations, support quality control, auditing, and reporting. It is, however, challenging, tedious, error‐prone, and time‐consuming, and diverts pharmacometricians from the more useful business of doing science. Automating this process would save time, reduce transcriptional errors, support the retention and transfer of knowledge, encourage good practice, and help ensure that pharmacometric analyses appropriately impact decisions. The ability to document, communicate, and reconstruct a complete pharmacometric analysis using an open standard would have considerable benefits. In this article, the Innovative Medicines Initiative (IMI) Drug Disease Model Resources (DDMoRe) consortium proposes a set of standards to facilitate the capture, storage, and reporting of knowledge (including assumptions and decisions) in the context of model‐informed drug discovery and development (MID3), as well as to support reproducibility: “Thoughtflow.” A prototype software implementation is provided. PMID:28504472

  11. Hybrid Clustering And Boundary Value Refinement for Tumor Segmentation using Brain MRI

    NASA Astrophysics Data System (ADS)

    Gupta, Anjali; Pahuja, Gunjan

    2017-08-01

    The method of brain tumor segmentation is the separation of tumor area from Brain Magnetic Resonance (MR) images. There are number of methods already exist for segmentation of brain tumor efficiently. However it’s tedious task to identify the brain tumor from MR images. The segmentation process is extraction of different tumor tissues such as active, tumor, necrosis, and edema from the normal brain tissues such as gray matter (GM), white matter (WM), as well as cerebrospinal fluid (CSF). As per the survey study, most of time the brain tumors are detected easily from brain MR image using region based approach but required level of accuracy, abnormalities classification is not predictable. The segmentation of brain tumor consists of many stages. Manually segmenting the tumor from brain MR images is very time consuming hence there exist many challenges in manual segmentation. In this research paper, our main goal is to present the hybrid clustering which consists of Fuzzy C-Means Clustering (for accurate tumor detection) and level set method(for handling complex shapes) for the detection of exact shape of tumor in minimal computational time. using this approach we observe that for a certain set of images 0.9412 sec of time is taken to detect tumor which is very less in comparison to recent existing algorithm i.e. Hybrid clustering (Fuzzy C-Means and K Means clustering).

  12. Conducting Web-Based Surveys. ERIC Digest.

    ERIC Educational Resources Information Center

    Solomon, David J.

    Web-based surveying is very attractive for many reasons, including reducing the time and cost of conducting a survey and avoiding the often error prone and tedious task of data entry. At this time, Web-based surveys should still be used with caution. The biggest concern at present is coverage bias or bias resulting from sampled people either not…

  13. An open framework for automated chemical hazard assessment based on GreenScreen for Safer Chemicals: A proof of concept.

    PubMed

    Wehage, Kristopher; Chenhansa, Panan; Schoenung, Julie M

    2017-01-01

    GreenScreen® for Safer Chemicals is a framework for comparative chemical hazard assessment. It is the first transparent, open and publicly accessible framework of its kind, allowing manufacturers and governmental agencies to make informed decisions about the chemicals and substances used in consumer products and buildings. In the GreenScreen® benchmarking process, chemical hazards are assessed and classified based on 18 hazard endpoints from up to 30 different sources. The result is a simple numerical benchmark score and accompanying assessment report that allows users to flag chemicals of concern and identify safer alternatives. Although the screening process is straightforward, aggregating and sorting hazard data is tedious, time-consuming, and prone to human error. In light of these challenges, the present work demonstrates the usage of automation to cull chemical hazard data from publicly available internet resources, assign metadata, and perform a GreenScreen® hazard assessment using the GreenScreen® "List Translator." The automated technique, written as a module in the Python programming language, generates GreenScreen® List Translation data for over 3000 chemicals in approximately 30 s. Discussion of the potential benefits and limitations of automated techniques is provided. By embedding the library into a web-based graphical user interface, the extensibility of the library is demonstrated. The accompanying source code is made available to the hazard assessment community. Integr Environ Assess Manag 2017;13:167-176. © 2016 SETAC. © 2016 SETAC.

  14. Ex vivo applications of multiphoton microscopy in urology

    NASA Astrophysics Data System (ADS)

    Jain, Manu; Mukherjee, Sushmita

    2016-03-01

    Background: Routine urological surgery frequently requires rapid on-site histopathological tissue evaluation either during biopsy or intra-operative procedure. However, resected tissue needs to undergo processing, which is not only time consuming but may also create artifacts hindering real-time tissue assessment. Likewise, pathologist often relies on several ancillary methods, in addition to H&E to arrive at a definitive diagnosis. Although, helpful these techniques are tedious and time consuming and often show overlapping results. Therefore, there is a need for an imaging tool that can rapidly assess tissue in real-time at cellular level. Multiphoton microscopy (MPM) is one such technique that can generate histology-quality images from fresh and fixed tissue solely based on their intrinsic autofluorescence emission, without the need for tissue processing or staining. Design: Fresh tissue sections (neoplastic and non-neoplastic) from biopsy and surgical specimens of bladder and kidney were obtained. Unstained deparaffinized slides from biopsy of medical kidney disease and oncocytic renal neoplasms were also obtained. MPM images were acquired using with an Olympus FluoView FV1000MPE system. After imaging, fresh tissues were submitted for routine histopathology. Results: Based on the architectural and cellular details of the tissue, MPM could characterize normal components of bladder and kidney. Neoplastic tissue could be differentiated from non-neoplastic tissue and could be further classified as per histopathological convention. Some of the tumors had unique MPM signatures not otherwise seen on H&E sections. Various subtypes of glomerular lesions were identified as well as renal oncocytic neoplasms were differentiated on unstained deparaffinized slides. Conclusions: We envision MPM to become an integral part of regular diagnostic workflow for rapid assessment of tissue. MPM can be used to evaluate the adequacy of biopsies and triage tissues for ancillary studies. It can also be used as an adjunct to frozen section analysis for intra-operative margin assessment. Further, it can play an important role for pathologist for guiding specimen grossing, selecting tissue for tumor banking and as a rapid ancillary diagnostic tool.

  15. How reinforcer type affects choice in economic games.

    PubMed

    Fantino, Edmund; Gaitan, Santino; Kennelly, Art; Stolarz-Fantino, Stephanie

    2007-06-01

    Behavioral economists stress that experiments on judgment and decision-making using economic games should be played with real money if the results are to have generality. Behavior analysts have sometimes disputed this contention and have reported results in which hypothetical rewards and real money have produced comparable outcomes. We review studies that have compared hypothetical and real money and discuss the results of two relevant experiments. In the first, using the Sharing Game developed in our laboratory, subjects' choices differed markedly depending on whether the rewards were real or hypothetical. In the second, using the Ultimatum and Dictator Games, we again found sharp differences between real and hypothetical rewards. However, this study also showed that time off from a tedious task could serve as a reinforcer every bit as potent as money. In addition to their empirical and theoretical contributions, these studies make the methodological point that meaningful studies may be conducted with economic games without spending money: time off from a tedious task can serve as a powerful reward.

  16. iMatTOUGH: An open-source Matlab-based graphical user interface for pre- and post-processing of TOUGH2 and iTOUGH2 models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan

    TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less

  17. iMatTOUGH: An open-source Matlab-based graphical user interface for pre- and post-processing of TOUGH2 and iTOUGH2 models

    DOE PAGES

    Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan

    2016-04-01

    TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less

  18. New diagnostics for latent and active tuberculosis: state of the art and future prospects.

    PubMed

    Pai, Madhukar; O'Brien, Richard

    2008-10-01

    Tuberculosis (TB) continues to be the world's most important infectious cause of morbidity and mortality among adults. Nearly 9 million people develop TB disease each year, and an estimated 1.6 million die from the disease. Despite this enormous global burden, case detection rates are low, posing serious hurdles for TB control. Conventional TB diagnosis continues to rely on antiquated tests such as sputum smear microscopy, culture, tuberculin skin test, and chest radiography. These tests have several limitations and perform poorly in populations affected by the HIV epidemic. Conventional tests for detection of drug resistance are time consuming, tedious, and inaccessible in most settings. In this review, we describe recent advances in the diagnosis of latent and active TB, and detection of drug resistance. Although the perfect test will not be ready for large-scale roll-out and integration into routine TB care services for some time, substantial progress has been made in expanding the TB diagnostic product pipeline. With the resurgence of interest in the development of new tools for TB control, and the recent influx of funding and political support, it is likely that the next few years will see the introduction of new diagnostic tools into routine TB control programs.

  19. Information Management For Tactical Reconnaissance

    NASA Astrophysics Data System (ADS)

    White, James P.

    1984-12-01

    The expected battlefield tactics of the 1980's and 1990's will be fluid and dynamic. If tactical reconnaissance is to meet this challenge, it must explore all ways of accelerating the flow of information through the reconnaissance cycle, from the moment a tasking request is received to the time the mission results are delivered to the requestor. In addition to near real-time dissemination of reconnaissance information, the mission planning phase needs to be more responsive to the rapidly changing battlefield scenario. By introducing Artificial Intelligence (AI) via an expert system to the mission planning phase, repetitive and computational tasks can be more readily performed by the ground-based mission planning system, thereby permitting the aircrew to devote more of their time to target study. Transporting the flight plan, plus other mission data, to the aircraft is simple with the Fairchild Data Transfer Equipment (DTE). Aircrews are relieved of the tedious, error-prone, and time-consuming task of manually keying-in avionics initialization data. Post-flight retrieval of mission data via the DTE will permit follow-on aircrews, just starting their mission planning phase, to capitalize on current threat data collected by the returning aircrew. Maintenance data retrieved from the recently flown mission will speed-up the aircraft turn-around by providing near-real time fault detection/isolation. As future avionics systems demand more information, a need for a computer-controlled, smart data base or expert system on-board the aircraft will emerge.

  20. Fabrication of enzyme-based coatings on intact multi-walled carbon nanotubes as highly effective electrodes in biofuel cells

    NASA Astrophysics Data System (ADS)

    Kim, Byoung Chan; Lee, Inseon; Kwon, Seok-Joon; Wee, Youngho; Kwon, Ki Young; Jeon, Chulmin; An, Hyo Jin; Jung, Hee-Tae; Ha, Su; Dordick, Jonathan S.; Kim, Jungbae

    2017-01-01

    CNTs need to be dispersed in aqueous solution for their successful use, and most methods to disperse CNTs rely on tedious and time-consuming acid-based oxidation. Here, we report the simple dispersion of intact multi-walled carbon nanotubes (CNTs) by adding them directly into an aqueous solution of glucose oxidase (GOx), resulting in simultaneous CNT dispersion and facile enzyme immobilization through sequential enzyme adsorption, precipitation, and crosslinking (EAPC). The EAPC achieved high enzyme loading and stability because of crosslinked enzyme coatings on intact CNTs, while obviating the chemical pretreatment that can seriously damage the electron conductivity of CNTs. EAPC-driven GOx activity was 4.5- and 11-times higher than those of covalently-attached GOx (CA) on acid-treated CNTs and simply-adsorbed GOx (ADS) on intact CNTs, respectively. EAPC showed no decrease of GOx activity for 270 days. EAPC was employed to prepare the enzyme anodes for biofuel cells, and the EAPC anode produced 7.5-times higher power output than the CA anode. Even with a higher amount of bound non-conductive enzymes, the EAPC anode showed 1.7-fold higher electron transfer rate than the CA anode. The EAPC on intact CNTs can improve enzyme loading and stability with key routes of improved electron transfer in various biosensing and bioelectronics devices.

  1. Operations research methods improve chemotherapy patient appointment scheduling.

    PubMed

    Santibáñez, Pablo; Aristizabal, Ruben; Puterman, Martin L; Chow, Vincent S; Huang, Wenhai; Kollmannsberger, Christian; Nordin, Travis; Runzer, Nancy; Tyldesley, Scott

    2012-12-01

    Clinical complexity, scheduling restrictions, and outdated manual booking processes resulted in frequent clerical rework, long waitlists for treatment, and late appointment notification for patients at a chemotherapy clinic in a large cancer center in British Columbia, Canada. A 17-month study was conducted to address booking, scheduling and workload issues and to develop, implement, and evaluate solutions. A review of scheduling practices included process observation and mapping, analysis of historical appointment data, creation of a new performance metric (final appointment notification lead time), and a baseline patient satisfaction survey. Process improvement involved discrete event simulation to evaluate alternative booking practice scenarios, development of an optimization-based scheduling tool to improve scheduling efficiency, and change management for implementation of process changes. Results were evaluated through analysis of appointment data, a follow-up patient survey, and staff surveys. Process review revealed a two-stage scheduling process. Long waitlists and late notification resulted from an inflexible first-stage process. The second-stage process was time consuming and tedious. After a revised, more flexible first-stage process and an automated second-stage process were implemented, the median percentage of appointments exceeding the final appointment notification lead time target of one week was reduced by 57% and median waitlist size decreased by 83%. Patient surveys confirmed increased satisfaction while staff feedback reported reduced stress levels. Significant operational improvements can be achieved through process redesign combined with operations research methods.

  2. Time dependent calibration of a sediment extraction scheme.

    PubMed

    Roychoudhury, Alakendra N

    2006-04-01

    Sediment extraction methods to quantify metal concentration in aquatic sediments usually present limitations in accuracy and reproducibility because metal concentration in the supernatant is controlled to a large extent by the physico-chemical properties of the sediment that result in a complex interplay between the solid and the solution phase. It is suggested here that standardization of sediment extraction methods using pure mineral phases or reference material is futile and instead the extraction processes should be calibrated using site-specific sediments before their application. For calibration, time dependent release of metals should be observed for each leachate to ascertain the appropriate time for a given extraction step. Although such an approach is tedious and time consuming, using iron extraction as an example, it is shown here that apart from quantitative data such an approach provides additional information on factors that play an intricate role in metal dynamics in the environment. Single step ascorbate, HCl, oxalate and dithionite extractions were used for targeting specific iron phases from saltmarsh sediments and their response was observed over time in order to calibrate the extraction times for each extractant later to be used in a sequential extraction. For surficial sediments, an extraction time of 24 h, 1 h, 2 h and 3 h was ascertained for ascorbate, HCl, oxalate and dithionite extractions, respectively. Fluctuations in iron concentration in the supernatant over time were ubiquitous. The adsorption-desorption behavior is possibly controlled by the sediment organic matter, formation or consumption of active exchange sites during extraction and the crystallinity of iron mineral phase present in the sediments.

  3. Saving Time with Automated Account Management

    ERIC Educational Resources Information Center

    School Business Affairs, 2013

    2013-01-01

    Thanks to intelligent solutions, schools, colleges, and universities no longer need to manage user account life cycles by using scripts or tedious manual procedures. The solutions house the scripts and manual procedures. Accounts can be automatically created, modified, or deleted in all applications within the school. This article describes how an…

  4. Electrode Processes in Porous Electrodes.

    DTIC Science & Technology

    1985-11-26

    F104470 2.0 MASS SPECTROMETRY One part of activity for this year is an investigation of the behavior of silver electrodes through the distribution of...al. (2)). These, in some cases, involve tedious and time comsuming procedures and discrepencies of as much as 15% have been observed in the results. As

  5. Integrating Technology into the Montessori Elementary Classroom.

    ERIC Educational Resources Information Center

    Hubbell, Elizabeth Ross

    2003-01-01

    Asserts that if used correctly, with forethought and respect to the Montessori philosophy, technology will advance and complement the experiences made available to children. Addresses the integration of technology into the Montessori elementary classroom focusing on the learning environment and the reduction of teacher time spent on tedious tasks.…

  6. Innovation 101: The story behind the invention of DNATrax

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farquar, George

    Lawrence Livermore researcher George Farquar shares the story behind the invention of DNATrax, a cost-effective and highly efficient method to accurately trace contaminated food back to its source. Foodborne illnesses kill roughly 3,000 Americans each year and about 1 in 6 are sickened, according to the Centers for Disease Control and Prevention. Yet most contaminated foods are never traced back to their source. That’s because existing methods to track tainted food following its supply chain from table to farm are highly inefficient, jeopardizing the health of millions and costing the food industry billions. A typical process to trace food includesmore » interviewing consumers and suppliers and examining every detail of the supply chain, a tedious method that takes weeks at best to complete.« less

  7. Real-time bilinear rotation decoupling in absorptive mode J-spectroscopy: Detecting low-intensity metabolite peak close to high-intensity metabolite peak with convenience.

    PubMed

    Verma, Ajay; Baishya, Bikash

    2016-05-01

    "Pure shift" NMR spectra display singlet peak per chemical site. Thus, high resolution is offered at the cost of valuable J-coupling information. In the present work, real-time BIRD (BIlinear Rotation Decoupling) is applied to the absorptive-mode 2D J-spectroscopy to provide pure shift spectrum in the direct dimension and J-coupling information in the indirect dimension. Quite often in metabolomics, proton NMR spectra from complex bio-fluids display tremendous signal overlap. Although conventional J-spectroscopy in principle overcomes this problem by separating the multiplet information from chemical shift information, however, only magnitude mode of the experiment is practical, sacrificing much of the potential high resolution that could be achieved. Few J-spectroscopy methods have been reported so far that produce high-resolution pure shift spectrum along with J-coupling information for crowded spectral regions. In the present work, high-quality J-resolved spectrum from important metabolomic mixture such as tissue extract from rat cortex is demonstrated. Many low-intensity metabolite peaks which are obscured by the broad dispersive tails from high-intensity metabolite peaks in regular magnitude mode J-spectrum can be clearly identified in real-time BIRD J-resolved spectrum. The general practice of removing such spectral overlap is tedious and time-consuming as it involves repeated sample preparation to change the pH of the tissue extract sample and subsequent spectra recording. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Using Contact Forces and Robot Arm Accelerations to Automatically Rate Surgeon Skill at Peg Transfer.

    PubMed

    Brown, Jeremy D; O Brien, Conor E; Leung, Sarah C; Dumon, Kristoffel R; Lee, David I; Kuchenbecker, Katherine J

    2017-09-01

    Most trainees begin learning robotic minimally invasive surgery by performing inanimate practice tasks with clinical robots such as the Intuitive Surgical da Vinci. Expert surgeons are commonly asked to evaluate these performances using standardized five-point rating scales, but doing such ratings is time consuming, tedious, and somewhat subjective. This paper presents an automatic skill evaluation system that analyzes only the contact force with the task materials, the broad-bandwidth accelerations of the robotic instruments and camera, and the task completion time. We recruited N = 38 participants of varying skill in robotic surgery to perform three trials of peg transfer with a da Vinci Standard robot instrumented with our Smart Task Board. After calibration, three individuals rated these trials on five domains of the Global Evaluative Assessment of Robotic Skill (GEARS) structured assessment tool, providing ground-truth labels for regression and classification machine learning algorithms that predict GEARS scores based on the recorded force, acceleration, and time signals. Both machine learning approaches produced scores on the reserved testing sets that were in good to excellent agreement with the human raters, even when the force information was not considered. Furthermore, regression predicted GEARS scores more accurately and efficiently than classification. A surgeon's skill at robotic peg transfer can be reliably rated via regression using features gathered from force, acceleration, and time sensors external to the robot. We expect improved trainee learning as a result of providing these automatic skill ratings during inanimate task practice on a surgical robot.

  9. Real-time bilinear rotation decoupling in absorptive mode J-spectroscopy: Detecting low-intensity metabolite peak close to high-intensity metabolite peak with convenience

    NASA Astrophysics Data System (ADS)

    Verma, Ajay; Baishya, Bikash

    2016-05-01

    ;Pure shift; NMR spectra display singlet peak per chemical site. Thus, high resolution is offered at the cost of valuable J-coupling information. In the present work, real-time BIRD (BIlinear Rotation Decoupling) is applied to the absorptive-mode 2D J-spectroscopy to provide pure shift spectrum in the direct dimension and J-coupling information in the indirect dimension. Quite often in metabolomics, proton NMR spectra from complex bio-fluids display tremendous signal overlap. Although conventional J-spectroscopy in principle overcomes this problem by separating the multiplet information from chemical shift information, however, only magnitude mode of the experiment is practical, sacrificing much of the potential high resolution that could be achieved. Few J-spectroscopy methods have been reported so far that produce high-resolution pure shift spectrum along with J-coupling information for crowded spectral regions. In the present work, high-quality J-resolved spectrum from important metabolomic mixture such as tissue extract from rat cortex is demonstrated. Many low-intensity metabolite peaks which are obscured by the broad dispersive tails from high-intensity metabolite peaks in regular magnitude mode J-spectrum can be clearly identified in real-time BIRD J-resolved spectrum. The general practice of removing such spectral overlap is tedious and time-consuming as it involves repeated sample preparation to change the pH of the tissue extract sample and subsequent spectra recording.

  10. MODEST: a web-based design tool for oligonucleotide-mediated genome engineering and recombineering

    PubMed Central

    Bonde, Mads T.; Klausen, Michael S.; Anderson, Mads V.; Wallin, Annika I.N.; Wang, Harris H.; Sommer, Morten O.A.

    2014-01-01

    Recombineering and multiplex automated genome engineering (MAGE) offer the possibility to rapidly modify multiple genomic or plasmid sites at high efficiencies. This enables efficient creation of genetic variants including both single mutants with specifically targeted modifications as well as combinatorial cell libraries. Manual design of oligonucleotides for these approaches can be tedious, time-consuming, and may not be practical for larger projects targeting many genomic sites. At present, the change from a desired phenotype (e.g. altered expression of a specific protein) to a designed MAGE oligo, which confers the corresponding genetic change, is performed manually. To address these challenges, we have developed the MAGE Oligo Design Tool (MODEST). This web-based tool allows designing of MAGE oligos for (i) tuning translation rates by modifying the ribosomal binding site, (ii) generating translational gene knockouts and (iii) introducing other coding or non-coding mutations, including amino acid substitutions, insertions, deletions and point mutations. The tool automatically designs oligos based on desired genotypic or phenotypic changes defined by the user, which can be used for high efficiency recombineering and MAGE. MODEST is available for free and is open to all users at http://modest.biosustain.dtu.dk. PMID:24838561

  11. Global Data Spatially Interrelate System for Scientific Big Data Spatial-Seamless Sharing

    NASA Astrophysics Data System (ADS)

    Yu, J.; Wu, L.; Yang, Y.; Lei, X.; He, W.

    2014-04-01

    A good data sharing system with spatial-seamless services will prevent the scientists from tedious, boring, and time consuming work of spatial transformation, and hence encourage the usage of the scientific data, and increase the scientific innovation. Having been adopted as the framework of Earth datasets by Group on Earth Observation (GEO), Earth System Spatial Grid (ESSG) is potential to be the spatial reference of the Earth datasets. Based on the implementation of ESSG, SDOG-ESSG, a data sharing system named global data spatially interrelate system (GASE) was design to make the data sharing spatial-seamless. The architecture of GASE was introduced. The implementation of the two key components, V-Pools, and interrelating engine, and the prototype is presented. Any dataset is firstly resampled into SDOG-ESSG, and is divided into small blocks, and then are mapped into hierarchical system of the distributed file system in V-Pools, which together makes the data serving at a uniform spatial reference and at a high efficiency. Besides, the datasets from different data centres are interrelated by the interrelating engine at the uniform spatial reference of SDOGESSG, which enables the system to sharing the open datasets in the internet spatial-seamless.

  12. Automated Derivation of Complex System Constraints from User Requirements

    NASA Technical Reports Server (NTRS)

    Foshee, Mark; Murey, Kim; Marsh, Angela

    2010-01-01

    The Payload Operations Integration Center (POIC) located at the Marshall Space Flight Center has the responsibility of integrating US payload science requirements for the International Space Station (ISS). All payload operations must request ISS system resources so that the resource usage will be included in the ISS on-board execution timelines. The scheduling of resources and building of the timeline is performed using the Consolidated Planning System (CPS). The ISS resources are quite complex due to the large number of components that must be accounted for. The planners at the POIC simplify the process for Payload Developers (PD) by providing the PDs with a application that has the basic functionality PDs need as well as list of simplified resources in the User Requirements Collection (URC) application. The planners maintained a mapping of the URC resources to the CPS resources. The process of manually converting PD's science requirements from a simplified representation to a more complex CPS representation is a time-consuming and tedious process. The goal is to provide a software solution to allow the planners to build a mapping of the complex CPS constraints to the basic URC constraints and automatically convert the PD's requirements into systems requirements during export to CPS.

  13. A Pathological Brain Detection System based on Extreme Learning Machine Optimized by Bat Algorithm.

    PubMed

    Lu, Siyuan; Qiu, Xin; Shi, Jianping; Li, Na; Lu, Zhi-Hai; Chen, Peng; Yang, Meng-Meng; Liu, Fang-Yuan; Jia, Wen-Juan; Zhang, Yudong

    2017-01-01

    It is beneficial to classify brain images as healthy or pathological automatically, because 3D brain images can generate so much information which is time consuming and tedious for manual analysis. Among various 3D brain imaging techniques, magnetic resonance (MR) imaging is the most suitable for brain, and it is now widely applied in hospitals, because it is helpful in the four ways of diagnosis, prognosis, pre-surgical, and postsurgical procedures. There are automatic detection methods; however they suffer from low accuracy. Therefore, we proposed a novel approach which employed 2D discrete wavelet transform (DWT), and calculated the entropies of the subbands as features. Then, a bat algorithm optimized extreme learning machine (BA-ELM) was trained to identify pathological brains from healthy controls. A 10x10-fold cross validation was performed to evaluate the out-of-sample performance. The method achieved a sensitivity of 99.04%, a specificity of 93.89%, and an overall accuracy of 98.33% over 132 MR brain images. The experimental results suggest that the proposed approach is accurate and robust in pathological brain detection. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. Deep convolutional neural network processing of aerial stereo imagery to monitor vulnerable zones near power lines

    NASA Astrophysics Data System (ADS)

    Qayyum, Abdul; Saad, Naufal M.; Kamel, Nidal; Malik, Aamir Saeed

    2018-01-01

    The monitoring of vegetation near high-voltage transmission power lines and poles is tedious. Blackouts present a huge challenge to power distribution companies and often occur due to tree growth in hilly and rural areas. There are numerous methods of monitoring hazardous overgrowth that are expensive and time-consuming. Accurate estimation of tree and vegetation heights near power poles can prevent the disruption of power transmission in vulnerable zones. This paper presents a cost-effective approach based on a convolutional neural network (CNN) algorithm to compute the height (depth maps) of objects proximal to power poles and transmission lines. The proposed CNN extracts and classifies features by employing convolutional pooling inputs to fully connected data layers that capture prominent features from stereo image patches. Unmanned aerial vehicle or satellite stereo image datasets can thus provide a feasible and cost-effective approach that identifies threat levels based on height and distance estimations of hazardous vegetation and other objects. Results were compared with extant disparity map estimation techniques, such as graph cut, dynamic programming, belief propagation, and area-based methods. The proposed method achieved an accuracy rate of 90%.

  15. A stochastic approach for automatic generation of urban drainage systems.

    PubMed

    Möderl, M; Butler, D; Rauch, W

    2009-01-01

    Typically, performance evaluation of new developed methodologies is based on one or more case studies. The investigation of multiple real world case studies is tedious and time consuming. Moreover extrapolating conclusions from individual investigations to a general basis is arguable and sometimes even wrong. In this article a stochastic approach is presented to evaluate new developed methodologies on a broader basis. For the approach the Matlab-tool "Case Study Generator" is developed which generates a variety of different virtual urban drainage systems automatically using boundary conditions e.g. length of urban drainage system, slope of catchment surface, etc. as input. The layout of the sewer system is based on an adapted Galton-Watson branching process. The sub catchments are allocated considering a digital terrain model. Sewer system components are designed according to standard values. In total, 10,000 different virtual case studies of urban drainage system are generated and simulated. Consequently, simulation results are evaluated using a performance indicator for surface flooding. Comparison between results of the virtual and two real world case studies indicates the promise of the method. The novelty of the approach is that it is possible to get more general conclusions in contrast to traditional evaluations with few case studies.

  16. Feature Selection Using Information Gain for Improved Structural-Based Alert Correlation

    PubMed Central

    Siraj, Maheyzah Md; Zainal, Anazida; Elshoush, Huwaida Tagelsir; Elhaj, Fatin

    2016-01-01

    Grouping and clustering alerts for intrusion detection based on the similarity of features is referred to as structurally base alert correlation and can discover a list of attack steps. Previous researchers selected different features and data sources manually based on their knowledge and experience, which lead to the less accurate identification of attack steps and inconsistent performance of clustering accuracy. Furthermore, the existing alert correlation systems deal with a huge amount of data that contains null values, incomplete information, and irrelevant features causing the analysis of the alerts to be tedious, time-consuming and error-prone. Therefore, this paper focuses on selecting accurate and significant features of alerts that are appropriate to represent the attack steps, thus, enhancing the structural-based alert correlation model. A two-tier feature selection method is proposed to obtain the significant features. The first tier aims at ranking the subset of features based on high information gain entropy in decreasing order. The‏ second tier extends additional features with a better discriminative ability than the initially ranked features. Performance analysis results show the significance of the selected features in terms of the clustering accuracy using 2000 DARPA intrusion detection scenario-specific dataset. PMID:27893821

  17. Production of Engineered Fabrics Using Artificial Neural Network-Genetic Algorithm Hybrid Model

    NASA Astrophysics Data System (ADS)

    Mitra, Ashis; Majumdar, Prabal Kumar; Banerjee, Debamalya

    2015-10-01

    The process of fabric engineering which is generally practised in most of the textile mills is very complicated, repetitive, tedious and time consuming. To eliminate this trial and error approach, a new approach of fabric engineering has been attempted in this work. Data sets of construction parameters [comprising of ends per inch, picks per inch, warp count and weft count] and three fabric properties (namely drape coefficient, air permeability and thermal resistance) of 25 handloom cotton fabrics have been used. The weights and biases of three artificial neural network (ANN) models developed for the prediction of drape coefficient, air permeability and thermal resistance were used to formulate the fitness or objective function and constraints of the optimization problem. The optimization problem was solved using genetic algorithm (GA). In both the fabrics which were attempted for engineering, the target and simulated fabric properties were very close. The GA was able to search the optimum set of fabric construction parameters with reasonably good accuracy except in case of EPI. However, the overall result is encouraging and can be improved further by using larger data sets of handloom fabrics by hybrid ANN-GA model.

  18. Deciphering protein signatures using color, morphological, and topological analysis of immunohistochemically stained human tissues

    NASA Astrophysics Data System (ADS)

    Zerhouni, Erwan; Prisacari, Bogdan; Zhong, Qing; Wild, Peter; Gabrani, Maria

    2016-03-01

    Images of tissue specimens enable evidence-based study of disease susceptibility and stratification. Moreover, staining technologies empower the evidencing of molecular expression patterns by multicolor visualization, thus enabling personalized disease treatment and prevention. However, translating molecular expression imaging into direct health benefits has been slow. Two major factors contribute to that. On the one hand, disease susceptibility and progression is a complex, multifactorial molecular process. Diseases, such as cancer, exhibit cellular heterogeneity, impeding the differentiation between diverse grades or types of cell formations. On the other hand, the relative quantification of the stained tissue selected features is ambiguous, tedious and time consuming, prone to clerical error, leading to intra- and inter-observer variability and low throughput. Image analysis of digital histopathology images is a fast-developing and exciting area of disease research that aims to address the above limitations. We have developed a computational framework that extracts unique signatures using color, morphological and topological information and allows the combination thereof. The integration of the above information enables diagnosis of disease with AUC as high as 0.97. Multiple staining show significant improvement with respect to most proteins, and an AUC as high as 0.99.

  19. Automatic extraction of the mid-sagittal plane using an ICP variant

    NASA Astrophysics Data System (ADS)

    Fieten, Lorenz; Eschweiler, Jörg; de la Fuente, Matías; Gravius, Sascha; Radermacher, Klaus

    2008-03-01

    Precise knowledge of the mid-sagittal plane is important for the assessment and correction of several deformities. Furthermore, the mid-sagittal plane can be used for the definition of standardized coordinate systems such as pelvis or skull coordinate systems. A popular approach for mid-sagittal plane computation is based on the selection of anatomical landmarks located either directly on the plane or symmetrically to it. However, the manual selection of landmarks is a tedious, time-consuming and error-prone task, which requires great care. In order to overcome this drawback, previously it was suggested to use the iterative closest point (ICP) algorithm: After an initial mirroring of the data points on a default mirror plane, the mirrored data points should be registered iteratively to the model points using rigid transforms. Finally, a reflection transform approximating the cumulative transform could be extracted. In this work, we present an ICP variant for the iterative optimization of the reflection parameters. It is based on a closed-form solution to the least-squares problem of matching data points to model points using a reflection. In experiments on CT pelvis and skull datasets our method showed a better ability to match homologous areas.

  20. Controlling Fringe Sensitivity of Electro-Optic Holography Systems Using Laser Diode Current Modulation

    NASA Technical Reports Server (NTRS)

    Bybee, Shannon J.

    2001-01-01

    Electro-Optic Holography (EOH) is a non-intrusive, laser-based, displacement measurement technique capable of static and dynamic displacement measurements. EOH is an optical interference technique in which fringe patterns that represent displacement contour maps are generated. At excessively large displacements the fringe density may be so great that individual fringes are not resolvable using typical EOH techniques. This thesis focuses on the development and implementation of a method for controlling the sensitivity of the EOH system. This method is known as Frequency Translated Electro-Optic Holography (FTEOH). It was determined that by modulating the current source of the laser diode at integer multiples of the object vibration, the fringe pattern is governed by higher order Bessel function of the first kind and the number of fringes that represent a given displacement can be controlled. The reduction of fringes is theoretically unlimited but physically limited by the frequency bandwidth of the signal generator, providing modulation to the laser diode. Although this research technique has been verified theoretically and experimentally in this thesis, due to the current laser diode capabilities it is a tedious and time consuming process to acquire data using the FTEOH technique.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamilton, T.; Jones, H.; Wong, K.

    The Marshall Islands Environmental Characterization and Dose Assessment Program has recently implemented waste minimization measures to reduce low level radioactive (LLW) and low level mixed (LLWMIXED) waste streams at the Lawrence Livermore National Laboratory (LLNL). Several thousand environmental samples are collected annually from former US nuclear test sites in the Marshall Islands, and returned to LLNL for processing and radiometric analysis. In the past, we analyzed coconut milk directly by gamma-spectrometry after adding formaldehyde (as preservative) and sealing the fluid in metal cans. This procedure was not only tedious and time consuming but generated storage and waste disposal problems. Wemore » have now reduced the number of coconut milk samples required for analysis from 1500 per year to approximately 250, and developed a new analytical procedure which essentially eliminates the associated mixed radioactive waste stream. Coconut milk samples are mixed with a few grams of ammonium-molydophosphate (AMP) which quantitatively scavenges the target radionuclide cesium 137 in an ion-exchange process. The AMP is then separated from the mixture and sealed in a plastic container. The bulk sample material can be disposed of as a non- radioactive non-hazardous waste, and the relatively small amount of AMP conveniently counted by gamma-spectrometry, packaged and stored for future use.« less

  2. A user friendly system for ultrasound carotid intima-media thickness image interpretation

    NASA Astrophysics Data System (ADS)

    Zhu, Xiangjun; Kendall, Christopher B.; Hurst, R. Todd; Liang, Jianming

    2011-03-01

    Assessment of Carotid Intima-Media Thickness (CIMT) by B-mode ultrasound is a technically mature and reproducible technology. Given the high morbidity, mortality and the large societal burden associated with CV diseases, as a safe yet inexpensive tool, CIMT is increasingly utilized for cardiovascular (CV) risk stratification. However, CIMT requires a precise measure of the thickness of the intima and media layers of the carotid artery that can be tedious, time consuming, and demand specialized expertise and experience. To this end, we have developed a highly user-friendly system for semiautomatic CIMT image interpretation. Our contribution is the application of active contour models (snake models) with hard constraints, leading to an accurate, adaptive and user-friendly border detection algorithm. A comparison study with the CIMT measurement software in Siemens Syngo® Arterial Health Package shows that our system gives a small bias in mean (0.049 +/-0.051mm) and maximum (0.010 +/- 0.083 mm) CIMT measures and offers a higher reproducibility (average correlation coefficients were 0.948 and 0.844 in mean and maximum CIMT respectively (P <0.001)). This superior performance is attributed to our novel interface design for hard constraints in the snake models.

  3. Automatic approach to deriving fuzzy slope positions

    NASA Astrophysics Data System (ADS)

    Zhu, Liang-Jun; Zhu, A.-Xing; Qin, Cheng-Zhi; Liu, Jun-Zhi

    2018-03-01

    Fuzzy characterization of slope positions is important for geographic modeling. Most of the existing fuzzy classification-based methods for fuzzy characterization require extensive user intervention in data preparation and parameter setting, which is tedious and time-consuming. This paper presents an automatic approach to overcoming these limitations in the prototype-based inference method for deriving fuzzy membership value (or similarity) to slope positions. The key contribution is a procedure for finding the typical locations and setting the fuzzy inference parameters for each slope position type. Instead of being determined totally by users in the prototype-based inference method, in the proposed approach the typical locations and fuzzy inference parameters for each slope position type are automatically determined by a rule set based on prior domain knowledge and the frequency distributions of topographic attributes. Furthermore, the preparation of topographic attributes (e.g., slope gradient, curvature, and relative position index) is automated, so the proposed automatic approach has only one necessary input, i.e., the gridded digital elevation model of the study area. All compute-intensive algorithms in the proposed approach were speeded up by parallel computing. Two study cases were provided to demonstrate that this approach can properly, conveniently and quickly derive the fuzzy slope positions.

  4. Enhancement of oral bioavailability of anti-HIV drug rilpivirine HCl through nanosponge formulation.

    PubMed

    Zainuddin, Rana; Zaheer, Zahid; Sangshetti, Jaiprakash N; Momin, Mufassir

    2017-12-01

    To synthesize β cyclodextrin nanosponges using a novel and efficient microwave mediated method for enhancing bioavailability of Rilpivirine HCl (RLP). Belonging to BCS class II RLP has pH dependent solubility and poor oral bioavailability. However, a fatty meal enhances its absorption hence the therapy indicates that the dosage form be consumed with a meal. But then it becomes tedious and inconvenient to continue the therapy for years with having to face the associated gastric side effects such as nausea. Microwave synthesizer was used to mediate the poly-condensation reaction between β-cyclodextrin and cross-linker diphenylcarbonate. Critical parameters selected were polymer to cross-linker ratio, Watt power, reaction time and solvent volume. Characterization studies were performed using FTIR, DSC, SEM, 1 H-NMR and PXRD. Molecular modeling was applied to confirm the possibility of drug entrapment. In vitro drug dissolution followed by oral bioavailability studies was performed in Sprawley rats. Samples were analyzed using HPLC. Microwave synthesis yields para-crystalline, porous nanosponges (∼205 nm). Drug entrapment led to enhancement of solubility and a two-fold increase in drug dissolution (P < 0.001) following Higuchi release model. Enhanced oral bioavailability was observed in fasted Sprawley rats where C max and AUC 0-∞ increases significantly (C max of NS∼ 586 ± 5.91 ng/mL; plain RLP ∼310 ± 5. 74 ng/mL). The approach offers a comfortable dosing zone for AIDs patients, negating the requirement of consuming the formulation in a fed state due to enhancement in drugs' oral bioavailability.

  5. From Tedious to Timely: Screencasting to Troubleshoot Electronic Resource Issues

    ERIC Educational Resources Information Center

    Hartnett, Eric; Thompson, Carole

    2010-01-01

    The shift from traditional print materials to electronic resources, in conjunction with the rise in the number of distance education programs, has left many electronic resource librarians scrambling to keep up with the resulting inundation of electronic resource problems. When it comes to diagnosing these problems, words do not always convey all…

  6. Efficient Semi-Automatic 3D Segmentation for Neuron Tracing in Electron Microscopy Images

    PubMed Central

    Jones, Cory; Liu, Ting; Cohan, Nathaniel Wood; Ellisman, Mark; Tasdizen, Tolga

    2015-01-01

    0.1. Background In the area of connectomics, there is a significant gap between the time required for data acquisition and dense reconstruction of the neural processes contained in the same dataset. Automatic methods are able to eliminate this timing gap, but the state-of-the-art accuracy so far is insufficient for use without user corrections. If completed naively, this process of correction can be tedious and time consuming. 0.2. New Method We present a new semi-automatic method that can be used to perform 3D segmentation of neurites in EM image stacks. It utilizes an automatic method that creates a hierarchical structure for recommended merges of superpixels. The user is then guided through each predicted region to quickly identify errors and establish correct links. 0.3. Results We tested our method on three datasets with both novice and expert users. Accuracy and timing were compared with published automatic, semi-automatic, and manual results. 0.4. Comparison with Existing Methods Post-automatic correction methods have also been used in [1] and [2]. These methods do not provide navigation or suggestions in the manner we present. Other semi-automatic methods require user input prior to the automatic segmentation such as [3] and [4] and are inherently different than our method. 0.5. Conclusion Using this method on the three datasets, novice users achieved accuracy exceeding state-of-the-art automatic results, and expert users achieved accuracy on par with full manual labeling but with a 70% time improvement when compared with other examples in publication. PMID:25769273

  7. A high-throughput assay for enzymatic polyester hydrolysis activity by fluorimetric detection.

    PubMed

    Wei, Ren; Oeser, Thorsten; Billig, Susan; Zimmermann, Wolfgang

    2012-12-01

    A fluorimetric assay for the fast determination of the activity of polyester-hydrolyzing enzymes in a large number of samples has been developed. Terephthalic acid (TPA) is a main product of the enzymatic hydrolysis of polyethylene terephthalate (PET), a synthetic polyester. Terephthalate has been quantified following its conversion to the fluorescent 2-hydroxyterephthalate by an iron autoxidation-mediated generation of free hydroxyl radicals. The assay proved to be robust at different buffer concentrations, reaction times, pH values, and in the presence of proteins. A validation of the assay was performed by analyzing TPA formation from PET films and nanoparticles catalyzed by a polyester hydrolase from Thermobifida fusca KW3 in a 96-well microplate format. The results showed a close correlation (R(2) = 0.99) with those obtained by a considerably more tedious and time-consuming HPLC method, suggesting the aptness of the fluorimetric assay for a high-throughput screening for polyester hydrolases. The method described in this paper will facilitate the detection and development of biocatalysts for the modification and degradation of synthetic polymers. The fluorimetric assay can be used to quantify the amount of TPA obtained as the final degradation product of the enzymatic hydrolysis of PET. In a microplate format, this assay can be applied for the high-throughput screening of polyester hydrolases. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Interactive tele-radiological segmentation systems for treatment and diagnosis.

    PubMed

    Zimeras, S; Gortzis, L G

    2012-01-01

    Telehealth is the exchange of health information and the provision of health care services through electronic information and communications technology, where participants are separated by geographic, time, social and cultural barriers. The shift of telemedicine from desktop platforms to wireless and mobile technologies is likely to have a significant impact on healthcare in the future. It is therefore crucial to develop a general information exchange e-medical system to enables its users to perform online and offline medical consultations through diagnosis. During the medical diagnosis, image analysis techniques combined with doctor's opinions could be useful for final medical decisions. Quantitative analysis of digital images requires detection and segmentation of the borders of the object of interest. In medical images, segmentation has traditionally been done by human experts. Even with the aid of image processing software (computer-assisted segmentation tools), manual segmentation of 2D and 3D CT images is tedious, time-consuming, and thus impractical, especially in cases where a large number of objects must be specified. Substantial computational and storage requirements become especially acute when object orientation and scale have to be considered. Therefore automated or semi-automated segmentation techniques are essential if these software applications are ever to gain widespread clinical use. The main purpose of this work is to analyze segmentation techniques for the definition of anatomical structures under telemedical systems.

  9. A Fast Method for the Segmentation of Synaptic Junctions and Mitochondria in Serial Electron Microscopic Images of the Brain.

    PubMed

    Márquez Neila, Pablo; Baumela, Luis; González-Soriano, Juncal; Rodríguez, Jose-Rodrigo; DeFelipe, Javier; Merchán-Pérez, Ángel

    2016-04-01

    Recent electron microscopy (EM) imaging techniques permit the automatic acquisition of a large number of serial sections from brain samples. Manual segmentation of these images is tedious, time-consuming and requires a high degree of user expertise. Therefore, there is considerable interest in developing automatic segmentation methods. However, currently available methods are computationally demanding in terms of computer time and memory usage, and to work properly many of them require image stacks to be isotropic, that is, voxels must have the same size in the X, Y and Z axes. We present a method that works with anisotropic voxels and that is computationally efficient allowing the segmentation of large image stacks. Our approach involves anisotropy-aware regularization via conditional random field inference and surface smoothing techniques to improve the segmentation and visualization. We have focused on the segmentation of mitochondria and synaptic junctions in EM stacks from the cerebral cortex, and have compared the results to those obtained by other methods. Our method is faster than other methods with similar segmentation results. Our image regularization procedure introduces high-level knowledge about the structure of labels. We have also reduced memory requirements with the introduction of energy optimization in overlapping partitions, which permits the regularization of very large image stacks. Finally, the surface smoothing step improves the appearance of three-dimensional renderings of the segmented volumes.

  10. Paper-based microreactor integrating cell culture and subsequent immunoassay for the investigation of cellular phosphorylation.

    PubMed

    Lei, Kin Fong; Huang, Chia-Hao

    2014-12-24

    Investigation of cellular phosphorylation and signaling pathway has recently gained much attention for the study of pathogenesis of cancer. Related conventional bioanalytical operations for this study including cell culture and Western blotting are time-consuming and labor-intensive. In this work, a paper-based microreactor has been developed to integrate cell culture and subsequent immunoassay on a single paper. The paper-based microreactor was a filter paper with an array of circular zones for running multiple cell cultures and subsequent immunoassays. Cancer cells were directly seeded in the circular zones without hydrogel encapsulation and cultured for 1 day. Subsequently, protein expressions including structural, functional, and phosphorylated proteins of the cells could be detected by their specific antibodies, respectively. Study of the activation level of phosphorylated Stat3 of liver cancer cells stimulated by IL-6 cytokine was demonstrated by the paper-based microreactor. This technique can highly reduce tedious bioanalytical operation and sample and reagent consumption. Also, the time required by the entire process can be shortened. This work provides a simple and rapid screening tool for the investigation of cellular phosphorylation and signaling pathway for understanding the pathogenesis of cancer. In addition, the operation of the paper-based microreactor is compatible to the molecular biological training, and therefore, it has the potential to be developed for routine protocol for various research areas in conventional bioanalytical laboratories.

  11. Comparison of three methods for registration of abdominal/pelvic volume data sets from functional-anatomic scans

    NASA Astrophysics Data System (ADS)

    Mahmoud, Faaiza; Ton, Anthony; Crafoord, Joakim; Kramer, Elissa L.; Maguire, Gerald Q., Jr.; Noz, Marilyn E.; Zeleznik, Michael P.

    2000-06-01

    The purpose of this work was to evaluate three volumetric registration methods in terms of technique, user-friendliness and time requirements. CT and SPECT data from 11 patients were interactively registered using: a 3D method involving only affine transformation; a mixed 3D - 2D non-affine (warping) method; and a 3D non-affine (warping) method. In the first method representative isosurfaces are generated from the anatomical images. Registration proceeds through translation, rotation, and scaling in all three space variables. Resulting isosurfaces are fused and quantitative measurements are possible. In the second method, the 3D volumes are rendered co-planar by performing an oblique projection. Corresponding landmark pairs are chosen on matching axial slice sets. A polynomial warp is then applied. This method has undergone extensive validation and was used to evaluate the results. The third method employs visualization tools. The data model allows images to be localized within two separate volumes. Landmarks are chosen on separate slices. Polynomial warping coefficients are generated and data points from one volume are moved to the corresponding new positions. The two landmark methods were the least time consuming (10 to 30 minutes from start to finish), but did demand a good knowledge of anatomy. The affine method was tedious and required a fair understanding of 3D geometry.

  12. Automated morphological analysis of bone marrow cells in microscopic images for diagnosis of leukemia: nucleus-plasma separation and cell classification using a hierarchical tree model of hematopoesis

    NASA Astrophysics Data System (ADS)

    Krappe, Sebastian; Wittenberg, Thomas; Haferlach, Torsten; Münzenmayer, Christian

    2016-03-01

    The morphological differentiation of bone marrow is fundamental for the diagnosis of leukemia. Currently, the counting and classification of the different types of bone marrow cells is done manually under the use of bright field microscopy. This is a time-consuming, subjective, tedious and error-prone process. Furthermore, repeated examinations of a slide may yield intra- and inter-observer variances. For that reason a computer assisted diagnosis system for bone marrow differentiation is pursued. In this work we focus (a) on a new method for the separation of nucleus and plasma parts and (b) on a knowledge-based hierarchical tree classifier for the differentiation of bone marrow cells in 16 different classes. Classification trees are easily interpretable and understandable and provide a classification together with an explanation. Using classification trees, expert knowledge (i.e. knowledge about similar classes and cell lines in the tree model of hematopoiesis) is integrated in the structure of the tree. The proposed segmentation method is evaluated with more than 10,000 manually segmented cells. For the evaluation of the proposed hierarchical classifier more than 140,000 automatically segmented bone marrow cells are used. Future automated solutions for the morphological analysis of bone marrow smears could potentially apply such an approach for the pre-classification of bone marrow cells and thereby shortening the examination time.

  13. Semi-automated image analysis: detecting carbonylation in subcellular regions of skeletal muscle

    PubMed Central

    Kostal, Vratislav; Levar, Kiara; Swift, Mark; Skillrud, Erik; Chapman, Mark; Thompson, LaDora V.

    2011-01-01

    The level of carbonylation in skeletal muscle is a marker of oxidative damage associated with disease and aging. While immunofluorescence microscopy is an elegant method to identify carbonylation sites in muscle cross-sections, imaging analysis is manual, tedious, and time consuming, especially when the goal is to characterize carbonyl contents in subcellular regions. In this paper, we present a semi-automated method for the analysis of carbonylation in subcellular regions of skeletal muscle cross-sections visualized with dual fluorescent immunohistochemistry. Carbonyls were visualized by their reaction with 2,4-dinitrophenylhydrazine (DNPH) followed by immunolabeling with an Alexa488-tagged anti-DNP antibody. Mitochondria were probed with an anti-COXI primary antibody followed by the labeling with an Alexa568-tagged secondary antibody. After imaging, muscle fibers were individually analyzed using a custom-designed, lab-written, computer-aided procedure to measure carbonylation levels in subsarcolemmal and interfibrillar mitochondrial regions, and in the cytoplasmic and extracellular regions. Using this procedure, we were able to decrease the time necessary for the analysis of a single muscle fiber from 45 min to about 1 min. The procedure was tested by four independent analysts and found to be independent on inter-person and intra-person variations. This procedure will help increase highly needed throughput in muscle studies related to ageing, disease, physical performance, and inactivity that use carbonyl levels as markers of oxidative damage. PMID:21327623

  14. Hydrogel-gauze dressing for moderate-to-severe atopic dermatitis: development and efficacy study on atopic dermatitis-like skin lesions in NC/Nga mice.

    PubMed

    Ng, Shiow-Fern; Lew, Pit-Chin; Sin, Yong-Boey

    2014-11-01

    Topical emollients are known to provide symptomatic relief for atopic dermatitis. In hospitals, wet-wrap therapy has been shown to benefit children with moderate-to-severe atopic dermatitis (AD), but the application of wet-wraps is tedious and time-consuming. Topical emollients have low residence time and often dry out easily. The aim of this work was to develop a hydrogel-gauze dressing that is not only easy to apply but also rehydrates and traps moisture to provide longer relief for AD patients. In this study, a prototype hydrogel-gauze dressing was developed with varying ratios of sodium carboxymethylcellulose (NaCMC) and propylene glycol. The hydrogel-gauze dressings were assessed based on the moisture vapor transmission rate, moisture absorption, mechanical properties and storage stability over three months. Then, the efficacy of the hydrogel-gauze dressing was compared to topical emollients using transgenic NC/Nga mice with AD-like lesions. The NaCMC hydrogel-gauze dressings significantly lowered transepidermal water loss, and the animals displayed a faster recovery, which indicates that hydrogel-gauze dressings can trap moisture more effectively and accelerate AD healing. Hence, we propose that hydrogel-gauze dressings can potentially become an alternative to wet-wrap therapy due to the ease of application and the higher efficacy compared to topical products.

  15. Assessing the difficulty and time cost of de-identification in clinical narratives.

    PubMed

    Dorr, D A; Phillips, W F; Phansalkar, S; Sims, S A; Hurdle, J F

    2006-01-01

    To characterize the difficulty confronting investigators in removing protected health information (PHI) from cross-discipline, free-text clinical notes, an important challenge to clinical informatics research as recalibrated by the introduction of the US Health Insurance Portability and Accountability Act (HIPAA) and similar regulations. Randomized selection of clinical narratives from complete admissions written by diverse providers, reviewed using a two-tiered rater system and simple automated regular expression tools. For manual review, two independent reviewers used simple search and replace algorithms and visual scanning to find PHI as defined by HIPAA, followed by an independent second review to detect any missed PHI. Simple automated review was also performed for the "easy" PHI that are number- or date-based. From 262 notes, 2074 PHI, or 7.9 +/- 6.1 per note, were found. The average recall (or sensitivity) was 95.9% while precision was 99.6% for single reviewers. Agreement between individual reviewers was strong (ICC = 0.99), although some asymmetry in errors was seen between reviewers (p = 0.001). The automated technique had better recall (98.5%) but worse precision (88.4%) for its subset of identifiers. Manually de-identifying a note took 87.3 +/- 61 seconds on average. Manual de-identification of free-text notes is tedious and time-consuming, but even simple PHI is difficult to automatically identify with the exactitude required under HIPAA.

  16. Interactive approach to segment organs at risk in radiotherapy treatment planning

    NASA Astrophysics Data System (ADS)

    Dolz, Jose; Kirisli, Hortense A.; Viard, Romain; Massoptier, Laurent

    2014-03-01

    Accurate delineation of organs at risk (OAR) is required for radiation treatment planning (RTP). However, it is a very time consuming and tedious task. The use in clinic of image guided radiation therapy (IGRT) becomes more and more popular, thus increasing the need of (semi-)automatic methods for delineation of the OAR. In this work, an interactive segmentation approach to delineate OAR is proposed and validated. The method is based on the combination of watershed transformation, which groups small areas of similar intensities in homogeneous labels, and graph cuts approach, which uses these labels to create the graph. Segmentation information can be added in any view - axial, sagittal or coronal -, making the interaction with the algorithm easy and fast. Subsequently, this information is propagated within the whole volume, providing a spatially coherent result. Manual delineations made by experts of 6 OAR - lungs, kidneys, liver, spleen, heart and aorta - over a set of 9 computed tomography (CT) scans were used as reference standard to validate the proposed approach. With a maximum of 4 interactions, a Dice similarity coefficient (DSC) higher than 0.87 was obtained, which demonstrates that, with the proposed segmentation approach, only few interactions are required to achieve similar results as the ones obtained manually. The integration of this method in the RTP process may save a considerable amount of time, and reduce the annotation complexity.

  17. A Standard Handshake for the Use of Electronic Materials

    ERIC Educational Resources Information Center

    Glenn, David

    2007-01-01

    For most people on campuses, September means revising syllabi, enjoying football, or avoiding the person you broke up with last spring. For college librarians, it marks the beginning of "renewal season." It is time for the tedious work of placing orders and negotiating licenses for next year's journals. However, relief may be on the horizon.…

  18. Walkie-Talkie Measurements for the Speed of Radio Waves in Air

    ERIC Educational Resources Information Center

    Dombi, Andra; Tunyagi, Arthur; Neda, Zoltan

    2013-01-01

    A handheld emitter-receiver device suitable for the direct estimation of the velocity of radio waves in air is presented. The velocity of radio waves is measured using the direct time-of-flight method, without the need for any tedious and precise settings. The results for two measurement series are reported. Both sets of results give an estimate…

  19. Interesting, Cool and Tantalising? Or Inappropriate, Complicated and Tedious? Pupil and Teacher Views on ICT in Science Teaching

    ERIC Educational Resources Information Center

    Willshire, Michael

    2013-01-01

    In a relatively short space of time, classrooms have become full of computers, gadgets and electronic devices. Technology will only continue to become more sophisticated, more efficient and more abundant in schools. But how desirable is this technological revolution and to what extent should it develop? To measure the effectiveness and popularity…

  20. Improved ultrasound transducer positioning by fetal heart location estimation during Doppler based heart rate measurements.

    PubMed

    Hamelmann, Paul; Vullings, Rik; Schmitt, Lars; Kolen, Alexander F; Mischi, Massimo; van Laar, Judith O E H; Bergmans, Jan W M

    2017-09-21

    Doppler ultrasound (US) is the most commonly applied method to measure the fetal heart rate (fHR). When the fetal heart is not properly located within the ultrasonic beam, fHR measurements often fail. As a consequence, clinical staff need to reposition the US transducer on the maternal abdomen, which can be a time consuming and tedious task. In this article, a method is presented to aid clinicians with the positioning of the US transducer to produce robust fHR measurements. A maximum likelihood estimation (MLE) algorithm is developed, which provides information on fetal heart location using the power of the Doppler signals received in the individual elements of a standard US transducer for fHR recordings. The performance of the algorithm is evaluated with simulations and in vitro experiments performed on a beating-heart setup. Both the experiments and the simulations show that the heart location can be accurately determined with an error of less than 7 mm within the measurement volume of the employed US transducer. The results show that the developed algorithm can be used to provide accurate feedback on fetal heart location for improved positioning of the US transducer, which may lead to improved measurements of the fHR.

  1. A Brief Measure of Peer Affiliation and Social Acceptance (PASA): Validity in an Ethnically Diverse Sample of Early Adolescents

    PubMed Central

    Dishion, Thomas J.; Kim, Hanjoe; Stormshak, Elizabeth A.; O'Neill, Maya

    2014-01-01

    Objective Conduct a multiagent–multimethod analysis of the validity of a brief measure of deviant peer affiliations and social acceptance (PASA) in young adolescents. Peer relationships are critical to child and adolescent social and emotional development, but currently available measures are tedious and time consuming. The PASA consists of a youth, parent, and teacher report that can be collected longitudinally to study development and intervention effectiveness. Method This longitudinal study included 998 middle school students and their families. We collected the PASA and peer sociometrics data in Grade 7 and a multiagent–multimethod construct of deviant peer clustering in Grade 8. Results Confirmatory factor analyses of the multiagent–multimethod data revealed that the constructs of deviant peer affiliations and social acceptance and rejection were distinguishable as unique but correlated constructs within the PASA. Convergent, discriminant, concurrent, and predictive validity of the PASA was satisfactory, although the acceptance and rejection constructs were highly correlated and showed similar patterns of concurrent validity. Factor invariance was established for mother and for father reports. Conclusions Results suggest that the PASA is a valid and reliable measure of peer affiliation and of social acceptance among peers during the middle school years and provides a comprehensive yet brief assessment of peer affiliations and social acceptance. PMID:24611623

  2. Segmentation of breast ultrasound images based on active contours using neutrosophic theory.

    PubMed

    Lotfollahi, Mahsa; Gity, Masoumeh; Ye, Jing Yong; Mahlooji Far, A

    2018-04-01

    Ultrasound imaging is an effective approach for diagnosing breast cancer, but it is highly operator-dependent. Recent advances in computer-aided diagnosis have suggested that it can assist physicians in diagnosis. Definition of the region of interest before computer analysis is still needed. Since manual outlining of the tumor contour is tedious and time-consuming for a physician, developing an automatic segmentation method is important for clinical application. The present paper represents a novel method to segment breast ultrasound images. It utilizes a combination of region-based active contour and neutrosophic theory to overcome the natural properties of ultrasound images including speckle noise and tissue-related textures. First, due to inherent speckle noise and low contrast of these images, we have utilized a non-local means filter and fuzzy logic method for denoising and image enhancement, respectively. This paper presents an improved weighted region-scalable active contour to segment breast ultrasound images using a new feature derived from neutrosophic theory. This method has been applied to 36 breast ultrasound images. It generates true-positive and false-positive results, and similarity of 95%, 6%, and 90%, respectively. The purposed method indicates clear advantages over other conventional methods of active contour segmentation, i.e., region-scalable fitting energy and weighted region-scalable fitting energy.

  3. Computer-assisted 3D kinematic analysis of all leg joints in walking insects.

    PubMed

    Bender, John A; Simpson, Elaine M; Ritzmann, Roy E

    2010-10-26

    High-speed video can provide fine-scaled analysis of animal behavior. However, extracting behavioral data from video sequences is a time-consuming, tedious, subjective task. These issues are exacerbated where accurate behavioral descriptions require analysis of multiple points in three dimensions. We describe a new computer program written to assist a user in simultaneously extracting three-dimensional kinematics of multiple points on each of an insect's six legs. Digital video of a walking cockroach was collected in grayscale at 500 fps from two synchronized, calibrated cameras. We improved the legs' visibility by painting white dots on the joints, similar to techniques used for digitizing human motion. Compared to manual digitization of 26 points on the legs over a single, 8-second bout of walking (or 106,496 individual 3D points), our software achieved approximately 90% of the accuracy with 10% of the labor. Our experimental design reduced the complexity of the tracking problem by tethering the insect and allowing it to walk in place on a lightly oiled glass surface, but in principle, the algorithms implemented are extensible to free walking. Our software is free and open-source, written in the free language Python and including a graphical user interface for configuration and control. We encourage collaborative enhancements to make this tool both better and widely utilized.

  4. MOWServ: a web client for integration of bioinformatic resources

    PubMed Central

    Ramírez, Sergio; Muñoz-Mérida, Antonio; Karlsson, Johan; García, Maximiliano; Pérez-Pulido, Antonio J.; Claros, M. Gonzalo; Trelles, Oswaldo

    2010-01-01

    The productivity of any scientist is affected by cumbersome, tedious and time-consuming tasks that try to make the heterogeneous web services compatible so that they can be useful in their research. MOWServ, the bioinformatic platform offered by the Spanish National Institute of Bioinformatics, was released to provide integrated access to databases and analytical tools. Since its release, the number of available services has grown dramatically, and it has become one of the main contributors of registered services in the EMBRACE Biocatalogue. The ontology that enables most of the web-service compatibility has been curated, improved and extended. The service discovery has been greatly enhanced by Magallanes software and biodataSF. User data are securely stored on the main server by an authentication protocol that enables the monitoring of current or already-finished user’s tasks, as well as the pipelining of successive data processing services. The BioMoby standard has been greatly extended with the new features included in the MOWServ, such as management of additional information (metadata such as extended descriptions, keywords and datafile examples), a qualified registry, error handling, asynchronous services and service replication. All of them have increased the MOWServ service quality, usability and robustness. MOWServ is available at http://www.inab.org/MOWServ/ and has a mirror at http://www.bitlab-es.com/MOWServ/. PMID:20525794

  5. Interactive three-dimensional visualization and creation of geometries for Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Theis, C.; Buchegger, K. H.; Brugger, M.; Forkel-Wirth, D.; Roesler, S.; Vincke, H.

    2006-06-01

    The implementation of three-dimensional geometries for the simulation of radiation transport problems is a very time-consuming task. Each particle transport code supplies its own scripting language and syntax for creating the geometries. All of them are based on the Constructive Solid Geometry scheme requiring textual description. This makes the creation a tedious and error-prone task, which is especially hard to master for novice users. The Monte Carlo code FLUKA comes with built-in support for creating two-dimensional cross-sections through the geometry and FLUKACAD, a custom-built converter to the commercial Computer Aided Design package AutoCAD, exists for 3D visualization. For other codes, like MCNPX, a couple of different tools are available, but they are often specifically tailored to the particle transport code and its approach used for implementing geometries. Complex constructive solid modeling usually requires very fast and expensive special purpose hardware, which is not widely available. In this paper SimpleGeo is presented, which is an implementation of a generic versatile interactive geometry modeler using off-the-shelf hardware. It is running on Windows, with a Linux version currently under preparation. This paper describes its functionality, which allows for rapid interactive visualization as well as generation of three-dimensional geometries, and also discusses critical issues regarding common CAD systems.

  6. A spatiotemporal-based scheme for efficient registration-based segmentation of thoracic 4-D MRI.

    PubMed

    Yang, Y; Van Reeth, E; Poh, C L; Tan, C H; Tham, I W K

    2014-05-01

    Dynamic three-dimensional (3-D) (four-dimensional, 4-D) magnetic resonance (MR) imaging is gaining importance in the study of pulmonary motion for respiratory diseases and pulmonary tumor motion for radiotherapy. To perform quantitative analysis using 4-D MR images, segmentation of anatomical structures such as the lung and pulmonary tumor is required. Manual segmentation of entire thoracic 4-D MRI data that typically contains many 3-D volumes acquired over several breathing cycles is extremely tedious, time consuming, and suffers high user variability. This requires the development of new automated segmentation schemes for 4-D MRI data segmentation. Registration-based segmentation technique that uses automatic registration methods for segmentation has been shown to be an accurate method to segment structures for 4-D data series. However, directly applying registration-based segmentation to segment 4-D MRI series lacks efficiency. Here we propose an automated 4-D registration-based segmentation scheme that is based on spatiotemporal information for the segmentation of thoracic 4-D MR lung images. The proposed scheme saved up to 95% of computation amount while achieving comparable accurate segmentations compared to directly applying registration-based segmentation to 4-D dataset. The scheme facilitates rapid 3-D/4-D visualization of the lung and tumor motion and potentially the tracking of tumor during radiation delivery.

  7. Tracking colliding cells in vivo microscopy.

    PubMed

    Nguyen, Nhat H; Keller, Steven; Norris, Eric; Huynh, Toan T; Clemens, Mark G; Shin, Min C

    2011-08-01

    Leukocyte motion represents an important component in the innate immune response to infection. Intravital microscopy is a powerful tool as it enables in vivo imaging of leukocyte motion. Under inflammatory conditions, leukocytes may exhibit various motion behaviors, such as flowing, rolling, and adhering. With many leukocytes moving at a wide range of speeds, collisions occur. These collisions result in abrupt changes in the motion and appearance of leukocytes. Manual analysis is tedious, error prone,time consuming, and could introduce technician-related bias. Automatic tracking is also challenging due to the noise inherent in in vivo images and abrupt changes in motion and appearance due to collision. This paper presents a method to automatically track multiple cells undergoing collisions by modeling the appearance and motion for each collision state and testing collision hypotheses of possible transitions between states. The tracking results are demonstrated using in vivo intravital microscopy image sequences.We demonstrate that 1)71% of colliding cells are correctly tracked; (2) the improvement of the proposed method is enhanced when the duration of collision increases; and (3) given good detection results, the proposed method can correctly track 88% of colliding cells. The method minimizes the tracking failures under collisions and, therefore, allows more robust analysis in the study of leukocyte behaviors responding to inflammatory conditions.

  8. Automated classification of bone marrow cells in microscopic images for diagnosis of leukemia: a comparison of two classification schemes with respect to the segmentation quality

    NASA Astrophysics Data System (ADS)

    Krappe, Sebastian; Benz, Michaela; Wittenberg, Thomas; Haferlach, Torsten; Münzenmayer, Christian

    2015-03-01

    The morphological analysis of bone marrow smears is fundamental for the diagnosis of leukemia. Currently, the counting and classification of the different types of bone marrow cells is done manually with the use of bright field microscope. This is a time consuming, partly subjective and tedious process. Furthermore, repeated examinations of a slide yield intra- and inter-observer variances. For this reason an automation of morphological bone marrow analysis is pursued. This analysis comprises several steps: image acquisition and smear detection, cell localization and segmentation, feature extraction and cell classification. The automated classification of bone marrow cells is depending on the automated cell segmentation and the choice of adequate features extracted from different parts of the cell. In this work we focus on the evaluation of support vector machines (SVMs) and random forests (RFs) for the differentiation of bone marrow cells in 16 different classes, including immature and abnormal cell classes. Data sets of different segmentation quality are used to test the two approaches. Automated solutions for the morphological analysis for bone marrow smears could use such a classifier to pre-classify bone marrow cells and thereby shortening the examination duration.

  9. Automated identification of abnormal metaphase chromosome cells for the detection of chronic myeloid leukemia using microscopic images

    NASA Astrophysics Data System (ADS)

    Wang, Xingwei; Zheng, Bin; Li, Shibo; Mulvihill, John J.; Chen, Xiaodong; Liu, Hong

    2010-07-01

    Karyotyping is an important process to classify chromosomes into standard classes and the results are routinely used by the clinicians to diagnose cancers and genetic diseases. However, visual karyotyping using microscopic images is time-consuming and tedious, which reduces the diagnostic efficiency and accuracy. Although many efforts have been made to develop computerized schemes for automated karyotyping, no schemes can get be performed without substantial human intervention. Instead of developing a method to classify all chromosome classes, we develop an automatic scheme to detect abnormal metaphase cells by identifying a specific class of chromosomes (class 22) and prescreen for suspicious chronic myeloid leukemia (CML). The scheme includes three steps: (1) iteratively segment randomly distributed individual chromosomes, (2) process segmented chromosomes and compute image features to identify the candidates, and (3) apply an adaptive matching template to identify chromosomes of class 22. An image data set of 451 metaphase cells extracted from bone marrow specimens of 30 positive and 30 negative cases for CML is selected to test the scheme's performance. The overall case-based classification accuracy is 93.3% (100% sensitivity and 86.7% specificity). The results demonstrate the feasibility of applying an automated scheme to detect or prescreen the suspicious cancer cases.

  10. Segmentation of White Blood Cells From Microscopic Images Using a Novel Combination of K-Means Clustering and Modified Watershed Algorithm.

    PubMed

    Ghane, Narjes; Vard, Alireza; Talebi, Ardeshir; Nematollahy, Pardis

    2017-01-01

    Recognition of white blood cells (WBCs) is the first step to diagnose some particular diseases such as acquired immune deficiency syndrome, leukemia, and other blood-related diseases that are usually done by pathologists using an optical microscope. This process is time-consuming, extremely tedious, and expensive and needs experienced experts in this field. Thus, a computer-aided diagnosis system that assists pathologists in the diagnostic process can be so effective. Segmentation of WBCs is usually a first step in developing a computer-aided diagnosis system. The main purpose of this paper is to segment WBCs from microscopic images. For this purpose, we present a novel combination of thresholding, k-means clustering, and modified watershed algorithms in three stages including (1) segmentation of WBCs from a microscopic image, (2) extraction of nuclei from cell's image, and (3) separation of overlapping cells and nuclei. The evaluation results of the proposed method show that similarity measures, precision, and sensitivity respectively were 92.07, 96.07, and 94.30% for nucleus segmentation and 92.93, 97.41, and 93.78% for cell segmentation. In addition, statistical analysis presents high similarity between manual segmentation and the results obtained by the proposed method.

  11. Computer simulation of the mathematical modeling involved in constitutive equation development: Via symbolic computations

    NASA Technical Reports Server (NTRS)

    Arnold, S. M.; Tan, H. Q.; Dong, X.

    1989-01-01

    Development of new material models for describing the high temperature constitutive behavior of real materials represents an important area of research in engineering disciplines. Derivation of mathematical expressions (constitutive equations) which describe this high temperature material behavior can be quite time consuming, involved and error prone; thus intelligent application of symbolic systems to facilitate this tedious process can be of significant benefit. A computerized procedure (SDICE) capable of efficiently deriving potential based constitutive models, in analytical form is presented. This package, running under MACSYMA, has the following features: partial differentiation, tensor computations, automatic grouping and labeling of common factors, expression substitution and simplification, back substitution of invariant and tensorial relations and a relational data base. Also limited aspects of invariant theory were incorporated into SDICE due to the utilization of potentials as a starting point and the desire for these potentials to be frame invariant (objective). Finally not only calculation of flow and/or evolutionary laws were accomplished but also the determination of history independent nonphysical coefficients in terms of physically measurable parameters, e.g., Young's modulus, was achieved. The uniqueness of SDICE resides in its ability to manipulate expressions in a general yet predefined order and simplify expressions so as to limit expression growth. Results are displayed when applicable utilizing index notation.

  12. A New Method for Automated Identification and Morphometry of Myelinated Fibers Through Light Microscopy Image Analysis.

    PubMed

    Novas, Romulo Bourget; Fazan, Valeria Paula Sassoli; Felipe, Joaquim Cezar

    2016-02-01

    Nerve morphometry is known to produce relevant information for the evaluation of several phenomena, such as nerve repair, regeneration, implant, transplant, aging, and different human neuropathies. Manual morphometry is laborious, tedious, time consuming, and subject to many sources of error. Therefore, in this paper, we propose a new method for the automated morphometry of myelinated fibers in cross-section light microscopy images. Images from the recurrent laryngeal nerve of adult rats and the vestibulocochlear nerve of adult guinea pigs were used herein. The proposed pipeline for fiber segmentation is based on the techniques of competitive clustering and concavity analysis. The evaluation of the proposed method for segmentation of images was done by comparing the automatic segmentation with the manual segmentation. To further evaluate the proposed method considering morphometric features extracted from the segmented images, the distributions of these features were tested for statistical significant difference. The method achieved a high overall sensitivity and very low false-positive rates per image. We detect no statistical difference between the distribution of the features extracted from the manual and the pipeline segmentations. The method presented a good overall performance, showing widespread potential in experimental and clinical settings allowing large-scale image analysis and, thus, leading to more reliable results.

  13. Counting glomeruli and podocytes: rationale and methodologies

    PubMed Central

    Puelles, Victor G.; Bertram, John F.

    2015-01-01

    Purpose of review There is currently much interest in the numbers of both glomeruli and podocytes. This interest stems from greater understanding of the effects of suboptimal fetal events on nephron endowment, the associations between low nephron number and chronic cardiovascular and kidney disease in adults, and the emergence of the podocyte depletion hypothesis. Recent findings Obtaining accurate and precise estimates of glomerular and podocyte number has proven surprisingly difficult. When whole kidneys or large tissue samples are available, design-based stereological methods are considered gold-standard because they are based on principles that negate systematic bias. However, these methods are often tedious and time-consuming, and oftentimes inapplicable when dealing with small samples such as biopsies. Therefore, novel methods suitable for small tissue samples, and innovative approaches to facilitate high through put measurements, such as magnetic resonance imaging (MRI) to estimate glomerular number and flow cytometry to estimate podocyte number, have recently been described. Summary This review describes current gold-standard methods for estimating glomerular and podocyte number, as well as methods developed in the past 3 years. We are now better placed than ever before to accurately and precisely estimate glomerular and podocyte number, and to examine relationships between these measurements and kidney health and disease. PMID:25887899

  14. Self-Supervised Chinese Ontology Learning from Online Encyclopedias

    PubMed Central

    Shao, Zhiqing; Ruan, Tong

    2014-01-01

    Constructing ontology manually is a time-consuming, error-prone, and tedious task. We present SSCO, a self-supervised learning based chinese ontology, which contains about 255 thousand concepts, 5 million entities, and 40 million facts. We explore the three largest online Chinese encyclopedias for ontology learning and describe how to transfer the structured knowledge in encyclopedias, including article titles, category labels, redirection pages, taxonomy systems, and InfoBox modules, into ontological form. In order to avoid the errors in encyclopedias and enrich the learnt ontology, we also apply some machine learning based methods. First, we proof that the self-supervised machine learning method is practicable in Chinese relation extraction (at least for synonymy and hyponymy) statistically and experimentally and train some self-supervised models (SVMs and CRFs) for synonymy extraction, concept-subconcept relation extraction, and concept-instance relation extraction; the advantages of our methods are that all training examples are automatically generated from the structural information of encyclopedias and a few general heuristic rules. Finally, we evaluate SSCO in two aspects, scale and precision; manual evaluation results show that the ontology has excellent precision, and high coverage is concluded by comparing SSCO with other famous ontologies and knowledge bases; the experiment results also indicate that the self-supervised models obviously enrich SSCO. PMID:24715819

  15. iADRs: towards online adverse drug reaction analysis.

    PubMed

    Lin, Wen-Yang; Li, He-Yi; Du, Jhih-Wei; Feng, Wen-Yu; Lo, Chiao-Feng; Soo, Von-Wun

    2012-12-01

    Adverse Drug Reaction (ADR) is one of the most important issues in the assessment of drug safety. In fact, many adverse drug reactions are not discovered during limited pre-marketing clinical trials; instead, they are only observed after long term post-marketing surveillance of drug usage. In light of this, the detection of adverse drug reactions, as early as possible, is an important topic of research for the pharmaceutical industry. Recently, large numbers of adverse events and the development of data mining technology have motivated the development of statistical and data mining methods for the detection of ADRs. These stand-alone methods, with no integration into knowledge discovery systems, are tedious and inconvenient for users and the processes for exploration are time-consuming. This paper proposes an interactive system platform for the detection of ADRs. By integrating an ADR data warehouse and innovative data mining techniques, the proposed system not only supports OLAP style multidimensional analysis of ADRs, but also allows the interactive discovery of associations between drugs and symptoms, called a drug-ADR association rule, which can be further developed using other factors of interest to the user, such as demographic information. The experiments indicate that interesting and valuable drug-ADR association rules can be efficiently mined.

  16. Research on the Integration of Bionic Geometry Modeling and Simulation of Robot Foot Based on Characteristic Curve

    NASA Astrophysics Data System (ADS)

    He, G.; Zhu, H.; Xu, J.; Gao, K.; Zhu, D.

    2017-09-01

    The bionic research of shape is an important aspect of the research on bionic robot, and its implementation cannot be separated from the shape modeling and numerical simulation of the bionic object, which is tedious and time-consuming. In order to improve the efficiency of shape bionic design, the feet of animals living in soft soil and swamp environment are taken as bionic objects, and characteristic skeleton curve, section curve, joint rotation variable, position and other parameters are used to describe the shape and position information of bionic object’s sole, toes and flipper. The geometry modeling of the bionic object is established by using the parameterization of characteristic curves and variables. Based on this, the integration framework of parametric modeling and finite element modeling, dynamic analysis and post-processing of sinking process in soil is proposed in this paper. The examples of bionic ostrich foot and bionic duck foot are also given. The parametric modeling and integration technique can achieve rapid improved design based on bionic object, and it can also greatly improve the efficiency and quality of robot foot bionic design, and has important practical significance to improve the level of bionic design of robot foot’s shape and structure.

  17. A novel multiplex PCR for the simultaneous detection of Salmonella enterica and Shigella species.

    PubMed

    Radhika, M; Saugata, Majumder; Murali, H S; Batra, H V

    2014-01-01

    Salmonella enterica and Shigella species are commonly associated with food and water borne infections leading to gastrointestinal diseases. The present work was undertaken to develop a sensitive and reliable PCR based detection system for simultaneous detection of Salmonella enterica and Shigella at species level. For this the conserved regions of specific genes namely ipaH1, ipaH, wbgZ, wzy and invA were targeted for detection of Shigella genus, S. flexneri, S. sonnei, S. boydii and Salmonella enterica respectively along with an internal amplification control (IAC). The results showed that twenty Salmonella and eleven Shigella spp., were accurately identified by the assay without showing non-specificity against closely related other Enterobacteriaceae organisms and also against other pathogens. Further evaluation of multiplex PCR was undertaken on 50 natural samples of chicken, eggs and poultry litter and results compared with conventional culture isolation and identification procedure. The multiplex PCR identified the presence of Salmonella and Shigella strains with a short pre-enrichment step of 5 h in peptone water and the same samples were processed by conventional procedures for comparison. Therefore, this reported multiplex PCR can serve as an alternative to the tedious time-consuming procedure of culture and identification in food safety laboratories.

  18. Self-supervised Chinese ontology learning from online encyclopedias.

    PubMed

    Hu, Fanghuai; Shao, Zhiqing; Ruan, Tong

    2014-01-01

    Constructing ontology manually is a time-consuming, error-prone, and tedious task. We present SSCO, a self-supervised learning based chinese ontology, which contains about 255 thousand concepts, 5 million entities, and 40 million facts. We explore the three largest online Chinese encyclopedias for ontology learning and describe how to transfer the structured knowledge in encyclopedias, including article titles, category labels, redirection pages, taxonomy systems, and InfoBox modules, into ontological form. In order to avoid the errors in encyclopedias and enrich the learnt ontology, we also apply some machine learning based methods. First, we proof that the self-supervised machine learning method is practicable in Chinese relation extraction (at least for synonymy and hyponymy) statistically and experimentally and train some self-supervised models (SVMs and CRFs) for synonymy extraction, concept-subconcept relation extraction, and concept-instance relation extraction; the advantages of our methods are that all training examples are automatically generated from the structural information of encyclopedias and a few general heuristic rules. Finally, we evaluate SSCO in two aspects, scale and precision; manual evaluation results show that the ontology has excellent precision, and high coverage is concluded by comparing SSCO with other famous ontologies and knowledge bases; the experiment results also indicate that the self-supervised models obviously enrich SSCO.

  19. Unified Theory for Decoding the Signals from X-Ray Florescence and X-Ray Diffraction of Mixtures.

    PubMed

    Chung, Frank H

    2017-05-01

    For research and development or for solving technical problems, we often need to know the chemical composition of an unknown mixture, which is coded and stored in the signals of its X-ray fluorescence (XRF) and X-ray diffraction (XRD). X-ray fluorescence gives chemical elements, whereas XRD gives chemical compounds. The major problem in XRF and XRD analyses is the complex matrix effect. The conventional technique to deal with the matrix effect is to construct empirical calibration lines with standards for each element or compound sought, which is tedious and time-consuming. A unified theory of quantitative XRF analysis is presented here. The idea is to cancel the matrix effect mathematically. It turns out that the decoding equation for quantitative XRF analysis is identical to that for quantitative XRD analysis although the physics of XRD and XRF are fundamentally different. The XRD work has been published and practiced worldwide. The unified theory derives a new intensity-concentration equation of XRF, which is free from the matrix effect and valid for a wide range of concentrations. The linear decoding equation establishes a constant slope for each element sought, hence eliminating the work on calibration lines. The simple linear decoding equation has been verified by 18 experiments.

  20. MOWServ: a web client for integration of bioinformatic resources.

    PubMed

    Ramírez, Sergio; Muñoz-Mérida, Antonio; Karlsson, Johan; García, Maximiliano; Pérez-Pulido, Antonio J; Claros, M Gonzalo; Trelles, Oswaldo

    2010-07-01

    The productivity of any scientist is affected by cumbersome, tedious and time-consuming tasks that try to make the heterogeneous web services compatible so that they can be useful in their research. MOWServ, the bioinformatic platform offered by the Spanish National Institute of Bioinformatics, was released to provide integrated access to databases and analytical tools. Since its release, the number of available services has grown dramatically, and it has become one of the main contributors of registered services in the EMBRACE Biocatalogue. The ontology that enables most of the web-service compatibility has been curated, improved and extended. The service discovery has been greatly enhanced by Magallanes software and biodataSF. User data are securely stored on the main server by an authentication protocol that enables the monitoring of current or already-finished user's tasks, as well as the pipelining of successive data processing services. The BioMoby standard has been greatly extended with the new features included in the MOWServ, such as management of additional information (metadata such as extended descriptions, keywords and datafile examples), a qualified registry, error handling, asynchronous services and service replication. All of them have increased the MOWServ service quality, usability and robustness. MOWServ is available at http://www.inab.org/MOWServ/ and has a mirror at http://www.bitlab-es.com/MOWServ/.

  1. Spectral unmixing of urban land cover using a generic library approach

    NASA Astrophysics Data System (ADS)

    Degerickx, Jeroen; Lordache, Marian-Daniel; Okujeni, Akpona; Hermy, Martin; van der Linden, Sebastian; Somers, Ben

    2016-10-01

    Remote sensing based land cover classification in urban areas generally requires the use of subpixel classification algorithms to take into account the high spatial heterogeneity. These spectral unmixing techniques often rely on spectral libraries, i.e. collections of pure material spectra (endmembers, EM), which ideally cover the large EM variability typically present in urban scenes. Despite the advent of several (semi-) automated EM detection algorithms, the collection of such image-specific libraries remains a tedious and time-consuming task. As an alternative, we suggest the use of a generic urban EM library, containing material spectra under varying conditions, acquired from different locations and sensors. This approach requires an efficient EM selection technique, capable of only selecting those spectra relevant for a specific image. In this paper, we evaluate and compare the potential of different existing library pruning algorithms (Iterative Endmember Selection and MUSIC) using simulated hyperspectral (APEX) data of the Brussels metropolitan area. In addition, we develop a new hybrid EM selection method which is shown to be highly efficient in dealing with both imagespecific and generic libraries, subsequently yielding more robust land cover classification results compared to existing methods. Future research will include further optimization of the proposed algorithm and additional tests on both simulated and real hyperspectral data.

  2. Automatic abdominal multi-organ segmentation using deep convolutional neural network and time-implicit level sets.

    PubMed

    Hu, Peijun; Wu, Fa; Peng, Jialin; Bao, Yuanyuan; Chen, Feng; Kong, Dexing

    2017-03-01

    Multi-organ segmentation from CT images is an essential step for computer-aided diagnosis and surgery planning. However, manual delineation of the organs by radiologists is tedious, time-consuming and poorly reproducible. Therefore, we propose a fully automatic method for the segmentation of multiple organs from three-dimensional abdominal CT images. The proposed method employs deep fully convolutional neural networks (CNNs) for organ detection and segmentation, which is further refined by a time-implicit multi-phase evolution method. Firstly, a 3D CNN is trained to automatically localize and delineate the organs of interest with a probability prediction map. The learned probability map provides both subject-specific spatial priors and initialization for subsequent fine segmentation. Then, for the refinement of the multi-organ segmentation, image intensity models, probability priors as well as a disjoint region constraint are incorporated into an unified energy functional. Finally, a novel time-implicit multi-phase level-set algorithm is utilized to efficiently optimize the proposed energy functional model. Our method has been evaluated on 140 abdominal CT scans for the segmentation of four organs (liver, spleen and both kidneys). With respect to the ground truth, average Dice overlap ratios for the liver, spleen and both kidneys are 96.0, 94.2 and 95.4%, respectively, and average symmetric surface distance is less than 1.3 mm for all the segmented organs. The computation time for a CT volume is 125 s in average. The achieved accuracy compares well to state-of-the-art methods with much higher efficiency. A fully automatic method for multi-organ segmentation from abdominal CT images was developed and evaluated. The results demonstrated its potential in clinical usage with high effectiveness, robustness and efficiency.

  3. Development of a Dependency Theory Toolbox for Database Design.

    DTIC Science & Technology

    1987-12-01

    published algorithms and theorems , and hand simulating these algorithms can be a tedious and error prone chore. Additionally, since the process of...to design and study relational databases exists in the form of published algorithms and theorems . However, hand simulating these algorithms can be a...published algorithms and theorems . Hand simulating these algorithms can be a tedious and error prone chore. Therefore, a toolbox of algorithms and

  4. First Time Rapid and Accurate Detection of Massive Number of Metal Absorption Lines in the Early Universe Using Deep Neural Network

    NASA Astrophysics Data System (ADS)

    Zhao, Yinan; Ge, Jian; Yuan, Xiaoyong; Li, Xiaolin; Zhao, Tiffany; Wang, Cindy

    2018-01-01

    Metal absorption line systems in the distant quasar spectra have been used as one of the most powerful tools to probe gas content in the early Universe. The MgII λλ 2796, 2803 doublet is one of the most popular metal absorption lines and has been used to trace gas and global star formation at redshifts between ~0.5 to 2.5. In the past, machine learning algorithms have been used to detect absorption lines systems in the large sky survey, such as Principle Component Analysis, Gaussian Process and decision tree, but the overall detection process is not only complicated, but also time consuming. It usually takes a few months to go through the entire quasar spectral dataset from each of the Sloan Digital Sky Survey (SDSS) data release. In this work, we applied the deep neural network, or “ deep learning” algorithms, in the most recently SDSS DR14 quasar spectra and were able to randomly search 20000 quasar spectra and detect 2887 strong Mg II absorption features in just 9 seconds. Our detection algorithms were verified with previously released DR12 and DR7 data and published Mg II catalog and the detection accuracy is 90%. This is the first time that deep neural network has demonstrated its promising power in both speed and accuracy in replacing tedious, repetitive human work in searching for narrow absorption patterns in a big dataset. We will present our detection algorithms and also statistical results of the newly detected Mg II absorption lines.

  5. Potential effectiveness of visible and near infrared spectroscopy coupled with wavelength selection for real time grapevine leaf water status measurement.

    PubMed

    Giovenzana, Valentina; Beghi, Roberto; Parisi, Simone; Brancadoro, Lucio; Guidetti, Riccardo

    2018-03-01

    Increasing attention is being paid to non-destructive methods for water status real time monitoring as a potential solution to replace the tedious conventional techniques which are time consuming and not easy to perform directly in the field. The objective of this study was to test the potential effectiveness of two portable optical devices (visible/near infrared (vis/NIR) and near infrared (NIR) spectrophotometers) for the rapid and non-destructive evaluation of the water status of grapevine leaves. Moreover, a variable selection methodology was proposed to determine a set of candidate variables for the prediction of water potential (Ψ, MPa) related to leaf water status in view of a simplified optical device. The statistics of the partial least square (PLS) models showed in validation R 2 between 0.67 and 0.77 for models arising from vis/NIR spectra, and R 2 ranged from 0.77 to 0.85 for the NIR region. The overall performance of the multiple linear regression (MLR) models from selected wavelengths was slightly worse than that of the PLS models. Regarding the NIR range, acceptable MLR models were obtained only using 14 effective variables (R 2 range 0.63-0.69). To address the market demand for portable optical devices and heading towards the trend of miniaturization and low cost of the devices, individual wavelengths could be useful for the design of a simplified and low-cost handheld system providing useful information for better irrigation scheduling. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  6. Combining user logging with eye tracking for interactive and dynamic applications.

    PubMed

    Ooms, Kristien; Coltekin, Arzu; De Maeyer, Philippe; Dupont, Lien; Fabrikant, Sara; Incoul, Annelies; Kuhn, Matthias; Slabbinck, Hendrik; Vansteenkiste, Pieter; Van der Haegen, Lise

    2015-12-01

    User evaluations of interactive and dynamic applications face various challenges related to the active nature of these displays. For example, users can often zoom and pan on digital products, and these interactions cause changes in the extent and/or level of detail of the stimulus. Therefore, in eye tracking studies, when a user's gaze is at a particular screen position (gaze position) over a period of time, the information contained in this particular position may have changed. Such digital activities are commonplace in modern life, yet it has been difficult to automatically compare the changing information at the viewed position, especially across many participants. Existing solutions typically involve tedious and time-consuming manual work. In this article, we propose a methodology that can overcome this problem. By combining eye tracking with user logging (mouse and keyboard actions) with cartographic products, we are able to accurately reference screen coordinates to geographic coordinates. This referencing approach allows researchers to know which geographic object (location or attribute) corresponds to the gaze coordinates at all times. We tested the proposed approach through two case studies, and discuss the advantages and disadvantages of the applied methodology. Furthermore, the applicability of the proposed approach is discussed with respect to other fields of research that use eye tracking-namely, marketing, sports and movement sciences, and experimental psychology. From these case studies and discussions, we conclude that combining eye tracking and user-logging data is an essential step forward in efficiently studying user behavior with interactive and static stimuli in multiple research fields.

  7. A self-taught artificial agent for multi-physics computational model personalization.

    PubMed

    Neumann, Dominik; Mansi, Tommaso; Itu, Lucian; Georgescu, Bogdan; Kayvanpour, Elham; Sedaghat-Hamedani, Farbod; Amr, Ali; Haas, Jan; Katus, Hugo; Meder, Benjamin; Steidl, Stefan; Hornegger, Joachim; Comaniciu, Dorin

    2016-12-01

    Personalization is the process of fitting a model to patient data, a critical step towards application of multi-physics computational models in clinical practice. Designing robust personalization algorithms is often a tedious, time-consuming, model- and data-specific process. We propose to use artificial intelligence concepts to learn this task, inspired by how human experts manually perform it. The problem is reformulated in terms of reinforcement learning. In an off-line phase, Vito, our self-taught artificial agent, learns a representative decision process model through exploration of the computational model: it learns how the model behaves under change of parameters. The agent then automatically learns an optimal strategy for on-line personalization. The algorithm is model-independent; applying it to a new model requires only adjusting few hyper-parameters of the agent and defining the observations to match. The full knowledge of the model itself is not required. Vito was tested in a synthetic scenario, showing that it could learn how to optimize cost functions generically. Then Vito was applied to the inverse problem of cardiac electrophysiology and the personalization of a whole-body circulation model. The obtained results suggested that Vito could achieve equivalent, if not better goodness of fit than standard methods, while being more robust (up to 11% higher success rates) and with faster (up to seven times) convergence rate. Our artificial intelligence approach could thus make personalization algorithms generalizable and self-adaptable to any patient and any model. Copyright © 2016. Published by Elsevier B.V.

  8. Micro-computed tomography in murine models of cerebral cavernous malformations as a paradigm for brain disease.

    PubMed

    Girard, Romuald; Zeineddine, Hussein A; Orsbon, Courtney; Tan, Huan; Moore, Thomas; Hobson, Nick; Shenkar, Robert; Lightle, Rhonda; Shi, Changbin; Fam, Maged D; Cao, Ying; Shen, Le; Neander, April I; Rorrer, Autumn; Gallione, Carol; Tang, Alan T; Kahn, Mark L; Marchuk, Douglas A; Luo, Zhe-Xi; Awad, Issam A

    2016-09-15

    Cerebral cavernous malformations (CCMs) are hemorrhagic brain lesions, where murine models allow major mechanistic discoveries, ushering genetic manipulations and preclinical assessment of therapies. Histology for lesion counting and morphometry is essential yet tedious and time consuming. We herein describe the application and validations of X-ray micro-computed tomography (micro-CT), a non-destructive technique allowing three-dimensional CCM lesion count and volumetric measurements, in transgenic murine brains. We hereby describe a new contrast soaking technique not previously applied to murine models of CCM disease. Volumetric segmentation and image processing paradigm allowed for histologic correlations and quantitative validations not previously reported with the micro-CT technique in brain vascular disease. Twenty-two hyper-dense areas on micro-CT images, identified as CCM lesions, were matched by histology. The inter-rater reliability analysis showed strong consistency in the CCM lesion identification and staging (K=0.89, p<0.0001) between the two techniques. Micro-CT revealed a 29% greater CCM lesion detection efficiency, and 80% improved time efficiency. Serial integrated lesional area by histology showed a strong positive correlation with micro-CT estimated volume (r(2)=0.84, p<0.0001). Micro-CT allows high throughput assessment of lesion count and volume in pre-clinical murine models of CCM. This approach complements histology with improved accuracy and efficiency, and can be applied for lesion burden assessment in other brain diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Human T cells monitored by impedance spectrometry using field-effect transistor arrays: a novel tool for single-cell adhesion and migration studies.

    PubMed

    Law, Jessica Ka Yan; Susloparova, Anna; Vu, Xuan Thang; Zhou, Xiao; Hempel, Felix; Qu, Bin; Hoth, Markus; Ingebrandt, Sven

    2015-05-15

    Cytotoxic T lymphocytes (CTLs) play an important role in the immune system by recognizing and eliminating pathogen-infected and tumorigenic cells. In order to achieve their function, T cells have to migrate throughout the whole body and identify the respective targets. In conventional immunology studies, interactions between CTLs and targets are usually investigated using tedious and time-consuming immunofluorescence imaging. However, there is currently no straightforward measurement tool available to examine the interaction strengths. In the present study, adhesion strengths and migration of single human CD8(+) T cells on pre-coated field-effect transistor (FET) devices (i.e. fibronectin, anti-CD3 antibody, and anti-LFA-1 antibody) were measured using impedance spectroscopy. Adhesion strengths to different protein and antibody coatings were compared. By fitting the data to an electronically equivalent circuit model, cell-related parameters (cell membrane capacitance referring to cell morphology and seal resistance referring to adhesion strength) were obtained. This electronically-assessed adhesion strength provides a novel, fast, and important index describing the interaction efficiency. Furthermore, the size of our detection transistor gates as well as their sensitivity reaches down to single cell resolution. Real-time motions of individually migrating T cells can be traced using our FET devices. The in-house fabricated FETs used in the present study are providing a novel and very efficient insight to individual cell interactions. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Solid-phase reductive amination for glycomic analysis.

    PubMed

    Jiang, Kuan; Zhu, He; Xiao, Cong; Liu, Ding; Edmunds, Garrett; Wen, Liuqing; Ma, Cheng; Li, Jing; Wang, Peng George

    2017-04-15

    Reductive amination is an indispensable method for glycomic analysis, as it tremendously facilitates glycan characterization and quantification by coupling functional tags at the reducing ends of glycans. However, traditional in-solution derivatization based approach for the preparation of reductively aminated glycans is quite tedious and time-consuming. Here, a simpler and more efficient strategy termed solid-phase reductive amination was investigated. The general concept underlying this new approach is to streamline glycan extraction, derivatization, and purification on non-porous graphitized carbon sorbents. Neutral and sialylated standard glycans were utilized to test the feasibility of the solid-phase method. As results, almost complete labeling of those glycans with four common labels of aniline, 2-aminobenzamide (2-AB), 2-aminobenzoic acid (2-AA) and 2-amino-N-(2-aminoethyl)-benzamide (AEAB) was obtained, and negligible desialylation occurred during sample preparation. The labeled glycans derived from glycoproteins showed excellent reproducibility in high performance liquid chromatography (HPLC) and matrix assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) analysis. Direct comparisons based on fluorescent absorbance and relative quantification using isotopic labeling demonstrated that the solid-phase strategy enabled 20-30% increase in sample recovery. In short, the solid-phase strategy is simple, reproducible, efficient, and sensitive for glycan analysis. This method was also successfully applied for N-glycan profiling of HEK 293 cells with MALDI-TOF MS, showing its attractive application in the high-throughput analysis of mammalian glycome. Published by Elsevier B.V.

  11. Recognising safety critical events: can automatic video processing improve naturalistic data analyses?

    PubMed

    Dozza, Marco; González, Nieves Pañeda

    2013-11-01

    New trends in research on traffic accidents include Naturalistic Driving Studies (NDS). NDS are based on large scale data collection of driver, vehicle, and environment information in real world. NDS data sets have proven to be extremely valuable for the analysis of safety critical events such as crashes and near crashes. However, finding safety critical events in NDS data is often difficult and time consuming. Safety critical events are currently identified using kinematic triggers, for instance searching for deceleration below a certain threshold signifying harsh braking. Due to the low sensitivity and specificity of this filtering procedure, manual review of video data is currently necessary to decide whether the events identified by the triggers are actually safety critical. Such reviewing procedure is based on subjective decisions, is expensive and time consuming, and often tedious for the analysts. Furthermore, since NDS data is exponentially growing over time, this reviewing procedure may not be viable anymore in the very near future. This study tested the hypothesis that automatic processing of driver video information could increase the correct classification of safety critical events from kinematic triggers in naturalistic driving data. Review of about 400 video sequences recorded from the events, collected by 100 Volvo cars in the euroFOT project, suggested that drivers' individual reaction may be the key to recognize safety critical events. In fact, whether an event is safety critical or not often depends on the individual driver. A few algorithms, able to automatically classify driver reaction from video data, have been compared. The results presented in this paper show that the state of the art subjective review procedures to identify safety critical events from NDS can benefit from automated objective video processing. In addition, this paper discusses the major challenges in making such video analysis viable for future NDS and new potential applications for NDS video processing. As new NDS such as SHRP2 are now providing the equivalent of five years of one vehicle data each day, the development of new methods, such as the one proposed in this paper, seems necessary to guarantee that these data can actually be analysed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Morphing the feature-based multi-blocks of normative/healthy vertebral geometries to scoliosis vertebral geometries: development of personalized finite element models.

    PubMed

    Hadagali, Prasannaah; Peters, James R; Balasubramanian, Sriram

    2018-03-01

    Personalized Finite Element (FE) models and hexahedral elements are preferred for biomechanical investigations. Feature-based multi-block methods are used to develop anatomically accurate personalized FE models with hexahedral mesh. It is tedious to manually construct multi-blocks for large number of geometries on an individual basis to develop personalized FE models. Mesh-morphing method mitigates the aforementioned tediousness in meshing personalized geometries every time, but leads to element warping and loss of geometrical data. Such issues increase in magnitude when normative spine FE model is morphed to scoliosis-affected spinal geometry. The only way to bypass the issue of hex-mesh distortion or loss of geometry as a result of morphing is to rely on manually constructing the multi-blocks for scoliosis-affected spine geometry of each individual, which is time intensive. A method to semi-automate the construction of multi-blocks on the geometry of scoliosis vertebrae from the existing multi-blocks of normative vertebrae is demonstrated in this paper. High-quality hexahedral elements were generated on the scoliosis vertebrae from the morphed multi-blocks of normative vertebrae. Time taken was 3 months to construct the multi-blocks for normative spine and less than a day for scoliosis. Efforts taken to construct multi-blocks on personalized scoliosis spinal geometries are significantly reduced by morphing existing multi-blocks.

  13. An Algorithm to Automatically Generate the Combinatorial Orbit Counting Equations

    PubMed Central

    Melckenbeeck, Ine; Audenaert, Pieter; Michoel, Tom; Colle, Didier; Pickavet, Mario

    2016-01-01

    Graphlets are small subgraphs, usually containing up to five vertices, that can be found in a larger graph. Identification of the graphlets that a vertex in an explored graph touches can provide useful information about the local structure of the graph around that vertex. Actually finding all graphlets in a large graph can be time-consuming, however. As the graphlets grow in size, more different graphlets emerge and the time needed to find each graphlet also scales up. If it is not needed to find each instance of each graphlet, but knowing the number of graphlets touching each node of the graph suffices, the problem is less hard. Previous research shows a way to simplify counting the graphlets: instead of looking for the graphlets needed, smaller graphlets are searched, as well as the number of common neighbors of vertices. Solving a system of equations then gives the number of times a vertex is part of each graphlet of the desired size. However, until now, equations only exist to count graphlets with 4 or 5 nodes. In this paper, two new techniques are presented. The first allows to generate the equations needed in an automatic way. This eliminates the tedious work needed to do so manually each time an extra node is added to the graphlets. The technique is independent on the number of nodes in the graphlets and can thus be used to count larger graphlets than previously possible. The second technique gives all graphlets a unique ordering which is easily extended to name graphlets of any size. Both techniques were used to generate equations to count graphlets with 4, 5 and 6 vertices, which extends all previous results. Code can be found at https://github.com/IneMelckenbeeck/equation-generator and https://github.com/IneMelckenbeeck/graphlet-naming. PMID:26797021

  14. Giving students a taste of research

    NASA Astrophysics Data System (ADS)

    Thoennessen, Michael

    2008-02-01

    When I was studying physics at the University of Cologne, Germany - admittedly a fairly long time ago - I once carried out an experiment that involved counting hundreds of pendulum oscillations. Using just a stopwatch to roughly measure the period of the oscillation we determined g, the gravitational acceleration on Earth. It was rote, tedious and only roughly accurate work. It was not an experience that inspired me to pursue a career in physics.

  15. Manufacturing Methods & Technology Project Execution Report. First CY 83.

    DTIC Science & Technology

    1983-11-01

    UCCURRENCE. H 83 5180 MMT FOR METAL DEWAR AND UNBONDED LEADS THE GOLD WIRE BONDED CONNECTIOkS ARE MADE BY HAND WHICH IS A TEDIOUS AND EXPENSIVE PROCESS. THE...ATTACHMENTS CURRENT FILAMENT WOUND COMPOSIIE ROCKET MOTOR CASES REQUIRE FORGED METAL POLE PIECESt NOZZLE CLOSURE ATTACHMENT RINGS, AND OTHER ATTACHMENT RINGS... ELASTOMER INSULATOR PROCESS LARGE TACTICAL ROCKET MOTOR INSULATORS ARE COSTLY, LACK DESIGN CHANGE FLEXIBILITY AND SUFFER LONG LEAD TIMES. CURRENT

  16. The Academic Diligence Task (ADT): Assessing Individual Differences in Effort on Tedious but Important Schoolwork

    PubMed Central

    Galla, Brian M.; Plummer, Benjamin D.; White, Rachel E.; Meketon, David; D’Mello, Sidney K.; Duckworth, Angela L.

    2014-01-01

    The current study reports on the development and validation of the Academic Diligence Task (ADT), designed to assess the tendency to expend effort on academic tasks which are tedious in the moment but valued in the long-term. In this novel online task, students allocate their time between solving simple math problems (framed as beneficial for problem solving skills) and, alternatively, playing Tetris or watching entertaining videos. Using a large sample of high school seniors (N = 921), the ADT demonstrated convergent validity with self-report ratings of Big Five conscientiousness and its facets, self-control and grit, as well as discriminant validity from theoretically unrelated constructs, such as Big Five extraversion, openness, and emotional stability, test anxiety, life satisfaction, and positive and negative affect. The ADT also demonstrated incremental predictive validity for objectively measured GPA, standardized math and reading achievement test scores, high school graduation, and college enrollment, over and beyond demographics and intelligence. Collectively, findings suggest the feasibility of online behavioral measures to assess noncognitive individual differences that predict academic outcomes. PMID:25258470

  17. The Academic Diligence Task (ADT): Assessing Individual Differences in Effort on Tedious but Important Schoolwork.

    PubMed

    Galla, Brian M; Plummer, Benjamin D; White, Rachel E; Meketon, David; D'Mello, Sidney K; Duckworth, Angela L

    2014-10-01

    The current study reports on the development and validation of the Academic Diligence Task (ADT), designed to assess the tendency to expend effort on academic tasks which are tedious in the moment but valued in the long-term. In this novel online task, students allocate their time between solving simple math problems (framed as beneficial for problem solving skills) and, alternatively, playing Tetris or watching entertaining videos. Using a large sample of high school seniors ( N = 921), the ADT demonstrated convergent validity with self-report ratings of Big Five conscientiousness and its facets, self-control and grit, as well as discriminant validity from theoretically unrelated constructs, such as Big Five extraversion, openness, and emotional stability, test anxiety, life satisfaction, and positive and negative affect. The ADT also demonstrated incremental predictive validity for objectively measured GPA, standardized math and reading achievement test scores, high school graduation, and college enrollment, over and beyond demographics and intelligence. Collectively, findings suggest the feasibility of online behavioral measures to assess noncognitive individual differences that predict academic outcomes.

  18. Strategy for robot motion and path planning in robot taping

    NASA Astrophysics Data System (ADS)

    Yuan, Qilong; Chen, I.-Ming; Lembono, Teguh Santoso; Landén, Simon Nelson; Malmgren, Victor

    2016-06-01

    Covering objects with masking tapes is a common process for surface protection in processes like spray painting, plasma spraying, shot peening, etc. Manual taping is tedious and takes a lot of effort of the workers. The taping process is a special process which requires correct surface covering strategy and proper attachment of the masking tape for an efficient surface protection. We have introduced an automatic robot taping system consisting of a robot manipulator, a rotating platform, a 3D scanner and specially designed taping end-effectors. This paper mainly talks about the surface covering strategies for different classes of geometries. The methods and corresponding taping tools are introduced for taping of following classes of surfaces: Cylindrical/extended surfaces, freeform surfaces with no grooves, surfaces with grooves, and rotational symmetrical surfaces. A collision avoidance algorithm is introduced for the robot taping manipulation. With further improvements on segmenting surfaces of taping parts and tape cutting mechanisms, such taping solution with the taping tool and the taping methodology can be combined as a very useful and practical taping package to assist humans in this tedious and time costly work.

  19. Development and Testing of a High-Speed Real-Time Kinematic Precise DGPS Positioning System Between Two Aircraft

    DTIC Science & Technology

    2006-09-01

    work-horse for this thesis. He spent hours writing some of the more tedious code, and as much time helping me learn C++ and Linux . He was always there...compared with C++, and the need to use Linux as the operating system, the filter was coded using C++ and KDevelop [28] in SUSE LINUX Professional 9.2 [42...The driving factor for using Linux was the operating system’s ability to access the serial ports in a reliable fashion. Under the original MATLAB® and

  20. Choosing and using citation and bibliographic database software (BDS).

    PubMed

    Hernandez, David A; El-Masri, Maher M; Hernandez, Cheri Ann

    2008-01-01

    The diabetes educator/researcher is faced with a proliferation of diabetes articles in various journals, both online and in print. Keeping track of cited references and remembering how to cite the references in text and the bibliography can be a daunting task for the new researcher and a tedious task for the experienced researcher. The challenge is to find and use a technology, such as bibliographic database software (BDS), which can help to manage this information overload. This article focuses on the use of BDS for the diabetes educator who is undertaking research. BDS can help researchers access and organize literature and make literature searches more efficient and less time consuming. Moreover, the use of such programs tends to reduce errors associated with the complexity of bibliographic citations and can increase the productivity of scholarly publications. The purpose of this article is to provide an overview of BDS currently available, describe how it can be used to aid researchers in their work, and highlight the features of different programs. It is important for diabetes educators and researchers to explore the many benefits of such BDS programs and consider their use to enhance the accuracy and efficiency of accessing and citing references of their research work and publications. Armed with this knowledge, researchers will be able to make informed decisions about selecting BDS which will meet their usage requirements.

  1. Technology That's Ready and Able to Inspect Those Cables

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Attempting to locate a malfunctioning wire in a complex bundle of wires or in a cable that is concealed behind a wall is as difficult as trying to find a needle in a haystack. The result of such an effort can also be costly, time-consuming, and frustrating, whether it is the tedious process of examining cable connections for the Space Shuttle or troubleshooting a cable television hookup. Furthermore, other maintenance restrictions can compound the effort required to locate and repair a particular wiring problem. For example, on the Space Shuttle, once a repair is completed, all systems that have a wire passing through any of the connectors that were disconnected during troubleshooting are affected and, therefore, must undergo retesting, an arduous task that is completely unrelated to the original problem. In an effort to streamline wire inspection and maintenance, two contractors supporting NASA's Kennedy Space Center invented the Standing Wave Reflectometer (SWR) in 1999. In doing so, they leveraged technology that was first developed to detect problems that could lead to aircraft accidents, such as the one that resulted in the catastrophic failure of TWA flight 800 in 1996. The SWR performs a non-intrusive inspection that verifies the condition of electrical power and signal-distribution systems inside the Space Shuttle orbiters. Such testing reduces processing delays and ensures safe operation of these systems.

  2. Fluorescence molecular imaging system with a novel mouse surface extraction method and a rotary scanning scheme

    NASA Astrophysics Data System (ADS)

    Zhao, Yue; Zhu, Dianwen; Baikejiang, Reheman; Li, Changqing

    2015-03-01

    We have developed a new fluorescence molecular tomography (FMT) imaging system, in which we utilized a phase shifting method to extract the mouse surface geometry optically and a rotary laser scanning approach to excite fluorescence molecules and acquire fluorescent measurements on the whole mouse body. Nine fringe patterns with a phase shifting of 2π/9 are projected onto the mouse surface by a projector. The fringe patterns are captured using a webcam to calculate a phase map that is converted to the geometry of the mouse surface with our algorithms. We used a DigiWarp approach to warp a finite element mesh of a standard digital mouse to the measured mouse surface thus the tedious and time-consuming procedure from a point cloud to mesh is avoided. Experimental results indicated that the proposed method is accurate with errors less than 0.5 mm. In the FMT imaging system, the mouse is placed inside a conical mirror and scanned with a line pattern laser that is mounted on a rotation stage. After being reflected by the conical mirror, the emitted fluorescence photons travel through central hole of the rotation stage and the band pass filters in a motorized filter wheel, and are collected by a CCD camera. Phantom experimental results of the proposed new FMT imaging system can reconstruct the target accurately.

  3. Retinal blood vessel segmentation in high resolution fundus photographs using automated feature parameter estimation

    NASA Astrophysics Data System (ADS)

    Orlando, José Ignacio; Fracchia, Marcos; del Río, Valeria; del Fresno, Mariana

    2017-11-01

    Several ophthalmological and systemic diseases are manifested through pathological changes in the properties and the distribution of the retinal blood vessels. The characterization of such alterations requires the segmentation of the vasculature, which is a tedious and time-consuming task that is infeasible to be performed manually. Numerous attempts have been made to propose automated methods for segmenting the retinal vasculature from fundus photographs, although their application in real clinical scenarios is usually limited by their ability to deal with images taken at different resolutions. This is likely due to the large number of parameters that have to be properly calibrated according to each image scale. In this paper we propose to apply a novel strategy for automated feature parameter estimation, combined with a vessel segmentation method based on fully connected conditional random fields. The estimation model is learned by linear regression from structural properties of the images and known optimal configurations, that were previously obtained for low resolution data sets. Our experiments in high resolution images show that this approach is able to estimate appropriate configurations that are suitable for performing the segmentation task without requiring to re-engineer parameters. Furthermore, our combined approach reported state of the art performance on the benchmark data set HRF, as measured in terms of the F1-score and the Matthews correlation coefficient.

  4. Comparative Approach of MRI-Based Brain Tumor Segmentation and Classification Using Genetic Algorithm.

    PubMed

    Bahadure, Nilesh Bhaskarrao; Ray, Arun Kumar; Thethi, Har Pal

    2018-01-17

    The detection of a brain tumor and its classification from modern imaging modalities is a primary concern, but a time-consuming and tedious work was performed by radiologists or clinical supervisors. The accuracy of detection and classification of tumor stages performed by radiologists is depended on their experience only, so the computer-aided technology is very important to aid with the diagnosis accuracy. In this study, to improve the performance of tumor detection, we investigated comparative approach of different segmentation techniques and selected the best one by comparing their segmentation score. Further, to improve the classification accuracy, the genetic algorithm is employed for the automatic classification of tumor stage. The decision of classification stage is supported by extracting relevant features and area calculation. The experimental results of proposed technique are evaluated and validated for performance and quality analysis on magnetic resonance brain images, based on segmentation score, accuracy, sensitivity, specificity, and dice similarity index coefficient. The experimental results achieved 92.03% accuracy, 91.42% specificity, 92.36% sensitivity, and an average segmentation score between 0.82 and 0.93 demonstrating the effectiveness of the proposed technique for identifying normal and abnormal tissues from brain MR images. The experimental results also obtained an average of 93.79% dice similarity index coefficient, which indicates better overlap between the automated extracted tumor regions with manually extracted tumor region by radiologists.

  5. Asynchronous data change notification between database server and accelerator controls system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, W.; Morris, J.; Nemesure, S.

    2011-10-10

    Database data change notification (DCN) is a commonly used feature. Not all database management systems (DBMS) provide an explicit DCN mechanism. Even for those DBMS's which support DCN (such as Oracle and MS SQL server), some server side and/or client side programming may be required to make the DCN system work. This makes the setup of DCN between database server and interested clients tedious and time consuming. In accelerator control systems, there are many well established software client/server architectures (such as CDEV, EPICS, and ADO) that can be used to implement data reflection servers that transfer data asynchronously to anymore » client using the standard SET/GET API. This paper describes a method for using such a data reflection server to set up asynchronous DCN (ADCN) between a DBMS and clients. This method works well for all DBMS systems which provide database trigger functionality. Asynchronous data change notification (ADCN) between database server and clients can be realized by combining the use of a database trigger mechanism, which is supported by major DBMS systems, with server processes that use client/server software architectures that are familiar in the accelerator controls community (such as EPICS, CDEV or ADO). This approach makes the ADCN system easy to set up and integrate into an accelerator controls system. Several ADCN systems have been set up and used in the RHIC-AGS controls system.« less

  6. Review of in situ derivatization techniques for enhanced bioanalysis using liquid chromatography with mass spectrometry.

    PubMed

    Baghdady, Yehia Z; Schug, Kevin A

    2016-01-01

    Accurate and specific analysis of target molecules in complex biological matrices remains a significant challenge, especially when ultra-trace detection limits are required. Liquid chromatography with mass spectrometry is often the method of choice for bioanalysis. Conventional sample preparation and clean-up methods prior to the analysis of biological fluids such as liquid-liquid extraction, solid-phase extraction, or protein precipitation are time-consuming, tedious, and can negatively affect target recovery and detection sensitivity. An alternative or complementary strategy is the use of an off-line or on-line in situ derivatization technique. In situ derivatization can be incorporated to directly derivatize target analytes in their native biological matrices, without any prior sample clean-up methods, to substitute or even enhance the extraction and preconcentration efficiency of these traditional sample preparation methods. Designed appropriately, it can reduce the number of sample preparation steps necessary prior to analysis. Moreover, in situ derivatization can be used to enhance the performance of the developed liquid chromatography with mass spectrometry-based bioanalysis methods regarding stability, chromatographic separation, selectivity, and ionization efficiency. This review presents an overview of the commonly used in situ derivatization techniques coupled to liquid chromatography with mass spectrometry-based bioanalysis to guide and to stimulate future research. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. PICKY: a novel SVD-based NMR spectra peak picking method.

    PubMed

    Alipanahi, Babak; Gao, Xin; Karakoc, Emre; Donaldson, Logan; Li, Ming

    2009-06-15

    Picking peaks from experimental NMR spectra is a key unsolved problem for automated NMR protein structure determination. Such a process is a prerequisite for resonance assignment, nuclear overhauser enhancement (NOE) distance restraint assignment, and structure calculation tasks. Manual or semi-automatic peak picking, which is currently the prominent way used in NMR labs, is tedious, time consuming and costly. We introduce new ideas, including noise-level estimation, component forming and sub-division, singular value decomposition (SVD)-based peak picking and peak pruning and refinement. PICKY is developed as an automated peak picking method. Different from the previous research on peak picking, we provide a systematic study of the proposed method. PICKY is tested on 32 real 2D and 3D spectra of eight target proteins, and achieves an average of 88% recall and 74% precision. PICKY is efficient. It takes PICKY on average 15.7 s to process an NMR spectrum. More important than these numbers, PICKY actually works in practice. We feed peak lists generated by PICKY to IPASS for resonance assignment, feed IPASS assignment to SPARTA for fragments generation, and feed SPARTA fragments to FALCON for structure calculation. This results in high-resolution structures of several proteins, for example, TM1112, at 1.25 A. PICKY is available upon request. The peak lists of PICKY can be easily loaded by SPARKY to enable a better interactive strategy for rapid peak picking.

  8. Automatically Identifying and Predicting Unplanned Wind Turbine Stoppages Using SCADA and Alarms System Data: Case Study and Results

    NASA Astrophysics Data System (ADS)

    Leahy, Kevin; Gallagher, Colm; Bruton, Ken; O'Donovan, Peter; O'Sullivan, Dominic T. J.

    2017-11-01

    Using 10-minute wind turbine SCADA data for fault prediction offers an attractive way of gaining additional prognostic capabilities without needing to invest in extra hardware. To use these data-driven methods effectively, the historical SCADA data must be labelled with the periods when the turbine was in faulty operation as well the sub-system the fault was attributed to. Manually identifying faults using maintenance logs can be effective, but is also highly time consuming and tedious due to the disparate nature of these logs across manufacturers, operators and even individual maintenance events. Turbine alarm systems can help to identify these periods, but the sheer volume of alarms and false positives generated makes analysing them on an individual basis ineffective. In this work, we present a new method for automatically identifying historical stoppages on the turbine using SCADA and alarms data. Each stoppage is associated with either a fault in one of the turbine’s sub-systems, a routine maintenance activity, a grid-related event or a number of other categories. This is then checked against maintenance logs for accuracy and the labelled data fed into a classifier for predicting when these stoppages will occur. Results show that the automated labelling process correctly identifies each type of stoppage, and can be effectively used for SCADA-based prediction of turbine faults.

  9. Recognition of acute lymphoblastic leukemia cells in microscopic images using k-means clustering and support vector machine classifier.

    PubMed

    Amin, Morteza Moradi; Kermani, Saeed; Talebi, Ardeshir; Oghli, Mostafa Ghelich

    2015-01-01

    Acute lymphoblastic leukemia is the most common form of pediatric cancer which is categorized into three L1, L2, and L3 and could be detected through screening of blood and bone marrow smears by pathologists. Due to being time-consuming and tediousness of the procedure, a computer-based system is acquired for convenient detection of Acute lymphoblastic leukemia. Microscopic images are acquired from blood and bone marrow smears of patients with Acute lymphoblastic leukemia and normal cases. After applying image preprocessing, cells nuclei are segmented by k-means algorithm. Then geometric and statistical features are extracted from nuclei and finally these cells are classified to cancerous and noncancerous cells by means of support vector machine classifier with 10-fold cross validation. These cells are also classified into their sub-types by multi-Support vector machine classifier. Classifier is evaluated by these parameters: Sensitivity, specificity, and accuracy which values for cancerous and noncancerous cells 98%, 95%, and 97%, respectively. These parameters are also used for evaluation of cell sub-types which values in mean 84.3%, 97.3%, and 95.6%, respectively. The results show that proposed algorithm could achieve an acceptable performance for the diagnosis of Acute lymphoblastic leukemia and its sub-types and can be used as an assistant diagnostic tool for pathologists.

  10. Adaptive Localization of Focus Point Regions via Random Patch Probabilistic Density from Whole-Slide, Ki-67-Stained Brain Tumor Tissue

    PubMed Central

    Alomari, Yazan M.; MdZin, Reena Rahayu

    2015-01-01

    Analysis of whole-slide tissue for digital pathology images has been clinically approved to provide a second opinion to pathologists. Localization of focus points from Ki-67-stained histopathology whole-slide tissue microscopic images is considered the first step in the process of proliferation rate estimation. Pathologists use eye pooling or eagle-view techniques to localize the highly stained cell-concentrated regions from the whole slide under microscope, which is called focus-point regions. This procedure leads to a high variety of interpersonal observations and time consuming, tedious work and causes inaccurate findings. The localization of focus-point regions can be addressed as a clustering problem. This paper aims to automate the localization of focus-point regions from whole-slide images using the random patch probabilistic density method. Unlike other clustering methods, random patch probabilistic density method can adaptively localize focus-point regions without predetermining the number of clusters. The proposed method was compared with the k-means and fuzzy c-means clustering methods. Our proposed method achieves a good performance, when the results were evaluated by three expert pathologists. The proposed method achieves an average false-positive rate of 0.84% for the focus-point region localization error. Moreover, regarding RPPD used to localize tissue from whole-slide images, 228 whole-slide images have been tested; 97.3% localization accuracy was achieved. PMID:25793010

  11. Activity of the Human Rhinovirus 3C Protease Studied in Various Buffers, Additives and Detergents Solutions for Recombinant Protein Production

    PubMed Central

    Tufail, Soban; Ismat, Fouzia; Imran, Muhammad; Iqbal, Mazhar; Mirza, Osman; Rhaman, Moazur

    2016-01-01

    Proteases are widely used to remove affinity and solubility tags from recombinant proteins to avoid potential interference of these tags with the structure and function of the fusion partner. In recent years, great interest has been seen in use of the human rhinovirus 3C protease owing to its stringent sequence specificity and enhanced activity. Like other proteases, activity of the human rhinovirus 3C protease can be affected in part by the buffer components and additives that are generally employed for purification and stabilization of proteins, hence, necessitate their removal by tedious and time-consuming procedures before proteolysis can occur. To address this issue, we examined the effect of elution buffers used for common affinity based purifications, salt ions, stability/solubility and reducing agents, and detergents on the activity of the human rhinovirus 3C protease using three different fusion proteins at 4°C, a temperature of choice for purification of many proteins. The results show that the human rhinovirus 3C protease performs better at 4°C than the frequently used tobacco etch virus protease and its activity was insensitive to most of the experimental conditions tested. Though number of fusion proteins tested is limited, we expect that these finding will facilitate the use of the human rhinovirus 3C protease in recombinant protein production for pharmaceutical and biotechnological applications. PMID:27093053

  12. Activity of the Human Rhinovirus 3C Protease Studied in Various Buffers, Additives and Detergents Solutions for Recombinant Protein Production.

    PubMed

    Ullah, Raheem; Shah, Majid Ali; Tufail, Soban; Ismat, Fouzia; Imran, Muhammad; Iqbal, Mazhar; Mirza, Osman; Rhaman, Moazur

    2016-01-01

    Proteases are widely used to remove affinity and solubility tags from recombinant proteins to avoid potential interference of these tags with the structure and function of the fusion partner. In recent years, great interest has been seen in use of the human rhinovirus 3C protease owing to its stringent sequence specificity and enhanced activity. Like other proteases, activity of the human rhinovirus 3C protease can be affected in part by the buffer components and additives that are generally employed for purification and stabilization of proteins, hence, necessitate their removal by tedious and time-consuming procedures before proteolysis can occur. To address this issue, we examined the effect of elution buffers used for common affinity based purifications, salt ions, stability/solubility and reducing agents, and detergents on the activity of the human rhinovirus 3C protease using three different fusion proteins at 4°C, a temperature of choice for purification of many proteins. The results show that the human rhinovirus 3C protease performs better at 4°C than the frequently used tobacco etch virus protease and its activity was insensitive to most of the experimental conditions tested. Though number of fusion proteins tested is limited, we expect that these finding will facilitate the use of the human rhinovirus 3C protease in recombinant protein production for pharmaceutical and biotechnological applications.

  13. Separation and purification of five alkaloids from Aconitum duclouxii by counter-current chromatography.

    PubMed

    Wang, Yarong; Cai, Shining; Chen, Yang; Deng, Liang; Zhou, Xumei; Liu, Jia; Xu, Xin; Xia, Qiang; Lin, Mao; Zhang, Jili; Huang, Weili; Wang, Wenjun; Xiang, Canhui; Cui, Guozhen; Du, Lianfeng; He, Huan; Qi, Baohui

    2015-07-01

    C19 -diterpenoid alkaloids are the main components of Aconitum duclouxii Levl. The process of separation and purification of these compounds in previous studies was tedious and time consuming, requiring multiple chromatographic steps, thus resulted in low recovery and high cost. In the present work, five C19 -diterpenoid alkaloids, namely, benzoylaconine (1), N-deethylaconitine (2), aconitine (3), deoxyaconitine (4), and ducloudine A (5), were efficiently prepared from A. duclouxii Levl (Aconitum L.) by ethyl acetate extraction followed with counter-current chromatography. In the process of separation, the critical conditions of counter-current chromatography were optimized. The two-phase solvent system composed of n-hexane/ethyl acetate/methanol/water/NH3 ·H2 O (25%) (1:1:1:1:0.1, v/v) was selected and 148.2 mg of 1, 24.1 mg of 2, 250.6 mg of 3, 73.9 mg of 4, and 31.4 mg of 5 were obtained from 1 g total Aconitum alkaloids extract, respectively, in a single run within 4 h. Their purities were found to be 98.4, 97.2, 98.2, 96.8, and 96.6%, respectively, by ultra-high performance liquid chromatography analysis. The presented separation and purification method was simple, fast, and efficient, and the obtained highly pure alkaloids are suitable for biochemical and toxicological investigation. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A new efficient method of generating photoaffinity beads for drug target identification.

    PubMed

    Nishiya, Yoichi; Hamada, Tomoko; Abe, Masayuki; Takashima, Michio; Tsutsumi, Kyoko; Okawa, Katsuya

    2017-02-15

    Affinity purification is one of the most prevalent methods for the target identification of small molecules. Preparation of an appropriate chemical for immobilization, however, is a tedious and time-consuming process. A decade ago, a photoreaction method for generating affinity beads was reported, where compounds are mixed with agarose beads carrying a photoreactive group (aryldiazirine) and then irradiated with ultraviolet light under dry conditions to form covalent attachment. Although the method has proven useful for identifying drug targets, the beads suffer from inefficient ligand incorporation and tend to shrink and aggregate, which can cause nonspecific binding and low reproducibility. We therefore decided to craft affinity beads free from these shortcomings without compromising the ease of preparation. We herein report a modified method; first, a compound of interest is mixed with a crosslinker having an activated ester and a photoreactive moiety on each end. This mixture is then dried in a glass tube and irradiated with ultraviolet light. Finally, the conjugates are dissolved and reacted with agarose beads with a primary amine. This protocol enabled us to immobilize compounds more efficiently (approximately 500-fold per bead compared to the original method) and generated beads without physical deterioration. We herein demonstrated that the new FK506-immobilized beads specifically isolated more FKBP12 than the original beads, thereby proving our method to be applicable to target identification experiments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Semantic extraction and processing of medical records for patient-oriented visual index

    NASA Astrophysics Data System (ADS)

    Zheng, Weilin; Dong, Wenjie; Chen, Xiangjiao; Zhang, Jianguo

    2012-02-01

    To have comprehensive and completed understanding healthcare status of a patient, doctors need to search patient medical records from different healthcare information systems, such as PACS, RIS, HIS, USIS, as a reference of diagnosis and treatment decisions for the patient. However, it is time-consuming and tedious to do these procedures. In order to solve this kind of problems, we developed a patient-oriented visual index system (VIS) to use the visual technology to show health status and to retrieve the patients' examination information stored in each system with a 3D human model. In this presentation, we present a new approach about how to extract the semantic and characteristic information from the medical record systems such as RIS/USIS to create the 3D Visual Index. This approach includes following steps: (1) Building a medical characteristic semantic knowledge base; (2) Developing natural language processing (NLP) engine to perform semantic analysis and logical judgment on text-based medical records; (3) Applying the knowledge base and NLP engine on medical records to extract medical characteristics (e.g., the positive focus information), and then mapping extracted information to related organ/parts of 3D human model to create the visual index. We performed the testing procedures on 559 samples of radiological reports which include 853 focuses, and achieved 828 focuses' information. The successful rate of focus extraction is about 97.1%.

  16. Structural Analysis of N- and O-glycans Using ZIC-HILIC/Dialysis Coupled to NMR Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qu, Yi; Feng, Ju; Deng, Shuang

    2014-11-19

    Protein glycosylation, an important and complex post-translational modification (PTM), is involved in various biological processes including the receptor-ligand and cell-cell interaction, and plays a crucial role in many biological functions. However, little is known about the glycan structures of important biological complex samples, and the conventional glycan enrichment strategy (i.e., size-exclusion column [SEC] separation,) prior to nuclear magnetic resonance (NMR) detection is time-consuming and tedious. In this study, we employed SEC, Zwitterionic hydrophilic interaction liquid chromatography (ZIC-HILIC), and ZIC-HILIC coupled with dialysis strategies to enrich the glycopeptides from the pronase E digests of RNase B, followed by NMR analysis ofmore » the glycoconjugate. Our results suggest that the ZIC-HILIC enrichment coupled with dialysis is the most efficient, which was thus applied to the analysis of biological complex sample, the pronase E digest of the secreted proteins from the fungi Aspergillus niger. The NMR spectra revealed that the secreted proteins from A. niger contain both N-linked glycans with a high-mannose core and O-linked glycans bearing mannose and glucose with 1->3 and 1->6 linkages. In all, our study provides compelling evidence that ZIC-HILIC separation coupled to dialysis is superior to the commonly used SEC separation to prepare glycopeptides for the downstream NMR analysis, which could greatly facilitate the future NMR-based glycoproteomics research.« less

  17. Application of an artificial neural network to pump card diagnosis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashenayi, K.; Lea, J.F.; Kemp, F.

    1994-12-01

    Beam pumping is the most frequently used artificial-lift technique for oil production. Downhole pump cards are used to evaluate performance of the pumping unit. Pump cards can be generated from surface dynamometer cards using a 1D wave equation with viscous damping, as suggested by Gibbs and Neely. Pump cards contain significant information describing the behavior of the pump. However, interpretation of these cards is tedious and time-consuming; hence, an automated system capable of interpreting these cards could speed interpretation and warn of pump failures. This work presents the results of a DOS-based computer program capable of correctly classifying pump cards.more » The program uses a hybrid artificial neural network (ANN) to identify significant features of the pump card. The hybrid ANN uses classical and sinusoidal perceptrons. The network is trained using an error-back-propagation technique. The program correctly identified pump problems for more than 180 different training and test pump cards. The ANN takes a total of 80 data points as input. Sixty data points are collected from the pump card perimeter, and the remaining 20 data points represent the slope at selected points on the pump card perimeter. Pump problem conditions are grouped into 11 distinct classes. The network is capable of identifying one or more of these problem conditions for each pump card. Eight examples are presented and discussed.« less

  18. Fully automatic registration and segmentation of first-pass myocardial perfusion MR image sequences.

    PubMed

    Gupta, Vikas; Hendriks, Emile A; Milles, Julien; van der Geest, Rob J; Jerosch-Herold, Michael; Reiber, Johan H C; Lelieveldt, Boudewijn P F

    2010-11-01

    Derivation of diagnostically relevant parameters from first-pass myocardial perfusion magnetic resonance images involves the tedious and time-consuming manual segmentation of the myocardium in a large number of images. To reduce the manual interaction and expedite the perfusion analysis, we propose an automatic registration and segmentation method for the derivation of perfusion linked parameters. A complete automation was accomplished by first registering misaligned images using a method based on independent component analysis, and then using the registered data to automatically segment the myocardium with active appearance models. We used 18 perfusion studies (100 images per study) for validation in which the automatically obtained (AO) contours were compared with expert drawn contours on the basis of point-to-curve error, Dice index, and relative perfusion upslope in the myocardium. Visual inspection revealed successful segmentation in 15 out of 18 studies. Comparison of the AO contours with expert drawn contours yielded 2.23 ± 0.53 mm and 0.91 ± 0.02 as point-to-curve error and Dice index, respectively. The average difference between manually and automatically obtained relative upslope parameters was found to be statistically insignificant (P = .37). Moreover, the analysis time per slice was reduced from 20 minutes (manual) to 1.5 minutes (automatic). We proposed an automatic method that significantly reduced the time required for analysis of first-pass cardiac magnetic resonance perfusion images. The robustness and accuracy of the proposed method were demonstrated by the high spatial correspondence and statistically insignificant difference in perfusion parameters, when AO contours were compared with expert drawn contours. Copyright © 2010 AUR. Published by Elsevier Inc. All rights reserved.

  19. Designing an autoverification system in Zagazig University Hospitals Laboratories: preliminary evaluation on thyroid function profile.

    PubMed

    Sediq, Amany Mohy-Eldin; Abdel-Azeez, Ahmad GabAllahm Hala

    2014-01-01

    The current practice in Zagazig University Hospitals Laboratories (ZUHL) is manual verification of all results for the later release of reports. These processes are time consuming and tedious, with large inter-individual variation that slows the turnaround time (TAT). Autoverification is the process of comparing patient results, generated from interfaced instruments, against laboratory-defined acceptance parameters. This study describes an autoverification engine designed and implemented in ZUHL, Egypt. A descriptive study conducted at ZUHL, from January 2012-December 2013. A rule-based system was used in designing an autoverification engine. The engine was preliminarily evaluated on a thyroid function panel. A total of 563 rules were written and tested on 563 simulated cases and 1673 archived cases. The engine decisions were compared to that of 4 independent expert reviewers. The impact of engine implementation on TAT was evaluated. Agreement was achieved among the 4 reviewers in 55.5% of cases, and with the engine in 51.5% of cases. The autoverification rate for archived cases was 63.8%. Reported lab TAT was reduced by 34.9%, and TAT segment from the completion of analysis to verification was reduced by 61.8%. The developed rule-based autoverification system has a verification rate comparable to that of the commercially available software. However, the in-house development of this system had saved the hospital the cost of commercially available ones. The implementation of the system shortened the TAT and minimized the number of samples that needed staff revision, which enabled laboratory staff to devote more time and effort to handle problematic test results and to improve patient care quality.

  20. A System for Anesthesia Drug Administration Using Barcode Technology: The Codonics Safe Label System and Smart Anesthesia Manager.

    PubMed

    Jelacic, Srdjan; Bowdle, Andrew; Nair, Bala G; Kusulos, Dolly; Bower, Lynnette; Togashi, Kei

    2015-08-01

    Many anesthetic drug errors result from vial or syringe swaps. Scanning the barcodes on vials before drug preparation, creating syringe labels that include barcodes, and scanning the syringe label barcodes before drug administration may help to prevent errors. In contrast, making syringe labels by hand that comply with the recommendations of regulatory agencies and standards-setting bodies is tedious and time consuming. A computerized system that uses vial barcodes and generates barcoded syringe labels could address both safety issues and labeling recommendations. We measured compliance of syringe labels in multiple operating rooms (ORs) with the recommendations of regulatory agencies and standards-setting bodies before and after the introduction of the Codonics Safe Label System (SLS). The Codonics SLS was then combined with Smart Anesthesia Manager software to create an anesthesia barcode drug administration system, which allowed us to measure the rate of scanning syringe label barcodes at the time of drug administration in 2 cardiothoracic ORs before and after introducing a coffee card incentive. Twelve attending cardiothoracic anesthesiologists and the OR satellite pharmacy participated. The use of the Codonics SLS drug labeling system resulted in >75% compliant syringe labels (95% confidence interval, 75%-98%). All syringe labels made using the Codonics SLS system were compliant. The average rate of scanning barcodes on syringe labels using Smart Anesthesia Manager was 25% (730 of 2976) over 13 weeks but increased to 58% (956 of 1645) over 8 weeks after introduction of a simple (coffee card) incentive (P < 0.001). An anesthesia barcode drug administration system resulted in a moderate rate of scanning syringe label barcodes at the time of drug administration. Further, adaptation of the system will be required to achieve a higher utilization rate.

  1. Improved sample preparation of glyphosate and methylphosphonic acid by EPA method 6800A and time-of-flight mass spectrometry using novel solid-phase extraction.

    PubMed

    Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip

    2012-02-01

    The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.

  2. Word spotting for handwritten documents using Chamfer Distance and Dynamic Time Warping

    NASA Astrophysics Data System (ADS)

    Saabni, Raid M.; El-Sana, Jihad A.

    2011-01-01

    A large amount of handwritten historical documents are located in libraries around the world. The desire to access, search, and explore these documents paves the way for a new age of knowledge sharing and promotes collaboration and understanding between human societies. Currently, the indexes for these documents are generated manually, which is very tedious and time consuming. Results produced by state of the art techniques, for converting complete images of handwritten documents into textual representations, are not yet sufficient. Therefore, word-spotting methods have been developed to archive and index images of handwritten documents in order to enable efficient searching within documents. In this paper, we present a new matching algorithm to be used in word-spotting tasks for historical Arabic documents. We present a novel algorithm based on the Chamfer Distance to compute the similarity between shapes of word-parts. Matching results are used to cluster images of Arabic word-parts into different classes using the Nearest Neighbor rule. To compute the distance between two word-part images, the algorithm subdivides each image into equal-sized slices (windows). A modified version of the Chamfer Distance, incorporating geometric gradient features and distance transform data, is used as a similarity distance between the different slices. Finally, the Dynamic Time Warping (DTW) algorithm is used to measure the distance between two images of word-parts. By using the DTW we enabled our system to cluster similar word-parts, even though they are transformed non-linearly due to the nature of handwriting. We tested our implementation of the presented methods using various documents in different writing styles, taken from Juma'a Al Majid Center - Dubai, and obtained encouraging results.

  3. Determination of Phosphorus in Cola Drinks

    NASA Astrophysics Data System (ADS)

    Lozano-Calero, Diego; Martìn-Palomeque, Pilar; Madueño-Lorguillo, Silvia

    1996-12-01

    Laboratory experiments can improve student interest in science. However, the contrary effect could occur if they are not well designed and seem tedious, too laborious, and disconnected from daily life. Cola beverages are one of the most widely consumed drinks and are most popular among students. Much attention is being paid to possible consequences of excessive consumption for human health. Intensive efforts are being made to assess the erosive potential for teeth because of the beverages' acidity (1, 2); adverse effects secondary to high caffeine intake (e.g., hypertension, allergic reactions, gastrointestinal disturbances) (3 - 5); and adverse effects on calcium metabolism due to their high phosphoric acid content, which combined with low dietary calcium intake could increase the risk of suffering from bone diseases (6 - 9). We propose here the quantification of the phosphorus content in this kind of drinks by a different procedure from that previously described by Murphy in this Journal (10). We think this laboratory experiment will seem very interesting to students.

  4. Automatic computational labeling of glomerular textural boundaries

    NASA Astrophysics Data System (ADS)

    Ginley, Brandon; Tomaszewski, John E.; Sarder, Pinaki

    2017-03-01

    The glomerulus, a specialized bundle of capillaries, is the blood filtering unit of the kidney. Each human kidney contains about 1 million glomeruli. Structural damages in the glomerular micro-compartments give rise to several renal conditions; most severe of which is proteinuria, where excessive blood proteins flow freely to the urine. The sole way to confirm glomerular structural damage in renal pathology is by examining histopathological or immunofluorescence stained needle biopsies under a light microscope. However, this method is extremely tedious and time consuming, and requires manual scoring on the number and volume of structures. Computational quantification of equivalent features promises to greatly ease this manual burden. The largest obstacle to computational quantification of renal tissue is the ability to recognize complex glomerular textural boundaries automatically. Here we present a computational pipeline to accurately identify glomerular boundaries with high precision and accuracy. The computational pipeline employs an integrated approach composed of Gabor filtering, Gaussian blurring, statistical F-testing, and distance transform, and performs significantly better than standard Gabor based textural segmentation method. Our integrated approach provides mean accuracy/precision of 0.89/0.97 on n = 200Hematoxylin and Eosin (HE) glomerulus images, and mean 0.88/0.94 accuracy/precision on n = 200 Periodic Acid Schiff (PAS) glomerulus images. Respective accuracy/precision of the Gabor filter bank based method is 0.83/0.84 for HE and 0.78/0.8 for PAS. Our method will simplify computational partitioning of glomerular micro-compartments hidden within dense textural boundaries. Automatic quantification of glomeruli will streamline structural analysis in clinic, and can help realize real time diagnoses and interventions.

  5. Computer vision and soft computing for automatic skull-face overlay in craniofacial superimposition.

    PubMed

    Campomanes-Álvarez, B Rosario; Ibáñez, O; Navarro, F; Alemán, I; Botella, M; Damas, S; Cordón, O

    2014-12-01

    Craniofacial superimposition can provide evidence to support that some human skeletal remains belong or not to a missing person. It involves the process of overlaying a skull with a number of ante mortem images of an individual and the analysis of their morphological correspondence. Within the craniofacial superimposition process, the skull-face overlay stage just focuses on achieving the best possible overlay of the skull and a single ante mortem image of the suspect. Although craniofacial superimposition has been in use for over a century, skull-face overlay is still applied by means of a trial-and-error approach without an automatic method. Practitioners finish the process once they consider that a good enough overlay has been attained. Hence, skull-face overlay is a very challenging, subjective, error prone, and time consuming part of the whole process. Though the numerical assessment of the method quality has not been achieved yet, computer vision and soft computing arise as powerful tools to automate it, dramatically reducing the time taken by the expert and obtaining an unbiased overlay result. In this manuscript, we justify and analyze the use of these techniques to properly model the skull-face overlay problem. We also present the automatic technical procedure we have developed using these computational methods and show the four overlays obtained in two craniofacial superimposition cases. This automatic procedure can be thus considered as a tool to aid forensic anthropologists to develop the skull-face overlay, automating and avoiding subjectivity of the most tedious task within craniofacial superimposition. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Genetic linkage analysis using pooled DNA and infrared detection of tailed STRP primer patterns

    NASA Astrophysics Data System (ADS)

    Oetting, William S.; Wildenberg, Scott C.; King, Richard A.

    1996-04-01

    The mapping of a disease locus to a specific chromosomal region is an important step in the eventual isolation and analysis of a disease causing gene. Conventional mapping methods analyze large multiplex families and/or smaller nuclear families to find linkage between the disease and a chromosome marker that maps to a known chromosomal region. This analysis is time consuming and tedious, typically requiring the determination of 30,000 genotypes or more. For appropriate populations, we have instead utilized pooled DNA samples for gene mapping which greatly reduces the amount of time necessary for an initial chromosomal screen. This technique assumes a common founder for the disease locus of interest and searches for a region of a chromosome shared between affected individuals. Our analysis involves the PCR amplification of short tandem repeat polymorphisms (STRP) to detect these shared regions. In order to reduce the cost of genotyping, we have designed unlabeled tailed PCR primers which, when combined with a labeled universal primer, provides for an alternative to synthesizing custom labeled primers. The STRP pattern is visualized with an infrared fluorescence based automated DNA sequencer and the patterns quantitated by densitometric analysis of the allele pattern. Differences in the distribution of alleles between pools of affected and unaffected individuals, including a reduction in the number of alleles in the affected pool, indicate the sharing of a region of a chromosome. We have found this method effective for markers 10 - 15 cM away from the disease locus for a recessive genetic disease.

  7. Quantitative assessment of multiple sclerosis lesion load using CAD and expert input

    NASA Astrophysics Data System (ADS)

    Gertych, Arkadiusz; Wong, Alexis; Sangnil, Alan; Liu, Brent J.

    2008-03-01

    Multiple sclerosis (MS) is a frequently encountered neurological disease with a progressive but variable course affecting the central nervous system. Outline-based lesion quantification in the assessment of lesion load (LL) performed on magnetic resonance (MR) images is clinically useful and provides information about the development and change reflecting overall disease burden. Methods of LL assessment that rely on human input are tedious, have higher intra- and inter-observer variability and are more time-consuming than computerized automatic (CAD) techniques. At present it seems that methods based on human lesion identification preceded by non-interactive outlining by CAD are the best LL quantification strategies. We have developed a CAD that automatically quantifies MS lesions, displays 3-D lesion map and appends radiological findings to original images according to current DICOM standard. CAD is also capable to display and track changes and make comparison between patient's separate MRI studies to determine disease progression. The findings are exported to a separate imaging tool for review and final approval by expert. Capturing and standardized archiving of manual contours is also implemented. Similarity coefficients calculated from quantities of LL in collected exams show a good correlation of CAD-derived results vs. those incorporated as expert's reading. Combining the CAD approach with an expert interaction may impact to the diagnostic work-up of MS patients because of improved reproducibility in LL assessment and reduced time for single MR or comparative exams reading. Inclusion of CAD-generated outlines as DICOM-compliant overlays into the image data can serve as a better reference in MS progression tracking.

  8. MAAMD: a workflow to standardize meta-analyses and comparison of affymetrix microarray data

    PubMed Central

    2014-01-01

    Background Mandatory deposit of raw microarray data files for public access, prior to study publication, provides significant opportunities to conduct new bioinformatics analyses within and across multiple datasets. Analysis of raw microarray data files (e.g. Affymetrix CEL files) can be time consuming, complex, and requires fundamental computational and bioinformatics skills. The development of analytical workflows to automate these tasks simplifies the processing of, improves the efficiency of, and serves to standardize multiple and sequential analyses. Once installed, workflows facilitate the tedious steps required to run rapid intra- and inter-dataset comparisons. Results We developed a workflow to facilitate and standardize Meta-Analysis of Affymetrix Microarray Data analysis (MAAMD) in Kepler. Two freely available stand-alone software tools, R and AltAnalyze were embedded in MAAMD. The inputs of MAAMD are user-editable csv files, which contain sample information and parameters describing the locations of input files and required tools. MAAMD was tested by analyzing 4 different GEO datasets from mice and drosophila. MAAMD automates data downloading, data organization, data quality control assesment, differential gene expression analysis, clustering analysis, pathway visualization, gene-set enrichment analysis, and cross-species orthologous-gene comparisons. MAAMD was utilized to identify gene orthologues responding to hypoxia or hyperoxia in both mice and drosophila. The entire set of analyses for 4 datasets (34 total microarrays) finished in ~ one hour. Conclusions MAAMD saves time, minimizes the required computer skills, and offers a standardized procedure for users to analyze microarray datasets and make new intra- and inter-dataset comparisons. PMID:24621103

  9. Graphic and movie illustrations of human prenatal development and their application to embryological education based on the human embryo specimens in the Kyoto collection.

    PubMed

    Yamada, Shigehito; Uwabe, Chigako; Nakatsu-Komatsu, Tomoko; Minekura, Yutaka; Iwakura, Masaji; Motoki, Tamaki; Nishimiya, Kazuhiko; Iiyama, Masaaki; Kakusho, Koh; Minoh, Michihiko; Mizuta, Shinobu; Matsuda, Tetsuya; Matsuda, Yoshimasa; Haishi, Tomoyuki; Kose, Katsumi; Fujii, Shingo; Shiota, Kohei

    2006-02-01

    Morphogenesis in the developing embryo takes place in three dimensions, and in addition, the dimension of time is another important factor in development. Therefore, the presentation of sequential morphological changes occurring in the embryo (4D visualization) is essential for understanding the complex morphogenetic events and the underlying mechanisms. Until recently, 3D visualization of embryonic structures was possible only by reconstruction from serial histological sections, which was tedious and time-consuming. During the past two decades, 3D imaging techniques have made significant advances thanks to the progress in imaging and computer technologies, computer graphics, and other related techniques. Such novel tools have enabled precise visualization of the 3D topology of embryonic structures and to demonstrate spatiotemporal 4D sequences of organogenesis. Here, we describe a project in which staged human embryos are imaged by the magnetic resonance (MR) microscope, and 3D images of embryos and their organs at each developmental stage were reconstructed based on the MR data, with the aid of computer graphics techniques. On the basis of the 3D models of staged human embryos, we constructed a data set of 3D images of human embryos and made movies to illustrate the sequential process of human morphogenesis. Furthermore, a computer-based self-learning program of human embryology is being developed for educational purposes, using the photographs, histological sections, MR images, and 3D models of staged human embryos. Copyright 2005 Wiley-Liss, Inc.

  10. A study on the impact of hydroxypropyl methylcellulose on the viscosity of PEG melt suspensions using surface plots and principal component analysis.

    PubMed

    Oh, Ching Mien; Heng, Paul Wan Sia; Chan, Lai Wah

    2015-04-01

    An understanding of the rheological behaviour of polymer melt suspensions is crucial in pharmaceutical manufacturing, especially when processed by spray congealing or melt extruding. However, a detailed comparison of the viscosities at each and every temperature and concentration between the various grades of adjuvants in the formulation will be tedious and time-consuming. Therefore, the statistical method, principal component analysis (PCA), was explored in this study. The composite formulations comprising polyethylene glycol (PEG) 3350 and hydroxypropyl methylcellulose (HPMC) of ten different grades (K100 LV, K4M, K15M, K100M, E15 LV, E50 LV, E4M, F50 LV, F4M and Methocel VLV) at various concentrations were prepared and their viscosities at different temperatures determined. Surface plots showed that concentration of HPMC had a greater effect on the viscosity compared to temperature. Particle size and size distribution of HPMC played an important role in the viscosity of melt suspensions. Smaller particles led to a greater viscosity than larger particles. PCA was used to evaluate formulations of different viscosities. The complex viscosity profiles of the various formulations containing HPMC were successfully classified into three clusters of low, moderate and high viscosity. Formulations within each group showed similar viscosities despite differences in grade or concentration of HPMC. Formulations in the low viscosity cluster were found to be sprayable. PCA was able to differentiate the complex viscosity profiles of different formulations containing HPMC in an efficient and time-saving manner and provided an excellent visualisation of the data.

  11. Automatic evidence retrieval for systematic reviews.

    PubMed

    Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G; Tsafnat, Guy

    2014-10-01

    Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing's effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Our goal was to evaluate an automatic method for citation snowballing's capacity to identify and retrieve the full text and/or abstracts of cited articles. Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews.

  12. Charging YOYO-1 on Capillary Wall for Online DNA Intercalation and Integrating This Approach with Multiplex PCR and Bare Narrow Capillary–Hydrodynamic Chromatography for Online DNA Analysis

    PubMed Central

    2016-01-01

    Multiplex polymerase chain reaction (PCR) has been widely utilized for high-throughput pathogen identification. Often, a dye is used to intercalate the amplified DNA fragments, and identifications of the pathogens are carried out by DNA melting curve analysis or gel electrophoresis. Integrating DNA amplification and identification is a logic path toward maximizing the benefit of multiplex PCR. Although PCR and gel electrophoresis have been integrated, replenishing the gels after each run is tedious and time-consuming. In this technical note, we develop an approach to address this issue. We perform multiplex PCR inside a capillary, transfer the amplified fragments to a bare narrow capillary, and measure their lengths online using bare narrow capillary–hydrodynamic chromatography (BaNC-HDC), a new technique recently developed in our laboratory for free-solution DNA separation. To intercalate the DNA with YOYO-1 (a fluorescent dye) for BaNC-HDC, we flush the capillary column with a YOYO-1 solution; positively charged YOYO-1 is adsorbed (or charged) onto the negatively charged capillary wall. As DNA molecules are driven down the column for separation, they react with the YOYO-1 stored on the capillary wall and are online-intercalated with the dye. With a single YOYO-1 charging, the column can be used for more than 40 runs, although the fluorescence signal intensities of the DNA peaks decrease gradually. Although the dye-DNA intercalation occurs during the separation, it does not affect the retention times, separation efficiencies, or resolutions. PMID:25555111

  13. Bacteriophage Amplification-Coupled Detection and Identification of Bacterial Pathogens

    NASA Astrophysics Data System (ADS)

    Cox, Christopher R.; Voorhees, Kent J.

    Current methods of species-specific bacterial detection and identification are complex, time-consuming, and often require expensive specialized equipment and highly trained personnel. Numerous biochemical and genotypic identification methods have been applied to bacterial characterization, but all rely on tedious microbiological culturing practices and/or costly sequencing protocols which render them impractical for deployment as rapid, cost-effective point-of-care or field detection and identification methods. With a view towards addressing these shortcomings, we have exploited the evolutionarily conserved interactions between a bacteriophage (phage) and its bacterial host to develop species-specific detection methods. Phage amplification-coupled matrix assisted laser desorption time-of-flight mass spectrometry (MALDI-TOF-MS) was utilized to rapidly detect phage propagation resulting from species-specific in vitro bacterial infection. This novel signal amplification method allowed for bacterial detection and identification in as little as 2 h, and when combined with disulfide bond reduction methods developed in our laboratory to enhance MALDI-TOF-MS resolution, was observed to lower the limit of detection by several orders of magnitude over conventional spectroscopy and phage typing methods. Phage amplification has been combined with lateral flow immunochromatography (LFI) to develop rapid, easy-to-operate, portable, species-specific point-of-care (POC) detection devices. Prototype LFI detectors have been developed and characterized for Yersinia pestis and Bacillus anthracis, the etiologic agents of plague and anthrax, respectively. Comparable sensitivity and rapidity was observed when phage amplification was adapted to a species-specific handheld LFI detector, thus allowing for rapid, simple, POC bacterial detection and identification while eliminating the need for bacterial culturing or DNA isolation and amplification techniques.

  14. Simplified dichromated gelatin hologram recording process

    NASA Technical Reports Server (NTRS)

    Georgekutty, Tharayil G.; Liu, Hua-Kuang

    1987-01-01

    A simplified method for making dichromated gelatin (DCG) holographic optical elements (HOE) has been discovered. The method is much less tedious and it requires a period of processing time comparable with that for processing a silver halide hologram. HOE characteristics including diffraction efficiency (DE), linearity, and spectral sensitivity have been quantitatively investigated. The quality of the holographic grating is very high. Ninety percent or higher diffraction efficiency has been achieved in simple plane gratings made by this process.

  15. Reactive low temperature plasma ionization mass spectrometry for the determination of organic UV filters in personal care products.

    PubMed

    Ding, Xuelu; Gerbig, Stefanie; Spengler, Bernhard; Schulz, Sabine

    2018-02-01

    Organic UV filters in personal care products (PCPs) have been persistently reported as a potential threat to human health. In order to guarantee consumers ' safety, the dose of these compounds in PCPs needs to be monitored. Here, a methodology based on reactive low temperature plasma ionization (LTP) mass spectrometry (MS) has been developed for the determination of common organic UV filters in PCPs including benzophenone-3, ethylhexyl dimethyl p-aminobenzoic acid, ethylhexyl methoxycinnamate, 4-methylbenzylidene camphor, octocrylene, and ethylhexyl salicylate. The experiments were carried out in transmission geometry where the LTP ion source, samples loaded on a stainless steel mesh, and the MS inlet were aligned coaxially. Four chemicals, ammonia, ammonium formate, aniline, and methylamine were considered as reactive additives allowing reactions with the UV filters through different mechanisms. Methylamine-induced reactive LTP-MS showed the most prominent improvement on the detection of UV filter compounds. Compared to direct LTP-MS, the developed method improved the detection limits of UV filters more than 10 fold. Moreover, the method enabled fast semi-quantitative screening of UV filters in authentic PCPs. Concentrations of active ingredients in eight authentic PCPs as determined with reactive LTP-MS were found comparable to values offered by the cosmetic companies and corresponding HPLC data. The methodology provides high throughput analysis (70s per sample) and sensitive identification of organic UV filters. Lowest detectable concentrations ranged from 0.13µg/g for 4-methylbenzylidene camphor to 7.67µg/g for octocrylene in spiked cream. In addition, it shows the potential to be used as a screening tool for legal authentications of these chemicals in the future due to its semi-quantitative determination of UV filters in PCPs without tedious sample preparation and time-consuming chromatographic separation. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Deep Sea Moorings Fishbite Handbook

    DTIC Science & Technology

    1987-03-01

    additions to the body of knowledge relative to fishbite damage and its control.. It is the purpose of this Handbook to bring information on the...2s cmt tbrcgagh so that the line parted (quite S •--t___ .__ - i ss be!_mg hanled aboard ship. The parted ends, taareff~’ 0= ± s o.- botfh cmtttng and...the depth at which damage took place. Such a procedure is at times tedious, but experience has shown that it usually leads to discovery of more biting

  17. The applications of computers in biological research

    NASA Technical Reports Server (NTRS)

    Wei, Jennifer

    1988-01-01

    Research in many fields could not be done without computers. There is often a great deal of technical data, even in the biological fields, that need to be analyzed. These data, unfortunately, previously absorbed much of every researcher's time. Now, due to the steady increase in computer technology, biological researchers are able to make incredible advances in their work without the added worries of tedious and difficult tasks such as the many mathematical calculations involved in today's research and health care.

  18. Computer measurement of particle sizes in electron microscope images

    NASA Technical Reports Server (NTRS)

    Hall, E. L.; Thompson, W. B.; Varsi, G.; Gauldin, R.

    1976-01-01

    Computer image processing techniques have been applied to particle counting and sizing in electron microscope images. Distributions of particle sizes were computed for several images and compared to manually computed distributions. The results of these experiments indicate that automatic particle counting within a reasonable error and computer processing time is feasible. The significance of the results is that the tedious task of manually counting a large number of particles can be eliminated while still providing the scientist with accurate results.

  19. Artificial Intelligence for VHSIC Systems Design (AIVD) User Reference Manual

    DTIC Science & Technology

    1988-12-01

    The goal of this program was to develop prototype tools which would use artificial intelligence techniques to extend the Architecture Design and Assessment (ADAS) software capabilities. These techniques were applied in a number of ways to increase the productivity of ADAS users. AIM will reduce the amount of time spent on tedious, negative, and error-prone steps. It will also provide f documentation that will assist users in varifying that the models they build are correct Finally, AIVD will help make ADAS models more reusable.

  20. In-Situ-Activated N-Doped Mesoporous Carbon from a Protic Salt and Its Performance in Supercapacitors.

    PubMed

    Mendes, Tiago C; Xiao, Changlong; Zhou, Fengling; Li, Haitao; Knowles, Gregory P; Hilder, Matthias; Somers, Anthony; Howlett, Patrick C; MacFarlane, Douglas R

    2016-12-28

    Protic salts have been recently recognized to be an excellent carbon source to obtain highly ordered N-doped carbon without the need of tedious and time-consuming preparation steps that are usually involved in traditional polymer-based precursors. Herein, we report a direct co-pyrolysis of an easily synthesized protic salt (benzimidazolium triflate) with calcium and sodium citrate at 850 °C to obtain N-doped mesoporous carbons from a single calcination procedure. It was found that sodium citrate plays a role in the final carbon porosity and acts as an in situ activator. This results in a large surface area as high as 1738 m 2 /g with a homogeneous pore size distribution and a moderate nitrogen doping level of 3.1%. X-ray photoelectron spectroscopy (XPS) measurements revealed that graphitic and pyridinic groups are the main nitrogen species present in the material, and their content depends on the amount of sodium citrate used during pyrolysis. Transmission electron microscopy (TEM) investigation showed that sodium citrate assists the formation of graphitic domains and many carbon nanosheets were observed. When applied as supercapacitor electrodes, a specific capacitance of 111 F/g in organic electrolyte was obtained and an excellent capacitance retention of 85.9% was observed at a current density of 10 A/g. At an operating voltage of 3.0 V, the device provided a maximum energy density of 35 W h/kg and a maximum power density of 12 kW/kg.

  1. Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform

    NASA Astrophysics Data System (ADS)

    Brau-Avila, A.; Santolaria, J.; Acero, R.; Valenzuela-Galvan, M.; Herrera-Jimenez, V. M.; Aguilar, J. J.

    2017-03-01

    The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs.

  2. What are the assets and weaknesses of HFO detectors? A benchmark framework based on realistic simulations

    PubMed Central

    Pizzo, Francesca; Bartolomei, Fabrice; Wendling, Fabrice; Bénar, Christian-George

    2017-01-01

    High-frequency oscillations (HFO) have been suggested as biomarkers of epileptic tissues. While visual marking of these short and small oscillations is tedious and time-consuming, automatic HFO detectors have not yet met a large consensus. Even though detectors have been shown to perform well when validated against visual marking, the large number of false detections due to their lack of robustness hinder their clinical application. In this study, we developed a validation framework based on realistic and controlled simulations to quantify precisely the assets and weaknesses of current detectors. We constructed a dictionary of synthesized elements—HFOs and epileptic spikes—from different patients and brain areas by extracting these elements from the original data using discrete wavelet transform coefficients. These elements were then added to their corresponding simulated background activity (preserving patient- and region- specific spectra). We tested five existing detectors against this benchmark. Compared to other studies confronting detectors, we did not only ranked them according their performance but we investigated the reasons leading to these results. Our simulations, thanks to their realism and their variability, enabled us to highlight unreported issues of current detectors: (1) the lack of robust estimation of the background activity, (2) the underestimated impact of the 1/f spectrum, and (3) the inadequate criteria defining an HFO. We believe that our benchmark framework could be a valuable tool to translate HFOs into a clinical environment. PMID:28406919

  3. Digital cover photography for estimating leaf area index (LAI) in apple trees using a variable light extinction coefficient.

    PubMed

    Poblete-Echeverría, Carlos; Fuentes, Sigfredo; Ortega-Farias, Samuel; Gonzalez-Talice, Jaime; Yuri, Jose Antonio

    2015-01-28

    Leaf area index (LAI) is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io) and transmitted radiation (I) through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAI(D)), which was compared with LAI estimated by the proposed digital photography method (LAI(M)). Results showed that the LAI(M) was able to estimate LAI(D) with an error of 25% using a constant light extinction coefficient (k = 0.68). However, when k was estimated using an exponential function based on the fraction of foliage cover (f(f)) derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic) were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions.

  4. A new method of automatic landmark tagging for shape model construction via local curvature scale

    NASA Astrophysics Data System (ADS)

    Rueda, Sylvia; Udupa, Jayaram K.; Bai, Li

    2008-03-01

    Segmentation of organs in medical images is a difficult task requiring very often the use of model-based approaches. To build the model, we need an annotated training set of shape examples with correspondences indicated among shapes. Manual positioning of landmarks is a tedious, time-consuming, and error prone task, and almost impossible in the 3D space. To overcome some of these drawbacks, we devised an automatic method based on the notion of c-scale, a new local scale concept. For each boundary element b, the arc length of the largest homogeneous curvature region connected to b is estimated as well as the orientation of the tangent at b. With this shape description method, we can automatically locate mathematical landmarks selected at different levels of detail. The method avoids the use of landmarks for the generation of the mean shape. The selection of landmarks on the mean shape is done automatically using the c-scale method. Then, these landmarks are propagated to each shape in the training set, defining this way the correspondences among the shapes. Altogether 12 strategies are described along these lines. The methods are evaluated on 40 MRI foot data sets, the object of interest being the talus bone. The results show that, for the same number of landmarks, the proposed methods are more compact than manual and equally spaced annotations. The approach is applicable to spaces of any dimensionality, although we have focused in this paper on 2D shapes.

  5. Digital Cover Photography for Estimating Leaf Area Index (LAI) in Apple Trees Using a Variable Light Extinction Coefficient

    PubMed Central

    Poblete-Echeverría, Carlos; Fuentes, Sigfredo; Ortega-Farias, Samuel; Gonzalez-Talice, Jaime; Yuri, Jose Antonio

    2015-01-01

    Leaf area index (LAI) is one of the key biophysical variables required for crop modeling. Direct LAI measurements are time consuming and difficult to obtain for experimental and commercial fruit orchards. Devices used to estimate LAI have shown considerable errors when compared to ground-truth or destructive measurements, requiring tedious site-specific calibrations. The objective of this study was to test the performance of a modified digital cover photography method to estimate LAI in apple trees using conventional digital photography and instantaneous measurements of incident radiation (Io) and transmitted radiation (I) through the canopy. Leaf area of 40 single apple trees were measured destructively to obtain real leaf area index (LAID), which was compared with LAI estimated by the proposed digital photography method (LAIM). Results showed that the LAIM was able to estimate LAID with an error of 25% using a constant light extinction coefficient (k = 0.68). However, when k was estimated using an exponential function based on the fraction of foliage cover (ff) derived from images, the error was reduced to 18%. Furthermore, when measurements of light intercepted by the canopy (Ic) were used as a proxy value for k, the method presented an error of only 9%. These results have shown that by using a proxy k value, estimated by Ic, helped to increase accuracy of LAI estimates using digital cover images for apple trees with different canopy sizes and under field conditions. PMID:25635411

  6. Integrated G and C Implementation within IDOS: A Simulink Based Reusable Launch Vehicle Simulation

    NASA Technical Reports Server (NTRS)

    Fisher, Joseph E.; Bevacqua, Tim; Lawrence, Douglas A.; Zhu, J. Jim; Mahoney, Michael

    2003-01-01

    The implementation of multiple Integrated Guidance and Control (IG&C) algorithms per flight phase within a vehicle simulation poses a daunting task to coordinate algorithm interactions with the other G&C components and with vehicle subsystems. Currently being developed by Universal Space Lines LLC (USL) under contract from NASA, the Integrated Development and Operations System (IDOS) contains a high fidelity Simulink vehicle simulation, which provides a means to test cutting edge G&C technologies. Combining the modularity of this vehicle simulation and Simulink s built-in primitive blocks provide a quick way to implement algorithms. To add discrete-event functionality to the unfinished IDOS simulation, Vehicle Event Manager (VEM) and Integrated Vehicle Health Monitoring (IVHM) subsystems were created to provide discrete-event and pseudo-health monitoring processing capabilities. Matlab's Stateflow is used to create the IVHM and Event Manager subsystems and to implement a supervisory logic controller referred to as the Auto-commander as part of the IG&C to coordinate the control system adaptation and reconfiguration and to select the control and guidance algorithms for a given flight phase. Manual creation of the Stateflow charts for all of these subsystems is a tedious and time-consuming process. The Stateflow Auto-builder was developed as a Matlab based software tool for the automatic generation of a Stateflow chart from information contained in a database. This paper describes the IG&C, VEM and IVHM implementations in IDOS. In addition, this paper describes the Stateflow Auto-builder.

  7. PICKY: a novel SVD-based NMR spectra peak picking method

    PubMed Central

    Alipanahi, Babak; Gao, Xin; Karakoc, Emre; Donaldson, Logan; Li, Ming

    2009-01-01

    Motivation: Picking peaks from experimental NMR spectra is a key unsolved problem for automated NMR protein structure determination. Such a process is a prerequisite for resonance assignment, nuclear overhauser enhancement (NOE) distance restraint assignment, and structure calculation tasks. Manual or semi-automatic peak picking, which is currently the prominent way used in NMR labs, is tedious, time consuming and costly. Results: We introduce new ideas, including noise-level estimation, component forming and sub-division, singular value decomposition (SVD)-based peak picking and peak pruning and refinement. PICKY is developed as an automated peak picking method. Different from the previous research on peak picking, we provide a systematic study of the proposed method. PICKY is tested on 32 real 2D and 3D spectra of eight target proteins, and achieves an average of 88% recall and 74% precision. PICKY is efficient. It takes PICKY on average 15.7 s to process an NMR spectrum. More important than these numbers, PICKY actually works in practice. We feed peak lists generated by PICKY to IPASS for resonance assignment, feed IPASS assignment to SPARTA for fragments generation, and feed SPARTA fragments to FALCON for structure calculation. This results in high-resolution structures of several proteins, for example, TM1112, at 1.25 Å. Availability: PICKY is available upon request. The peak lists of PICKY can be easily loaded by SPARKY to enable a better interactive strategy for rapid peak picking. Contact: mli@uwaterloo.ca PMID:19477998

  8. Implementing self sustained quality control procedures in a clinical laboratory.

    PubMed

    Khatri, Roshan; K C, Sanjay; Shrestha, Prabodh; Sinha, J N

    2013-01-01

    Quality control is an essential component in every clinical laboratory which maintains the excellence of laboratory standards, supplementing to proper disease diagnosis, patient care and resulting in overall strengthening of health care system. Numerous quality control schemes are available, with combinations of procedures, most of which are tedious, time consuming and can be "too technical" whereas commercially available quality control materials can be expensive especially for laboratories in developing nations like Nepal. Here, we present a procedure performed at our centre with self prepared control serum and use of simple statistical tools for quality assurance. The pooled serum was prepared as per guidelines for preparation of stabilized liquid quality control serum from human sera. Internal Quality Assessment was performed on this sample, on a daily basis which included measurement of 12 routine biochemical parameters. The results were plotted on Levey-Jennings charts and analysed with quality control rules, for a period of one month. The mean levels of biochemical analytes in self prepared control serum were within normal physiological range. This serum was evaluated every day along with patients' samples. The results obtained were plotted on control charts and analysed using common quality control rules to identify possible systematic and random errors. Immediate mitigation measures were taken and the dispatch of erroneous reports was avoided. In this study we try to highlight on a simple internal quality control procedure which can be performed by laboratories, with minimum technology, expenditure, and expertise and improve reliability and validity of the test reports.

  9. Structurally Simple and Easily Accessible Perylenes for Dye-Sensitized Solar Cells Applicable to Both 1 Sun and Dim-Light Environments.

    PubMed

    Chou, Hsien-Hsin; Liu, Yu-Chieh; Fang, Guanjie; Cao, Qiao-Kai; Wei, Tzu-Chien; Yeh, Chen-Yu

    2017-11-01

    The need for low-cost and highly efficient dyes for dye-sensitized solar cells under both the sunlight and dim light environments is growing. We have devised GJ-series push-pull organic dyes which require only four synthesis steps. These dyes feature a linear molecular structure of donor-perylene-ethynylene-arylcarboxylic acid, where donor represents N,N-diarylamino group and arylcarboxylic groups represent benzoic, thienocarboxylic, 2-cyano-3-phenylacrylic, 2-cyano-3-thienoacrylic, and 4-benzo[c][1,2,5]thiadiazol-4-yl-benzoic groups. In this study, we demonstrated that a dye without tedious and time-consuming synthesis efforts can perform efficiently. Under the illumination of AM1.5G simulated sunlight, the benzothiadiazole-benzoic-containing GJ-BP dye shows the best power conversion efficiency (PCE) of 6.16% with V OC of 0.70 V and J SC of 11.88 mA cm -2 using liquid iodide-based electrolyte. It also shows high performance in converting light of 6000 lx light intensity, that is, incident power of ca. 1.75 mW cm -2 , to power output of 0.28 mW cm -2 which equals a PCE of 15.79%. Interestingly, the benzoic-containing dye GJ-P with a simple molecular structure has comparable performance in generating power output of 0.26 mW cm -2 (PCE of 15.01%) under the same condition and is potentially viable toward future application.

  10. Assessment of brittleness and empirical correlations between physical and mechanical parameters of the Asmari limestone in Khersan 2 dam site, in southwest of Iran

    NASA Astrophysics Data System (ADS)

    Lashkaripour, Gholam Reza; Rastegarnia, Ahmad; Ghafoori, Mohammad

    2018-02-01

    The determination of brittleness and geomechanical parameters, especially uniaxial compressive strength (UCS) and Young's modulus (ES) of rocks are needed for the design of different rock engineering applications. Evaluation of these parameters are time-consuming processes, tedious, expensive and require well-prepared rock cores. Therefore, compressional wave velocity (Vp) and index parameters such as point load index and porosity are often used to predict the properties of rocks. In this paper, brittleness and other properties, physical and mechanical in type, of 56 Asmari limestones in dry and saturated conditions were analyzed. The rock samples were collected from Khersan 2 dam site. This dam with the height of 240 m is being constructed and located in the Zagros Mountain, in the southwest of Iran. The bedrock and abutments of the dam site consist of Asemari and Gachsaran Formations. In this paper, a practical relation for predicting brittleness and some relations between mechanical and index parameters of the Asmari limestone were established. The presented equation for predicting brittleness based on UCS, Brazilian tensile strength and Vp had high accuracy. Moreover, results showed that the brittleness estimation based on B3 concept (the ratio of multiply compressive strength in tensile strength divided 2) had more accuracy as compared to the B2 (the ratio of compressive strength minus tensile strength to compressive strength plus tensile strength) and B1 (the ratio of compressive strength to tensile strength) concepts.

  11. Semantic segmentation of forest stands of pure species combining airborne lidar data and very high resolution multispectral imagery

    NASA Astrophysics Data System (ADS)

    Dechesne, Clément; Mallet, Clément; Le Bris, Arnaud; Gouet-Brunet, Valérie

    2017-04-01

    Forest stands are the basic units for forest inventory and mapping. Stands are defined as large forested areas (e.g., ⩾ 2 ha) of homogeneous tree species composition and age. Their accurate delineation is usually performed by human operators through visual analysis of very high resolution (VHR) infra-red images. This task is tedious, highly time consuming, and should be automated for scalability and efficient updating purposes. In this paper, a method based on the fusion of airborne lidar data and VHR multispectral images is proposed for the automatic delineation of forest stands containing one dominant species (purity superior to 75%). This is the key preliminary task for forest land-cover database update. The multispectral images give information about the tree species whereas 3D lidar point clouds provide geometric information on the trees and allow their individual extraction. Multi-modal features are computed, both at pixel and object levels: the objects are individual trees extracted from lidar data. A supervised classification is then performed at the object level in order to coarsely discriminate the existing tree species in each area of interest. The classification results are further processed to obtain homogeneous areas with smooth borders by employing an energy minimum framework, where additional constraints are joined to form the energy function. The experimental results show that the proposed method provides very satisfactory results both in terms of stand labeling and delineation (overall accuracy ranges between 84 % and 99 %).

  12. Retinal health information and notification system (RHINO)

    NASA Astrophysics Data System (ADS)

    Dashtbozorg, Behdad; Zhang, Jiong; Abbasi-Sureshjani, Samaneh; Huang, Fan; ter Haar Romeny, Bart M.

    2017-03-01

    The retinal vasculature is the only part of the blood circulation system that can be observed non-invasively using fundus cameras. Changes in the dynamic properties of retinal blood vessels are associated with many systemic and vascular diseases, such as hypertension, coronary heart disease and diabetes. The assessment of the characteristics of the retinal vascular network provides important information for an early diagnosis and prognosis of many systemic and vascular diseases. The manual analysis of the retinal vessels and measurement of quantitative biomarkers in large-scale screening programs is a tedious task, time-consuming and costly. This paper describes a reliable, automated, and efficient retinal health information and notification system (acronym RHINO) which can extract a wealth of geometric biomarkers in large volumes of fundus images. The fully automated software presented in this paper includes vessel enhancement and segmentation, artery/vein classification, optic disc, fovea, and vessel junction detection, and bifurcation/crossing discrimination. Pipelining these tools allows the assessment of several quantitative vascular biomarkers: width, curvature, bifurcation geometry features and fractal dimension. The brain-inspired algorithms outperform most of the state-of-the-art techniques. Moreover, several annotation tools are implemented in RHINO for the manual labeling of arteries and veins, marking optic disc and fovea, and delineating vessel centerlines. The validation phase is ongoing and the software is currently being used for the analysis of retinal images from the Maastricht study (the Netherlands) which includes over 10,000 subjects (healthy and diabetic) with a broad spectrum of clinical measurements

  13. PRIMED: PRIMEr Database for Deleting and Tagging All Fission and Budding Yeast Genes Developed Using the Open-Source Genome Retrieval Script (GRS)

    PubMed Central

    Cummings, Michael T.; Joh, Richard I.; Motamedi, Mo

    2015-01-01

    The fission (Schizosaccharomyces pombe) and budding (Saccharomyces cerevisiae) yeasts have served as excellent models for many seminal discoveries in eukaryotic biology. In these organisms, genes are deleted or tagged easily by transforming cells with PCR-generated DNA inserts, flanked by short (50-100bp) regions of gene homology. These PCR reactions use especially designed long primers, which, in addition to the priming sites, carry homology for gene targeting. Primer design follows a fixed method but is tedious and time-consuming especially when done for a large number of genes. To automate this process, we developed the Python-based Genome Retrieval Script (GRS), an easily customizable open-source script for genome analysis. Using GRS, we created PRIMED, the complete PRIMEr D atabase for deleting and C-terminal tagging genes in the main S. pombe and five of the most commonly used S. cerevisiae strains. Because of the importance of noncoding RNAs (ncRNAs) in many biological processes, we also included the deletion primer set for these features in each genome. PRIMED are accurate and comprehensive and are provided as downloadable Excel files, removing the need for future primer design, especially for large-scale functional analyses. Furthermore, the open-source GRS can be used broadly to retrieve genome information from custom or other annotated genomes, thus providing a suitable platform for building other genomic tools by the yeast or other research communities. PMID:25643023

  14. Data You May Like: A Recommender System for Research Data Discovery

    NASA Astrophysics Data System (ADS)

    Devaraju, A.; Davy, R.; Hogan, D.

    2016-12-01

    Various data portals been developed to facilitate access to research datasets from different sources. For example, the Data Publisher for Earth & Environmental Science (PANGAEA), the Registry of Research Data Repositories (re3data.org), and the National Geoscience Data Centre (NGDC). Due to data quantity and heterogeneity, finding relevant datasets on these portals may be difficult and tedious. Keyword searches based on specific metadata elements or multi-key indexes may return irrelevant results. Faceted searches may be unsatisfactory and time consuming, especially when facet values are exhaustive. We need a much more intelligent way to complement existing searching mechanisms in order to enhance user experiences of the data portals. We developed a recommender system that helps users to find the most relevant research datasets on the CSIRO's Data Access Portal (DAP). The system is based on content-based filtering. We computed the similarity of datasets based on data attributes (e.g., descriptions, fields of research, location, contributors, and provenance) and inference from transaction logs (e.g., the relations among datasets and between queries and datasets). We improved the recommendation quality by assigning weights to data similarities. The weight values are drawn from a survey involving data users. The recommender results for a given dataset are accessible programmatically via a web service. Taking both data attributes and user actions into account, the recommender system will make it easier for researchers to find and reuse data offered through the data portal.

  15. Application of symbolic computations to the constitutive modeling of structural materials

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Tan, H. Q.; Dong, X.

    1990-01-01

    In applications involving elevated temperatures, the derivation of mathematical expressions (constitutive equations) describing the material behavior can be quite time consuming, involved and error-prone. Therefore intelligent application of symbolic systems to faciliate this tedious process can be of significant benefit. Presented here is a problem oriented, self contained symbolic expert system, named SDICE, which is capable of efficiently deriving potential based constitutive models in analytical form. This package, running under DOE MACSYMA, has the following features: (1) potential differentiation (chain rule), (2) tensor computations (utilizing index notation) including both algebraic and calculus; (3) efficient solution of sparse systems of equations; (4) automatic expression substitution and simplification; (5) back substitution of invariant and tensorial relations; (6) the ability to form the Jacobian and Hessian matrix; and (7) a relational data base. Limited aspects of invariant theory were also incorporated into SDICE due to the utilization of potentials as a starting point and the desire for these potentials to be frame invariant (objective). The uniqueness of SDICE resides in its ability to manipulate expressions in a general yet pre-defined order and simplify expressions so as to limit expression growth. Results are displayed, when applicable, utilizing index notation. SDICE was designed to aid and complement the human constitutive model developer. A number of examples are utilized to illustrate the various features contained within SDICE. It is expected that this symbolic package can and will provide a significant incentive to the development of new constitutive theories.

  16. Automated segmentation of blood-flow regions in large thoracic arteries using 3D-cine PC-MRI measurements.

    PubMed

    van Pelt, Roy; Nguyen, Huy; ter Haar Romeny, Bart; Vilanova, Anna

    2012-03-01

    Quantitative analysis of vascular blood flow, acquired by phase-contrast MRI, requires accurate segmentation of the vessel lumen. In clinical practice, 2D-cine velocity-encoded slices are inspected, and the lumen is segmented manually. However, segmentation of time-resolved volumetric blood-flow measurements is a tedious and time-consuming task requiring automation. Automated segmentation of large thoracic arteries, based solely on the 3D-cine phase-contrast MRI (PC-MRI) blood-flow data, was done. An active surface model, which is fast and topologically stable, was used. The active surface model requires an initial surface, approximating the desired segmentation. A method to generate this surface was developed based on a voxel-wise temporal maximum of blood-flow velocities. The active surface model balances forces, based on the surface structure and image features derived from the blood-flow data. The segmentation results were validated using volunteer studies, including time-resolved 3D and 2D blood-flow data. The segmented surface was intersected with a velocity-encoded PC-MRI slice, resulting in a cross-sectional contour of the lumen. These cross-sections were compared to reference contours that were manually delineated on high-resolution 2D-cine slices. The automated approach closely approximates the manual blood-flow segmentations, with error distances on the order of the voxel size. The initial surface provides a close approximation of the desired luminal geometry. This improves the convergence time of the active surface and facilitates parametrization. An active surface approach for vessel lumen segmentation was developed, suitable for quantitative analysis of 3D-cine PC-MRI blood-flow data. As opposed to prior thresholding and level-set approaches, the active surface model is topologically stable. A method to generate an initial approximate surface was developed, and various features that influence the segmentation model were evaluated. The active surface segmentation results were shown to closely approximate manual segmentations.

  17. Automated toxicological screening reports of modified Agilent MSD Chemstation combined with Microsoft Visual Basic application programs.

    PubMed

    Choe, Sanggil; Kim, Suncheun; Choi, Hyeyoung; Choi, Hwakyoung; Chung, Heesun; Hwang, Bangyeon

    2010-06-15

    Agilent GC-MS MSD Chemstation offers automated library search report for toxicological screening using total ion chromatogram (TIC) and mass spectroscopy in normal mode. Numerous peaks appear in the chromatogram of biological specimen such as blood or urine and often large migrating peaks obscure small target peaks, in addition, any target peaks of low abundance regularly give wrong library search result or low matching score. As a result, retention time and mass spectrum of all the peaks in the chromatogram have to be checked to see if they are relevant. These repeated actions are very tedious and time-consuming to toxicologists. MSD Chemstation software operates using a number of macro files which give commands and instructions on how to work on and extract data from the chromatogram and spectroscopy. These macro files are developed by the own compiler of the software. All the original macro files can be modified and new macro files can be added to the original software by users. To get more accurate results with more convenient method and to save time for data analysis, we developed new macro files for reports generation and inserted new menus in the Enhanced Data Analysis program. Toxicological screening reports generated by these new macro files are in text mode or graphic mode and these reports can be generated with three different automated subtraction options. Text reports have Brief mode and Full mode and graphic reports have the option with or without mass spectrum mode. Matched mass spectrum and matching score for detected compounds are printed in reports by modified library searching modules. We have also developed an independent application program named DrugMan. This program manages drug groups, lists and parameters that are in use in MSD Chemstation. The incorporation of DrugMan with modified macro modules provides a powerful tool for toxicological screening and save a lot of valuable time on toxicological work. (c) 2010 Elsevier Ireland Ltd. All rights reserved.

  18. Learning machines and sleeping brains: Automatic sleep stage classification using decision-tree multi-class support vector machines.

    PubMed

    Lajnef, Tarek; Chaibi, Sahbi; Ruby, Perrine; Aguera, Pierre-Emmanuel; Eichenlaub, Jean-Baptiste; Samet, Mounir; Kachouri, Abdennaceur; Jerbi, Karim

    2015-07-30

    Sleep staging is a critical step in a range of electrophysiological signal processing pipelines used in clinical routine as well as in sleep research. Although the results currently achievable with automatic sleep staging methods are promising, there is need for improvement, especially given the time-consuming and tedious nature of visual sleep scoring. Here we propose a sleep staging framework that consists of a multi-class support vector machine (SVM) classification based on a decision tree approach. The performance of the method was evaluated using polysomnographic data from 15 subjects (electroencephalogram (EEG), electrooculogram (EOG) and electromyogram (EMG) recordings). The decision tree, or dendrogram, was obtained using a hierarchical clustering technique and a wide range of time and frequency-domain features were extracted. Feature selection was carried out using forward sequential selection and classification was evaluated using k-fold cross-validation. The dendrogram-based SVM (DSVM) achieved mean specificity, sensitivity and overall accuracy of 0.92, 0.74 and 0.88 respectively, compared to expert visual scoring. Restricting DSVM classification to data where both experts' scoring was consistent (76.73% of the data) led to a mean specificity, sensitivity and overall accuracy of 0.94, 0.82 and 0.92 respectively. The DSVM framework outperforms classification with more standard multi-class "one-against-all" SVM and linear-discriminant analysis. The promising results of the proposed methodology suggest that it may be a valuable alternative to existing automatic methods and that it could accelerate visual scoring by providing a robust starting hypnogram that can be further fine-tuned by expert inspection. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Methylsorb: a simple method for quantifying DNA methylation using DNA-gold affinity interactions.

    PubMed

    Sina, Abu Ali Ibn; Carrascosa, Laura G; Palanisamy, Ramkumar; Rauf, Sakandar; Shiddiky, Muhammad J A; Trau, Matt

    2014-10-21

    The analysis of DNA methylation is becoming increasingly important both in the clinic and also as a research tool to unravel key epigenetic molecular mechanisms in biology. Current methodologies for the quantification of regional DNA methylation (i.e., the average methylation over a region of DNA in the genome) are largely affected by comprehensive DNA sequencing methodologies which tend to be expensive, tedious, and time-consuming for many applications. Herein, we report an alternative DNA methylation detection method referred to as "Methylsorb", which is based on the inherent affinity of DNA bases to the gold surface (i.e., the trend of the affinity interactions is adenine > cytosine ≥ guanine > thymine).1 Since the degree of gold-DNA affinity interaction is highly sequence dependent, it provides a new capability to detect DNA methylation by simply monitoring the relative adsorption of bisulfite treated DNA sequences onto a gold chip. Because the selective physical adsorption of DNA fragments to gold enable a direct read-out of regional DNA methylation, the current requirement for DNA sequencing is obviated. To demonstrate the utility of this method, we present data on the regional methylation status of two CpG clusters located in the EN1 and MIR200B genes in MCF7 and MDA-MB-231 cells. The methylation status of these regions was obtained from the change in relative mass on gold surface with respect to relative adsorption of an unmethylated DNA source and this was detected using surface plasmon resonance (SPR) in a label-free and real-time manner. We anticipate that the simplicity of this method, combined with the high level of accuracy for identifying the methylation status of cytosines in DNA, could find broad application in biology and diagnostics.

  20. Multi-analyte method development for analysis of brominated flame retardants (BFRs) and PBDE metabolites in human serum.

    PubMed

    Lu, Dasheng; Jin, Yu'e; Feng, Chao; Wang, Dongli; Lin, Yuanjie; Qiu, Xinlei; Xu, Qian; Wen, Yimin; She, Jianwen; Wang, Guoquan; Zhou, Zhijun

    2017-09-01

    Commonly, analytical methods measuring brominated flame retardants (BFRs) of different chemical polarities in human serum are labor consuming and tedious. Our study used acidified diatomaceous earth as solid-phase extraction (SPE) adsorbent and defatting material to simultaneously determine the most abundant BFRs and their metabolites with different polarities in human serum samples. The analytes include three types of commercial BFRs, tetrabromobisphenol A (TBBPA), hexabromocyclododecane (HBCD) isomers, and polybrominated biphenyl ethers (PBDEs), and dominant hydroxylated BDE (OH-PBDE) and methoxylated BDE (MeO-PBDE) metabolites of PBDEs. The sample eluents were sequentially analyzed for PBDEs and MeO-BDEs on online gel permeation chromatography/gas chromatography-electron capture-negative ionization mass spectrometry (online GPC GC-ECNI-MS) and for TBBPA, HBCD, and OH-BDEs on liquid chromatography-tandem mass spectrometry (LC-MS/MS). Method recoveries were 67-134% with a relative standard deviation (RSD) of less than 20%. Method detection limits (MDLs) were 0.30-4.20 pg/mL fresh weight (f.w.) for all analytes, except for BDE-209 of 16 pg/mL f.w. The methodology was also applied in a pilot study, which analyzed ten real samples from healthy donors in China, and the majority of target analytes were detected with a detection rate of more than 80%. To our knowledge, it is the first time for us in effectively determining BFRs of most types in one aliquot of human serum samples. This new analytical method is more specific, sensitive, accurate, and time saving for routine biomonitoring of these BFRs and for integrated assessment of health risk of BFR exposure.

  1. Reduced TiO2-Graphene Oxide Heterostructure As Broad Spectrum-Driven Efficient Water-Splitting Photocatalysts.

    PubMed

    Li, Lihua; Yu, Lili; Lin, Zhaoyong; Yang, Guowei

    2016-04-06

    The reduced TiO2-graphene oxide heterostructure as an alternative broad spectrum-driven efficient water splitting photocatalyst has become a really interesting topic, however, its syntheses has many flaws, e.g., tedious experimental steps, time-consuming, small scale production, and requirement of various additives, for example, hydrazine hydrate is widely used as reductant to the reduction of graphene oxide, which is high toxicity and easy to cause the second pollution. For these issues, herein, we reported the synthesis of the reduced TiO2-graphene oxide heterostructure by a facile chemical reduction agent-free one-step laser ablation in liquid (LAL) method, which achieves extended optical response range from ultraviolet to visible and composites TiO(2-x) (reduced TiO2) nanoparticle and graphene oxide for promoting charge conducting. 30.64% Ti(3+) content in the reduced TiO2 nanoparticles induces the electronic reconstruction of TiO2, which results in 0.87 eV decrease of the band gap for the visible light absorption. TiO(2-x)-graphene oxide heterostructure achieved drastically increased photocatalytic H2 production rate, up to 23 times with respect to the blank experiment. Furthermore, a maximum H2 production rate was measured to be 16 mmol/h/g using Pt as a cocatalyst under the simulated sunlight irradiation (AM 1.5G, 135 mW/cm(2)), the quantum efficiencies were measured to be 5.15% for wavelength λ = 365 ± 10 nm and 1.84% for λ = 405 ± 10 nm, and overall solar energy conversion efficiency was measured to be 14.3%. These findings provided new insights into the broad applicability of this methodology for accessing fascinate photocatalysts.

  2. Automated reticle inspection data analysis for wafer fabs

    NASA Astrophysics Data System (ADS)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2008-10-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity DefecTM data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.

  3. Automated reticle inspection data analysis for wafer fabs

    NASA Astrophysics Data System (ADS)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2009-04-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity Defect(R) data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.

  4. Automated reticle inspection data analysis for wafer fabs

    NASA Astrophysics Data System (ADS)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2009-03-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity DefectTM data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.

  5. Automated tissue segmentation of MR brain images in the presence of white matter lesions.

    PubMed

    Valverde, Sergi; Oliver, Arnau; Roura, Eloy; González-Villà, Sandra; Pareto, Deborah; Vilanova, Joan C; Ramió-Torrentà, Lluís; Rovira, Àlex; Lladó, Xavier

    2017-01-01

    Over the last few years, the increasing interest in brain tissue volume measurements on clinical settings has led to the development of a wide number of automated tissue segmentation methods. However, white matter lesions are known to reduce the performance of automated tissue segmentation methods, which requires manual annotation of the lesions and refilling them before segmentation, which is tedious and time-consuming. Here, we propose a new, fully automated T1-w/FLAIR tissue segmentation approach designed to deal with images in the presence of WM lesions. This approach integrates a robust partial volume tissue segmentation with WM outlier rejection and filling, combining intensity and probabilistic and morphological prior maps. We evaluate the performance of this method on the MRBrainS13 tissue segmentation challenge database, which contains images with vascular WM lesions, and also on a set of Multiple Sclerosis (MS) patient images. On both databases, we validate the performance of our method with other state-of-the-art techniques. On the MRBrainS13 data, the presented approach was at the time of submission the best ranked unsupervised intensity model method of the challenge (7th position) and clearly outperformed the other unsupervised pipelines such as FAST and SPM12. On MS data, the differences in tissue segmentation between the images segmented with our method and the same images where manual expert annotations were used to refill lesions on T1-w images before segmentation were lower or similar to the best state-of-the-art pipeline incorporating automated lesion segmentation and filling. Our results show that the proposed pipeline achieved very competitive results on both vascular and MS lesions. A public version of this approach is available to download for the neuro-imaging community. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Automatic Evidence Retrieval for Systematic Reviews

    PubMed Central

    Choong, Miew Keen; Galgani, Filippo; Dunn, Adam G

    2014-01-01

    Background Snowballing involves recursively pursuing relevant references cited in the retrieved literature and adding them to the search results. Snowballing is an alternative approach to discover additional evidence that was not retrieved through conventional search. Snowballing’s effectiveness makes it best practice in systematic reviews despite being time-consuming and tedious. Objective Our goal was to evaluate an automatic method for citation snowballing’s capacity to identify and retrieve the full text and/or abstracts of cited articles. Methods Using 20 review articles that contained 949 citations to journal or conference articles, we manually searched Microsoft Academic Search (MAS) and identified 78.0% (740/949) of the cited articles that were present in the database. We compared the performance of the automatic citation snowballing method against the results of this manual search, measuring precision, recall, and F1 score. Results The automatic method was able to correctly identify 633 (as proportion of included citations: recall=66.7%, F1 score=79.3%; as proportion of citations in MAS: recall=85.5%, F1 score=91.2%) of citations with high precision (97.7%), and retrieved the full text or abstract for 490 (recall=82.9%, precision=92.1%, F1 score=87.3%) of the 633 correctly retrieved citations. Conclusions The proposed method for automatic citation snowballing is accurate and is capable of obtaining the full texts or abstracts for a substantial proportion of the scholarly citations in review articles. By automating the process of citation snowballing, it may be possible to reduce the time and effort of common evidence surveillance tasks such as keeping trial registries up to date and conducting systematic reviews. PMID:25274020

  7. High-Throughput Genetic Analysis and Combinatorial Chiral Separations Based on Capillary Electrophoresis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhong, Wenwan

    2003-01-01

    Capillary electrophoresis (CE) offers many advantages over conventional analytical methods, such as speed, simplicity, high resolution, low cost, and small sample consumption, especially for the separation of enantiomers. However, chiral method developments still can be time consuming and tedious. They designed a comprehensive enantioseparation protocol employing neutral and sulfated cyclodextrins as chiral selectors for common basic, neutral, and acidic compounds with a 96-capillary array system. By using only four judiciously chosen separation buffers, successful enantioseparations were achieved for 49 out of 54 test compounds spanning a large variety of pKs and structures. Therefore, unknown compounds can be screened in thismore » manner to identify optimal enantioselective conditions in just one rn. In addition to superior separation efficiency for small molecules, CE is also the most powerful technique for DNA separations. Using the same multiplexed capillary system with UV absorption detection, the sequence of a short DNA template can be acquired without any dye-labels. Two internal standards were utilized to adjust the migration time variations among capillaries, so that the four electropherograms for the A, T, C, G Sanger reactions can be aligned and base calling can be completed with a high level of confidence. the CE separation of DNA can be applied to study differential gene expression as well. Combined with pattern recognition techniques, small variations among electropherograms obtained by the separation of cDNA fragments produced from the total RNA samples of different human tissues can be revealed. These variations reflect the differences in total RNA expression among tissues. Thus, this Ce-based approach can serve as an alternative to the DNA array techniques in gene expression analysis.« less

  8. Dual-force aggregation of magnetic particles enhances label-free quantification of DNA at the sub-single cell level.

    PubMed

    Nelson, Daniel A; Strachan, Briony C; Sloane, Hillary S; Li, Jingyi; Landers, James P

    2014-03-28

    We recently reported the 'pinwheel effect' as the foundation for a DNA assay based on a DNA concentration-dependent aggregation of silica-coated magnetic beads in a rotating magnetic field (RMF). Using a rotating magnet that generated a 5 cm magnetic field that impinged on a circular array of 5mm microwells, aggregation was found to only be effective in a single well at the center of the field. As a result, when multiple samples needed to be analyzed, the single-plex (single well) analysis was tedious, time-consuming and labor-intensive, as each well needed to be exposed to the center of the RMF in a serial manner for consistent well-to-well aggregation. For more effective multiplexing (simultaneous aggregation in 12 wells), we used a circular array of microwells and incorporated 'agitation' as a second force that worked in concert with the RMF to provide effective multiplexed aggregation-based DNA quantitation. The dual-force aggregation (DFA) approach allows for effective simultaneous aggregation in multiple wells (12 demonstrated) of the multi-well microdevice, allowing for 12 samples to be interrogated for DNA content in 140 s, providing a ∼35-fold improvement in time compared to single-plex approach (80 min) and ∼4-fold improvement over conventional fluorospectrometric methods. Furthermore, the increased interaction between DNA and beads provided by DFA improved the limit of detection to 250 fg μL(-1). The correlation between the DFA results and those from a fluorospectrometer, demonstrate DFA as an inexpensive and rapid alternative to more conventional methods (fluorescent and spectrophotometric). Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Eucalyptus hairy roots, a fast, efficient and versatile tool to explore function and expression of genes involved in wood formation.

    PubMed

    Plasencia, Anna; Soler, Marçal; Dupas, Annabelle; Ladouce, Nathalie; Silva-Martins, Guilherme; Martinez, Yves; Lapierre, Catherine; Franche, Claudine; Truchet, Isabelle; Grima-Pettenati, Jacqueline

    2016-06-01

    Eucalyptus are of tremendous economic importance being the most planted hardwoods worldwide for pulp and paper, timber and bioenergy. The recent release of the Eucalyptus grandis genome sequence pointed out many new candidate genes potentially involved in secondary growth, wood formation or lineage-specific biosynthetic pathways. Their functional characterization is, however, hindered by the tedious, time-consuming and inefficient transformation systems available hitherto for eucalypts. To overcome this limitation, we developed a fast, reliable and efficient protocol to obtain and easily detect co-transformed E. grandis hairy roots using fluorescent markers, with an average efficiency of 62%. We set up conditions both to cultivate excised roots in vitro and to harden composite plants and verified that hairy root morphology and vascular system anatomy were similar to wild-type ones. We further demonstrated that co-transformed hairy roots are suitable for medium-throughput functional studies enabling, for instance, protein subcellular localization, gene expression patterns through RT-qPCR and promoter expression, as well as the modulation of endogenous gene expression. Down-regulation of the Eucalyptus cinnamoyl-CoA reductase1 (EgCCR1) gene, encoding a key enzyme in lignin biosynthesis, led to transgenic roots with reduced lignin levels and thinner cell walls. This gene was used as a proof of concept to demonstrate that the function of genes involved in secondary cell wall biosynthesis and wood formation can be elucidated in transgenic hairy roots using histochemical, transcriptomic and biochemical approaches. The method described here is timely because it will accelerate gene mining of the genome for both basic research and industry purposes. © 2015 Society for Experimental Biology, Association of Applied Biologists and John Wiley & Sons Ltd.

  10. Evaluating the Power Consumption of Wireless Sensor Network Applications Using Models

    PubMed Central

    Dâmaso, Antônio; Freitas, Davi; Rosa, Nelson; Silva, Bruno; Maciel, Paulo

    2013-01-01

    Power consumption is the main concern in developing Wireless Sensor Network (WSN) applications. Consequently, several strategies have been proposed for investigating the power consumption of this kind of application. These strategies can help to predict the WSN lifetime, provide recommendations to application developers and may optimize the energy consumed by the WSN applications. While measurement is a known and precise strategy for power consumption evaluation, it is very costly, tedious and may be unfeasible considering the (usual) large number of WSN nodes. Furthermore, due to the inherent dynamism of WSNs, the instrumentation required by measurement techniques makes difficult their use in several different scenarios. In this context, this paper presents an approach for evaluating the power consumption of WSN applications by using simulation models along with a set of tools to automate the proposed approach. Starting from a programming language code, we automatically generate consumption models used to predict the power consumption of WSN applications. In order to evaluate the proposed approach, we compare the results obtained by using the generated models against ones obtained by measurement. PMID:23486217

  11. Evaluating the power consumption of wireless sensor network applications using models.

    PubMed

    Dâmaso, Antônio; Freitas, Davi; Rosa, Nelson; Silva, Bruno; Maciel, Paulo

    2013-03-13

    Power consumption is the main concern in developing Wireless Sensor Network (WSN) applications. Consequently, several strategies have been proposed for investigating the power consumption of this kind of application. These strategies can help to predict the WSN lifetime, provide recommendations to application developers and may optimize the energy consumed by the WSN applications. While measurement is a known and precise strategy for power consumption evaluation, it is very costly, tedious and may be unfeasible considering the (usual) large number of WSN nodes. Furthermore, due to the inherent dynamism of WSNs, the instrumentation required by measurement techniques makes difficult their use in several different scenarios. In this context, this paper presents an approach for evaluating the power consumption of WSN applications by using simulation models along with a set of tools to automate the proposed approach. Starting from a programming language code, we automatically generate consumption models used to predict the power consumption of WSN applications. In order to evaluate the proposed approach, we compare the results obtained by using the generated models against ones obtained by measurement.

  12. HDDTOOLS: an R package serving Hydrological Data Discovery Tools

    NASA Astrophysics Data System (ADS)

    Vitolo, C.; Buytaert, W.

    2014-12-01

    Many governmental bodies and institutions are currently committed to publish open data as the result of a trend of increasing transparency, based on which a wide variety of information produced at public expense is now becoming open and freely available to improve public involvement in the process of decision and policy making. Discovery, access and retrieval of information is, however, not always a simple task. Especially when programmatic access to data resources is not allowed, downloading metadata catalogue, select the information needed, request datasets, de-compression, conversion, manual filtering and parsing can become rather tedious. The R package "hddtools" is an open source project, designed to make all the above operations more efficient by means of re-usable functions. The package facilitate non programmatic access to various online data sources such as the Global Runoff Data Centre, NASA's TRMM mission, the Data60UK database amongst others. This package complements R's growing functionality in environmental web technologies to bridge the gap between data providers and data consumers and it is designed to be the starting building block of scientific workflows for linking data and models in a seamless fashion.

  13. Boring but Important: A Self-Transcendent Purpose for Learning Fosters Academic Self-Regulation

    PubMed Central

    Yeager, David S.; Henderson, Marlone D.; D’Mello, Sidney; Paunesku, David; Walton, Gregory M.; Spitzer, Brian J.; Duckworth, Angela Lee

    2015-01-01

    Many important learning tasks feel uninteresting and tedious to learners. This research proposed that promoting a prosocial, self-transcendent purpose could improve academic self-regulation on such tasks. This proposal was supported in four studies with over 2,000 adolescents and young adults. Study 1 documented a correlation between a self-transcendent purpose for learning and self-reported trait measures of academic self-regulation. Those with more of a purpose for learning also persisted longer on a boring task rather than giving in to a tempting alternative, and, many months later, were less likely to drop out of college. Study 2 addressed causality. It showed that a brief, one-time psychological intervention promoting a self-transcendent purpose for learning could improve high school science and math GPA over several months. Studies 3 and 4 were short-term experiments that explored possible mechanisms. They showed that the self-transcendent purpose manipulation could increase deeper learning behavior on tedious test review materials (Study 3), and sustain self-regulation over the course of an increasingly-boring task (Study 4). More self-oriented motives for learning—such as the desire to have an interesting or enjoyable career—did not, on their own, consistently produce these benefits (Studies 1 and 4). PMID:25222648

  14. Boring but important: a self-transcendent purpose for learning fosters academic self-regulation.

    PubMed

    Yeager, David S; Henderson, Marlone D; Paunesku, David; Walton, Gregory M; D'Mello, Sidney; Spitzer, Brian J; Duckworth, Angela Lee

    2014-10-01

    Many important learning tasks feel uninteresting and tedious to learners. This research proposed that promoting a prosocial, self-transcendent purpose could improve academic self-regulation on such tasks. This proposal was supported in 4 studies with over 2,000 adolescents and young adults. Study 1 documented a correlation between a self-transcendent purpose for learning and self-reported trait measures of academic self-regulation. Those with more of a purpose for learning also persisted longer on a boring task rather than giving in to a tempting alternative and, many months later, were less likely to drop out of college. Study 2 addressed causality. It showed that a brief, one-time psychological intervention promoting a self-transcendent purpose for learning could improve high school science and math grade point average (GPA) over several months. Studies 3 and 4 were short-term experiments that explored possible mechanisms. They showed that the self-transcendent purpose manipulation could increase deeper learning behavior on tedious test review materials (Study 3), and sustain self-regulation over the course of an increasingly boring task (Study 4). More self-oriented motives for learning--such as the desire to have an interesting or enjoyable career--did not, on their own, consistently produce these benefits (Studies 1 and 4). 2014 APA, all rights reserved

  15. Automated plasmodia recognition in microscopic images for diagnosis of malaria using convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Krappe, Sebastian; Benz, Michaela; Gryanik, Alexander; Tannich, Egbert; Wegner, Christine; Stamminger, Marc; Wittenberg, Thomas; Münzenmayer, Chrisitan

    2017-03-01

    Malaria is one of the world's most common and serious tropical diseases, caused by parasites of the genus plasmodia that are transmitted by Anopheles mosquitoes. Various parts of Asia and Latin America are affected but highest malaria incidence is found in Sub-Saharan Africa. Standard diagnosis of malaria comprises microscopic detection of parasites in stained thick and thin blood films. As the process of slide reading under the microscope is an error-prone and tedious issue we are developing computer-assisted microscopy systems to support detection and diagnosis of malaria. In this paper we focus on a deep learning (DL) approach for the detection of plasmodia and the evaluation of the proposed approach in comparison with two reference approaches. The proposed classification schemes have been evaluated with more than 180,000 automatically detected and manually classified plasmodia candidate objects from so-called thick smears. Automated solutions for the morphological analysis of malaria blood films could apply such a classifier to detect plasmodia in the highly complex image data of thick smears and thereby shortening the examination time. With such a system diagnosis of malaria infections should become a less tedious, more reliable and reproducible and thus a more objective process. Better quality assurance, improved documentation and global data availability are additional benefits.

  16. Drones--ethical considerations and medical implications.

    PubMed

    Pepper, Tom

    2012-01-01

    Drones enhance military capability and form a potent element of force protection, allowing humans to be removed from hazardous environments and tedious jobs. However, there are moral, legal, and political dangers associated with their use. Although a time may come when it is possible to develop a drone that is able to autonomously and ethically engage a legitimate target with greater reliability than a human, until then military drones demand a crawl-walk-run development methodology, consent by military personnel for weapon use, and continued debate about the complex issues surrounding their deployment.

  17. Robotic Processing Of Rocket-Engine Nozzles

    NASA Technical Reports Server (NTRS)

    Gilbert, Jeffrey L.; Maslakowski, John E.; Gutow, David A.; Deily, David C.

    1994-01-01

    Automated manufacturing cell containing computer-controlled robotic processing system developed to implement some important related steps in fabrication of rocket-engine nozzles. Performs several tedious and repetitive fabrication, measurement, adjustment, and inspection processes and subprocesses now performed manually. Offers advantages of reduced processing time, greater consistency, excellent collection of data, objective inspections, greater productivity, and simplified fixturing. Also affords flexibility: by making suitable changes in hardware and software, possible to modify process and subprocesses. Flexibility makes work cell adaptable to fabrication of heat exchangers and other items structured similarly to rocket nozzles.

  18. Machining Chatter Analysis for High Speed Milling Operations

    NASA Astrophysics Data System (ADS)

    Sekar, M.; Kantharaj, I.; Amit Siddhappa, Savale

    2017-10-01

    Chatter in high speed milling is characterized by time delay differential equations (DDE). Since closed form solution exists only for simple cases, the governing non-linear DDEs of chatter problems are solved by various numerical methods. Custom codes to solve DDEs are tedious to build, implement and not error free and robust. On the other hand, software packages provide solution to DDEs, however they are not straight forward to implement. In this paper an easy way to solve DDE of chatter in milling is proposed and implemented with MATLAB. Time domain solution permits the study and model of non-linear effects of chatter vibration with ease. Time domain results are presented for various stable and unstable conditions of cut and compared with stability lobe diagrams.

  19. Beyond space and time: advanced selection for seismological data

    NASA Astrophysics Data System (ADS)

    Trabant, C. M.; Van Fossen, M.; Ahern, T. K.; Casey, R. E.; Weertman, B.; Sharer, G.; Benson, R. B.

    2017-12-01

    Separating the available raw data from that useful for any given study is often a tedious step in a research project, particularly for first-order data quality problems such as broken sensors, incorrect response information, and non-continuous time series. With the ever increasing amounts of data available to researchers, this chore becomes more and more time consuming. To assist users in this pre-processing of data, the IRIS Data Management Center (DMC) has created a system called Research Ready Data Sets (RRDS). The RRDS system allows researchers to apply filters that constrain their data request using criteria related to signal quality, response correctness, and high resolution data availability. In addition to the traditional selection methods of stations at a geographic location for given time spans, RRDS will provide enhanced criteria for data selection based on many of the measurements available in the DMC's MUSTANG quality control system. This means that data may be selected based on background noise (tolerance relative to high and low noise Earth models), signal-to-noise ratio for earthquake arrivals, signal RMS, instrument response corrected signal correlation with Earth tides, time tear (gaps/overlaps) counts, timing quality (when reported in the raw data by the datalogger) and more. The new RRDS system is available as a web service designed to operate as a request filter. A request is submitted containing the traditional station and time constraints as well as data quality constraints. The request is then filtered and a report is returned that indicates 1) the request that would subsequently be submitted to a data access service, 2) a record of the quality criteria specified and 3) a record of the data rejected based on those criteria, including the relevant values. This service can be used to either filter a request prior to requesting the actual data or to explore which data match a set of enhanced criteria without downloading the data. We are optimistic this capability will reduce the initial data culling steps most researchers go through. Additionally, use of this service should reduce the amount of data transmitted from the DMC, easing the workload for our finite shared resources.

  20. A fast and direct spectrophotometric method for the simultaneous determination of methyl paraben and hydroquinone in cosmetic products using successive projections algorithm.

    PubMed

    Esteki, M; Nouroozi, S; Shahsavari, Z

    2016-02-01

    To develop a simple and efficient spectrophotometric technique combined with chemometrics for the simultaneous determination of methyl paraben (MP) and hydroquinone (HQ) in cosmetic products, and specifically, to: (i) evaluate the potential use of successive projections algorithm (SPA) to derivative spectrophotometric data in order to provide sufficient accuracy and model robustness and (ii) determine MP and HQ concentration in cosmetics without tedious pre-treatments such as derivatization or extraction techniques which are time-consuming and require hazardous solvents. The absorption spectra were measured in the wavelength range of 200-350 nm. Prior to performing chemometric models, the original and first-derivative absorption spectra of binary mixtures were used as calibration matrices. Variable selected by successive projections algorithm was used to obtain multiple linear regression (MLR) models based on a small subset of wavelengths. The number of wavelengths and the starting vector were optimized, and the comparison of the root mean square error of calibration (RMSEC) and cross-validation (RMSECV) was applied to select effective wavelengths with the least collinearity and redundancy. Principal component regression (PCR) and partial least squares (PLS) were also developed for comparison. The concentrations of the calibration matrix ranged from 0.1 to 20 μg mL(-1) for MP, and from 0.1 to 25 μg mL(-1) for HQ. The constructed models were tested on an external validation data set and finally cosmetic samples. The results indicated that successive projections algorithm-multiple linear regression (SPA-MLR), applied on the first-derivative spectra, achieved the optimal performance for two compounds when compared with the full-spectrum PCR and PLS. The root mean square error of prediction (RMSEP) was 0.083, 0.314 for MP and HQ, respectively. To verify the accuracy of the proposed method, a recovery study on real cosmetic samples was carried out with satisfactory results (84-112%). The proposed method, which is an environmentally friendly approach, using minimum amount of solvent, is a simple, fast and low-cost analysis method that can provide high accuracy and robust models. The suggested method does not need any complex extraction procedure which is time-consuming and requires hazardous solvents. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  1. Low-Cost, User-Friendly, Rapid Analysis of Dynamic Data System Established

    NASA Technical Reports Server (NTRS)

    Arend, David J.

    2004-01-01

    An issue of primary importance to the development of new jet and certain other airbreathing combined-cycle powered aircraft is the advancement of airframe-integrated propulsion technologies. Namely, engine inlets and their systems and subsystems are required to capture, convert, and deliver the atmospheric airflow demanded by such engines across their operating envelope in a form that can be used to provide efficient, stable thrust. This must be done while also minimizing aircraft drag and weight. Revolutionary inlet designs aided by new technologies are needed to enable new missions. An unwanted byproduct of pursuing these inlet technologies is increased time-variant airflow distortion. Such distortions reduce propulsion system stability, performance, operability, and life. To countermand these limitations and fully evaluate the resulting configurations, best practices dictate that this distortion be experimentally measured at large scale and analyzed. The required measurements consist of those made by an array of high-response pressure transducers located in the flow field at the aerodynamic interface plane (AIP) between the inlet and engine. Although the acquisition of the necessary pitot-pressure time histories is relatively straight-forward, until recent years, the analysis has proved to be very time-consuming, tedious, and expensive. To transform the analysis of these data into a tractable and timely proposition, researchers at the NASA Glenn Research Center created and established the Rapid Analysis of Dynamic Data (RADD) system. The system provides complete, near real-time analysis of time-varying inlet airflow distortion datasets with report quality output. This fully digital approach employs Institute of Electrical and Electronics Engineers (IEEE) binary data file format standardization to establish data-acquisition-system-independent processing on low cost personal computers. Features include invalid instrumentation code-out, logging, and multiple replacement schemes as needed for each channel of instrumentation. The AIP pressure distribution can be interpolated to simulate measurements by alternate AIP probe arrays, if desired. In addition, the RADD system provides for the application of filters that can be used to focus the analysis on the frequency range of interest.

  2. Automated cell tracking and analysis in phase-contrast videos (iTrack4U): development of Java software based on combined mean-shift processes.

    PubMed

    Cordelières, Fabrice P; Petit, Valérie; Kumasaka, Mayuko; Debeir, Olivier; Letort, Véronique; Gallagher, Stuart J; Larue, Lionel

    2013-01-01

    Cell migration is a key biological process with a role in both physiological and pathological conditions. Locomotion of cells during embryonic development is essential for their correct positioning in the organism; immune cells have to migrate and circulate in response to injury. Failure of cells to migrate or an inappropriate acquisition of migratory capacities can result in severe defects such as altered pigmentation, skull and limb abnormalities during development, and defective wound repair, immunosuppression or tumor dissemination. The ability to accurately analyze and quantify cell migration is important for our understanding of development, homeostasis and disease. In vitro cell tracking experiments, using primary or established cell cultures, are often used to study migration as cells can quickly and easily be genetically or chemically manipulated. Images of the cells are acquired at regular time intervals over several hours using microscopes equipped with CCD camera. The locations (x,y,t) of each cell on the recorded sequence of frames then need to be tracked. Manual computer-assisted tracking is the traditional method for analyzing the migratory behavior of cells. However, this processing is extremely tedious and time-consuming. Most existing tracking algorithms require experience in programming languages that are unfamiliar to most biologists. We therefore developed an automated cell tracking program, written in Java, which uses a mean-shift algorithm and ImageJ as a library. iTrack4U is a user-friendly software. Compared to manual tracking, it saves considerable amount of time to generate and analyze the variables characterizing cell migration, since they are automatically computed with iTrack4U. Another major interest of iTrack4U is the standardization and the lack of inter-experimenter differences. Finally, iTrack4U is adapted for phase contrast and fluorescent cells.

  3. SU-E-T-97: An Analysis of Knowledge Based Planning for Stereotactic Body Radiation Therapy of the Spine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foy, J; Marsh, R; Owen, D

    2015-06-15

    Purpose: Creating high quality SBRT treatment plans for the spine is often tedious and time consuming. In addition, the quality of treatment plans can vary greatly between treatment facilities due to inconsistencies in planning methods. This study investigates the performance of knowledge-based planning (KBP) for spine SBRT. Methods: Treatment plans were created for 28 spine SBRT patients. Each case was planned to meet strict dose objectives and guidelines. After physician and physicist approval, the plans were added to a custom model in a KBP system (RapidPlan, Varian Eclipse v13.5). The model was then trained to be able to predict estimatedmore » DVHs and provide starting objective functions for future patients based on both generated and manual objectives. To validate the model, ten additional spine SBRT cases were planned manually as well as using the model objectives. Plans were compared based on planning time and quality (ability to meet the plan objectives, including dose metrics and conformity). Results: The average dose to the spinal cord and the cord PRV differed between the validation and control plans by <0.25% demonstrating iso-toxicity. Six out of 10 validation plans met all dose objectives without the need for modifications, and overall, target dose coverage was increased by about 4.8%. If the validation plans did not meet the dose requirements initially, only 1–2 iterations of modifying the planning parameters were required before an acceptable plan was achieved. While manually created plans usually required 30 minutes to 3 hours to create, KBP can be used to create similar quality plans in 15–20 minutes. Conclusion: KBP for spinal tumors has shown to greatly decrease the amount of time required to achieve high quality treatment plans with minimal human intervention and could feasibly be used to standardize plan quality between institutions. Supported by Varian Medical Systems.« less

  4. Novel keyword co-occurrence network-based methods to foster systematic reviews of scientific literature.

    PubMed

    Radhakrishnan, Srinivasan; Erbis, Serkan; Isaacs, Jacqueline A; Kamarthi, Sagar

    2017-01-01

    Systematic reviews of scientific literature are important for mapping the existing state of research and highlighting further growth channels in a field of study, but systematic reviews are inherently tedious, time consuming, and manual in nature. In recent years, keyword co-occurrence networks (KCNs) are exploited for knowledge mapping. In a KCN, each keyword is represented as a node and each co-occurrence of a pair of words is represented as a link. The number of times that a pair of words co-occurs in multiple articles constitutes the weight of the link connecting the pair. The network constructed in this manner represents cumulative knowledge of a domain and helps to uncover meaningful knowledge components and insights based on the patterns and strength of links between keywords that appear in the literature. In this work, we propose a KCN-based approach that can be implemented prior to undertaking a systematic review to guide and accelerate the review process. The novelty of this method lies in the new metrics used for statistical analysis of a KCN that differ from those typically used for KCN analysis. The approach is demonstrated through its application to nano-related Environmental, Health, and Safety (EHS) risk literature. The KCN approach identified the knowledge components, knowledge structure, and research trends that match with those discovered through a traditional systematic review of the nanoEHS field. Because KCN-based analyses can be conducted more quickly to explore a vast amount of literature, this method can provide a knowledge map and insights prior to undertaking a rigorous traditional systematic review. This two-step approach can significantly reduce the effort and time required for a traditional systematic literature review. The proposed KCN-based pre-systematic review method is universal. It can be applied to any scientific field of study to prepare a knowledge map.

  5. Novel keyword co-occurrence network-based methods to foster systematic reviews of scientific literature

    PubMed Central

    Isaacs, Jacqueline A.

    2017-01-01

    Systematic reviews of scientific literature are important for mapping the existing state of research and highlighting further growth channels in a field of study, but systematic reviews are inherently tedious, time consuming, and manual in nature. In recent years, keyword co-occurrence networks (KCNs) are exploited for knowledge mapping. In a KCN, each keyword is represented as a node and each co-occurrence of a pair of words is represented as a link. The number of times that a pair of words co-occurs in multiple articles constitutes the weight of the link connecting the pair. The network constructed in this manner represents cumulative knowledge of a domain and helps to uncover meaningful knowledge components and insights based on the patterns and strength of links between keywords that appear in the literature. In this work, we propose a KCN-based approach that can be implemented prior to undertaking a systematic review to guide and accelerate the review process. The novelty of this method lies in the new metrics used for statistical analysis of a KCN that differ from those typically used for KCN analysis. The approach is demonstrated through its application to nano-related Environmental, Health, and Safety (EHS) risk literature. The KCN approach identified the knowledge components, knowledge structure, and research trends that match with those discovered through a traditional systematic review of the nanoEHS field. Because KCN-based analyses can be conducted more quickly to explore a vast amount of literature, this method can provide a knowledge map and insights prior to undertaking a rigorous traditional systematic review. This two-step approach can significantly reduce the effort and time required for a traditional systematic literature review. The proposed KCN-based pre-systematic review method is universal. It can be applied to any scientific field of study to prepare a knowledge map. PMID:28328983

  6. An optofluidic approach for gold nanoprobes based-cancer theranostics

    NASA Astrophysics Data System (ADS)

    Panwar, Nishtha; Song, Peiyi; Yang, Chengbin; Yong, Ken-Tye; Tjin, Swee Chuan

    2017-02-01

    Suppression of overexpressed gene mutations in cancer cells through RNA interference (RNAi) technique is a therapeutically effective modality for oncogene silencing. In general, transfection agent is needed for siRNA delivery. Also, it is a tedious and time consuming process to analyze the gene transfection using current conventional flow cytometry systems and commercially available transfection kits. Therefore, there are two urgent challenges that we need to address for understanding and real time monitoring the delivery of siRNA to cancer cells more effectively. One, nontoxic, biocompatible and stable non-viral transfection agents need to be developed and investigated for gene delivery in cancer cells. Two, new, portable optofluidic methods need to be engineered for determining the transfection efficiency of the nanoformulation in real time. First, we demonstrate the feasibility of using gold nanorods (AuNRs) as nanoprobes for the delivery of Interleukin-8 (IL-8) siRNA in a pancreatic cancer cell line- MiaPaCa-2. An optimum ratio of 10:1 for the AuNRs-siRNA nanoformulation required for efficient loading has been experimentally determined. Promising transfection rates (≈88%) of the nanoprobe-assisted gene delivery are quantified by flow cytometry and fluorescence imaging, which are higher than the commercial control, Oligofectamine. The excellent gene knockdown performance (over 81%) of the proposed model support in vivo trials for RNAi-based cancer theranostics. In addition to cancer theranostics, our nanoprobe combination can be also applied for disease outbreak monitoring like MERS. Second, we present an optical fiber-integrated microfluidic chip that utilizes simple hydrodynamic and optical setups for miniaturized on-chip flow cytometry. The chip provides a powerful and convenient tool to quantitatively determine the siRNA transfection into cancer cells without using bulky flow cytometer. These studies outline the role of AuNRs as potential non-viral gene delivery vehicles, and their suitability for microfluidics-based lab-on-chip flow cytometry applications.

  7. Biomedical question answering using semantic relations.

    PubMed

    Hristovski, Dimitar; Dinevski, Dejan; Kastrin, Andrej; Rindflesch, Thomas C

    2015-01-16

    The proliferation of the scientific literature in the field of biomedicine makes it difficult to keep abreast of current knowledge, even for domain experts. While general Web search engines and specialized information retrieval (IR) systems have made important strides in recent decades, the problem of accurate knowledge extraction from the biomedical literature is far from solved. Classical IR systems usually return a list of documents that have to be read by the user to extract relevant information. This tedious and time-consuming work can be lessened with automatic Question Answering (QA) systems, which aim to provide users with direct and precise answers to their questions. In this work we propose a novel methodology for QA based on semantic relations extracted from the biomedical literature. We extracted semantic relations with the SemRep natural language processing system from 122,421,765 sentences, which came from 21,014,382 MEDLINE citations (i.e., the complete MEDLINE distribution up to the end of 2012). A total of 58,879,300 semantic relation instances were extracted and organized in a relational database. The QA process is implemented as a search in this database, which is accessed through a Web-based application, called SemBT (available at http://sembt.mf.uni-lj.si ). We conducted an extensive evaluation of the proposed methodology in order to estimate the accuracy of extracting a particular semantic relation from a particular sentence. Evaluation was performed by 80 domain experts. In total 7,510 semantic relation instances belonging to 2,675 distinct relations were evaluated 12,083 times. The instances were evaluated as correct 8,228 times (68%). In this work we propose an innovative methodology for biomedical QA. The system is implemented as a Web-based application that is able to provide precise answers to a wide range of questions. A typical question is answered within a few seconds. The tool has some extensions that make it especially useful for interpretation of DNA microarray results.

  8. Viewing and Editing Earth Science Metadata MOBE: Metadata Object Browser and Editor in Java

    NASA Astrophysics Data System (ADS)

    Chase, A.; Helly, J.

    2002-12-01

    Metadata is an important, yet often neglected aspect of successful archival efforts. However, to generate robust, useful metadata is often a time consuming and tedious task. We have been approaching this problem from two directions: first by automating metadata creation, pulling from known sources of data, and in addition, what this (paper/poster?) details, developing friendly software for human interaction with the metadata. MOBE and COBE(Metadata Object Browser and Editor, and Canonical Object Browser and Editor respectively), are Java applications for editing and viewing metadata and digital objects. MOBE has already been designed and deployed, currently being integrated into other areas of the SIOExplorer project. COBE is in the design and development stage, being created with the same considerations in mind as those for MOBE. Metadata creation, viewing, data object creation, and data object viewing, when taken on a small scale are all relatively simple tasks. Computer science however, has an infamous reputation for transforming the simple into complex. As a system scales upwards to become more robust, new features arise and additional functionality is added to the software being written to manage the system. The software that emerges from such an evolution, though powerful, is often complex and difficult to use. With MOBE the focus is on a tool that does a small number of tasks very well. The result has been an application that enables users to manipulate metadata in an intuitive and effective way. This allows for a tool that serves its purpose without introducing additional cognitive load onto the user, an end goal we continue to pursue.

  9. Refinery spreadsheet highlights microcomputer process applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tucker, M.A.

    1984-01-23

    Microcomputer applications in the process areas at Chevron U.S.A. refineries and at the Chevron Research Co. illustrate how the microcomputer has changed the way we do our jobs. This article will describe major uses of the microcomputer as a personal work tool in Chevron process areas. It will also describe how and why many of Chevron's microcomputer applications were developed and their characteristics. One of our earliest microcomputer applications, developed in late 1981, was an electronic spreadsheet program using a small desktop microcomputer. It was designed to help a refinery planner prepare monthly plans for a small portion of onemore » of our major refineries. This particular microcomputer had a tiny 4-in. screen, and the reports were several strips of print-out from the microcomputer's 3-in.-wide internal printer taped together. In spite of these archaic computing conditions, it was a successful application. It automated what had been very tedious and time-consuming calculations with a pencil, a calculator, and a great deal of erasing. It eliminated filling out large ''horseblanket'' reports. The electronic spreadsheet was also flexible; the planner could easily change the worksheet to match new operating constraints, new process conditions, and new feeds and products. Fortunately, within just a few months, this application graduated to a similar electronic spreadsheet program on a new, more powerful microcomputer. It had a bigger display screen and a letter-size printer. The same application is still in use today, although it has been greatly enhanced and altered to match extensive plant modifications. And there are plans to expand it again onto yet another, more powerful microcomputer.« less

  10. SNPflow: A Lightweight Application for the Processing, Storing and Automatic Quality Checking of Genotyping Assays

    PubMed Central

    Schönherr, Sebastian; Neuner, Mathias; Forer, Lukas; Specht, Günther; Kloss-Brandstätter, Anita; Kronenberg, Florian; Coassin, Stefan

    2013-01-01

    Single nucleotide polymorphisms (SNPs) play a prominent role in modern genetics. Current genotyping technologies such as Sequenom iPLEX, ABI TaqMan and KBioscience KASPar made the genotyping of huge SNP sets in large populations straightforward and allow the generation of hundreds of thousands of genotypes even in medium sized labs. While data generation is straightforward, the subsequent data conversion, storage and quality control steps are time-consuming, error-prone and require extensive bioinformatic support. In order to ease this tedious process, we developed SNPflow. SNPflow is a lightweight, intuitive and easily deployable application, which processes genotype data from Sequenom MassARRAY (iPLEX) and ABI 7900HT (TaqMan, KASPar) systems and is extendible to other genotyping methods as well. SNPflow automatically converts the raw output files to ready-to-use genotype lists, calculates all standard quality control values such as call rate, expected and real amount of replicates, minor allele frequency, absolute number of discordant replicates, discordance rate and the p-value of the HWE test, checks the plausibility of the observed genotype frequencies by comparing them to HapMap/1000-Genomes, provides a module for the processing of SNPs, which allow sex determination for DNA quality control purposes and, finally, stores all data in a relational database. SNPflow runs on all common operating systems and comes as both stand-alone version and multi-user version for laboratory-wide use. The software, a user manual, screenshots and a screencast illustrating the main features are available at http://genepi-snpflow.i-med.ac.at. PMID:23527209

  11. Improving medical stores management through automation and effective communication.

    PubMed

    Kumar, Ashok; Cariappa, M P; Marwaha, Vishal; Sharma, Mukti; Arora, Manu

    2016-01-01

    Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan-Do-Study-Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management.

  12. Building Construction Progress Monitoring Using Unmanned Aerial System (uas), Low-Cost Photogrammetry, and Geographic Information System (gis)

    NASA Astrophysics Data System (ADS)

    Bognot, J. R.; Candido, C. G.; Blanco, A. C.; Montelibano, J. R. Y.

    2018-05-01

    Monitoring the progress of building's construction is critical in construction management. However, measuring the building construction's progress are still manual, time consuming, error prone, and impose tedious process of analysis leading to delays, additional costings and effort. The main goal of this research is to develop a methodology for building construction progress monitoring based on 3D as-built model of the building from unmanned aerial system (UAS) images, 4D as-planned model (with construction schedule integrated) and, GIS analysis. Monitoring was done by capturing videos of the building with a camera-equipped UAS. Still images were extracted, filtered, bundle-adjusted, and 3D as-built model was generated using open source photogrammetric software. The as-planned model was generated from digitized CAD drawings using GIS. The 3D as-built model was aligned with the 4D as-planned model of building formed from extrusion of building elements, and integration of the construction's planned schedule. The construction progress is visualized via color-coding the building elements in the 3D model. The developed methodology was conducted and applied from the data obtained from an actual construction site. Accuracy in detecting `built' or `not built' building elements ranges from 82-84 % and precision of 50-72 %. Quantified progress in terms of the number of building elements are 21.31% (November 2016), 26.84 % (January 2017) and 44.19 % (March 2017). The results can be used as an input for progress monitoring performance of construction projects and improving related decision-making process.

  13. Automated glioblastoma segmentation based on a multiparametric structured unsupervised classification.

    PubMed

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V; Robles, Montserrat; Aparici, F; Martí-Bonmatí, L; García-Gómez, Juan M

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation.

  14. Resolution and Assignment of Differential Ion Mobility Spectra of Sarcosine and Isomers.

    PubMed

    Berthias, Francis; Maatoug, Belkis; Glish, Gary L; Moussa, Fathi; Maitre, Philippe

    2018-04-01

    Due to their central role in biochemical processes, fast separation and identification of amino acids (AA) is of importance in many areas of the biomedical field including the diagnosis and monitoring of inborn errors of metabolism and biomarker discovery. Due to the large number of AA together with their isomers and isobars, common methods of AA analysis are tedious and time-consuming because they include a chromatographic separation step requiring pre- or post-column derivatization. Here, we propose a rapid method of separation and identification of sarcosine, a biomarker candidate of prostate cancer, from isomers using differential ion mobility spectrometry (DIMS) interfaced with a tandem mass spectrometer (MS/MS) instrument. Baseline separation of protonated sarcosine from α- and β-alanine isomers can be easily achieved. Identification of DIMS peak is performed using an isomer-specific activation mode where DIMS- and mass-selected ions are irradiated at selected wavenumbers allowing for the specific fragmentation via an infrared multiple photon dissociation (IRMPD) process. Two orthogonal methods to MS/MS are thus added, where the MS/MS(IRMPD) is nothing but an isomer-specific multiple reaction monitoring (MRM) method. The identification relies on the comparison of DIMS-MS/MS(IRMPD) chromatograms recorded at different wavenumbers. Based on the comparison of IR spectra of the three isomers, it is shown that specific depletion of the two protonated α- and β-alanine can be achieved, thus allowing for clear identification of the sarcosine peak. It is also demonstrated that DIMS-MS/MS(IRMPD) spectra in the carboxylic C=O stretching region allow for the resolution of overlapping DIMS peaks. Graphical Abstract ᅟ.

  15. In vivo Assembly in Escherichia coli of Transformation Vectors for Plastid Genome Engineering

    PubMed Central

    Wu, Yuyong; You, Lili; Li, Shengchun; Ma, Meiqi; Wu, Mengting; Ma, Lixin; Bock, Ralph; Chang, Ling; Zhang, Jiang

    2017-01-01

    Plastid transformation for the expression of recombinant proteins and entire metabolic pathways has become a promising tool for plant biotechnology. However, large-scale application of this technology has been hindered by some technical bottlenecks, including lack of routine transformation protocols for agronomically important crop plants like rice or maize. Currently, there are no standard or commercial plastid transformation vectors available for the scientific community. Construction of a plastid transformation vector usually requires tedious and time-consuming cloning steps. In this study, we describe the adoption of an in vivo Escherichia coli cloning (iVEC) technology to quickly assemble a plastid transformation vector. The method enables simple and seamless build-up of a complete plastid transformation vector from five DNA fragments in a single step. The vector assembled for demonstration purposes contains an enhanced green fluorescent protein (GFP) expression cassette, in which the gfp transgene is driven by the tobacco plastid ribosomal RNA operon promoter fused to the 5′ untranslated region (UTR) from gene10 of bacteriophage T7 and the transcript-stabilizing 3′UTR from the E. coli ribosomal RNA operon rrnB. Successful transformation of the tobacco plastid genome was verified by Southern blot analysis and seed assays. High-level expression of the GFP reporter in the transplastomic plants was visualized by confocal microscopy and Coomassie staining, and GFP accumulation was ~9% of the total soluble protein. The iVEC method represents a simple and efficient approach for construction of plastid transformation vector, and offers great potential for the assembly of increasingly complex vectors for synthetic biology applications in plastids. PMID:28871270

  16. In vivo Assembly in Escherichia coli of Transformation Vectors for Plastid Genome Engineering.

    PubMed

    Wu, Yuyong; You, Lili; Li, Shengchun; Ma, Meiqi; Wu, Mengting; Ma, Lixin; Bock, Ralph; Chang, Ling; Zhang, Jiang

    2017-01-01

    Plastid transformation for the expression of recombinant proteins and entire metabolic pathways has become a promising tool for plant biotechnology. However, large-scale application of this technology has been hindered by some technical bottlenecks, including lack of routine transformation protocols for agronomically important crop plants like rice or maize. Currently, there are no standard or commercial plastid transformation vectors available for the scientific community. Construction of a plastid transformation vector usually requires tedious and time-consuming cloning steps. In this study, we describe the adoption of an in vivo Escherichia coli cloning (iVEC) technology to quickly assemble a plastid transformation vector. The method enables simple and seamless build-up of a complete plastid transformation vector from five DNA fragments in a single step. The vector assembled for demonstration purposes contains an enhanced green fluorescent protein (GFP) expression cassette, in which the gfp transgene is driven by the tobacco plastid ribosomal RNA operon promoter fused to the 5' untranslated region (UTR) from gene10 of bacteriophage T7 and the transcript-stabilizing 3'UTR from the E. coli ribosomal RNA operon rrnB . Successful transformation of the tobacco plastid genome was verified by Southern blot analysis and seed assays. High-level expression of the GFP reporter in the transplastomic plants was visualized by confocal microscopy and Coomassie staining, and GFP accumulation was ~9% of the total soluble protein. The iVEC method represents a simple and efficient approach for construction of plastid transformation vector, and offers great potential for the assembly of increasingly complex vectors for synthetic biology applications in plastids.

  17. Construction and completion of flux balance models from pathway databases.

    PubMed

    Latendresse, Mario; Krummenacker, Markus; Trupp, Miles; Karp, Peter D

    2012-02-01

    Flux balance analysis (FBA) is a well-known technique for genome-scale modeling of metabolic flux. Typically, an FBA formulation requires the accurate specification of four sets: biochemical reactions, biomass metabolites, nutrients and secreted metabolites. The development of FBA models can be time consuming and tedious because of the difficulty in assembling completely accurate descriptions of these sets, and in identifying errors in the composition of these sets. For example, the presence of a single non-producible metabolite in the biomass will make the entire model infeasible. Other difficulties in FBA modeling are that model distributions, and predicted fluxes, can be cryptic and difficult to understand. We present a multiple gap-filling method to accelerate the development of FBA models using a new tool, called MetaFlux, based on mixed integer linear programming (MILP). The method suggests corrections to the sets of reactions, biomass metabolites, nutrients and secretions. The method generates FBA models directly from Pathway/Genome Databases. Thus, FBA models developed in this framework are easily queried and visualized using the Pathway Tools software. Predicted fluxes are more easily comprehended by visualizing them on diagrams of individual metabolic pathways or of metabolic maps. MetaFlux can also remove redundant high-flux loops, solve FBA models once they are generated and model the effects of gene knockouts. MetaFlux has been validated through construction of FBA models for Escherichia coli and Homo sapiens. Pathway Tools with MetaFlux is freely available to academic users, and for a fee to commercial users. Download from: biocyc.org/download.shtml. mario.latendresse@sri.com Supplementary data are available at Bioinformatics online.

  18. A tool for calculating binding-site residues on proteins from PDB structures.

    PubMed

    Hu, Jing; Yan, Changhui

    2009-08-03

    In the research on protein functional sites, researchers often need to identify binding-site residues on a protein. A commonly used strategy is to find a complex structure from the Protein Data Bank (PDB) that consists of the protein of interest and its interacting partner(s) and calculate binding-site residues based on the complex structure. However, since a protein may participate in multiple interactions, the binding-site residues calculated based on one complex structure usually do not reveal all binding sites on a protein. Thus, this requires researchers to find all PDB complexes that contain the protein of interest and combine the binding-site information gleaned from them. This process is very time-consuming. Especially, combing binding-site information obtained from different PDB structures requires tedious work to align protein sequences. The process becomes overwhelmingly difficult when researchers have a large set of proteins to analyze, which is usually the case in practice. In this study, we have developed a tool for calculating binding-site residues on proteins, TCBRP http://yanbioinformatics.cs.usu.edu:8080/ppbindingsubmit. For an input protein, TCBRP can quickly find all binding-site residues on the protein by automatically combining the information obtained from all PDB structures that consist of the protein of interest. Additionally, TCBRP presents the binding-site residues in different categories according to the interaction type. TCBRP also allows researchers to set the definition of binding-site residues. The developed tool is very useful for the research on protein binding site analysis and prediction.

  19. An automatic system to detect and extract texts in medical images for de-identification

    NASA Astrophysics Data System (ADS)

    Zhu, Yingxuan; Singh, P. D.; Siddiqui, Khan; Gillam, Michael

    2010-03-01

    Recently, there is an increasing need to share medical images for research purpose. In order to respect and preserve patient privacy, most of the medical images are de-identified with protected health information (PHI) before research sharing. Since manual de-identification is time-consuming and tedious, so an automatic de-identification system is necessary and helpful for the doctors to remove text from medical images. A lot of papers have been written about algorithms of text detection and extraction, however, little has been applied to de-identification of medical images. Since the de-identification system is designed for end-users, it should be effective, accurate and fast. This paper proposes an automatic system to detect and extract text from medical images for de-identification purposes, while keeping the anatomic structures intact. First, considering the text have a remarkable contrast with the background, a region variance based algorithm is used to detect the text regions. In post processing, geometric constraints are applied to the detected text regions to eliminate over-segmentation, e.g., lines and anatomic structures. After that, a region based level set method is used to extract text from the detected text regions. A GUI for the prototype application of the text detection and extraction system is implemented, which shows that our method can detect most of the text in the images. Experimental results validate that our method can detect and extract text in medical images with a 99% recall rate. Future research of this system includes algorithm improvement, performance evaluation, and computation optimization.

  20. Construction and characterization of recombinant adenovirus carrying a mouse TIGIT-GFP gene.

    PubMed

    Zheng, J M; Cui, J L; He, W T; Yu, D W; Gao, Y; Wang, L; Chen, Z K; Zhou, H M

    2015-12-29

    Recombinant adenovirus vector systems have been used extensively in protein research and gene therapy. However, the construction and characterization of recombinant adenovirus is a tedious and time-consuming process. TIGIT is a recently discovered immunosuppressive molecule that plays an important role in maintaining immunological balance. The construction of recombinant adenovirus mediating TIGIT expression must be simplified to facilitate its use in the study of TIGIT. In this study, the TIGIT gene was combined with green fluorescent protein (GFP); the TIGIT-GFP gene was inserted into a gateway plasmid to construct a TIGIT-GFP adenovirus. HEK 293A cells were infected with the adenovirus, which was then purified and subjected to virus titering. TIGIT-GFP adenovirus was characterized by flow cytometry and immunofluorescence, and its expression in mouse liver was detected by infection through caudal vein injection. The results showed the successful construction of the TIGIT-GFP adenovirus (5 x 10(10) PFU/mL). Co-expression of TIGIT and GFP was identified in 293A and liver cells; synthesis and positioning of TIGIT-GFP was viewed under a fluorescence microscope. TIGIT-GFP was highly expressed on liver cells 1 day (25.53%) after infection and faded 3 days (11.36%) after injection. In conclusion, the fusion of TIGIT with GFP allows easy, rapid, and uncomplicated detection of TIGIT translation. The construction of a TIGIT-GFP adenovirus, mediating TIGIT expression in vitro and in vivo, lays the foundation for further research into TIGIT function and gene therapy. Moreover, the TIGIT-GFP adenovirus is a helpful tool for studying other proteins (which could replace the TIGIT gene).

  1. Development of an automated processing and screening system for the space shuttle orbiter flight test data

    NASA Technical Reports Server (NTRS)

    Mccutchen, D. K.; Brose, J. F.; Palm, W. E.

    1982-01-01

    One nemesis of the structural dynamist is the tedious task of reviewing large quantities of data. This data, obtained from various types of instrumentation, may be represented by oscillogram records, root-mean-squared (rms) time histories, power spectral densities, shock spectra, 1/3 octave band analyses, and various statistical distributions. In an attempt to reduce the laborious task of manually reviewing all of the space shuttle orbiter wideband frequency-modulated (FM) analog data, an automated processing system was developed to perform the screening process based upon predefined or predicted threshold criteria.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rohe, Daniel Peter

    Sandia National Laboratories has recently purchased a Polytec 3D Scanning Laser Doppler Vibrometer for vibration measurement. This device has proven to be a very nice tool for making vibration measurements, and has a number of advantages over traditional sensors such as accelerometers. The non-contact nature of the laser vibrometer means there is no mass loading due to measuring the response. Additionally, the laser scanning heads can position the laser spot much more quickly and accurately than placing an accelerometer or performing a roving hammer impact. The disadvantage of the system is that a significant amount of time must be investedmore » to align the lasers with each other and the part so that the laser spots can be accurately positioned. The Polytec software includes a number of nice tools to aid in this procedure; however, certain portions are still tedious. Luckily, the Polytec software is readily extensible by programming macros for the system, so tedious portions of the procedure can be made easier by automating the process. The Polytec Software includes a WinWrap (similar to Visual Basic) editor and interface to run macros written in that programming language. The author, however, is much more proficient in Python, and the latter also has a much larger set of libraries that can be used to create very complex macros, while taking advantage of Python’s inherent readability and maintainability.« less

  3. Easing access to R using 'shiny' to create graphical user interfaces: An example for the R package 'Luminescence'

    NASA Astrophysics Data System (ADS)

    Burow, Christoph; Kreutzer, Sebastian; Dietze, Michael; Fuchs, Margret C.; Schmidt, Christoph; Fischer, Manfred; Brückner, Helmut

    2017-04-01

    Since the release of the R package 'Luminescence' (Kreutzer et al., 2012) the functionality of the package has been greatly enhanced by implementing further functions for measurement data processing, statistical analysis and graphical output. Despite its capabilities for complex and non-standard analysis of luminescence data, working with the command-line interface (CLI) of R can be tedious at best and overwhelming at worst, especially for users without experience in programming languages. Even though much work is put into simplifying the usage of the package to continuously lower the entry threshold, at least basic knowledge of R will always be required. Thus, the potential user base of the package cannot be exhausted, at least as long as the CLI is the only means of utilising the 'Luminescence' package. But even experienced users may find it tedious to iteratively run a function until a satisfying results is produced. For example, plotting data is also at least partly subject to personal aesthetic tastes in accordance with the information it is supposed to convey and iterating through all the possible options in the R CLI can be a time-consuming task. An alternative approach to the CLI is the graphical user interface (GUI), which allows direct, interactive manipulation and interaction with the underlying software. For users with little or no experience with command-lines a GUI offers intuitive access that counteracts the perceived steep learning curve of a CLI. Even though R lacks native support for GUI functions, its capabilities of linking it to other programming languages allows to utilise external frameworks to build graphical user interfaces. A recent attempt to provide a GUI toolkit for R was the introduction of the 'shiny' package (Chang et al., 2016), which allows automatic construction of HTML, CSS and JavaScript based user interfaces straight from R. Here, we give (1) a brief introduction to the 'shiny' framework for R, before we (2) present a GUI for the R package 'Luminescence' in the form of interactive web applications. These applications can be accessed online so that a user is not even required to have a local installation of R and which provide access to most of the plotting functions of the R package 'Luminescence'. These functionalities will be demonstrated live during the PICO session. References Chang, W., Cheng, J., Allaire, JJ., Xie, Y., McPherson, J., 2016. shiny: Web Application Framework for R. R package version 0.13.2. https://CRAN.R-project.org/package=shiny Kreutzer, S., Schmidt, C., Fuchs, M.C., Dietze, M., Fischer, M., Fuchs, M., 2012. Introducing an R package for luminescence dating analysis. Ancient TL, 30: 1-8, 2012.

  4. Differentiation of tea varieties using UV-Vis spectra and pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Palacios-Morillo, Ana; Alcázar, Ángela.; de Pablos, Fernando; Jurado, José Marcos

    2013-02-01

    Tea, one of the most consumed beverages all over the world, is of great importance in the economies of a number of countries. Several methods have been developed to classify tea varieties or origins based in pattern recognition techniques applied to chemical data, such as metal profile, amino acids, catechins and volatile compounds. Some of these analytical methods become tedious and expensive to be applied in routine works. The use of UV-Vis spectral data as discriminant variables, highly influenced by the chemical composition, can be an alternative to these methods. UV-Vis spectra of methanol-water extracts of tea have been obtained in the interval 250-800 nm. Absorbances have been used as input variables. Principal component analysis was used to reduce the number of variables and several pattern recognition methods, such as linear discriminant analysis, support vector machines and artificial neural networks, have been applied in order to differentiate the most common tea varieties. A successful classification model was built by combining principal component analysis and multilayer perceptron artificial neural networks, allowing the differentiation between tea varieties. This rapid and simple methodology can be applied to solve classification problems in food industry saving economic resources.

  5. Patient-specific surgical planning and hemodynamic computational fluid dynamics optimization through free-form haptic anatomy editing tool (SURGEM).

    PubMed

    Pekkan, Kerem; Whited, Brian; Kanter, Kirk; Sharma, Shiva; de Zelicourt, Diane; Sundareswaran, Kartik; Frakes, David; Rossignac, Jarek; Yoganathan, Ajit P

    2008-11-01

    The first version of an anatomy editing/surgical planning tool (SURGEM) targeting anatomical complexity and patient-specific computational fluid dynamics (CFD) analysis is presented. Novel three-dimensional (3D) shape editing concepts and human-shape interaction technologies have been integrated to facilitate interactive surgical morphology alterations, grid generation and CFD analysis. In order to implement "manual hemodynamic optimization" at the surgery planning phase for patients with congenital heart defects, these tools are applied to design and evaluate possible modifications of patient-specific anatomies. In this context, anatomies involve complex geometric topologies and tortuous 3D blood flow pathways with multiple inlets and outlets. These tools make it possible to freely deform the lumen surface and to bend and position baffles through real-time, direct manipulation of the 3D models with both hands, thus eliminating the tedious and time-consuming phase of entering the desired geometry using traditional computer-aided design (CAD) systems. The 3D models of the modified anatomies are seamlessly exported and meshed for patient-specific CFD analysis. Free-formed anatomical modifications are quantified using an in-house skeletization based cross-sectional geometry analysis tool. Hemodynamic performance of the systematically modified anatomies is compared with the original anatomy using CFD. CFD results showed the relative importance of the various surgically created features such as pouch size, vena cave to pulmonary artery (PA) flare and PA stenosis. An interactive surgical-patch size estimator is also introduced. The combined design/analysis cycle time is used for comparing and optimizing surgical plans and improvements are tabulated. The reduced cost of patient-specific shape design and analysis process, made it possible to envision large clinical studies to assess the validity of predictive patient-specific CFD simulations. In this paper, model anatomical design studies are performed on a total of eight different complex patient specific anatomies. Using SURGEM, more than 30 new anatomical designs (or candidate configurations) are created, and the corresponding user times presented. CFD performances for eight of these candidate configurations are also presented.

  6. Automatic quality control in clinical (1)H MRSI of brain cancer.

    PubMed

    Pedrosa de Barros, Nuno; McKinley, Richard; Knecht, Urspeter; Wiest, Roland; Slotboom, Johannes

    2016-05-01

    MRSI grids frequently show spectra with poor quality, mainly because of the high sensitivity of MRS to field inhomogeneities. These poor quality spectra are prone to quantification and/or interpretation errors that can have a significant impact on the clinical use of spectroscopic data. Therefore, quality control of the spectra should always precede their clinical use. When performed manually, quality assessment of MRSI spectra is not only a tedious and time-consuming task, but is also affected by human subjectivity. Consequently, automatic, fast and reliable methods for spectral quality assessment are of utmost interest. In this article, we present a new random forest-based method for automatic quality assessment of (1)H MRSI brain spectra, which uses a new set of MRS signal features. The random forest classifier was trained on spectra from 40 MRSI grids that were classified as acceptable or non-acceptable by two expert spectroscopists. To account for the effects of intra-rater reliability, each spectrum was rated for quality three times by each rater. The automatic method classified these spectra with an area under the curve (AUC) of 0.976. Furthermore, in the subset of spectra containing only the cases that were classified every time in the same way by the spectroscopists, an AUC of 0.998 was obtained. Feature importance for the classification was also evaluated. Frequency domain skewness and kurtosis, as well as time domain signal-to-noise ratios (SNRs) in the ranges 50-75 ms and 75-100 ms, were the most important features. Given that the method is able to assess a whole MRSI grid faster than a spectroscopist (approximately 3 s versus approximately 3 min), and without loss of accuracy (agreement between classifier trained with just one session and any of the other labelling sessions, 89.88%; agreement between any two labelling sessions, 89.03%), the authors suggest its implementation in the clinical routine. The method presented in this article was implemented in jMRUI's SpectrIm plugin. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Synthesis of 2,4,8,10-tetroxaspiro5,5undecane

    NASA Technical Reports Server (NTRS)

    Poshkus, A. C. (Inventor)

    1985-01-01

    Pentaerythritol is converted to its diformal, 2,4,8,10-tetroxaspirol5.5undecane, by heating it to a temperature within the range of about 110 to 150 C, for a period of up to 10 minutes, in the presence of a slight excess of paraformaldehyde and of a catalytic quantity of an acid catalyst such as sulfuric acid. The reaction may be carried out in two steps, by forming first the monoformal, then the diformal. In any case, total reaction time is about 10 minutes, and yield of diformal are greater than 90%. Previous processes require hours or days, and often, tedious operating procedures.

  8. References for scientific papers: why not standardise to one global style?

    PubMed Central

    Kumar, A. M. V.; Satyanarayana, S.; Bissell, K.; Hinderaker, S. G.; Edginton, M.; Reid, A. J.; Zachariah, R.

    2013-01-01

    The different reference styles demanded by journals, both for in-text citations and manuscript bibliographies, require that significant time and attention be paid to minute detail that constitute a tedious obstacle on the road to publication for all authors, but especially for those from resource-limited countries and/or writing in a second language. To illustrate this, we highlight different reference styles requested by five popular journals to which operational research papers are often submitted. We call for a simpler, standardised format for in-text and bibliography reference citations, so that researchers can concentrate on the science and its interpretation rather than fonts and punctuation. PMID:26393041

  9. Laplace Transforms without Integration

    ERIC Educational Resources Information Center

    Robertson, Robert L.

    2017-01-01

    Calculating Laplace transforms from the definition often requires tedious integrations. This paper provides an integration-free technique for calculating Laplace transforms of many familiar functions. It also shows how the technique can be applied to probability theory.

  10. Determination of endocrine disrupting chemicals and antiretroviral compounds in surface water: A disposable sorptive sampler with comprehensive gas chromatography - Time-of-flight mass spectrometry and large volume injection with ultra-high performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Wooding, Madelien; Rohwer, Egmont R; Naudé, Yvette

    2017-05-05

    Many rural dwellers and inhabitants of informal settlements in South Africa are without access to treated water and collect untreated water from rivers and dams for personal use. Endocrine disrupting chemicals (EDCs) have been detected in surface water and wildlife of South Africa. EDCs are often present in complex environmental matrices at ultra-trace levels complicating detection thereof. We report a simplified multi-residue approach for the detection and quantification of EDCs, emerging EDCs, and antiretroviral drugs in surface water. A low cost (less than one US dollar), disposable, sorptive extraction sampler was prepared in-house. The disposable samplers consisted of polydimethylsiloxane (PDMS) tubing fashioned into a loop which was then placed in water samples to concentrate EDCs and emerging pollutants. The PDMS samplers were thermally desorbed directly in the inlet of a GC, thereby eliminating the need for expensive consumable cryogenics. Comprehensive gas chromatography coupled to time-of-flight mass spectrometry (GC×GC-TOFMS) was used for compound separation and identification. Linear retention indices of EDCs and emerging pollutants were determined on a proprietary Crossbond ® phase Rtx ® -CLPesticides II GC capillary column. In addition, large volume injection of surface water into an ultra-performance liquid chromatograph tandem mass spectrometer (UPLC-MS/MS) was used as complementary methodology for the detection of less volatile compounds. Large volume injection reduced tedious and costly sample preparation steps. Limits of detection for the GC method ranged from 1 to 98pg/l and for the LC method from 2 to 135ng/l. Known and emerging EDCs such as pharmaceuticals, personal care products and pesticides, as well as the antiretroviral compounds, efavirenz and nevirapine, were detected in surface water from South Africa at concentration levels ranging from 0.16ng/l to 227ng/l. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Stirring-controlled solidified floating solid-liquid drop microextraction as a new solid phase-enhanced liquid-phase microextraction method by exploiting magnetic carbon nanotube-nickel hybrid.

    PubMed

    Ghazaghi, Mehri; Mousavi, Hassan Zavvar; Shirkhanloo, Hamid; Rashidi, Alimorad

    2017-01-25

    A specific technique is introduced to overcome limitations of classical solidification of floating organic drop microextraction, such as tedious and time-consuming centrifuge step and using disperser solvent, by facile and efficient participation of solid and liquid phases. In this proposed method of stirring-controlled solidified floating solid-liquid drop microextraction (SC-SF-SLDME), magnetic carbon nanotube-nickel hybrid (MNi-CNT) as a solid part of the extractors are dispersed ultrasonically in sample solution, and the procedure followed by dispersion of liquid phase (1-undecanol) through high-rate stirring and easily recollection of MNi-CNT in organic solvent droplets through hydrophobic force. With the reduction in speed of stirring, one solid-liquid drop is formed on top of the solution. MNi-CNT acts as both extractor and the coalescence helper between organic droplets for a facile recollection. MNi-CNT was prepared by spray pyrolysis of nickel oleate/toluene mixture at 1000 °C. Four tyrosine kinase inhibitors were selected as model analytes and the effecting parameters were investigated. The results confirmed that magnetic nanoadsorbent has an important role in the procedure and complete collection of dispersed solvent is not achieved in the absence of the solid phase. Also, short extraction time exhibited success of the proposed method and effect of dispersed solid/liquid phases. The limits of quantification (LOQs) for imatinib, sunitinib, erlotinib, and nilotinib were determined to be as low as 0.7, 1.7, 0.6, and 1.0 μg L -1 , respectively. The intra-day precisions (RSDs) were lower than 4.5%. Method performance was investigated by determination of mentioned tyrosine kinase inhibitors (TKIs) in human serum and cerebrospinal fluid samples with good recoveries in the range of 93-98%. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Ground-penetrating radar (GPR) responses for sub-surface salt contamination and solid waste: modeling and controlled lysimeter studies.

    PubMed

    Wijewardana, Y N S; Shilpadi, A T; Mowjood, M I M; Kawamoto, K; Galagedara, L W

    2017-02-01

    The assessment of polluted areas and municipal solid waste (MSW) sites using non-destructive geophysical methods is timely and much needed in the field of environmental monitoring and management. The objectives of this study are (i) to evaluate the ground-penetrating radar (GPR) wave responses as a result of different electrical conductivity (EC) in groundwater and (ii) to conduct MSW stratification using a controlled lysimeter and modeling approach. A GPR wave simulation was carried out using GprMax2D software, and the field test was done on two lysimeters that were filled with sand (Lysimeter-1) and MSW (Lysimeter-2). A Pulse EKKO-Pro GPR system with 200- and 500-MHz center frequency antennae was used to collect GPR field data. Amplitudes of GPR-reflected waves (sub-surface reflectors and water table) were studied under different EC levels injected to the water table. Modeling results revealed that the signal strength of the reflected wave decreases with increasing EC levels and the disappearance of the subsurface reflection and wave amplitude reaching zero at higher EC levels (when EC >0.28 S/m). Further, when the EC level was high, the plume thickness did not have a significant effect on the amplitude of the reflected wave. However, it was also found that reflected signal strength decreases with increasing plume thickness at a given EC level. 2D GPR profile images under wet conditions showed stratification of the waste layers and relative thickness, but it was difficult to resolve the waste layers under dry conditions. These results show that the GPR as a non-destructive method with a relatively larger sample volume can be used to identify highly polluted areas with inorganic contaminants in groundwater and waste stratification. The current methods of MSW dumpsite investigation are tedious, destructive, time consuming, costly, and provide only point-scale measurements. However, further research is needed to verify the results under heterogeneous aquifer conditions and complex dumpsite conditions.

  13. Direct aqueous determination of glyphosate and related compounds by liquid chromatography/tandem mass spectrometry using reversed-phase and weak anion-exchange mixed-mode column.

    PubMed

    Hao, Chunyan; Morse, David; Morra, Franca; Zhao, Xiaoming; Yang, Paul; Nunn, Brian

    2011-08-19

    Analysis of the broad-spectrum herbicide glyphosate and its related compounds is quite challenging. Tedious and time-consuming derivatization is often required for these substances due to their high polarity, high water solubility, low volatility and molecular structure which lacks either a chromophore or fluorophore. A novel liquid chromatography/tandem mass spectrometry (LC/MS-MS) method has been developed for the determination of glyphosate, aminomethylphosphonic acid (AMPA) and glufosinate using a reversed-phase and weak anion-exchange mixed-mode Acclaim® WAX-1 column. Aqueous environmental samples are directly injected and analyzed in 12 min with no sample concentration or derivatization steps. Two multiple reaction monitoring (MRM) channels are monitored in the method for each target compound to achieve true positive identification, and ¹³C, ¹⁵N-glyphosate is used as an internal standard to carry out isotope dilution mass spectrometric (IDMS) measurement for glyphosate. The instrument detection limits (IDLs) for glyphosate, AMPA and glufosinate are 1, 2 and 0.9 μg/L, respectively. Linearity of the detector response with a minimum coefficient of determination (R² value (R² > 0.995) was demonstrated in the range of ∼10 to 10³ μg/L for each analytes. Spiked drinking water, surface water and groundwater samples were analyzed using this method and the average recoveries of analytes in three matrices ranged from 77.0 to 102%, 62.1 to 101%, 66.1 to 93.7% while relative standard deviation ranged from 6.3 to 10.2%, 2.7 to 14.8%, 2.9 to 10.7%, respectively. Factors that may affect method performance, such as metal ions, sample preservation, and storage time, are also discussed. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  14. Defect detection and classification of machined surfaces under multiple illuminant directions

    NASA Astrophysics Data System (ADS)

    Liao, Yi; Weng, Xin; Swonger, C. W.; Ni, Jun

    2010-08-01

    Continuous improvement of product quality is crucial to the successful and competitive automotive manufacturing industry in the 21st century. The presence of surface porosity located on flat machined surfaces such as cylinder heads/blocks and transmission cases may allow leaks of coolant, oil, or combustion gas between critical mating surfaces, thus causing damage to the engine or transmission. Therefore 100% inline inspection plays an important role for improving product quality. Although the techniques of image processing and machine vision have been applied to machined surface inspection and well improved in the past 20 years, in today's automotive industry, surface porosity inspection is still done by skilled humans, which is costly, tedious, time consuming and not capable of reliably detecting small defects. In our study, an automated defect detection and classification system for flat machined surfaces has been designed and constructed. In this paper, the importance of the illuminant direction in a machine vision system was first emphasized and then the surface defect inspection system under multiple directional illuminations was designed and constructed. After that, image processing algorithms were developed to realize 5 types of 2D or 3D surface defects (pore, 2D blemish, residue dirt, scratch, and gouge) detection and classification. The steps of image processing include: (1) image acquisition and contrast enhancement (2) defect segmentation and feature extraction (3) defect classification. An artificial machined surface and an actual automotive part: cylinder head surface were tested and, as a result, microscopic surface defects can be accurately detected and assigned to a surface defect class. The cycle time of this system can be sufficiently fast that implementation of 100% inline inspection is feasible. The field of view of this system is 150mm×225mm and the surfaces larger than the field of view can be stitched together in software.

  15. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    PubMed

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.

  16. Computer-Assisted Automated Scoring of Polysomnograms Using the Somnolyzer System.

    PubMed

    Punjabi, Naresh M; Shifa, Naima; Dorffner, Georg; Patil, Susheel; Pien, Grace; Aurora, Rashmi N

    2015-10-01

    Manual scoring of polysomnograms is a time-consuming and tedious process. To expedite the scoring of polysomnograms, several computerized algorithms for automated scoring have been developed. The overarching goal of this study was to determine the validity of the Somnolyzer system, an automated system for scoring polysomnograms. The analysis sample comprised of 97 sleep studies. Each polysomnogram was manually scored by certified technologists from four sleep laboratories and concurrently subjected to automated scoring by the Somnolyzer system. Agreement between manual and automated scoring was examined. Sleep staging and scoring of disordered breathing events was conducted using the 2007 American Academy of Sleep Medicine criteria. Clinical sleep laboratories. A high degree of agreement was noted between manual and automated scoring of the apnea-hypopnea index (AHI). The average correlation between the manually scored AHI across the four clinical sites was 0.92 (95% confidence interval: 0.90-0.93). Similarly, the average correlation between the manual and Somnolyzer-scored AHI values was 0.93 (95% confidence interval: 0.91-0.96). Thus, interscorer correlation between the manually scored results was no different than that derived from manual and automated scoring. Substantial concordance in the arousal index, total sleep time, and sleep efficiency between manual and automated scoring was also observed. In contrast, differences were noted between manually and automated scored percentages of sleep stages N1, N2, and N3. Automated analysis of polysomnograms using the Somnolyzer system provides results that are comparable to manual scoring for commonly used metrics in sleep medicine. Although differences exist between manual versus automated scoring for specific sleep stages, the level of agreement between manual and automated scoring is not significantly different than that between any two human scorers. In light of the burden associated with manual scoring, automated scoring platforms provide a viable complement of tools in the diagnostic armamentarium of sleep medicine. © 2015 Associated Professional Sleep Societies, LLC.

  17. Maui-VIA: A User-Friendly Software for Visual Identification, Alignment, Correction, and Quantification of Gas Chromatography–Mass Spectrometry Data

    PubMed Central

    Kuich, P. Henning J. L.; Hoffmann, Nils; Kempa, Stefan

    2015-01-01

    A current bottleneck in GC–MS metabolomics is the processing of raw machine data into a final datamatrix that contains the quantities of identified metabolites in each sample. While there are many bioinformatics tools available to aid the initial steps of the process, their use requires both significant technical expertise and a subsequent manual validation of identifications and alignments if high data quality is desired. The manual validation is tedious and time consuming, becoming prohibitively so as sample numbers increase. We have, therefore, developed Maui-VIA, a solution based on a visual interface that allows experts and non-experts to simultaneously and quickly process, inspect, and correct large numbers of GC–MS samples. It allows for the visual inspection of identifications and alignments, facilitating a unique and, due to its visualization and keyboard shortcuts, very fast interaction with the data. Therefore, Maui-Via fills an important niche by (1) providing functionality that optimizes the component of data processing that is currently most labor intensive to save time and (2) lowering the threshold of expertise required to process GC–MS data. Maui-VIA projects are initiated with baseline-corrected raw data, peaklists, and a database of metabolite spectra and retention indices used for identification. It provides functionality for retention index calculation, a targeted library search, the visual annotation, alignment, correction interface, and metabolite quantification, as well as the export of the final datamatrix. The high quality of data produced by Maui-VIA is illustrated by its comparison to data attained manually by an expert using vendor software on a previously published dataset concerning the response of Chlamydomonas reinhardtii to salt stress. In conclusion, Maui-VIA provides the opportunity for fast, confident, and high-quality data processing validation of large numbers of GC–MS samples by non-experts. PMID:25654076

  18. Automatic analysis of the micronucleus test in primary human lymphocytes using image analysis.

    PubMed

    Frieauff, W; Martus, H J; Suter, W; Elhajouji, A

    2013-01-01

    The in vitro micronucleus test (MNT) is a well-established test for early screening of new chemical entities in industrial toxicology. For assessing the clastogenic or aneugenic potential of a test compound, micronucleus induction in cells has been shown repeatedly to be a sensitive and a specific parameter. Various automated systems to replace the tedious and time-consuming visual slide analysis procedure as well as flow cytometric approaches have been discussed. The ROBIAS (Robotic Image Analysis System) for both automatic cytotoxicity assessment and micronucleus detection in human lymphocytes was developed at Novartis where the assay has been used to validate positive results obtained in the MNT in TK6 cells, which serves as the primary screening system for genotoxicity profiling in early drug development. In addition, the in vitro MNT has become an accepted alternative to support clinical studies and will be used for regulatory purposes as well. The comparison of visual with automatic analysis results showed a high degree of concordance for 25 independent experiments conducted for the profiling of 12 compounds. For concentration series of cyclophosphamide and carbendazim, a very good correlation between automatic and visual analysis by two examiners could be established, both for the relative division index used as cytotoxicity parameter, as well as for micronuclei scoring in mono- and binucleated cells. Generally, false-positive micronucleus decisions could be controlled by fast and simple relocation of the automatically detected patterns. The possibility to analyse 24 slides within 65h by automatic analysis over the weekend and the high reproducibility of the results make automatic image processing a powerful tool for the micronucleus analysis in primary human lymphocytes. The automated slide analysis for the MNT in human lymphocytes complements the portfolio of image analysis applications on ROBIAS which is supporting various assays at Novartis.

  19. Image based Monte Carlo Modeling for Computational Phantom

    NASA Astrophysics Data System (ADS)

    Cheng, Mengyun; Wang, Wen; Zhao, Kai; Fan, Yanchang; Long, Pengcheng; Wu, Yican

    2014-06-01

    The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verfication of the models for Monte carlo(MC)simulation are very tedious, error-prone and time-consuming. In addiation, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling by FDS Team (Advanced Nuclear Energy Research Team, http://www.fds.org.cn). The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients(Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection.

  20. A versatile 2A peptide-based bicistronic protein expressing platform for the industrial cellulase producing fungus, Trichoderma reesei

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Subramanian, Venkataramanan; Schuster, Logan A.; Moore, Kyle T.

    Here, the industrial workhorse fungus, Trichoderma reesei, is typically exploited for its ability to produce cellulase enzymes, whereas use of this fungus for over-expression of other proteins (homologous and heterologous) is still very limited. Identifying transformants expressing target protein is a tedious task due to low transformation efficiency, combined with highly variable expression levels between transformants. Routine methods for identification include PCR-based analysis, western blotting, or crude activity screening, all of which are time-consuming techniques. To simplify this screening, we have adapted the 2A peptide system from the foot-and-mouth disease virus (FMDV) to T. reesei to express a readily screenablemore » marker protein that is co-translated with a target protein. The 2A peptide sequence allows multiple independent genes to be transcribed as a single mRNA. Upon translation, the 2A peptide sequence causes a 'ribosomal skip' generating two (or more) independent gene products. When the 2A peptide is translated, the 'skip' occurs between its two C-terminal amino acids (glycine and proline), resulting in the addition of extra amino acids on the C terminus of the upstream protein and a single proline addition to the N terminus of the downstream protein. To test this approach, we have cloned two heterologous proteins on either side of a modified 2A peptide, a secreted cellobiohydrolase enzyme (Cel7A from Penicillium funiculosum) as our target protein, and an intracellular enhanced green fluorescent protein (eGFP) as our marker protein. Using straightforward monitoring of eGFP expression, we have shown that we can efficiently monitor the expression of the target Cel7A protein.« less

  1. A pulsed injection parahydrogen generator and techniques for quantifying enrichment.

    PubMed

    Feng, Bibo; Coffey, Aaron M; Colon, Raul D; Chekmenev, Eduard Y; Waddell, Kevin W

    2012-01-01

    A device is presented for efficiently enriching parahydrogen by pulsed injection of ambient hydrogen gas. Hydrogen input to the generator is pulsed at high pressure to a catalyst chamber making thermal contact with the cold head of a closed-cycle cryocooler maintained between 15 and 20K. The system enables fast production (0.9 standard liters per minute) and allows for a wide range of production targets. Production rates can be systematically adjusted by varying the actuation sequence of high-pressure solenoid valves, which are controlled via an open source microcontroller to sample all combinations between fast and thorough enrichment by varying duration of hydrogen contact in the catalyst chamber. The entire enrichment cycle from optimization to quantification and storage kinetics are also described. Conversion of the para spin-isomer to orthohydrogen in borosilicate tubes was measured at 8 min intervals over a period of 64 h with a 12 T NMR spectrometer. These relaxation curves were then used to extract initial enrichment by exploiting the known equilibrium (relaxed) distribution of spin isomers with linear least squares fitting to a single exponential decay curve with an estimated error less than or equal to 1%. This procedure is time-consuming, but requires only one sample pressurized to atmosphere. Given that tedious matching to external references are unnecessary with this procedure, we find it to be useful for periodic inspection of generator performance. The equipment and procedures offer a variation in generator design that eliminate the need to meter flow while enabling access to increased rates of production. These tools for enriching and quantifying parahydrogen have been in steady use for 3 years and should be helpful as a template or as reference material for building and operating a parahydrogen production facility. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Visualizing astronomy data using VRML

    NASA Astrophysics Data System (ADS)

    Beeson, Brett; Lancaster, Michael; Barnes, David G.; Bourke, Paul D.; Rixon, Guy T.

    2004-09-01

    Visualisation is a powerful tool for understanding the large data sets typical of astronomical surveys and can reveal unsuspected relationships and anomalous regions of parameter space which may be difficult to find programatically. Visualisation is a classic information technology for optimising scientific return. We are developing a number of generic on-line visualisation tools as a component of the Australian Virtual Observatory project. The tools will be deployed within the framework of the International Virtual Observatory Alliance (IVOA), and follow agreed-upon standards to make them accessible by other programs and people. We and our IVOA partners plan to utilise new information technologies (such as grid computing and web services) to advance the scientific return of existing and future instrumentation. Here we present a new tool - VOlume - which visualises point data. Visualisation of astronomical data normally requires the local installation of complex software, the downloading of potentially large datasets, and very often time-consuming and tedious data format conversions. VOlume enables the astronomer to visualise data using just a web browser and plug-in. This is achieved using IVOA standards which allow us to pass data between Web Services, Java Servlet Technology and Common Gateway Interface programs. Data from a catalogue server can be streamed in eXtensible Mark-up Language format to a servlet which produces Virtual Reality Modeling Language output. The user selects elements of the catalogue to map to geometry and then visualises the result in a browser plug-in such as Cortona or FreeWRL. Other than requiring an input VOTable format file, VOlume is very general. While its major use will likely be to display and explore astronomical source catalogues, it can easily render other important parameter fields such as the sky and redshift coverage of proposed surveys or the sampling of the visibility plane by a rotation-synthesis interferometer.

  3. Drug-target interaction prediction using ensemble learning and dimensionality reduction.

    PubMed

    Ezzat, Ali; Wu, Min; Li, Xiao-Li; Kwoh, Chee-Keong

    2017-10-01

    Experimental prediction of drug-target interactions is expensive, time-consuming and tedious. Fortunately, computational methods help narrow down the search space for interaction candidates to be further examined via wet-lab techniques. Nowadays, the number of attributes/features for drugs and targets, as well as the amount of their interactions, are increasing, making these computational methods inefficient or occasionally prohibitive. This motivates us to derive a reduced feature set for prediction. In addition, since ensemble learning techniques are widely used to improve the classification performance, it is also worthwhile to design an ensemble learning framework to enhance the performance for drug-target interaction prediction. In this paper, we propose a framework for drug-target interaction prediction leveraging both feature dimensionality reduction and ensemble learning. First, we conducted feature subspacing to inject diversity into the classifier ensemble. Second, we applied three different dimensionality reduction methods to the subspaced features. Third, we trained homogeneous base learners with the reduced features and then aggregated their scores to derive the final predictions. For base learners, we selected two classifiers, namely Decision Tree and Kernel Ridge Regression, resulting in two variants of ensemble models, EnsemDT and EnsemKRR, respectively. In our experiments, we utilized AUC (Area under ROC Curve) as an evaluation metric. We compared our proposed methods with various state-of-the-art methods under 5-fold cross validation. Experimental results showed EnsemKRR achieving the highest AUC (94.3%) for predicting drug-target interactions. In addition, dimensionality reduction helped improve the performance of EnsemDT. In conclusion, our proposed methods produced significant improvements for drug-target interaction prediction. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. A versatile 2A peptide-based bicistronic protein expressing platform for the industrial cellulase producing fungus, Trichoderma reesei

    DOE PAGES

    Subramanian, Venkataramanan; Schuster, Logan A.; Moore, Kyle T.; ...

    2017-02-06

    Here, the industrial workhorse fungus, Trichoderma reesei, is typically exploited for its ability to produce cellulase enzymes, whereas use of this fungus for over-expression of other proteins (homologous and heterologous) is still very limited. Identifying transformants expressing target protein is a tedious task due to low transformation efficiency, combined with highly variable expression levels between transformants. Routine methods for identification include PCR-based analysis, western blotting, or crude activity screening, all of which are time-consuming techniques. To simplify this screening, we have adapted the 2A peptide system from the foot-and-mouth disease virus (FMDV) to T. reesei to express a readily screenablemore » marker protein that is co-translated with a target protein. The 2A peptide sequence allows multiple independent genes to be transcribed as a single mRNA. Upon translation, the 2A peptide sequence causes a 'ribosomal skip' generating two (or more) independent gene products. When the 2A peptide is translated, the 'skip' occurs between its two C-terminal amino acids (glycine and proline), resulting in the addition of extra amino acids on the C terminus of the upstream protein and a single proline addition to the N terminus of the downstream protein. To test this approach, we have cloned two heterologous proteins on either side of a modified 2A peptide, a secreted cellobiohydrolase enzyme (Cel7A from Penicillium funiculosum) as our target protein, and an intracellular enhanced green fluorescent protein (eGFP) as our marker protein. Using straightforward monitoring of eGFP expression, we have shown that we can efficiently monitor the expression of the target Cel7A protein.« less

  5. Automated Glioblastoma Segmentation Based on a Multiparametric Structured Unsupervised Classification

    PubMed Central

    Juan-Albarracín, Javier; Fuster-Garcia, Elies; Manjón, José V.; Robles, Montserrat; Aparici, F.; Martí-Bonmatí, L.; García-Gómez, Juan M.

    2015-01-01

    Automatic brain tumour segmentation has become a key component for the future of brain tumour treatment. Currently, most of brain tumour segmentation approaches arise from the supervised learning standpoint, which requires a labelled training dataset from which to infer the models of the classes. The performance of these models is directly determined by the size and quality of the training corpus, whose retrieval becomes a tedious and time-consuming task. On the other hand, unsupervised approaches avoid these limitations but often do not reach comparable results than the supervised methods. In this sense, we propose an automated unsupervised method for brain tumour segmentation based on anatomical Magnetic Resonance (MR) images. Four unsupervised classification algorithms, grouped by their structured or non-structured condition, were evaluated within our pipeline. Considering the non-structured algorithms, we evaluated K-means, Fuzzy K-means and Gaussian Mixture Model (GMM), whereas as structured classification algorithms we evaluated Gaussian Hidden Markov Random Field (GHMRF). An automated postprocess based on a statistical approach supported by tissue probability maps is proposed to automatically identify the tumour classes after the segmentations. We evaluated our brain tumour segmentation method with the public BRAin Tumor Segmentation (BRATS) 2013 Test and Leaderboard datasets. Our approach based on the GMM model improves the results obtained by most of the supervised methods evaluated with the Leaderboard set and reaches the second position in the ranking. Our variant based on the GHMRF achieves the first position in the Test ranking of the unsupervised approaches and the seventh position in the general Test ranking, which confirms the method as a viable alternative for brain tumour segmentation. PMID:25978453

  6. ChromatoGate: A Tool for Detecting Base Mis-Calls in Multiple Sequence Alignments by Semi-Automatic Chromatogram Inspection

    PubMed Central

    Alachiotis, Nikolaos; Vogiatzi, Emmanouella; Pavlidis, Pavlos; Stamatakis, Alexandros

    2013-01-01

    Automated DNA sequencers generate chromatograms that contain raw sequencing data. They also generate data that translates the chromatograms into molecular sequences of A, C, G, T, or N (undetermined) characters. Since chromatogram translation programs frequently introduce errors, a manual inspection of the generated sequence data is required. As sequence numbers and lengths increase, visual inspection and manual correction of chromatograms and corresponding sequences on a per-peak and per-nucleotide basis becomes an error-prone, time-consuming, and tedious process. Here, we introduce ChromatoGate (CG), an open-source software that accelerates and partially automates the inspection of chromatograms and the detection of sequencing errors for bidirectional sequencing runs. To provide users full control over the error correction process, a fully automated error correction algorithm has not been implemented. Initially, the program scans a given multiple sequence alignment (MSA) for potential sequencing errors, assuming that each polymorphic site in the alignment may be attributed to a sequencing error with a certain probability. The guided MSA assembly procedure in ChromatoGate detects chromatogram peaks of all characters in an alignment that lead to polymorphic sites, given a user-defined threshold. The threshold value represents the sensitivity of the sequencing error detection mechanism. After this pre-filtering, the user only needs to inspect a small number of peaks in every chromatogram to correct sequencing errors. Finally, we show that correcting sequencing errors is important, because population genetic and phylogenetic inferences can be misled by MSAs with uncorrected mis-calls. Our experiments indicate that estimates of population mutation rates can be affected two- to three-fold by uncorrected errors. PMID:24688709

  7. ChromatoGate: A Tool for Detecting Base Mis-Calls in Multiple Sequence Alignments by Semi-Automatic Chromatogram Inspection.

    PubMed

    Alachiotis, Nikolaos; Vogiatzi, Emmanouella; Pavlidis, Pavlos; Stamatakis, Alexandros

    2013-01-01

    Automated DNA sequencers generate chromatograms that contain raw sequencing data. They also generate data that translates the chromatograms into molecular sequences of A, C, G, T, or N (undetermined) characters. Since chromatogram translation programs frequently introduce errors, a manual inspection of the generated sequence data is required. As sequence numbers and lengths increase, visual inspection and manual correction of chromatograms and corresponding sequences on a per-peak and per-nucleotide basis becomes an error-prone, time-consuming, and tedious process. Here, we introduce ChromatoGate (CG), an open-source software that accelerates and partially automates the inspection of chromatograms and the detection of sequencing errors for bidirectional sequencing runs. To provide users full control over the error correction process, a fully automated error correction algorithm has not been implemented. Initially, the program scans a given multiple sequence alignment (MSA) for potential sequencing errors, assuming that each polymorphic site in the alignment may be attributed to a sequencing error with a certain probability. The guided MSA assembly procedure in ChromatoGate detects chromatogram peaks of all characters in an alignment that lead to polymorphic sites, given a user-defined threshold. The threshold value represents the sensitivity of the sequencing error detection mechanism. After this pre-filtering, the user only needs to inspect a small number of peaks in every chromatogram to correct sequencing errors. Finally, we show that correcting sequencing errors is important, because population genetic and phylogenetic inferences can be misled by MSAs with uncorrected mis-calls. Our experiments indicate that estimates of population mutation rates can be affected two- to three-fold by uncorrected errors.

  8. Some NACA Muroc personnel with snowman

    NASA Technical Reports Server (NTRS)

    1949-01-01

    The late 1940s saw increased flight activity, and more women computers were needed at the NACA Muroc Flight Test Unit than the ones who had originally arrived in 1946. A call went out to the NACA Langley, Lewis, and Ames laboratories for more women computers. Pictured in this photograph with the Snowman are some of the women computers who responded to the call for help in 1948 along with Roxanah, Emily, Dorothy, who were already here. Standing left to right: Mary (Tut) Hedgepeth, from Langley; Lilly Ann Bajus, Lewis; Roxanah Yancey, Emily Stephens, Jane Collons (Procurement), Leona Corbett (Personnel), Angel Dunn, Langley. Kneeling left to right: Dorothy (Dottie) Crawford Roth, Lewis; Dorothy Clift Hughes, and Gertrude (Trudy) Wilken Valentine, Lewis. In National Advisory Committee for Aeronautics (NACA) terminology of 1946, computers were employees who performed laborious and time-consuming mathematical calculations and data reduction from long strips of records generated by onboard aircraft instrumentation. Virtually without exception, computers were female; at least part of the rationale seems to have been the notion that the work was long and tedious, and men were not thought to have the patience to do it. Though equipment changed over the years and most computers eventually found themselves programming and operating electronic computers, as well as doing other data processing tasks, being a computer initially meant long hours with a slide rule, hunched over illuminated light boxes measuring line traces from grainy and obscure strips of oscillograph film. Computers suffered terrible eyestrain, and those who didn't begin by wearing glasses did so after a few years. But they were initially essential employees at the Muroc Flight Test Unit and NACA High-Speed Flight Research Station, taking the oscillograph flight records and 'reducing' the data on them to make them useful to research engineers, who analyzed the data.

  9. Computer vision-based automated peak picking applied to protein NMR spectra.

    PubMed

    Klukowski, Piotr; Walczak, Michal J; Gonczarek, Adam; Boudet, Julien; Wider, Gerhard

    2015-09-15

    A detailed analysis of multidimensional NMR spectra of macromolecules requires the identification of individual resonances (peaks). This task can be tedious and time-consuming and often requires support by experienced users. Automated peak picking algorithms were introduced more than 25 years ago, but there are still major deficiencies/flaws that often prevent complete and error free peak picking of biological macromolecule spectra. The major challenges of automated peak picking algorithms is both the distinction of artifacts from real peaks particularly from those with irregular shapes and also picking peaks in spectral regions with overlapping resonances which are very hard to resolve by existing computer algorithms. In both of these cases a visual inspection approach could be more effective than a 'blind' algorithm. We present a novel approach using computer vision (CV) methodology which could be better adapted to the problem of peak recognition. After suitable 'training' we successfully applied the CV algorithm to spectra of medium-sized soluble proteins up to molecular weights of 26 kDa and to a 130 kDa complex of a tetrameric membrane protein in detergent micelles. Our CV approach outperforms commonly used programs. With suitable training datasets the application of the presented method can be extended to automated peak picking in multidimensional spectra of nucleic acids or carbohydrates and adapted to solid-state NMR spectra. CV-Peak Picker is available upon request from the authors. gsw@mol.biol.ethz.ch; michal.walczak@mol.biol.ethz.ch; adam.gonczarek@pwr.edu.pl Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Tools for automating the imaging of zebrafish larvae.

    PubMed

    Pulak, Rock

    2016-03-01

    The VAST BioImager system is a set of tools developed for zebrafish researchers who require the collection of images from a large number of 2-7 dpf zebrafish larvae. The VAST BioImager automates larval handling, positioning and orientation tasks. Color images at about 10 μm resolution are collected from the on-board camera of the system. If images of greater resolution and detail are required, this system is mounted on an upright microscope, such as a confocal or fluorescence microscope, to utilize their capabilities. The system loads a larvae, positions it in view of the camera, determines orientation using pattern recognition analysis, and then more precisely positions to user-defined orientation for optimal imaging of any desired tissue or organ system. Multiple images of the same larva can be collected. The specific part of each larva and the desired orientation and position is identified by the researcher and an experiment defining the settings and a series of steps can be saved and repeated for imaging of subsequent larvae. The system captures images, then ejects and loads another larva from either a bulk reservoir, a well of a 96 well plate using the LP Sampler, or individually targeted larvae from a Petri dish or other container using the VAST Pipettor. Alternative manual protocols for handling larvae for image collection are tedious and time consuming. The VAST BioImager automates these steps to allow for greater throughput of assays and screens requiring high-content image collection of zebrafish larvae such as might be used in drug discovery and toxicology studies. Copyright © 2015 The Author. Published by Elsevier Inc. All rights reserved.

  11. Osteoarthritis classification using self organizing map based on gabor kernel and contrast-limited adaptive histogram equalization.

    PubMed

    Anifah, Lilik; Purnama, I Ketut Eddy; Hariadi, Mochamad; Purnomo, Mauridhi Hery

    2013-01-01

    Localization is the first step in osteoarthritis (OA) classification. Manual classification, however, is time-consuming, tedious, and expensive. The proposed system is designed as decision support system for medical doctors to classify the severity of knee OA. A method has been proposed here to localize a joint space area for OA and then classify it in 4 steps to classify OA into KL-Grade 0, KL-Grade 1, KL-Grade 2, KL-Grade 3 and KL-Grade 4, which are preprocessing, segmentation, feature extraction, and classification. In this proposed system, right and left knee detection was performed by employing the Contrast-Limited Adaptive Histogram Equalization (CLAHE) and the template matching. The Gabor kernel, row sum graph and moment methods were used to localize the junction space area of knee. CLAHE is used for preprocessing step, i.e.to normalize the varied intensities. The segmentation process was conducted using the Gabor kernel, template matching, row sum graph and gray level center of mass method. Here GLCM (contrast, correlation, energy, and homogeinity) features were employed as training data. Overall, 50 data were evaluated for training and 258 data for testing. Experimental results showed the best performance by using gabor kernel with parameters α=8, θ=0, Ψ=[0 π/2], γ=0,8, N=4 and with number of iterations being 5000, momentum value 0.5 and α0=0.6 for the classification process. The run gave classification accuracy rate of 93.8% for KL-Grade 0, 70% for KL-Grade 1, 4% for KL-Grade 2, 10% for KL-Grade 3 and 88.9% for KL-Grade 4.

  12. Osteoarthritis Classification Using Self Organizing Map Based on Gabor Kernel and Contrast-Limited Adaptive Histogram Equalization

    PubMed Central

    Anifah, Lilik; Purnama, I Ketut Eddy; Hariadi, Mochamad; Purnomo, Mauridhi Hery

    2013-01-01

    Localization is the first step in osteoarthritis (OA) classification. Manual classification, however, is time-consuming, tedious, and expensive. The proposed system is designed as decision support system for medical doctors to classify the severity of knee OA. A method has been proposed here to localize a joint space area for OA and then classify it in 4 steps to classify OA into KL-Grade 0, KL-Grade 1, KL-Grade 2, KL-Grade 3 and KL-Grade 4, which are preprocessing, segmentation, feature extraction, and classification. In this proposed system, right and left knee detection was performed by employing the Contrast-Limited Adaptive Histogram Equalization (CLAHE) and the template matching. The Gabor kernel, row sum graph and moment methods were used to localize the junction space area of knee. CLAHE is used for preprocessing step, i.e.to normalize the varied intensities. The segmentation process was conducted using the Gabor kernel, template matching, row sum graph and gray level center of mass method. Here GLCM (contrast, correlation, energy, and homogeinity) features were employed as training data. Overall, 50 data were evaluated for training and 258 data for testing. Experimental results showed the best performance by using gabor kernel with parameters α=8, θ=0, Ψ=[0 π/2], γ=0,8, N=4 and with number of iterations being 5000, momentum value 0.5 and α0=0.6 for the classification process. The run gave classification accuracy rate of 93.8% for KL-Grade 0, 70% for KL-Grade 1, 4% for KL-Grade 2, 10% for KL-Grade 3 and 88.9% for KL-Grade 4. PMID:23525188

  13. FPGA wavelet processor design using language for instruction-set architectures (LISA)

    NASA Astrophysics Data System (ADS)

    Meyer-Bäse, Uwe; Vera, Alonzo; Rao, Suhasini; Lenk, Karl; Pattichis, Marios

    2007-04-01

    The design of an microprocessor is a long, tedious, and error-prone task consisting of typically three design phases: architecture exploration, software design (assembler, linker, loader, profiler), architecture implementation (RTL generation for FPGA or cell-based ASIC) and verification. The Language for instruction-set architectures (LISA) allows to model a microprocessor not only from instruction-set but also from architecture description including pipelining behavior that allows a design and development tool consistency over all levels of the design. To explore the capability of the LISA processor design platform a.k.a. CoWare Processor Designer we present in this paper three microprocessor designs that implement a 8/8 wavelet transform processor that is typically used in today's FBI fingerprint compression scheme. We have designed a 3 stage pipelined 16 bit RISC processor (NanoBlaze). Although RISC μPs are usually considered "fast" processors due to design concept like constant instruction word size, deep pipelines and many general purpose registers, it turns out that DSP operations consume essential processing time in a RISC processor. In a second step we have used design principles from programmable digital signal processor (PDSP) to improve the throughput of the DWT processor. A multiply-accumulate operation along with indirect addressing operation were the key to achieve higher throughput. A further improvement is possible with today's FPGA technology. Today's FPGAs offer a large number of embedded array multipliers and it is now feasible to design a "true" vector processor (TVP). A multiplication of two vectors can be done in just one clock cycle with our TVP, a complete scalar product in two clock cycles. Code profiling and Xilinx FPGA ISE synthesis results are provided that demonstrate the essential improvement that a TVP has compared with traditional RISC or PDSP designs.

  14. Automation technology using Geographic Information System (GIS)

    NASA Technical Reports Server (NTRS)

    Brooks, Cynthia L.

    1994-01-01

    Airport Surface Movement Area is but one of the actions taken to increase the capacity and safety of existing airport facilities. The System Integration Branch (SIB) has designed an integrated system consisting of an electronic moving display in the cockpit, and includes display of taxi routes which will warn controllers and pilots of the position of other traffic and warning information automatically. Although, this system has in test simulation proven to be accurate and helpful; the initial process of obtaining an airport layout of the taxi-routes and designing each of them is a very tedious and time-consuming process. Other methods of preparing the display maps are being researched. One such method is the use of the Geographical Information System (GIS). GIS is an integrated system of computer hardware and software linking topographical, demographic and other resource data that is being referenced. The software can support many areas of work with virtually unlimited information compatibility due to the system's open architecture. GIS will allow us to work faster with increased efficiency and accuracy while providing decision making capabilities. GIS is currently being used at the Langley Research Center with other applications and has been validated as an accurate system for that task. GIS usage for our task will involve digitizing aerial photographs of the topology for each taxi-runway and identifying each position according to its specific spatial coordinates. The information currently being used can be integrated with the GIS system, due to its ability to provide a wide variety of user interfaces. Much more research and data analysis will be needed before this technique will be used, however we are hopeful this will lead to better usage of man-power and technological capabilities for the future.

  15. An immunochromatographic assay for rapid and direct detection of 3-amino-5-morpholino-2-oxazolidone (AMOZ) in meat and feed samples.

    PubMed

    Li, Shuqun; Song, Juan; Yang, Hong; Cao, Biyun; Chang, Huafang; Deng, Anping

    2014-03-15

    Furaltadone (FTD) is a type of nitrofuran and has been banned in many countries as a veterinary drug in food-producing animals owing to its potential carcinogenicity and mutagenicity. FTD is unstable in vivo, rapidly metabolizing to 3-amino-5-methylmorpholino-2-oxazolidinone (AMOZ); thus AMOZ can be used as an indicator for illegal usage of FTD. Usually, for the determination of nitrofurans, the analyte is often a derivative of the metabolite rather than the metabolite itself. In this study, based on the monoclonal antibody (mAb) against AMOZ, a competitive immunochromatographic assay (ICA) using a colloidal gold-mAb probe for rapid and direct detection of AMOZ without a derivatization step in meat and feed samples was developed. The intensity of red color in the test line is inversely related to the analyte concentration and the visual detection limit was found to be 10 ng mL⁻¹. The performance of this assay was simple and convenient because the tedious and time-consuming derivatization step was avoided. The ICA detection was completed within 10 min. The ICA strips could be used for 7 weeks at room temperature without significant loss of activity. The AMOZ spiked samples were detected by ICA and confirmed by enzyme-linked immunosorbent assay. The results of the two methods were in good agreement. The proposed ICA provides a feasible tool for simple, sensitive, rapid, convenient and semi-quantitative detection of AMOZ in meat and feed samples on site. To our knowledge, this is the first report of the ICA for direct detection of AMOZ. © 2013 Society of Chemical Industry.

  16. Ultrasound image-based thyroid nodule automatic segmentation using convolutional neural networks.

    PubMed

    Ma, Jinlian; Wu, Fa; Jiang, Tian'an; Zhao, Qiyu; Kong, Dexing

    2017-11-01

    Delineation of thyroid nodule boundaries from ultrasound images plays an important role in calculation of clinical indices and diagnosis of thyroid diseases. However, it is challenging for accurate and automatic segmentation of thyroid nodules because of their heterogeneous appearance and components similar to the background. In this study, we employ a deep convolutional neural network (CNN) to automatically segment thyroid nodules from ultrasound images. Our CNN-based method formulates a thyroid nodule segmentation problem as a patch classification task, where the relationship among patches is ignored. Specifically, the CNN used image patches from images of normal thyroids and thyroid nodules as inputs and then generated the segmentation probability maps as outputs. A multi-view strategy is used to improve the performance of the CNN-based model. Additionally, we compared the performance of our approach with that of the commonly used segmentation methods on the same dataset. The experimental results suggest that our proposed method outperforms prior methods on thyroid nodule segmentation. Moreover, the results show that the CNN-based model is able to delineate multiple nodules in thyroid ultrasound images accurately and effectively. In detail, our CNN-based model can achieve an average of the overlap metric, dice ratio, true positive rate, false positive rate, and modified Hausdorff distance as [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text], [Formula: see text] on overall folds, respectively. Our proposed method is fully automatic without any user interaction. Quantitative results also indicate that our method is so efficient and accurate that it can be good enough to replace the time-consuming and tedious manual segmentation approach, demonstrating the potential clinical applications.

  17. AutoIHC-scoring: a machine learning framework for automated Allred scoring of molecular expression in ER- and PR-stained breast cancer tissue.

    PubMed

    Tewary, S; Arun, I; Ahmed, R; Chatterjee, S; Chakraborty, C

    2017-11-01

    In prognostic evaluation of breast cancer Immunohistochemical (IHC) markers namely, oestrogen receptor (ER) and progesterone receptor (PR) are widely used. The expert pathologist investigates qualitatively the stained tissue slide under microscope to provide the Allred score; which is clinically used for therapeutic decision making. Such qualitative judgment is time-consuming, tedious and more often suffers from interobserver variability. As a result, it leads to imprecise IHC score for ER and PR. To overcome this, there is an urgent need of developing a reliable and efficient IHC quantifier for high throughput decision making. In view of this, our study aims at developing an automated IHC profiler for quantitative assessment of ER and PR molecular expression from stained tissue images. We propose here to use CMYK colour space for positively and negatively stained cell extraction for proportion score. Also colour features are used for quantitative assessment of intensity scoring among the positively stained cells. Five different machine learning models namely artificial neural network, Naïve Bayes, K-nearest neighbours, decision tree and random forest are considered for learning the colour features using average red, green and blue pixel values of positively stained cell patches. Fifty cases of ER- and PR-stained tissues have been evaluated for validation with the expert pathologist's score. All five models perform adequately where random forest shows the best correlation with the expert's score (Pearson's correlation coefficient = 0.9192). In the proposed approach the average variation of diaminobenzidine (DAB) to nuclear area from the expert's score is found to be 7.58%, as compared to 27.83% for state-of-the-art ImmunoRatio software. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  18. A hetero-micro-seeding strategy for readily crystallizing closely related protein variants.

    PubMed

    Islam, Mohammad M; Kuroda, Yutaka

    2017-11-04

    Protein crystallization remains difficult to rationalize and screening for optimal crystallization conditions is a tedious and time consuming procedure. Here, we report a hetero-micro-seeding strategy for producing high resolution crystals of closely related protein variants, where micro crystals from a readily crystallized variant are used as seeds to develop crystals of other variants less amenable to crystallization. We applied this strategy to Bovine Pancreatic Trypsin Inhibitor (BPTI) variants, which would not crystallize using standard crystallization practice. Out of six variants in our analysis, only one called BPTI-[5,55]A14G formed well behaving crystals; and the remaining five (A14GA38G, A14GA38V, A14GA38L, A14GA38I, and A14GA38K) could be crystallized only using micro-seeds from the BPTI-[5,55]A14G crystal. All hetero-seeded crystals diffracted at high resolution with minimum mosaicity, retaining the same space group and cell dimension. Moreover, hetero-micro-seeding did not introduce any biases into the mutant's structure toward the seed structure, as demonstrated by A14GA38I structures solved using micro-seeds from A14GA38G, A14GA38L and A14GA38I. Though hetero-micro-seeding is a simple and almost naïve strategy, this is the first direct demonstration of its workability. We believe that hetero-micro-seeding, which is contrasting with the popular idea that crystallization requires highly purified proteins, could contribute a new tool for rapidly solving protein structures in mutational analysis studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Use of GIS-Based Sampling to Inform Food Security Assessments and Decision Making in Kenya

    NASA Astrophysics Data System (ADS)

    Wahome, A.; Ndubi, A. O.; Ndungu, L. W.; Mugo, R. M.; Flores Cordova, A. I.

    2017-12-01

    Kenya relies on agricultural production for supporting local consumption and other processing value chains. With changing climate in a rain-fed dependent agricultural production system, cropping zones are shifting and proper decision making will require updated data. Where up-to-date data is not available it is important that it is generated and passed over to relevant stakeholders to inform their decision making. The process of generating this data should be cost effective and less time consuming. The Kenyan State Department of Agriculture (SDA) runs an insurance programme for maize farmers in a number of counties in Kenya. Previously, SDA was using a list of farmers to identify the crop fields for this insurance programme. However, the process of listing of all farmers in each Unit Area of Insurance (UAI) proved to be tedious and very costly, hence need for an alternative approach, but acceptable sampling methodology. Building on the existing cropland maps, SERVIR, a joint NASA-USAID initiative that brings Earth observations (EO) for improved environmental decision making in developing countries, specifically its hub in Eastern and Soutehrn Africa developed a High Resolution Map based on 10m Sentinel satellite images from which a GIS based sampling frame for identifying maize fields was developed. Sampling points were randomly generated in each UAI and navigated to using hand-held GPS units for identification of maize farmers. In the GIS-based identification of farmers SDA uses 1 day to cover an area covered in 1 week by list identification of farmers. Similarly, SDA spends approximately 3,000 USD per sub-county to locate maize fields using GIS-based sampling as compared 10,000 USD they used to spend before. This has resulted in 70% cost reduction.

  20. A new method for the prediction of combustion instability

    NASA Astrophysics Data System (ADS)

    Flanagan, Steven Meville

    This dissertation presents a new approach to the prediction of combustion instability in solid rocket motors. Previous attempts at developing computational tools to solve this problem have been largely unsuccessful, showing very poor agreement with experimental results and having little or no predictive capability. This is due primarily to deficiencies in the linear stability theory upon which these efforts have been based. Recent advances in linear instability theory by Flandro have demonstrated the importance of including unsteady rotational effects, previously considered negligible. Previous versions of the theory also neglected corrections to the unsteady flow field of the first order in the mean flow Mach number. This research explores the stability implications of extending the solution to include these corrections. Also, the corrected linear stability theory based upon a rotational unsteady flow field extended to first order in mean flow Mach number has been implemented in two computer programs developed for the Macintosh platform. A quasi one-dimensional version of the program has been developed which is based upon an approximate solution to the cavity acoustics problem. The three-dimensional program applies Greens's Function Discretization (GFD) to the solution for the acoustic mode shapes and frequency. GFD is a recently developed numerical method for finding fully three dimensional solutions for this class of problems. The analysis of complex motor geometries, previously a tedious and time consuming task, has also been greatly simplified through the development of a drawing package designed specifically to facilitate the specification of typical motor geometries. The combination of the drawing package, improved acoustic solutions, and new analysis, results in a tool which is capable of producing more accurate and meaningful predictions than have been possible in the past.

  1. Intelligent and automatic in vivo detection and quantification of transplanted cells in MRI.

    PubMed

    Afridi, Muhammad Jamal; Ross, Arun; Liu, Xiaoming; Bennewitz, Margaret F; Shuboni, Dorela D; Shapiro, Erik M

    2017-11-01

    Magnetic resonance imaging (MRI)-based cell tracking has emerged as a useful tool for identifying the location of transplanted cells, and even their migration. Magnetically labeled cells appear as dark contrast in T2*-weighted MRI, with sensitivities of individual cells. One key hurdle to the widespread use of MRI-based cell tracking is the inability to determine the number of transplanted cells based on this contrast feature. In the case of single cell detection, manual enumeration of spots in three-dimensional (3D) MRI in principle is possible; however, it is a tedious and time-consuming task that is prone to subjectivity and inaccuracy on a large scale. This research presents the first comprehensive study on how a computer-based intelligent, automatic, and accurate cell quantification approach can be designed for spot detection in MRI scans. Magnetically labeled mesenchymal stem cells (MSCs) were transplanted into rats using an intracardiac injection, accomplishing single cell seeding in the brain. T2*-weighted MRI of these rat brains were performed where labeled MSCs appeared as spots. Using machine learning and computer vision paradigms, approaches were designed to systematically explore the possibility of automatic detection of these spots in MRI. Experiments were validated against known in vitro scenarios. Using the proposed deep convolutional neural network (CNN) architecture, an in vivo accuracy up to 97.3% and in vitro accuracy of up to 99.8% was achieved for automated spot detection in MRI data. The proposed approach for automatic quantification of MRI-based cell tracking will facilitate the use of MRI in large-scale cell therapy studies. Magn Reson Med 78:1991-2002, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  2. Development of an Electrochemical DNA Biosensor to Detect a Foodborne Pathogen.

    PubMed

    Nordin, Noordiana; Yusof, Nor Azah; Radu, Son; Hushiarian, Roozbeh

    2018-06-03

    Vibrio parahaemolyticus (V. parahaemolyticus) is a common foodborne pathogen that contributes to a large proportion of public health problems globally, significantly affecting the rate of human mortality and morbidity. Conventional methods for the detection of V. parahaemolyticus such as culture-based methods, immunological assays, and molecular-based methods require complicated sample handling and are time-consuming, tedious, and costly. Recently, biosensors have proven to be a promising and comprehensive detection method with the advantages of fast detection, cost-effectiveness, and practicality. This research focuses on developing a rapid method of detecting V. parahaemolyticus with high selectivity and sensitivity using the principles of DNA hybridization. In the work, characterization of synthesized polylactic acid-stabilized gold nanoparticles (PLA-AuNPs) was achieved using X-ray Diffraction (XRD), Ultraviolet-visible Spectroscopy (UV-Vis), Transmission Electron Microscopy (TEM), Field-emission Scanning Electron Microscopy (FESEM), and Cyclic Voltammetry (CV). We also carried out further testing of stability, sensitivity, and reproducibility of the PLA-AuNPs. We found that the PLA-AuNPs formed a sound structure of stabilized nanoparticles in aqueous solution. We also observed that the sensitivity improved as a result of the smaller charge transfer resistance (Rct) value and an increase of active surface area (0.41 cm 2 ). The development of our DNA biosensor was based on modification of a screen-printed carbon electrode (SPCE) with PLA-AuNPs and using methylene blue (MB) as the redox indicator. We assessed the immobilization and hybridization events by differential pulse voltammetry (DPV). We found that complementary, non-complementary, and mismatched oligonucleotides were specifically distinguished by the fabricated biosensor. It also showed reliably sensitive detection in cross-reactivity studies against various food-borne pathogens and in the identification of V. parahaemolyticus in fresh cockles.

  3. A Pulsed Injection Parahydrogen Generator and Techniques for Quantifying Enrichment

    PubMed Central

    Feng, Bibo; Coffey, Aaron M.; Colon, Raul D.; Chekmenev, Eduard Y.; Waddell, Kevin W.

    2012-01-01

    A device is presented for efficiently enriching parahydrogen by pulsed injection of ambient hydrogen gas. Hydrogen input to the generator is pulsed at high pressure to a catalyst chamber making thermal contact with the cold head of a closed cycle cryostat maintained between 15 and 20 K. The system enables fast production (0.9 standard liters per minute) and allows for a wide range of production targets. Production rates can be systematically adjusted by varying the actuation sequence of high-pressure solenoid valves, which are controlled via an open source microcontroller to sample all combinations between fast and thorough enrichment by varying duration of hydrogen contact in the catalyst chamber. The entire enrichment cycle from optimization to quantification and storage kinetics are also described. Conversion of the para spin-isomer to orthohydrogen in borosilicate tubes was measured at 8 minute intervals over a period of 64 hours with a 12 Tesla NMR spectrometer. These relaxation curves were then used to extract initial enrichment by exploiting the known equilibrium (relaxed) distribution of spin isomers with linear least squares fitting to a single exponential decay curve with an estimated error less than or equal to 1 %. This procedure is time-consuming, but requires only one sample pressurized to atmosphere. Given that tedious matching to external references are unnecessary with this procedure, we find it to be useful for periodic inspection of generator performance. The equipment and procedures offer a variation in generator design that eliminate the need to meter flow while enabling access to increased rates of production. These tools for enriching and quantifying parahydrogen have been in steady use for 3 years and should be helpful as a template or as reference material for building and operating a parahydrogen production facility. PMID:22188975

  4. Automatic recognition of holistic functional brain networks using iteratively optimized convolutional neural networks (IO-CNN) with weak label initialization.

    PubMed

    Zhao, Yu; Ge, Fangfei; Liu, Tianming

    2018-07-01

    fMRI data decomposition techniques have advanced significantly from shallow models such as Independent Component Analysis (ICA) and Sparse Coding and Dictionary Learning (SCDL) to deep learning models such Deep Belief Networks (DBN) and Convolutional Autoencoder (DCAE). However, interpretations of those decomposed networks are still open questions due to the lack of functional brain atlases, no correspondence across decomposed or reconstructed networks across different subjects, and significant individual variabilities. Recent studies showed that deep learning, especially deep convolutional neural networks (CNN), has extraordinary ability of accommodating spatial object patterns, e.g., our recent works using 3D CNN for fMRI-derived network classifications achieved high accuracy with a remarkable tolerance for mistakenly labelled training brain networks. However, the training data preparation is one of the biggest obstacles in these supervised deep learning models for functional brain network map recognitions, since manual labelling requires tedious and time-consuming labours which will sometimes even introduce label mistakes. Especially for mapping functional networks in large scale datasets such as hundreds of thousands of brain networks used in this paper, the manual labelling method will become almost infeasible. In response, in this work, we tackled both the network recognition and training data labelling tasks by proposing a new iteratively optimized deep learning CNN (IO-CNN) framework with an automatic weak label initialization, which enables the functional brain networks recognition task to a fully automatic large-scale classification procedure. Our extensive experiments based on ABIDE-II 1099 brains' fMRI data showed the great promise of our IO-CNN framework. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. A heuristic approach to determine an appropriate number of topics in topic modeling

    PubMed Central

    2015-01-01

    Background Topic modelling is an active research field in machine learning. While mainly used to build models from unstructured textual data, it offers an effective means of data mining where samples represent documents, and different biological endpoints or omics data represent words. Latent Dirichlet Allocation (LDA) is the most commonly used topic modelling method across a wide number of technical fields. However, model development can be arduous and tedious, and requires burdensome and systematic sensitivity studies in order to find the best set of model parameters. Often, time-consuming subjective evaluations are needed to compare models. Currently, research has yielded no easy way to choose the proper number of topics in a model beyond a major iterative approach. Methods and results Based on analysis of variation of statistical perplexity during topic modelling, a heuristic approach is proposed in this study to estimate the most appropriate number of topics. Specifically, the rate of perplexity change (RPC) as a function of numbers of topics is proposed as a suitable selector. We test the stability and effectiveness of the proposed method for three markedly different types of grounded-truth datasets: Salmonella next generation sequencing, pharmacological side effects, and textual abstracts on computational biology and bioinformatics (TCBB) from PubMed. Conclusion The proposed RPC-based method is demonstrated to choose the best number of topics in three numerical experiments of widely different data types, and for databases of very different sizes. The work required was markedly less arduous than if full systematic sensitivity studies had been carried out with number of topics as a parameter. We understand that additional investigation is needed to substantiate the method's theoretical basis, and to establish its generalizability in terms of dataset characteristics. PMID:26424364

  6. Meet Spinky: An Open-Source Spindle and K-Complex Detection Toolbox Validated on the Open-Access Montreal Archive of Sleep Studies (MASS).

    PubMed

    Lajnef, Tarek; O'Reilly, Christian; Combrisson, Etienne; Chaibi, Sahbi; Eichenlaub, Jean-Baptiste; Ruby, Perrine M; Aguera, Pierre-Emmanuel; Samet, Mounir; Kachouri, Abdennaceur; Frenette, Sonia; Carrier, Julie; Jerbi, Karim

    2017-01-01

    Sleep spindles and K-complexes are among the most prominent micro-events observed in electroencephalographic (EEG) recordings during sleep. These EEG microstructures are thought to be hallmarks of sleep-related cognitive processes. Although tedious and time-consuming, their identification and quantification is important for sleep studies in both healthy subjects and patients with sleep disorders. Therefore, procedures for automatic detection of spindles and K-complexes could provide valuable assistance to researchers and clinicians in the field. Recently, we proposed a framework for joint spindle and K-complex detection (Lajnef et al., 2015a) based on a Tunable Q-factor Wavelet Transform (TQWT; Selesnick, 2011a) and morphological component analysis (MCA). Using a wide range of performance metrics, the present article provides critical validation and benchmarking of the proposed approach by applying it to open-access EEG data from the Montreal Archive of Sleep Studies (MASS; O'Reilly et al., 2014). Importantly, the obtained scores were compared to alternative methods that were previously tested on the same database. With respect to spindle detection, our method achieved higher performance than most of the alternative methods. This was corroborated with statistic tests that took into account both sensitivity and precision (i.e., Matthew's coefficient of correlation (MCC), F1, Cohen κ). Our proposed method has been made available to the community via an open-source tool named Spinky (for spindle and K-complex detection). Thanks to a GUI implementation and access to Matlab and Python resources, Spinky is expected to contribute to an open-science approach that will enhance replicability and reliable comparisons of classifier performances for the detection of sleep EEG microstructure in both healthy and patient populations.

  7. Synergetic approach for simple and rapid conjugation of gold nanoparticles with oligonucleotides.

    PubMed

    Li, Jiuxing; Zhu, Binqing; Yao, Xiujie; Zhang, Yicong; Zhu, Zhi; Tu, Song; Jia, Shasha; Liu, Rudi; Kang, Huaizhi; Yang, Chaoyong James

    2014-10-08

    Attaching thiolated DNA on gold nanoparticles (AuNPs) has been extremely important in nanobiotechnology because DNA-AuNPs combine the programmability and molecular recognition properties of the biopolymers with the optical, thermal, and catalytic properties of the inorganic nanomaterials. However, current standard protocols to attach thiolated DNA on AuNPs involve time-consuming, tedious steps and do not perform well for large AuNPs, thereby greatly restricting applications of DNA-AuNPs. Here we demonstrate a rapid and facile strategy to attach thiolated DNA on AuNPs based on the excellent stabilization effect of mPEG-SH on AuNPs. AuNPs are first protected by mPEG-SH in the presence of Tween 20, which results in excellent stability of AuNPs in high ionic strength environments and extreme pHs. A high concentration of NaCl can be applied to the mixture of DNA and AuNP directly, allowing highly efficient DNA attachment to the AuNP surface by minimizing electrostatic repulsion. The entire DNA loading process can be completed in 1.5 h with only a few simple steps. DNA-loaded AuNPs are stable for more than 2 weeks at room temperature, and they can precisely hybridize with the complementary sequence, which was applied to prepare core-satellite nanostructures. Moreover, cytotoxicity assay confirmed that the DNA-AuNPs synthesized by this method exhibit lower cytotoxicity than those prepared by current standard methods. The proposed method provides a new way to stabilize AuNPs for rapid and facile loading thiolated DNA on AuNPs and will find wide applications in many areas requiring DNA-AuNPs, including diagnosis, therapy, and imaging.

  8. A novel method for rapidly isolating microbes that suppress soil-borne phytopathogens

    NASA Astrophysics Data System (ADS)

    Cooper, Sarah; Agnew, Linda; Pereg, Lily

    2016-04-01

    Seedling establishment faces a large number of challenges related to soil physical properties as well as to fungal root diseases. It is extremely difficult to eliminate fungal pathogens from soils where their populations are established due to the persistent nature of their spores and since fumigation of resident fungi is very ineffective in clay-containing soils. Therefore it is necessary to find ways to overcome disease in areas where the soils are infected with fungal phytopathogens. The phenomenon of disease suppressive soils, where the pathogen is present but no disease observed, suggests that microbial antagonism in the soil may lead to the suppression of the growth of fungal pathogens. There are also cases in the literature where soil microorganisms were isolated that suppress the growth of phytopathogens. Antibiosis is one of the most important mechanisms responsible for fungal antagonism, with some significant antifungal compounds involved including antibiotics, volatile organic compounds, hydrogen cyanide and lytic enzymes. Isolation of pathogen-suppressive microorganisms from the soil is time consuming and tedious. We established a simple method for direct isolation of soil microbes (bacteria and fungi) that suppress fungal phytopathogens as well as procedures for confirmation of disease suppression. We will discuss such methods, which were so far tested with the cotton fungal pathogens Thielaviopsis basicola, Verticillium dahliae and Fusarium oxysporum and Verticillium fungicola. We have isolated a diversity of T. basicola-suppressive fungi and bacteria from two vastly different soil types. Identification of the antagonistic isolates revealed that they are a diverse lot, some belong to groups known to be suppressive of a wide range of fungal pathogens, endorsing the power of this technique to rapidly and directly isolate soil-borne microbes antagonistic to a wide variety of fungal pathogens.

  9. Magnetic solid-phase extraction using carbon nanotubes as sorbents: a review.

    PubMed

    Herrero-Latorre, C; Barciela-García, J; García-Martín, S; Peña-Crecente, R M; Otárola-Jiménez, J

    2015-09-10

    Magnetic solid-phase extraction (M-SPE) is a procedure based on the use of magnetic sorbents for the separation and preconcentration of different organic and inorganic analytes from large sample volumes. The magnetic sorbent is added to the sample solution and the target analyte is adsorbed onto the surface of the magnetic sorbent particles (M-SPs). Analyte-M-SPs are separated from the sample solution by applying an external magnetic field and, after elution with the appropriate solvent, the recovered analyte is analyzed. This approach has several advantages over traditional solid phase extraction as it avoids time-consuming and tedious on-column SPE procedures and it provides a rapid and simple analyte separation that avoids the need for centrifugation or filtration steps. As a consequence, in the past few years a great deal of research has been focused on M-SPE, including the development of new sorbents and novel automation strategies. In recent years, the use of magnetic carbon nanotubes (M-CNTs) as a sorption substrate in M-SPE has become an active area of research. These materials have exceptional mechanical, electrical, optical and magnetic properties and they also have an extremely large surface area and varied possibilities for functionalization. This review covers the synthesis of M-CNTs and the different approaches for the use of these compounds in M-SPE. The performance, general characteristics and applications of M-SPE based on magnetic carbon nanotubes for organic and inorganic analysis have been evaluated on the basis of more than 110 references. Finally, some important challenges with respect the use of magnetic carbon nanotubes in M-SPE are discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Left ventricle segmentation in cardiac MRI images using fully convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Vázquez Romaguera, Liset; Costa, Marly Guimarães Fernandes; Romero, Francisco Perdigón; Costa Filho, Cicero Ferreira Fernandes

    2017-03-01

    According to the World Health Organization, cardiovascular diseases are the leading cause of death worldwide, accounting for 17.3 million deaths per year, a number that is expected to grow to more than 23.6 million by 2030. Most cardiac pathologies involve the left ventricle; therefore, estimation of several functional parameters from a previous segmentation of this structure can be helpful in diagnosis. Manual delineation is a time consuming and tedious task that is also prone to high intra and inter-observer variability. Thus, there exists a need for automated cardiac segmentation method to help facilitate the diagnosis of cardiovascular diseases. In this work we propose a deep fully convolutional neural network architecture to address this issue and assess its performance. The model was trained end to end in a supervised learning stage from whole cardiac MRI images input and ground truth to make a per pixel classification. For its design, development and experimentation was used Caffe deep learning framework over an NVidia Quadro K4200 Graphics Processing Unit. The net architecture is: Conv64-ReLU (2x) - MaxPooling - Conv128-ReLU (2x) - MaxPooling - Conv256-ReLU (2x) - MaxPooling - Conv512-ReLu-Dropout (2x) - Conv2-ReLU - Deconv - Crop - Softmax. Training and testing processes were carried out using 5-fold cross validation with short axis cardiac magnetic resonance images from Sunnybrook Database. We obtained a Dice score of 0.92 and 0.90, Hausdorff distance of 4.48 and 5.43, Jaccard index of 0.97 and 0.97, sensitivity of 0.92 and 0.90 and specificity of 0.99 and 0.99, overall mean values with SGD and RMSProp, respectively.

  11. The effects of automated artifact removal algorithms on electroencephalography-based Alzheimer's disease diagnosis

    PubMed Central

    Cassani, Raymundo; Falk, Tiago H.; Fraga, Francisco J.; Kanda, Paulo A. M.; Anghinah, Renato

    2014-01-01

    Over the last decade, electroencephalography (EEG) has emerged as a reliable tool for the diagnosis of cortical disorders such as Alzheimer's disease (AD). EEG signals, however, are susceptible to several artifacts, such as ocular, muscular, movement, and environmental. To overcome this limitation, existing diagnostic systems commonly depend on experienced clinicians to manually select artifact-free epochs from the collected multi-channel EEG data. Manual selection, however, is a tedious and time-consuming process, rendering the diagnostic system “semi-automated.” Notwithstanding, a number of EEG artifact removal algorithms have been proposed in the literature. The (dis)advantages of using such algorithms in automated AD diagnostic systems, however, have not been documented; this paper aims to fill this gap. Here, we investigate the effects of three state-of-the-art automated artifact removal (AAR) algorithms (both alone and in combination with each other) on AD diagnostic systems based on four different classes of EEG features, namely, spectral, amplitude modulation rate of change, coherence, and phase. The three AAR algorithms tested are statistical artifact rejection (SAR), blind source separation based on second order blind identification and canonical correlation analysis (BSS-SOBI-CCA), and wavelet enhanced independent component analysis (wICA). Experimental results based on 20-channel resting-awake EEG data collected from 59 participants (20 patients with mild AD, 15 with moderate-to-severe AD, and 24 age-matched healthy controls) showed the wICA algorithm alone outperforming other enhancement algorithm combinations across three tasks: diagnosis (control vs. mild vs. moderate), early detection (control vs. mild), and disease progression (mild vs. moderate), thus opening the doors for fully-automated systems that can assist clinicians with early detection of AD, as well as disease severity progression assessment. PMID:24723886

  12. Resolution and Assignment of Differential Ion Mobility Spectra of Sarcosine and Isomers

    NASA Astrophysics Data System (ADS)

    Berthias, Francis; Maatoug, Belkis; Glish, Gary L.; Moussa, Fathi; Maitre, Philippe

    2018-02-01

    Due to their central role in biochemical processes, fast separation and identification of amino acids (AA) is of importance in many areas of the biomedical field including the diagnosis and monitoring of inborn errors of metabolism and biomarker discovery. Due to the large number of AA together with their isomers and isobars, common methods of AA analysis are tedious and time-consuming because they include a chromatographic separation step requiring pre- or post-column derivatization. Here, we propose a rapid method of separation and identification of sarcosine, a biomarker candidate of prostate cancer, from isomers using differential ion mobility spectrometry (DIMS) interfaced with a tandem mass spectrometer (MS/MS) instrument. Baseline separation of protonated sarcosine from α- and β-alanine isomers can be easily achieved. Identification of DIMS peak is performed using an isomer-specific activation mode where DIMS- and mass-selected ions are irradiated at selected wavenumbers allowing for the specific fragmentation via an infrared multiple photon dissociation (IRMPD) process. Two orthogonal methods to MS/MS are thus added, where the MS/MS(IRMPD) is nothing but an isomer-specific multiple reaction monitoring (MRM) method. The identification relies on the comparison of DIMS-MS/MS(IRMPD) chromatograms recorded at different wavenumbers. Based on the comparison of IR spectra of the three isomers, it is shown that specific depletion of the two protonated α- and β-alanine can be achieved, thus allowing for clear identification of the sarcosine peak. It is also demonstrated that DIMS-MS/MS(IRMPD) spectra in the carboxylic C=O stretching region allow for the resolution of overlapping DIMS peaks. [Figure not available: see fulltext.

  13. A high-throughput seed germination assay for root parasitic plants

    PubMed Central

    2013-01-01

    Background Some root-parasitic plants belonging to the Orobanche, Phelipanche or Striga genus represent one of the most destructive and intractable weed problems to agricultural production in both developed and developing countries. Compared with most of the other weeds, parasitic weeds are difficult to control by conventional methods because of their life style. The main difficulties that currently limit the development of successful control methods are the ability of the parasite to produce a tremendous number of tiny seeds that may remain viable in the soil for more than 15 years. Seed germination requires induction by stimulants present in root exudates of host plants. Researches performed on these minute seeds are until now tedious and time-consuming because germination rate is usually evaluated in Petri-dish by counting germinated seeds under a binocular microscope. Results We developed an easy and fast method for germination rate determination based on a standardized 96-well plate test coupled with spectrophotometric reading of tetrazolium salt (MTT) reduction. We adapted the Mosmann’s protocol for cell cultures to germinating seeds and determined the conditions of seed stimulation and germination, MTT staining and formazan salt solubilization required to obtain a linear relationship between absorbance and germination rate. Dose–response analyses were presented as applications of interest for assessing half maximal effective or inhibitory concentrations of germination stimulants (strigolactones) or inhibitors (ABA), respectively, using four parameter logistic curves. Conclusion The developed MTT system is simple and accurate. It yields reproducible results for germination bioassays of parasitic plant seeds. This method is adapted to high-throughput screenings of allelochemicals (stimulants, inhibitors) or biological extracts on parasitic plant seed germination, and strengthens the investigations of distinctive features of parasitic plant germination. PMID:23915294

  14. Improving medical stores management through automation and effective communication

    PubMed Central

    Kumar, Ashok; Cariappa, M.P.; Marwaha, Vishal; Sharma, Mukti; Arora, Manu

    2016-01-01

    Background Medical stores management in hospitals is a tedious and time consuming chore with limited resources tasked for the purpose and poor penetration of Information Technology. The process of automation is slow paced due to various inherent factors and is being challenged by the increasing inventory loads and escalating budgets for procurement of drugs. Methods We carried out an indepth case study at the Medical Stores of a tertiary care health care facility. An iterative six step Quality Improvement (QI) process was implemented based on the Plan–Do–Study–Act (PDSA) cycle. The QI process was modified as per requirement to fit the medical stores management model. The results were evaluated after six months. Results After the implementation of QI process, 55 drugs of the medical store inventory which had expired since 2009 onwards were replaced with fresh stock by the suppliers as a result of effective communication through upgraded database management. Various pending audit objections were dropped due to the streamlined documentation and processes. Inventory management improved drastically due to automation, with disposal orders being initiated four months prior to the expiry of drugs and correct demands being generated two months prior to depletion of stocks. The monthly expense summary of drugs was now being done within ten days of the closing month. Conclusion Improving communication systems within the hospital with vendor database management and reaching out to clinicians is important. Automation of inventory management requires to be simple and user-friendly, utilizing existing hardware. Physical stores monitoring is indispensable, especially due to the scattered nature of stores. Staff training and standardized documentation protocols are the other keystones for optimal medical store management. PMID:26900225

  15. Computer-assisted Crystallization.

    ERIC Educational Resources Information Center

    Semeister, Joseph J., Jr.; Dowden, Edward

    1989-01-01

    To avoid a tedious task for recording temperature, a computer was used for calculating the heat of crystallization for the compound sodium thiosulfate. Described are the computer-interfacing procedures. Provides pictures of laboratory equipment and typical graphs from experiments. (YP)

  16. Creating Simple Admin Tools Using Info*Engine and Java

    NASA Technical Reports Server (NTRS)

    Jones, Corey; Kapatos, Dennis; Skradski, Cory; Felkins, J. D.

    2012-01-01

    PTC has provided a simple way to dynamically interact with Windchill using Info*Engine. This presentation will describe how to create a simple Info*Engine Tasks capable of saving Windchill 10.0 administration of tedious work.

  17. Utility installation review (UIR) system training materials.

    DOT National Transportation Integrated Search

    2008-10-01

    The Texas Department of Transportation (TxDOT) issues thousands of approvals every year that : enable new utility installations to occupy the state right-of-way (ROW). The utility installation : review process currently in place is manual, tedious, a...

  18. Automatic measurement of pennation angle and fascicle length of gastrocnemius muscles using real-time ultrasound imaging.

    PubMed

    Zhou, Guang-Quan; Chan, Phoebe; Zheng, Yong-Ping

    2015-03-01

    Muscle imaging is a promising field of research to understand the biological and bioelectrical characteristics of muscles through the observation of muscle architectural change. Sonomyography (SMG) is a technique which can quantify the real-time architectural change of muscles under different contractions and motions with ultrasound imaging. The pennation angle and fascicle length are two crucial SMG parameters to understand the contraction mechanics at muscle level, but they have to be manually detected on ultrasound images frame by frame. In this study, we proposed an automatic method to quantitatively identify pennation angle and fascicle length of gastrocnemius (GM) muscle based on multi-resolution analysis and line feature extraction, which could overcome the limitations of tedious and time-consuming manual measurement. The method started with convolving Gabor wavelet specially designed for enhancing the line-like structure detection in GM ultrasound image. The resulting image was then used to detect the fascicles and aponeuroses for calculating the pennation angle and fascicle length with the consideration of their distribution in ultrasound image. The performance of this method was tested on computer simulated images and experimental images in vivo obtained from normal subjects. Tests on synthetic images showed that the method could identify the fascicle orientation with an average error less than 0.1°. The result of in vivo experiment showed a good agreement between the results obtained by the automatic and the manual measurements (r=0.94±0.03; p<0.001, and r=0.95±0.02, p<0.001). Furthermore, a significant correlation between the ankle angle and pennation angle (r=0.89±0.05; p<0.001) and fascicle length (r=-0.90±0.04; p<0.001) was found for the ankle plantar flexion. This study demonstrated that the proposed method was able to automatically measure the pennation angle and fascicle length of GM ultrasound images, which made it feasible to investigate muscle-level mechanics more comprehensively in vivo. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. DEMONSTRATION OF AUTONOMOUS AIR MONITORING THROUGH ROBOTICS

    EPA Science Inventory

    Hazardous and/or tedious functions are often performed by on-site workers during investigation, mitigation and clean-up of hazardous substances. These functions include site surveys, sampling and analysis, excavation, and treatment and preparation of wastes for shipment to chemic...

  20. Maturity assessment of harumanis mango using thermal camera sensor

    NASA Astrophysics Data System (ADS)

    Sa'ad, F. S. A.; Shakaff, A. Y. Md.; Zakaria, A.; Abdullah, A. H.; Ibrahim, M. F.

    2017-03-01

    The perceived quality of fruits, such as mangoes, is greatly dependent on many parameters such as ripeness, shape, size, and is influenced by other factors such as harvesting time. Unfortunately, a manual fruit grading has several drawbacks such as subjectivity, tediousness and inconsistency. By automating the procedure, as well as developing new classification technique, it may solve these problems. This paper presents the novel work on the using Infrared as a Tool in Quality Monitoring of Harumanis Mangoes. The histogram of infrared image was used to distinguish and classify the level of ripeness of the fruits based on the colour spectrum by week. The approach proposed thermal data was able to achieve 90.5% correct classification.

  1. Real-time metabolome profiling of the metabolic switch between starvation and growth.

    PubMed

    Link, Hannes; Fuhrer, Tobias; Gerosa, Luca; Zamboni, Nicola; Sauer, Uwe

    2015-11-01

    Metabolic systems are often the first networks to respond to environmental changes, and the ability to monitor metabolite dynamics is key for understanding these cellular responses. Because monitoring metabolome changes is experimentally tedious and demanding, dynamic data on time scales from seconds to hours are scarce. Here we describe real-time metabolome profiling by direct injection of living bacteria, yeast or mammalian cells into a high-resolution mass spectrometer, which enables automated monitoring of about 300 compounds in 15-30-s cycles over several hours. We observed accumulation of energetically costly biomass metabolites in Escherichia coli in carbon starvation-induced stationary phase, as well as the rapid use of these metabolites upon growth resumption. By combining real-time metabolome profiling with modeling and inhibitor experiments, we obtained evidence for switch-like feedback inhibition in amino acid biosynthesis and for control of substrate availability through the preferential use of the metabolically cheaper one-step salvaging pathway over costly ten-step de novo purine biosynthesis during growth resumption.

  2. Highly Flexible Superhydrophobic and Fire-Resistant Layered Inorganic Paper.

    PubMed

    Chen, Fei-Fei; Zhu, Ying-Jie; Xiong, Zhi-Chao; Sun, Tuan-Wei; Shen, Yue-Qin

    2016-12-21

    Traditional paper made from plant cellulose fibers is easily destroyed by either liquid or fire. In addition, the paper making industry consumes a large amount of natural trees and thus causes serious environmental problems including excessive deforestation and pollution. In consideration of the intrinsic flammability of organics and minimizing the effects on the environment and creatures, biocompatible ultralong hydroxyapatite nanowires are an ideal building material for inorganic fire-resistant paper. Herein, a new kind of free-standing, highly flexible, superhydrophobic, and fire-resistant layered inorganic paper has been successfully prepared using ultralong hydroxyapatite nanowires as building blocks after the surface modification with sodium oleate. During the vacuum filtration, ultralong hydroxyapatite nanowires assemble into self-roughened setalike microfibers, avoiding the tedious fabrication process to construct the hierarchical structure; the self-roughened microfibers further form the inorganic paper with a nacrelike layered structure. We have demonstrated that the layered structure can significantly improve the resistance to mechanical destruction of the as-prepared superhydrophobic paper. The as-prepared superhydrophobic and fire-resistant inorganic paper shows excellent nonflammability, liquid repellency to various commercial drinks, high thermal stability, and self-cleaning property. Moreover, we have explored the potential applications of the superhydrophobic and fire-resistant inorganic paper as a highly effective adsorbent for oil/water separation, fire-shielding protector, and writing paper.

  3. Health monitoring of offshore structures using wireless sensor network: experimental investigations

    NASA Astrophysics Data System (ADS)

    Chandrasekaran, Srinivasan; Chitambaram, Thailammai

    2016-04-01

    This paper presents a detailed methodology of deploying wireless sensor network in offshore structures for structural health monitoring (SHM). Traditional SHM is carried out by visual inspections and wired systems, which are complicated and requires larger installation space to deploy while decommissioning is a tedious process. Wireless sensor networks can enhance the art of health monitoring with deployment of scalable and dense sensor network, which consumes lesser space and lower power consumption. Proposed methodology is mainly focused to determine the status of serviceability of large floating platforms under environmental loads using wireless sensors. Data acquired by the servers will analyze the data for their exceedance with respect to the threshold values. On failure, SHM architecture will trigger an alarm or an early warning in the form of alert messages to alert the engineer-in-charge on board; emergency response plans can then be subsequently activated, which shall minimize the risk involved apart from mitigating economic losses occurring from the accidents. In the present study, wired and wireless sensors are installed in the experimental model and the structural response, acquired is compared. The wireless system comprises of Raspberry pi board, which is programmed to transmit the acquired data to the server using Wi-Fi adapter. Data is then hosted in the webpage for further post-processing, as desired.

  4. Zbrowse: An interactive GWAS results browser

    USDA-ARS?s Scientific Manuscript database

    The growing number of genotyped populations, the advent of high-throughput phenotyping techniques and the development of GWAS analysis software has rapidly accelerated the number of GWAS experimental results. Candidate gene discovery from these results files is often tedious, involving many manual s...

  5. How to Compute Labile Metal-Ligand Equilibria

    ERIC Educational Resources Information Center

    de Levie, Robert

    2007-01-01

    The different methods used for computing labile metal-ligand complexes, which are suitable for an iterative computer solution, are illustrated. The ligand function has allowed students to relegate otherwise tedious iterations to a computer, while retaining complete control over what is calculated.

  6. Bonder for Solar-Cell Strings

    NASA Technical Reports Server (NTRS)

    Garwood, G.; Frasch, W.

    1982-01-01

    String bonder for solar-cell arrays eliminates tedious manual assembly procedure that could damage cell face. Vacuum arm picks up face-down cell from cell-inverting work station and transfers it to string conveyor without changing cell orientation. Arm is activated by signal from microprocessor.

  7. GIS Learning Objects: An Approach to Content Aggregation

    ERIC Educational Resources Information Center

    Govorov, Michael; Gienko, Gennady

    2013-01-01

    Content development and maintenance of geographic information systems (GIS) related courses, especially designed for distance and online delivery, could be a tedious task even for an experienced instructor. The paper outlines application of abstract instructional design techniques for modeling course structure and developing corresponding course…

  8. Trigonometric Integration without Trigonometric Functions

    ERIC Educational Resources Information Center

    Quinlan, James; Kolibal, Joseph

    2016-01-01

    Teaching techniques of integration can be tedious and often uninspired. We present an obvious but underutilized approach for finding antiderivatives of various trigonometric functions using the complex exponential representation of the sine and cosine. The purpose goes beyond providing students an alternative approach to trigonometric integrals.…

  9. Innovative techniques with multi-purpose survey vehicle for automated analysis of cross-slope data.

    DOT National Transportation Integrated Search

    2007-11-02

    Manual surveying methods have long been used in the field of highway engineering to determine : the cross-slope, and longitudinal grade of an existing roadway. However, these methods are : slow, tedious and labor intensive. Moreover, manual survey me...

  10. Detecting dominant motion patterns in crowds of pedestrians

    NASA Astrophysics Data System (ADS)

    Saqib, Muhammad; Khan, Sultan Daud; Blumenstein, Michael

    2017-02-01

    As the population of the world increases, urbanization generates crowding situations which poses challenges to public safety and security. Manual analysis of crowded situations is a tedious job and usually prone to errors. In this paper, we propose a novel technique of crowd analysis, the aim of which is to detect different dominant motion patterns in real-time videos. A motion field is generated by computing the dense optical flow. The motion field is then divided into blocks. For each block, we adopt an Intra-clustering algorithm for detecting different flows within the block. Later on, we employ Inter-clustering for clustering the flow vectors among different blocks. We evaluate the performance of our approach on different real-time videos. The experimental results show that our proposed method is capable of detecting distinct motion patterns in crowded videos. Moreover, our algorithm outperforms state-of-the-art methods.

  11. Mini Photobioreactors for in Vivo Real-Time Characterization and Evolutionary Tuning of Bacterial Optogenetic Circuit.

    PubMed

    Wang, Hsinkai; Yang, Ya-Tang

    2017-09-15

    The current standard protocols for characterizing the optogenetic circuit of bacterial cells using flow cytometry in light tubes and light exposure of culture plates are tedious, labor-intensive, and cumbersome. In this work, we engineer a bioreactor with working volume of ∼10 mL for in vivo real-time optogenetic characterization of E. coli with a CcaS-CcaR light-sensing system. In the bioreactor, optical density measurements, reporter protein fluorescence detection, and light input stimuli are provided by four light-emitting diode sources and two photodetectors. Once calibrated, the device can cultivate microbial cells and record their growth and gene expression without human intervention. We measure gene expression during cell growth with different organic substrates (glucose, succinate, acetate, pyruvate) as carbon sources in minimal medium and demonstrate evolutionary tuning of the optogenetic circuit by serial dilution passages.

  12. ChoiceKey: a real-time speech recognition program for psychology experiments with a small response set.

    PubMed

    Donkin, Christopher; Brown, Scott D; Heathcote, Andrew

    2009-02-01

    Psychological experiments often collect choice responses using buttonpresses. However, spoken responses are useful in many cases-for example, when working with special clinical populations, or when a paradigm demands vocalization, or when accurate response time measurements are desired. In these cases, spoken responses are typically collected using a voice key, which usually involves manual coding by experimenters in a tedious and error-prone manner. We describe ChoiceKey, an open-source speech recognition package for MATLAB. It can be optimized by training for small response sets and different speakers. We show ChoiceKey to be reliable with minimal training for most participants in experiments with two different responses. Problems presented by individual differences, and occasional atypical responses, are examined, and extensions to larger response sets are explored. The ChoiceKey source files and instructions may be downloaded as supplemental materials for this article from brm.psychonomic-journals.org/content/supplemental.

  13. Real-time Tracking of DNA Fragment Separation by Smartphone.

    PubMed

    Tao, Chunxian; Yang, Bo; Li, Zhenqing; Zhang, Dawei; Yamaguchi, Yoshinori

    2017-06-01

    Slab gel electrophoresis (SGE) is the most common method for the separation of DNA fragments; thus, it is broadly applied to the field of biology and others. However, the traditional SGE protocol is quite tedious, and the experiment takes a long time. Moreover, the chemical consumption in SGE experiments is very high. This work proposes a simple method for the separation of DNA fragments based on an SGE chip. The chip is made by an engraving machine. Two plastic sheets are used for the excitation and emission wavelengths of the optical signal. The fluorescence signal of the DNA bands is collected by smartphone. To validate this method, 50, 100, and 1,000 bp DNA ladders were separated. The results demonstrate that a DNA ladder smaller than 5,000 bp can be resolved within 12 min and with high resolution when using this method, indicating that it is an ideal substitute for the traditional SGE method.

  14. Intelligent navigation to improve obstetrical sonography.

    PubMed

    Yeo, Lami; Romero, Roberto

    2016-04-01

    'Manual navigation' by the operator is the standard method used to obtain information from two-dimensional and volumetric sonography. Two-dimensional sonography is highly operator dependent and requires extensive training and expertise to assess fetal anatomy properly. Most of the sonographic examination time is devoted to acquisition of images, while 'retrieval' and display of diagnostic planes occurs rapidly (essentially instantaneously). In contrast, volumetric sonography has a rapid acquisition phase, but the retrieval and display of relevant diagnostic planes is often time-consuming, tedious and challenging. We propose the term 'intelligent navigation' to refer to a new method of interrogation of a volume dataset whereby identification and selection of key anatomical landmarks allow the system to: 1) generate a geometrical reconstruction of the organ of interest; and 2) automatically navigate, find, extract and display specific diagnostic planes. This is accomplished using operator-independent algorithms that are both predictable and adaptive. Virtual Intelligent Sonographer Assistance (VIS-Assistance®) is a tool that allows operator-independent sonographic navigation and exploration of the surrounding structures in previously identified diagnostic planes. The advantage of intelligent (over manual) navigation in volumetric sonography is the short time required for both acquisition and retrieval and display of diagnostic planes. Intelligent navigation technology automatically realigns the volume, and reorients and standardizes the anatomical position, so that the fetus and the diagnostic planes are consistently displayed in the same manner each time, regardless of the fetal position or the initial orientation. Automatic labeling of anatomical structures, subject orientation and each of the diagnostic planes is also possible. Intelligent navigation technology can operate on conventional computers, and is not dependent on specific ultrasound platforms or on the use of software to perform manual navigation of volume datasets. Diagnostic planes and VIS-Assistance videoclips can be transmitted by telemedicine so that expert consultants can evaluate the images to provide an opinion. The end result is a user-friendly, simple, fast and consistent method of obtaining sonographic images with decreased operator dependency. Intelligent navigation is one approach to improve obstetrical sonography. Published 2015. This article is a U.S. Government work and is in the public domain in the USA. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  15. Extraction and identification of cyclobutanones from irradiated cheese employing a rapid direct solvent extraction method.

    PubMed

    Tewfik, Ihab

    2008-01-01

    2-Alkylcyclobutanones (cyclobutanones) are accepted as chemical markers for irradiated foods containing lipid. However, current extraction procedures (Soxhlet-florisil chromatography) for the isolation of these markers involve a long and tedious clean-up regime prior to gas chromatography-mass spectrophotometry identification. This paper outlines an alternative isolation and clean-up method for the extraction of cyclobutanones in irradiated Camembert cheese. The newly developed direct solvent extraction method enables the efficient screening of large numbers of food samples and is not as resource intensive as the BS EN 1785:1997 method. Direct solvent extraction appears to be a simple, robust method and has the added advantage of a considerably shorter extraction time for the analysis of foods containing lipid.

  16. Quantitative phase imaging of arthropods

    PubMed Central

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-01-01

    Abstract. Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy. PMID:26334858

  17. Multifit / Polydefix : a framework for the analysis of polycrystal deformation using X-rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkel, Sébastien; Hilairet, Nadège

    2015-06-27

    Multifit/Polydefixis an open source IDL software package for the efficient processing of diffraction data obtained in deformation apparatuses at synchrotron beamlines.Multifitallows users to decompose two-dimensional diffraction images into azimuthal slices, fit peak positions, shapes and intensities, and propagate the results to other azimuths and images.Polydefixis for analysis of deformation experiments. Starting from output files created inMultifitor other packages, it will extract elastic lattice strains, evaluate sample pressure and differential stress, and prepare input files for further texture analysis. TheMultifit/Polydefixpackage is designed to make the tedious data analysis of synchrotron-based plasticity, rheology or other time-dependent experiments very straightforward and accessible tomore » a wider community.« less

  18. Improving single molecule force spectroscopy through automated real-time data collection and quantification of experimental conditions

    PubMed Central

    Scholl, Zackary N.; Marszalek, Piotr E.

    2013-01-01

    The benefits of single molecule force spectroscopy (SMFS) clearly outweigh the challenges which include small sample sizes, tedious data collection and introduction of human bias during the subjective data selection. These difficulties can be partially eliminated through automation of the experimental data collection process for atomic force microscopy (AFM). Automation can be accomplished using an algorithm that triages usable force-extension recordings quickly with positive and negative selection. We implemented an algorithm based on the windowed fast Fourier transform of force-extension traces that identifies peaks using force-extension regimes to correctly identify usable recordings from proteins composed of repeated domains. This algorithm excels as a real-time diagnostic because it involves <30 ms computational time, has high sensitivity and specificity, and efficiently detects weak unfolding events. We used the statistics provided by the automated procedure to clearly demonstrate the properties of molecular adhesion and how these properties change with differences in the cantilever tip and protein functional groups and protein age. PMID:24001740

  19. Optical-Fiber-Welding Machine

    NASA Technical Reports Server (NTRS)

    Goss, W. C.; Mann, W. A.; Goldstein, R.

    1985-01-01

    Technique yields joints with average transmissivity of 91.6 percent. Electric arc passed over butted fiber ends to melt them together. Maximum optical transmissivity of joint achieved with optimum choice of discharge current, translation speed, and axial compression of fibers. Practical welding machine enables delicate and tedious joining operation performed routinely.

  20. Teaching Statistics Online Using "Excel"

    ERIC Educational Resources Information Center

    Jerome, Lawrence

    2011-01-01

    As anyone who has taught or taken a statistics course knows, statistical calculations can be tedious and error-prone, with the details of a calculation sometimes distracting students from understanding the larger concepts. Traditional statistics courses typically use scientific calculators, which can relieve some of the tedium and errors but…

  1. Marketing Plan Competition for Experiential Learning

    ERIC Educational Resources Information Center

    Civi, Emin; Persinger, Elif S.

    2011-01-01

    Many students find traditional lectures, routine memorization, and restatement of facts and terms tedious and boring (Munoz and Huser, 2008). This requires professors to employ a variety of teaching techniques, for example, live case classroom projects. Such an experiential learning opportunity encourages students to become involved with the…

  2. New weight-handling device for commercial oil pressure balances

    NASA Astrophysics Data System (ADS)

    Woo, S. Y.; Choi, I. M.; Kim, B. S.

    2005-12-01

    This paper presents a new device to automatically handle a large number of weights for the calibration of a pressure gauge. This newly invented weight-handling device is made for use in conjunction with a commercial oil pressure balance. Although the pressure balance is essential as a calibration tool, its use has been generally tedious and labour intensive for a long time. In particular, the process of loading a different combination of weights on the top of a piston requires repetitious manual handling for every new measurement. This inevitably leaves the operator fatigued, and sometimes causes damage to the weights due to careless handling. The newly invented automatic weight-handling device can eliminate such tedious, error-prone and wear-inducing manual weight manipulation. The device consists of a stepping motor, a drive belt, a solenoid valve, three weight-lifting assemblies and three linear-motion guide assemblies. The weight-lifting assembly is composed of a pneumatic actuator, a solid-state switch and a metal finger. It has many advantages compared with the commercial automatic weight-handling device. Firstly, it is not necessary to lift all the weights off the piston in the weight selection process, as it is in the case of the commercial device. Thus it can prevent a permanent deformation of the weight carrier. Secondly, this new device can handle a larger number of weights than the commercial one. This is because the new device adopts a different method in retaining the remaining weights in place. Another advantage of this new device is that there is no possibility of the fingers touching the surface of the weights due to the oscillation of weights. Moreover it uses the general technology of a stepping motor, and is also made up of components that are easily obtainable in the market, thereby being very economical.

  3. High-throughput prediction of Acacia and eucalypt lignin syringyl/guaiacyl content using FT-Raman spectroscopy and partial least squares modeling

    DOE PAGES

    Lupoi, Jason S.; Healey, Adam; Singh, Seema; ...

    2015-01-16

    High-throughput techniques are necessary to efficiently screen potential lignocellulosic feedstocks for the production of renewable fuels, chemicals, and bio-based materials, thereby reducing experimental time and expense while supplanting tedious, destructive methods. The ratio of lignin syringyl (S) to guaiacyl (G) monomers has been routinely quantified as a way to probe biomass recalcitrance. Mid-infrared and Raman spectroscopy have been demonstrated to produce robust partial least squares models for the prediction of lignin S/G ratios in a diverse group of Acacia and eucalypt trees. The most accurate Raman model has now been used to predict the S/G ratio from 269 unknown Acaciamore » and eucalypt feedstocks. This study demonstrates the application of a partial least squares model composed of Raman spectral data and lignin S/G ratios measured using pyrolysis/molecular beam mass spectrometry (pyMBMS) for the prediction of S/G ratios in an unknown data set. The predicted S/G ratios calculated by the model were averaged according to plant species, and the means were not found to differ from the pyMBMS ratios when evaluating the mean values of each method within the 95 % confidence interval. Pairwise comparisons within each data set were employed to assess statistical differences between each biomass species. While some pairwise appraisals failed to differentiate between species, Acacias, in both data sets, clearly display significant differences in their S/G composition which distinguish them from eucalypts. In conclusion, this research shows the power of using Raman spectroscopy to supplant tedious, destructive methods for the evaluation of the lignin S/G ratio of diverse plant biomass materials.« less

  4. JournalMap: Discovering location-relevant knowledge from published studies for sustainable land use, preventing degradation, and restoring landscapes

    USDA-ARS?s Scientific Manuscript database

    Finding relevant knowledge and information to prevent land degradation and support restoration has historically involved researchers working from their own knowledge, querying people they know, and tediously searching topical literature reviews.To address this need we created JournalMap (http://www....

  5. Visualizing Vocabulary

    ERIC Educational Resources Information Center

    Skophammer, Karen

    2012-01-01

    Vocabulary can become tedious and a chore if it is approached as such. By making art terms and vocabulary meaningful, students will remember and use them for years to come. In this article, the author describes two vocabulary review projects that work wonderfully and create great works of art: (1) cursive creature rubbings; and (2) bubbling bodies…

  6. Using Machine Learning to Increase Research Efficiency: A New Approach in Environmental Sciences

    USDA-ARS?s Scientific Manuscript database

    Data collection has evolved from tedious in-person fieldwork to automatic data gathering from multiple sensor remotely. Scientist in environmental sciences have not fully exploited this data deluge, including legacy and new data, because the traditional scientific method is focused on small, high qu...

  7. Sawtooth Functions. Classroom Notes

    ERIC Educational Resources Information Center

    Hirst, Keith

    2004-01-01

    Using MAPLE enables students to consider many examples which would be very tedious to work out by hand. This applies to graph plotting as well as to algebraic manipulation. The challenge is to use these observations to develop the students' understanding of mathematical concepts. In this note an interesting relationship arising from inverse…

  8. Conversion of PCDP Dialogs.

    ERIC Educational Resources Information Center

    Bork, Alfred M.

    An introduction to the problems involved in conversion of computer dialogues from one computer language to another is presented. Conversion of individual dialogues by complete rewriting is straightforward, if tedious. To make a general conversion of a large group of heterogeneous dialogue material from one language to another at one step is more…

  9. Headspace analysis of polar organic compounds in biological matrixes using solid phase microextraction (SPME)

    USDA-ARS?s Scientific Manuscript database

    Analysis of biological fluids and waste material is difficult and tedious given the sample matrix. A rapid automated method for the determination of volatile fatty acids and phenolic and indole compounds was developed using a multipurpose sampler (MPS) with solid phase microextraction (SPME) and GC-...

  10. Initiating a Programmatic Assessment Report

    ERIC Educational Resources Information Center

    Berkaliev, Zaur; Devi, Shavila; Fasshauer, Gregory E.; Hickernell, Fred J.; Kartal, Ozgul; Li, Xiaofan; McCray, Patrick; Whitney, Stephanie; Zawojewski, Judith S.

    2014-01-01

    In the context of a department of applied mathematics, a program assessment was conducted to assess the departmental goal of enabling undergraduate students to recognize, appreciate, and apply the power of computational tools in solving mathematical problems that cannot be solved by hand, or would require extensive and tedious hand computation. A…

  11. Toxicity of Selected Mosquito Sprays to Channel Catfish Sac Fry

    USDA-ARS?s Scientific Manuscript database

    In the spring when channel catfish, Ictalurus punctatus, hatcheries are in full operation, the associated moisture and warm temperatures provide a haven for mosquitoes. Large swarms of biting mosquitoes in a hatchery can make the tedious work of egg-picking (i.e., removing dead and fungus-infested e...

  12. Automated detection of insect-damaged sunflower seeds by X-ray imaging

    USDA-ARS?s Scientific Manuscript database

    The development of insect-resistant sunflowers is hindered by the lack of a quick and effective method for scoring samples in terms of insect damage. The current method for scoring insect damage, which involves manual inspection of seeds for holes bored into the shell, is tedious, requiring approxi...

  13. Spreadsheet Analysis of Harvesting Systems

    Treesearch

    R.B. Rummer; B.L. Lanford

    1987-01-01

    Harvesting systems can be modeled and analyzed on microcomputers using commercially available "spreadsheet" software. The effect of system or external variables on the production rate or system cost can be evaluated and alternative systems can be easily examined. The tedious calculations associated with such analyses are performed by the computer. For users...

  14. Two Distinct Approaches for CRISPR-Cas9-Mediated Gene Editing in Cryptococcus neoformans and Related Species.

    PubMed

    Wang, Ping

    2018-06-27

    Cryptococcus neoformans and related species are encapsulated basidiomycetous fungi that cause meningoencephalitis in individuals with immune deficiency. This pathogen has a tractable genetic system; however, gene disruption via electroporation remains difficult, while biolistic transformation is often limited by lack of multiple genetic markers and the high initial cost of equipment. The approach using clustered regularly interspaced short palindromic repeats (CRISPR) and CRISPR-associated protein 9 (Cas9) has become the technology of choice for gene editing in many organisms due to its simplicity, efficiency, and versatility. The technique has been successfully demonstrated in C. neoformans and Cryptococcus deneoformans in which two DNA plasmids expressing either the Streptococcus pyogenes CAS9 gene or the guide RNA (gRNA) were employed. However, potential adverse effects due to constitutive expression and the time-consuming process of constructing vectors to express each gRNA remain as a primary barrier for wide adaptation. This report describes the delivery of preassembled CRISPR-Cas9-gRNA ribonucleoproteins (RNPs) via electroporation that is able to generate edited mutant alleles. RNP-mediated CRISPR-Cas9 was used to replace the wild-type GIB2 gene encoding a Gβ-like/RACK1 Gib2 protein with a gib2 :: NAT allele via homologous recombination in both C. neoformans and C. deneoformans In addition, a DNA plasmid (pCnCas9:U6-gRNA) that expresses both Cas9 and gRNA, allowing for convenient yet low-cost DNA-mediated gene editing, is described. pCnCas9:U6-gRNA contains an endogenous U6 promoter for gRNA expression and restriction sites for one-step insertion of a gRNA. These approaches and resources provide new opportunities to accelerate genetic studies of Cryptococcus species. IMPORTANCE For genetic studies of the Cryptococcus genus, generation of mutant strains is often hampered by a limited number of selectable genetic markers, the tedious process of vector construction, side effects, and other limitations, such as the high cost of acquiring a particle delivery system. CRISPR-Cas9 technology has been demonstrated in Cryptococcus for genome editing. However, it remains labor-intensive and time-consuming since it requires the identification of a suitable type III RNA polymerase promoter for gRNA expression. In addition, there may be potential adverse effects caused by constitutive expressions of Cas9 and gRNA. Here, I report the use of a ribonucleoprotein-mediated CRISPR-Cas9 technique for genome editing of C. neoformans and related species. Together with the custom-constructed pCnCas9:U6-gRNA vector that allows low-cost and time-saving DNA-based CRISPR-Cas9, my approach adds to the molecular toolbox for dissecting the molecular mechanism of pathogenesis in this important group of fungal pathogens. Copyright © 2018 Wang.

  15. Two-Relaxation-Time Lattice Boltzmann Method and its Application to Advective-Diffusive-Reactive Transport

    DOE PAGES

    Yan, Zhifeng; Yang, Xiaofan; Li, Siliang; ...

    2017-09-05

    The lattice Boltzmann method (LBM) based on single-relaxation-time (SRT) or multiple-relaxation-time (MRT) collision operators is widely used in simulating flow and transport phenomena. The LBM based on two-relaxation-time (TRT) collision operators possesses strengths from the SRT and MRT LBMs, such as its simple implementation and good numerical stability, although tedious mathematical derivations and presentations of the TRT LBM hinder its application to a broad range of flow and transport phenomena. This paper describes the TRT LBM clearly and provides a pseudocode for easy implementation. Various transport phenomena were simulated using the TRT LBM to illustrate its applications in subsurface environments.more » These phenomena include advection-diffusion in uniform flow, Taylor dispersion in a pipe, solute transport in a packed column, reactive transport in uniform flow, and bacterial chemotaxis in porous media. Finally, the TRT LBM demonstrated good numerical performance in terms of accuracy and stability in predicting these transport phenomena. Therefore, the TRT LBM is a powerful tool to simulate various geophysical and biogeochemical processes in subsurface environments.« less

  16. Two-relaxation-time lattice Boltzmann method and its application to advective-diffusive-reactive transport

    NASA Astrophysics Data System (ADS)

    Yan, Zhifeng; Yang, Xiaofan; Li, Siliang; Hilpert, Markus

    2017-11-01

    The lattice Boltzmann method (LBM) based on single-relaxation-time (SRT) or multiple-relaxation-time (MRT) collision operators is widely used in simulating flow and transport phenomena. The LBM based on two-relaxation-time (TRT) collision operators possesses strengths from the SRT and MRT LBMs, such as its simple implementation and good numerical stability, although tedious mathematical derivations and presentations of the TRT LBM hinder its application to a broad range of flow and transport phenomena. This paper describes the TRT LBM clearly and provides a pseudocode for easy implementation. Various transport phenomena were simulated using the TRT LBM to illustrate its applications in subsurface environments. These phenomena include advection-diffusion in uniform flow, Taylor dispersion in a pipe, solute transport in a packed column, reactive transport in uniform flow, and bacterial chemotaxis in porous media. The TRT LBM demonstrated good numerical performance in terms of accuracy and stability in predicting these transport phenomena. Therefore, the TRT LBM is a powerful tool to simulate various geophysical and biogeochemical processes in subsurface environments.

  17. Label-Free Optofluidic Nanobiosensor Enables Real-Time Analysis of Single-Cell Cytokine Secretion.

    PubMed

    Li, Xiaokang; Soler, Maria; Szydzik, Crispin; Khoshmanesh, Khashayar; Schmidt, Julien; Coukos, George; Mitchell, Arnan; Altug, Hatice

    2018-06-01

    Single-cell analysis of cytokine secretion is essential to understand the heterogeneity of cellular functionalities and develop novel therapies for multiple diseases. Unraveling the dynamic secretion process at single-cell resolution reveals the real-time functional status of individual cells. Fluorescent and colorimetric-based methodologies require tedious molecular labeling that brings inevitable interferences with cell integrity and compromises the temporal resolution. An innovative label-free optofluidic nanoplasmonic biosensor is introduced for single-cell analysis in real time. The nanobiosensor incorporates a novel design of a multifunctional microfluidic system with small volume microchamber and regulation channels for reliable monitoring of cytokine secretion from individual cells for hours. Different interleukin-2 secretion profiles are detected and distinguished from single lymphoma cells. The sensor configuration combined with optical spectroscopic imaging further allows us to determine the spatial single-cell secretion fingerprints in real time. This new biosensor system is anticipated to be a powerful tool to characterize single-cell signaling for basic and clinical research. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Two-Relaxation-Time Lattice Boltzmann Method and its Application to Advective-Diffusive-Reactive Transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Zhifeng; Yang, Xiaofan; Li, Siliang

    The lattice Boltzmann method (LBM) based on single-relaxation-time (SRT) or multiple-relaxation-time (MRT) collision operators is widely used in simulating flow and transport phenomena. The LBM based on two-relaxation-time (TRT) collision operators possesses strengths from the SRT and MRT LBMs, such as its simple implementation and good numerical stability, although tedious mathematical derivations and presentations of the TRT LBM hinder its application to a broad range of flow and transport phenomena. This paper describes the TRT LBM clearly and provides a pseudocode for easy implementation. Various transport phenomena were simulated using the TRT LBM to illustrate its applications in subsurface environments.more » These phenomena include advection-diffusion in uniform flow, Taylor dispersion in a pipe, solute transport in a packed column, reactive transport in uniform flow, and bacterial chemotaxis in porous media. Finally, the TRT LBM demonstrated good numerical performance in terms of accuracy and stability in predicting these transport phenomena. Therefore, the TRT LBM is a powerful tool to simulate various geophysical and biogeochemical processes in subsurface environments.« less

  19. Protein Innovations Advance Drug Treatments, Skin Care

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Dan Carter carefully layered the sheets of tracing paper on the light box. On each sheet were renderings of the atomic components of an essential human protein, one whose structure had long been a mystery. With each layer Carter laid down, a never-before-seen image became clearer. Carter joined NASA s Marshall Space Flight Center in 1985 and began exploring processes of protein crystal growth in space. By bouncing intense X-rays off the crystals, researchers can determine the electron densities around the thousands of atoms forming the protein molecules, unveiling their atomic structures. Cultivating crystals of sufficient quality on Earth was problematic; the microgravity conditions of space were far more accommodating. At the time, only a few hundred protein structures had been mapped, and the methods were time consuming and tedious. Carter hoped his work would help reveal the structure of human serum albumin, a major protein in the human circulatory system responsible for ferrying numerous small molecules in the blood. More was at stake than scientific curiosity. Albumin has a high affinity for most of the world s pharmaceuticals, Carter explains, and its interaction with drugs can change their safety and efficacy. When a medication enters the bloodstream a cancer chemotherapy drug, for example a majority of it can bind with albumin, leaving only a small percentage active for treatment. How a drug interacts with albumin can influence considerations like the necessary effective dosage, playing a significant role in the design and application of therapeutic measures. In spite of numerous difficulties, including having no access to microgravity following the 1986 Space Shuttle Challenger disaster, the image Carter had hoped to see was finally clarifying. In 1988, his lab had acquired specialized X-ray and detection equipment a tipping point. Carter and his colleagues began to piece together albumin s portrait, the formation of its electron densities coalescing on the sheets of tracing paper he arranged on the light box. While space-grown crystals were ultimately not involved in the achievement, a year later, Carter says, we were on the cover of Science magazine, having determined the atomic structure of albumin.

  20. An algorithm for automatic crystal identification in pixelated scintillation detectors using thin plate splines and Gaussian mixture models

    NASA Astrophysics Data System (ADS)

    Schellenberg, Graham; Stortz, Greg; Goertzen, Andrew L.

    2016-02-01

    A typical positron emission tomography detector is comprised of a scintillator crystal array coupled to a photodetector array or other position sensitive detector. Such detectors using light sharing to read out crystal elements require the creation of a crystal lookup table (CLUT) that maps the detector response to the crystal of interaction based on the x-y position of the event calculated through Anger-type logic. It is vital for system performance that these CLUTs be accurate so that the location of events can be accurately identified and so that crystal-specific corrections, such as energy windowing or time alignment, can be applied. While using manual segmentation of the flood image to create the CLUT is a simple and reliable approach, it is both tedious and time consuming for systems with large numbers of crystal elements. In this work we describe the development of an automated algorithm for CLUT generation that uses a Gaussian mixture model paired with thin plate splines (TPS) to iteratively fit a crystal layout template that includes the crystal numbering pattern. Starting from a region of stability, Gaussians are individually fit to data corresponding to crystal locations while simultaneously updating a TPS for predicting future Gaussian locations at the edge of a region of interest that grows as individual Gaussians converge to crystal locations. The algorithm was tested with flood image data collected from 16 detector modules, each consisting of a 409 crystal dual-layer offset LYSO crystal array readout by a 32 pixel SiPM array. For these detector flood images, depending on user defined input parameters, the algorithm runtime ranged between 17.5-82.5 s per detector on a single core of an Intel i7 processor. The method maintained an accuracy above 99.8% across all tests, with the majority of errors being localized to error prone corner regions. This method can be easily extended for use with other detector types through adjustment of the initial template model used.

  1. Panel discussion on vaccine development to meet U.S. and international needs. Strategies for reducing the disincentives to HIV vaccine development: description of a successful public-private sector international collaboration.

    PubMed

    Bronnenkant, L

    1994-01-01

    A representative of Finishing Enterprises, the world's largest manufacturer of intrauterine contraceptive devices (IUDs), discusses how to alter the balance of incentives-disincentives to expedite the development of HIV vaccines for international evaluation. Three main disincentives exist for private manufacturers in the United States to develop a new HIV vaccine to be used in developing countries, outside the profitable North American and western European markets: 1) low profit margin because of limited money, time, and resources. Medium and large-sized corporations are more concerned with a high return on their investment owing to stockholder pressure than with the human benefit of that investment. 2) Lengthy regulatory approval process. The current regulatory process in the US is tedious, time-consuming, and costly. 3) Liability risk. The United States is the most litigious society in the world. Suits filed against US corporations involved in drug manufacture incur legal defence costs, which make an already low profit margin HIV vaccine even lower. Finishing Enterprises' IUD program aimed at providing the safest and most effective IUD at an affordable price in a socially responsible way. The Population Council developed the Copper T and retained the patent rights. They and other international health authorities, such as the World Health Organization, conducted or monitored international clinical trials to determine safety and efficacy. Private foundations and public donor agencies funded these activities. When donor agencies committed to volume purchases for their commodity programs, Finishing Enterprises could commit to volume pricing. Whenever high-margin private sector sales occur, Population Council receives a royalty payment. Thus, the disincentives were overcome: 1) Low profit margin was less an issue for a small, private company created specifically to manufacture IUDs and guaranteed volume orders. 2) Lengthy regulatory approval processes were avoided by various international clinical trials, generating international interest in the product. 3) Liability risk was minimized by the variety of safety tests the product underwent.

  2. QuantiFly: Robust Trainable Software for Automated Drosophila Egg Counting.

    PubMed

    Waithe, Dominic; Rennert, Peter; Brostow, Gabriel; Piper, Matthew D W

    2015-01-01

    We report the development and testing of software called QuantiFly: an automated tool to quantify Drosophila egg laying. Many laboratories count Drosophila eggs as a marker of fitness. The existing method requires laboratory researchers to count eggs manually while looking down a microscope. This technique is both time-consuming and tedious, especially when experiments require daily counts of hundreds of vials. The basis of the QuantiFly software is an algorithm which applies and improves upon an existing advanced pattern recognition and machine-learning routine. The accuracy of the baseline algorithm is additionally increased in this study through correction of bias observed in the algorithm output. The QuantiFly software, which includes the refined algorithm, has been designed to be immediately accessible to scientists through an intuitive and responsive user-friendly graphical interface. The software is also open-source, self-contained, has no dependencies and is easily installed (https://github.com/dwaithe/quantifly). Compared to manual egg counts made from digital images, QuantiFly achieved average accuracies of 94% and 85% for eggs laid on transparent (defined) and opaque (yeast-based) fly media. Thus, the software is capable of detecting experimental differences in most experimental situations. Significantly, the advanced feature recognition capabilities of the software proved to be robust to food surface artefacts like bubbles and crevices. The user experience involves image acquisition, algorithm training by labelling a subset of eggs in images of some of the vials, followed by a batch analysis mode in which new images are automatically assessed for egg numbers. Initial training typically requires approximately 10 minutes, while subsequent image evaluation by the software is performed in just a few seconds. Given the average time per vial for manual counting is approximately 40 seconds, our software introduces a timesaving advantage for experiments starting with as few as 20 vials. We also describe an optional acrylic box to be used as a digital camera mount and to provide controlled lighting during image acquisition which will guarantee the conditions used in this study.

  3. QuantiFly: Robust Trainable Software for Automated Drosophila Egg Counting

    PubMed Central

    Waithe, Dominic; Rennert, Peter; Brostow, Gabriel; Piper, Matthew D. W.

    2015-01-01

    We report the development and testing of software called QuantiFly: an automated tool to quantify Drosophila egg laying. Many laboratories count Drosophila eggs as a marker of fitness. The existing method requires laboratory researchers to count eggs manually while looking down a microscope. This technique is both time-consuming and tedious, especially when experiments require daily counts of hundreds of vials. The basis of the QuantiFly software is an algorithm which applies and improves upon an existing advanced pattern recognition and machine-learning routine. The accuracy of the baseline algorithm is additionally increased in this study through correction of bias observed in the algorithm output. The QuantiFly software, which includes the refined algorithm, has been designed to be immediately accessible to scientists through an intuitive and responsive user-friendly graphical interface. The software is also open-source, self-contained, has no dependencies and is easily installed (https://github.com/dwaithe/quantifly). Compared to manual egg counts made from digital images, QuantiFly achieved average accuracies of 94% and 85% for eggs laid on transparent (defined) and opaque (yeast-based) fly media. Thus, the software is capable of detecting experimental differences in most experimental situations. Significantly, the advanced feature recognition capabilities of the software proved to be robust to food surface artefacts like bubbles and crevices. The user experience involves image acquisition, algorithm training by labelling a subset of eggs in images of some of the vials, followed by a batch analysis mode in which new images are automatically assessed for egg numbers. Initial training typically requires approximately 10 minutes, while subsequent image evaluation by the software is performed in just a few seconds. Given the average time per vial for manual counting is approximately 40 seconds, our software introduces a timesaving advantage for experiments starting with as few as 20 vials. We also describe an optional acrylic box to be used as a digital camera mount and to provide controlled lighting during image acquisition which will guarantee the conditions used in this study. PMID:25992957

  4. Computer-Assisted Automated Scoring of Polysomnograms Using the Somnolyzer System

    PubMed Central

    Punjabi, Naresh M.; Shifa, Naima; Dorffner, Georg; Patil, Susheel; Pien, Grace; Aurora, Rashmi N.

    2015-01-01

    Study Objectives: Manual scoring of polysomnograms is a time-consuming and tedious process. To expedite the scoring of polysomnograms, several computerized algorithms for automated scoring have been developed. The overarching goal of this study was to determine the validity of the Somnolyzer system, an automated system for scoring polysomnograms. Design: The analysis sample comprised of 97 sleep studies. Each polysomnogram was manually scored by certified technologists from four sleep laboratories and concurrently subjected to automated scoring by the Somnolyzer system. Agreement between manual and automated scoring was examined. Sleep staging and scoring of disordered breathing events was conducted using the 2007 American Academy of Sleep Medicine criteria. Setting: Clinical sleep laboratories. Measurements and Results: A high degree of agreement was noted between manual and automated scoring of the apnea-hypopnea index (AHI). The average correlation between the manually scored AHI across the four clinical sites was 0.92 (95% confidence interval: 0.90–0.93). Similarly, the average correlation between the manual and Somnolyzer-scored AHI values was 0.93 (95% confidence interval: 0.91–0.96). Thus, interscorer correlation between the manually scored results was no different than that derived from manual and automated scoring. Substantial concordance in the arousal index, total sleep time, and sleep efficiency between manual and automated scoring was also observed. In contrast, differences were noted between manually and automated scored percentages of sleep stages N1, N2, and N3. Conclusion: Automated analysis of polysomnograms using the Somnolyzer system provides results that are comparable to manual scoring for commonly used metrics in sleep medicine. Although differences exist between manual versus automated scoring for specific sleep stages, the level of agreement between manual and automated scoring is not significantly different than that between any two human scorers. In light of the burden associated with manual scoring, automated scoring platforms provide a viable complement of tools in the diagnostic armamentarium of sleep medicine. Citation: Punjabi NM, Shifa N, Dorffner G, Patil S, Pien G, Aurora RN. Computer-assisted automated scoring of polysomnograms using the Somnolyzer system. SLEEP 2015;38(10):1555–1566. PMID:25902809

  5. 10 CFR Appendix Y to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Battery Chargers

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... power (i.e., watts) consumed as the time series integral of the power consumed over a 1-hour test period...) consumed as the time series integral of the power consumed over a 1-hour test period, divided by the period...-maintenance mode and standby mode over time periods defined in the test procedure. b. Active mode is the...

  6. 10 CFR Appendix Y to Subpart B of... - Uniform Test Method for Measuring the Energy Consumption of Battery Chargers

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... power (i.e., watts) consumed as the time series integral of the power consumed over a 1-hour test period...) consumed as the time series integral of the power consumed over a 1-hour test period, divided by the period...-maintenance mode and standby mode over time periods defined in the test procedure. b. Active mode is the...

  7. A model selection approach for robust spatio-temporal analysis of dynamics in 4D fluorescence videomicroscopy.

    PubMed

    Bechar, Ikhlef; Trubuil, Alain

    2006-01-01

    We describe a novel automatic approach for vesicle trafficking analysis in 3D+T videomicroscopy. Tracking individually objects in time in 3D+T videomicroscopy is known to be a very tedious job and leads generally to unreliable results. So instead, our method proceeds by first identifying trafficking regions in the 3D volume and next analysing at them the vesicle trafficking. The latter is viewed as significant change in the fluorescence of a region in the image. We embed the problem in a model selection framework and we resolve it using dynamic programming. We applied the proposed approach to analyse the vesicle dynamics related to the trafficking of the RAB6A protein between the Golgi apparatus and ER cell compartments.

  8. Creation of lumped parameter thermal model by the use of finite elements

    NASA Technical Reports Server (NTRS)

    1978-01-01

    In the finite difference technique, the thermal network is represented by an analogous electrical network. The development of this network model, which is used to describe a physical system, often requires tedious and mental data preparation and checkout by the analyst which can be greatly reduced through the use of the computer programs to develop automatically the mathematical model and associated input data and graphically display the analytical model to facilitate model verification. Three separate programs are involved which are linked through common mass storage files and data card formats. These programs are SPAR, CINGEN and GEOMPLT, and are used to (1) develop thermal models for the MITAS II thermal analyzer program; (2) produce geometry plots of the thermal network; and (3) produce temperature distribution and time history plots.

  9. An SPSS implementation of the nonrecursive outlier deletion procedure with shifting z score criterion (Van Selst & Jolicoeur, 1994).

    PubMed

    Thompson, Glenn L

    2006-05-01

    Sophisticated univariate outlier screening procedures are not yet available in widely used statistical packages such as SPSS. However, SPSS can accept user-supplied programs for executing these procedures. Failing this, researchers tend to rely on simplistic alternatives that can distort data because they do not adjust to cell-specific characteristics. Despite their popularity, these simple procedures may be especially ill suited for some applications (e.g., data from reaction time experiments). A user friendly SPSS Production Facility implementation of the shifting z score criterion procedure (Van Selst & Jolicoeur, 1994) is presented in an attempt to make it easier to use. In addition to outlier screening, optional syntax modules can be added that will perform tedious database management tasks (e.g., restructuring or computing means).

  10. Remembrance of weaning past: the seminal papers.

    PubMed

    Tobin, Martin J

    2006-10-01

    The approach to ventilator weaning has changed considerably over the past 30 years. Change has resulted from research in three areas: pathophysiology, weaning-predictor testing, and weaning techniques. Physiology research illuminated the mechanisms of weaning failure. It also uncovered markers of weaning success. Through more reliable prediction, patients whose weaning would have been tedious in the 1970s are now weaned more rapidly. The weaning story offers several lessons in metascience: importance of creativity, the asking of heretical questions, serendipity, mental-set psychology, cross-fertilization, and the hazards of precocity. Weaning research also illustrates how Kuhnian normal (me-too) science dominates any field. Making the next quantum leap in weaning will depend on spending less time on normal science and more on the raising (and testing) of maverick ideas.

  11. Development of a Time-Intensity Evaluation System for Consumers: Measuring Bitterness and Retronasal Aroma of Coffee Beverages in 106 Untrained Panelists.

    PubMed

    Gotow, Naomi; Moritani, Ami; Hayakawa, Yoshinobu; Akutagawa, Akihito; Hashimoto, Hiroshi; Kobayakawa, Tatsu

    2015-06-01

    In order to develop products that are acceptable to consumers, it is necessary to incorporate consumers' intentions into products' characteristics. Therefore, investigation of consumers' perceptions of the taste or smell of common beverages provides information that should be useful in predicting market responses. In this study, we sought to develop a time-intensity evaluation system for consumer panels. Using our system, we performed time-intensity evaluation of flavor attributes (bitterness and retronasal aroma) that consumers perceived after swallowing a coffee beverage. Additionally, we developed quantitative evaluation methods for determining whether consumer panelists can properly perform time-intensity evaluation. In every trial, we fitted an exponential function to measured intensity data for bitterness and retronasal aroma. The correlation coefficients between measured time-intensity data and the fitted exponential curves were greater than 0.8 in about 90% of trials, indicating that we had successfully developed a time-intensity system for use with consumer panelists, even after just a single training trial using a nontrained consumer. We classified participants into two groups based on their consumption of canned coffee beverages. Significant difference was observed in only AUC of sensory modality (bitterness compared with retronasal aroma) among conventional TI parameters using two-way ANOVA. However, three-way ANOVA including a time course revealed significant difference between bitterness and retronasal aroma in the high-consumption group. Moreover, the high-consumption group more easily discriminated between bitterness and retronasal aroma than the low-consumption group. This finding implied that manufacturers should select consumer panelists who are suitable for their concepts of new products. © 2015 Institute of Food Technologists®

  12. Inquiring into Familiar Objects: An Inquiry-Based Approach to Introduce Scientific Vocabulary

    ERIC Educational Resources Information Center

    Hicks Pries, Caitlin; Hughes, Julie

    2012-01-01

    Learning science vocabulary is an often tedious but important component of many curricula. Frequently, students are expected to learn science vocabulary indirectly, but this method can hinder the success of lower-performing students (Carlisle, Fleming, and Gudbrandsen 2000). We have developed an inquiry-based vocabulary activity wherein students…

  13. Who Needs Lewis Structures to Get VSEPR Geometries?

    ERIC Educational Resources Information Center

    Lindmark, Alan F.

    2010-01-01

    Teaching the VSEPR (valence shell electron-pair repulsion) model can be a tedious process. Traditionally, Lewis structures are drawn and the number of "electron clouds" (groups) around the central atom are counted and related to the standard VSEPR table of possible geometries. A simpler method to deduce the VSEPR structure without first drawing…

  14. Systematic Reviews: Inducting Research Students into Scholarly Conversations?

    ERIC Educational Resources Information Center

    Leung, Janni; Ferrari, Alize; Baxter, Amanda; Schoultz, Mariyana; Beattie, Michelle; Harris, Meredith

    2017-01-01

    Systematic reviews are common in disciplines such as medicine, nursing, and health sciences and students are increasingly being encouraged to conduct them as a component of their thesis (Pickering & Byrne, 2014). Fortunately, the tedious old days of writing a thesis by gathering mountains of hard-copy papers are long behind us. Privileged…

  15. Counting on COUNTER: The Current State of E-Resource Usage Data in Libraries

    ERIC Educational Resources Information Center

    Welker, Josh

    2012-01-01

    Any librarian who has managed electronic resources has experienced the--for want of words--"joy" of gathering and analyzing usage statistics. Such statistics are important for evaluating the effectiveness of resources and for making important budgeting decisions. Unfortunately, the data are usually tedious to collect, inconsistently organized, of…

  16. An Alternative Method to Gauss-Jordan Elimination: Minimizing Fraction Arithmetic

    ERIC Educational Resources Information Center

    Smith, Luke; Powell, Joan

    2011-01-01

    When solving systems of equations by using matrices, many teachers present a Gauss-Jordan elimination approach to row reducing matrices that can involve painfully tedious operations with fractions (which I will call the traditional method). In this essay, I present an alternative method to row reduce matrices that does not introduce additional…

  17. Self-assembled optically transparent cellulose nanofibril films: effect of nanofibril morphology and drying procedure

    Treesearch

    Yan Qing; Ronald Sabo; Yiqiang Wu; J.Y. Zhu; Zhiyong Cai

    2015-01-01

    Cellulose nanofibril (CNF) films currently provide great opportunity in many applications with advantages of excellent mechanical strength, high light transmittance, and good barrier properties. However, processes for preparing CNFs are typically tedious and vary, along with their properties. Here, five preparation methods using various combinations of filtration,...

  18. Engaging Students' Imaginations in Second Language Learning

    ERIC Educational Resources Information Center

    Judson, Gillian; Egan, Kieran

    2013-01-01

    Imagination is rarely acknowledged as one of the main workhorses of learning. Unfortunately, disregarding the imagination has some clearly negative pedagogical impacts: Learning is more ineffective than it should be and much schooling is more tedious than it need be. In this paper, we outline a somewhat new way of thinking about the process of…

  19. Estimating water retention curves for sandy soils at the Doñana National Park, SW Spain

    USDA-ARS?s Scientific Manuscript database

    The determination of soil water retention curves (SWRC) in the laboratory is a slow and tedious task, which is especially challenging for sandy soils due to their low water retention capacity and large water content changes for small pressure head differences. Due to spatial variability within larg...

  20. Can tasks be inherently boring?

    PubMed

    Charney, Evan

    2013-12-01

    Kurzban et al. argue that the experiences of "effort," "boredom," and "fatigue" are indications that the costs of a task outweigh its benefits. Reducing the costs of tasks to "opportunity costs" has the effect of rendering tasks costless and of denying that they can be inherently boring or tedious, something that "vigilance tasks" were intentionally designed to be.

  1. A study of how unmanned aircraft systems can support the Kansas Department of Transportation's efforts to improve efficiency, safety, and cost reduction : final report.

    DOT National Transportation Integrated Search

    2016-08-01

    Regulations for using Unmanned Aircraft Systems (UAS) are not yet standardized by the Federal Aviation : Administration (FAA). This creates tedious obstacles for those who wish to utilize the technology. The goal of this : research is to provide a ju...

  2. Extracting Maximum Total Water Levels from Video "Brightest" Images

    NASA Astrophysics Data System (ADS)

    Brown, J. A.; Holman, R. A.; Stockdon, H. F.; Plant, N. G.; Long, J.; Brodie, K.

    2016-02-01

    An important parameter for predicting storm-induced coastal change is the maximum total water level (TWL). Most studies estimate the TWL as the sum of slowly varying water levels, including tides and storm surge, and the extreme runup parameter R2%, which includes wave setup and swash motions over minutes to seconds. Typically, R2% is measured using video remote sensing data, where cross-shore timestacks of pixel intensity are digitized to extract the horizontal runup timeseries. However, this technique must be repeated at multiple alongshore locations to resolve alongshore variability, and can be tedious and time consuming. We seek an efficient, video-based approach that yields a synoptic estimate of TWL that accounts for alongshore variability and can be applied during storms. In this work, the use of a video product termed the "brightest" image is tested; this represents the highest intensity of each pixel captured during a 10-minute collection period. Image filtering and edge detection techniques are applied to automatically determine the shoreward edge of the brightest region (i.e., the swash zone) at each alongshore pixel. The edge represents the horizontal position of the maximum TWL along the beach during the collection period, and is converted to vertical elevations using measured beach topography. This technique is evaluated using video and topographic data collected every half-hour at Duck, NC, during differing hydrodynamic conditions. Relationships between the maximum TWL estimates from the brightest images and various runup statistics computed using concurrent runup timestacks are examined, and errors associated with mapping the horizontal results to elevations are discussed. This technique is invaluable, as it can be used to routinely estimate maximum TWLs along a coastline from a single brightest image product, and provides a means for examining alongshore variability of TWLs at high alongshore resolution. These advantages will be useful in validating numerical hydrodynamic models and improving coastal change predictions.

  3. Application of Shuffled Frog Leaping Algorithm and Genetic Algorithm for the Optimization of Urban Stormwater Drainage

    NASA Astrophysics Data System (ADS)

    Kumar, S.; Kaushal, D. R.; Gosain, A. K.

    2017-12-01

    Urban hydrology will have an increasing role to play in the sustainability of human settlements. Expansion of urban areas brings significant changes in physical characteristics of landuse. Problems with administration of urban flooding have their roots in concentration of population within a relatively small area. As watersheds are urbanized, infiltration decreases, pattern of surface runoff is changed generating high peak flows, large runoff volumes from urban areas. Conceptual rainfall-runoff models have become a foremost tool for predicting surface runoff and flood forecasting. Manual calibration is often time consuming and tedious because of the involved subjectivity, which makes automatic approach more preferable. The calibration of parameters usually includes numerous criteria for evaluating the performances with respect to the observed data. Moreover, derivation of objective function assosciat6ed with the calibration of model parameters is quite challenging. Various studies dealing with optimization methods has steered the embracement of evolution based optimization algorithms. In this paper, a systematic comparison of two evolutionary approaches to multi-objective optimization namely shuffled frog leaping algorithm (SFLA) and genetic algorithms (GA) is done. SFLA is a cooperative search metaphor, stimulated by natural memetics based on the population while, GA is based on principle of survival of the fittest and natural evolution. SFLA and GA has been employed for optimizing the major parameters i.e. width, imperviousness, Manning's coefficient and depression storage for the highly urbanized catchment of Delhi, India. The study summarizes the auto-tuning of a widely used storm water management model (SWMM), by internal coupling of SWMM with SFLA and GA separately. The values of statistical parameters such as, Nash-Sutcliffe efficiency (NSE) and Percent Bias (PBIAS) were found to lie within the acceptable limit, indicating reasonably good model performance. Overall, this study proved promising for assessing risk in urban drainage systems and should prove useful to improve integrity of the urban system, its reliability and provides guidance for inundation preparedness.Keywords: Hydrologic model, SWMM, Urbanization, SFLA and GA.

  4. Estimation of in-situ bioremediation system cost using a hybrid Extreme Learning Machine (ELM)-particle swarm optimization approach

    NASA Astrophysics Data System (ADS)

    Yadav, Basant; Ch, Sudheer; Mathur, Shashi; Adamowski, Jan

    2016-12-01

    In-situ bioremediation is the most common groundwater remediation procedure used for treating organically contaminated sites. A simulation-optimization approach, which incorporates a simulation model for groundwaterflow and transport processes within an optimization program, could help engineers in designing a remediation system that best satisfies management objectives as well as regulatory constraints. In-situ bioremediation is a highly complex, non-linear process and the modelling of such a complex system requires significant computational exertion. Soft computing techniques have a flexible mathematical structure which can generalize complex nonlinear processes. In in-situ bioremediation management, a physically-based model is used for the simulation and the simulated data is utilized by the optimization model to optimize the remediation cost. The recalling of simulator to satisfy the constraints is an extremely tedious and time consuming process and thus there is need for a simulator which can reduce the computational burden. This study presents a simulation-optimization approach to achieve an accurate and cost effective in-situ bioremediation system design for groundwater contaminated with BTEX (Benzene, Toluene, Ethylbenzene, and Xylenes) compounds. In this study, the Extreme Learning Machine (ELM) is used as a proxy simulator to replace BIOPLUME III for the simulation. The selection of ELM is done by a comparative analysis with Artificial Neural Network (ANN) and Support Vector Machine (SVM) as they were successfully used in previous studies of in-situ bioremediation system design. Further, a single-objective optimization problem is solved by a coupled Extreme Learning Machine (ELM)-Particle Swarm Optimization (PSO) technique to achieve the minimum cost for the in-situ bioremediation system design. The results indicate that ELM is a faster and more accurate proxy simulator than ANN and SVM. The total cost obtained by the ELM-PSO approach is held to a minimum while successfully satisfying all the regulatory constraints of the contaminated site.

  5. Fully automatic region of interest selection in glomerular filtration rate estimation from 99mTc-DTPA renogram.

    PubMed

    Lin, Kun-Ju; Huang, Jia-Yann; Chen, Yung-Sheng

    2011-12-01

    Glomerular filtration rate (GFR) is a common accepted standard estimation of renal function. Gamma camera-based methods for estimating renal uptake of (99m)Tc-diethylenetriaminepentaacetic acid (DTPA) without blood or urine sampling have been widely used. Of these, the method introduced by Gates has been the most common method. Currently, most of gamma cameras are equipped with a commercial program for GFR determination, a semi-quantitative analysis by manually drawing region of interest (ROI) over each kidney. Then, the GFR value can be computed from the scintigraphic determination of (99m)Tc-DTPA uptake within the kidney automatically. Delineating the kidney area is difficult when applying a fixed threshold value. Moreover, hand-drawn ROIs are tedious, time consuming, and dependent highly on operator skill. Thus, we developed a fully automatic renal ROI estimation system based on the temporal changes in intensity counts, intensity-pair distribution image contrast enhancement method, adaptive thresholding, and morphological operations that can locate the kidney area and obtain the GFR value from a (99m)Tc-DTPA renogram. To evaluate the performance of the proposed approach, 30 clinical dynamic renograms were introduced. The fully automatic approach failed in one patient with very poor renal function. Four patients had a unilateral kidney, and the others had bilateral kidneys. The automatic contours from the remaining 54 kidneys were compared with the contours of manual drawing. The 54 kidneys were included for area error and boundary error analyses. There was high correlation between two physicians' manual contours and the contours obtained by our approach. For area error analysis, the mean true positive area overlap is 91%, the mean false negative is 13.4%, and the mean false positive is 9.3%. The boundary error is 1.6 pixels. The GFR calculated using this automatic computer-aided approach is reproducible and may be applied to help nuclear medicine physicians in clinical practice.

  6. Analysis of geometric moments as features for firearm identification.

    PubMed

    Md Ghani, Nor Azura; Liong, Choong-Yeun; Jemain, Abdul Aziz

    2010-05-20

    The task of identifying firearms from forensic ballistics specimens is exacting in crime investigation since the last two decades. Every firearm, regardless of its size, make and model, has its own unique 'fingerprint'. These fingerprints transfer when a firearm is fired to the fired bullet and cartridge case. The components that are involved in producing these unique characteristics are the firing chamber, breech face, firing pin, ejector, extractor and the rifling of the barrel. These unique characteristics are the critical features in identifying firearms. It allows investigators to decide on which particular firearm that has fired the bullet. Traditionally the comparison of ballistic evidence has been a tedious and time-consuming process requiring highly skilled examiners. Therefore, the main objective of this study is the extraction and identification of suitable features from firing pin impression of cartridge case images for firearm recognition. Some previous studies have shown that firing pin impression of cartridge case is one of the most important characteristics used for identifying an individual firearm. In this study, data are gathered using 747 cartridge case images captured from five different pistols of type 9mm Parabellum Vektor SP1, made in South Africa. All the images of the cartridge cases are then segmented into three regions, forming three different set of images, i.e. firing pin impression image, centre of firing pin impression image and ring of firing pin impression image. Then geometric moments up to the sixth order were generated from each part of the images to form a set of numerical features. These 48 features were found to be significantly different using the MANOVA test. This high dimension of features is then reduced into only 11 significant features using correlation analysis. Classification results using cross-validation under discriminant analysis show that 96.7% of the images were classified correctly. These results demonstrate the value of geometric moments technique for producing a set of numerical features, based on which the identification of firearms are made.

  7. A molecular fragment cheminformatics roadmap for mesoscopic simulation.

    PubMed

    Truszkowski, Andreas; Daniel, Mirco; Kuhn, Hubert; Neumann, Stefan; Steinbeck, Christoph; Zielesny, Achim; Epple, Matthias

    2014-12-01

    Mesoscopic simulation studies the structure, dynamics and properties of large molecular ensembles with millions of atoms: Its basic interacting units (beads) are no longer the nuclei and electrons of quantum chemical ab-initio calculations or the atom types of molecular mechanics but molecular fragments, molecules or even larger molecular entities. For its simulation setup and output a mesoscopic simulation kernel software uses abstract matrix (array) representations for bead topology and connectivity. Therefore a pure kernel-based mesoscopic simulation task is a tedious, time-consuming and error-prone venture that limits its practical use and application. A consequent cheminformatics approach tackles these problems and provides solutions for a considerably enhanced accessibility. This study aims at outlining a complete cheminformatics roadmap that frames a mesoscopic Molecular Fragment Dynamics (MFD) simulation kernel to allow its efficient use and practical application. The molecular fragment cheminformatics roadmap consists of four consecutive building blocks: An adequate fragment structure representation (1), defined operations on these fragment structures (2), the description of compartments with defined compositions and structural alignments (3), and the graphical setup and analysis of a whole simulation box (4). The basis of the cheminformatics approach (i.e. building block 1) is a SMILES-like line notation (denoted f SMILES) with connected molecular fragments to represent a molecular structure. The f SMILES notation and the following concepts and methods for building blocks 2-4 are outlined with examples and practical usage scenarios. It is shown that the requirements of the roadmap may be partly covered by already existing open-source cheminformatics software. Mesoscopic simulation techniques like MFD may be considerably alleviated and broadened for practical use with a consequent cheminformatics layer that successfully tackles its setup subtleties and conceptual usage hurdles. Molecular Fragment Cheminformatics may be regarded as a crucial accelerator to propagate MFD and similar mesoscopic simulation techniques in the molecular sciences. Graphical abstractA molecular fragment cheminformatics roadmap for mesoscopic simulation.

  8. An ensemble deep learning based approach for red lesion detection in fundus images.

    PubMed

    Orlando, José Ignacio; Prokofyeva, Elena; Del Fresno, Mariana; Blaschko, Matthew B

    2018-01-01

    Diabetic retinopathy (DR) is one of the leading causes of preventable blindness in the world. Its earliest sign are red lesions, a general term that groups both microaneurysms (MAs) and hemorrhages (HEs). In daily clinical practice, these lesions are manually detected by physicians using fundus photographs. However, this task is tedious and time consuming, and requires an intensive effort due to the small size of the lesions and their lack of contrast. Computer-assisted diagnosis of DR based on red lesion detection is being actively explored due to its improvement effects both in clinicians consistency and accuracy. Moreover, it provides comprehensive feedback that is easy to assess by the physicians. Several methods for detecting red lesions have been proposed in the literature, most of them based on characterizing lesion candidates using hand crafted features, and classifying them into true or false positive detections. Deep learning based approaches, by contrast, are scarce in this domain due to the high expense of annotating the lesions manually. In this paper we propose a novel method for red lesion detection based on combining both deep learned and domain knowledge. Features learned by a convolutional neural network (CNN) are augmented by incorporating hand crafted features. Such ensemble vector of descriptors is used afterwards to identify true lesion candidates using a Random Forest classifier. We empirically observed that combining both sources of information significantly improve results with respect to using each approach separately. Furthermore, our method reported the highest performance on a per-lesion basis on DIARETDB1 and e-ophtha, and for screening and need for referral on MESSIDOR compared to a second human expert. Results highlight the fact that integrating manually engineered approaches with deep learned features is relevant to improve results when the networks are trained from lesion-level annotated data. An open source implementation of our system is publicly available at https://github.com/ignaciorlando/red-lesion-detection. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. The Gene Set Builder: collation, curation, and distribution of sets of genes

    PubMed Central

    Yusuf, Dimas; Lim, Jonathan S; Wasserman, Wyeth W

    2005-01-01

    Background In bioinformatics and genomics, there are many applications designed to investigate the common properties for a set of genes. Often, these multi-gene analysis tools attempt to reveal sequential, functional, and expressional ties. However, while tremendous effort has been invested in developing tools that can analyze a set of genes, minimal effort has been invested in developing tools that can help researchers compile, store, and annotate gene sets in the first place. As a result, the process of making or accessing a set often involves tedious and time consuming steps such as finding identifiers for each individual gene. These steps are often repeated extensively to shift from one identifier type to another; or to recreate a published set. In this paper, we present a simple online tool which – with the help of the gene catalogs Ensembl and GeneLynx – can help researchers build and annotate sets of genes quickly and easily. Description The Gene Set Builder is a database-driven, web-based tool designed to help researchers compile, store, export, and share sets of genes. This application supports the 17 eukaryotic genomes found in version 32 of the Ensembl database, which includes species from yeast to human. User-created information such as sets and customized annotations are stored to facilitate easy access. Gene sets stored in the system can be "exported" in a variety of output formats – as lists of identifiers, in tables, or as sequences. In addition, gene sets can be "shared" with specific users to facilitate collaborations or fully released to provide access to published results. The application also features a Perl API (Application Programming Interface) for direct connectivity to custom analysis tools. A downloadable Quick Reference guide and an online tutorial are available to help new users learn its functionalities. Conclusion The Gene Set Builder is an Ensembl-facilitated online tool designed to help researchers compile and manage sets of genes in a user-friendly environment. The application can be accessed via . PMID:16371163

  10. Monitoring grass nutrients and biomass as indicators of rangeland quality and quantity using random forest modelling and WorldView-2 data

    NASA Astrophysics Data System (ADS)

    Ramoelo, Abel; Cho, M. A.; Mathieu, R.; Madonsela, S.; van de Kerchove, R.; Kaszta, Z.; Wolff, E.

    2015-12-01

    Land use and climate change could have huge impacts on food security and the health of various ecosystems. Leaf nitrogen (N) and above-ground biomass are some of the key factors limiting agricultural production and ecosystem functioning. Leaf N and biomass can be used as indicators of rangeland quality and quantity. Conventional methods for assessing these vegetation parameters at landscape scale level are time consuming and tedious. Remote sensing provides a bird-eye view of the landscape, which creates an opportunity to assess these vegetation parameters over wider rangeland areas. Estimation of leaf N has been successful during peak productivity or high biomass and limited studies estimated leaf N in dry season. The estimation of above-ground biomass has been hindered by the signal saturation problems using conventional vegetation indices. The objective of this study is to monitor leaf N and above-ground biomass as an indicator of rangeland quality and quantity using WorldView-2 satellite images and random forest technique in the north-eastern part of South Africa. Series of field work to collect samples for leaf N and biomass were undertaken in March 2013, April or May 2012 (end of wet season) and July 2012 (dry season). Several conventional and red edge based vegetation indices were computed. Overall results indicate that random forest and vegetation indices explained over 89% of leaf N concentrations for grass and trees, and less than 89% for all the years of assessment. The red edge based vegetation indices were among the important variables for predicting leaf N. For the biomass, random forest model explained over 84% of biomass variation in all years, and visible bands including red edge based vegetation indices were found to be important. The study demonstrated that leaf N could be monitored using high spatial resolution with the red edge band capability, and is important for rangeland assessment and monitoring.

  11. A potential to monitor nutrients as an indicator of rangeland quality using space borne remote sensing

    NASA Astrophysics Data System (ADS)

    Ramoelo, A.; Cho, M. A.; Madonsela, S.; Mathieu, R.; van der Korchove, R.; Kaszta, Z.; Wolf, E.

    2014-02-01

    Global change consisting of land use and climate change could have huge impacts on food security and the health of various ecosystems. Leaf nitrogen (N) is one of the key factors limiting agricultural production and ecosystem functioning. Leaf N can be used as an indicator of rangeland quality which could provide information for the farmers, decision makers, land planners and managers. Leaf N plays a crucial role in understanding the feeding patterns and distribution of wildlife and livestock. Assessment of this vegetation parameter using conventional methods at landscape scale level is time consuming and tedious. Remote sensing provides a synoptic view of the landscape, which engenders an opportunity to assess leaf N over wider rangeland areas from protected to communal areas. Estimation of leaf N has been successful during peak productivity or high biomass and limited studies estimated leaf N in dry season. The objective of this study is to monitor leaf N as an indicator of rangeland quality using WorldView 2 satellite images in the north-eastern part of South Africa. Series of field work to collect samples for leaf N were undertaken in the beginning of May (end of wet season) and July (dry season). Several conventional and red edge based vegetation indices were computed. Simple regression was used to develop prediction model for leaf N. Using bootstrapping, indicator of precision and accuracy were analyzed to select a best model for the combined data sets (May and July). The may model for red edge based simple ratio explained over 90% of leaf N variations. The model developed from the combined data sets with normalized difference vegetation index explained 62% of leaf N variation, and this is a model used to estimate and map leaf N for two seasons. The study demonstrated that leaf N could be monitored using high spatial resolution with the red edge band capability.

  12. Improved protein-protein interactions prediction via weighted sparse representation model combining continuous wavelet descriptor and PseAA composition.

    PubMed

    Huang, Yu-An; You, Zhu-Hong; Chen, Xing; Yan, Gui-Ying

    2016-12-23

    Protein-protein interactions (PPIs) are essential to most biological processes. Since bioscience has entered into the era of genome and proteome, there is a growing demand for the knowledge about PPI network. High-throughput biological technologies can be used to identify new PPIs, but they are expensive, time-consuming, and tedious. Therefore, computational methods for predicting PPIs have an important role. For the past years, an increasing number of computational methods such as protein structure-based approaches have been proposed for predicting PPIs. The major limitation in principle of these methods lies in the prior information of the protein to infer PPIs. Therefore, it is of much significance to develop computational methods which only use the information of protein amino acids sequence. Here, we report a highly efficient approach for predicting PPIs. The main improvements come from the use of a novel protein sequence representation by combining continuous wavelet descriptor and Chou's pseudo amino acid composition (PseAAC), and from adopting weighted sparse representation based classifier (WSRC). This method, cross-validated on the PPIs datasets of Saccharomyces cerevisiae, Human and H. pylori, achieves an excellent results with accuracies as high as 92.50%, 95.54% and 84.28% respectively, significantly better than previously proposed methods. Extensive experiments are performed to compare the proposed method with state-of-the-art Support Vector Machine (SVM) classifier. The outstanding results yield by our model that the proposed feature extraction method combing two kinds of descriptors have strong expression ability and are expected to provide comprehensive and effective information for machine learning-based classification models. In addition, the prediction performance in the comparison experiments shows the well cooperation between the combined feature and WSRC. Thus, the proposed method is a very efficient method to predict PPIs and may be a useful supplementary tool for future proteomics studies.

  13. Langmuir probe analysis in electronegative plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bredin, Jerome, E-mail: jerome.bredin@lpp.polytechnique.fr; Chabert, Pascal; Aanesland, Ane

    2014-12-15

    This paper compares two methods to analyze Langmuir probe data obtained in electronegative plasmas. The techniques are developed to allow investigations in plasmas, where the electronegativity α{sub 0} = n{sub –}/n{sub e} (the ratio between the negative ion and electron densities) varies strongly. The first technique uses an analytical model to express the Langmuir probe current-voltage (I-V) characteristic and its second derivative as a function of the electron and ion densities (n{sub e}, n{sub +}, n{sub –}), temperatures (T{sub e}, T{sub +}, T{sub –}), and masses (m{sub e}, m{sub +}, m{sub –}). The analytical curves are fitted to the experimental data bymore » adjusting these variables and parameters. To reduce the number of fitted parameters, the ion masses are assumed constant within the source volume, and quasi-neutrality is assumed everywhere. In this theory, Maxwellian distributions are assumed for all charged species. We show that this data analysis can predict the various plasma parameters within 5–10%, including the ion temperatures when α{sub 0} > 100. However, the method is tedious, time consuming, and requires a precise measurement of the energy distribution function. A second technique is therefore developed for easier access to the electron and ion densities, but does not give access to the ion temperatures. Here, only the measured I-V characteristic is needed. The electron density, temperature, and ion saturation current for positive ions are determined by classical probe techniques. The electronegativity α{sub 0} and the ion densities are deduced via an iterative method since these variables are coupled via the modified Bohm velocity. For both techniques, a Child-Law sheath model for cylindrical probes has been developed and is presented to emphasize the importance of this model for small cylindrical Langmuir probes.« less

  14. Comparing Chemistry to Outcome: The Development of a Chemical Distance Metric, Coupled with Clustering and Hierarchal Visualization Applied to Macromolecular Crystallography

    PubMed Central

    Bruno, Andrew E.; Ruby, Amanda M.; Luft, Joseph R.; Grant, Thomas D.; Seetharaman, Jayaraman; Montelione, Gaetano T.; Hunt, John F.; Snell, Edward H.

    2014-01-01

    Many bioscience fields employ high-throughput methods to screen multiple biochemical conditions. The analysis of these becomes tedious without a degree of automation. Crystallization, a rate limiting step in biological X-ray crystallography, is one of these fields. Screening of multiple potential crystallization conditions (cocktails) is the most effective method of probing a proteins phase diagram and guiding crystallization but the interpretation of results can be time-consuming. To aid this empirical approach a cocktail distance coefficient was developed to quantitatively compare macromolecule crystallization conditions and outcome. These coefficients were evaluated against an existing similarity metric developed for crystallization, the C6 metric, using both virtual crystallization screens and by comparison of two related 1,536-cocktail high-throughput crystallization screens. Hierarchical clustering was employed to visualize one of these screens and the crystallization results from an exopolyphosphatase-related protein from Bacteroides fragilis, (BfR192) overlaid on this clustering. This demonstrated a strong correlation between certain chemically related clusters and crystal lead conditions. While this analysis was not used to guide the initial crystallization optimization, it led to the re-evaluation of unexplained peaks in the electron density map of the protein and to the insertion and correct placement of sodium, potassium and phosphate atoms in the structure. With these in place, the resulting structure of the putative active site demonstrated features consistent with active sites of other phosphatases which are involved in binding the phosphoryl moieties of nucleotide triphosphates. The new distance coefficient, CDcoeff, appears to be robust in this application, and coupled with hierarchical clustering and the overlay of crystallization outcome, reveals information of biological relevance. While tested with a single example the potential applications related to crystallography appear promising and the distance coefficient, clustering, and hierarchal visualization of results undoubtedly have applications in wider fields. PMID:24971458

  15. ToxiM: A Toxicity Prediction Tool for Small Molecules Developed Using Machine Learning and Chemoinformatics Approaches.

    PubMed

    Sharma, Ashok K; Srivastava, Gopal N; Roy, Ankita; Sharma, Vineet K

    2017-01-01

    The experimental methods for the prediction of molecular toxicity are tedious and time-consuming tasks. Thus, the computational approaches could be used to develop alternative methods for toxicity prediction. We have developed a tool for the prediction of molecular toxicity along with the aqueous solubility and permeability of any molecule/metabolite. Using a comprehensive and curated set of toxin molecules as a training set, the different chemical and structural based features such as descriptors and fingerprints were exploited for feature selection, optimization and development of machine learning based classification and regression models. The compositional differences in the distribution of atoms were apparent between toxins and non-toxins, and hence, the molecular features were used for the classification and regression. On 10-fold cross-validation, the descriptor-based, fingerprint-based and hybrid-based classification models showed similar accuracy (93%) and Matthews's correlation coefficient (0.84). The performances of all the three models were comparable (Matthews's correlation coefficient = 0.84-0.87) on the blind dataset. In addition, the regression-based models using descriptors as input features were also compared and evaluated on the blind dataset. Random forest based regression model for the prediction of solubility performed better ( R 2 = 0.84) than the multi-linear regression (MLR) and partial least square regression (PLSR) models, whereas, the partial least squares based regression model for the prediction of permeability (caco-2) performed better ( R 2 = 0.68) in comparison to the random forest and MLR based regression models. The performance of final classification and regression models was evaluated using the two validation datasets including the known toxins and commonly used constituents of health products, which attests to its accuracy. The ToxiM web server would be a highly useful and reliable tool for the prediction of toxicity, solubility, and permeability of small molecules.

  16. ToxiM: A Toxicity Prediction Tool for Small Molecules Developed Using Machine Learning and Chemoinformatics Approaches

    PubMed Central

    Sharma, Ashok K.; Srivastava, Gopal N.; Roy, Ankita; Sharma, Vineet K.

    2017-01-01

    The experimental methods for the prediction of molecular toxicity are tedious and time-consuming tasks. Thus, the computational approaches could be used to develop alternative methods for toxicity prediction. We have developed a tool for the prediction of molecular toxicity along with the aqueous solubility and permeability of any molecule/metabolite. Using a comprehensive and curated set of toxin molecules as a training set, the different chemical and structural based features such as descriptors and fingerprints were exploited for feature selection, optimization and development of machine learning based classification and regression models. The compositional differences in the distribution of atoms were apparent between toxins and non-toxins, and hence, the molecular features were used for the classification and regression. On 10-fold cross-validation, the descriptor-based, fingerprint-based and hybrid-based classification models showed similar accuracy (93%) and Matthews's correlation coefficient (0.84). The performances of all the three models were comparable (Matthews's correlation coefficient = 0.84–0.87) on the blind dataset. In addition, the regression-based models using descriptors as input features were also compared and evaluated on the blind dataset. Random forest based regression model for the prediction of solubility performed better (R2 = 0.84) than the multi-linear regression (MLR) and partial least square regression (PLSR) models, whereas, the partial least squares based regression model for the prediction of permeability (caco-2) performed better (R2 = 0.68) in comparison to the random forest and MLR based regression models. The performance of final classification and regression models was evaluated using the two validation datasets including the known toxins and commonly used constituents of health products, which attests to its accuracy. The ToxiM web server would be a highly useful and reliable tool for the prediction of toxicity, solubility, and permeability of small molecules. PMID:29249969

  17. MUDMASTER: A Program for Calculating Crystalline Size Distributions and Strain from the Shapes of X-Ray Diffraction Peaks

    USGS Publications Warehouse

    Eberl, D.D.; Drits, V.A.; Środoń, Jan; Nüesch, R.

    1996-01-01

    Particle size may strongly influence the physical and chemical properties of a substance (e.g. its rheology, surface area, cation exchange capacity, solubility, etc.), and its measurement in rocks may yield geological information about ancient environments (sediment provenance, degree of metamorphism, degree of weathering, current directions, distance to shore, etc.). Therefore mineralogists, geologists, chemists, soil scientists, and others who deal with clay-size material would like to have a convenient method for measuring particle size distributions. Nano-size crystals generally are too fine to be measured by light microscopy. Laser scattering methods give only average particle sizes; therefore particle size can not be measured in a particular crystallographic direction. Also, the particles measured by laser techniques may be composed of several different minerals, and may be agglomerations of individual crystals. Measurement by electron and atomic force microscopy is tedious, expensive, and time consuming. It is difficult to measure more than a few hundred particles per sample by these methods. This many measurements, often taking several days of intensive effort, may yield an accurate mean size for a sample, but may be too few to determine an accurate distribution of sizes. Measurement of size distributions by X-ray diffraction (XRD) solves these shortcomings. An X-ray scan of a sample occurs automatically, taking a few minutes to a few hours. The resulting XRD peaks average diffraction effects from billions of individual nano-size crystals. The size that is measured by XRD may be related to the size of the individual crystals of the mineral in the sample, rather than to the size of particles formed from the agglomeration of these crystals. Therefore one can determine the size of a particular mineral in a mixture of minerals, and the sizes in a particular crystallographic direction of that mineral.

  18. Two-Photon Microscopy Analysis of Gold Nanoparticle Uptake in 3D Cell Spheroids.

    PubMed

    Rane, Tushar D; Armani, Andrea M

    2016-01-01

    Nanomaterials can be synthesized from a wide range of material systems in numerous morphologies, creating an extremely diverse portfolio. As result of this tunability, these materials are emerging as a new class of nanotherapeutics and imaging agents. One particularly interesting nanomaterial is the gold nanoparticle. Due to its inherent biocompatibility and tunable photothermal behavior, it has made a rapid transition from the lab setting to in vivo testing. In most nanotherapeutic applications, the efficacy of the agent is directly related to the target of interest. However, the optimization of the AuNP size and shape for efficacy in vitro, prior to testing in in vivo models of a disease, has been largely limited to two dimensional monolayers of cells. Two dimensional cell cultures are unable to reproduce conditions experienced by AuNP in the body. In this article, we systematically investigate the effect of different properties of AuNP on the penetration depth into 3D cell spheroids using two-photon microscopy. The 3D spheroids are formed from the HCT116 cell line, a colorectal carcinoma cell line. In addition to studying different sizes and shapes of AuNPs, we also study the effect of an oligo surface chemistry. There is a significant difference between AuNP uptake profiles in the 2D monolayers of cells as compared to the 3D cell spheroids. Additionally, the range of sizes and shapes studied here also exhibit marked differences in uptake penetration depth and efficacy. Finally, our results demonstrate that two-photon microscopy enables quantitative AuNP localization and concentration data to be obtained at the single spheroid level without fluorescent labeling of the AuNP, thus, providing a viable technique for large scale screening of AuNP properties in 3D cell spheroids as compared to tedious and time consuming techniques like electron microscopy.

  19. TCP Packet Trace Analysis. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Shepard, Timothy J.

    1991-01-01

    Examination of a trace of packets collected from the network is often the only method available for diagnosing protocol performance problems in computer networks. This thesis explores the use of packet traces to diagnose performance problems of the transport protocol TCP. Unfortunately, manual examination of these traces can be so tedious that effective analysis is not possible. The primary contribution of this thesis is a graphical method of displaying the packet trace which greatly reduce, the tediousness of examining a packet trace. The graphical method is demonstrated by the examination of some packet traces of typical TCP connections. The performance of two different implementations of TCP sending data across a particular network path is compared. Traces many thousands of packets long are used to demonstrate how effectively the graphical method simplifies examination of long complicated traces. In the comparison of the two TCP implementations, the burstiness of the TCP transmitter appeared to be related to the achieved throughput. A method of quantifying this burstiness is presented and its possible relevance to understanding the performance of TCP is discussed.

  20. 12 CFR Supplement I to Part 213 - Official Staff Commentary to Regulation M

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... primary use. 2. Period of time. To be a consumer lease, the initial term of the lease must be more than... lease (unless terminable without penalty at any time by the consumer) under which the consumer: i... time of consummation. The threshold amount in effect during a particular time period is the amount...

  1. 12 CFR Supplement I to Part 213 - Official Staff Commentary to Regulation M

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... primary use. 2. Period of time. To be a consumer lease, the initial term of the lease must be more than... lease (unless terminable without penalty at any time by the consumer) under which the consumer: i... time of consummation. The threshold amount in effect during a particular time period is the amount...

  2. 12 CFR Supplement I to Part 213 - Official Staff Commentary to Regulation M

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... primary use. 2. Period of time. To be a consumer lease, the initial term of the lease must be more than... lease (unless terminable without penalty at any time by the consumer) under which the consumer: i... time of consummation. The threshold amount in effect during a particular time period is the amount...

  3. Estimate of the soil water retention curve from the sorptivity and β parameter calculated from an upward infiltration experiment

    NASA Astrophysics Data System (ADS)

    Moret-Fernández, D.; Latorre, B.

    2017-01-01

    The water retention curve (θ(h)), which defines the relationship between the volumetric water content (θ) and the matric potential (h), is of paramount importance to characterize the hydraulic behaviour of soils. Because current methods to estimate θ(h) are, in general, tedious and time consuming, alternative procedures to determine θ(h) are needed. Using an upward infiltration curve, the main objective of this work is to present a method to determine the parameters of the van Genuchten (1980) water retention curve (α and n) from the sorptivity (S) and the β parameter defined in the 1D infiltration equation proposed by Haverkamp et al. (1994). The first specific objective is to present an equation, based on the Haverkamp et al. (1994) analysis, which allows describing an upward infiltration process. Secondary, assuming a known saturated hydraulic conductivity, Ks, calculated on a finite soil column by the Darcy's law, a numerical procedure to calculate S and β by the inverse analysis of an exfiltration curve is presented. Finally, the α and n values are numerically calculated from Ks, S and β. To accomplish the first specific objective, cumulative upward infiltration curves simulated with HYDRUS-1D for sand, loam, silt and clay soils were compared to those calculated with the proposed equation, after applying the corresponding β and S calculated from the theoretical Ks, α and n. The same curves were used to: (i) study the influence of the exfiltration time on S and β estimations, (ii) evaluate the limits of the inverse analysis, and (iii) validate the feasibility of the method to estimate α and n. Next, the θ(h) parameters estimated with the numerical method on experimental soils were compared to those obtained with pressure cells. The results showed that the upward infiltration curve could be correctly described by the modified Haverkamp et al. (1994) equation. While S was only affected by early-time exfiltration data, the β parameter had a significant influence on the long-time exfiltration curve, which accuracy increased with time. The 1D infiltration model was only suitable for β < 1.7 (sand, loam and silt). After omitting the clay soil, an excellent relationship (R2 = 0.99, p < 0.005) was observed between the theoretical α and n values of the synthetic soils and those estimated from the inverse analysis. Consistent results, with a significant relationship (p < 0.001) between the n values estimated with the pressure cell and the upward infiltration analysis, were also obtained on the experimental soils.

  4. Development of a Near Real-Time Hail Damage Swath Identification Algorithm for Vegetation

    NASA Technical Reports Server (NTRS)

    Bell, Jordan R.; Molthan, Andrew L.; Schultz, Kori A.; McGrath, Kevin M.; Burks, Jason E.

    2015-01-01

    Every year in the Midwest and Great Plains, widespread greenness forms in conjunction with the latter part of the spring-summer growing season. This prevalent greenness forms as a result of the high concentration of agricultural areas having their crops reach their maturity before the fall harvest. This time of year also coincides with an enhanced hail frequency for the Great Plains (Cintineo et al. 2012). These severe thunderstorms can bring damaging winds and large hail that can result in damage to the surface vegetation. The spatial extent of the damage can relatively small concentrated area or be a vast swath of damage that is visible from space. These large areas of damage have been well documented over the years. In the late 1960s aerial photography was used to evaluate crop damage caused by hail. As satellite remote sensing technology has evolved, the identification of these hail damage streaks has increased. Satellites have made it possible to view these streaks in additional spectrums. Parker et al. (2005) documented two streaks using the Moderate Resolution Imaging Spectroradiometer (MODIS) that occurred in South Dakota. He noted the potential impact that these streaks had on the surface temperature and associated surface fluxes that are impacted by a change in temperature. Gallo et al. (2012) examined at the correlation between radar signatures and ground observations from storms that produced a hail damage swath in Central Iowa also using MODIS. Finally, Molthan et al. (2013) identified hail damage streaks through MODIS, Landsat-7, and SPOT observations of different resolutions for the development of a potential near-real time applications. The manual analysis of hail damage streaks in satellite imagery is both tedious and time consuming, and may be inconsistent from event to event. This study focuses on development of an objective and automatic algorithm to detect these areas of damage in a more efficient and timely manner. This study utilizes the MODIS sensor aboard the NASA Aqua satellite. Aqua was chosen due to an afternoon orbit over the United States when land surface temperatures are relatively warm and improve the contrast between damaged and undamaged areas. This orbit is also similar to the orbit of the Suomi-National Polar-orbiting Partnership (NPP) satellite. The Suomi NPP satellite hosts the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument, which is the next generation of a MODIS-like sensor in polar orbit.

  5. 12 CFR Supplement I to Part 1005 - Official Interpretations

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... used to capture the Magnetic Ink Character Recognition (MICR) encoding to initiate a one-time automated clearinghouse (ACH) debit. For example, if a consumer authorizes a one-time ACH debit from the consumer's... involved at the time of the transaction, if the consumer's asset account is subsequently debited for the...

  6. 12 CFR Supplement I to Part 1005 - Official Interpretations

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... used to capture the Magnetic Ink Character Recognition (MICR) encoding to initiate a one-time automated clearinghouse (ACH) debit. For example, if a consumer authorizes a one-time ACH debit from the consumer's... involved at the time of the transaction, if the consumer's asset account is subsequently debited for the...

  7. 12 CFR 205.3 - Coverage.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... to initiate a one-time electronic fund transfer from a consumer's account. The consumer must...-time electronic fund transfer (in providing a check to a merchant or other payee for the MICR encoding... information for the transfer shall also provide a notice to the consumer at the same time it provides the...

  8. 12 CFR 1005.3 - Coverage.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...-time electronic fund transfer from a consumer's account. The consumer must authorize the transfer. (ii... one-time electronic fund transfer (in providing a check to a merchant or other payee for the MICR... transfer. A consumer authorizes a one-time electronic fund transfer from his or her account to pay the fee...

  9. 12 CFR 205.3 - Coverage.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... to initiate a one-time electronic fund transfer from a consumer's account. The consumer must...-time electronic fund transfer (in providing a check to a merchant or other payee for the MICR encoding... information for the transfer shall also provide a notice to the consumer at the same time it provides the...

  10. 12 CFR 205.3 - Coverage.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... to initiate a one-time electronic fund transfer from a consumer's account. The consumer must...-time electronic fund transfer (in providing a check to a merchant or other payee for the MICR encoding... information for the transfer shall also provide a notice to the consumer at the same time it provides the...

  11. Genotype identification of Math1/LacZ knockout mice based on real-time PCR with SYBR Green I dye.

    PubMed

    Krizhanovsky, Valery; Golenser, Esther; Ben-Arie, Nissim

    2004-07-30

    Knockout mice are widely used in all fields of biomedical research. Determining the genotype of every newborn mouse is a tedious task, usually performed by Southern blot hybridization or Polymerase Chain Reaction (PCR). We describe here a quick and simple genotype identification assay based on real-time PCR and SYBR Green I dye, without using fluorescent primers. The discrimination between the wild type and targeted alleles is based on a PCR design that leads to a different melting temperature for each product. The identification of the genotype is obvious immediately after amplification, and no post-PCR manipulations are needed, reducing cost and time. Therefore, while the real-time PCR amplification increases the sensitivity, the fact that the reactions tubes are never opened after amplification, reduces the risk of contamination and eliminates errors, which are common during the repeated handling of dozens of samples from the same mouse line. The protocol we provide was tested on Math1 knockout mice, but is general, and may be utilized for any knockout line and real-time thermocycler, without any further modification, accessories or special reagents. Copyright 2004 Elsevier B.V.

  12. Fundamental Aerodynamic Investigations for Development of Arrow-Stabilized Projectiles

    NASA Technical Reports Server (NTRS)

    Kurzweg, Hermann

    1947-01-01

    The numerous patent applications on arrow-stabilized projectiles indicate that the idea of projectiles without spin is not new, but has appeared in various proposals throughout the last decades. As far as projectiles for subsonic speeds are concerned, suitable shapes have been developed for sometime, for example, numerous grenades. Most of the patent applications, though, are not practicable particularly for projectiles with supersonic speed. This is because the inventor usually does not have any knowledge of aerodynamic flow around the projectile nor any particular understanding of the practical solution. The lack of wind tunnels for the development of projectiles made it necessary to use firing tests for development. These are obviously extremely tedious or expensive and lead almost always to failures. The often expressed opinion that arrow-stabilized projectiles cannot fly supersonically can be traced to this condition. That this is not the case has been shown for the first time by Roechling on long projectiles with foldable fins. Since no aerodynamic investigations were made for the development of these projectiles, only tedious series of firing tests with systematic variation of the fins could lead to satisfactory results. These particular projectiles though have a disadvantage which lies in the nature cf foldable fins. They occasionally do not open uniformly in flight, thus causing unsymmetry in flow and greater scatter. The junctions of fins and body are very bad aerodynamically and increase the drag. It must be possible to develop high-performance arrow-stabilized projectiles based on the aerodynamic research conducted during the last few years at Peenemuende and new construction ideas. Thus the final shape, ready for operational use, could be developed in the wind tunnel without loss of expensive time in firing tests. The principle of arrow-stabilized performance has been applied to a large number of caliburs which were stabilized by various means Most promising was the development of a subcaliber wing-stabilized projectile with driving disc (Treibspiegel) where rigid control surfaces extend beyond the caliber of the projectile into the free stream. The stabilized projectiles of full-caliber, wing-stabilized projectiles with fins within the caliber is considerably more difficult. A completely satisfactory solution for the latter has not been found yet.

  13. Implementing Access to Data Distributed on Many Processors

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A reference architecture is defined for an object-oriented implementation of domains, arrays, and distributions written in the programming language Chapel. This technology primarily addresses domains that contain arrays that have regular index sets with the low-level implementation details being beyond the scope of this discussion. What is defined is a complete set of object-oriented operators that allows one to perform data distributions for domain arrays involving regular arithmetic index sets. What is unique is that these operators allow for the arbitrary regions of the arrays to be fragmented and distributed across multiple processors with a single point of access giving the programmer the illusion that all the elements are collocated on a single processor. Today's massively parallel High Productivity Computing Systems (HPCS) are characterized by a modular structure, with a large number of processing and memory units connected by a high-speed network. Locality of access as well as load balancing are primary concerns in these systems that are typically used for high-performance scientific computation. Data distributions address these issues by providing a range of methods for spreading large data sets across the components of a system. Over the past two decades, many languages, systems, tools, and libraries have been developed for the support of distributions. Since the performance of data parallel applications is directly influenced by the distribution strategy, users often resort to low-level programming models that allow fine-tuning of the distribution aspects affecting performance, but, at the same time, are tedious and error-prone. This technology presents a reusable design of a data-distribution framework for data parallel high-performance applications. Distributions are a means to express locality in systems composed of large numbers of processor and memory components connected by a network. Since distributions have a great effect on the performance of applications, it is important that the distribution strategy is flexible, so its behavior can change depending on the needs of the application. At the same time, high productivity concerns require that the user be shielded from error-prone, tedious details such as communication and synchronization.

  14. Postreinforcement Pause in Grocery Shopping: Comparing Interpurchase Times across Products and Consumers

    ERIC Educational Resources Information Center

    Oliveira-Castro, Jorge M.; James, Victoria K.; Foxall, Gordon R.

    2007-01-01

    Purchase probability as a function of interpurchase time was examined through comparison of findings from laboratory experiments on reinforcement schedules and from marketing investigations of consumers' interpurchase time. Panel data, based on a sample of 80 consumers who purchased nine supermarket food products during 16 weeks, were used. For…

  15. Measuring, managing and maximizing performance of mineral processing plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bascur, O.A.; Kennedy, J.P.

    1995-12-31

    The implementation of continuous quality improvement is the confluence of Total Quality Management, People Empowerment, Performance Indicators and Information Engineering. The supporting information technologies allow a mineral processor to narrow the gap between management business objectives and the process control level. One of the most important contributors is the user friendliness and flexibility of the personal computer in a client/server environment. This synergistic combination when used for real time performance monitoring translates into production cost savings, improved communications and enhanced decision support. Other savings come from reduced time to collect data and perform tedious calculations, act quickly with fresh newmore » data, generate and validate data to be used by others. This paper presents an integrated view of plant management. The selection of the proper tools for continuous quality improvement are described. The process of selecting critical performance monitoring indices for improved plant performance are discussed. The importance of a well balanced technological improvement, personnel empowerment, total quality management and organizational assets are stressed.« less

  16. Exploring the Notion of Context in Medical Data.

    PubMed

    Mylonas, Phivos

    2017-01-01

    Scientific and technological knowledge and skills are becoming crucial for most data analysis activities. Two rather distinct, but at the same time collaborating, domains are the ones of computer science and medicine; the former offers significant aid towards a more efficient understanding of the latter's research trends. Still, the process of meaningfully analyzing and understanding medical information and data is a tedious one, bound to several challenges. One of them is the efficient utilization of contextual information in the process leading to optimized, context-aware data analysis results. Nowadays, researchers are provided with tools and opportunities to analytically study medical data, but at the same time significant and rather complex computational challenges are yet to be tackled, among others due to the humanistic nature and increased rate of new content and information production imposed by related hardware and applications. So, the ultimate goal of this position paper is to provide interested parties an overview of major contextual information types to be identified within the medical data processing framework.

  17. Spaceport Command and Control System Automated Verification Software Development

    NASA Technical Reports Server (NTRS)

    Backus, Michael W.

    2017-01-01

    For as long as we have walked the Earth, humans have always been explorers. We have visited our nearest celestial body and sent Voyager 1 beyond our solar system1 out into interstellar space. Now it is finally time for us to step beyond our home and onto another planet. The Spaceport Command and Control System (SCCS) is being developed along with the Space Launch System (SLS) to take us on a journey further than ever attempted. Within SCCS are separate subsystems and system level software, each of which have to be tested and verified. Testing is a long and tedious process, so automating it will be much more efficient and also helps to remove the possibility of human error from mission operations. I was part of a team of interns and full-time engineers who automated tests for the requirements on SCCS, and with that was able to help verify that the software systems are performing as expected.

  18. The evolution of ecosystem ascendency in a complex systems based model.

    PubMed

    Brinck, Katharina; Jensen, Henrik Jeldtoft

    2017-09-07

    General patterns in ecosystem development can shed light on driving forces behind ecosystem formation and recovery and have been of long interest. In recent years, the need for integrative and process oriented approaches to capture ecosystem growth, development and organisation, as well as the scope of information theory as a descriptive tool has been addressed from various sides. However data collection of ecological network flows is difficult and tedious and comprehensive models are lacking. We use a hierarchical version of the Tangled Nature Model of evolutionary ecology to study the relationship between structure, flow and organisation in model ecosystems, their development over evolutionary time scales and their relation to ecosystem stability. Our findings support the validity of ecosystem ascendency as a meaningful measure of ecosystem organisation, which increases over evolutionary time scales and significantly drops during periods of disturbance. The results suggest a general trend towards both higher integrity and increased stability driven by functional and structural ecosystem coadaptation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. An automatic multi-atlas prostate segmentation in MRI using a multiscale representation and a label fusion strategy

    NASA Astrophysics Data System (ADS)

    Álvarez, Charlens; Martínez, Fabio; Romero, Eduardo

    2015-01-01

    The pelvic magnetic Resonance images (MRI) are used in Prostate cancer radiotherapy (RT), a process which is part of the radiation planning. Modern protocols require a manual delineation, a tedious and variable activity that may take about 20 minutes per patient, even for trained experts. That considerable time is an important work ow burden in most radiological services. Automatic or semi-automatic methods might improve the efficiency by decreasing the measure times while conserving the required accuracy. This work presents a fully automatic atlas- based segmentation strategy that selects the more similar templates for a new MRI using a robust multi-scale SURF analysis. Then a new segmentation is achieved by a linear combination of the selected templates, which are previously non-rigidly registered towards the new image. The proposed method shows reliable segmentations, obtaining an average DICE Coefficient of 79%, when comparing with the expert manual segmentation, under a leave-one-out scheme with the training database.

  20. Evaluation of Adaptive Subdivision Method on Mobile Device

    NASA Astrophysics Data System (ADS)

    Rahim, Mohd Shafry Mohd; Isa, Siti Aida Mohd; Rehman, Amjad; Saba, Tanzila

    2013-06-01

    Recently, there are significant improvements in the capabilities of mobile devices; but rendering large 3D object is still tedious because of the constraint in resources of mobile devices. To reduce storage requirement, 3D object is simplified but certain area of curvature is compromised and the surface will not be smooth. Therefore a method to smoother selected area of a curvature is implemented. One of the popular methods is adaptive subdivision method. Experiments are performed using two data with results based on processing time, rendering speed and the appearance of the object on the devices. The result shows a downfall in frame rate performance due to the increase in the number of triangles with each level of iteration while the processing time of generating the new mesh also significantly increase. Since there is a difference in screen size between the devices the surface on the iPhone appears to have more triangles and more compact than the surface displayed on the iPad. [Figure not available: see fulltext.

  1. Elimination sequence optimization for SPAR

    NASA Technical Reports Server (NTRS)

    Hogan, Harry A.

    1986-01-01

    SPAR is a large-scale computer program for finite element structural analysis. The program allows user specification of the order in which the joints of a structure are to be eliminated since this order can have significant influence over solution performance, in terms of both storage requirements and computer time. An efficient elimination sequence can improve performance by over 50% for some problems. Obtaining such sequences, however, requires the expertise of an experienced user and can take hours of tedious effort to affect. Thus, an automatic elimination sequence optimizer would enhance productivity by reducing the analysts' problem definition time and by lowering computer costs. Two possible methods for automating the elimination sequence specifications were examined. Several algorithms based on the graph theory representations of sparse matrices were studied with mixed results. Significant improvement in the program performance was achieved, but sequencing by an experienced user still yields substantially better results. The initial results provide encouraging evidence that the potential benefits of such an automatic sequencer would be well worth the effort.

  2. Zonal and tesseral harmonic coefficients for the geopotential function, from zero to 18th order

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, J. C.

    1976-01-01

    Zonal and tesseral harmonic coefficients for the geopotential function are usually tabulated in normalized form to provide immediate information as to the relative significance of the coefficients in the gravity model. The normalized form of the geopotential coefficients cannot be used for computational purposes unless the gravity model has been modified to receive them. This modification is usually not done because the absolute or unnormalized form of the coefficients can be obtained from the simple mathematical relationship that relates the two forms. This computation can be quite tedious for hand calculation, especially for the higher order terms, and can be costly in terms of storage and execution time for machine computation. In this report, zonal and tesseral harmonic coefficients for the geopotential function are tabulated in absolute or unnormalized form. The report is designed to be used as a ready reference for both hand and machine calculation to save the user time and effort.

  3. A new rapid method for isolating nucleoli.

    PubMed

    Li, Zhou Fang; Lam, Yun Wah

    2015-01-01

    The nucleolus was one of the first subcellular organelles to be isolated from the cell. The advent of modern proteomic techniques has resulted in the identification of thousands of proteins in this organelle, and live cell imaging technology has allowed the study of the dynamics of these proteins. However, the limitations of current nucleolar isolation methods hinder the further exploration of this structure. In particular, these methods require the use of a large number of cells and tedious procedures. In this chapter we describe a new and improved nucleolar isolation method for cultured adherent cells. In this method cells are snap-frozen before direct sonication and centrifugation onto a sucrose cushion. The nucleoli can be obtained within a time as short as 20 min, and the high yield allows the use of less starting material. As a result, this method can capture rapid biochemical changes in nucleoli by freezing the cells at a precise time, hence faithfully reflecting the protein composition of nucleoli at the specified time point. This protocol will be useful for proteomic studies of dynamic events in the nucleolus and for better understanding of the biology of mammalian cells.

  4. PyMOL mControl: Manipulating Molecular Visualization with Mobile Devices

    ERIC Educational Resources Information Center

    Lam, Wendy W. T.; Siu, Shirley W. I.

    2017-01-01

    Viewing and manipulating three-dimensional (3D) structures in molecular graphics software are essential tasks for researchers and students to understand the functions of molecules. Currently, the way to manipulate a 3D molecular object is mainly based on mouse-and-keyboard control that is usually difficult and tedious to learn. While gesture-based…

  5. CLOSED-LOOP STRIPPING ANALYSIS (CLSA) OF SYNTHETIC MUSK COMPOUNDS FROM FISH TISSUES WITH MEASUREMENT BY GAS CHROMATOGRAPHY-MASS SPECTROMETRY WITH SELECTED-ION MONITORING

    EPA Science Inventory

    Synthetic musk compounds have been found in surface water, fish tissues, and human breast milk. Current techniques for separating these compounds from fish tissues require tedious sample clean-upprocedures A simple method for the deterrnination of these compounds in fish tissues ...

  6. Your Institution in a Global Economy

    ERIC Educational Resources Information Center

    Freund, William

    2009-01-01

    In this article, the author offers his reflections on the American economy and its "slow, gradual, and tedious" recovery. What the American people are experiencing now is not one of the ordinary recessions that have been experienced since World War II. What they have seen is a bursting of a bubble in the credit markets and in financial…

  7. A simple device for dehairing insect egg masses

    Treesearch

    Benjamin J. Cosenza; Edwin A. Boger; Normand R. Dubois; Franklin B. Lewis

    1963-01-01

    The egg masses of some lepidopterous insects are covered by a mat of hairs that for some research purposes must be removed. Doing this by hand is tedious. Besides, the hairs on the egg masses of certain insects such as the gypsy moth (Porthetria dispar [L.] and the browntail moth Nygmia phaeorrhoea [Donov.]) can cause severe...

  8. Temperature-indicating Paints

    NASA Technical Reports Server (NTRS)

    Penzig, F

    1939-01-01

    This report is an attempt at a new method of coating the surface of the cylinder with materials that undergo chemical change at definite temperatures as indicated by a change in color. In this way it was hoped that the substance itself would indicate directly the position of its isotherms, which in measurements with thermocouples requires a tedious amount of labor.

  9. Development and Evaluation of an Analytical Method for the Determination of Total Atmospheric Mercury. Final Report.

    ERIC Educational Resources Information Center

    Chase, D. L.; And Others

    Total mercury in ambient air can be collected in iodine monochloride, but the subsequent analysis is relatively complex and tedious, and contamination from reagents and containers is a problem. A sliver wool collector, preceded by a catalytic pyrolysis furnace, gives good recovery of mercury and simplifies the analytical step. An instrumental…

  10. Biomass Determination Using Wood Specific Gravity from Increment Cores

    Treesearch

    Michael C. Wiemann; G. Bruce Williamson

    2013-01-01

    Wood specific gravity (SG) is one of the most important variables used to determine biomass. Measurement of SG is problematic because it requires tedious, and often difficult, sampling of wood from standing trees. Sampling is complicated because the SG usually varies nonrandomly within trees, resulting in systematic errors. Off-center pith and hollow or decayed stems...

  11. How To Proofread and Edit Your Writing: A Guide for Student Writers.

    ERIC Educational Resources Information Center

    Morgan, M.C.

    Proofreading can be tedious and boring, especially if it is approached as correcting errors. But proofreading is not correcting errors so much as reviewing the paper for ideas and for readability. Sometimes classmates can help a student proofread--they can help assess the draft, propose some alternative solutions, and make some choices. This paper…

  12. How Is the Relationship between Entrepreneurship Potential and Student Personality in the Implementation of Science and Technology for Entrepreneurship in Higher Education?

    ERIC Educational Resources Information Center

    Yuliana

    2017-01-01

    Entrepreneurship learning and training are tedious and rising educated unemployment of higher education graduates. For this reason, an alternative solution is done by higher education through Science and Technology Program for Entrepreneurship. This research aims to explore and describe the correlation analysis between entrepreneurial potential…

  13. Concert Programming and Performing as a Model for Lesson Planning and Teaching

    ERIC Educational Resources Information Center

    Branscome, Eric E.

    2014-01-01

    For many novice music teachers, creating and implementing effective music lessons can be a tedious process. Moreover, preparing a music lesson is quite different from lesson planning in other areas, creating a disconnect that music educators may feel when trying to make music lessons fit a classroom lesson-plan model. However, most music teachers…

  14. Teaching Reform of Course Group Regarding Theory and Design of Mechanisms Based on MATLAB Technology

    ERIC Educational Resources Information Center

    Shen, Yi; Yuan, Mingxin; Wang, Mingqiang

    2013-01-01

    Considering that the course group regarding theory and design of mechanisms is characterized by strong engineering application background and the students generally feel very boring and tedious during the learning process, some teaching reforms for the theory and design of mechanisms are carried out to improve the teaching effectiveness in this…

  15. Move, Stop, Learn: Illustrating Mitosis through Stop-Motion Animation

    ERIC Educational Resources Information Center

    Kamp, Brandi L.; Deaton, Cynthia C. M.

    2013-01-01

    Learning about microscopic things, such as cells, can often be mundane to students because they are not able to see or manipulate what they are learning about. Students often recall learning about cell division through memorization--thus they find it tedious and dull. Few opportunities exist that allow students to explore and manipulate cells or…

  16. "Long, Boring, and Tedious": Youths' Experiences with Complex, Religious Texts

    ERIC Educational Resources Information Center

    Rackley, Eric D.; Kwok, Michelle

    2016-01-01

    Growing out of the renewed attention to text complexity in the United States and the large population of youth who are deeply committed to reading scripture, this study explores 16 Latter-day Saint and Methodist youths' experiences with complex, religious texts. The study took place in the Midwestern United States. Data consisted of an academic…

  17. Forest Service's Northern Research Station FIA launches 24-state study of forest regeneration

    Treesearch

    Will McWilliams; Shawn Lehman; Paul Roth; Jim. Westfall

    2012-01-01

    Inventory foresters often quake when asked to count tree seedlings, because the work is tedious and sometimes means tallying hundreds of stems. They also know that the density and quality of advance regeneration are key to the success of new stand establishment. Seedling counts provide valuable information on regeneration adequacy, forest diversity, wildlife habitat,...

  18. BAC Libraries from Wheat Chromosome 7D – Efficient Tool for Positional Cloning of Aphid Resistance Genes

    USDA-ARS?s Scientific Manuscript database

    Positional cloning in bread wheat is a tedious task due to its huge genome size (~17 Gbp) and polyploid character. BAC libraries represent an essential tool for positional cloning. However, wheat BAC libraries comprise more than million clones, which make their screening very laborious. Here we pres...

  19. Computer programs for optical dendrometer measurements of standing tree profiles

    Treesearch

    Jacob R. Beard; Thomas G. Matney; Emily B. Schultz

    2015-01-01

    Tree profile equations are effective volume predictors. Diameter data for building these equations are collected from felled trees using diameter tapes and calipers or from standing trees using optical dendrometers. Developing and implementing a profile function from the collected data is a tedious and error prone task. This study created a computer program, Profile...

  20. Musings on the State of the ILS in 2006

    ERIC Educational Resources Information Center

    Breeding, Marshall

    2006-01-01

    It is hard to imagine operating a library today without the assistance of an integrated library system (ILS). Without help from it, library work would be tedious, labor would be intensive, and patrons would be underserved in almost all respects. Given the importance of these automation systems, it is essential that they work well and deliver…

Top