Sample records for quantitative systems analysis

  1. Analysis of a document/reporting system

    NASA Technical Reports Server (NTRS)

    Narrow, B.

    1971-01-01

    An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.

  2. Deficient Contractor Business Systems: Applying the Value at Risk (VaR) Model to Earned Value Management Systems

    DTIC Science & Technology

    2013-06-30

    QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC

  3. Quantitative method of medication system interface evaluation.

    PubMed

    Pingenot, Alleene Anne; Shanteau, James; Pingenot, James D F

    2007-01-01

    The objective of this study was to develop a quantitative method of evaluating the user interface for medication system software. A detailed task analysis provided a description of user goals and essential activity. A structural fault analysis was used to develop a detailed description of the system interface. Nurses experienced with use of the system under evaluation provided estimates of failure rates for each point in this simplified fault tree. Means of estimated failure rates provided quantitative data for fault analysis. Authors note that, although failures of steps in the program were frequent, participants reported numerous methods of working around these failures so that overall system failure was rare. However, frequent process failure can affect the time required for processing medications, making a system inefficient. This method of interface analysis, called Software Efficiency Evaluation and Fault Identification Method, provides quantitative information with which prototypes can be compared and problems within an interface identified.

  4. Farber's Disease

    MedlinePlus

    ... of these disorders. Additional studies will emphasize the quantitative analysis of the central nervous system structure and ... of these disorders. Additional studies will emphasize the quantitative analysis of the central nervous system structure and ...

  5. Deficient Contractor Business Systems: Applying the Value at Risk (VAR) Model to Earned Value Management Systems

    DTIC Science & Technology

    2013-06-01

    measuring numerical risk to the government ( Galway , 2004). However, quantitative risk analysis is rarely utilized in DoD acquisition programs because the...quantitative assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost...Kindle version]. Retrieved from Amazon.com 83 Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review

  6. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  7. Quantitation of absorbed or deposited materials on a substrate that measures energy deposition

    DOEpatents

    Grant, Patrick G.; Bakajin, Olgica; Vogel, John S.; Bench, Graham

    2005-01-18

    This invention provides a system and method for measuring an energy differential that correlates to quantitative measurement of an amount mass of an applied localized material. Such a system and method remains compatible with other methods of analysis, such as, for example, quantitating the elemental or isotopic content, identifying the material, or using the material in biochemical analysis.

  8. Clinical application of a light-pen computer system for quantitative angiography

    NASA Technical Reports Server (NTRS)

    Alderman, E. L.

    1975-01-01

    The paper describes an angiographic analysis system which uses a video disk for recording and playback, a light-pen for data input, minicomputer processing, and an electrostatic printer/plotter for hardcopy output. The method is applied to quantitative analysis of ventricular volumes, sequential ventriculography for assessment of physiologic and pharmacologic interventions, analysis of instantaneous time sequence of ventricular systolic and diastolic events, and quantitation of segmental abnormalities. The system is shown to provide the capability for computation of ventricular volumes and other measurements from operator-defined margins by greatly reducing the tedium and errors associated with manual planimetry.

  9. Distinguishing nanomaterial particles from background airborne particulate matter for quantitative exposure assessment

    NASA Astrophysics Data System (ADS)

    Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi

    2009-10-01

    As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.

  10. Quantitative X-ray diffraction and fluorescence analysis of paint pigment systems : final report.

    DOT National Transportation Integrated Search

    1978-01-01

    This study attempted to correlate measured X-ray intensities with concentrations of each member of paint pigment systems, thereby establishing calibration curves for the quantitative analyses of such systems.

  11. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  12. An experimental design for quantification of cardiovascular responses to music stimuli in humans.

    PubMed

    Chang, S-H; Luo, C-H; Yeh, T-L

    2004-01-01

    There have been several researches on the relationship between music and human physiological or psychological responses. However, there are cardiovascular index factors that have not been explored quantitatively due to the qualitative nature of acoustic stimuli. This study proposes and demonstrates an experimental design for quantification of cardiovascular responses to music stimuli in humans. The system comprises two components: a unit for generating and monitoring quantitative acoustic stimuli and a portable autonomic nervous system (ANS) analysis unit for quantitative recording and analysis of the cardiovascular responses. The experimental results indicate that the proposed system can exactly achieve the goal of full control and measurement for the music stimuli, and also effectively support many quantitative indices of cardiovascular response in humans. In addition, the analysis results are discussed and predicted in the future clinical research.

  13. Gas chromatograph-mass spectrometer (GC/MS) system for quantitative analysis of reactive chemical compounds

    DOEpatents

    Grindstaff, Quirinus G.

    1992-01-01

    Described is a new gas chromatograph-mass spectrometer (GC/MS) system and method for quantitative analysis of reactive chemical compounds. All components of such a GC/MS system external to the oven of the gas chromatograph are programmably temperature controlled to operate at a volatilization temperature specific to the compound(s) sought to be separated and measured.

  14. Quantitative analysis of crystalline pharmaceuticals in powders and tablets by a pattern-fitting procedure using X-ray powder diffraction data.

    PubMed

    Yamamura, S; Momose, Y

    2001-01-16

    A pattern-fitting procedure for quantitative analysis of crystalline pharmaceuticals in solid dosage forms using X-ray powder diffraction data is described. This method is based on a procedure for pattern-fitting in crystal structure refinement, and observed X-ray scattering intensities were fitted to analytical expressions including some fitting parameters, i.e. scale factor, peak positions, peak widths and degree of preferred orientation of the crystallites. All fitting parameters were optimized by the non-linear least-squares procedure. Then the weight fraction of each component was determined from the optimized scale factors. In the present study, well-crystallized binary systems, zinc oxide-zinc sulfide (ZnO-ZnS) and salicylic acid-benzoic acid (SA-BA), were used as the samples. In analysis of the ZnO-ZnS system, the weight fraction of ZnO or ZnS could be determined quantitatively in the range of 5-95% in the case of both powders and tablets. In analysis of the SA-BA systems, the weight fraction of SA or BA could be determined quantitatively in the range of 20-80% in the case of both powders and tablets. Quantitative analysis applying this pattern-fitting procedure showed better reproducibility than other X-ray methods based on the linear or integral intensities of particular diffraction peaks. Analysis using this pattern-fitting procedure also has the advantage that the preferred orientation of the crystallites in solid dosage forms can be also determined in the course of quantitative analysis.

  15. Quantitative capillary electrophoresis and its application in analysis of alkaloids in tea, coffee, coca cola, and theophylline tablets.

    PubMed

    Li, Mengjia; Zhou, Junyi; Gu, Xue; Wang, Yan; Huang, Xiaojing; Yan, Chao

    2009-01-01

    A quantitative CE (qCE) system with high precision has been developed, in which a 4-port nano-valve was isolated from the electric field and served as sample injector. The accurate amount of sample was introduced into the CE system with high reproducibility. Based on this system, consecutive injections and separations were performed without voltage interruption. Reproducibilities in terms of RSD lower than 0.8% for retention time and 1.7% for peak area were achieved. The effectiveness of the system was demonstrated by the quantitative analysis of caffeine, theobromine, and theophylline in real samples, such as tea leaf, roasted coffee, coca cola, and theophylline tablets.

  16. Conflagration Analysis System II: Bibliography.

    DTIC Science & Technology

    1985-04-01

    Therefore, it is Lmportant to examine both the reinforcement and the supplemental considerations Eor the quantitative methods for conflagration...and the meaningful quantitative factors for conflagration analysis are determined, the relevatn literature will be brought into the nainstream of the... quantitative :hods. Fire Development in Multiple Structures From a purely analytical view, the research identified in the literature fire development in

  17. Quantitative analysis of major dibenzocyclooctane lignans in Schisandrae fructus by online TLC-DART-MS.

    PubMed

    Kim, Hye Jin; Oh, Myung Sook; Hong, Jongki; Jang, Young Pyo

    2011-01-01

    Direct analysis in real time (DART) ion source is a powerful ionising technique for the quick and easy detection of various organic molecules without any sample preparation steps, but the lack of quantitation capacity limits its extensive use in the field of phytochemical analysis. To improvise a new system which utilize DART-MS as a hyphenated detector for quantitation. A total extract of Schisandra chinensis fruit was analyzed on a TLC plate and three major lignan compounds were quantitated by three different methods of UV densitometry, TLC-DART-MS and HPLC-UV to compare the efficiency of each method. To introduce the TLC plate into the DART ion source at a constant velocity, a syringe pump was employed. The DART-MS total ion current chromatogram was recorded for the entire TLC plate. The concentration of each lignan compound was calculated from the calibration curve established with standard compound. Gomisin A, gomisin N and schisandrin were well separated on a silica-coated TLC plate and the specific ion current chromatograms were successfully acquired from the TLC-DART-MS system. The TLC-DART-MS system for the quantitation of natural products showed better linearity and specificity than TLC densitometry, and consumed less time and solvent than conventional HPLC method. A hyphenated system for the quantitation of phytochemicals from crude herbal drugs was successfully established. This system was shown to have a powerful analytical capacity for the prompt and efficient quantitation of natural products from crude drugs. Copyright © 2010 John Wiley & Sons, Ltd.

  18. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  19. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  20. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  1. Using multiple PCR and CE with chemiluminescence detection for simultaneous qualitative and quantitative analysis of genetically modified organism.

    PubMed

    Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan

    2008-09-01

    In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.

  2. On the analysis of complex biological supply chains: From Process Systems Engineering to Quantitative Systems Pharmacology.

    PubMed

    Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P

    2017-12-05

    The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.

  3. Quantitative analysis of eyes and other optical systems in linear optics.

    PubMed

    Harris, William F; Evans, Tanya; van Gool, Radboud D

    2017-05-01

    To show that 14-dimensional spaces of augmented point P and angle Q characteristics, matrices obtained from the ray transference, are suitable for quantitative analysis although only the latter define an inner-product space and only on it can one define distances and angles. The paper examines the nature of the spaces and their relationships to other spaces including symmetric dioptric power space. The paper makes use of linear optics, a three-dimensional generalization of Gaussian optics. Symmetric 2 × 2 dioptric power matrices F define a three-dimensional inner-product space which provides a sound basis for quantitative analysis (calculation of changes, arithmetic means, etc.) of refractive errors and thin systems. For general systems the optical character is defined by the dimensionally-heterogeneous 4 × 4 symplectic matrix S, the transference, or if explicit allowance is made for heterocentricity, the 5 × 5 augmented symplectic matrix T. Ordinary quantitative analysis cannot be performed on them because matrices of neither of these types constitute vector spaces. Suitable transformations have been proposed but because the transforms are dimensionally heterogeneous the spaces are not naturally inner-product spaces. The paper obtains 14-dimensional spaces of augmented point P and angle Q characteristics. The 14-dimensional space defined by the augmented angle characteristics Q is dimensionally homogenous and an inner-product space. A 10-dimensional subspace of the space of augmented point characteristics P is also an inner-product space. The spaces are suitable for quantitative analysis of the optical character of eyes and many other systems. Distances and angles can be defined in the inner-product spaces. The optical systems may have multiple separated astigmatic and decentred refracting elements. © 2017 The Authors Ophthalmic & Physiological Optics © 2017 The College of Optometrists.

  4. Analysis of airborne MAIS imaging spectrometric data for mineral exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Jinnian; Zheng Lanfen; Tong Qingxi

    1996-11-01

    The high spectral resolution imaging spectrometric system made quantitative analysis and mapping of surface composition possible. The key issue will be the quantitative approach for analysis of surface parameters for imaging spectrometer data. This paper describes the methods and the stages of quantitative analysis. (1) Extracting surface reflectance from imaging spectrometer image. Lab. and inflight field measurements are conducted for calibration of imaging spectrometer data, and the atmospheric correction has also been used to obtain ground reflectance by using empirical line method and radiation transfer modeling. (2) Determining quantitative relationship between absorption band parameters from the imaging spectrometer data andmore » chemical composition of minerals. (3) Spectral comparison between the spectra of spectral library and the spectra derived from the imagery. The wavelet analysis-based spectrum-matching techniques for quantitative analysis of imaging spectrometer data has beer, developed. Airborne MAIS imaging spectrometer data were used for analysis and the analysis results have been applied to the mineral and petroleum exploration in Tarim Basin area china. 8 refs., 8 figs.« less

  5. Quantitative probe of the transition metal redox in battery electrodes through soft x-ray absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Qinghao; Qiao, Ruimin; Wray, L. Andrew; Chen, Jun; Zhuo, Zengqing; Chen, Yanxue; Yan, Shishen; Pan, Feng; Hussain, Zahid; Yang, Wanli

    2016-10-01

    Most battery positive electrodes operate with a 3d transition-metal (TM) reaction centre. A direct and quantitative probe of the TM states upon electrochemical cycling is valuable for understanding the detailed cycling mechanism and charge diffusion in the electrodes, which is related with many practical parameters of a battery. This review includes a comprehensive summary of our recent demonstrations of five different types of quantitative analysis of the TM states in battery electrodes based on soft x-ray absorption spectroscopy and multiplet calculations. In LiFePO4, a system of a well-known two-phase transformation type, the TM redox could be strictly determined through a simple linear combination of the two end-members. In Mn-based compounds, the Mn states could also be quantitatively evaluated, but a set of reference spectra with all the three possible Mn valences needs to be deliberately selected and considered in the fitting. Although the fluorescence signals suffer the self-absorption distortion, the multiplet calculations could consider the distortion effect, which allows a quantitative determination of the overall Ni oxidation state in the bulk. With the aid of multiplet calculations, one could also achieve a quasi-quantitative analysis of the Co redox evolution in LiCoO2 based on the energy position of the spectroscopic peak. The benefit of multiplet calculations is more important for studying electrode materials with TMs of mixed spin states, as exemplified by the quantitative analysis of the mixed spin Na2-x Fe2(CN)6 system. At the end, we showcase that such quantitative analysis could provide valuable information for optimizing the electrochemical performance of Na0.44MnO2 electrodes for Na-ion batteries. The methodology summarized in this review could be extended to other energy application systems with TM redox centre for detailed analysis, for example, fuel cell and catalytic materials.

  6. [The development of a computer model in the quantitative assessment of thallium-201 myocardial scintigraphy].

    PubMed

    Raineri, M; Traina, M; Rotolo, A; Candela, B; Lombardo, R M; Raineri, A A

    1993-05-01

    Thallium-201 scintigraphy is a widely used noninvasive procedure for the detection and prognostic assessment of patients with suspected or proven coronary artery disease. Thallium uptake can be evaluated by a visual analysis or by a quantitative interpretation. Quantitative scintigraphy enhances disease detection in individual coronary arteries, provides a more precise estimate of the amount of ischemic myocardium, distinguishing scar from hypoperfused tissue. Due to the great deal of data, analysis, interpretation and comparison of thallium uptake can be very complex. We designed a computer-based system for the interpretation of quantitative thallium-201 scintigraphy data uptake. We used a database (DataEase 4.2-DataEase Italia). Our software has the following functions: data storage; calculation; conversion of numerical data into different definitions classifying myocardial perfusion; uptake data comparison; automatic conclusion; comparison of different scintigrams for the same patient. Our software is made up by 4 sections: numeric analysis, descriptive analysis, automatic conclusion, clinical remarks. We introduced in the computer system appropriate information, "logical paths", that use the "IF ... THEN" rules. The software executes these rules in order to analyze the myocardial regions in the 3 phases of scintigraphic analysis (stress, redistribution, re-injection), in the 3 projections (LAO 45 degrees, LAT,ANT), considering our uptake cutoff, obtaining, finally, the automatic conclusions. For these reasons, our computer-based system could be considered a real "expert system".

  7. Quantitative analysis of the anti-noise performance of an m-sequence in an electromagnetic method

    NASA Astrophysics Data System (ADS)

    Yuan, Zhe; Zhang, Yiming; Zheng, Qijia

    2018-02-01

    An electromagnetic method with a transmitted waveform coded by an m-sequence achieved better anti-noise performance compared to the conventional manner with a square-wave. The anti-noise performance of the m-sequence varied with multiple coding parameters; hence, a quantitative analysis of the anti-noise performance for m-sequences with different coding parameters was required to optimize them. This paper proposes the concept of an identification system, with the identified Earth impulse response obtained by measuring the system output with the input of the voltage response. A quantitative analysis of the anti-noise performance of the m-sequence was achieved by analyzing the amplitude-frequency response of the corresponding identification system. The effects of the coding parameters on the anti-noise performance are summarized by numerical simulation, and their optimization is further discussed in our conclusions; the validity of the conclusions is further verified by field experiment. The quantitative analysis method proposed in this paper provides a new insight into the anti-noise mechanism of the m-sequence, and could be used to evaluate the anti-noise performance of artificial sources in other time-domain exploration methods, such as the seismic method.

  8. Quantitative analysis of defects in silicon. Silicon sheet growth development for the large are silicon sheet task of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Smith, J. M.; Bruce, T.; Oidwai, H. A.

    1980-01-01

    One hundred and seventy four silicon sheet samples were analyzed for twin boundary density, dislocation pit density, and grain boundary length. Procedures were developed for the quantitative analysis of the twin boundary and dislocation pit densities using a QTM-720 Quantitative Image Analyzing system. The QTM-720 system was upgraded with the addition of a PDP 11/03 mini-computer with dual floppy disc drive, a digital equipment writer high speed printer, and a field-image feature interface module. Three versions of a computer program that controls the data acquisition and analysis on the QTM-720 were written. Procedures for the chemical polishing and etching were also developed.

  9. An introduction to quantitative remote sensing. [data processing

    NASA Technical Reports Server (NTRS)

    Lindenlaub, J. C.; Russell, J.

    1974-01-01

    The quantitative approach to remote sensing is discussed along with the analysis of remote sensing data. Emphasis is placed on the application of pattern recognition in numerically oriented remote sensing systems. A common background and orientation for users of the LARS computer software system is provided.

  10. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  11. Evaluation of a web based informatics system with data mining tools for predicting outcomes with quantitative imaging features in stroke rehabilitation clinical trials

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent

    2017-03-01

    Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.

  12. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  13. Quantitative analysis of arm movement smoothness

    NASA Astrophysics Data System (ADS)

    Szczesna, Agnieszka; Błaszczyszyn, Monika

    2017-07-01

    The paper deals with the problem of motion data quantitative smoothness analysis. We investigated values of movement unit, fluidity and jerk for healthy and paralyzed arm of patients with hemiparesis after stroke. Patients were performing drinking task. To validate the approach, movement of 24 patients were captured using optical motion capture system.

  14. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  15. [Quantitative analysis of the structure of neuronal dendritic spines in the striatum using the Leitz-ASM system].

    PubMed

    Leontovich, T A; Zvegintseva, E G

    1985-10-01

    Two principal classes of striatum long axonal neurons (sparsely ramified reticular cells and densely ramified dendritic cells) were analyzed quantitatively in four animal species: hedgehog, rabbit, dog and monkey. The cross section area, total dendritic length and the area of dendritic field were measured using "LEITZ-ASM" system. Classes of neurons studied were significantly different in dogs and monkeys, while no differences were noted between hedgehog and rabbit. Reticular neurons of different species varied much more than dendritic ones. Quantitative analysis has revealed the progressive increase in the complexity of dendritic tree in mammals from rabbit to monkey.

  16. Viewpoint on ISA TR84.0.02--simplified methods and fault tree analysis.

    PubMed

    Summers, A E

    2000-01-01

    ANSI/ISA-S84.01-1996 and IEC 61508 require the establishment of a safety integrity level for any safety instrumented system or safety related system used to mitigate risk. Each stage of design, operation, maintenance, and testing is judged against this safety integrity level. Quantitative techniques can be used to verify whether the safety integrity level is met. ISA-dTR84.0.02 is a technical report under development by ISA, which discusses how to apply quantitative analysis techniques to safety instrumented systems. This paper discusses two of those techniques: (1) Simplified equations and (2) Fault tree analysis.

  17. Try Fault Tree Analysis, a Step-by-Step Way to Improve Organization Development.

    ERIC Educational Resources Information Center

    Spitzer, Dean

    1980-01-01

    Fault Tree Analysis, a systems safety engineering technology used to analyze organizational systems, is described. Explains the use of logic gates to represent the relationship between failure events, qualitative analysis, quantitative analysis, and effective use of Fault Tree Analysis. (CT)

  18. A fully battery-powered inexpensive spectrophotometric system for high-sensitivity point-of-care analysis on a microfluidic chip

    PubMed Central

    Dou, Maowei; Lopez, Juan; Rios, Misael; Garcia, Oscar; Xiao, Chuan; Eastman, Michael

    2016-01-01

    A cost-effective battery-powered spectrophotometric system (BASS) was developed for quantitative point-of-care (POC) analysis on a microfluidic chip. By using methylene blue as a model analyte, we first compared the performance of the BASS with a commercial spectrophotometric system, and further applied the BASS for loop-mediated isothermal amplification (LAMP) detection and subsequent quantitative nucleic acid analysis which exhibited a comparable limit of detection to that of Nanodrop. Compared to the commercial spectrophotometric system, our spectrophotometric system is lower-cost, consumes less reagents, and has a higher detection sensitivity. Most importantly, it does not rely on external power supplies. All these features make our spectrophotometric system highly suitable for a variety of POC analyses, such as field detection. PMID:27143408

  19. 76 FR 28819 - NUREG/CR-XXXX, Development of Quantitative Software Reliability Models for Digital Protection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-18

    ... NUCLEAR REGULATORY COMMISSION [NRC-2011-0109] NUREG/CR-XXXX, Development of Quantitative Software..., ``Development of Quantitative Software Reliability Models for Digital Protection Systems of Nuclear Power Plants... of Risk Analysis, Office of Nuclear Regulatory Research, U.S. Nuclear Regulatory Commission...

  20. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  1. Dependence of quantitative accuracy of CT perfusion imaging on system parameters

    NASA Astrophysics Data System (ADS)

    Li, Ke; Chen, Guang-Hong

    2017-03-01

    Deconvolution is a popular method to calculate parametric perfusion parameters from four dimensional CT perfusion (CTP) source images. During the deconvolution process, the four dimensional space is squeezed into three-dimensional space by removing the temporal dimension, and a prior knowledge is often used to suppress noise associated with the process. These additional complexities confound the understanding about deconvolution-based CTP imaging system and how its quantitative accuracy depends on parameters and sub-operations involved in the image formation process. Meanwhile, there has been a strong clinical need in answering this question, as physicians often rely heavily on the quantitative values of perfusion parameters to make diagnostic decisions, particularly during an emergent clinical situation (e.g. diagnosis of acute ischemic stroke). The purpose of this work was to develop a theoretical framework that quantitatively relates the quantification accuracy of parametric perfusion parameters with CTP acquisition and post-processing parameters. This goal was achieved with the help of a cascaded systems analysis for deconvolution-based CTP imaging systems. Based on the cascaded systems analysis, the quantitative relationship between regularization strength, source image noise, arterial input function, and the quantification accuracy of perfusion parameters was established. The theory could potentially be used to guide developments of CTP imaging technology for better quantification accuracy and lower radiation dose.

  2. Investment appraisal using quantitative risk analysis.

    PubMed

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  3. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.

  4. Quantitative analysis to guide orphan drug development.

    PubMed

    Lesko, L J

    2012-08-01

    The development of orphan drugs for rare diseases has made impressive strides in the past 10 years. There has been a surge in orphan drug designations, but new drug approvals have not kept up. This article presents a three-pronged hierarchical strategy for quantitative analysis of data at the descriptive, mechanistic, and systems levels of the biological system that could represent a standardized and rational approach to orphan drug development. Examples are provided to illustrate the concept.

  5. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Systems Operations Studies for Automated Guideway Transit Systems : Quantitative Analysis of Alternative AGT Operational Control Strategies

    DOT National Transportation Integrated Search

    1981-10-01

    The objectives of the Systems Operation Studies (SOS) for automated guideway transit (AGT) systems are to develop models for the analysis of system operations, to evaluate performance and cost, and to establish guidelines for the design and operation...

  7. Silicon sheet growth development for the large area silicon sheet task of the low cost solar array project. Quantitative analysis of defects in silicon

    NASA Technical Reports Server (NTRS)

    Natesh, R.

    1978-01-01

    The various steps involved in obtaining quantitative information of structural defects in crystalline silicon samples are described. Procedures discussed include: (1) chemical polishing; (2) chemical etching; and (3) automated image analysis of samples on the QTM 720 System.

  8. Analysis and Derivation of Allocations for Fiber Contaminants in Liquid Bipropellant Systems

    NASA Technical Reports Server (NTRS)

    Lowrey, N. M; ibrahim, K. Y.

    2012-01-01

    An analysis was performed to identify the engineering rationale for the existing particulate limits in MSFC-SPEC-164, Cleanliness of Components for Use in Oxygen, Fuel, and Pneumatic Systems, determine the applicability of this rationale to fibers, identify potential risks that may result from fiber contamination in liquid oxygen/fuel bipropellant systems, and bound each of these risks. The objective of this analysis was to determine whether fiber contamination exceeding the established quantitative limits for particulate can be tolerated in these systems and, if so, to derive and recommend quantitative allocations for fibers beyond the limits established for other particulate. Knowledge gaps were identified that limit a complete understanding of the risk of promoted ignition from an accumulation of fibers in a gaseous oxygen system.

  9. Quantitative classification of a historic northern Wisconsin (U.S.A.) landscape: mapping forests at regional scales

    Treesearch

    Lisa A. Schulte; David J. Mladenoff; Erik V. Nordheim

    2002-01-01

    We developed a quantitative and replicable classification system to improve understanding of historical composition and structure within northern Wisconsin's forests. The classification system was based on statistical cluster analysis and two forest metrics, relative dominance (% basal area) and relative importance (mean of relative dominance and relative density...

  10. Location precision analysis of stereo thermal anti-sniper detection system

    NASA Astrophysics Data System (ADS)

    He, Yuqing; Lu, Ya; Zhang, Xiaoyan; Jin, Weiqi

    2012-06-01

    Anti-sniper detection devices are the urgent requirement in modern warfare. The precision of the anti-sniper detection system is especially important. This paper discusses the location precision analysis of the anti-sniper detection system based on the dual-thermal imaging system. It mainly discusses the following two aspects which produce the error: the digital quantitative effects of the camera; effect of estimating the coordinate of bullet trajectory according to the infrared images in the process of image matching. The formula of the error analysis is deduced according to the method of stereovision model and digital quantitative effects of the camera. From this, we can get the relationship of the detecting accuracy corresponding to the system's parameters. The analysis in this paper provides the theory basis for the error compensation algorithms which are put forward to improve the accuracy of 3D reconstruction of the bullet trajectory in the anti-sniper detection devices.

  11. An Automated System for Chromosome Analysis

    NASA Technical Reports Server (NTRS)

    Castleman, K. R.; Melnyk, J. H.

    1976-01-01

    The design, construction, and testing of a complete system to produce karyotypes and chromosome measurement data from human blood samples, and to provide a basis for statistical analysis of quantitative chromosome measurement data are described.

  12. SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph

    2015-01-01

    This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.

  13. Quantitative Verse in a Quantity-Insensitive Language: Baif's "vers mesures."

    ERIC Educational Resources Information Center

    Bullock, Barbara E.

    1997-01-01

    Analysis of the quantitative metrical verse of French Renaissance poet Jean-Antoine de Baif finds that the metrics, often seen as unscannable and using an incomprehensible phonetic orthography, derive largely from a system that is accentual, with the orthography permitting the poet to encode quantitative distinctions that coincide with the meter.…

  14. Comparison of different approaches to quantitative adenovirus detection in stool specimens of hematopoietic stem cell transplant recipients.

    PubMed

    Kosulin, K; Dworzak, S; Lawitschka, A; Matthes-Leodolter, S; Lion, T

    2016-12-01

    Adenoviruses almost invariably proliferate in the gastrointestinal tract prior to dissemination, and critical threshold concentrations in stool correlate with the risk of viremia. Monitoring of adenovirus loads in stool may therefore be important for timely initiation of treatment in order to prevent invasive infection. Comparison of a manual DNA extraction kit in combination with a validated in-house PCR assay with automated extraction on the NucliSENS-EasyMAG device coupled with the Adenovirus R-gene kit (bioMérieux) for quantitative adenovirus analysis in stool samples. Stool specimens spiked with adenovirus concentrations in a range from 10E2-10E11 copies/g and 32 adenovirus-positive clinical stool specimens from pediatric stem cell transplant recipients were tested along with appropriate negative controls. Quantitative analysis of viral load in adenovirus-positive stool specimens revealed a median difference of 0.5 logs (range 0.1-2.2) between the detection systems tested and a difference of 0.3 logs (range 0.0-1.7) when the comparison was restricted to the PCR assays only. Spiking experiments showed a detection limit of 10 2 -10 3 adenovirus copies/g stool revealing a somewhat higher sensitivity offered by the automated extraction. The dynamic range of accurate quantitative analysis by both systems investigated was between 10 3 and 10 8 virus copies/g. The differences in quantitative analysis of adenovirus copy numbers between the systems tested were primarily attributable to the DNA extraction method used, while the qPCR assays revealed a high level of concordance. Both systems showed adequate performance for detection and monitoring of adenoviral load in stool specimens. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system.

    PubMed

    Lee, J Y; Moon, H J; Kim, T I; Kim, H W; Han, M Y

    2013-10-01

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design. © 2013 Elsevier Ltd. All rights reserved.

  16. Isobaric Tags for Relative and Absolute Quantitation-Based Proteomic Analysis of Patent and Constricted Ductus Arteriosus Tissues Confirms the Systemic Regulation of Ductus Arteriosus Closure.

    PubMed

    Hong, Haifa; Ye, Lincai; Chen, Huiwen; Xia, Yu; Liu, Yue; Liu, Jinfen; Lu, Yanan; Zhang, Haibo

    2015-08-01

    We aimed to evaluate global changes in protein expression associated with patency by undertaking proteomic analysis of human constricted and patent ductus arteriosus (DA). Ten constricted and 10 patent human DAs were excised from infants with ductal-dependent heart disease during surgery. Using isobaric tags for relative and absolute quantitation-based quantitative proteomics, 132 differentially expressed proteins were identified. Of 132 proteins, voltage-gated sodium channel 1.3 (SCN3A), myosin 1d (Myo1d), Rho GTPase activating protein 26 (ARHGAP26), and retinitis pigmentosa 1 (RP1) were selected for validation by Western blot and quantitative real-time polymerase chain reaction analyses. Significant upregulation of SCN3A, Myo1d, and RP1 messenger RNA, and protein levels was observed in the patent DA group (all P ≤ 0.048). ARHGAP26 messenger RNA and protein levels were decreased in patent DA tissue (both P ≤ 0.018). Immunohistochemistry analysis revealed that Myo1d, ARHGAP26, and RP1 were specifically expressed in the subendothelial region of constricted DAs; however, diffuse expression of these proteins was noted in the patent group. Proteomic analysis revealed global changes in the expression of proteins that regulate oxygen sensing, ion channels, smooth muscle cell migration, nervous system, immune system, and metabolism, suggesting a basis for the systemic regulation of DA patency by diverse signaling pathways, which will be confirmed in further studies.

  17. Systems Toxicology: From Basic Research to Risk Assessment

    PubMed Central

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  18. Systems toxicology: from basic research to risk assessment.

    PubMed

    Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C

    2014-03-17

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.

  19. Changes in landscape patterns and associated forest succession on the western slope of the Rocky Mountains, Colorado

    Treesearch

    Daniel J. Manier; Richard D. Laven

    2001-01-01

    Using repeat photography, we conducted a qualitative and quantitative analysis of changes in forest cover on the western slope of the Rocky Mountains in Colorado. For the quantitative analysis, both images in a pair were classified using remote sensing and geographic information system (GIS) technologies. Comparisons were made using three landscape metrics: total...

  20. Microbial-based evaluation of foaming events in full-scale wastewater treatment plants by microscopy survey and quantitative image analysis.

    PubMed

    Leal, Cristiano; Amaral, António Luís; Costa, Maria de Lourdes

    2016-08-01

    Activated sludge systems are prone to be affected by foaming occurrences causing the sludge to rise in the reactor and affecting the wastewater treatment plant (WWTP) performance. Nonetheless, there is currently a knowledge gap hindering the development of foaming events prediction tools that may be fulfilled by the quantitative monitoring of AS systems biota and sludge characteristics. As such, the present study focuses on the assessment of foaming events in full-scale WWTPs, by quantitative protozoa, metazoa, filamentous bacteria, and sludge characteristics analysis, further used to enlighten the inner relationships between these parameters. In the current study, a conventional activated sludge system (CAS) and an oxidation ditch (OD) were surveyed throughout a period of 2 and 3 months, respectively, regarding their biota and sludge characteristics. The biota community was monitored by microscopic observation, and a new filamentous bacteria index was developed to quantify their occurrence. Sludge characteristics (aggregated and filamentous biomass contents and aggregate size) were determined by quantitative image analysis (QIA). The obtained data was then processed by principal components analysis (PCA), cross-correlation analysis, and decision trees to assess the foaming occurrences, and enlighten the inner relationships. It was found that such events were best assessed by the combined use of the relative abundance of testate amoeba and nocardioform filamentous index, presenting a 92.9 % success rate for overall foaming events, and 87.5 and 100 %, respectively, for persistent and mild events.

  1. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    PubMed

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  2. CCTV Coverage Index Based on Surveillance Resolution and Its Evaluation Using 3D Spatial Analysis

    PubMed Central

    Choi, Kyoungah; Lee, Impyeong

    2015-01-01

    We propose a novel approach to evaluating how effectively a closed circuit television (CCTV) system can monitor a targeted area. With 3D models of the target area and the camera parameters of the CCTV system, the approach produces surveillance coverage index, which is newly defined in this study as a quantitative measure for surveillance performance. This index indicates the proportion of the space being monitored with a sufficient resolution to the entire space of the target area. It is determined by computing surveillance resolution at every position and orientation, which indicates how closely a specific object can be monitored with a CCTV system. We present full mathematical derivation for the resolution, which depends on the location and orientation of the object as well as the geometric model of a camera. With the proposed approach, we quantitatively evaluated the surveillance coverage of a CCTV system in an underground parking area. Our evaluation process provided various quantitative-analysis results, compelling us to examine the design of the CCTV system prior to its installation and understand the surveillance capability of an existing CCTV system. PMID:26389909

  3. Quantitative refractive index distribution of single cell by combining phase-shifting interferometry and AFM imaging.

    PubMed

    Zhang, Qinnan; Zhong, Liyun; Tang, Ping; Yuan, Yingjie; Liu, Shengde; Tian, Jindong; Lu, Xiaoxu

    2017-05-31

    Cell refractive index, an intrinsic optical parameter, is closely correlated with the intracellular mass and concentration. By combining optical phase-shifting interferometry (PSI) and atomic force microscope (AFM) imaging, we constructed a label free, non-invasive and quantitative refractive index of single cell measurement system, in which the accurate phase map of single cell was retrieved with PSI technique and the cell morphology with nanoscale resolution was achieved with AFM imaging. Based on the proposed AFM/PSI system, we achieved quantitative refractive index distributions of single red blood cell and Jurkat cell, respectively. Further, the quantitative change of refractive index distribution during Daunorubicin (DNR)-induced Jurkat cell apoptosis was presented, and then the content changes of intracellular biochemical components were achieved. Importantly, these results were consistent with Raman spectral analysis, indicating that the proposed PSI/AFM based refractive index system is likely to become a useful tool for intracellular biochemical components analysis measurement, and this will facilitate its application for revealing cell structure and pathological state from a new perspective.

  4. Quantitative biology of single neurons

    PubMed Central

    Eberwine, James; Lovatt, Ditte; Buckley, Peter; Dueck, Hannah; Francis, Chantal; Kim, Tae Kyung; Lee, Jaehee; Lee, Miler; Miyashiro, Kevin; Morris, Jacqueline; Peritz, Tiina; Schochet, Terri; Spaethling, Jennifer; Sul, Jai-Yoon; Kim, Junhyong

    2012-01-01

    The building blocks of complex biological systems are single cells. Fundamental insights gained from single-cell analysis promise to provide the framework for understanding normal biological systems development as well as the limits on systems/cellular ability to respond to disease. The interplay of cells to create functional systems is not well understood. Until recently, the study of single cells has concentrated primarily on morphological and physiological characterization. With the application of new highly sensitive molecular and genomic technologies, the quantitative biochemistry of single cells is now accessible. PMID:22915636

  5. Using GPS To Teach More Than Accurate Positions.

    ERIC Educational Resources Information Center

    Johnson, Marie C.; Guth, Peter L.

    2002-01-01

    Undergraduate science majors need practice in critical thinking, quantitative analysis, and judging whether their calculated answers are physically reasonable. Develops exercises using handheld Global Positioning System (GPS) receivers. Reinforces students' abilities to think quantitatively, make realistic "back of the envelope"…

  6. Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data

    PubMed Central

    Kümmel, Anne; Panke, Sven; Heinemann, Matthias

    2006-01-01

    As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595

  7. Manpower Systems Integration Factors for Frigate Design in the Turkish Navy

    DTIC Science & Technology

    2016-12-01

    factors for frigate design in the Turkish Navy. The qualitative and quantitative analyses of the correlation between ship design specifications and...frigates. The correlation between the ship design characteristics and the manpower requirements is supported by the quantitative analysis. This... design in the Turkish Navy. The qualitative and quantitative analyses of the correlation between ship design specifications and manpower requirements

  8. Quantitative impact characterization of aeronautical CFRP materials with non-destructive testing methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian, E-mail: Grosse@tum.de

    2015-03-31

    In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT)more » system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.« less

  9. An automated system for chromosome analysis. Volume 1: Goals, system design, and performance

    NASA Technical Reports Server (NTRS)

    Castleman, K. R.; Melnyk, J. H.

    1975-01-01

    The design, construction, and testing of a complete system to produce karyotypes and chromosome measurement data from human blood samples, and a basis for statistical analysis of quantitative chromosome measurement data is described. The prototype was assembled, tested, and evaluated on clinical material and thoroughly documented.

  10. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less

  11. Spectrochemical analysis of powdered biological samples using transversely excited atmospheric carbon dioxide laser plasma excitation

    NASA Astrophysics Data System (ADS)

    Zivkovic, Sanja; Momcilovic, Milos; Staicu, Angela; Mutic, Jelena; Trtica, Milan; Savovic, Jelena

    2017-02-01

    The aim of this study was to develop a simple laser induced breakdown spectroscopy (LIBS) method for quantitative elemental analysis of powdered biological materials based on laboratory prepared calibration samples. The analysis was done using ungated single pulse LIBS in ambient air at atmospheric pressure. Transversely-Excited Atmospheric pressure (TEA) CO2 laser was used as an energy source for plasma generation on samples. The material used for the analysis was a blue-green alga Spirulina, widely used in food and pharmaceutical industries and also in a few biotechnological applications. To demonstrate the analytical potential of this particular LIBS system the obtained spectra were compared to the spectra obtained using a commercial LIBS system based on pulsed Nd:YAG laser. A single sample of known concentration was used to estimate detection limits for Ba, Ca, Fe, Mg, Mn, Si and Sr and compare detection power of these two LIBS systems. TEA CO2 laser based LIBS was also applied for quantitative analysis of the elements in powder Spirulina samples. Analytical curves for Ba, Fe, Mg, Mn and Sr were constructed using laboratory produced matrix-matched calibration samples. Inductively coupled plasma optical emission spectroscopy (ICP-OES) was used as the reference technique for elemental quantification, and reasonably well agreement between ICP and LIBS data was obtained. Results confirm that, in respect to its sensitivity and precision, TEA CO2 laser based LIBS can be successfully applied for quantitative analysis of macro and micro-elements in algal samples. The fact that nearly all classes of materials can be prepared as powders implies that the proposed method could be easily extended to a quantitative analysis of different kinds of materials, organic, biological or inorganic.

  12. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, within a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability, and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation, testing results, and other information. Where appropriate, actual performance history was used to calculate failure rates for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to assess compliance with requirements and to highlight design or performance shortcomings for further decision making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability, and maintainability analysis, and present findings and observation based on analysis leading to the Ground Operations Project Preliminary Design Review milestone.

  13. Capillary nano-immunoassays: advancing quantitative proteomics analysis, biomarker assessment, and molecular diagnostics.

    PubMed

    Chen, Jin-Qiu; Wakefield, Lalage M; Goldstein, David J

    2015-06-06

    There is an emerging demand for the use of molecular profiling to facilitate biomarker identification and development, and to stratify patients for more efficient treatment decisions with reduced adverse effects. In the past decade, great strides have been made to advance genomic, transcriptomic and proteomic approaches to address these demands. While there has been much progress with these large scale approaches, profiling at the protein level still faces challenges due to limitations in clinical sample size, poor reproducibility, unreliable quantitation, and lack of assay robustness. A novel automated capillary nano-immunoassay (CNIA) technology has been developed. This technology offers precise and accurate measurement of proteins and their post-translational modifications using either charge-based or size-based separation formats. The system not only uses ultralow nanogram levels of protein but also allows multi-analyte analysis using a parallel single-analyte format for increased sensitivity and specificity. The high sensitivity and excellent reproducibility of this technology make it particularly powerful for analysis of clinical samples. Furthermore, the system can distinguish and detect specific protein post-translational modifications that conventional Western blot and other immunoassays cannot easily capture. This review will summarize and evaluate the latest progress to optimize the CNIA system for comprehensive, quantitative protein and signaling event characterization. It will also discuss how the technology has been successfully applied in both discovery research and clinical studies, for signaling pathway dissection, proteomic biomarker assessment, targeted treatment evaluation and quantitative proteomic analysis. Lastly, a comparison of this novel system with other conventional immuno-assay platforms is performed.

  14. Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis.

    PubMed

    Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L; Hwang, Jae Youn

    2016-12-01

    We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis.

  15. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  16. Analyzing the texture changes in the quantitative phase maps of adipocytes

    NASA Astrophysics Data System (ADS)

    Roitshtain, Darina; Sharabani-Yosef, Orna; Gefen, Amit; Shaked, Natan T.

    2016-03-01

    We present a new analysis tool for studying texture changes in the quantitative phase maps of live cells acquired by wide-field interferometry. The sensitivity of wide-field interferometry systems to small changes in refractive index enables visualizing cells and inner cell organelles without the using fluorescent dyes or other cell-invasive approaches, which may affect the measurement and require external labeling. Our label-free texture-analysis tool is based directly on the optical path delay profile of the sample and does not necessitate decoupling refractive index and thickness in the cell quantitative phase profile; thus, relevant parameters can be calculated using a single-frame acquisition. Our experimental system includes low-coherence wide-field interferometer, combined with simultaneous florescence microscopy system for validation. We used this system and analysis tool for studying lipid droplets formation in adipocytes. The latter demonstration is relevant for various cellular functions such as lipid metabolism, protein storage and degradation to viral replication. These processes are functionally linked to several physiological and pathological conditions, including obesity and metabolic diseases. Quantification of these biological phenomena based on the texture changes in the cell phase map has a potential as a new cellular diagnosis tool.

  17. Inter-rater reliability of motor unit number estimates and quantitative motor unit analysis in the tibialis anterior muscle.

    PubMed

    Boe, S G; Dalton, B H; Harwood, B; Doherty, T J; Rice, C L

    2009-05-01

    To establish the inter-rater reliability of decomposition-based quantitative electromyography (DQEMG) derived motor unit number estimates (MUNEs) and quantitative motor unit (MU) analysis. Using DQEMG, two examiners independently obtained a sample of needle and surface-detected motor unit potentials (MUPs) from the tibialis anterior muscle from 10 subjects. Coupled with a maximal M wave, surface-detected MUPs were used to derive a MUNE for each subject and each examiner. Additionally, size-related parameters of the individual MUs were obtained following quantitative MUP analysis. Test-retest MUNE values were similar with high reliability observed between examiners (ICC=0.87). Additionally, MUNE variability from test-retest as quantified by a 95% confidence interval was relatively low (+/-28 MUs). Lastly, quantitative data pertaining to MU size, complexity and firing rate were similar between examiners. MUNEs and quantitative MU data can be obtained with high reliability by two independent examiners using DQEMG. Establishing the inter-rater reliability of MUNEs and quantitative MU analysis using DQEMG is central to the clinical applicability of the technique. In addition to assessing response to treatments over time, multiple clinicians may be involved in the longitudinal assessment of the MU pool of individuals with disorders of the central or peripheral nervous system.

  18. The Spectral Image Processing System (SIPS) - Interactive visualization and analysis of imaging spectrometer data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1993-01-01

    The Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, has developed a prototype interactive software system called the Spectral Image Processing System (SIPS) using IDL (the Interactive Data Language) on UNIX-based workstations. SIPS is designed to take advantage of the combination of high spectral resolution and spatial data presentation unique to imaging spectrometers. It streamlines analysis of these data by allowing scientists to rapidly interact with entire datasets. SIPS provides visualization tools for rapid exploratory analysis and numerical tools for quantitative modeling. The user interface is X-Windows-based, user friendly, and provides 'point and click' operation. SIPS is being used for multidisciplinary research concentrating on use of physically based analysis methods to enhance scientific results from imaging spectrometer data. The objective of this continuing effort is to develop operational techniques for quantitative analysis of imaging spectrometer data and to make them available to the scientific community prior to the launch of imaging spectrometer satellite systems such as the Earth Observing System (EOS) High Resolution Imaging Spectrometer (HIRIS).

  19. A computer system to be used with laser-based endoscopy for quantitative diagnosis of early gastric cancer.

    PubMed

    Miyaki, Rie; Yoshida, Shigeto; Tanaka, Shinji; Kominami, Yoko; Sanomura, Yoji; Matsuo, Taiji; Oka, Shiro; Raytchev, Bisser; Tamaki, Toru; Koide, Tetsushi; Kaneda, Kazufumi; Yoshihara, Masaharu; Chayama, Kazuaki

    2015-02-01

    To evaluate the usefulness of a newly devised computer system for use with laser-based endoscopy in differentiating between early gastric cancer, reddened lesions, and surrounding tissue. Narrow-band imaging based on laser light illumination has come into recent use. We devised a support vector machine (SVM)-based analysis system to be used with the newly devised endoscopy system to quantitatively identify gastric cancer on images obtained by magnifying endoscopy with blue-laser imaging (BLI). We evaluated the usefulness of the computer system in combination with the new endoscopy system. We evaluated the system as applied to 100 consecutive early gastric cancers in 95 patients examined by BLI magnification at Hiroshima University Hospital. We produced a set of images from the 100 early gastric cancers; 40 flat or slightly depressed, small, reddened lesions; and surrounding tissues, and we attempted to identify gastric cancer, reddened lesions, and surrounding tissue quantitatively. The average SVM output value was 0.846 ± 0.220 for cancerous lesions, 0.381 ± 0.349 for reddened lesions, and 0.219 ± 0.277 for surrounding tissue, with the SVM output value for cancerous lesions being significantly greater than that for reddened lesions or surrounding tissue. The average SVM output value for differentiated-type cancer was 0.840 ± 0.207 and for undifferentiated-type cancer was 0.865 ± 0.259. Although further development is needed, we conclude that our computer-based analysis system used with BLI will identify gastric cancers quantitatively.

  20. Evaluation of quantitative image analysis criteria for the high-resolution microendoscopic detection of neoplasia in Barrett's esophagus

    NASA Astrophysics Data System (ADS)

    Muldoon, Timothy J.; Thekkek, Nadhi; Roblyer, Darren; Maru, Dipen; Harpaz, Noam; Potack, Jonathan; Anandasabapathy, Sharmila; Richards-Kortum, Rebecca

    2010-03-01

    Early detection of neoplasia in patients with Barrett's esophagus is essential to improve outcomes. The aim of this ex vivo study was to evaluate the ability of high-resolution microendoscopic imaging and quantitative image analysis to identify neoplastic lesions in patients with Barrett's esophagus. Nine patients with pathologically confirmed Barrett's esophagus underwent endoscopic examination with biopsies or endoscopic mucosal resection. Resected fresh tissue was imaged with fiber bundle microendoscopy; images were analyzed by visual interpretation or by quantitative image analysis to predict whether the imaged sites were non-neoplastic or neoplastic. The best performing pair of quantitative features were chosen based on their ability to correctly classify the data into the two groups. Predictions were compared to the gold standard of histopathology. Subjective analysis of the images by expert clinicians achieved average sensitivity and specificity of 87% and 61%, respectively. The best performing quantitative classification algorithm relied on two image textural features and achieved a sensitivity and specificity of 87% and 85%, respectively. This ex vivo pilot trial demonstrates that quantitative analysis of images obtained with a simple microendoscope system can distinguish neoplasia in Barrett's esophagus with good sensitivity and specificity when compared to histopathology and to subjective image interpretation.

  1. UPLC-MS/MS quantitative analysis and structural fragmentation study of five Parmotrema lichens from the Eastern Ghats.

    PubMed

    Kumar, K; Siva, Bandi; Sarma, V U M; Mohabe, Satish; Reddy, A Madhusudana; Boustie, Joel; Tiwari, Ashok K; Rao, N Rama; Babu, K Suresh

    2018-07-15

    Comparative phytochemical analysis of five lichen species [Parmotrema tinctorum (Delise ex Nyl.) Hale, P. andinum (Mull. Arg.) Hale, P. praesorediosum (Nyl.) Hale, P. grayanum (Hue) Hale, P. austrosinense (Zahlbr.) Hale] of Parmotrema genus were performed using two complementary UPLC-MS systems. The first system consists of high resolution UPLC-QToF-MS/MS spectrometer and the second system consisted of UPLC-MS/MS in Multiple Reaction Monitoring (MRM) mode for quantitative analysis of major constituents in the selected lichen species. The individual compounds (47 compounds) were identified using Q-ToF-MS/MS, via comparison of the exact molecular masses from their MS/MS spectra, the comparison of literature data and retention times to those of standard compounds which were isolated from crude extract of abundant lichen, P. tinctorum. The analysis also allowed us to identify unknown peaks/compounds, which were further characterized by their mass fragmentation studies. The quantitative MRM analysis was useful to have a better discrimination of species according to their chemical profile. Moreover, the determination of antioxidant activities (ABTS + inhibition) and Advance Glycation Endproducts (AGEs) inhibition carried out for the crude extracts revealed a potential antiglycaemic activity to be confirmed for P. austrosinense. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Comparison of longitudinal excursion of a nerve-phantom model using quantitative ultrasound imaging and motion analysis system methods: A convergent validity study.

    PubMed

    Paquette, Philippe; El Khamlichi, Youssef; Lamontagne, Martin; Higgins, Johanne; Gagnon, Dany H

    2017-08-01

    Quantitative ultrasound imaging is gaining popularity in research and clinical settings to measure the neuromechanical properties of the peripheral nerves such as their capability to glide in response to body segment movement. Increasing evidence suggests that impaired median nerve longitudinal excursion is associated with carpal tunnel syndrome. To date, psychometric properties of longitudinal nerve excursion measurements using quantitative ultrasound imaging have not been extensively investigated. This study investigates the convergent validity of the longitudinal nerve excursion by comparing measures obtained using quantitative ultrasound imaging with those determined with a motion analysis system. A 38-cm long rigid nerve-phantom model was used to assess the longitudinal excursion in a laboratory environment. The nerve-phantom model, immersed in a 20-cm deep container filled with a gelatin-based solution, was moved 20 times using a linear forward and backward motion. Three light-emitting diodes were used to record nerve-phantom excursion with a motion analysis system, while a 5-cm linear transducer allowed simultaneous recording via ultrasound imaging. Both measurement techniques yielded excellent association ( r  = 0.99) and agreement (mean absolute difference between methods = 0.85 mm; mean relative difference between methods = 7.48 %). Small discrepancies were largely found when larger excursions (i.e. > 10 mm) were performed, revealing slight underestimation of the excursion by the ultrasound imaging analysis software. Quantitative ultrasound imaging is an accurate method to assess the longitudinal excursion of an in vitro nerve-phantom model and appears relevant for future research protocols investigating the neuromechanical properties of the peripheral nerves.

  3. Quantitative study of FORC diagrams in thermally corrected Stoner- Wohlfarth nanoparticles systems

    NASA Astrophysics Data System (ADS)

    De Biasi, E.; Curiale, J.; Zysler, R. D.

    2016-12-01

    The use of FORC diagrams is becoming increasingly popular among researchers devoted to magnetism and magnetic materials. However, a thorough interpretation of this kind of diagrams, in order to achieve quantitative information, requires an appropriate model of the studied system. For that reason most of the FORC studies are used for a qualitative analysis. In magnetic systems thermal fluctuations "blur" the signatures of the anisotropy, volume and particle interactions distributions, therefore thermal effects in nanoparticles systems conspire against a proper interpretation and analysis of these diagrams. Motivated by this fact, we have quantitatively studied the degree of accuracy of the information extracted from FORC diagrams for the special case of single-domain thermal corrected Stoner- Wohlfarth (easy axes along the external field orientation) nanoparticles systems. In this work, the starting point is an analytical model that describes the behavior of a magnetic nanoparticles system as a function of field, anisotropy, temperature and measurement time. In order to study the quantitative degree of accuracy of our model, we built FORC diagrams for different archetypical cases of magnetic nanoparticles. Our results show that from the quantitative information obtained from the diagrams, under the hypotheses of the proposed model, is possible to recover the features of the original system with accuracy above 95%. This accuracy is improved at low temperatures and also it is possible to access to the anisotropy distribution directly from the FORC coercive field profile. Indeed, our simulations predict that the volume distribution plays a secondary role being the mean value and its deviation the only important parameters. Therefore it is possible to obtain an accurate result for the inversion and interaction fields despite the features of the volume distribution.

  4. LANDSCAPE STRUCTURE AND ESTUARINE CONDITION IN THE MID-ATLANTIC REGION OF THE UNITED STATES: I. DEVELOPING QUANTITATIVE RELATIONSHIPS

    EPA Science Inventory

    In a previously published study, quantitative relationships were developed between landscape metrics and sediment contamination for 25 small estuarine systems within Chesapeake Bay. Nonparametric statistical analysis (rank transformation) was used to develop an empirical relation...

  5. Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides

    USDA-ARS?s Scientific Manuscript database

    In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...

  6. SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena

    PubMed Central

    Tohsato, Yukako; Ho, Kenneth H. L.; Kyoda, Koji; Onami, Shuichi

    2016-01-01

    Motivation: Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. Results: We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus. The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. Availability and Implementation: SSBD is accessible at http://ssbd.qbic.riken.jp. Contact: sonami@riken.jp PMID:27412095

  7. SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena.

    PubMed

    Tohsato, Yukako; Ho, Kenneth H L; Kyoda, Koji; Onami, Shuichi

    2016-11-15

    Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. SSBD is accessible at http://ssbd.qbic.riken.jp CONTACT: sonami@riken.jp. © The Author 2016. Published by Oxford University Press.

  8. Safety evaluation methodology for advanced coal extraction systems

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.

    1981-01-01

    Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.

  9. Using the phase-space imager to analyze partially coherent imaging systems: bright-field, phase contrast, differential interference contrast, differential phase contrast, and spiral phase contrast

    NASA Astrophysics Data System (ADS)

    Mehta, Shalin B.; Sheppard, Colin J. R.

    2010-05-01

    Various methods that use large illumination aperture (i.e. partially coherent illumination) have been developed for making transparent (i.e. phase) specimens visible. These methods were developed to provide qualitative contrast rather than quantitative measurement-coherent illumination has been relied upon for quantitative phase analysis. Partially coherent illumination has some important advantages over coherent illumination and can be used for measurement of the specimen's phase distribution. However, quantitative analysis and image computation in partially coherent systems have not been explored fully due to the lack of a general, physically insightful and computationally efficient model of image formation. We have developed a phase-space model that satisfies these requirements. In this paper, we employ this model (called the phase-space imager) to elucidate five different partially coherent systems mentioned in the title. We compute images of an optical fiber under these systems and verify some of them with experimental images. These results and simulated images of a general phase profile are used to compare the contrast and the resolution of the imaging systems. We show that, for quantitative phase imaging of a thin specimen with matched illumination, differential phase contrast offers linear transfer of specimen information to the image. We also show that the edge enhancement properties of spiral phase contrast are compromised significantly as the coherence of illumination is reduced. The results demonstrate that the phase-space imager model provides a useful framework for analysis, calibration, and design of partially coherent imaging methods.

  10. Meeting Report: Tissue-based Image Analysis.

    PubMed

    Saravanan, Chandra; Schumacher, Vanessa; Brown, Danielle; Dunstan, Robert; Galarneau, Jean-Rene; Odin, Marielle; Mishra, Sasmita

    2017-10-01

    Quantitative image analysis (IA) is a rapidly evolving area of digital pathology. Although not a new concept, the quantification of histological features on photomicrographs used to be cumbersome, resource-intensive, and limited to specialists and specialized laboratories. Recent technological advances like highly efficient automated whole slide digitizer (scanner) systems, innovative IA platforms, and the emergence of pathologist-friendly image annotation and analysis systems mean that quantification of features on histological digital images will become increasingly prominent in pathologists' daily professional lives. The added value of quantitative IA in pathology includes confirmation of equivocal findings noted by a pathologist, increasing the sensitivity of feature detection, quantification of signal intensity, and improving efficiency. There is no denying that quantitative IA is part of the future of pathology; however, there are also several potential pitfalls when trying to estimate volumetric features from limited 2-dimensional sections. This continuing education session on quantitative IA offered a broad overview of the field; a hands-on toxicologic pathologist experience with IA principles, tools, and workflows; a discussion on how to apply basic stereology principles in order to minimize bias in IA; and finally, a reflection on the future of IA in the toxicologic pathology field.

  11. Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis

    PubMed Central

    Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L.; Hwang, Jae Youn

    2016-01-01

    We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis. PMID:28018743

  12. Approaching human language with complex networks

    NASA Astrophysics Data System (ADS)

    Cong, Jin; Liu, Haitao

    2014-12-01

    The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics).

  13. Leukotriene B4 catabolism: quantitation of leukotriene B4 and its omega-oxidation products by reversed-phase high-performance liquid chromatography.

    PubMed

    Shak, S

    1987-01-01

    LTB4 and its omega-oxidation products may be rapidly, sensitively, and specifically quantitated by the methods of solid-phase extraction and reversed-phase high-performance liquid chromatography (HPLC), which are described in this chapter. Although other techniques, such as radioimmunoassay or gas chromatography-mass spectrometry, may be utilized for quantitative analysis of the lipoxygenase products of arachidonic acid, only the technique of reversed-phase HPLC can quantitate as many as 10 metabolites in a single analysis, without prior derivatization. In this chapter, we also reviewed the chromatographic theory which we utilized in order to optimize reversed-phase HPLC analysis of LTB4 and its omega-oxidation products. With this information and a gradient HPLC system, it is possible for any investigator to develop a powerful assay for the potent inflammatory mediator, LTB4, or for any other lipoxygenase product of arachidonic acid.

  14. Knowledge Management for the Analysis of Complex Experimentation.

    ERIC Educational Resources Information Center

    Maule, R.; Schacher, G.; Gallup, S.

    2002-01-01

    Describes a knowledge management system that was developed to help provide structure for dynamic and static data and to aid in the analysis of complex experimentation. Topics include quantitative and qualitative data; mining operations using artificial intelligence techniques; information architecture of the system; and transforming data into…

  15. Relating interesting quantitative time series patterns with text events and text features

    NASA Astrophysics Data System (ADS)

    Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.

    2013-12-01

    In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.

  16. Evaluation of a rapid quantitative determination method of PSA concentration with gold immunochromatographic strips.

    PubMed

    Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia

    2015-11-03

    Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.

  17. A Data-Processing System for Quantitative Analysis in Speech Production. CLCS Occasional Paper No. 17.

    ERIC Educational Resources Information Center

    Chasaide, Ailbhe Ni; Davis, Eugene

    The data processing system used at Trinity College's Centre for Language and Communication Studies (Ireland) enables computer-automated collection and analysis of phonetic data and has many advantages for research on speech production. The system allows accurate handling of large quantities of data, eliminates many of the limitations of manual…

  18. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  19. A novel CMOS image sensor system for quantitative loop-mediated isothermal amplification assays to detect food-borne pathogens.

    PubMed

    Wang, Tiantian; Kim, Sanghyo; An, Jeong Ho

    2017-02-01

    Loop-mediated isothermal amplification (LAMP) is considered as one of the alternatives to the conventional PCR and it is an inexpensive portable diagnostic system with minimal power consumption. The present work describes the application of LAMP in real-time photon detection and quantitative analysis of nucleic acids integrated with a disposable complementary-metal-oxide semiconductor (CMOS) image sensor. This novel system works as an amplification-coupled detection platform, relying on a CMOS image sensor, with the aid of a computerized circuitry controller for the temperature and light sources. The CMOS image sensor captures the light which is passing through the sensor surface and converts into digital units using an analog-to-digital converter (ADC). This new system monitors the real-time photon variation, caused by the color changes during amplification. Escherichia coli O157 was used as a proof-of-concept target for quantitative analysis, and compared with the results for Staphylococcus aureus and Salmonella enterica to confirm the efficiency of the system. The system detected various DNA concentrations of E. coli O157 in a short time (45min), with a detection limit of 10fg/μL. The low-cost, simple, and compact design, with low power consumption, represents a significant advance in the development of a portable, sensitive, user-friendly, real-time, and quantitative analytic tools for point-of-care diagnosis. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    PubMed

    Chavez, Juan D; Eng, Jimmy K; Schweppe, Devin K; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  1. The application of high-speed cinematography for the quantitative analysis of equine locomotion.

    PubMed

    Fredricson, I; Drevemo, S; Dalin, G; Hjertën, G; Björne, K

    1980-04-01

    Locomotive disorders constitute a serious problem in horse racing which will only be rectified by a better understanding of the causative factors associated with disturbances of gait. This study describes a system for the quantitative analysis of the locomotion of horses at speed. The method is based on high-speed cinematography with a semi-automatic system of analysis of the films. The recordings are made with a 16 mm high-speed camera run at 500 frames per second (fps) and the films are analysed by special film-reading equipment and a mini-computer. The time and linear gait variables are presented in tabular form and the angles and trajectories of the joints and body segments are presented graphically.

  2. Benchmarking the performance of fixed-image receptor digital radiographic systems part 1: a novel method for image quality analysis.

    PubMed

    Lee, Kam L; Ireland, Timothy A; Bernardo, Michael

    2016-06-01

    This is the first part of a two-part study in benchmarking the performance of fixed digital radiographic general X-ray systems. This paper concentrates on reporting findings related to quantitative analysis techniques used to establish comparative image quality metrics. A systematic technical comparison of the evaluated systems is presented in part two of this study. A novel quantitative image quality analysis method is presented with technical considerations addressed for peer review. The novel method was applied to seven general radiographic systems with four different makes of radiographic image receptor (12 image receptors in total). For the System Modulation Transfer Function (sMTF), the use of grid was found to reduce veiling glare and decrease roll-off. The major contributor in sMTF degradation was found to be focal spot blurring. For the System Normalised Noise Power Spectrum (sNNPS), it was found that all systems examined had similar sNNPS responses. A mathematical model is presented to explain how the use of stationary grid may cause a difference between horizontal and vertical sNNPS responses.

  3. [Methods of quantitative proteomics].

    PubMed

    Kopylov, A T; Zgoda, V G

    2007-01-01

    In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.

  4. Examining a Paradigm Shift in Organic Depot-Level Software Maintenance for Army Communications and Electronics Equipment

    DTIC Science & Technology

    2015-05-30

    study used quantitative and qualitative analytical methods in the examination of software versus hardware maintenance trends and forecasts, human and...financial resources at TYAD and SEC, and overall compliance with Title 10 mandates (e.g., 10 USC 2466). Quantitative methods were executed by...Systems (PEO EIS). These methods will provide quantitative-based analysis on which to base and justify trends and gaps, as well as qualitative methods

  5. Quantitative Analysis of the Trends Exhibited by the Three Interdisciplinary Biological Sciences: Biophysics, Bioinformatics, and Systems Biology.

    PubMed

    Kang, Jonghoon; Park, Seyeon; Venkat, Aarya; Gopinath, Adarsh

    2015-12-01

    New interdisciplinary biological sciences like bioinformatics, biophysics, and systems biology have become increasingly relevant in modern science. Many papers have suggested the importance of adding these subjects, particularly bioinformatics, to an undergraduate curriculum; however, most of their assertions have relied on qualitative arguments. In this paper, we will show our metadata analysis of a scientific literature database (PubMed) that quantitatively describes the importance of the subjects of bioinformatics, systems biology, and biophysics as compared with a well-established interdisciplinary subject, biochemistry. Specifically, we found that the development of each subject assessed by its publication volume was well described by a set of simple nonlinear equations, allowing us to characterize them quantitatively. Bioinformatics, which had the highest ratio of publications produced, was predicted to grow between 77% and 93% by 2025 according to the model. Due to the large number of publications produced in bioinformatics, which nearly matches the number published in biochemistry, it can be inferred that bioinformatics is almost equal in significance to biochemistry. Based on our analysis, we suggest that bioinformatics be added to the standard biology undergraduate curriculum. Adding this course to an undergraduate curriculum will better prepare students for future research in biology.

  6. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  7. The SCHEIE Visual Field Grading System

    PubMed Central

    Sankar, Prithvi S.; O’Keefe, Laura; Choi, Daniel; Salowe, Rebecca; Miller-Ellis, Eydie; Lehman, Amanda; Addis, Victoria; Ramakrishnan, Meera; Natesh, Vikas; Whitehead, Gideon; Khachatryan, Naira; O’Brien, Joan

    2017-01-01

    Objective No method of grading visual field (VF) defects has been widely accepted throughout the glaucoma community. The SCHEIE (Systematic Classification of Humphrey visual fields-Easy Interpretation and Evaluation) grading system for glaucomatous visual fields was created to convey qualitative and quantitative information regarding visual field defects in an objective, reproducible, and easily applicable manner for research purposes. Methods The SCHEIE grading system is composed of a qualitative and quantitative score. The qualitative score consists of designation in one or more of the following categories: normal, central scotoma, paracentral scotoma, paracentral crescent, temporal quadrant, nasal quadrant, peripheral arcuate defect, expansive arcuate, or altitudinal defect. The quantitative component incorporates the Humphrey visual field index (VFI), location of visual defects for superior and inferior hemifields, and blind spot involvement. Accuracy and speed at grading using the qualitative and quantitative components was calculated for non-physician graders. Results Graders had a median accuracy of 96.67% for their qualitative scores and a median accuracy of 98.75% for their quantitative scores. Graders took a mean of 56 seconds per visual field to assign a qualitative score and 20 seconds per visual field to assign a quantitative score. Conclusion The SCHEIE grading system is a reproducible tool that combines qualitative and quantitative measurements to grade glaucomatous visual field defects. The system aims to standardize clinical staging and to make specific visual field defects more easily identifiable. Specific patterns of visual field loss may also be associated with genetic variants in future genetic analysis. PMID:28932621

  8. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses.

    PubMed

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard.

  9. Quantitative analysis of crystalline pharmaceuticals in tablets by pattern-fitting procedure using X-ray diffraction pattern.

    PubMed

    Takehira, Rieko; Momose, Yasunori; Yamamura, Shigeo

    2010-10-15

    A pattern-fitting procedure using an X-ray diffraction pattern was applied to the quantitative analysis of binary system of crystalline pharmaceuticals in tablets. Orthorhombic crystals of isoniazid (INH) and mannitol (MAN) were used for the analysis. Tablets were prepared under various compression pressures using a direct compression method with various compositions of INH and MAN. Assuming that X-ray diffraction pattern of INH-MAN system consists of diffraction intensities from respective crystals, observed diffraction intensities were fitted to analytic expression based on X-ray diffraction theory and separated into two intensities from INH and MAN crystals by a nonlinear least-squares procedure. After separation, the contents of INH were determined by using the optimized normalization constants for INH and MAN. The correction parameter including all the factors that are beyond experimental control was required for quantitative analysis without calibration curve. The pattern-fitting procedure made it possible to determine crystalline phases in the range of 10-90% (w/w) of the INH contents. Further, certain characteristics of the crystals in the tablets, such as the preferred orientation, size of crystallite, and lattice disorder were determined simultaneously. This method can be adopted to analyze compounds whose crystal structures are known. It is a potentially powerful tool for the quantitative phase analysis and characterization of crystals in tablets and powders using X-ray diffraction patterns. Copyright 2010 Elsevier B.V. All rights reserved.

  10. Efficiency and Equity within European Education Systems and School Choice Policy: Bridging Qualitative and Quantitative Approaches

    ERIC Educational Resources Information Center

    Poder, Kaire; Kerem, Kaie; Lauri, Triin

    2013-01-01

    We seek out the good institutional features of the European choice policies that can enhance both equity and efficiency at the system level. For causality analysis we construct the typology of 28 European educational systems by using fuzzy-set analysis. We combine five independent variables to indicate institutional features of school choice…

  11. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  12. Corporatized Higher Education: A Quantitative Study Examining Faculty Motivation Using Self-Determination Theory

    ERIC Educational Resources Information Center

    Brown, Aaron D.

    2016-01-01

    The intent of this research is to offer a quantitative analysis of self-determined faculty motivation within the current corporate model of higher education across public and private research universities. With such a heightened integration of accountability structures, external reward systems, and the ongoing drive for more money and…

  13. Printing 2-dimentional droplet array for single-cell reverse transcription quantitative PCR assay with a microfluidic robot.

    PubMed

    Zhu, Ying; Zhang, Yun-Xia; Liu, Wen-Wen; Ma, Yan; Fang, Qun; Yao, Bo

    2015-04-01

    This paper describes a nanoliter droplet array-based single-cell reverse transcription quantitative PCR (RT-qPCR) assay method for quantifying gene expression in individual cells. By sequentially printing nanoliter-scale droplets on microchip using a microfluidic robot, all liquid-handling operations including cell encapsulation, lysis, reverse transcription, and quantitative PCR with real-time fluorescence detection, can be automatically achieved. The inhibition effect of cell suspension buffer on RT-PCR assay was comprehensively studied to achieve high-sensitivity gene quantification. The present system was applied in the quantitative measurement of expression level of mir-122 in single Huh-7 cells. A wide distribution of mir-122 expression in single cells from 3061 copies/cell to 79998 copies/cell was observed, showing a high level of cell heterogeneity. With the advantages of full-automation in liquid-handling, simple system structure, and flexibility in achieving multi-step operations, the present method provides a novel liquid-handling mode for single cell gene expression analysis, and has significant potentials in transcriptional identification and rare cell analysis.

  14. Printing 2-Dimentional Droplet Array for Single-Cell Reverse Transcription Quantitative PCR Assay with a Microfluidic Robot

    PubMed Central

    Zhu, Ying; Zhang, Yun-Xia; Liu, Wen-Wen; Ma, Yan; Fang, Qun; Yao, Bo

    2015-01-01

    This paper describes a nanoliter droplet array-based single-cell reverse transcription quantitative PCR (RT-qPCR) assay method for quantifying gene expression in individual cells. By sequentially printing nanoliter-scale droplets on microchip using a microfluidic robot, all liquid-handling operations including cell encapsulation, lysis, reverse transcription, and quantitative PCR with real-time fluorescence detection, can be automatically achieved. The inhibition effect of cell suspension buffer on RT-PCR assay was comprehensively studied to achieve high-sensitivity gene quantification. The present system was applied in the quantitative measurement of expression level of mir-122 in single Huh-7 cells. A wide distribution of mir-122 expression in single cells from 3061 copies/cell to 79998 copies/cell was observed, showing a high level of cell heterogeneity. With the advantages of full-automation in liquid-handling, simple system structure, and flexibility in achieving multi-step operations, the present method provides a novel liquid-handling mode for single cell gene expression analysis, and has significant potentials in transcriptional identification and rare cell analysis. PMID:25828383

  15. Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.

    PubMed

    Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe

    2018-01-01

    Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.

  16. Safe Passage Data Analysis: Interim Report

    DOT National Transportation Integrated Search

    1993-04-01

    The purpose of this report is to describe quantitatively the costs and benefits of screener : proficiency evaluation and reporting systems (SPEARS) equipment, particularly computer-based : instruction (CBI) systems, compared to current methods of tra...

  17. Quantitative analysis for peripheral vascularity assessment based on clinical photoacoustic and ultrasound images

    NASA Astrophysics Data System (ADS)

    Murakoshi, Dai; Hirota, Kazuhiro; Ishii, Hiroyasu; Hashimoto, Atsushi; Ebata, Tetsurou; Irisawa, Kaku; Wada, Takatsugu; Hayakawa, Toshiro; Itoh, Kenji; Ishihara, Miya

    2018-02-01

    Photoacoustic (PA) imaging technology is expected to be applied to clinical assessment for peripheral vascularity. We started a clinical evaluation with the prototype PA imaging system we recently developed. Prototype PA imaging system was composed with in-house Q-switched Alexandrite laser system which emits short-pulsed laser with 750 nm wavelength, handheld ultrasound transducer where illumination optics were integrated and signal processing for PA image reconstruction implemented in the clinical ultrasound (US) system. For the purpose of quantitative assessment of PA images, an image analyzing function has been developed and applied to clinical PA images. In this analyzing function, vascularity derived from PA signal intensity ranged for prescribed threshold was defined as a numerical index of vessel fulfillment and calculated for the prescribed region of interest (ROI). Skin surface was automatically detected by utilizing B-mode image acquired simultaneously with PA image. Skinsurface position is utilized to place the ROI objectively while avoiding unwanted signals such as artifacts which were imposed due to melanin pigment in the epidermal layer which absorbs laser emission and generates strong PA signals. Multiple images were available to support the scanned image set for 3D viewing. PA images for several fingers of patients with systemic sclerosis (SSc) were quantitatively assessed. Since the artifact region is trimmed off in PA images, the visibility of vessels with rather low PA signal intensity on the 3D projection image was enhanced and the reliability of the quantitative analysis was improved.

  18. Quantitative Synthesis and Component Analysis of Single-Participant Studies on the Picture Exchange Communication System

    ERIC Educational Resources Information Center

    Tincani, Matt; Devis, Kathryn

    2011-01-01

    The "Picture Exchange Communication System" (PECS) has emerged as the augmentative communication intervention of choice for individuals with autism spectrum disorder (ASD), with a supporting body of single-participant studies. This report describes a meta-analysis of 16 single-participant studies on PECS with percentage of nonoverlapping data…

  19. Analysis of Radio Frequency Surveillance Systems for Air Traffic Control : Volume 1. Text.

    DOT National Transportation Integrated Search

    1976-02-01

    Performance criteria that afford quantitative evaluation of a variety of current and proposed configurations of the Air Traffic Control Radar Beacon System (ATCRBS) are described in detail. Two analytic system models are developed to allow applicatio...

  20. Railroad classification yard technology : assessment of car speed control systems

    DOT National Transportation Integrated Search

    1980-12-01

    The scope of this study has encompassed an evaluation of fourteen yard speed : control devices, an identification of four generic speed control systems, a : qualitative assessment of the four systems, and finally a quantitative analysis : of three hy...

  1. A quantitative image cytometry technique for time series or population analyses of signaling networks.

    PubMed

    Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya

    2010-04-01

    Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.

  2. A high-throughput urinalysis of abused drugs based on a SPE-LC-MS/MS method coupled with an in-house developed post-analysis data treatment system.

    PubMed

    Cheng, Wing-Chi; Yau, Tsan-Sang; Wong, Ming-Kei; Chan, Lai-Ping; Mok, Vincent King-Kuen

    2006-10-16

    A rapid urinalysis system based on SPE-LC-MS/MS with an in-house post-analysis data management system has been developed for the simultaneous identification and semi-quantitation of opiates (morphine, codeine), methadone, amphetamines (amphetamine, methylamphetamine (MA), 3,4-methylenedioxyamphetamine (MDA) and 3,4-methylenedioxymethamphetamine (MDMA)), 11-benzodiazepines or their metabolites and ketamine. The urine samples are subjected to automated solid phase extraction prior to analysis by LC-MS (Finnigan Surveyor LC connected to a Finnigan LCQ Advantage) fitted with an Alltech Rocket Platinum EPS C-18 column. With a single point calibration at the cut-off concentration for each analyte, simultaneous identification and semi-quantitation for the above mentioned drugs can be achieved in a 10 min run per urine sample. A computer macro-program package was developed to automatically retrieve appropriate data from the analytical data files, compare results with preset values (such as cut-off concentrations, MS matching scores) of each drug being analyzed and generate user-defined Excel reports to indicate all positive and negative results in batch-wise manner for ease of checking. The final analytical results are automatically copied into an Access database for report generation purposes. Through the use of automation in sample preparation, simultaneous identification and semi-quantitation by LC-MS/MS and a tailored made post-analysis data management system, this new urinalysis system significantly improves the quality of results, reduces the post-data treatment time, error due to data transfer and is suitable for high-throughput laboratory in batch-wise operation.

  3. Quantitative Analysis Tools and Digital Phantoms for Deformable Image Registration Quality Assurance.

    PubMed

    Kim, Haksoo; Park, Samuel B; Monroe, James I; Traughber, Bryan J; Zheng, Yiran; Lo, Simon S; Yao, Min; Mansur, David; Ellis, Rodney; Machtay, Mitchell; Sohn, Jason W

    2015-08-01

    This article proposes quantitative analysis tools and digital phantoms to quantify intrinsic errors of deformable image registration (DIR) systems and establish quality assurance (QA) procedures for clinical use of DIR systems utilizing local and global error analysis methods with clinically realistic digital image phantoms. Landmark-based image registration verifications are suitable only for images with significant feature points. To address this shortfall, we adapted a deformation vector field (DVF) comparison approach with new analysis techniques to quantify the results. Digital image phantoms are derived from data sets of actual patient images (a reference image set, R, a test image set, T). Image sets from the same patient taken at different times are registered with deformable methods producing a reference DVFref. Applying DVFref to the original reference image deforms T into a new image R'. The data set, R', T, and DVFref, is from a realistic truth set and therefore can be used to analyze any DIR system and expose intrinsic errors by comparing DVFref and DVFtest. For quantitative error analysis, calculating and delineating differences between DVFs, 2 methods were used, (1) a local error analysis tool that displays deformation error magnitudes with color mapping on each image slice and (2) a global error analysis tool that calculates a deformation error histogram, which describes a cumulative probability function of errors for each anatomical structure. Three digital image phantoms were generated from three patients with a head and neck, a lung and a liver cancer. The DIR QA was evaluated using the case with head and neck. © The Author(s) 2014.

  4. Approaching human language with complex networks.

    PubMed

    Cong, Jin; Liu, Haitao

    2014-12-01

    The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics). Copyright © 2014 Elsevier B.V. All rights reserved.

  5. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  6. Quantitative analysis of aircraft multispectral-scanner data and mapping of water-quality parameters in the James River in Virginia

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Bahn, G. S.

    1977-01-01

    Statistical analysis techniques were applied to develop quantitative relationships between in situ river measurements and the remotely sensed data that were obtained over the James River in Virginia on 28 May 1974. The remotely sensed data were collected with a multispectral scanner and with photographs taken from an aircraft platform. Concentration differences among water quality parameters such as suspended sediment, chlorophyll a, and nutrients indicated significant spectral variations. Calibrated equations from the multiple regression analysis were used to develop maps that indicated the quantitative distributions of water quality parameters and the dispersion characteristics of a pollutant plume entering the turbid river system. Results from further analyses that use only three preselected multispectral scanner bands of data indicated that regression coefficients and standard errors of estimate were not appreciably degraded compared with results from the 10-band analysis.

  7. Analysis of Fringe Field Formed Inside LDA Measurement Volume Using Compact Two Hololens Imaging Systems

    NASA Astrophysics Data System (ADS)

    Ghosh, Abhijit; Nirala, A. K.; Yadav, H. L.

    2018-03-01

    We have designed and fabricated four LDA optical setups consisting of aberration compensated four different compact two hololens imaging systems. We have experimentally investigated and realized a hololens recording geometry which is interferogram of converging spherical wavefront with mutually coherent planar wavefront. Proposed real time monitoring and actual fringe field analysis techniques allow complete characterizations of fringes formed at measurement volume and permit to evaluate beam quality, alignment and fringe uniformity with greater precision. After experimentally analyzing the fringes formed at measurement volume by all four imaging systems, it is found that fringes obtained using compact two hololens imaging systems get improved both qualitatively and quantitatively compared to that obtained using conventional imaging system. Results indicate qualitative improvement of non-uniformity in fringe thickness and micro intensity variations perpendicular to the fringes, and quantitative improvement of 39.25% in overall average normalized standard deviations of fringe width formed by compact two hololens imaging systems compare to that of conventional imaging system.

  8. Global quantitative analysis of phosphorylation underlying phencyclidine signaling and sensorimotor gating in the prefrontal cortex.

    PubMed

    McClatchy, D B; Savas, J N; Martínez-Bartolomé, S; Park, S K; Maher, P; Powell, S B; Yates, J R

    2016-02-01

    Prepulse inhibition (PPI) is an example of sensorimotor gating and deficits in PPI have been demonstrated in schizophrenia patients. Phencyclidine (PCP) suppression of PPI in animals has been studied to elucidate the pathological elements of schizophrenia. However, the molecular mechanisms underlying PCP treatment or PPI in the brain are still poorly understood. In this study, quantitative phosphoproteomic analysis was performed on the prefrontal cortex from rats that were subjected to PPI after being systemically injected with PCP or saline. PCP downregulated phosphorylation events were significantly enriched in proteins associated with long-term potentiation (LTP). Importantly, this data set identifies functionally novel phosphorylation sites on known LTP-associated signaling molecules. In addition, mutagenesis of a significantly altered phosphorylation site on xCT (SLC7A11), the light chain of system xc-, the cystine/glutamate antiporter, suggests that PCP also regulates the activity of this protein. Finally, new insights were also derived on PPI signaling independent of PCP treatment. This is the first quantitative phosphorylation proteomic analysis providing new molecular insights into sensorimotor gating.

  9. A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework

    NASA Astrophysics Data System (ADS)

    Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo

    An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.

  10. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; Zheng, Lu; Jiang, Zhanzhi; Ganesan, Vishal; Wang, Yayu; Lai, Keji

    2018-04-01

    We report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-field microwave imaging with small distance modulation.

  11. An analysis of radio frequency surveillance systems for air traffic control volume II: appendixes

    DOT National Transportation Integrated Search

    1976-02-01

    Performance criteria that afford quantitative evaluation of a variety of current and proposed configurations of the Air Traffic Control Radar Beacon System (ATCRBS) are described in detail. Two analytic system models are developed to allow applicatio...

  12. An importance-performance analysis of hospital information system attributes: A nurses' perspective.

    PubMed

    Cohen, Jason F; Coleman, Emma; Kangethe, Matheri J

    2016-02-01

    Health workers have numerous concerns about hospital IS (HIS) usage. Addressing these concerns requires understanding the system attributes most important to their satisfaction and productivity. Following a recent HIS implementation, our objective was to identify priorities for managerial intervention based on user evaluations of the performance of the HIS attributes as well as the relative importance of these attributes to user satisfaction and productivity outcomes. We collected data along a set of attributes representing system quality, data quality, information quality, and service quality from 154 nurse users. Their quantitative responses were analysed using the partial least squares approach followed by an importance-performance analysis. Qualitative responses were analysed using thematic analysis to triangulate and supplement the quantitative findings. Two system quality attributes (responsiveness and ease of learning), one information quality attribute (detail), one service quality attribute (sufficient support), and three data quality attributes (records complete, accurate and never missing) were identified as high priorities for intervention. Our application of importance-performance analysis is unique in HIS evaluation and we have illustrated its utility for identifying those system attributes for which underperformance is not acceptable to users and therefore should be high priorities for intervention. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  13. [Quantitative histoenzymatic analysis of the adenohypophysis and adrenal cortex during the early stages of involution].

    PubMed

    Prochukhanov, R A; Rostovtseva, T I

    1977-11-01

    A method of quantitative histenzymatic analysis was applied for determination of the involution changes of the neuroendocrine system. The activity of NAD- and NADP-reductases, acid and alkaline phosphatases, glucose-6-phosphoric dehydrogenase, 3-OH-steroid-dehydrogenase, 11-hydroxysteroid dehydrogenases was investigated in the adenohypophysis and in the adrenal cortex of rats aged 4 and 12 months. There were revealed peculiarities attending the structural-metabolic provision of physiological reconstructions of the neuro-endocrine system under conditions of the estral cycle at the early involution stages. An initial reduction of the cell ular-vascular transport with the retention of the functional activity of the intracellular organoids was demonstrated in ageing animals.

  14. Qualitative and quantitative processing of side-scan sonar data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dwan, F.S.; Anderson, A.L.; Hilde, T.W.C.

    1990-06-01

    Modern side-scan sonar systems allow vast areas of seafloor to be rapidly imaged and quantitatively mapped in detail. The application of remote sensing image processing techniques can be used to correct for various distortions inherent in raw sonography. Corrections are possible for water column, slant-range, aspect ratio, speckle and striping noise, multiple returns, power drop-off, and for georeferencing. The final products reveal seafloor features and patterns that are geometrically correct, georeferenced, and have improved signal/noise ratio. These products can be merged with other georeferenced data bases for further database management and information extraction. In order to compare data collected bymore » different systems from a common area and to ground truth measurements and geoacoustic models, quantitative correction must be made for calibrated sonar system and bathymetry effects. Such data inversion must account for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area, and grazing angle effects. Seafloor classification can then be performed on the calculated back-scattering strength using Lambert's Law and regression analysis. Examples are given using both approaches: image analysis and inversion of data based on the sonar equation.« less

  15. Automated Simulation For Analysis And Design

    NASA Technical Reports Server (NTRS)

    Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.

    1992-01-01

    Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.

  16. Products of combustion of non-metallic materials

    NASA Technical Reports Server (NTRS)

    Perry, Cortes L.

    1995-01-01

    The objective of this project is to evaluate methodologies for the qualitative and quantitative determination of the gaseous products of combustion of non-metallic materials of interest to the aerospace community. The goal is to develop instrumentation and analysis procedures which qualitatively and quantitatively identify gaseous products evolved by thermal decomposition and provide NASA a detailed system operating procedure.

  17. The Application of Operations Research Techniques to the Evaluation of Military Management Information Systems.

    DTIC Science & Technology

    systems such as management information systems . To provide a methodology yielding quantitative results which may assist a commander and his staff in...this analysis, it is proposed that management information systems be evaluated as a whole by a technique defined as the semantic differential. Each

  18. Novel Threat-risk Index Using Probabilistic Risk Assessment and Human Reliability Analysis - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George A. Beitel

    2004-02-01

    In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less

  19. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  20. DEVELOPMENT OF AN IN SITU THERMAL EXTRACTION DETECTION SYSTEM (TEDS) FOR RAPID, ACCURATE, QUANTITATIVE ANALYSIS OF ENVIRONMENTAL POLLUTANTS IN THE SUBSURFACE - PHASE I

    EPA Science Inventory

    Ion Signature Technology, Inc. (IST) will develop and market a collection and analysis system that will retrieve soil-bound pollutants as well as soluble and non-soluble contaminants from groundwater as the probe is pushed by cone penetrometry of Geoprobe into the subsurface. ...

  1. Web-Based Essay Critiquing System and EFL Students' Writing: A Quantitative and Qualitative Investigation

    ERIC Educational Resources Information Center

    Lee, Cynthia; Wong, Kelvin C. K.; Cheung, William K.; Lee, Fion S. L.

    2009-01-01

    The paper first describes a web-based essay critiquing system developed by the authors using latent semantic analysis (LSA), an automatic text analysis technique, to provide students with immediate feedback on content and organisation for revision whenever there is an internet connection. It reports on its effectiveness in enhancing adult EFL…

  2. Integrated Approach To Design And Analysis Of Systems

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Iverson, David L.

    1993-01-01

    Object-oriented fault-tree representation unifies evaluation of reliability and diagnosis of faults. Programming/fault tree described more fully in "Object-Oriented Algorithm For Evaluation Of Fault Trees" (ARC-12731). Augmented fault tree object contains more information than fault tree object used in quantitative analysis of reliability. Additional information needed to diagnose faults in system represented by fault tree.

  3. Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.

    PubMed

    Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman

    2016-10-28

    Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  4. Developing a performance measurement approach to benefit/cost freight project prioritization.

    DOT National Transportation Integrated Search

    2014-10-01

    Future reauthorizations of the federal transportation bill will require a comprehensive and quantitative analysis of the freight benefits : of proposed freight system projects. To prioritize public investments in freight systems and to insure conside...

  5. The Brain Network for Deductive Reasoning: A Quantitative Meta-analysis of 28 Neuroimaging Studies

    PubMed Central

    Prado, Jérôme; Chadha, Angad; Booth, James R.

    2011-01-01

    Over the course of the past decade, contradictory claims have been made regarding the neural bases of deductive reasoning. Researchers have been puzzled by apparent inconsistencies in the literature. Some have even questioned the effectiveness of the methodology used to study the neural bases of deductive reasoning. However, the idea that neuroimaging findings are inconsistent is not based on any quantitative evidence. Here, we report the results of a quantitative meta-analysis of 28 neuroimaging studies of deductive reasoning published between 1997 and 2010, combining 382 participants. Consistent areas of activations across studies were identified using the multilevel kernel density analysis method. We found that results from neuroimaging studies are more consistent than what has been previously assumed. Overall, studies consistently report activations in specific regions of a left fronto-parietal system, as well as in the left Basal Ganglia. This brain system can be decomposed into three subsystems that are specific to particular types of deductive arguments: relational, categorical, and propositional. These dissociations explain inconstancies in the literature. However, they are incompatible with the notion that deductive reasoning is supported by a single cognitive system relying either on visuospatial or rule-based mechanisms. Our findings provide critical insight into the cognitive organization of deductive reasoning and need to be accounted for by cognitive theories. PMID:21568632

  6. On the Need for Quantitative Bias Analysis in the Peer-Review Process.

    PubMed

    Fox, Matthew P; Lash, Timothy L

    2017-05-15

    Peer review is central to the process through which epidemiologists generate evidence to inform public health and medical interventions. Reviewers thereby act as critical gatekeepers to high-quality research. They are asked to carefully consider the validity of the proposed work or research findings by paying careful attention to the methodology and critiquing the importance of the insight gained. However, although many have noted problems with the peer-review system for both manuscripts and grant submissions, few solutions have been proposed to improve the process. Quantitative bias analysis encompasses all methods used to quantify the impact of systematic error on estimates of effect in epidemiologic research. Reviewers who insist that quantitative bias analysis be incorporated into the design, conduct, presentation, and interpretation of epidemiologic research could substantially strengthen the process. In the present commentary, we demonstrate how quantitative bias analysis can be used by investigators and authors, reviewers, funding agencies, and editors. By utilizing quantitative bias analysis in the peer-review process, editors can potentially avoid unnecessary rejections, identify key areas for improvement, and improve discussion sections by shifting from speculation on the impact of sources of error to quantification of the impact those sources of bias may have had. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Simultaneous quantitative analysis of main components in linderae reflexae radix with one single marker.

    PubMed

    Wang, Li-Li; Zhang, Yun-Bin; Sun, Xiao-Ya; Chen, Sui-Qing

    2016-05-08

    Establish a quantitative analysis of multi-components by the single marker (QAMS) method for quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four main components in Linderae Reflexae Radix. Four main components of pinostrobin, pinosylvin, pinocembrin, and 3,5-dihydroxy-2-(1- p -mentheneyl)- trans -stilbene were selected as analytes to evaluate the quality by RP-HPLC coupled with a UV-detector. The method was evaluated by a comparison of the quantitative results between the external standard method and QAMS with a different HPLC system. The results showed that no significant differences were found in the quantitative results of the four contents of Linderae Reflexae Radix determined by the external standard method and QAMS (RSD <3%). The contents of four analytes (pinosylvin, pinocembrin, pinostrobin, and Reflexanbene I) in Linderae Reflexae Radix were determined by the single marker of pinosylvin. This fingerprint was the spectra determined by Shimadzu LC-20AT and Waters e2695 HPLC that were equipped with three different columns.

  8. A versatile pipeline for the multi-scale digital reconstruction and quantitative analysis of 3D tissue architecture

    PubMed Central

    Morales-Navarrete, Hernán; Segovia-Miranda, Fabián; Klukowski, Piotr; Meyer, Kirstin; Nonaka, Hidenori; Marsico, Giovanni; Chernykh, Mikhail; Kalaidzidis, Alexander; Zerial, Marino; Kalaidzidis, Yannis

    2015-01-01

    A prerequisite for the systems biology analysis of tissues is an accurate digital three-dimensional reconstruction of tissue structure based on images of markers covering multiple scales. Here, we designed a flexible pipeline for the multi-scale reconstruction and quantitative morphological analysis of tissue architecture from microscopy images. Our pipeline includes newly developed algorithms that address specific challenges of thick dense tissue reconstruction. Our implementation allows for a flexible workflow, scalable to high-throughput analysis and applicable to various mammalian tissues. We applied it to the analysis of liver tissue and extracted quantitative parameters of sinusoids, bile canaliculi and cell shapes, recognizing different liver cell types with high accuracy. Using our platform, we uncovered an unexpected zonation pattern of hepatocytes with different size, nuclei and DNA content, thus revealing new features of liver tissue organization. The pipeline also proved effective to analyse lung and kidney tissue, demonstrating its generality and robustness. DOI: http://dx.doi.org/10.7554/eLife.11214.001 PMID:26673893

  9. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    PubMed Central

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137

  10. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-12-15

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less

  11. Noninvasive Biomonitoring Approaches to Determine Dosimetry and Risk Following Acute Chemical Exposure: Analysis of Lead or Organophosphate Insecticide in Saliva

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timchalk, Chuck; Poet, Torka S.; Kousba, Ahmed A.

    2004-04-01

    There is a need to develop approaches for assessing risk associated with acute exposures to a broad-range of chemical agents and to rapidly determine the potential implications to human health. Non-invasive biomonitoring approaches are being developed using reliable portable analytical systems to quantitate dosimetry utilizing readily obtainable body fluids, such as saliva. Saliva has been used to evaluate a broad range of biomarkers, drugs, and environmental contaminants including heavy metals and pesticides. To advance the application of non-invasive biomonitoring a microfluidic/ electrochemical device has also been developed for the analysis of lead (Pb), using square wave anodic stripping voltammetry. Themore » system demonstrates a linear response over a broad concentration range (1 2000 ppb) and is capable of quantitating saliva Pb in rats orally administered acute doses of Pb-acetate. Appropriate pharmacokinetic analyses have been used to quantitate systemic dosimetry based on determination of saliva Pb concentrations. In addition, saliva has recently been used to quantitate dosimetry following exposure to the organophosphate insecticide chlorpyrifos in a rodent model system by measuring the major metabolite, trichloropyridinol, and saliva cholinesterase inhibition following acute exposures. These results suggest that technology developed for non-invasive biomonitoring can provide a sensitive, and portable analytical tool capable of assessing exposure and risk in real-time. By coupling these non-invasive technologies with pharmacokinetic modeling it is feasible to rapidly quantitate acute exposure to a broad range of chemical agents. In summary, it is envisioned that once fully developed, these monitoring and modeling approaches will be useful for accessing acute exposure and health risk.« less

  12. Use of local noise power spectrum and wavelet analysis in quantitative image quality assurance for EPIDs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Soyoung

    Purpose: To investigate the use of local noise power spectrum (NPS) to characterize image noise and wavelet analysis to isolate defective pixels and inter-subpanel flat-fielding artifacts for quantitative quality assurance (QA) of electronic portal imaging devices (EPIDs). Methods: A total of 93 image sets including custom-made bar-pattern images and open exposure images were collected from four iViewGT a-Si EPID systems over three years. Global quantitative metrics such as modulation transform function (MTF), NPS, and detective quantum efficiency (DQE) were computed for each image set. Local NPS was also calculated for individual subpanels by sampling region of interests within each subpanelmore » of the EPID. The 1D NPS, obtained by radially averaging the 2D NPS, was fitted to a power-law function. The r-square value of the linear regression analysis was used as a singular metric to characterize the noise properties of individual subpanels of the EPID. The sensitivity of the local NPS was first compared with the global quantitative metrics using historical image sets. It was then compared with two commonly used commercial QA systems with images collected after applying two different EPID calibration methods (single-level gain and multilevel gain). To detect isolated defective pixels and inter-subpanel flat-fielding artifacts, Haar wavelet transform was applied on the images. Results: Global quantitative metrics including MTF, NPS, and DQE showed little change over the period of data collection. On the contrary, a strong correlation between the local NPS (r-square values) and the variation of the EPID noise condition was observed. The local NPS analysis indicated image quality improvement with the r-square values increased from 0.80 ± 0.03 (before calibration) to 0.85 ± 0.03 (after single-level gain calibration) and to 0.96 ± 0.03 (after multilevel gain calibration), while the commercial QA systems failed to distinguish the image quality improvement between the two calibration methods. With wavelet analysis, defective pixels and inter-subpanel flat-fielding artifacts were clearly identified as spikes after thresholding the inversely transformed images. Conclusions: The proposed local NPS (r-square values) showed superior sensitivity to the noise level variations of individual subpanels compared with global quantitative metrics such as MTF, NPS, and DQE. Wavelet analysis was effective in detecting isolated defective pixels and inter-subpanel flat-fielding artifacts. The proposed methods are promising for the early detection of imaging artifacts of EPIDs.« less

  13. Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses

    PubMed Central

    Alexander, Elsinore; Wei, Xin; Lee, Shinwook

    2018-01-01

    Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526

  14. Quantitative imaging features of pretreatment CT predict volumetric response to chemotherapy in patients with colorectal liver metastases.

    PubMed

    Creasy, John M; Midya, Abhishek; Chakraborty, Jayasree; Adams, Lauryn B; Gomes, Camilla; Gonen, Mithat; Seastedt, Kenneth P; Sutton, Elizabeth J; Cercek, Andrea; Kemeny, Nancy E; Shia, Jinru; Balachandran, Vinod P; Kingham, T Peter; Allen, Peter J; DeMatteo, Ronald P; Jarnagin, William R; D'Angelica, Michael I; Do, Richard K G; Simpson, Amber L

    2018-06-19

    This study investigates whether quantitative image analysis of pretreatment CT scans can predict volumetric response to chemotherapy for patients with colorectal liver metastases (CRLM). Patients treated with chemotherapy for CRLM (hepatic artery infusion (HAI) combined with systemic or systemic alone) were included in the study. Patients were imaged at baseline and approximately 8 weeks after treatment. Response was measured as the percentage change in tumour volume from baseline. Quantitative imaging features were derived from the index hepatic tumour on pretreatment CT, and features statistically significant on univariate analysis were included in a linear regression model to predict volumetric response. The regression model was constructed from 70% of data, while 30% were reserved for testing. Test data were input into the trained model. Model performance was evaluated with mean absolute prediction error (MAPE) and R 2 . Clinicopatholologic factors were assessed for correlation with response. 157 patients were included, split into training (n = 110) and validation (n = 47) sets. MAPE from the multivariate linear regression model was 16.5% (R 2 = 0.774) and 21.5% in the training and validation sets, respectively. Stratified by HAI utilisation, MAPE in the validation set was 19.6% for HAI and 25.1% for systemic chemotherapy alone. Clinical factors associated with differences in median tumour response were treatment strategy, systemic chemotherapy regimen, age and KRAS mutation status (p < 0.05). Quantitative imaging features extracted from pretreatment CT are promising predictors of volumetric response to chemotherapy in patients with CRLM. Pretreatment predictors of response have the potential to better select patients for specific therapies. • Colorectal liver metastases (CRLM) are downsized with chemotherapy but predicting the patients that will respond to chemotherapy is currently not possible. • Heterogeneity and enhancement patterns of CRLM can be measured with quantitative imaging. • Prediction model constructed that predicts volumetric response with 20% error suggesting that quantitative imaging holds promise to better select patients for specific treatments.

  15. Manual on performance of traffic signal systems: assessment of operations and maintenance : [summary].

    DOT National Transportation Integrated Search

    2017-05-01

    In this project, Florida Atlantic University researchers developed a methodology and software tools that allow objective, quantitative analysis of the performance of signal systems. : The researchers surveyed the state of practice for traffic signal ...

  16. Prototype Development of a Tradespace Analysis Tool for Spaceflight Medical Resources.

    PubMed

    Antonsen, Erik L; Mulcahy, Robert A; Rubin, David; Blue, Rebecca S; Canga, Michael A; Shah, Ronak

    2018-02-01

    The provision of medical care in exploration-class spaceflight is limited by mass, volume, and power constraints, as well as limitations of available skillsets of crewmembers. A quantitative means of exploring the risks and benefits of inclusion or exclusion of onboard medical capabilities may help to inform the development of an appropriate medical system. A pilot project was designed to demonstrate the utility of an early tradespace analysis tool for identifying high-priority resources geared toward properly equipping an exploration mission medical system. Physician subject matter experts identified resources, tools, and skillsets required, as well as associated criticality scores of the same, to meet terrestrial, U.S.-specific ideal medical solutions for conditions concerning for exploration-class spaceflight. A database of diagnostic and treatment actions and resources was created based on this input and weighed against the probabilities of mission-specific medical events to help identify common and critical elements needed in a future exploration medical capability. Analysis of repository data demonstrates the utility of a quantitative method of comparing various medical resources and skillsets for future missions. Directed database queries can provide detailed comparative estimates concerning likelihood of resource utilization within a given mission and the weighted utility of tangible and intangible resources. This prototype tool demonstrates one quantitative approach to the complex needs and limitations of an exploration medical system. While this early version identified areas for refinement in future version development, more robust analysis tools may help to inform the development of a comprehensive medical system for future exploration missions.Antonsen EL, Mulcahy RA, Rubin D, Blue RS, Canga MA, Shah R. Prototype development of a tradespace analysis tool for spaceflight medical resources. Aerosp Med Hum Perform. 2018; 89(2):108-114.

  17. Refining the quantitative pathway of the Pathways to Mathematics model.

    PubMed

    Sowinski, Carla; LeFevre, Jo-Anne; Skwarchuk, Sheri-Lynn; Kamawar, Deepthi; Bisanz, Jeffrey; Smith-Chant, Brenda

    2015-03-01

    In the current study, we adopted the Pathways to Mathematics model of LeFevre et al. (2010). In this model, there are three cognitive domains--labeled as the quantitative, linguistic, and working memory pathways--that make unique contributions to children's mathematical development. We attempted to refine the quantitative pathway by combining children's (N=141 in Grades 2 and 3) subitizing, counting, and symbolic magnitude comparison skills using principal components analysis. The quantitative pathway was examined in relation to dependent numerical measures (backward counting, arithmetic fluency, calculation, and number system knowledge) and a dependent reading measure, while simultaneously accounting for linguistic and working memory skills. Analyses controlled for processing speed, parental education, and gender. We hypothesized that the quantitative, linguistic, and working memory pathways would account for unique variance in the numerical outcomes; this was the case for backward counting and arithmetic fluency. However, only the quantitative and linguistic pathways (not working memory) accounted for unique variance in calculation and number system knowledge. Not surprisingly, only the linguistic pathway accounted for unique variance in the reading measure. These findings suggest that the relative contributions of quantitative, linguistic, and working memory skills vary depending on the specific cognitive task. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Photo ion spectrometer

    DOEpatents

    Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.

    1989-01-01

    A method and apparatus for extracting for quantitative analysis ions of selected atomic components of a sample. A lens system is configured to provide a slowly diminishing field region for a volume containing the selected atomic components, enabling accurate energy analysis of ions generated in the slowly diminishing field region. The lens system also enables focusing on a sample of a charged particle beam, such as an ion beam, along a path length perpendicular to the sample and extraction of the charged particles along a path length also perpendicular to the sample. Improvement of signal to noise ratio is achieved by laser excitation of ions to selected autoionization states before carrying out quantitative analysis. Accurate energy analysis of energetic charged particles is assured by using a preselected resistive thick film configuration disposed on an insulator substrate for generating predetermined electric field boundary conditions to achieve for analysis the required electric field potential. The spectrometer also is applicable in the fields of SIMS, ISS and electron spectroscopy.

  19. Photo ion spectrometer

    DOEpatents

    Gruen, D.M.; Young, C.E.; Pellin, M.J.

    1989-08-08

    A method and apparatus are described for extracting for quantitative analysis ions of selected atomic components of a sample. A lens system is configured to provide a slowly diminishing field region for a volume containing the selected atomic components, enabling accurate energy analysis of ions generated in the slowly diminishing field region. The lens system also enables focusing on a sample of a charged particle beam, such as an ion beam, along a path length perpendicular to the sample and extraction of the charged particles along a path length also perpendicular to the sample. Improvement of signal to noise ratio is achieved by laser excitation of ions to selected auto-ionization states before carrying out quantitative analysis. Accurate energy analysis of energetic charged particles is assured by using a preselected resistive thick film configuration disposed on an insulator substrate for generating predetermined electric field boundary conditions to achieve for analysis the required electric field potential. The spectrometer also is applicable in the fields of SIMS, ISS and electron spectroscopy. 8 figs.

  20. Advances in Surface Plasmon Resonance Imaging allowing for quantitative measurement of laterally heterogeneous samples

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2012-02-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to materials on metallic surfaces for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases -- uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. The degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  1. Task-based modeling and optimization of a cone-beam CT scanner for musculoskeletal imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prakash, P.; Zbijewski, W.; Gang, G. J.

    2011-10-15

    Purpose: This work applies a cascaded systems model for cone-beam CT imaging performance to the design and optimization of a system for musculoskeletal extremity imaging. The model provides a quantitative guide to the selection of system geometry, source and detector components, acquisition techniques, and reconstruction parameters. Methods: The model is based on cascaded systems analysis of the 3D noise-power spectrum (NPS) and noise-equivalent quanta (NEQ) combined with factors of system geometry (magnification, focal spot size, and scatter-to-primary ratio) and anatomical background clutter. The model was extended to task-based analysis of detectability index (d') for tasks ranging in contrast and frequencymore » content, and d' was computed as a function of system magnification, detector pixel size, focal spot size, kVp, dose, electronic noise, voxel size, and reconstruction filter to examine trade-offs and optima among such factors in multivariate analysis. The model was tested quantitatively versus the measured NPS and qualitatively in cadaver images as a function of kVp, dose, pixel size, and reconstruction filter under conditions corresponding to the proposed scanner. Results: The analysis quantified trade-offs among factors of spatial resolution, noise, and dose. System magnification (M) was a critical design parameter with strong effect on spatial resolution, dose, and x-ray scatter, and a fairly robust optimum was identified at M {approx} 1.3 for the imaging tasks considered. The results suggested kVp selection in the range of {approx}65-90 kVp, the lower end (65 kVp) maximizing subject contrast and the upper end maximizing NEQ (90 kVp). The analysis quantified fairly intuitive results--e.g., {approx}0.1-0.2 mm pixel size (and a sharp reconstruction filter) optimal for high-frequency tasks (bone detail) compared to {approx}0.4 mm pixel size (and a smooth reconstruction filter) for low-frequency (soft-tissue) tasks. This result suggests a specific protocol for 1 x 1 (full-resolution) projection data acquisition followed by full-resolution reconstruction with a sharp filter for high-frequency tasks along with 2 x 2 binning reconstruction with a smooth filter for low-frequency tasks. The analysis guided selection of specific source and detector components implemented on the proposed scanner. The analysis also quantified the potential benefits and points of diminishing return in focal spot size, reduced electronic noise, finer detector pixels, and low-dose limits of detectability. Theoretical results agreed quantitatively with the measured NPS and qualitatively with evaluation of cadaver images by a musculoskeletal radiologist. Conclusions: A fairly comprehensive model for 3D imaging performance in cone-beam CT combines factors of quantum noise, system geometry, anatomical background, and imaging task. The analysis provided a valuable, quantitative guide to design, optimization, and technique selection for a musculoskeletal extremities imaging system under development.« less

  2. Quantitative comparison of the in situ microbial communities in different biomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, D.C.; Ringelberg, D.B.; Palmer, R.J.

    1995-12-31

    A system to define microbial communities in different biomes requires the application of non-traditional methodology. Classical microbiological methods have severe limitations for the analysis of environmental samples. Pure-culture isolation, biochemical testing, and/or enumeration by direct microscopic counting are not well suited for the estimation of total biomass or the assessment of community composition within environmental samples. Such methods provide little insight into the in situ phenotypic activity of the extant microbiota since these techniques are dependent on microbial growth and thus select against many environmental microorganisms which are non- culturable under a wide range of conditions. It has been repeatedlymore » documented in the literature that viable counts or direct counts of bacteria attached to sediment grains are difficult to quantitative and may grossly underestimate the extent of the existing community. The traditional tests provide little indication of the in situ nutritional status or for evidence of toxicity within the microbial community. A more recent development (MIDI Microbial Identification System), measure free and ester-linked fatty acids from isolated microorganisms. Bacterial isolates are identified by comparing their fatty acid profiles to the MIKI database which contains over 8000 entries. The application of the MIKI system to the analysis of environmental samples however, has significant drawbacks. The MIDI system was developed to identify clinical microorganisms and requires their isolation and culture on trypticase soy agar at 27{degrees}C. Since many isolates are unable to grow at these restrictive growth conditions, the system does not lend itself to identification of some environmental organisms. A more applicable methodology for environmental microbial analysis is based on the liquid extrication and separation of microbial lipids from environmental samples, followed by quantitative analysis using gas chromatography/« less

  3. Application of automatic image analysis in wood science

    Treesearch

    Charles W. McMillin

    1982-01-01

    In this paper I describe an image analysis system and illustrate with examples the application of automatic quantitative measurement to wood science. Automatic image analysis, a powerful and relatively new technology, uses optical, video, electronic, and computer components to rapidly derive information from images with minimal operator interaction. Such instruments...

  4. Noninvasive Dry Eye Assessment Using High-Technology Ophthalmic Examination Devices.

    PubMed

    Yamaguchi, Masahiko; Sakane, Yuri; Kamao, Tomoyuki; Zheng, Xiaodong; Goto, Tomoko; Shiraishi, Atsushi; Ohashi, Yuichi

    2016-11-01

    Recently, the number of dry eye cases has dramatically increased. Thus, it is important that easy screening, exact diagnoses, and suitable treatments be available. We developed 3 original and noninvasive assessments for this disorder. First, a DR-1 dry eye monitor was used to determine the tear meniscus height quantitatively by capturing a tear meniscus digital image that was analyzed by Meniscus Processor software. The DR-1 meniscus height value significantly correlated with the fluorescein meniscus height (r = 0.06, Bland-Altman analysis). At a cutoff value of 0.22 mm, sensitivity of the dry eye diagnosis was 84.1% with 90.9% specificity. Second, the Tear Stability Analysis System was used to quantitatively measure tear film stability using a topographic modeling system corneal shape analysis device. Tear film stability was objectively and quantitatively evaluated every second during sustained eye openings. The Tear Stability Analysis System is currently installed in an RT-7000 autorefractometer and topographer to automate the diagnosis of dry eye. Third, the Ocular Surface Thermographer uses ophthalmic thermography for diagnosis. The decrease in ocular surface temperature in dry eyes was significantly greater than that in normal eyes (P < 0.001) at 10 seconds after eye opening. Decreased corneal temperature correlated significantly with the tear film breakup time (r = 0.572; P < 0.001). When changes in the ocular surface temperature of the cornea were used as indicators for dry eye, sensitivity was 0.83 and specificity was 0.80 after 10 seconds. This article describes the details and potential of these 3 noninvasive dry eye assessment systems.

  5. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  6. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE PAGES

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; ...

    2018-04-01

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  7. High-Content Screening for Quantitative Cell Biology.

    PubMed

    Mattiazzi Usaj, Mojca; Styles, Erin B; Verster, Adrian J; Friesen, Helena; Boone, Charles; Andrews, Brenda J

    2016-08-01

    High-content screening (HCS), which combines automated fluorescence microscopy with quantitative image analysis, allows the acquisition of unbiased multiparametric data at the single cell level. This approach has been used to address diverse biological questions and identify a plethora of quantitative phenotypes of varying complexity in numerous different model systems. Here, we describe some recent applications of HCS, ranging from the identification of genes required for specific biological processes to the characterization of genetic interactions. We review the steps involved in the design of useful biological assays and automated image analysis, and describe major challenges associated with each. Additionally, we highlight emerging technologies and future challenges, and discuss how the field of HCS might be enhanced in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. A scoring system for appraising mixed methods research, and concomitantly appraising qualitative, quantitative and mixed methods primary studies in Mixed Studies Reviews.

    PubMed

    Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique

    2009-04-01

    A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This scoring system may also be used to appraise the methodological quality of qualitative, quantitative and mixed methods components of mixed methods research.

  9. Pharmacometabolomics Informs Quantitative Radiomics for Glioblastoma Diagnostic Innovation.

    PubMed

    Katsila, Theodora; Matsoukas, Minos-Timotheos; Patrinos, George P; Kardamakis, Dimitrios

    2017-08-01

    Applications of omics systems biology technologies have enormous promise for radiology and diagnostics in surgical fields. In this context, the emerging fields of radiomics (a systems scale approach to radiology using a host of technologies, including omics) and pharmacometabolomics (use of metabolomics for patient and disease stratification and guiding precision medicine) offer much synergy for diagnostic innovation in surgery, particularly in neurosurgery. This synthesis of omics fields and applications is timely because diagnostic accuracy in central nervous system tumors still challenges decision-making. Considering the vast heterogeneity in brain tumors, disease phenotypes, and interindividual variability in surgical and chemotherapy outcomes, we believe that diagnostic accuracy can be markedly improved by quantitative radiomics coupled to pharmacometabolomics and related health information technologies while optimizing economic costs of traditional diagnostics. In this expert review, we present an innovation analysis on a systems-level multi-omics approach toward diagnostic accuracy in central nervous system tumors. For this, we suggest that glioblastomas serve as a useful application paradigm. We performed a literature search on PubMed for articles published in English between 2006 and 2016. We used the search terms "radiomics," "glioblastoma," "biomarkers," "pharmacogenomics," "pharmacometabolomics," "pharmacometabonomics/pharmacometabolomics," "collaborative informatics," and "precision medicine." A list of the top 4 insights we derived from this literature analysis is presented in this study. For example, we found that (i) tumor grading needs to be better refined, (ii) diagnostic precision should be improved, (iii) standardization in radiomics is lacking, and (iv) quantitative radiomics needs to prove clinical implementation. We conclude with an interdisciplinary call to the metabolomics, pharmacy/pharmacology, radiology, and surgery communities that pharmacometabolomics coupled to information technologies (chemoinformatics tools, databases, collaborative systems) can inform quantitative radiomics, thus translating Big Data and information growth to knowledge growth, rational drug development and diagnostics innovation for glioblastomas, and possibly in other brain tumors.

  10. General description and understanding of the nonlinear dynamics of mode-locked fiber lasers.

    PubMed

    Wei, Huai; Li, Bin; Shi, Wei; Zhu, Xiushan; Norwood, Robert A; Peyghambarian, Nasser; Jian, Shuisheng

    2017-05-02

    As a type of nonlinear system with complexity, mode-locked fiber lasers are known for their complex behaviour. It is a challenging task to understand the fundamental physics behind such complex behaviour, and a unified description for the nonlinear behaviour and the systematic and quantitative analysis of the underlying mechanisms of these lasers have not been developed. Here, we present a complexity science-based theoretical framework for understanding the behaviour of mode-locked fiber lasers by going beyond reductionism. This hierarchically structured framework provides a model with variable dimensionality, resulting in a simple view that can be used to systematically describe complex states. Moreover, research into the attractors' basins reveals the origin of stochasticity, hysteresis and multistability in these systems and presents a new method for quantitative analysis of these nonlinear phenomena. These findings pave the way for dynamics analysis and system designs of mode-locked fiber lasers. We expect that this paradigm will also enable potential applications in diverse research fields related to complex nonlinear phenomena.

  11. [Urban ecological land in Changsha City: its quantitative analysis and optimization].

    PubMed

    Li, Xiao-Li; Zeng, Guang-Ming; Shi, Lin; Liang, Jie; Cai, Qing

    2010-02-01

    In this paper, a hierarchy index system suitable for catastrophe progression method was constructed to comprehensively analyze and evaluate the status of ecological land construction in Changsha City in 2007. Based on the evaluation results, the irrationalities of the distribution pattern of Changsha urban ecological land were discussed. With the support of geographic information system (GIS), the ecological corridors of the urban ecological land were constructed by using the 'least-cost' modeling, and, in combining with conflict analysis, the optimum project of the urban ecological land was put forward, forming an integrated evaluation system. The results indicated that the ecological efficiency of urban ecological land in Changsha in 2007 was at medium level, with an evaluation value being 0.9416, and the quantitative index being relatively high but the coordination index being relatively low. The analysis and verification with software Fragstats showed that the ecological efficiency of the urban ecological land after optimization was higher, with the evaluation value being 0.9618, and the SHDI, CONTAG, and other indices also enhanced.

  12. Performance Analysis of Hospital Information System of the National Health Insurance Corporation Ilsan Hospital

    PubMed Central

    Han, Jung Mi; Boo, Eun Hee; Kim, Jung A; Yoon, Soo Jin; Kim, Seong Woo

    2012-01-01

    Objectives This study evaluated the qualitative and quantitative performances of the newly developed information system which was implemented on November 4, 2011 at the National Health Insurance Corporation Ilsan Hospital. Methods Registration waiting time and changes in the satisfaction scores for the key performance indicators (KPI) before and after the introduction of the system were compared; and the economic effects of the system were analyzed by using the information economics approach. Results After the introduction of the system, the waiting time for registration was reduced by 20%, and the waiting time at the internal medicine department was reduced by 15%. The benefit-to-cost ratio was increased to 1.34 when all intangible benefits were included in the economic analysis. Conclusions The economic impact and target satisfaction rates increased due to the introduction of the new system. The results were proven by the quantitative and qualitative analyses carried out in this study. This study was conducted only seven months after the introduction of the system. As such, a follow-up study should be carried out in the future when the system stabilizes. PMID:23115744

  13. Transdisciplinary application of the cross-scale resilience model

    USGS Publications Warehouse

    Sundstrom, Shana M.; Angeler, David G.; Garmestani, Ahjond S.; Garcia, Jorge H.; Allen, Craig R.

    2014-01-01

    The cross-scale resilience model was developed in ecology to explain the emergence of resilience from the distribution of ecological functions within and across scales, and as a tool to assess resilience. We propose that the model and the underlying discontinuity hypothesis are relevant to other complex adaptive systems, and can be used to identify and track changes in system parameters related to resilience. We explain the theory behind the cross-scale resilience model, review the cases where it has been applied to non-ecological systems, and discuss some examples of social-ecological, archaeological/ anthropological, and economic systems where a cross-scale resilience analysis could add a quantitative dimension to our current understanding of system dynamics and resilience. We argue that the scaling and diversity parameters suitable for a resilience analysis of ecological systems are appropriate for a broad suite of systems where non-normative quantitative assessments of resilience are desired. Our planet is currently characterized by fast environmental and social change, and the cross-scale resilience model has the potential to quantify resilience across many types of complex adaptive systems.

  14. The Quantitative Basis of the Arabidopsis Innate Immune System to Endemic Pathogens Depends on Pathogen Genetics

    PubMed Central

    Corwin, Jason A.; Copeland, Daniel; Feusier, Julie; Subedy, Anushriya; Eshbaugh, Robert; Palmer, Christine; Maloof, Julin; Kliebenstein, Daniel J.

    2016-01-01

    The most established model of the eukaryotic innate immune system is derived from examples of large effect monogenic quantitative resistance to pathogens. However, many host-pathogen interactions involve many genes of small to medium effect and exhibit quantitative resistance. We used the Arabidopsis-Botrytis pathosystem to explore the quantitative genetic architecture underlying host innate immune system in a population of Arabidopsis thaliana. By infecting a diverse panel of Arabidopsis accessions with four phenotypically and genotypically distinct isolates of the fungal necrotroph B. cinerea, we identified a total of 2,982 genes associated with quantitative resistance using lesion area and 3,354 genes associated with camalexin production as measures of the interaction. Most genes were associated with resistance to a specific Botrytis isolate, which demonstrates the influence of pathogen genetic variation in analyzing host quantitative resistance. While known resistance genes, such as receptor-like kinases (RLKs) and nucleotide-binding site leucine-rich repeat proteins (NLRs), were found to be enriched among associated genes, they only account for a small fraction of the total genes associated with quantitative resistance. Using publically available co-expression data, we condensed the quantitative resistance associated genes into co-expressed gene networks. GO analysis of these networks implicated several biological processes commonly connected to disease resistance, including defense hormone signaling and ROS production, as well as novel processes, such as leaf development. Validation of single gene T-DNA knockouts in a Col-0 background demonstrate a high success rate (60%) when accounting for differences in environmental and Botrytis genetic variation. This study shows that the genetic architecture underlying host innate immune system is extremely complex and is likely able to sense and respond to differential virulence among pathogen genotypes. PMID:26866607

  15. The Quantitative Basis of the Arabidopsis Innate Immune System to Endemic Pathogens Depends on Pathogen Genetics.

    PubMed

    Corwin, Jason A; Copeland, Daniel; Feusier, Julie; Subedy, Anushriya; Eshbaugh, Robert; Palmer, Christine; Maloof, Julin; Kliebenstein, Daniel J

    2016-02-01

    The most established model of the eukaryotic innate immune system is derived from examples of large effect monogenic quantitative resistance to pathogens. However, many host-pathogen interactions involve many genes of small to medium effect and exhibit quantitative resistance. We used the Arabidopsis-Botrytis pathosystem to explore the quantitative genetic architecture underlying host innate immune system in a population of Arabidopsis thaliana. By infecting a diverse panel of Arabidopsis accessions with four phenotypically and genotypically distinct isolates of the fungal necrotroph B. cinerea, we identified a total of 2,982 genes associated with quantitative resistance using lesion area and 3,354 genes associated with camalexin production as measures of the interaction. Most genes were associated with resistance to a specific Botrytis isolate, which demonstrates the influence of pathogen genetic variation in analyzing host quantitative resistance. While known resistance genes, such as receptor-like kinases (RLKs) and nucleotide-binding site leucine-rich repeat proteins (NLRs), were found to be enriched among associated genes, they only account for a small fraction of the total genes associated with quantitative resistance. Using publically available co-expression data, we condensed the quantitative resistance associated genes into co-expressed gene networks. GO analysis of these networks implicated several biological processes commonly connected to disease resistance, including defense hormone signaling and ROS production, as well as novel processes, such as leaf development. Validation of single gene T-DNA knockouts in a Col-0 background demonstrate a high success rate (60%) when accounting for differences in environmental and Botrytis genetic variation. This study shows that the genetic architecture underlying host innate immune system is extremely complex and is likely able to sense and respond to differential virulence among pathogen genotypes.

  16. Space Station Freedom Water Recovery test total organic carbon accountability

    NASA Technical Reports Server (NTRS)

    Davidson, Michael W.; Slivon, Laurence; Sheldon, Linda; Traweek, Mary

    1991-01-01

    Marshall Space Flight Center's (MSFC) Water Recovery Test (WRT) addresses the concept of integrated hygiene and potable reuse water recovery systems baselined for Space Station Freedom (SSF). To assess the adequacy of water recovery system designs and the conformance of reclaimed water quality to established specifications, MSFC has initiated an extensive water characterization program. MSFC's goal is to quantitatively account for a large percentage of organic compounds present in waste and reclaimed hygiene and potable waters from the WRT and in humidity condensate from Spacelab missions. The program is coordinated into Phase A and B. Phase A's focus is qualitative and semi-quantitative. Precise quantitative analyses are not emphasized. Phase B's focus centers on a near complete quantitative characterization of all water types. Technical approaches along with Phase A and partial Phase B investigations on the compositional analysis of Total Organic Carbon (TOC) Accountability are presented.

  17. Parameter Requirements for Description of Alternative LINC Systems. Final Report.

    ERIC Educational Resources Information Center

    Center for Applied Linguistics, Washington, DC. Language Information Network and Clearinghouse System.

    This study was undertaken for the Center for Applied Linguistics to survey and analyze its information system program for the language sciences. The study identifies and defines the generalized sets of parameters required for subsequent quantitative analysis of proposed alternative Language Information Network and Clearinghouse Systems by means of…

  18. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  19. Recent Achievements in Characterizing the Histone Code and Approaches to Integrating Epigenomics and Systems Biology.

    PubMed

    Janssen, K A; Sidoli, S; Garcia, B A

    2017-01-01

    Functional epigenetic regulation occurs by dynamic modification of chromatin, including genetic material (i.e., DNA methylation), histone proteins, and other nuclear proteins. Due to the highly complex nature of the histone code, mass spectrometry (MS) has become the leading technique in identification of single and combinatorial histone modifications. MS has now overcome antibody-based strategies due to its automation, high resolution, and accurate quantitation. Moreover, multiple approaches to analysis have been developed for global quantitation of posttranslational modifications (PTMs), including large-scale characterization of modification coexistence (middle-down and top-down proteomics), which is not currently possible with any other biochemical strategy. Recently, our group and others have simplified and increased the effectiveness of analyzing histone PTMs by improving multiple MS methods and data analysis tools. This review provides an overview of the major achievements in the analysis of histone PTMs using MS with a focus on the most recent improvements. We speculate that the workflow for histone analysis at its state of the art is highly reliable in terms of identification and quantitation accuracy, and it has the potential to become a routine method for systems biology thanks to the possibility of integrating histone MS results with genomics and proteomics datasets. © 2017 Elsevier Inc. All rights reserved.

  20. Automated quantitative gait analysis during overground locomotion in the rat: its application to spinal cord contusion and transection injuries.

    PubMed

    Hamers, F P; Lankhorst, A J; van Laar, T J; Veldhuis, W B; Gispen, W H

    2001-02-01

    Analysis of locomotion is an important tool in the study of peripheral and central nervous system damage. Most locomotor scoring systems in rodents are based either upon open field locomotion assessment, for example, the BBB score or upon foot print analysis. The former yields a semiquantitative description of locomotion as a whole, whereas the latter generates quantitative data on several selected gait parameters. In this paper, we describe the use of a newly developed gait analysis method that allows easy quantitation of a large number of locomotion parameters during walkway crossing. We were able to extract data on interlimb coordination, swing duration, paw print areas (total over stance, and at 20-msec time resolution), stride length, and base of support: Similar data can not be gathered by any single previously described method. We compare changes in gait parameters induced by two different models of spinal cord injury in rats, transection of the dorsal half of the spinal cord and spinal cord contusion injury induced by the NYU or MASCIS device. Although we applied this method to rats with spinal cord injury, the usefulness of this method is not limited to rats or to the investigation of spinal cord injuries alone.

  1. High speed quantitative digital microscopy

    NASA Technical Reports Server (NTRS)

    Castleman, K. R.; Price, K. H.; Eskenazi, R.; Ovadya, M. M.; Navon, M. A.

    1984-01-01

    Modern digital image processing hardware makes possible quantitative analysis of microscope images at high speed. This paper describes an application to automatic screening for cervical cancer. The system uses twelve MC6809 microprocessors arranged in a pipeline multiprocessor configuration. Each processor executes one part of the algorithm on each cell image as it passes through the pipeline. Each processor communicates with its upstream and downstream neighbors via shared two-port memory. Thus no time is devoted to input-output operations as such. This configuration is expected to be at least ten times faster than previous systems.

  2. A specialized plug-in software module for computer-aided quantitative measurement of medical images.

    PubMed

    Wang, Q; Zeng, Y J; Huo, P; Hu, J L; Zhang, J H

    2003-12-01

    This paper presents a specialized system for quantitative measurement of medical images. Using Visual C++, we developed a computer-aided software based on Image-Pro Plus (IPP), a software development platform. When transferred to the hard disk of a computer by an MVPCI-V3A frame grabber, medical images can be automatically processed by our own IPP plug-in for immunohistochemical analysis, cytomorphological measurement and blood vessel segmentation. In 34 clinical studies, the system has shown its high stability, reliability and ease of utility.

  3. Development of quantitative security optimization approach for the picture archives and carrying system between a clinic and a rehabilitation center

    NASA Astrophysics Data System (ADS)

    Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko

    2002-05-01

    The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.

  4. European Workshop Industrical Computer Science Systems approach to design for safety

    NASA Technical Reports Server (NTRS)

    Zalewski, Janusz

    1992-01-01

    This paper presents guidelines on designing systems for safety, developed by the Technical Committee 7 on Reliability and Safety of the European Workshop on Industrial Computer Systems. The focus is on complementing the traditional development process by adding the following four steps: (1) overall safety analysis; (2) analysis of the functional specifications; (3) designing for safety; (4) validation of design. Quantitative assessment of safety is possible by means of a modular questionnaire covering various aspects of the major stages of system development.

  5. Highly hydrogen-sensitive thermal desorption spectroscopy system for quantitative analysis of low hydrogen concentration (˜1 × 1016 atoms/cm3) in thin-film samples

    NASA Astrophysics Data System (ADS)

    Hanna, Taku; Hiramatsu, Hidenori; Sakaguchi, Isao; Hosono, Hideo

    2017-05-01

    We developed a highly hydrogen-sensitive thermal desorption spectroscopy (HHS-TDS) system to detect and quantitatively analyze low hydrogen concentrations in thin films. The system was connected to an in situ sample-transfer chamber system, manipulators, and an rf magnetron sputtering thin-film deposition chamber under an ultra-high-vacuum (UHV) atmosphere of ˜10-8 Pa. The following key requirements were proposed in developing the HHS-TDS: (i) a low hydrogen residual partial pressure, (ii) a low hydrogen exhaust velocity, and (iii) minimization of hydrogen thermal desorption except from the bulk region of the thin films. To satisfy these requirements, appropriate materials and components were selected, and the system was constructed to extract the maximum performance from each component. Consequently, ˜2000 times higher sensitivity to hydrogen than that of a commercially available UHV-TDS system was achieved using H+-implanted Si samples. Quantitative analysis of an amorphous oxide semiconductor InGaZnO4 thin film (1 cm × 1 cm × 1 μm thickness, hydrogen concentration of 4.5 × 1017 atoms/cm3) was demonstrated using the HHS-TDS system. This concentration level cannot be detected using UHV-TDS or secondary ion mass spectroscopy (SIMS) systems. The hydrogen detection limit of the HHS-TDS system was estimated to be ˜1 × 1016 atoms/cm3, which implies ˜2 orders of magnitude higher sensitivity than that of SIMS and resonance nuclear reaction systems (˜1018 atoms/cm3).

  6. Interrelation of structure and operational states in cascading failure of overloading lines in power grids

    NASA Astrophysics Data System (ADS)

    Xue, Fei; Bompard, Ettore; Huang, Tao; Jiang, Lin; Lu, Shaofeng; Zhu, Huaiying

    2017-09-01

    As the modern power system is expected to develop to a more intelligent and efficient version, i.e. the smart grid, or to be the central backbone of energy internet for free energy interactions, security concerns related to cascading failures have been raised with consideration of catastrophic results. The researches of topological analysis based on complex networks have made great contributions in revealing structural vulnerabilities of power grids including cascading failure analysis. However, existing literature with inappropriate assumptions in modeling still cannot distinguish the effects between the structure and operational state to give meaningful guidance for system operation. This paper is to reveal the interrelation between network structure and operational states in cascading failure and give quantitative evaluation by integrating both perspectives. For structure analysis, cascading paths will be identified by extended betweenness and quantitatively described by cascading drop and cascading gradient. Furthermore, the operational state for cascading paths will be described by loading level. Then, the risk of cascading failure along a specific cascading path can be quantitatively evaluated considering these two factors. The maximum cascading gradient of all possible cascading paths can be used as an overall metric to evaluate the entire power grid for its features related to cascading failure. The proposed method is tested and verified on IEEE30-bus system and IEEE118-bus system, simulation evidences presented in this paper suggests that the proposed model can identify the structural causes for cascading failure and is promising to give meaningful guidance for the protection of system operation in the future.

  7. Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Brugel, Edward W.; Domik, Gitta O.; Ayres, Thomas R.

    1993-01-01

    The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions, and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists.

  8. [Study of Cervical Exfoliated Cell's DNA Quantitative Analysis Based on Multi-Spectral Imaging Technology].

    PubMed

    Wu, Zheng; Zeng, Li-bo; Wu, Qiong-shui

    2016-02-01

    The conventional cervical cancer screening methods mainly include TBS (the bethesda system) classification method and cellular DNA quantitative analysis, however, by using multiple staining method in one cell slide, which is staining the cytoplasm with Papanicolaou reagent and the nucleus with Feulgen reagent, the study of achieving both two methods in the cervical cancer screening at the same time is still blank. Because the difficulty of this multiple staining method is that the absorbance of the non-DNA material may interfere with the absorbance of DNA, so that this paper has set up a multi-spectral imaging system, and established an absorbance unmixing model by using multiple linear regression method based on absorbance's linear superposition character, and successfully stripped out the absorbance of DNA to run the DNA quantitative analysis, and achieved the perfect combination of those two kinds of conventional screening method. Through a series of experiment we have proved that between the absorbance of DNA which is calculated by the absorbance unmixxing model and the absorbance of DNA which is measured there is no significant difference in statistics when the test level is 1%, also the result of actual application has shown that there is no intersection between the confidence interval of the DNA index of the tetraploid cells which are screened by using this paper's analysis method when the confidence level is 99% and the DNA index's judging interval of cancer cells, so that the accuracy and feasibility of the quantitative DNA analysis with multiple staining method expounded by this paper have been verified, therefore this analytical method has a broad application prospect and considerable market potential in early diagnosis of cervical cancer and other cancers.

  9. Analysis of the time structure of synchronization in multidimensional chaotic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarenko, A. V., E-mail: avm.science@mail.ru

    2015-05-15

    A new approach is proposed to the integrated analysis of the time structure of synchronization of multidimensional chaotic systems. The method allows one to diagnose and quantitatively evaluate the intermittency characteristics during synchronization of chaotic oscillations in the T-synchronization mode. A system of two identical logistic mappings with unidirectional coupling that operate in the developed chaos regime is analyzed. It is shown that the widely used approach, in which only synchronization patterns are subjected to analysis while desynchronization areas are considered as a background signal and removed from analysis, should be regarded as methodologically incomplete.

  10. Potential use of combining the diffusion equation with the free Shrödinger equation to improve the Optical Coherence Tomography image analysis

    NASA Astrophysics Data System (ADS)

    Cabrera Fernandez, Delia; Salinas, Harry M.; Somfai, Gabor; Puliafito, Carmen A.

    2006-03-01

    Optical coherence tomography (OCT) is a rapidly emerging medical imaging technology. In ophthalmology, OCT is a powerful tool because it enables visualization of the cross sectional structure of the retina and anterior eye with higher resolutions than any other non-invasive imaging modality. Furthermore, OCT image information can be quantitatively analyzed, enabling objective assessment of features such as macular edema and diabetes retinopathy. We present specific improvements in the quantitative analysis of the OCT system, by combining the diffusion equation with the free Shrödinger equation. In such formulation, important features of the image can be extracted by extending the analysis from the real axis to the complex domain. Experimental results indicate that our proposed novel approach has good performance in speckle noise removal, enhancement and segmentation of the various cellular layers of the retina using the OCT system.

  11. Deformation analysis of MEMS structures by modified digital moiré methods

    NASA Astrophysics Data System (ADS)

    Liu, Zhanwei; Lou, Xinhao; Gao, Jianxin

    2010-11-01

    Quantitative deformation analysis of micro-fabricated electromechanical systems is of importance for the design and functional control of microsystems. In this paper, two modified digital moiré processing methods, Gaussian blurring algorithm combined with digital phase shifting and geometrical phase analysis (GPA) technique based on digital moiré method, are developed to quantitatively analyse the deformation behaviour of micro-electro-mechanical system (MEMS) structures. Measuring principles and experimental procedures of the two methods are described in detail. A digital moiré fringe pattern is generated by superimposing a specimen grating etched directly on a microstructure surface with a digital reference grating (DRG). Most of the grating noise is removed from the digital moiré fringes, which enables the phase distribution of the moiré fringes to be obtained directly. Strain measurement result of a MEMS structure demonstrates the feasibility of the two methods.

  12. Convergence Estimates for Multidisciplinary Analysis and Optimization

    NASA Technical Reports Server (NTRS)

    Arian, Eyal

    1997-01-01

    A quantitative analysis of coupling between systems of equations is introduced. This analysis is then applied to problems in multidisciplinary analysis, sensitivity, and optimization. For the sensitivity and optimization problems both multidisciplinary and single discipline feasibility schemes are considered. In all these cases a "convergence factor" is estimated in terms of the Jacobians and Hessians of the system, thus it can also be approximated by existing disciplinary analysis and optimization codes. The convergence factor is identified with the measure for the "coupling" between the disciplines in the system. Applications to algorithm development are discussed. Demonstration of the convergence estimates and numerical results are given for a system composed of two non-linear algebraic equations, and for a system composed of two PDEs modeling aeroelasticity.

  13. Caitlin Murphy bio | NREL

    Science.gov Websites

    . energy system through quantitative analysis methods. Research Interests Evaluating the system and , Washington, DC (2014-2017) Postdoctoral Research Fellow, Carnegie Institution for Science, Washington, DC (2012-2014) Graduate Research and Teaching Assistant, California Institute of Technology, Pasadena, CA

  14. Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy

    NASA Astrophysics Data System (ADS)

    Sindern, Sven; Meyer, F. Michael

    2016-09-01

    Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become increasingly important for supply of REEs in the future.

  15. Maturity Curve of Systems Engineering

    DTIC Science & Technology

    2008-12-01

    b. Analysis of Data .......................................................... 41 4. Fuzzy Logic...the collection and analysis of data . (Hart, 1998) 13 1. Methodology Overview A qualitative approach in acquiring and managing the data was used...for this analysis . A quantitative tool was used to examine and evaluate the data . The qualitative approach was intended to sort the acquired traits

  16. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  17. ASPECTS: an automation-assisted SPE method development system.

    PubMed

    Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu

    2013-07-01

    A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.

  18. Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.

    ERIC Educational Resources Information Center

    Lindahl, William H.; Gardner, James H.

    Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…

  19. Calibration method for spectroscopic systems

    DOEpatents

    Sandison, David R.

    1998-01-01

    Calibration spots of optically-characterized material placed in the field of view of a spectroscopic system allow calibration of the spectroscopic system. Response from the calibration spots is measured and used to calibrate for varying spectroscopic system operating parameters. The accurate calibration achieved allows quantitative spectroscopic analysis of responses taken at different times, different excitation conditions, and of different targets.

  20. Calibration method for spectroscopic systems

    DOEpatents

    Sandison, D.R.

    1998-11-17

    Calibration spots of optically-characterized material placed in the field of view of a spectroscopic system allow calibration of the spectroscopic system. Response from the calibration spots is measured and used to calibrate for varying spectroscopic system operating parameters. The accurate calibration achieved allows quantitative spectroscopic analysis of responses taken at different times, different excitation conditions, and of different targets. 3 figs.

  1. Acoustic Facies Analysis of Side-Scan Sonar Data

    NASA Astrophysics Data System (ADS)

    Dwan, Fa Shu

    Acoustic facies analysis methods have allowed the generation of system-independent values for the quantitative seafloor acoustic parameter, backscattering strength, from GLORIA and (TAMU) ^2 side-scan sonar data. The resulting acoustic facies parameters enable quantitative comparisons of data collected by different sonar systems, data from different environments, and measurements made with survey geometries. Backscattering strength values were extracted from the sonar amplitude data by inversion based on the sonar equation. Image processing products reveal seafloor features and patterns of relative intensity. To quantitatively compare data collected at different times or by different systems, and to ground truth-measurements and geoacoustic models, quantitative corrections must be made on any given data set for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area contribution, and grazing angle effects. In the sonar equation, backscattering strength is the sonar parameter which is directly related to seafloor properties. The GLORIA data used in this study are from the edge of a distal lobe of the Monterey Fan. An interfingered region of strong and weak seafloor signal returns from a flat seafloor region provides an ideal data set for this study. Inversion of imagery data from the region allows the quantitative definition of different acoustic facies. The (TAMU) ^2 data used are from a calibration site near the Green Canyon area of the Gulf of Mexico. Acoustic facies analysis techniques were implemented to generate statistical information for acoustic facies based on the estimates of backscattering strength. The backscattering strength values have been compared with Lambert's Law and other functions to parameterize the description of the acoustic facies. The resulting Lambertian constant values range from -26 dB to -36 dB. A modified Lambert relationship, which consists of both intercept and slope terms, appears to represent the BSS versus grazing angle profiles better based on chi^2 testing and error ellipse generation. Different regression functions, composed of trigonometric functions, were analyzed for different segments of the BSS profiles. A cotangent or sine/cosine function shows promising results for representing the entire grazing angle span of the BSS profiles.

  2. Matrix evaluation of science objectives

    NASA Technical Reports Server (NTRS)

    Wessen, Randii R.

    1994-01-01

    The most fundamental objective of all robotic planetary spacecraft is to return science data. To accomplish this, a spacecraft is fabricated and built, software is planned and coded, and a ground system is designed and implemented. However, the quantitative analysis required to determine how the collection of science data drives ground system capabilities has received very little attention. This paper defines a process by which science objectives can be quantitatively evaluated. By applying it to the Cassini Mission to Saturn, this paper further illustrates the power of this technique. The results show which science objectives drive specific ground system capabilities. In addition, this process can assist system engineers and scientists in the selection of the science payload during pre-project mission planning; ground system designers during ground system development and implementation; and operations personnel during mission operations.

  3. Systems-Level Analysis of Innate Immunity

    PubMed Central

    Zak, Daniel E.; Tam, Vincent C.; Aderem, Alan

    2014-01-01

    Systems-level analysis of biological processes strives to comprehensively and quantitatively evaluate the interactions between the relevant molecular components over time, thereby enabling development of models that can be employed to ultimately predict behavior. Rapid development in measurement technologies (omics), when combined with the accessible nature of the cellular constituents themselves, is allowing the field of innate immunity to take significant strides toward this lofty goal. In this review, we survey exciting results derived from systems biology analyses of the immune system, ranging from gene regulatory networks to influenza pathogenesis and systems vaccinology. PMID:24655298

  4. Broadband external cavity quantum cascade laser based sensor for gasoline detection

    NASA Astrophysics Data System (ADS)

    Ding, Junya; He, Tianbo; Zhou, Sheng; Li, Jinsong

    2018-02-01

    A new type of tunable diode spectroscopy sensor based on an external cavity quantum cascade laser (ECQCL) and a quartz crystal tuning fork (QCTF) were used for quantitative analysis of volatile organic compounds. In this work, the sensor system had been tested on different gasoline sample analysis. For signal processing, the self-established interpolation algorithm and multiple linear regression algorithm model were used for quantitative analysis of major volatile organic compounds in gasoline samples. The results were very consistent with that of the standard spectra taken from the Pacific Northwest National Laboratory (PNNL) database. In future, The ECQCL sensor will be used for trace explosive, chemical warfare agent, and toxic industrial chemical detection and spectroscopic analysis, etc.

  5. Modified HS-SPME for determination of quantitative relations between low-molecular oxygen compounds in various matrices.

    PubMed

    Dawidowicz, Andrzej L; Szewczyk, Joanna; Dybowski, Michal P

    2016-09-07

    Similar quantitative relations between individual constituents of the liquid sample established by its direct injection can be obtained applying Polydimethylsiloxane (PDMS) fiber in the headspace solid phase microextraction (HS-SPME) system containing the examined sample suspended in methyl silica oil. This paper proves that the analogous system composed of sample suspension/emulsion in polyethylene glycol (PEG) and Carbowax fiber allows to get similar quantitative relations between components of the mixture as those established by its direct analysis, but only for polar constituents. It is demonstrated for essential oil (EO) components of savory, sage, mint and thyme, and of artificial liquid mixture of polar constituents. The observed differences in quantitative relations between polar constituents estimated by both applied procedures are insignificant (Fexp < Fcrit). The presented results indicates that wider applicability of the system composed of a sample suspended in the oil of the same physicochemical character as that of used SPME fiber coating strongly depends on the character of interactions between analytes-suspending liquid and analytes-fiber coating. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Quantitative evaluation of haze formation of koji and progression of internal haze by drying of koji during koji making.

    PubMed

    Ito, Kazunari; Gomi, Katsuya; Kariyama, Masahiro; Miyake, Tsuyoshi

    2017-07-01

    The construction of an experimental system that can mimic koji making in the manufacturing setting of a sake brewery is initially required for the quantitative evaluation of mycelia grown on/in koji pellets (haze formation). Koji making with rice was investigated with a solid-state fermentation (SSF) system using a non-airflow box (NAB), which produced uniform conditions in the culture substrate with high reproducibility and allowed for the control of favorable conditions in the substrate during culture. The SSF system using NAB accurately reproduced koji making in a manufacturing setting. To evaluate haze formation during koji making, surfaces and cross sections of koji pellets obtained from koji making tests were observed using a digital microscope. Image analysis was used to distinguish between haze and non-haze sections of koji pellets, enabling the evaluation of haze formation in a batch by measuring the haze rate of a specific number of koji pellets. This method allowed us to obtain continuous and quantitative data on the time course of haze formation. Moreover, drying koji during the late stage of koji making was revealed to cause further penetration of mycelia into koji pellets (internal haze). The koji making test with the SSF system using NAB and quantitative evaluation of haze formation in a batch by image analysis is a useful method for understanding the relations between haze formation and koji making conditions. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  7. Image analysis in cytology: DNA-histogramming versus cervical smear prescreening.

    PubMed

    Bengtsson, E W; Nordin, B

    1993-01-01

    The visual inspection of cellular specimens and histological sections through a light microscope plays an important role in clinical medicine and biomedical research. The human visual system is very good at the recognition of various patterns but less efficient at quantitative assessment of these patterns. Some samples are prepared in great numbers, most notably the screening for cervical cancer, the so-called PAP-smears, which results in hundreds of millions of samples each year, creating a tedious mass inspection task. Numerous attempts have been made over the last 40 years to create systems that solve these two tasks, the quantitative supplement to the human visual system and the automation of mass screening. The most difficult task, the total automation, has received the greatest attention with many large scale projects over the decades. In spite of all these efforts, still no generally accepted automated prescreening device exists on the market. The main reason for this failure is the great pattern recognition capabilities needed to distinguish between cancer cells and all other kinds of objects found in the specimens: cellular clusters, debris, degenerate cells, etc. Improved algorithms, the ever-increasing processing power of computers and progress in biochemical specimen preparation techniques make it likely that eventually useful automated prescreening systems will become available. Meanwhile, much less effort has been put into the development of interactive cell image analysis systems. Still, some such systems have been developed and put into use at thousands of laboratories worldwide. In these the human pattern recognition capability is used to select the fields and objects that are to be analysed while the computational power of the computer is used for the quantitative analysis of cellular DNA content or other relevant markers. Numerous studies have shown that the quantitative information about the distribution of cellular DNA content is of prognostic significance in many types of cancer. Several laboratories are therefore putting these techniques into routine clinical use. The more advanced systems can also study many other markers and cellular features, some known to be of clinical interest, others useful in research. The advances in computer technology are making these systems more generally available through decreasing cost, increasing computational power and improved user interfaces. We have been involved in research and development of both automated and interactive cell analysis systems during the last 20 years. Here some experiences and conclusions from this work will be presented as well as some predictions about what can be expected in the near future.

  8. Comparing the MRI-based Goutallier Classification to an experimental quantitative MR spectroscopic fat measurement of the supraspinatus muscle.

    PubMed

    Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk

    2016-08-22

    The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of the Goutallier classification and thus improve the prediction of clinical results after rotator cuff repair. However, these techniques are currently only available in an experimental setting.

  9. Quantitative Analysis Method of Output Loss due to Restriction for Grid-connected PV Systems

    NASA Astrophysics Data System (ADS)

    Ueda, Yuzuru; Oozeki, Takashi; Kurokawa, Kosuke; Itou, Takamitsu; Kitamura, Kiyoyuki; Miyamoto, Yusuke; Yokota, Masaharu; Sugihara, Hiroyuki

    Voltage of power distribution line will be increased due to reverse power flow from grid-connected PV systems. In the case of high density grid connection, amount of voltage increasing will be higher than the stand-alone grid connection system. To prevent the over voltage of power distribution line, PV system's output will be restricted if the voltage of power distribution line is close to the upper limit of the control range. Because of this interaction, amount of output loss will be larger in high density case. This research developed a quantitative analysis method for PV systems output and losses to clarify the behavior of grid connected PV systems. All the measured data are classified into the loss factors using 1 minute average of 1 second data instead of typical 1 hour average. Operation point on the I-V curve is estimated to quantify the loss due to the output restriction using module temperature, array output voltage, array output current and solar irradiance. As a result, loss due to output restriction is successfully quantified and behavior of output restriction is clarified.

  10. The simultaneous quantitation of ten amino acids in soil extracts by mass fragmentography

    NASA Technical Reports Server (NTRS)

    Pereira, W. E.; Hoyano, Y.; Reynolds, W. E.; Summons, R. E.; Duffield, A. M.

    1972-01-01

    A specific and sensitive method for the identification and simultaneous quantitation by mass fragmentography of ten of the amino acids present in soil was developed. The technique uses a computer driven quadrupole mass spectrometer and a commercial preparation of deuterated amino acids is used as internal standards for purposes of quantitation. The results obtained are comparable with those from an amino acid analyzer. In the quadrupole mass spectrometer-computer system up to 25 pre-selected ions may be monitored sequentially. This allows a maximum of 12 different amino acids (one specific ion in each of the undeuterated and deuterated amino acid spectra) to be quantitated. The method is relatively rapid (analysis time of approximately one hour) and is capable of the quantitation of nanogram quantities of amino acids.

  11. Reliability analysis for the smart grid : from cyber control and communication to physical manifestations of failure.

    DOT National Transportation Integrated Search

    2010-01-01

    The Smart Grid is a cyber-physical system comprised of physical components, such as transmission lines and generators, and a : network of embedded systems deployed for their cyber control. Our objective is to qualitatively and quantitatively analyze ...

  12. Comparison of DIGE and post-stained gel electrophoresis with both traditional and SameSpots analysis for quantitative proteomics.

    PubMed

    Karp, Natasha A; Feret, Renata; Rubtsov, Denis V; Lilley, Kathryn S

    2008-03-01

    2-DE is an important tool in quantitative proteomics. Here, we compare the deep purple (DP) system with DIGE using both a traditional and the SameSpots approach to gel analysis. Missing values in the traditional approach were found to be a significant issue for both systems. SameSpots attempts to address the missing value problem. SameSpots was found to increase the proportion of low volume data for DP but not for DIGE. For all the analysis methods applied in this study, the assumptions of parametric tests were met. Analysis of the same images gave significantly lower noise with SameSpots (over traditional) for DP, but no difference for DIGE. We propose that SameSpots gave lower noise with DP due to the stabilisation of the spot area by the common spot outline, but this was not seen with DIGE due to the co-detection process which stabilises the area selected. For studies where measurement of small abundance changes is required, a cost-benefit analysis highlights that DIGE was significantly cheaper regardless of the analysis methods. For studies analysing large changes, DP with SameSpots could be an effective alternative to DIGE but this will be dependent on the biological noise of the system under investigation.

  13. Quantitative measurement of a candidate serum biomarker peptide derived from α2-HS-glycoprotein, and a preliminary trial of multidimensional peptide analysis in females with pregnancy-induced hypertension.

    PubMed

    Hamamura, Kensuke; Yanagida, Mitsuaki; Ishikawa, Hitoshi; Banzai, Michio; Yoshitake, Hiroshi; Nonaka, Daisuke; Tanaka, Kenji; Sakuraba, Mayumi; Miyakuni, Yasuka; Takamori, Kenji; Nojima, Michio; Yoshida, Koyo; Fujiwara, Hiroshi; Takeda, Satoru; Araki, Yoshihiko

    2018-03-01

    Purpose We previously attempted to develop quantitative enzyme-linked immunosorbent assay (ELISA) systems for the PDA039/044/071 peptides, potential serum disease biomarkers (DBMs) of pregnancy-induced hypertension (PIH), primarily identified by a peptidomic approach (BLOTCHIP®-mass spectrometry (MS)). However, our methodology did not extend to PDA071 (cysteinyl α2-HS-glycoprotein 341-367 ), due to difficulty to produce a specific antibody against the peptide. The aim of the present study was to establish an alternative PDA071 quantitation system using liquid chromatography-multiple reaction monitoring (LC-MRM)/MS, to explore the potential utility of PDA071 as a DBM for PIH. Methods We tested heat/acid denaturation methods in efforts to purify serum PDA071 and developed an LC-MRM/MS method allowing for specific quantitation thereof. We measured serum PDA071 concentrations, and these results were validated including by three-dimensional (3D) plotting against PDA039 (kininogen-1 439-456 )/044 (kininogen-1 438-456 ) concentrations, followed by discriminant analysis. Results PDA071 was successfully extracted from serum using a heat denaturation method. Optimum conditions for quantitation via LC-MRM/MS were developed; the assayed serum PDA071 correlated well with the BLOTCHIP® assay values. Although the PDA071 alone did not significantly differ between patients and controls, 3D plotting of PDA039/044/071 peptide concentrations and construction of a Jackknife classification matrix were satisfactory in terms of PIH diagnostic precision. Conclusions Combination analysis using both PDA071 and PDA039/044 concentrations allowed PIH diagnostic accuracy to be attained, and our method will be valuable in future pathophysiological studies of hypertensive disorders of pregnancy.

  14. Quantitative fluorescence correlation spectroscopy on DNA in living cells

    NASA Astrophysics Data System (ADS)

    Hodges, Cameron; Kafle, Rudra P.; Meiners, Jens-Christian

    2017-02-01

    FCS is a fluorescence technique conventionally used to study the kinetics of fluorescent molecules in a dilute solution. Being a non-invasive technique, it is now drawing increasing interest for the study of more complex systems like the dynamics of DNA or proteins in living cells. Unlike an ordinary dye solution, the dynamics of macromolecules like proteins or entangled DNA in crowded environments is often slow and subdiffusive in nature. This in turn leads to longer residence times of the attached fluorophores in the excitation volume of the microscope and artifacts from photobleaching abound that can easily obscure the signature of the molecular dynamics of interest and make quantitative analysis challenging.We discuss methods and procedures to make FCS applicable to quantitative studies of the dynamics of DNA in live prokaryotic and eukaryotic cells. The intensity autocorrelation is computed function from weighted arrival times of the photons on the detector that maximizes the information content while simultaneously correcting for the effect of photobleaching to yield an autocorrelation function that reflects only the underlying dynamics of the sample. This autocorrelation function in turn is used to calculate the mean square displacement of the fluorophores attached to DNA. The displacement data is more amenable to further quantitative analysis than the raw correlation functions. By using a suitable integral transform of the mean square displacement, we can then determine the viscoelastic moduli of the DNA in its cellular environment. The entire analysis procedure is extensively calibrated and validated using model systems and computational simulations.

  15. Systems microscopy: an emerging strategy for the life sciences.

    PubMed

    Lock, John G; Strömblad, Staffan

    2010-05-01

    Dynamic cellular processes occurring in time and space are fundamental to all physiology and disease. To understand complex and dynamic cellular processes therefore demands the capacity to record and integrate quantitative multiparametric data from the four spatiotemporal dimensions within which living cells self-organize, and to subsequently use these data for the mathematical modeling of cellular systems. To this end, a raft of complementary developments in automated fluorescence microscopy, cell microarray platforms, quantitative image analysis and data mining, combined with multivariate statistics and computational modeling, now coalesce to produce a new research strategy, "systems microscopy", which facilitates systems biology analyses of living cells. Systems microscopy provides the crucial capacities to simultaneously extract and interrogate multiparametric quantitative data at resolution levels ranging from the molecular to the cellular, thereby elucidating a more comprehensive and richly integrated understanding of complex and dynamic cellular systems. The unique capacities of systems microscopy suggest that it will become a vital cornerstone of systems biology, and here we describe the current status and future prospects of this emerging field, as well as outlining some of the key challenges that remain to be overcome. Copyright 2010 Elsevier Inc. All rights reserved.

  16. Simultaneous extraction and quantitation of several bioactive amines in cheese and chocolate.

    PubMed

    Baker, G B; Wong, J T; Coutts, R T; Pasutto, F M

    1987-04-17

    A method is described for simultaneous extraction and quantitation of the amines 2-phenylethylamine, tele-methylhistamine, histamine, tryptamine, m- and p-tyramine, 3-methoxytyramine, 5-hydroxytryptamine, cadaverine, putrescine, spermidine and spermine. This method is based on extractive derivatization of the amines with a perfluoroacylating agent, pentafluorobenzoyl chloride, under basic aqueous conditions. Analysis was done on a gas chromatograph equipped with an electron-capture detector and a capillary column system. The procedure is relatively rapid and provides derivatives with good chromatographic properties. Its application to analysis of the above amines in cheese and chocolate products is described.

  17. Airport Surface Traffic Control Concept Formulation Study : Volume 3. Operations Analysis of O'Hare Airport - Part 2.

    DOT National Transportation Integrated Search

    1975-07-01

    The volume presents the results of the quantitative analyses of the O'Hare ASTC System operations. The operations environments for the periods selected for detailed analysis of the ASDE films and controller communications recording are described. Fol...

  18. Methodology development for quantitative optimization of security enhancement in medical information systems -Case study in a PACS and a multi-institutional radiotherapy database-.

    PubMed

    Haneda, Kiyofumi; Umeda, Tokuo; Koyama, Tadashi; Harauchi, Hajime; Inamura, Kiyonari

    2002-01-01

    The target of our study is to establish the methodology for analyzing level of security requirements, for searching suitable security measures and for optimizing security distribution to every portion of medical practice. Quantitative expression must be introduced to our study as possible for the purpose of easy follow up of security procedures and easy evaluation of security outcomes or results. Results of system analysis by fault tree analysis (FTA) clarified that subdivided system elements in detail contribute to much more accurate analysis. Such subdivided composition factors very much depended on behavior of staff, interactive terminal devices, kinds of service, and routes of network. As conclusion, we found the methods to analyze levels of security requirements for each medical information systems employing FTA, basic events for each composition factor and combination of basic events. Methods for searching suitable security measures were found. Namely risk factors for each basic event, number of elements for each composition factor and candidates of security measure elements were found. Method to optimize the security measures for each medical information system was proposed. Namely optimum distribution of risk factors in terms of basic events were figured out, and comparison of them between each medical information systems became possible.

  19. Informational analysis involving application of complex information system

    NASA Astrophysics Data System (ADS)

    Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael

    The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.

  20. Fluvial drainage networks: the fractal approach as an improvement of quantitative geomorphic analyses

    NASA Astrophysics Data System (ADS)

    Melelli, Laura; Liucci, Luisa; Vergari, Francesca; Ciccacci, Sirio; Del Monte, Maurizio

    2014-05-01

    Drainage basins are primary landscape units for geomorphological investigations. Both hillslopes and river drainage system are fundamental components in drainage basins analysis. As other geomorphological systems, also the drainage basins aim to an equilibrium condition where the sequence of erosion, transport and sedimentation approach to a condition of minimum energy effort. This state is revealed by a typical geometry of landforms and of drainage net. Several morphometric indexes can measure how much a drainage basin is far from the theoretical equilibrium configuration, revealing possible external disarray. In active tectonic areas, the drainage basins have a primary importance in order to highlight style, amount and rate of tectonic impulses, and morphometric indexes allow to estimate the tectonic activity classes of different sectors in a study area. Moreover, drainage rivers are characterized by a self-similarity structure; this promotes the use of fractals theory to investigate the system. In this study, fractals techniques are employed together with quantitative geomorphological analysis to study the Upper Tiber Valley (UTV), a tectonic intermontane basin located in northern Apennines (Umbria, central Italy). The area is the result of different tectonic phases. From Late Pliocene until present time the UTV is strongly controlled by a regional uplift and by an extensional phase with different sets of normal faults playing a fundamental role in basin morphology. Thirty-four basins are taken into account for the quantitative analysis, twenty on the left side of the basin, the others on the right side. Using fractals dimension of drainage networks, Horton's laws results, concavity and steepness indexes, and hypsometric curves, this study aims to obtain an evolutionary model of the UTV, where the uplift is compared to local subsidence induced by normal fault activity. The results highlight a well defined difference between western and eastern tributary basins, suggesting a greater disequilibrium in the last ones. The quantitative analysis points out the segments of the basin boundaries where the fault activity is more efficient and the resulting geomorphological implications.

  1. Soft error evaluation and vulnerability analysis in Xilinx Zynq-7010 system-on chip

    NASA Astrophysics Data System (ADS)

    Du, Xuecheng; He, Chaohui; Liu, Shuhuan; Zhang, Yao; Li, Yonghong; Xiong, Ceng; Tan, Pengkang

    2016-09-01

    Radiation-induced soft errors are an increasingly important threat to the reliability of modern electronic systems. In order to evaluate system-on chip's reliability and soft error, the fault tree analysis method was used in this work. The system fault tree was constructed based on Xilinx Zynq-7010 All Programmable SoC. Moreover, the soft error rates of different components in Zynq-7010 SoC were tested by americium-241 alpha radiation source. Furthermore, some parameters that used to evaluate the system's reliability and safety were calculated using Isograph Reliability Workbench 11.0, such as failure rate, unavailability and mean time to failure (MTTF). According to fault tree analysis for system-on chip, the critical blocks and system reliability were evaluated through the qualitative and quantitative analysis.

  2. The quantitation of buffering action II. Applications of the formal & general approach.

    PubMed

    Schmitt, Bernhard M

    2005-03-16

    The paradigm of "buffering" originated in acid-base physiology, but was subsequently extended to other fields and is now used for a wide and diverse set of phenomena. In the preceding article, we have presented a formal and general approach to the quantitation of buffering action. Here, we use that buffering concept for a systematic treatment of selected classical and other buffering phenomena. H+ buffering by weak acids and "self-buffering" in pure water represent "conservative buffered systems" whose analysis reveals buffering properties that contrast in important aspects from classical textbook descriptions. The buffering of organ perfusion in the face of variable perfusion pressure (also termed "autoregulation") can be treated in terms of "non-conservative buffered systems", the general form of the concept. For the analysis of cytoplasmic Ca++ concentration transients (also termed "muffling"), we develop a related unit that is able to faithfully reflect the time-dependent quantitative aspect of buffering during the pre-steady state period. Steady-state buffering is shown to represent the limiting case of time-dependent muffling, namely for infinitely long time intervals and infinitely small perturbations. Finally, our buffering concept provides a stringent definition of "buffering" on the level of systems and control theory, resulting in four absolute ratio scales for control performance that are suited to measure disturbance rejection and setpoint tracking, and both their static and dynamic aspects. Our concept of buffering provides a powerful mathematical tool for the quantitation of buffering action in all its appearances.

  3. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  4. Robot and Human Surface Operations on Solar System Bodies

    NASA Technical Reports Server (NTRS)

    Weisbin, C. R.; Easter, R.; Rodriguez, G.

    2001-01-01

    This paper presents a comparison of robot and human surface operations on solar system bodies. The topics include: 1) Long Range Vision of Surface Scenarios; 2) Human and Robots Complement Each Other; 3) Respective Human and Robot Strengths; 4) Need More In-Depth Quantitative Analysis; 5) Projected Study Objectives; 6) Analysis Process Summary; 7) Mission Scenarios Decompose into Primitive Tasks; 7) Features of the Projected Analysis Approach; and 8) The "Getting There Effect" is a Major Consideration. This paper is in viewgraph form.

  5. 77 FR 68773 - FIFRA Scientific Advisory Panel; Notice of Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... for physical chemical properties that cannot be easily tested in in vitro systems or stable enough for.... Quantitative structural-activity relationship (QSAR) models and estrogen receptor (ER) expert systems development. High-throughput data generation and analysis (expertise focused on how this methodology can be...

  6. MIDAS Website. Revised

    NASA Technical Reports Server (NTRS)

    Goodman, Allen; Shively, R. Joy (Technical Monitor)

    1997-01-01

    MIDAS, Man-machine Integration Design and Analysis System, is a unique combination of software tools aimed at reducing design cycle time, supporting quantitative predictions of human-system effectiveness and improving the design of crew stations and their associated operating procedures. This project is supported jointly by the US Army and NASA.

  7. Magnetic fingerprints of rolling cells for quantitative flow cytometry in whole blood

    NASA Astrophysics Data System (ADS)

    Reisbeck, Mathias; Helou, Michael Johannes; Richter, Lukas; Kappes, Barbara; Friedrich, Oliver; Hayden, Oliver

    2016-09-01

    Over the past 50 years, flow cytometry has had a profound impact on preclinical and clinical applications requiring single cell function information for counting, sub-typing and quantification of epitope expression. At the same time, the workflow complexity and high costs of such optical systems still limit flow cytometry applications to specialized laboratories. Here, we present a quantitative magnetic flow cytometer that incorporates in situ magnetophoretic cell focusing for highly accurate and reproducible rolling of the cellular targets over giant magnetoresistance sensing elements. Time-of-flight analysis is used to unveil quantitative single cell information contained in its magnetic fingerprint. Furthermore, we used erythrocytes as a biological model to validate our methodology with respect to precise analysis of the hydrodynamic cell diameter, quantification of binding capacity of immunomagnetic labels, and discrimination of cell morphology. The extracted time-of-flight information should enable point-of-care quantitative flow cytometry in whole blood for clinical applications, such as immunology and primary hemostasis.

  8. Quantitative Confocal Microscopy Analysis as a Basis for Search and Study of Potassium Kv1.x Channel Blockers

    NASA Astrophysics Data System (ADS)

    Feofanov, Alexey V.; Kudryashova, Kseniya S.; Nekrasova, Oksana V.; Vassilevski, Alexander A.; Kuzmenkov, Alexey I.; Korolkova, Yuliya V.; Grishin, Eugene V.; Kirpichnikov, Mikhail P.

    Artificial KcsA-Kv1.x (x = 1, 3) receptors were recently designed by transferring the ligand-binding site from human Kv1.x voltage-gated potassium channels into corresponding domain of the bacterial KscA channel. We found that KcsA-Kv1.x receptors expressed in E. coli cells are embedded into cell membrane and bind ligands when the cells are transformed to spheroplasts. We supposed that E. coli spheroplasts with membrane-embedded KcsA-Kv1.x and fluorescently labeled ligand agitoxin-2 (R-AgTx2) can be used as elements of an advanced analytical system for search and study of Kv1-channel blockers. To realize this idea, special procedures were developed for measurement and quantitative treatment of fluorescence signals obtained from spheroplast membrane using confocal laser scanning microscopy (CLSM). The worked out analytical "mix and read" systems supported by quantitative CLSM analysis were demonstrated to be reliable alternative to radioligand and electrophysiology techniques in the search and study of selective Kv1.x channel blockers of high scientific and medical importance.

  9. Spectrally And Temporally Resolved Low-Light Level Video Microscopy

    NASA Astrophysics Data System (ADS)

    Wampler, John E.; Furukawa, Ruth; Fechheimer, Marcus

    1989-12-01

    The IDG law-light video microscope system was designed to aid studies of localization of subcellular luminescence sources and stimulus/response coupling in single living cells using luminescent probes. Much of the motivation for design of this instrument system came from the pioneering efforts of Dr. Reynolds (Reynolds, Q. Rev. Biophys. 5, 295-347; Reynolds and Taylor, Bioscience 30, 586-592) who showed the value of intensified video camera systems for detection and localizion of fluorescence and bioluminescence signals from biological tissues. Our instrument system has essentially two roles, 1) localization and quantitation of very weak bioluminescence signals and 2) quantitation of intracellular environmental characteristics such as pH and calcium ion concentrations using fluorescent and bioluminescent probes. The instrument system exhibits over one million fold operating range allowing visualization and enhancement of quantum limited images with quantum limited response, spectral analysis of fluorescence signals, and transmitted light imaging. The computer control of the system implements rapid switching between light regimes, spatially resolved spectral scanning, and digital data processing for spectral shape analysis and for detailed analysis of the statistical distribution of single cell measurements. The system design and software algorithms used by the system are summarized. These design criteria are illustrated with examples taken from studies of bioluminescence, applications of bioluminescence to study developmental processes and gene expression in single living cells, and applications of fluorescent probes to study stimulus/response coupling in living cells.

  10. Quantitative system drift compensates for altered maternal inputs to the gap gene network of the scuttle fly Megaselia abdita

    PubMed Central

    Wotton, Karl R; Jiménez-Guri, Eva; Crombach, Anton; Janssens, Hilde; Alcaine-Colet, Anna; Lemke, Steffen; Schmidt-Ott, Urs; Jaeger, Johannes

    2015-01-01

    The segmentation gene network in insects can produce equivalent phenotypic outputs despite differences in upstream regulatory inputs between species. We investigate the mechanistic basis of this phenomenon through a systems-level analysis of the gap gene network in the scuttle fly Megaselia abdita (Phoridae). It combines quantification of gene expression at high spatio-temporal resolution with systematic knock-downs by RNA interference (RNAi). Initiation and dynamics of gap gene expression differ markedly between M. abdita and Drosophila melanogaster, while the output of the system converges to equivalent patterns at the end of the blastoderm stage. Although the qualitative structure of the gap gene network is conserved, there are differences in the strength of regulatory interactions between species. We term such network rewiring ‘quantitative system drift’. It provides a mechanistic explanation for the developmental hourglass model in the dipteran lineage. Quantitative system drift is likely to be a widespread mechanism for developmental evolution. DOI: http://dx.doi.org/10.7554/eLife.04785.001 PMID:25560971

  11. Qualitative and Quantitative Pedigree Analysis: Graph Theory, Computer Software, and Case Studies.

    ERIC Educational Resources Information Center

    Jungck, John R.; Soderberg, Patti

    1995-01-01

    Presents a series of elementary mathematical tools for re-representing pedigrees, pedigree generators, pedigree-driven database management systems, and case studies for exploring genetic relationships. (MKR)

  12. Quantitative kinetic analysis of lung nodules by temporal subtraction technique in dynamic chest radiography with a flat panel detector

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Yuichiro; Kodera, Yoshie; Tanaka, Rie; Sanada, Shigeru

    2007-03-01

    Early detection and treatment of lung cancer is one of the most effective means to reduce cancer mortality; chest X-ray radiography has been widely used as a screening examination or health checkup. The new examination method and the development of computer analysis system allow obtaining respiratory kinetics by the use of flat panel detector (FPD), which is the expanded method of chest X-ray radiography. Through such changes functional evaluation of respiratory kinetics in chest has become available. Its introduction into clinical practice is expected in the future. In this study, we developed the computer analysis algorithm for the purpose of detecting lung nodules and evaluating quantitative kinetics. Breathing chest radiograph obtained by modified FPD was converted into 4 static images drawing the feature, by sequential temporal subtraction processing, morphologic enhancement processing, kinetic visualization processing, and lung region detection processing, after the breath synchronization process utilizing the diaphragmatic analysis of the vector movement. The artificial neural network used to analyze the density patterns detected the true nodules by analyzing these static images, and drew their kinetic tracks. For the algorithm performance and the evaluation of clinical effectiveness with 7 normal patients and simulated nodules, both showed sufficient detecting capability and kinetic imaging function without statistically significant difference. Our technique can quantitatively evaluate the kinetic range of nodules, and is effective in detecting a nodule on a breathing chest radiograph. Moreover, the application of this technique is expected to extend computer-aided diagnosis systems and facilitate the development of an automatic planning system for radiation therapy.

  13. Chemical Applications of a Programmable Image Acquisition System

    NASA Astrophysics Data System (ADS)

    Ogren, Paul J.; Henry, Ian; Fletcher, Steven E. S.; Kelly, Ian

    2003-06-01

    Image analysis is widely used in chemistry, both for rapid qualitative evaluations using techniques such as thin layer chromatography (TLC) and for quantitative purposes such as well-plate measurements of analyte concentrations or fragment-size determinations in gel electrophoresis. This paper describes a programmable system for image acquisition and processing that is currently used in the laboratories of our organic and physical chemistry courses. It has also been used in student research projects in analytical chemistry and biochemistry. The potential range of applications is illustrated by brief presentations of four examples: (1) using well-plate optical transmission data to construct a standard concentration absorbance curve; (2) the quantitative analysis of acetaminophen in Tylenol and acetylsalicylic acid in aspirin using TLC with fluorescence detection; (3) the analysis of electrophoresis gels to determine DNA fragment sizes and amounts; and, (4) using color change to follow reaction kinetics. The supplemental material in JCE Online contains information on two additional examples: deconvolution of overlapping bands in protein gel electrophoresis, and the recovery of data from published images or graphs. The JCE Online material also presents additional information on each example, on the system hardware and software, and on the data analysis methodology.

  14. Establishment of a sensitive system for analysis of human vaginal microbiota on the basis of rRNA-targeted reverse transcription-quantitative PCR.

    PubMed

    Kurakawa, Takashi; Ogata, Kiyohito; Tsuji, Hirokazu; Kado, Yukiko; Takahashi, Takuya; Kida, Yumi; Ito, Masahiro; Okada, Nobuhiko; Nomoto, Koji

    2015-04-01

    Ten specific primer sets, for Lactobacillus gasseri, Lactobacillus crispatus, Atopobium vaginae, Gardnerella vaginalis, Mobiluncus curtisii, Chlamydia trachomatis/muridarum, Bifidobacterium longum subsp. longum, Bifidobacterium longum subsp. infantis, Bifidobacterium adolescentis, and Bifidobacterium angulatum, were developed for quantitative analysis of vaginal microbiota. rRNA-targeted reverse transcription-quantitative PCR (RT-qPCR) analysis of the vaginal samples from 12 healthy Japanese volunteers using the new primer sets together with 25 existing primer sets revealed the diversity of their vaginal microbiota: Lactobacilli such as L. crispatus, L. gasseri, Lactobacillus jensenii, Lactobacillus iners, and Lactobacillus vaginalis, as the major populations at 10(7) cells/ml vaginal fluid, were followed by facultative anaerobes such as Streptococcus and strict anaerobes at lower population levels of 10(4) cells/ml or less. Certain bacterial vaginosis (BV)-related bacteria, such as G. vaginalis, A. vaginae, M. curtisii, and Prevotella, were also detected in some subjects. Especially in one subject, both G. vaginalis and A. vaginae were detected at high population levels of 10(8.8) and 10(8.9) cells/ml vaginal fluid, suggesting that she is an asymptomatic BV patient. These results suggest that the RT-qPCR system is effective for accurate analysis of major vaginal commensals and diagnosis of several vaginal infections. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Blackboard architecture for medical image interpretation

    NASA Astrophysics Data System (ADS)

    Davis, Darryl N.; Taylor, Christopher J.

    1991-06-01

    There is a growing interest in using sophisticated knowledge-based systems for biomedical image interpretation. We present a principled attempt to use artificial intelligence methodologies in interpreting lateral skull x-ray images. Such radiographs are routinely used in cephalometric analysis to provide quantitative measurements useful to clinical orthodontists. Manual and interactive methods of analysis are known to be error prone and previous attempts to automate this analysis typically fail to capture the expertise and adaptability required to cope with the variability in biological structure and image quality. An integrated model-based system has been developed which makes use of a blackboard architecture and multiple knowledge sources. A model definition interface allows quantitative models, of feature appearance and location, to be built from examples as well as more qualitative modelling constructs. Visual task definition and blackboard control modules allow task-specific knowledge sources to act on information available to the blackboard in a hypothesise and test reasoning cycle. Further knowledge-based modules include object selection, location hypothesis, intelligent segmentation, and constraint propagation systems. Alternative solutions to given tasks are permitted.

  16. Silica-based ionic liquid coating for 96-blade system for extraction of aminoacids from complex matrixes.

    PubMed

    Mousavi, Fatemeh; Pawliszyn, Janusz

    2013-11-25

    1-Vinyl-3-octadecylimidazolium bromide ionic liquid [C18VIm]Br was prepared and used for the modification of mercaptopropyl-functionalized silica (Si-MPS) through surface radical chain-transfer addition. The synthesized octadecylimidazolium-modified silica (SiImC18) was characterized by thermogravimetric analysis (TGA), infrared spectroscopy (IR), (13)C NMR and (29)Si NMR spectroscopy and used as an extraction phase for the automated 96-blade solid phase microextraction (SPME) system with thin-film geometry using polyacrylonitrile (PAN) glue. The new proposed extraction phase was applied for extraction of aminoacids from grape pulp, and LC-MS-MS method was developed for separation of model compounds. Extraction efficiency, reusability, linearity, limit of detection, limit of quantitation and matrix effect were evaluated. The whole process of sample preparation for the proposed method requires 270min for 96 samples simultaneously (60min preconditioning, 90min extraction, 60min desorption and 60min for carryover step) using 96-blade SPME system. Inter-blade and intra-blade reproducibility were in the respective ranges of 5-13 and 3-10% relative standard deviation (RSD) for all model compounds. Limits of detection and quantitation of the proposed SPME-LC-MS/MS system for analysis of analytes were found to range from 0.1 to 1.0 and 0.5 to 3.0μgL(-1), respectively. Standard addition calibration was applied for quantitative analysis of aminoacids from grape juice and the results were validated with solvent extraction (SE) technique. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. HuMOVE: a low-invasive wearable monitoring platform in sexual medicine.

    PubMed

    Ciuti, Gastone; Nardi, Matteo; Valdastri, Pietro; Menciassi, Arianna; Basile Fasolo, Ciro; Dario, Paolo

    2014-10-01

    To investigate an accelerometer-based wearable system, named Human Movement (HuMOVE) platform, designed to enable quantitative and continuous measurement of sexual performance with minimal invasiveness and inconvenience for users. Design, implementation, and development of HuMOVE, a wearable platform equipped with an accelerometer sensor for monitoring inertial parameters for sexual performance assessment and diagnosis, were performed. The system enables quantitative measurement of movement parameters during sexual intercourse, meeting the requirements of wearability, data storage, sampling rate, and interfacing methods, which are fundamental for human sexual intercourse performance analysis. HuMOVE was validated through characterization using a controlled experimental test bench and evaluated in a human model during simulated sexual intercourse conditions. HuMOVE demonstrated to be a robust and quantitative monitoring platform and a reliable candidate for sexual performance evaluation and diagnosis. Characterization analysis on the controlled experimental test bench demonstrated an accurate correlation between the HuMOVE system and data from a reference displacement sensor. Experimental tests in the human model during simulated intercourse conditions confirmed the accuracy of the sexual performance evaluation platform and the effectiveness of the selected and derived parameters. The obtained outcomes also established the project expectations in terms of usability and comfort, evidenced by the questionnaires that highlighted the low invasiveness and acceptance of the device. To the best of our knowledge, HuMOVE platform is the first device for human sexual performance analysis compatible with sexual intercourse; the system has the potential to be a helpful tool for physicians to accurately classify sexual disorders, such as premature or delayed ejaculation. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Method and apparatus for transport, introduction, atomization and excitation of emission spectrum for quantitative analysis of high temperature gas sample streams containing vapor and particulates without degradation of sample stream temperature

    DOEpatents

    Eckels, David E.; Hass, William J.

    1989-05-30

    A sample transport, sample introduction, and flame excitation system for spectrometric analysis of high temperature gas streams which eliminates degradation of the sample stream by condensation losses.

  19. An Empirical Study of Education Divide Diminishment through Online Learning Courses

    ERIC Educational Resources Information Center

    Hsieh, Ming-Yuan

    2017-01-01

    According to the swift development of education system, Taiwanese government is always devoting diminishing the educational divide between rural and urban regions. This research focuses on this educational divide by cross-employing the Grey Relational Analysis (GRA) of quantitative analysis and the Fuzzy Set Qualitative Comparative Analysis…

  20. The use of morphological characteristics and texture analysis in the identification of tissue composition in prostatic neoplasia.

    PubMed

    Diamond, James; Anderson, Neil H; Bartels, Peter H; Montironi, Rodolfo; Hamilton, Peter W

    2004-09-01

    Quantitative examination of prostate histology offers clues in the diagnostic classification of lesions and in the prediction of response to treatment and prognosis. To facilitate the collection of quantitative data, the development of machine vision systems is necessary. This study explored the use of imaging for identifying tissue abnormalities in prostate histology. Medium-power histological scenes were recorded from whole-mount radical prostatectomy sections at x 40 objective magnification and assessed by a pathologist as exhibiting stroma, normal tissue (nonneoplastic epithelial component), or prostatic carcinoma (PCa). A machine vision system was developed that divided the scenes into subregions of 100 x 100 pixels and subjected each to image-processing techniques. Analysis of morphological characteristics allowed the identification of normal tissue. Analysis of image texture demonstrated that Haralick feature 4 was the most suitable for discriminating stroma from PCa. Using these morphological and texture measurements, it was possible to define a classification scheme for each subregion. The machine vision system is designed to integrate these classification rules and generate digital maps of tissue composition from the classification of subregions; 79.3% of subregions were correctly classified. Established classification rates have demonstrated the validity of the methodology on small scenes; a logical extension was to apply the methodology to whole slide images via scanning technology. The machine vision system is capable of classifying these images. The machine vision system developed in this project facilitates the exploration of morphological and texture characteristics in quantifying tissue composition. It also illustrates the potential of quantitative methods to provide highly discriminatory information in the automated identification of prostatic lesions using computer vision.

  1. The PAXgene® Tissue System Preserves Phosphoproteins in Human Tissue Specimens and Enables Comprehensive Protein Biomarker Research

    PubMed Central

    Gündisch, Sibylle; Schott, Christina; Wolff, Claudia; Tran, Kai; Beese, Christian; Viertler, Christian; Zatloukal, Kurt; Becker, Karl-Friedrich

    2013-01-01

    Precise quantitation of protein biomarkers in clinical tissue specimens is a prerequisite for accurate and effective diagnosis, prognosis, and personalized medicine. Although progress is being made, protein analysis from formalin-fixed and paraffin-embedded tissues is still challenging. In previous reports, we showed that the novel formalin-free tissue preservation technology, the PAXgene Tissue System, allows the extraction of intact and immunoreactive proteins from PAXgene-fixed and paraffin-embedded (PFPE) tissues. In the current study, we focused on the analysis of phosphoproteins and the applicability of two-dimensional gel electrophoresis (2D-PAGE) and enzyme-linked immunosorbent assay (ELISA) to the analysis of a variety of malignant and non-malignant human tissues. Using western blot analysis, we found that phosphoproteins are quantitatively preserved in PFPE tissues, and signal intensities are comparable to that in paired, frozen tissues. Furthermore, proteins extracted from PFPE samples are suitable for 2D-PAGE and can be quantified by ELISA specific for denatured proteins. In summary, the PAXgene Tissue System reliably preserves phosphoproteins in human tissue samples, even after prolonged fixation or stabilization times, and is compatible with methods for protein analysis such as 2D-PAGE and ELISA. We conclude that the PAXgene Tissue System has the potential to serve as a versatile tissue fixative for modern pathology. PMID:23555997

  2. Application of capital social of Bali cattle farmers that participate in the partnership system in Barru Regency, South Sulawesi Province

    NASA Astrophysics Data System (ADS)

    Sirajuddin, S. N.; Siregar, A. R.; Mappigau, P.

    2018-05-01

    There are four models of partnership that is centralized models, multipartite models, intermediary models and informal model application in all livestock commodities, including beef cattle. Partnership in the beef cattle business has been done in Barruie the program showroom cattle (SRS).This study aimed to known application the social capital of beef cattle breeders who followed the partnership system (program showroom cattle) in Barru. This research was conducted in April 2017 in the district Tanete Riaja. The population is all the farmers in Barru Regency who joined the partnership system (showroom program) and the sample is beef cattle breeders who followed the partnership system in Tanete Riaja district, Barru regency. This type of research is quantitative descriptive. This type of data is quantitative and qualitative. The resource data are primary data and secondary data. Data analysis uses descriptive statistical analysis with Likert scale. The results research show that social capital (trust, linkage, norm) of beef cattle breeders who joined the partnership system (cattle showroom program) at high scale

  3. Quantitative phosphoproteomic analysis of host responses in human lung epithelial (A549) cells during influenza virus infection.

    PubMed

    Dapat, Clyde; Saito, Reiko; Suzuki, Hiroshi; Horigome, Tsuneyoshi

    2014-01-22

    The emergence of antiviral drug-resistant influenza viruses highlights the need for alternative therapeutic strategies. Elucidation of host factors required during virus infection provides information not only on the signaling pathways involved but also on the identification of novel drug targets. RNA interference screening method had been utilized by several studies to determine these host factors; however, proteomics data on influenza host factors are currently limited. In this study, quantitative phosphoproteomic analysis of human lung cell line (A549) infected with 2009 pandemic influenza virus A (H1N1) virus was performed. Phosphopeptides were enriched from tryptic digests of total protein of infected and mock-infected cells using a titania column on an automated purification system followed by iTRAQ labeling. Identification and quantitative analysis of iTRAQ-labeled phosphopeptides were performed using LC-MS/MS. We identified 366 phosphorylation sites on 283 proteins. Of these, we detected 43 upregulated and 35 downregulated proteins during influenza virus infection. Gene ontology enrichment analysis showed that majority of the identified proteins are phosphoproteins involved in RNA processing, immune system process and response to infection. Host-virus interaction network analysis had identified 23 densely connected subnetworks. Of which, 13 subnetworks contained proteins with altered phosphorylation levels during by influenza virus infection. Our results will help to identify potential drug targets that can be pursued for influenza antiviral drug development. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. A perspective on bridging scales and design of models using low-dimensional manifolds and data-driven model inference

    PubMed Central

    Zenil, Hector; Kiani, Narsis A.; Ball, Gordon; Gomez-Cabrero, David

    2016-01-01

    Systems in nature capable of collective behaviour are nonlinear, operating across several scales. Yet our ability to account for their collective dynamics differs in physics, chemistry and biology. Here, we briefly review the similarities and differences between mathematical modelling of adaptive living systems versus physico-chemical systems. We find that physics-based chemistry modelling and computational neuroscience have a shared interest in developing techniques for model reductions aiming at the identification of a reduced subsystem or slow manifold, capturing the effective dynamics. By contrast, as relations and kinetics between biological molecules are less characterized, current quantitative analysis under the umbrella of bioinformatics focuses on signal extraction, correlation, regression and machine-learning analysis. We argue that model reduction analysis and the ensuing identification of manifolds bridges physics and biology. Furthermore, modelling living systems presents deep challenges as how to reconcile rich molecular data with inherent modelling uncertainties (formalism, variables selection and model parameters). We anticipate a new generative data-driven modelling paradigm constrained by identified governing principles extracted from low-dimensional manifold analysis. The rise of a new generation of models will ultimately connect biology to quantitative mechanistic descriptions, thereby setting the stage for investigating the character of the model language and principles driving living systems. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698038

  5. Grass Grows, the Cow Eats: A Simple Grazing Systems Model with Emergent Properties

    ERIC Educational Resources Information Center

    Ungar, Eugene David; Seligman, Noam G.; Noy-Meir, Imanuel

    2004-01-01

    We describe a simple, yet intellectually challenging model of grazing systems that introduces basic concepts in ecology and systems analysis. The practical is suitable for high-school and university curricula with a quantitative orientation, and requires only basic skills in mathematics and spreadsheet use. The model is based on Noy-Meir's (1975)…

  6. System safety education focused on flight safety

    NASA Technical Reports Server (NTRS)

    Holt, E.

    1971-01-01

    The measures necessary for achieving higher levels of system safety are analyzed with an eye toward maintaining the combat capability of the Air Force. Several education courses were provided for personnel involved in safety management. Data include: (1) Flight Safety Officer Course, (2) Advanced Safety Program Management, (3) Fundamentals of System Safety, and (4) Quantitative Methods of Safety Analysis.

  7. A conductive grating sensor for online quantitative monitoring of fatigue crack.

    PubMed

    Li, Peiyuan; Cheng, Li; Yan, Xiaojun; Jiao, Shengbo; Li, Yakun

    2018-05-01

    Online quantitative monitoring of crack damage due to fatigue is a critical challenge for structural health monitoring systems assessing structural safety. To achieve online quantitative monitoring of fatigue crack, a novel conductive grating sensor based on the principle of electrical potential difference is proposed. The sensor consists of equidistant grating channels to monitor the fatigue crack length and conductive bars to provide the circuit path. An online crack monitoring system is established to verify the sensor's capability. The experimental results prove that the sensor is suitable for online quantitative monitoring of fatigue crack. A finite element model for the sensor is also developed to optimize the sensitivity of crack monitoring, which is defined by the rate of sensor resistance change caused by the break of the first grating channel. Analysis of the model shows that the sensor sensitivity can be enhanced by reducing the number of grating channels and increasing their resistance and reducing the resistance of the conductive bar.

  8. A conductive grating sensor for online quantitative monitoring of fatigue crack

    NASA Astrophysics Data System (ADS)

    Li, Peiyuan; Cheng, Li; Yan, Xiaojun; Jiao, Shengbo; Li, Yakun

    2018-05-01

    Online quantitative monitoring of crack damage due to fatigue is a critical challenge for structural health monitoring systems assessing structural safety. To achieve online quantitative monitoring of fatigue crack, a novel conductive grating sensor based on the principle of electrical potential difference is proposed. The sensor consists of equidistant grating channels to monitor the fatigue crack length and conductive bars to provide the circuit path. An online crack monitoring system is established to verify the sensor's capability. The experimental results prove that the sensor is suitable for online quantitative monitoring of fatigue crack. A finite element model for the sensor is also developed to optimize the sensitivity of crack monitoring, which is defined by the rate of sensor resistance change caused by the break of the first grating channel. Analysis of the model shows that the sensor sensitivity can be enhanced by reducing the number of grating channels and increasing their resistance and reducing the resistance of the conductive bar.

  9. Stochastic optical reconstruction microscopy-based relative localization analysis (STORM-RLA) for quantitative nanoscale assessment of spatial protein organization.

    PubMed

    Veeraraghavan, Rengasayee; Gourdie, Robert G

    2016-11-07

    The spatial association between proteins is crucial to understanding how they function in biological systems. Colocalization analysis of fluorescence microscopy images is widely used to assess this. However, colocalization analysis performed on two-dimensional images with diffraction-limited resolution merely indicates that the proteins are within 200-300 nm of each other in the xy-plane and within 500-700 nm of each other along the z-axis. Here we demonstrate a novel three-dimensional quantitative analysis applicable to single-molecule positional data: stochastic optical reconstruction microscopy-based relative localization analysis (STORM-RLA). This method offers significant advantages: 1) STORM imaging affords 20-nm resolution in the xy-plane and <50 nm along the z-axis; 2) STORM-RLA provides a quantitative assessment of the frequency and degree of overlap between clusters of colabeled proteins; and 3) STORM-RLA also calculates the precise distances between both overlapping and nonoverlapping clusters in three dimensions. Thus STORM-RLA represents a significant advance in the high-throughput quantitative assessment of the spatial organization of proteins. © 2016 Veeraraghavan and Gourdie. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  10. Widely-targeted quantitative lipidomics methodology by supercritical fluid chromatography coupled with fast-scanning triple quadrupole mass spectrometry.

    PubMed

    Takeda, Hiroaki; Izumi, Yoshihiro; Takahashi, Masatomo; Paxton, Thanai; Tamura, Shohei; Koike, Tomonari; Yu, Ying; Kato, Noriko; Nagase, Katsutoshi; Shiomi, Masashi; Bamba, Takeshi

    2018-05-03

    Lipidomics, the mass spectrometry-based comprehensive analysis of lipids, has attracted attention as an analytical approach to provide novel insight into lipid metabolism and to search for biomarkers. However, an ideal method for both comprehensive and quantitative analysis of lipids has not been fully developed. Herein, we have proposed a practical methodology for widely-targeted quantitative lipidome analysis using supercritical fluid chromatography fast-scanning triple-quadrupole mass spectrometry (SFC/QqQMS) and theoretically calculated a comprehensive lipid multiple reaction monitoring (MRM) library. Lipid classes can be separated by SFC with a normal phase diethylamine-bonded silica column with high-resolution, high-throughput, and good repeatability. Structural isomers of phospholipids can be monitored by mass spectrometric separation with fatty acyl-based MRM transitions. SFC/QqQMS analysis with an internal standard-dilution method offers quantitative information for both lipid class and individual lipid molecular species in the same lipid class. Additionally, data acquired using this method has advantages including reduction of misidentification and acceleration of data analysis. Using the SFC/QqQMS system, alteration of plasma lipid levels in myocardial infarction-prone rabbits to the supplementation of eicosapentaenoic acid was first observed. Our developed SFC/QqQMS method represents a potentially useful tool for in-depth studies focused on complex lipid metabolism and biomarker discovery. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  11. Quantitative simultaneous multi-element microprobe analysis using combined wavelength and energy dispersive systems

    NASA Technical Reports Server (NTRS)

    Walter, L. S.; Doan, A. S., Jr.; Wood, F. M., Jr.; Bredekamp, J. H.

    1972-01-01

    A combined WDS-EDS system obviates the severe X-ray peak overlap problems encountered with Na, Mg, Al and Si common to pure EDS systems. By application of easily measured empirical correction factors for pulse pile-up and peak overlaps which are normally observed in the analysis of silicate minerals, the accuracy of analysis is comparable with that expected for WDS electron microprobe analyses. The continuum backgrounds are subtracted for the spectra by a spline fitting technique based on integrated intensities between the peaks. The preprocessed data are then reduced to chemical analyses by existing data reduction programs.

  12. Comparative study between quantitative digital image analysis and fluorescence in situ hybridization of breast cancer equivocal human epidermal growth factor receptors 2 score 2(+) cases.

    PubMed

    Ayad, Essam; Mansy, Mina; Elwi, Dalal; Salem, Mostafa; Salama, Mohamed; Kayser, Klaus

    2015-01-01

    Optimization of workflow for breast cancer samples with equivocal human epidermal growth factor receptors 2 (HER2)/neu score 2(+) results in routine practice, remains to be a central focus of the on-going efforts to assess HER2 status. According to the College of American Pathologists/American Society of Clinical Oncology guidelines equivocal HER2/neu score 2(+) cases are subject for further testing, usually by fluorescence in situ hybridization (FISH) investigations. It still remains on open question, whether quantitative digital image analysis of HER2 immunohistochemistry (IHC) stained slides can assist in further refining the HER2 score 2(+). To assess utility of quantitative digital analysis of IHC stained slides and compare its performance to FISH in cases of breast cancer with equivocal HER2 score 2(+). Fifteen specimens (previously diagnosed as breast cancer and was evaluated as HER 2(-) score 2(+)) represented the study population. Contemporary new cuts were prepared for re-evaluation of HER2 immunohistochemical studies and FISH examination. All the cases were digitally scanned by iScan (Produced by BioImagene [Now Roche-Ventana]). The IHC signals of HER2 were measured using an automated image analyzing system (MECES, www.Diagnomx.eu/meces). Finally, a comparative study was done between the results of the FISH and the quantitative analysis of the virtual slides. Three out of the 15 cases with equivocal HER2 score 2(+), turned out to be positive (3(+)) by quantitative digital analysis, and 12 were found to be negative in FISH too. Two of these three positive cases proved to be positive with FISH, and only one was negative. Quantitative digital analysis is highly sensitive and relatively specific when compared to FISH in detecting HER2/neu overexpression. Therefore, it represents a potential reliable substitute for FISH in breast cancer cases, which desire further refinement of equivocal IHC results.

  13. Investigating Information System Development, Behavior and Business Knowledge Impact on Project Success: Quantitative Analysis

    ERIC Educational Resources Information Center

    Skinner, Ann

    2018-01-01

    Resource-based theory provided the theoretical foundation to investigate the extent that developer knowledge correlated to success of information technology (IT) development projects. Literature indicated there was a knowledge gap in understanding whether developer information system development, behavior and business knowledge contributed to IT…

  14. Quantitative Analysis of Strategic Voting in Anonymous Voting Systems

    ERIC Educational Resources Information Center

    Wang, Tiance

    2016-01-01

    Democratically choosing a single preference from three or more candidate options is not a straightforward matter. There are many competing ideas on how to aggregate rankings of candidates. However, the Gibbard-Satterthwaite theorem implies that no fair voting system (equality among voters and equality among candidates) is immune to strategic…

  15. RAMP: a computer system for mapping regional areas

    Treesearch

    Bradley B. Nickey

    1975-01-01

    Until 1972, the U.S. Forest Service's Individual Fire Reports recorded locations by the section-township-range system..These earlier fire reports, therefore, lacked congruent locations. RAMP (Regional Area Mapping Procedure) was designed to make the reports more useful for quantitative analysis. This computer-based technique converts locations expressed in...

  16. Educational Information Quantization for Improving Content Quality in Learning Management Systems

    ERIC Educational Resources Information Center

    Rybanov, Alexander Aleksandrovich

    2014-01-01

    The article offers the educational information quantization method for improving content quality in Learning Management Systems. The paper considers questions concerning analysis of quality of quantized presentation of educational information, based on quantitative text parameters: average frequencies of parts of speech, used in the text; formal…

  17. Bibliographic Post-Processing with the TIS Intelligent Gateway: Analytical and Communication Capabilities.

    ERIC Educational Resources Information Center

    Burton, Hilary D.

    TIS (Technology Information System) is an intelligent gateway system capable of performing quantitative evaluation and analysis of bibliographic citations using a set of Process functions. Originally developed by Lawrence Livermore National Laboratory (LLNL) to analyze information retrieved from three major federal databases, DOE/RECON,…

  18. The Simulation of an Oxidation-Reduction Titration Curve with Computer Algebra

    ERIC Educational Resources Information Center

    Whiteley, Richard V., Jr.

    2015-01-01

    Although the simulation of an oxidation/reduction titration curve is an important exercise in an undergraduate course in quantitative analysis, that exercise is frequently simplified to accommodate computational limitations. With the use of readily available computer algebra systems, however, such curves for complicated systems can be generated…

  19. Quantitative analysis of lead in aqueous solutions by ultrasonic nebulizer assisted laser induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhong, Shi-Lei; Lu, Yuan; Kong, Wei-Jin; Cheng, Kai; Zheng, Ronger

    2016-08-01

    In this study, an ultrasonic nebulizer unit was established to improve the quantitative analysis ability of laser-induced breakdown spectroscopy (LIBS) for liquid samples detection, using solutions of the heavy metal element Pb as an example. An analytical procedure was designed to guarantee the stability and repeatability of the LIBS signal. A series of experiments were carried out strictly according to the procedure. The experimental parameters were optimized based on studies of the pulse energy influence and temporal evolution of the emission features. The plasma temperature and electron density were calculated to confirm the LTE state of the plasma. Normalizing the intensities by background was demonstrated to be an appropriate method in this work. The linear range of this system for Pb analysis was confirmed over a concentration range of 0-4,150ppm by measuring 12 samples with different concentrations. The correlation coefficient of the fitted calibration curve was as high as 99.94% in the linear range, and the LOD of Pb was confirmed as 2.93ppm. Concentration prediction experiments were performed on a further six samples. The excellent quantitative ability of the system was demonstrated by comparison of the real and predicted concentrations of the samples. The lowest relative error was 0.043% and the highest was no more than 7.1%.

  20. Assessing the risk posed by natural hazards to infrastructures

    NASA Astrophysics Data System (ADS)

    Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn

    2015-04-01

    The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm and landslide is assessed. The application examples show that the proposed model provides a useful tool for screening of undesirable events, with the ultimate goal to reduce the societal vulnerability.

  1. Transdisciplinary Application of Cross-Scale Resilience ...

    EPA Pesticide Factsheets

    The cross-scale resilience model was developed in ecology to explain the emergence of resilience from the distribution of ecological functions within and across scales, and as a tool to assess resilience. We propose that the model and the underlyingdiscontinuity hypothesis are relevant to other complex adaptive systems, and can be used to identify and track changes in system parameters related to resilience. We explain the theory behind the cross-scale resilience model, review the cases where it has been applied to non-ecological systems, and discuss some examples of social-ecological, archaeological/anthropological, and economic systems where a cross-scale resilience analysis could add a quantitative dimension to our current understanding of system dynamics and resilience. We argue that the scaling and diversity parameters suitable for a resilience analysis of ecological systems are appropriate for a broad suite of systems where non-normative quantitative assessments of resilience are desired. Our planet is currently characterized by fast environmental and social change, and the cross-scale resilience model has the potential to quantify resilience across many types of complex adaptive systems. Comparative analyses of complex systems have, in fact, demonstrated commonalities among distinctly different types of systems (Schneider & Kay 1994; Holling 2001; Lansing 2003; Foster 2005; Bullmore et al. 2009). Both biological and non-biological complex systems appear t

  2. Will Systems Biology Deliver Its Promise and Contribute to the Development of New or Improved Vaccines? What Really Constitutes the Study of "Systems Biology" and How Might Such an Approach Facilitate Vaccine Design.

    PubMed

    Germain, Ronald N

    2017-10-16

    A dichotomy exists in the field of vaccinology about the promise versus the hype associated with application of "systems biology" approaches to rational vaccine design. Some feel it is the only way to efficiently uncover currently unknown parameters controlling desired immune responses or discover what elements actually mediate these responses. Others feel that traditional experimental, often reductionist, methods for incrementally unraveling complex biology provide a more solid way forward, and that "systems" approaches are costly ways to collect data without gaining true insight. Here I argue that both views are inaccurate. This is largely because of confusion about what can be gained from classical experimentation versus statistical analysis of large data sets (bioinformatics) versus methods that quantitatively explain emergent properties of complex assemblies of biological components, with the latter reflecting what was previously called "physiology." Reductionist studies will remain essential for generating detailed insight into the functional attributes of specific elements of biological systems, but such analyses lack the power to provide a quantitative and predictive understanding of global system behavior. But by employing (1) large-scale screening methods for discovery of unknown components and connections in the immune system ( omics ), (2) statistical analysis of large data sets ( bioinformatics ), and (3) the capacity of quantitative computational methods to translate these individual components and connections into models of emergent behavior ( systems biology ), we will be able to better understand how the overall immune system functions and to determine with greater precision how to manipulate it to produce desired protective responses. Copyright © 2017 Cold Spring Harbor Laboratory Press; all rights reserved.

  3. Subsurface imaging and cell refractometry using quantitative phase/ shear-force feedback microscopy

    NASA Astrophysics Data System (ADS)

    Edward, Kert; Farahi, Faramarz

    2009-10-01

    Over the last few years, several novel quantitative phase imaging techniques have been developed for the study of biological cells. However, many of these techniques are encumbered by inherent limitations including 2π phase ambiguities and diffraction limited spatial resolution. In addition, subsurface information in the phase data is not exploited. We hereby present a novel quantitative phase imaging system without 2 π ambiguities, which also allows for subsurface imaging and cell refractometry studies. This is accomplished by utilizing simultaneously obtained shear-force topography information. We will demonstrate how the quantitative phase and topography data can be used for subsurface and cell refractometry analysis and will present results for a fabricated structure and a malaria infected red blood cell.

  4. Visualization techniques to aid in the analysis of multispectral astrophysical data sets

    NASA Technical Reports Server (NTRS)

    Brugel, E. W.; Domik, Gitta O.; Ayres, T. R.

    1993-01-01

    The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists. Twenty-one examples of the use of visualization for astrophysical data are included with this report. Sixteen publications related to efforts performed during or initiated through work on this project are listed at the end of this report.

  5. Design of a Virtual Reality System for Affect Analysis in Facial Expressions (VR-SAAFE); Application to Schizophrenia.

    PubMed

    Bekele, E; Bian, D; Peterman, J; Park, S; Sarkar, N

    2017-06-01

    Schizophrenia is a life-long, debilitating psychotic disorder with poor outcome that affects about 1% of the population. Although pharmacotherapy can alleviate some of the acute psychotic symptoms, residual social impairments present a significant barrier that prevents successful rehabilitation. With limited resources and access to social skills training opportunities, innovative technology has emerged as a potentially powerful tool for intervention. In this paper, we present a novel virtual reality (VR)-based system for understanding facial emotion processing impairments that may lead to poor social outcome in schizophrenia. We henceforth call it a VR System for Affect Analysis in Facial Expressions (VR-SAAFE). This system integrates a VR-based task presentation platform that can minutely control facial expressions of an avatar with or without accompanying verbal interaction, with an eye-tracker to quantitatively measure a participants real-time gaze and a set of physiological sensors to infer his/her affective states to allow in-depth understanding of the emotion recognition mechanism of patients with schizophrenia based on quantitative metrics. A usability study with 12 patients with schizophrenia and 12 healthy controls was conducted to examine processing of the emotional faces. Preliminary results indicated that there were significant differences in the way patients with schizophrenia processed and responded towards the emotional faces presented in the VR environment compared with healthy control participants. The preliminary results underscore the utility of such a VR-based system that enables precise and quantitative assessment of social skill deficits in patients with schizophrenia.

  6. Successful downstream application of the Paxgene Blood RNA system from small blood samples in paediatric patients for quantitative PCR analysis

    PubMed Central

    Carrol, Enitan D; Salway, Fiona; Pepper, Stuart D; Saunders, Emma; Mankhambo, Limangeni A; Ollier, William E; Hart, C Anthony; Day, Phillip

    2007-01-01

    Background The challenge of gene expression studies is to reliably quantify levels of transcripts, but this is hindered by a number of factors including sample availability, handling and storage. The PAXgene™ Blood RNA System includes a stabilizing additive in a plastic evacuated tube, but requires 2.5 mL blood, which makes routine implementation impractical for paediatric use. The aim of this study was to modify the PAXgene™ Blood RNA System kit protocol for application to small, sick chidren, without compromising RNA integrity, and subsequently to perform quantitative analysis of ICAM and interleukin-6 gene expression. Aliquots of 0.86 mL PAXgene™ reagent were put into microtubes and 0.3 mL whole blood added to maintain the same recommended proportions as in the PAXgene™ evacuated tube system. RNA quality was assessed using the Agilent BioAnalyser 2100 and an in-house TaqMan™ assay which measures GAPDH transcript integrity by determining 3' to 5' ratios. qPCR analysis was performed on an additional panel of 7 housekeeping genes. Three reference genes (HPRT1, YWHAZ and GAPDH) were identified using the GeNORM algorithm, which were subsequently used to normalising target gene expression levels. ICAM-1 and IL-6 gene expression were measured in 87 Malawian children with invasive pneumococcal disease. Results Total RNA yield was between 1,114 and 2,950 ng and the BioAnalyser 2100 demonstrated discernible 18s and 28s bands. The cycle threshold values obtained for the seven housekeeping genes were between 15 and 30 and showed good consistency. Median relative ICAM and IL-6 gene expression were significantly reduced in non-survivors compared to survivors (ICAM: 3.56 vs 4.41, p = 0.04, and IL-6: 2.16 vs 6.73, p = 0.02). Conclusion We have successfully modified the PAXgene™ blood collection system for use in small children and demonstrated preservation of RNA integrity and successful quantitative real-time PCR analysis. PMID:17850649

  7. Simplex and duplex event-specific analytical methods for functional biotech maize.

    PubMed

    Lee, Seong-Hun; Kim, Su-Jeong; Yi, Bu-Young

    2009-08-26

    Analytical methods are very important in the control of genetically modified organism (GMO) labeling systems or living modified organism (LMO) management for biotech crops. Event-specific primers and probes were developed for qualitative and quantitative analysis for biotech maize event 3272 and LY 038 on the basis of the 3' flanking regions, respectively. The qualitative primers confirmed the specificity by a single PCR product and sensitivity to 0.05% as a limit of detection (LOD). Simplex and duplex quantitative methods were also developed using TaqMan real-time PCR. One synthetic plasmid was constructed from two taxon-specific DNA sequences of maize and two event-specific 3' flanking DNA sequences of event 3272 and LY 038 as reference molecules. In-house validation of the quantitative methods was performed using six levels of mixing samples, from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-30%. Limits of quantitation (LOQs) of the quantitative methods were all 0.1% for simplex real-time PCRs of event 3272 and LY 038 and 0.5% for duplex real-time PCR of LY 038. This study reports that event-specific analytical methods were applicable for qualitative and quantitative analysis for biotech maize event 3272 and LY 038.

  8. Miniaturized Battery-Free Wireless Systems for Wearable Pulse Oximetry.

    PubMed

    Kim, Jeonghyun; Gutruf, Philipp; Chiarelli, Antonio M; Heo, Seung Yun; Cho, Kyoungyeon; Xie, Zhaoqian; Banks, Anthony; Han, Seungyoung; Jang, Kyung-In; Lee, Jung Woo; Lee, Kyu-Tae; Feng, Xue; Huang, Yonggang; Fabiani, Monica; Gratton, Gabriele; Paik, Ungyu; Rogers, John A

    2017-01-05

    Development of unconventional technologies for wireless collection, storage and analysis of quantitative, clinically relevant information on physiological status is of growing interest. Soft, biocompatible systems are widely regarded as important because they facilitate mounting on external (e.g. skin) and internal (e.g. heart, brain) surfaces of the body. Ultra-miniaturized, lightweight and battery-free devices have the potential to establish complementary options in bio-integration, where chronic interfaces (i.e. months) are possible on hard surfaces such as the fingernails and the teeth, with negligible risk for irritation or discomfort. Here we report materials and device concepts for flexible platforms that incorporate advanced optoelectronic functionality for applications in wireless capture and transmission of photoplethysmograms, including quantitative information on blood oxygenation, heart rate and heart rate variability. Specifically, reflectance pulse oximetry in conjunction with near-field communication (NFC) capabilities enables operation in thin, miniaturized flexible devices. Studies of the material aspects associated with the body interface, together with investigations of the radio frequency characteristics, the optoelectronic data acquisition approaches and the analysis methods capture all of the relevant engineering considerations. Demonstrations of operation on various locations of the body and quantitative comparisons to clinical gold standards establish the versatility and the measurement accuracy of these systems, respectively.

  9. A low cost mobile phone dark-field microscope for nanoparticle-based quantitative studies.

    PubMed

    Sun, Dali; Hu, Tony Y

    2018-01-15

    Dark-field microscope (DFM) analysis of nanoparticle binding signal is highly useful for a variety of research and biomedical applications, but current applications for nanoparticle quantification rely on expensive DFM systems. The cost, size, limited robustness of these DFMs limits their utility for non-laboratory settings. Most nanoparticle analyses use high-magnification DFM images, which are labor intensive to acquire and subject to operator bias. Low-magnification DFM image capture is faster, but is subject to background from surface artifacts and debris, although image processing can partially compensate for background signal. We thus mated an LED light source, a dark-field condenser and a 20× objective lens with a mobile phone camera to create an inexpensive, portable and robust DFM system suitable for use in non-laboratory conditions. This proof-of-concept mobile DFM device weighs less than 400g and costs less than $2000, but analysis of images captured with this device reveal similar nanoparticle quantitation results to those acquired with a much larger and more expensive desktop DFMM system. Our results suggest that similar devices may be useful for quantification of stable, nanoparticle-based activity and quantitation assays in resource-limited areas where conventional assay approaches are not practical. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Quantitative analysis and comparative study of four cities green pattern in API system on the background of big data

    NASA Astrophysics Data System (ADS)

    Xin, YANG; Si-qi, WU; Qi, ZHANG

    2018-05-01

    Beijing, London, Paris, New York are typical cities in the world, so comparative study of four cities green pattern is very important to find out gap and advantage and to learn from each other. The paper will provide basis and new ideas for development of metropolises in China. On the background of big data, API (Application Programming Interface) system can provide extensive and accurate basic data to study urban green pattern in different geographical environment in domestic and foreign. On the basis of this, Average nearest neighbor tool, Kernel density tool and Standard Ellipse tool in ArcGIS platform can process and summarize data and realize quantitative analysis of green pattern. The paper summarized uniqueness of four cities green pattern and reasons of formation on basis of numerical comparison.

  11. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  12. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework.

    PubMed

    Anguera, M Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts.

  13. [Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].

    PubMed

    Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie

    2013-11-01

    In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.

  14. Quantitative transmission Raman spectroscopy of pharmaceutical tablets and capsules.

    PubMed

    Johansson, Jonas; Sparén, Anders; Svensson, Olof; Folestad, Staffan; Claybourn, Mike

    2007-11-01

    Quantitative analysis of pharmaceutical formulations using the new approach of transmission Raman spectroscopy has been investigated. For comparison, measurements were also made in conventional backscatter mode. The experimental setup consisted of a Raman probe-based spectrometer with 785 nm excitation for measurements in backscatter mode. In transmission mode the same system was used to detect the Raman scattered light, while an external diode laser of the same type was used as excitation source. Quantitative partial least squares models were developed for both measurement modes. The results for tablets show that the prediction error for an independent test set was lower for the transmission measurements with a relative root mean square error of about 2.2% as compared with 2.9% for the backscatter mode. Furthermore, the models were simpler in the transmission case, for which only a single partial least squares (PLS) component was required to explain the variation. The main reason for the improvement using the transmission mode is a more representative sampling of the tablets compared with the backscatter mode. Capsules containing mixtures of pharmaceutical powders were also assessed by transmission only. The quantitative results for the capsules' contents were good, with a prediction error of 3.6% w/w for an independent test set. The advantage of transmission Raman over backscatter Raman spectroscopy has been demonstrated for quantitative analysis of pharmaceutical formulations, and the prospects for reliable, lean calibrations for pharmaceutical analysis is discussed.

  15. Nonlinear optical microscopy: use of second harmonic generation and two-photon microscopy for automated quantitative liver fibrosis studies.

    PubMed

    Sun, Wanxin; Chang, Shi; Tai, Dean C S; Tan, Nancy; Xiao, Guangfa; Tang, Huihuan; Yu, Hanry

    2008-01-01

    Liver fibrosis is associated with an abnormal increase in an extracellular matrix in chronic liver diseases. Quantitative characterization of fibrillar collagen in intact tissue is essential for both fibrosis studies and clinical applications. Commonly used methods, histological staining followed by either semiquantitative or computerized image analysis, have limited sensitivity, accuracy, and operator-dependent variations. The fibrillar collagen in sinusoids of normal livers could be observed through second-harmonic generation (SHG) microscopy. The two-photon excited fluorescence (TPEF) images, recorded simultaneously with SHG, clearly revealed the hepatocyte morphology. We have systematically optimized the parameters for the quantitative SHG/TPEF imaging of liver tissue and developed fully automated image analysis algorithms to extract the information of collagen changes and cell necrosis. Subtle changes in the distribution and amount of collagen and cell morphology are quantitatively characterized in SHG/TPEF images. By comparing to traditional staining, such as Masson's trichrome and Sirius red, SHG/TPEF is a sensitive quantitative tool for automated collagen characterization in liver tissue. Our system allows for enhanced detection and quantification of sinusoidal collagen fibers in fibrosis research and clinical diagnostics.

  16. Shot noise-limited Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry

    NASA Astrophysics Data System (ADS)

    Chen, Shichao; Zhu, Yizheng

    2017-02-01

    Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.

  17. A Second-Generation Device for Automated Training and Quantitative Behavior Analyses of Molecularly-Tractable Model Organisms

    PubMed Central

    Blackiston, Douglas; Shomrat, Tal; Nicolas, Cindy L.; Granata, Christopher; Levin, Michael

    2010-01-01

    A deep understanding of cognitive processes requires functional, quantitative analyses of the steps leading from genetics and the development of nervous system structure to behavior. Molecularly-tractable model systems such as Xenopus laevis and planaria offer an unprecedented opportunity to dissect the mechanisms determining the complex structure of the brain and CNS. A standardized platform that facilitated quantitative analysis of behavior would make a significant impact on evolutionary ethology, neuropharmacology, and cognitive science. While some animal tracking systems exist, the available systems do not allow automated training (feedback to individual subjects in real time, which is necessary for operant conditioning assays). The lack of standardization in the field, and the numerous technical challenges that face the development of a versatile system with the necessary capabilities, comprise a significant barrier keeping molecular developmental biology labs from integrating behavior analysis endpoints into their pharmacological and genetic perturbations. Here we report the development of a second-generation system that is a highly flexible, powerful machine vision and environmental control platform. In order to enable multidisciplinary studies aimed at understanding the roles of genes in brain function and behavior, and aid other laboratories that do not have the facilities to undergo complex engineering development, we describe the device and the problems that it overcomes. We also present sample data using frog tadpoles and flatworms to illustrate its use. Having solved significant engineering challenges in its construction, the resulting design is a relatively inexpensive instrument of wide relevance for several fields, and will accelerate interdisciplinary discovery in pharmacology, neurobiology, regenerative medicine, and cognitive science. PMID:21179424

  18. Design and development of an ethnically-diverse imaging informatics-based eFolder system for multiple sclerosis patients.

    PubMed

    Ma, Kevin C; Fernandez, James R; Amezcua, Lilyana; Lerner, Alex; Shiroishi, Mark S; Liu, Brent J

    2015-12-01

    MRI has been used to identify multiple sclerosis (MS) lesions in brain and spinal cord visually. Integrating patient information into an electronic patient record system has become key for modern patient care in medicine in recent years. Clinically, it is also necessary to track patients' progress in longitudinal studies, in order to provide comprehensive understanding of disease progression and response to treatment. As the amount of required data increases, there exists a need for an efficient systematic solution to store and analyze MS patient data, disease profiles, and disease tracking for both clinical and research purposes. An imaging informatics based system, called MS eFolder, has been developed as an integrated patient record system for data storage and analysis of MS patients. The eFolder system, with a DICOM-based database, includes a module for lesion contouring by radiologists, a MS lesion quantification tool to quantify MS lesion volume in 3D, brain parenchyma fraction analysis, and provide quantitative analysis and tracking of volume changes in longitudinal studies. Patient data, including MR images, have been collected retrospectively at University of Southern California Medical Center (USC) and Los Angeles County Hospital (LAC). The MS eFolder utilizes web-based components, such as browser-based graphical user interface (GUI) and web-based database. The eFolder database stores patient clinical data (demographics, MS disease history, family history, etc.), MR imaging-related data found in DICOM headers, and lesion quantification results. Lesion quantification results are derived from radiologists' contours on brain MRI studies and quantified into 3-dimensional volumes and locations. Quantified results of white matter lesions are integrated into a structured report based on DICOM-SR protocol and templates. The user interface displays patient clinical information, original MR images, and viewing structured reports of quantified results. The GUI also includes a data mining tool to handle unique search queries for MS. System workflow and dataflow steps has been designed based on the IHE post-processing workflow profile, including workflow process tracking, MS lesion contouring and quantification of MR images at a post-processing workstation, and storage of quantitative results as DICOM-SR in DICOM-based storage system. The web-based GUI is designed to display zero-footprint DICOM web-accessible data objects (WADO) and the SR objects. The MS eFolder system has been designed and developed as an integrated data storage and mining solution in both clinical and research environments, while providing unique features, such as quantitative lesion analysis and disease tracking over a longitudinal study. A comprehensive image and clinical data integrated database provided by MS eFolder provides a platform for treatment assessment, outcomes analysis and decision-support. The proposed system serves as a platform for future quantitative analysis derived automatically from CAD algorithms that can also be integrated within the system for individual disease tracking and future MS-related research. Ultimately the eFolder provides a decision-support infrastructure that can eventually be used as add-on value to the overall electronic medical record. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Design and development of an ethnically-diverse imaging informatics-based eFolder system for multiple sclerosis patients

    PubMed Central

    Ma, Kevin C.; Fernandez, James R.; Amezcua, Lilyana; Lerner, Alex; Shiroishi, Mark S.; Liu, Brent J.

    2016-01-01

    Purpose MRI has been used to identify multiple sclerosis (MS) lesions in brain and spinal cord visually. Integrating patient information into an electronic patient record system has become key for modern patient care in medicine in recent years. Clinically, it is also necessary to track patients' progress in longitudinal studies, in order to provide comprehensive understanding of disease progression and response to treatment. As the amount of required data increases, there exists a need for an efficient systematic solution to store and analyze MS patient data, disease profiles, and disease tracking for both clinical and research purposes. Method An imaging informatics based system, called MS eFolder, has been developed as an integrated patient record system for data storage and analysis of MS patients. The eFolder system, with a DICOM-based database, includes a module for lesion contouring by radiologists, a MS lesion quantification tool to quantify MS lesion volume in 3D, brain parenchyma fraction analysis, and provide quantitative analysis and tracking of volume changes in longitudinal studies. Patient data, including MR images, have been collected retrospectively at University of Southern California Medical Center (USC) and Los Angeles County Hospital (LAC). The MS eFolder utilizes web-based components, such as browser-based graphical user interface (GUI) and web-based database. The eFolder database stores patient clinical data (demographics, MS disease history, family history, etc.), MR imaging-related data found in DICOM headers, and lesion quantification results. Lesion quantification results are derived from radiologists' contours on brain MRI studies and quantified into 3-dimensional volumes and locations. Quantified results of white matter lesions are integrated into a structured report based on DICOM-SR protocol and templates. The user interface displays patient clinical information, original MR images, and viewing structured reports of quantified results. The GUI also includes a data mining tool to handle unique search queries for MS. System workflow and dataflow steps has been designed based on the IHE post-processing workflow profile, including workflow process tracking, MS lesion contouring and quantification of MR images at a post-processing workstation, and storage of quantitative results as DICOM-SR in DICOM-based storage system. The web-based GUI is designed to display zero-footprint DICOM web-accessible data objects (WADO) and the SR objects. Summary The MS eFolder system has been designed and developed as an integrated data storage and mining solution in both clinical and research environments, while providing unique features, such as quantitative lesion analysis and disease tracking over a longitudinal study. A comprehensive image and clinical data integrated database provided by MS eFolder provides a platform for treatment assessment, outcomes analysis and decision-support. The proposed system serves as a platform for future quantitative analysis derived automatically from CAD algorithms that can also be integrated within the system for individual disease tracking and future MS-related research. Ultimately the eFolder provides a decision-support infrastructure that can eventually be used as add-on value to the overall electronic medical record. PMID:26564667

  20. Application of targeted quantitative proteomics analysis in human cerebrospinal fluid using a liquid chromatography matrix-assisted laser desorption/ionization time-of-flight tandem mass spectrometer (LC MALDI TOF/TOF) platform.

    PubMed

    Pan, Sheng; Rush, John; Peskind, Elaine R; Galasko, Douglas; Chung, Kathryn; Quinn, Joseph; Jankovic, Joseph; Leverenz, James B; Zabetian, Cyrus; Pan, Catherine; Wang, Yan; Oh, Jung Hun; Gao, Jean; Zhang, Jianpeng; Montine, Thomas; Zhang, Jing

    2008-02-01

    Targeted quantitative proteomics by mass spectrometry aims to selectively detect one or a panel of peptides/proteins in a complex sample and is particularly appealing for novel biomarker verification/validation because it does not require specific antibodies. Here, we demonstrated the application of targeted quantitative proteomics in searching, identifying, and quantifying selected peptides in human cerebrospinal spinal fluid (CSF) using a matrix-assisted laser desorption/ionization time-of-flight tandem mass spectrometer (MALDI TOF/TOF)-based platform. The approach involved two major components: the use of isotopic-labeled synthetic peptides as references for targeted identification and quantification and a highly selective mass spectrometric analysis based on the unique characteristics of the MALDI instrument. The platform provides high confidence for targeted peptide detection in a complex system and can potentially be developed into a high-throughput system. Using the liquid chromatography (LC) MALDI TOF/TOF platform and the complementary identification strategy, we were able to selectively identify and quantify a panel of targeted peptides in the whole proteome of CSF without prior depletion of abundant proteins. The effectiveness and robustness of the approach associated with different sample complexity, sample preparation strategies, as well as mass spectrometric quantification were evaluated. Other issues related to chromatography separation and the feasibility for high-throughput analysis were also discussed. Finally, we applied targeted quantitative proteomics to analyze a subset of previously identified candidate markers in CSF samples of patients with Parkinson's disease (PD) at different stages and Alzheimer's disease (AD) along with normal controls.

  1. A Taiwanese Mandarin Main Concept Analysis (TM-MCA) for Quantification of Aphasic Oral Discourse

    ERIC Educational Resources Information Center

    Kong, Anthony Pak-Hin; Yeh, Chun-Chih

    2015-01-01

    Background: Various quantitative systems have been proposed to examine aphasic oral narratives in English. A clinical tool for assessing discourse produced by Cantonese-speaking persons with aphasia (PWA), namely Main Concept Analysis (MCA), was developed recently for quantifying the presence, accuracy and completeness of a narrative. Similar…

  2. Do Deregulated Cas Proteins Induce Genomic Instability in Early-Stage Ovarian Cancer

    DTIC Science & Technology

    2006-12-01

    use Western blot analysis of tumor lysates to correlate expression of HEF1, p130Cas, Aurora A, and phospho-Aurora A. This analysis is in progress. In...and importantly, evaluated a number of different detection/image analysis systems to ensure reproducible quantitative results. We have used a pilot...reproducible Interestingly, preliminary statistical analysis using Spearman and Pearson correlation indicates at least one striking correlation

  3. A Quantitative Experimental Study of the Effectiveness of Systems to Identify Network Attackers

    ERIC Educational Resources Information Center

    Handorf, C. Russell

    2016-01-01

    This study analyzed the meta-data collected from a honeypot that was run by the Federal Bureau of Investigation for a period of 5 years. This analysis compared the use of existing industry methods and tools, such as Intrusion Detection System alerts, network traffic flow and system log traffic, within the Open Source Security Information Manager…

  4. Computerized EEG analysis for studying the effect of drugs on the central nervous system.

    PubMed

    Rosadini, G; Cavazza, B; Rodriguez, G; Sannita, W G; Siccardi, A

    1977-11-01

    Samples of our experience in quantitative pharmaco-EEG are reviewed to discuss and define its applicability and limits. Simple processing systems, such as the computation of Hjorth's descriptors, are useful for on-line monitoring of drug-induced EEG modifications which are evident also at the visual visual analysis. Power spectral analysis is suitable to identify and quantify EEG effects not evident at the visual inspection. It demonstrated how the EEG effects of compounds in a long-acting formulation vary according to the sampling time and the explored cerebral area. EEG modifications not detected by power spectral analysis can be defined by comparing statistically (F test) the spectral values of the EEG from a single lead at the different samples (longitudinal comparison), or the spectral values from different leads at any sample (intrahemispheric comparison). The presently available procedures of quantitative pharmaco-EEG are effective when applied to the study of mutltilead EEG recordings in a statistically significant sample of population. They do not seem reliable in the monitoring of directing of neuropyschiatric therapies in single patients, due to individual variability of drug effects.

  5. Systems analysis of the single photon response in invertebrate photoreceptors.

    PubMed

    Pumir, Alain; Graves, Jennifer; Ranganathan, Rama; Shraiman, Boris I

    2008-07-29

    Photoreceptors of Drosophila compound eye employ a G protein-mediated signaling pathway that transduces single photons into transient electrical responses called "quantum bumps" (QB). Although most of the molecular components of this pathway are already known, the system-level understanding of the mechanism of QB generation has remained elusive. Here, we present a quantitative model explaining how QBs emerge from stochastic nonlinear dynamics of the signaling cascade. The model shows that the cascade acts as an "integrate and fire" device and explains how photoreceptors achieve reliable responses to light although keeping low background in the dark. The model predicts the nontrivial behavior of mutants that enhance or suppress signaling and explains the dependence on external calcium, which controls feedback regulation. The results provide insight into physiological questions such as single-photon response efficiency and the adaptation of response to high incident-light level. The system-level analysis enabled by modeling phototransduction provides a foundation for understanding G protein signaling pathways less amenable to quantitative approaches.

  6. Study of distributed learning as a solution to category proliferation in Fuzzy ARTMAP based neural systems.

    PubMed

    Parrado-Hernández, Emilio; Gómez-Sánchez, Eduardo; Dimitriadis, Yannis A

    2003-09-01

    An evaluation of distributed learning as a means to attenuate the category proliferation problem in Fuzzy ARTMAP based neural systems is carried out, from both qualitative and quantitative points of view. The study involves two original winner-take-all (WTA) architectures, Fuzzy ARTMAP and FasArt, and their distributed versions, dARTMAP and dFasArt. A qualitative analysis of the distributed learning properties of dARTMAP is made, focusing on the new elements introduced to endow Fuzzy ARTMAP with distributed learning. In addition, a quantitative study on a selected set of classification problems points out that problems have to present certain features in their output classes in order to noticeably reduce the number of recruited categories and achieve an acceptable classification accuracy. As part of this analysis, distributed learning was successfully adapted to a member of the Fuzzy ARTMAP family, FasArt, and similar procedures can be used to extend distributed learning capabilities to other Fuzzy ARTMAP based systems.

  7. Quantitative phase analysis and microstructure characterization of magnetite nanocrystals obtained by microwave assisted non-hydrolytic sol–gel synthesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sciancalepore, Corrado, E-mail: corrado.sciancalepore@unimore.it; Bondioli, Federica; INSTM Consortium, Via G. Giusti 9, 51121 Firenze

    2015-02-15

    An innovative preparation procedure, based on microwave assisted non-hydrolytic sol–gel synthesis, to obtain spherical magnetite nanoparticles was reported together with a detailed quantitative phase analysis and microstructure characterization of the synthetic products. The nanoparticle growth was analyzed as a function of the synthesis time and was described in terms of crystallization degree employing the Rietveld method on the magnetic nanostructured system for the determination of the amorphous content using hematite as internal standard. Product crystallinity increases as the microwave thermal treatment is increased and reaches very high percentages for synthesis times longer than 1 h. Microstructural evolution of nanocrystals wasmore » followed by the integral breadth methods to obtain information on the crystallite size-strain distribution. The results of diffraction line profile analysis were compared with nanoparticle grain distribution estimated by dimensional analysis of the transmission electron microscopy (TEM) images. A variation both in the average grain size and in the distribution of the coherently diffraction domains is evidenced, allowing to suppose a relationship between the two quantities. The traditional integral breadth methods have proven to be valid for a rapid assessment of the diffraction line broadening effects in the above-mentioned nanostructured systems and the basic assumption for the correct use of these methods are discussed as well. - Highlights: • Fe{sub 3}O{sub 4} nanocrystals were obtained by MW-assisted non-hydrolytic sol–gel synthesis. • Quantitative phase analysis revealed that crystallinity up to 95% was reached. • The strategy of Rietveld refinements was discussed in details. • Dimensional analysis showed nanoparticles ranging from 4 to 8 nm. • Results of integral breadth methods were compared with microscopic analysis.« less

  8. In-depth Qualitative and Quantitative Profiling of Tyrosine Phosphorylation Using a Combination of Phosphopeptide Immunoaffinity Purification and Stable Isotope Dimethyl Labeling*

    PubMed Central

    Boersema, Paul J.; Foong, Leong Yan; Ding, Vanessa M. Y.; Lemeer, Simone; van Breukelen, Bas; Philp, Robin; Boekhorst, Jos; Snel, Berend; den Hertog, Jeroen; Choo, Andre B. H.; Heck, Albert J. R.

    2010-01-01

    Several mass spectrometry-based assays have emerged for the quantitative profiling of cellular tyrosine phosphorylation. Ideally, these methods should reveal the exact sites of tyrosine phosphorylation, be quantitative, and not be cost-prohibitive. The latter is often an issue as typically several milligrams of (stable isotope-labeled) starting protein material are required to enable the detection of low abundance phosphotyrosine peptides. Here, we adopted and refined a peptidecentric immunoaffinity purification approach for the quantitative analysis of tyrosine phosphorylation by combining it with a cost-effective stable isotope dimethyl labeling method. We were able to identify by mass spectrometry, using just two LC-MS/MS runs, more than 1100 unique non-redundant phosphopeptides in HeLa cells from about 4 mg of starting material without requiring any further affinity enrichment as close to 80% of the identified peptides were tyrosine phosphorylated peptides. Stable isotope dimethyl labeling could be incorporated prior to the immunoaffinity purification, even for the large quantities (mg) of peptide material used, enabling the quantification of differences in tyrosine phosphorylation upon pervanadate treatment or epidermal growth factor stimulation. Analysis of the epidermal growth factor-stimulated HeLa cells, a frequently used model system for tyrosine phosphorylation, resulted in the quantification of 73 regulated unique phosphotyrosine peptides. The quantitative data were found to be exceptionally consistent with the literature, evidencing that such a targeted quantitative phosphoproteomics approach can provide reproducible results. In general, the combination of immunoaffinity purification of tyrosine phosphorylated peptides with large scale stable isotope dimethyl labeling provides a cost-effective approach that can alleviate variation in sample preparation and analysis as samples can be combined early on. Using this approach, a rather complete qualitative and quantitative picture of tyrosine phosphorylation signaling events can be generated. PMID:19770167

  9. System architectures for telerobotic research

    NASA Technical Reports Server (NTRS)

    Harrison, F. Wallace

    1989-01-01

    Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.

  10. Genetic variants associated with the root system architecture of oilseed rape (Brassica napus L.) under contrasting phosphate supply.

    PubMed

    Wang, Xiaohua; Chen, Yanling; Thomas, Catherine L; Ding, Guangda; Xu, Ping; Shi, Dexu; Grandke, Fabian; Jin, Kemo; Cai, Hongmei; Xu, Fangsen; Yi, Bin; Broadley, Martin R; Shi, Lei

    2017-08-01

    Breeding crops with ideal root system architecture for efficient absorption of phosphorus is an important strategy to reduce the use of phosphate fertilizers. To investigate genetic variants leading to changes in root system architecture, 405 oilseed rape cultivars were genotyped with a 60K Brassica Infinium SNP array in low and high P environments. A total of 285 single-nucleotide polymorphisms were associated with root system architecture traits at varying phosphorus levels. Nine single-nucleotide polymorphisms corroborate a previous linkage analysis of root system architecture quantitative trait loci in the BnaTNDH population. One peak single-nucleotide polymorphism region on A3 was associated with all root system architecture traits and co-localized with a quantitative trait locus for primary root length at low phosphorus. Two more single-nucleotide polymorphism peaks on A5 for root dry weight at low phosphorus were detected in both growth systems and co-localized with a quantitative trait locus for the same trait. The candidate genes identified on A3 form a haplotype 'BnA3Hap', that will be important for understanding the phosphorus/root system interaction and for the incorporation into Brassica napus breeding programs. © The Author 2017. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  11. A color prediction model for imagery analysis

    NASA Technical Reports Server (NTRS)

    Skaley, J. E.; Fisher, J. R.; Hardy, E. E.

    1977-01-01

    A simple model has been devised to selectively construct several points within a scene using multispectral imagery. The model correlates black-and-white density values to color components of diazo film so as to maximize the color contrast of two or three points per composite. The CIE (Commission Internationale de l'Eclairage) color coordinate system is used as a quantitative reference to locate these points in color space. Superimposed on this quantitative reference is a perceptional framework which functionally contrasts color values in a psychophysical sense. This methodology permits a more quantitative approach to the manual interpretation of multispectral imagery while resulting in improved accuracy and lower costs.

  12. An overview of a multifactor-system theory of personality and individual differences: III. Life span development and the heredity-environment issue.

    PubMed

    Powell, A; Royce, J R

    1981-12-01

    In Part III of this three-part series on multifactor-system theory, multivariate, life-span development is approached from the standpoint of a quantitative and qualitative analysis of the ontogenesis of factors in each of the six systems. The pattern of quantitative development (described via the Gompertz equation and three developmental parameters) involves growth, stability, and decline, and qualitative development involves changes in the organization of factors (e.g., factor differentiation and convergence). Hereditary and environmental sources of variation are analyzed via the factor gene model and the concept of heredity-dominant factors, and the factor-learning model and environment-dominant factors. It is hypothesized that the sensory and motor systems are heredity dominant, that the style and value systems are environment dominant, and that the cognitive and affective systems are partially heredity dominant.

  13. ICP-MS as a novel detection system for quantitative element-tagged immunoassay of hidden peanut allergens in foods.

    PubMed

    Careri, Maria; Elviri, Lisa; Mangia, Alessandro; Mucchino, Claudio

    2007-03-01

    A novel ICP-MS-based ELISA immunoassay via element-tagged determination was devised for quantitative analysis of hidden allergens in food. The method was able to detect low amounts of peanuts (down to approximately 2 mg peanuts kg(-1) cereal-based matrix) by using a europium-tagged antibody. Selectivity was proved by the lack of detectable cross-reaction with a number of protein-rich raw materials.

  14. Systematic profiling of Caenorhabditis elegans locomotive behaviors reveals additional components in G-protein Gαq signaling.

    PubMed

    Yu, Hui; Aleman-Meza, Boanerges; Gharib, Shahla; Labocha, Marta K; Cronin, Christopher J; Sternberg, Paul W; Zhong, Weiwei

    2013-07-16

    Genetic screens have been widely applied to uncover genetic mechanisms of movement disorders. However, most screens rely on human observations of qualitative differences. Here we demonstrate the application of an automatic imaging system to conduct a quantitative screen for genes regulating the locomotive behavior in Caenorhabditis elegans. Two hundred twenty-seven neuronal signaling genes with viable homozygous mutants were selected for this study. We tracked and recorded each animal for 4 min and analyzed over 4,400 animals of 239 genotypes to obtain a quantitative, 10-parameter behavioral profile for each genotype. We discovered 87 genes whose inactivation causes movement defects, including 50 genes that had never been associated with locomotive defects. Computational analysis of the high-content behavioral profiles predicted 370 genetic interactions among these genes. Network partition revealed several functional modules regulating locomotive behaviors, including sensory genes that detect environmental conditions, genes that function in multiple types of excitable cells, and genes in the signaling pathway of the G protein Gαq, a protein that is essential for animal life and behavior. We developed quantitative epistasis analysis methods to analyze the locomotive profiles and validated the prediction of the γ isoform of phospholipase C as a component in the Gαq pathway. These results provided a system-level understanding of how neuronal signaling genes coordinate locomotive behaviors. This study also demonstrated the power of quantitative approaches in genetic studies.

  15. Quantitative measurement of indomethacin crystallinity in indomethacin-silica gel binary system using differential scanning calorimetry and X-ray powder diffractometry.

    PubMed

    Pan, Xiaohong; Julian, Thomas; Augsburger, Larry

    2006-02-10

    Differential scanning calorimetry (DSC) and X-ray powder diffractometry (XRPD) methods were developed for the quantitative analysis of the crystallinity of indomethacin (IMC) in IMC and silica gel (SG) binary system. The DSC calibration curve exhibited better linearity than that of XRPD. No phase transformation occurred in the IMC-SG mixtures during DSC measurement. The major sources of error in DSC measurements were inhomogeneous mixing and sampling. Analyzing the amount of IMC in the mixtures using high-performance liquid chromatography (HPLC) could reduce the sampling error. DSC demonstrated greater sensitivity and had less variation in measurement than XRPD in quantifying crystalline IMC in the IMC-SG binary system.

  16. Quantitative Velocity Field Measurements in Reduced-Gravity Combustion Science and Fluid Physics Experiments

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.; Wernet, Mark P.

    1999-01-01

    Systems have been developed and demonstrated for performing quantitative velocity measurements in reduced gravity combustion science and fluid physics investigations. The unique constraints and operational environments inherent to reduced-gravity experimental facilities pose special challenges to the development of hardware and software systems. Both point and planar velocimetric capabilities are described, with particular attention being given to the development of systems to support the International Space Station laboratory. Emphasis has been placed on optical methods, primarily arising from the sensitivity of the phenomena of interest to intrusive probes. Limitations on available power, volume, data storage, and attendant expertise have motivated the use of solid-state sources and detectors, as well as efficient analysis capabilities emphasizing interactive data display and parameter control.

  17. Procedures for analysis of debris relative to Space Shuttle systems

    NASA Technical Reports Server (NTRS)

    Kim, Hae Soo; Cummings, Virginia J.

    1993-01-01

    Debris samples collected from various Space Shuttle systems have been submitted to the Microchemical Analysis Branch. This investigation was initiated to develop optimal techniques for the analysis of debris. Optical microscopy provides information about the morphology and size of crystallites, particle sizes, amorphous phases, glass phases, and poorly crystallized materials. Scanning electron microscopy with energy dispersive spectrometry is utilized for information on surface morphology and qualitative elemental content of debris. Analytical electron microscopy with wavelength dispersive spectrometry provides information on the quantitative elemental content of debris.

  18. GLS-Finder: An Automated Data-Mining System for Fast Profiling Glucosinolates and its Application in Brassica Vegetables

    USDA-ARS?s Scientific Manuscript database

    A rapid computer-aided program for profiling glucosinolates, “GLS-Finder", was developed. GLS-Finder is a Matlab script based expert system that is capable for qualitative and semi-quantitative analysis of glucosinolates in samples using data generated by ultra-high performance liquid chromatograph...

  19. An Optimized Microplate Assay System for Quantitative Evaluation of Plant Cell Wall Degrading Enzyme Activity of Fungal Culture Extracts

    USDA-ARS?s Scientific Manuscript database

    Developing enzyme cocktails for cellulosic biomass hydrolysis complementary to current cellulase systems is a critical step needed for economically viable biofuels production. Recent genomic analysis indicates that some plant pathogenic fungi are likely a largely untapped resource in which to prospe...

  20. "BRAIN": Baruch Retrieval of Automated Information for Negotiations.

    ERIC Educational Resources Information Center

    Levenstein, Aaron, Ed.

    1981-01-01

    A data processing program that can be used as a research and collective bargaining aid for colleges is briefly described and the fields of the system are outlined. The system, known as BRAIN (Baruch Retrieval of Automated Information for Negotiations), is designed primarily as an instrument for quantitative and qualitative analysis. BRAIN consists…

  1. Monitoring Urban Quality of Life: The Porto Experience

    ERIC Educational Resources Information Center

    Santos, Luis Delfim; Martins, Isabel

    2007-01-01

    This paper describes the monitoring system of the urban quality of life developed by the Porto City Council, a new tool being used to support urban planning and management. The two components of this system--a quantitative approach based on statistical indicators and a qualitative analysis based on the citizens' perceptions of the conditions of…

  2. Energy Cascade in Fermi-Pasta Models

    NASA Astrophysics Data System (ADS)

    Ponno, A.; Bambusi, D.

    We show that, for long-wavelength initial conditions, the FPU dynamics is described, up to a certain time, by two KdV-like equations, which represent the resonant Hamiltonian normal form of the system. The energy cascade taking place in the system is then quantitatively characterized by arguments of dimensional analysis based on such equations.

  3. Mining and Analyzing Circulation and ILL Data for Informed Collection Development

    ERIC Educational Resources Information Center

    Link, Forrest E.; Tosaka, Yuji; Weng, Cathy

    2015-01-01

    The authors investigated quantitative methods of collection use analysis employing library data that are available in ILS and ILL systems to better understand library collection use and user needs. For the purpose of the study, the authors extracted circulation and ILL records from the library's systems using data-mining techniques. By comparing…

  4. Gap Gene Regulatory Dynamics Evolve along a Genotype Network

    PubMed Central

    Crombach, Anton; Wotton, Karl R.; Jiménez-Guri, Eva; Jaeger, Johannes

    2016-01-01

    Developmental gene networks implement the dynamic regulatory mechanisms that pattern and shape the organism. Over evolutionary time, the wiring of these networks changes, yet the patterning outcome is often preserved, a phenomenon known as “system drift.” System drift is illustrated by the gap gene network—involved in segmental patterning—in dipteran insects. In the classic model organism Drosophila melanogaster and the nonmodel scuttle fly Megaselia abdita, early activation and placement of gap gene expression domains show significant quantitative differences, yet the final patterning output of the system is essentially identical in both species. In this detailed modeling analysis of system drift, we use gene circuits which are fit to quantitative gap gene expression data in M. abdita and compare them with an equivalent set of models from D. melanogaster. The results of this comparative analysis show precisely how compensatory regulatory mechanisms achieve equivalent final patterns in both species. We discuss the larger implications of the work in terms of “genotype networks” and the ways in which the structure of regulatory networks can influence patterns of evolutionary change (evolvability). PMID:26796549

  5. KIN-Nav navigation system for kinematic assessment in anterior cruciate ligament reconstruction: features, use, and perspectives.

    PubMed

    Martelli, S; Zaffagnini, S; Bignozzi, S; Lopomo, N F; Iacono, F; Marcacci, M

    2007-10-01

    In this paper a new navigation system, KIN-Nav, developed for research and used during 80 anterior cruciate ligament (ACL) reconstructions is described. KIN-Nav is a user-friendly navigation system for flexible intraoperative acquisitions of anatomical and kinematic data, suitable for validation of biomechanical hypotheses. It performs real-time quantitative evaluation of antero-posterior, internal-external, and varus-valgus knee laxity at any degree of flexion and provides a new interface for this task, suitable also for comparison of pre-operative and post-operative knee laxity and surgical documentation. In this paper the concept and features of KIN-Nav, which represents a new approach to navigation and allows the investigation of new quantitative measurements in ACL reconstruction, are described. Two clinical studies are reported, as examples of clinical potentiality and correct use of this methodology. In this paper a preliminary analysis of KIN-Nav's reliability and clinical efficacy, performed during blinded repeated measures by three independent examiners, is also given. This analysis is the first assessment of the potential of navigation systems for evaluating knee kinematics.

  6. Accurate ECG diagnosis of atrial tachyarrhythmias using quantitative analysis: a prospective diagnostic and cost-effectiveness study.

    PubMed

    Krummen, David E; Patel, Mitul; Nguyen, Hong; Ho, Gordon; Kazi, Dhruv S; Clopton, Paul; Holland, Marian C; Greenberg, Scott L; Feld, Gregory K; Faddis, Mitchell N; Narayan, Sanjiv M

    2010-11-01

    Quantitative ECG Analysis. Optimal atrial tachyarrhythmia management is facilitated by accurate electrocardiogram interpretation, yet typical atrial flutter (AFl) may present without sawtooth F-waves or RR regularity, and atrial fibrillation (AF) may be difficult to separate from atypical AFl or rapid focal atrial tachycardia (AT). We analyzed whether improved diagnostic accuracy using a validated analysis tool significantly impacts costs and patient care. We performed a prospective, blinded, multicenter study using a novel quantitative computerized algorithm to identify atrial tachyarrhythmia mechanism from the surface ECG in patients referred for electrophysiology study (EPS). In 122 consecutive patients (age 60 ± 12 years) referred for EPS, 91 sustained atrial tachyarrhythmias were studied. ECGs were also interpreted by 9 physicians from 3 specialties for comparison and to allow healthcare system modeling. Diagnostic accuracy was compared to the diagnosis at EPS. A Markov model was used to estimate the impact of improved arrhythmia diagnosis. We found 13% of typical AFl ECGs had neither sawtooth flutter waves nor RR regularity, and were misdiagnosed by the majority of clinicians (0/6 correctly diagnosed by consensus visual interpretation) but correctly by quantitative analysis in 83% (5/6, P = 0.03). AF diagnosis was also improved through use of the algorithm (92%) versus visual interpretation (primary care: 76%, P < 0.01). Economically, we found that these improvements in diagnostic accuracy resulted in an average cost-savings of $1,303 and 0.007 quality-adjusted-life-years per patient. Typical AFl and AF are frequently misdiagnosed using visual criteria. Quantitative analysis improves diagnostic accuracy and results in improved healthcare costs and patient outcomes. © 2010 Wiley Periodicals, Inc.

  7. Quantitative proteomics in biological research.

    PubMed

    Wilm, Matthias

    2009-10-01

    Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.

  8. An automated machine vision system for the histological grading of cervical intraepithelial neoplasia (CIN).

    PubMed

    Keenan, S J; Diamond, J; McCluggage, W G; Bharucha, H; Thompson, D; Bartels, P H; Hamilton, P W

    2000-11-01

    The histological grading of cervical intraepithelial neoplasia (CIN) remains subjective, resulting in inter- and intra-observer variation and poor reproducibility in the grading of cervical lesions. This study has attempted to develop an objective grading system using automated machine vision. The architectural features of cervical squamous epithelium are quantitatively analysed using a combination of computerized digital image processing and Delaunay triangulation analysis; 230 images digitally captured from cases previously classified by a gynaecological pathologist included normal cervical squamous epithelium (n=30), koilocytosis (n=46), CIN 1 (n=52), CIN 2 (n=56), and CIN 3 (n=46). Intra- and inter-observer variation had kappa values of 0.502 and 0.415, respectively. A machine vision system was developed in KS400 macro programming language to segment and mark the centres of all nuclei within the epithelium. By object-oriented analysis of image components, the positional information of nuclei was used to construct a Delaunay triangulation mesh. Each mesh was analysed to compute triangle dimensions including the mean triangle area, the mean triangle edge length, and the number of triangles per unit area, giving an individual quantitative profile of measurements for each case. Discriminant analysis of the geometric data revealed the significant discriminatory variables from which a classification score was derived. The scoring system distinguished between normal and CIN 3 in 98.7% of cases and between koilocytosis and CIN 1 in 76.5% of cases, but only 62.3% of the CIN cases were classified into the correct group, with the CIN 2 group showing the highest rate of misclassification. Graphical plots of triangulation data demonstrated the continuum of morphological change from normal squamous epithelium to the highest grade of CIN, with overlapping of the groups originally defined by the pathologists. This study shows that automated location of nuclei in cervical biopsies using computerized image analysis is possible. Analysis of positional information enables quantitative evaluation of architectural features in CIN using Delaunay triangulation meshes, which is effective in the objective classification of CIN. This demonstrates the future potential of automated machine vision systems in diagnostic histopathology. Copyright 2000 John Wiley & Sons, Ltd.

  9. Quantitative imaging of aggregated emulsions.

    PubMed

    Penfold, Robert; Watson, Andrew D; Mackie, Alan R; Hibberd, David J

    2006-02-28

    Noise reduction, restoration, and segmentation methods are developed for the quantitative structural analysis in three dimensions of aggregated oil-in-water emulsion systems imaged by fluorescence confocal laser scanning microscopy. Mindful of typical industrial formulations, the methods are demonstrated for concentrated (30% volume fraction) and polydisperse emulsions. Following a regularized deconvolution step using an analytic optical transfer function and appropriate binary thresholding, novel application of the Euclidean distance map provides effective discrimination of closely clustered emulsion droplets with size variation over at least 1 order of magnitude. The a priori assumption of spherical nonintersecting objects provides crucial information to combat the ill-posed inverse problem presented by locating individual particles. Position coordinates and size estimates are recovered with sufficient precision to permit quantitative study of static geometrical features. In particular, aggregate morphology is characterized by a novel void distribution measure based on the generalized Apollonius problem. This is also compared with conventional Voronoi/Delauney analysis.

  10. Standard Reference Line Combined with One-Point Calibration-Free Laser-Induced Breakdown Spectroscopy (CF-LIBS) to Quantitatively Analyze Stainless and Heat Resistant Steel.

    PubMed

    Fu, Hongbo; Wang, Huadong; Jia, Junwei; Ni, Zhibo; Dong, Fengzhong

    2018-01-01

    Due to the influence of major elements' self-absorption, scarce observable spectral lines of trace elements, and relative efficiency correction of experimental system, accurate quantitative analysis with calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is in fact not easy. In order to overcome these difficulties, standard reference line (SRL) combined with one-point calibration (OPC) is used to analyze six elements in three stainless-steel and five heat-resistant steel samples. The Stark broadening and Saha - Boltzmann plot of Fe are used to calculate the electron density and the plasma temperature, respectively. In the present work, we tested the original SRL method, the SRL with the OPC method, and intercept with the OPC method. The final calculation results show that the latter two methods can effectively improve the overall accuracy of quantitative analysis and the detection limits of trace elements.

  11. Representational Similarity Analysis – Connecting the Branches of Systems Neuroscience

    PubMed Central

    Kriegeskorte, Nikolaus; Mur, Marieke; Bandettini, Peter

    2008-01-01

    A fundamental challenge for systems neuroscience is to quantitatively relate its three major branches of research: brain-activity measurement, behavioral measurement, and computational modeling. Using measured brain-activity patterns to evaluate computational network models is complicated by the need to define the correspondency between the units of the model and the channels of the brain-activity data, e.g., single-cell recordings or voxels from functional magnetic resonance imaging (fMRI). Similar correspondency problems complicate relating activity patterns between different modalities of brain-activity measurement (e.g., fMRI and invasive or scalp electrophysiology), and between subjects and species. In order to bridge these divides, we suggest abstracting from the activity patterns themselves and computing representational dissimilarity matrices (RDMs), which characterize the information carried by a given representation in a brain or model. Building on a rich psychological and mathematical literature on similarity analysis, we propose a new experimental and data-analytical framework called representational similarity analysis (RSA), in which multi-channel measures of neural activity are quantitatively related to each other and to computational theory and behavior by comparing RDMs. We demonstrate RSA by relating representations of visual objects as measured with fMRI in early visual cortex and the fusiform face area to computational models spanning a wide range of complexities. The RDMs are simultaneously related via second-level application of multidimensional scaling and tested using randomization and bootstrap techniques. We discuss the broad potential of RSA, including novel approaches to experimental design, and argue that these ideas, which have deep roots in psychology and neuroscience, will allow the integrated quantitative analysis of data from all three branches, thus contributing to a more unified systems neuroscience. PMID:19104670

  12. A quantitative analysis of municipal solid waste disposal charges in China.

    PubMed

    Wu, Jian; Zhang, Weiqian; Xu, Jiaxuan; Che, Yue

    2015-03-01

    Rapid industrialization and economic development have caused a tremendous increase in municipal solid waste (MSW) generation in China. China began implementing a policy of MSW disposal fees for household waste management at the end of last century. Three charging methods were implemented throughout the country: a fixed disposal fee, a potable water-based disposal fee, and a plastic bag-based disposal fee. To date, there has been little qualitative or quantitative analysis on the effectiveness of this relatively new policy. This paper provides a general overview of MSW fee policy in China, attempts to verify whether the policy is successful in reducing general waste collected, and proposes an improved charging system to address current problems. The paper presents an empirical statistical analysis of policy effectiveness derived from an environmental Kuznets curve (EKC) test on panel data of China. EKC tests on different kinds of MSW charge systems were then examined for individual provinces or cities. A comparison of existing charging systems was conducted using environmental and economic criteria. The results indicate the following: (1) the MSW policies implemented over the study period were effective in the reduction of waste generation, (2) the household waste discharge fee policy did not act as a strong driver in terms of waste prevention and reduction, and (3) the plastic bag-based disposal fee appeared to be performing well according to qualitative and quantitative analysis. Based on current situation of waste discharging management in China, a three-stage transitional charging scheme is proposed and both advantages and drawbacks discussed. Evidence suggests that a transition from a fixed disposal fee to a plastic bag-based disposal fee involving various stakeholders should be the next objective of waste reduction efforts.

  13. Accuracy of a remote quantitative image analysis in the whole slide images.

    PubMed

    Słodkowska, Janina; Markiewicz, Tomasz; Grala, Bartłomiej; Kozłowski, Wojciech; Papierz, Wielisław; Pleskacz, Katarzyna; Murawski, Piotr

    2011-03-30

    The rationale for choosing a remote quantitative method supporting a diagnostic decision requires some empirical studies and knowledge on scenarios including valid telepathology standards. The tumours of the central nervous system [CNS] are graded on the base of the morphological features and the Ki-67 labelling Index [Ki-67 LI]. Various methods have been applied for Ki-67 LI estimation. Recently we have introduced the Computerized Analysis of Medical Images [CAMI] software for an automated Ki-67 LI counting in the digital images. Aims of our study was to explore the accuracy and reliability of a remote assessment of Ki-67 LI with CAMI software applied to the whole slide images [WSI]. The WSI representing CNS tumours: 18 meningiomas and 10 oligodendrogliomas were stored on the server of the Warsaw University of Technology. The digital copies of entire glass slides were created automatically by the Aperio ScanScope CS with objective 20x or 40x. Aperio's Image Scope software provided functionality for a remote viewing of WSI. The Ki-67 LI assessment was carried on within 2 out of 20 selected fields of view (objective 40x) representing the highest labelling areas in each WSI. The Ki-67 LI counting was performed by 3 various methods: 1) the manual reading in the light microscope - LM, 2) the automated counting with CAMI software on the digital images - DI , and 3) the remote quantitation on the WSIs - as WSI method. The quality of WSIs and technical efficiency of the on-line system were analysed. The comparative statistical analysis was performed for the results obtained by 3 methods of Ki-67 LI counting. The preliminary analysis showed that in 18% of WSI the results of Ki-67 LI differed from those obtained in other 2 methods of counting when the quality of the glass slides was below the standard range. The results of our investigations indicate that the remote automated Ki-67 LI analysis performed with the CAMI algorithm on the whole slide images of meningiomas and oligodendrogliomas could be successfully used as an alternative method to the manual reading as well as to the digital images quantitation with CAMI software. According to our observation a need of a remote supervision/consultation and training for the effective use of remote quantitative analysis of WSI is necessary.

  14. Quantitative Ultrasound for Measuring Obstructive Severity in Children with Hydronephrosis.

    PubMed

    Cerrolaza, Juan J; Peters, Craig A; Martin, Aaron D; Myers, Emmarie; Safdar, Nabile; Linguraru, Marius George

    2016-04-01

    We define sonographic biomarkers for hydronephrotic renal units that can predict the necessity of diuretic nuclear renography. We selected a cohort of 50 consecutive patients with hydronephrosis of varying severity in whom 2-dimensional sonography and diuretic mercaptoacetyltriglycine renography had been performed. A total of 131 morphological parameters were computed using quantitative image analysis algorithms. Machine learning techniques were then applied to identify ultrasound based safety thresholds that agreed with the t½ for washout. A best fit model was then derived for each threshold level of t½ that would be clinically relevant at 20, 30 and 40 minutes. Receiver operating characteristic curve analysis was performed. Sensitivity, specificity and area under the receiver operating characteristic curve were determined. Improvement obtained by the quantitative imaging method compared to the Society for Fetal Urology grading system and the hydronephrosis index was statistically verified. For the 3 thresholds considered and at 100% sensitivity the specificities of the quantitative imaging method were 94%, 70% and 74%, respectively. Corresponding area under the receiver operating characteristic curve values were 0.98, 0.94 and 0.94, respectively. Improvement obtained by the quantitative imaging method over the Society for Fetal Urology grade and hydronephrosis index was statistically significant (p <0.05 in all cases). Quantitative imaging analysis of renal sonograms in children with hydronephrosis can identify thresholds of clinically significant washout times with 100% sensitivity to decrease the number of diuretic renograms in up to 62% of children. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  15. A Study of Cognitive Load for Enhancing Student’s Quantitative Literacy in Inquiry Lab Learning

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahman, T.; Alifiani, D. P.; Khoerunnisa, R. S.

    2017-09-01

    Students often find it difficult to appreciate the relevance of the role of quantitative analysis and concept attainment in the science class. This study measured student cognitive load during the inquiry lab of the respiratory system to improve quantitative literacy. Participants in this study were 40 11th graders from senior high school in Indonesia. After students learned, their feelings about the degree of mental effort that it took to complete the learning tasks were measured by 28 self-report on a 4-point Likert scale. The Task Complexity Worksheet were used to asses processing quantitative information and paper based test were applied to assess participants’ concept achievements. The results showed that inquiry instructional induced a relatively low mental effort, high processing information and high concept achievments.

  16. New insight in quantitative analysis of vascular permeability during immune reaction (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kalchenko, Vyacheslav; Molodij, Guillaume; Kuznetsov, Yuri; Smolyakov, Yuri; Israeli, David; Meglinski, Igor; Harmelin, Alon

    2016-03-01

    The use of fluorescence imaging of vascular permeability becomes a golden standard for assessing the inflammation process during experimental immune response in vivo. The use of the optical fluorescence imaging provides a very useful and simple tool to reach this purpose. The motivation comes from the necessity of a robust and simple quantification and data presentation of inflammation based on a vascular permeability. Changes of the fluorescent intensity, as a function of time is a widely accepted method to assess the vascular permeability during inflammation related to the immune response. In the present study we propose to bring a new dimension by applying a more sophisticated approach to the analysis of vascular reaction by using a quantitative analysis based on methods derived from astronomical observations, in particular by using a space-time Fourier filtering analysis followed by a polynomial orthogonal modes decomposition. We demonstrate that temporal evolution of the fluorescent intensity observed at certain pixels correlates quantitatively to the blood flow circulation at normal conditions. The approach allows to determine the regions of permeability and monitor both the fast kinetics related to the contrast material distribution in the circulatory system and slow kinetics associated with extravasation of the contrast material. Thus, we introduce a simple and convenient method for fast quantitative visualization of the leakage related to the inflammatory (immune) reaction in vivo.

  17. A web-based quantitative signal detection system on adverse drug reaction in China.

    PubMed

    Li, Chanjuan; Xia, Jielai; Deng, Jianxiong; Chen, Wenge; Wang, Suzhen; Jiang, Jing; Chen, Guanquan

    2009-07-01

    To establish a web-based quantitative signal detection system for adverse drug reactions (ADRs) based on spontaneous reporting to the Guangdong province drug-monitoring database in China. Using Microsoft Visual Basic and Active Server Pages programming languages and SQL Server 2000, a web-based system with three software modules was programmed to perform data preparation and association detection, and to generate reports. Information component (IC), the internationally recognized measure of disproportionality for quantitative signal detection, was integrated into the system, and its capacity for signal detection was tested with ADR reports collected from 1 January 2002 to 30 June 2007 in Guangdong. A total of 2,496 associations including known signals were mined from the test database. Signals (e.g., cefradine-induced hematuria) were found early by using the IC analysis. In addition, 291 drug-ADR associations were alerted for the first time in the second quarter of 2007. The system can be used for the detection of significant associations from the Guangdong drug-monitoring database and could be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs for the first time in China.

  18. Methods of quantitative risk assessment: The case of the propellant supply system

    NASA Astrophysics Data System (ADS)

    Merz, H. A.; Bienz, A.

    1984-08-01

    As a consequence of the disastrous accident in Lapua (Finland) in 1976, where an explosion in a cartridge loading facility killed 40 and injured more than 70 persons, efforts were undertaken to examine and improve the safety of such installations. An ammunition factory in Switzerland considered the replacement of the manual supply of propellant hoppers by a new pneumatic supply system. This would reduce the maximum quantity of propellant in the hoppers to a level, where an accidental ignition would no longer lead to a detonation, and this would drastically limit the effects on persons. A quantitative risk assessment of the present and the planned supply system demonstrated that, in this particular case, the pneumatic supply system would not reduce the risk enough to justify the related costs. In addition, it could be shown that the safety of the existing system can be improved more effectively by other safety measures at considerably lower costs. Based on this practical example, the advantages of a strictly quantitative risk assessment for the safety planning in explosives factories are demonstrated. The methodological background of a risk assessment and the steps involved in the analysis are summarized. In addition, problems of quantification are discussed.

  19. North Carolina DOT traffic separation studies. Volume I, Assessment.

    DOT National Transportation Integrated Search

    2004-09-01

    The Federal Railroad Administration requested the Volpe National Transportation Systems Center to assess ten sites in depth that used the Traffic Separation Study (TSS) process. The assessment involved a quantitative and qualitative analysis of the i...

  20. Application of remote sensing to monitoring and studying dispersion in ocean dumping

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.; Ohlhorst, C. W.

    1981-01-01

    Remotely sensed wide area synoptic data provides information on ocean dumping that is not readily available by other means. A qualitative approach has been used to map features, such as river plumes. Results of quantitative analyses have been used to develop maps showing quantitative distributions of one or more water quality parameters, such as suspended solids or chlorophyll a. Joint NASA/NOAA experiments have been conducted at designated dump areas in the U.S. coastal zones to determine the applicability of aircraft remote sensing systems to map plumes resulting from ocean dumping of sewage sludge and industrial wastes. A second objective is related to the evaluation of previously developed quantitative analysis techniques for studying dispersion of materials in these plumes. It was found that plumes resulting from dumping of four waste materials have distinctive spectral characteristics. The development of a technology for use in a routine monitoring system, based on remote sensing techniques, is discussed.

  1. Backup of renewable energy for an electrical island: case study of Israeli electricity system--current status.

    PubMed

    Fakhouri, A; Kuperman, A

    2014-01-01

    The paper focuses on the quantitative analysis of Israeli Government's targets of 10% renewable energy penetration by 2020 and determining the desired methodology (models) for assessing the effects on the electricity market, addressing the fact that Israel is an electricity island. The main objective is to determine the influence of achieving the Government's goals for renewable energy penetration on the need for backup in the Israeli electricity system. This work presents the current situation of the Israeli electricity market and the study to be taken in order to assess the undesirable effects resulting from the intermittency of electricity generated by wind and solar power stations as well as presents some solutions to mitigating these phenomena. Future work will focus on a quantitative analysis of model runs and determine the amounts of backup required relative to the amount of installed capacity from renewable resources.

  2. Biomechanical and mathematical analysis of human movement in medical rehabilitation science using time-series data from two video cameras and force-plate sensor

    NASA Astrophysics Data System (ADS)

    Tsuruoka, Masako; Shibasaki, Ryosuke; Box, Elgene O.; Murai, Shunji; Mori, Eiji; Wada, Takao; Kurita, Masahiro; Iritani, Makoto; Kuroki, Yoshikatsu

    1994-08-01

    In medical rehabilitation science, quantitative understanding of patient movement in 3-D space is very important. The patient with any joint disorder will experience its influence on other body parts in daily movement. The alignment of joints in movement is able to improve under medical therapy process. In this study, the newly developed system is composed of two non- metri CCD video cameras and a force plate sensor, which are controlled simultaneously by a personal computer. By this system time-series digital data from 3-D image photogrammetry, each foot pressure and its center position, is able to provide efficient information for biomechanical and mathematical analysis of human movement. Each specific and common points are indicated in any patient movement. This study suggests more various, quantitative understanding in medical rehabilitation science.

  3. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  4. Development of the living thing transportation systems worksheet on learning cycle model to increase student understanding

    NASA Astrophysics Data System (ADS)

    Rachmawati, E.; Nurohman, S.; Widowati, A.

    2018-01-01

    This study aims to know: 1) the feasibility LKPD review of aspects of the didactic requirements, construction requirements, technical requirements and compliance with the Learning Cycle. 2) Increase understanding of learners with Learning Model Learning Cycle in SMP N 1 Wates in the form LKPD. 3) The response of learners and educators SMP N 1 Wates to quality LKPD Transportation Systems Beings. This study is an R & D with the 4D model (Define, Design, Develop and Disseminate). Data were analyzed using qualitative analysis and quantitative analysis. Qualitative analysis in the form of advice description and assessment scores from all validates that was converted to a scale of 4. While the analysis of quantitative data by calculating the percentage of materializing learning and achievement using the standard gain an increased understanding and calculation of the KKM completeness evaluation value as an indicator of the achievement of students understanding. the results of this study yield LKPD IPA model learning Cycle theme Transportation Systems Beings obtain 108.5 total scores of a maximum score of 128 including the excellent category (A). LKPD IPA developed able to demonstrate an improved understanding of learners and the response of learners was very good to this quality LKPD IPA.

  5. Highly sensitive and quantitative detection of rare pathogens through agarose droplet microfluidic emulsion PCR at the single-cell level.

    PubMed

    Zhu, Zhi; Zhang, Wenhua; Leng, Xuefei; Zhang, Mingxia; Guan, Zhichao; Lu, Jiangquan; Yang, Chaoyong James

    2012-10-21

    Genetic alternations can serve as highly specific biomarkers to distinguish fatal bacteria or cancer cells from their normal counterparts. However, these mutations normally exist in very rare amount in the presence of a large excess of non-mutated analogs. Taking the notorious pathogen E. coli O157:H7 as the target analyte, we have developed an agarose droplet-based microfluidic ePCR method for highly sensitive, specific and quantitative detection of rare pathogens in the high background of normal bacteria. Massively parallel singleplex and multiplex PCR at the single-cell level in agarose droplets have been successfully established. Moreover, we challenged the system with rare pathogen detection and realized the sensitive and quantitative analysis of a single E. coli O157:H7 cell in the high background of 100,000 excess normal K12 cells. For the first time, we demonstrated rare pathogen detection through agarose droplet microfluidic ePCR. Such a multiplex single-cell agarose droplet amplification method enables ultra-high throughput and multi-parameter genetic analysis of large population of cells at the single-cell level to uncover the stochastic variations in biological systems.

  6. CALIPSO: an interactive image analysis software package for desktop PACS workstations

    NASA Astrophysics Data System (ADS)

    Ratib, Osman M.; Huang, H. K.

    1990-07-01

    The purpose of this project is to develop a low cost workstation for quantitative analysis of multimodality images using a Macintosh II personal computer. In the current configuration the Macintosh operates as a stand alone workstation where images are imported either from a central PACS server through a standard Ethernet network or recorded through video digitizer board. The CALIPSO software developed contains a large variety ofbasic image display and manipulation tools. We focused our effort however on the design and implementation ofquantitative analysis methods that can be applied to images from different imaging modalities. Analysis modules currently implemented include geometric and densitometric volumes and ejection fraction calculation from radionuclide and cine-angiograms Fourier analysis ofcardiac wall motion vascular stenosis measurement color coded parametric display of regional flow distribution from dynamic coronary angiograms automatic analysis ofmyocardial distribution ofradiolabelled tracers from tomoscintigraphic images. Several of these analysis tools were selected because they use similar color coded andparametric display methods to communicate quantitative data extracted from the images. 1. Rationale and objectives of the project Developments of Picture Archiving and Communication Systems (PACS) in clinical environment allow physicians and radiologists to assess radiographic images directly through imaging workstations (''). This convenient access to the images is often limited by the number of workstations available due in part to their high cost. There is also an increasing need for quantitative analysis ofthe images. During thepast decade

  7. Immediate drop on demand technology (I-DOT) coupled with mass spectrometry via an open port sampling interface.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos; Boeltz, Harry

    2017-11-01

    The aim of this work was to demonstrate and evaluate the analytical performance of coupling the immediate drop on demand technology to a mass spectrometer via the recently introduced open port sampling interface and ESI. Methodology & results: A maximum sample analysis throughput of 5 s per sample was demonstrated. Signal reproducibility was 10% or better as demonstrated by the quantitative analysis of propranolol and its stable isotope-labeled internal standard propranolol-d7. The ability of the system to multiply charge and analyze macromolecules was demonstrated using the protein cytochrome c. This immediate drop on demand technology/open port sampling interface/ESI-MS combination allowed for the quantitative analysis of relatively small mass analytes and was used for the identification of macromolecules like proteins.

  8. Quantitative carbon detector for enhanced detection of molecules in foods, pharmaceuticals, cosmetics, flavors, and fuels.

    PubMed

    Beach, Connor A; Krumm, Christoph; Spanjers, Charles S; Maduskar, Saurabh; Jones, Andrew J; Dauenhauer, Paul J

    2016-03-07

    Analysis of trace compounds, such as pesticides and other contaminants, within consumer products, fuels, and the environment requires quantification of increasingly complex mixtures of difficult-to-quantify compounds. Many compounds of interest are non-volatile and exhibit poor response in current gas chromatography and flame ionization systems. Here we show the reaction of trimethylsilylated chemical analytes to methane using a quantitative carbon detector (QCD; the Polyarc™ reactor) within a gas chromatograph (GC), thereby enabling enhanced detection (up to 10×) of highly functionalized compounds including carbohydrates, acids, drugs, flavorants, and pesticides. Analysis of a complex mixture of compounds shows that the GC-QCD method exhibits faster and more accurate analysis of complex mixtures commonly encountered in everyday products and the environment.

  9. Comparison of emerging contaminants in receiving waters downstream of a conventional wastewater treatment plant and a forest-water reuse system.

    PubMed

    McEachran, Andrew D; Hedgespeth, Melanie L; Newton, Seth R; McMahen, Rebecca; Strynar, Mark; Shea, Damian; Nichols, Elizabeth Guthrie

    2018-05-01

    Forest-water reuse (FWR) systems treat municipal, industrial, and agricultural wastewaters via land application to forest soils. Previous studies have shown that both large-scale conventional wastewater treatment plants (WWTPs) and FWR systems do not completely remove many contaminants of emerging concern (CECs) before release of treated wastewater. To better characterize CECs and potential for increased implementation of FWR systems, FWR systems need to be directly compared to conventional WWTPs. In this study, both a quantitative, targeted analysis and a nontargeted analysis were utilized to better understand how CECs release to waterways from an FWR system compared to a conventional treatment system. Quantitatively, greater concentrations and total mass load of CECs was exhibited downstream of the conventional WWTP compared to the FWR. Average summed concentrations of 33 targeted CECs downstream of the conventional system were ~ 1000 ng/L and downstream of the FWR were ~ 30 ng/L. From a nontargeted chemical standpoint, more tentatively identified chemicals were present, and at a greater relative abundance, downstream of the conventional system as well. Frequently occurring contaminants included phthalates, pharmaceuticals, and industrial chemicals. These data indicate that FWR systems represent a sustainable wastewater treatment alternative and that emerging contaminant release to waterways was lower at a FWR system than a conventional WWTP.

  10. Optical demodulation system for digitally encoded suspension array in fluoroimmunoassay

    NASA Astrophysics Data System (ADS)

    He, Qinghua; Li, Dongmei; He, Yonghong; Guan, Tian; Zhang, Yilong; Shen, Zhiyuan; Chen, Xuejing; Liu, Siyu; Lu, Bangrong; Ji, Yanhong

    2017-09-01

    A laser-induced breakdown spectroscopy and fluorescence spectroscopy-coupled optical system is reported to demodulate digitally encoded suspension array in fluoroimmunoassay. It takes advantage of the plasma emissions of assembled elemental materials to digitally decode the suspension array, providing a more stable and accurate recognition to target biomolecules. By separating the decoding procedure of suspension array and adsorption quantity calculation of biomolecules into two independent channels, the cross talk between decoding and label signals in traditional methods had been successfully avoided, which promoted the accuracy of both processes and realized more sensitive quantitative detection of target biomolecules. We carried a multiplexed detection of several types of anti-IgG to verify the quantitative analysis performance of the system. A limit of detection of 1.48×10-10 M was achieved, demonstrating the detection sensitivity of the optical demodulation system.

  11. Operational models of infrastructure resilience.

    PubMed

    Alderson, David L; Brown, Gerald G; Carlyle, W Matthew

    2015-04-01

    We propose a definition of infrastructure resilience that is tied to the operation (or function) of an infrastructure as a system of interacting components and that can be objectively evaluated using quantitative models. Specifically, for any particular system, we use quantitative models of system operation to represent the decisions of an infrastructure operator who guides the behavior of the system as a whole, even in the presence of disruptions. Modeling infrastructure operation in this way makes it possible to systematically evaluate the consequences associated with the loss of infrastructure components, and leads to a precise notion of "operational resilience" that facilitates model verification, validation, and reproducible results. Using a simple example of a notional infrastructure, we demonstrate how to use these models for (1) assessing the operational resilience of an infrastructure system, (2) identifying critical vulnerabilities that threaten its continued function, and (3) advising policymakers on investments to improve resilience. © 2014 Society for Risk Analysis.

  12. Validation of a new UNIX-based quantitative coronary angiographic system for the measurement of coronary artery lesions.

    PubMed

    Bell, M R; Britson, P J; Chu, A; Holmes, D R; Bresnahan, J F; Schwartz, R S

    1997-01-01

    We describe a method of validation of computerized quantitative coronary arteriography and report the results of a new UNIX-based quantitative coronary arteriography software program developed for rapid on-line (digital) and off-line (digital or cinefilm) analysis. The UNIX operating system is widely available in computer systems using very fast processors and has excellent graphics capabilities. The system is potentially compatible with any cardiac digital x-ray system for on-line analysis and has been designed to incorporate an integrated database, have on-line and immediate recall capabilities, and provide digital access to all data. The accuracy (mean signed differences of the observed minus the true dimensions) and precision (pooled standard deviations of the measurements) of the program were determined x-ray vessel phantoms. Intra- and interobserver variabilities were assessed from in vivo studies during routine clinical coronary arteriography. Precision from the x-ray phantom studies (6-In. field of view) for digital images was 0.066 mm and for digitized cine images was 0.060 mm. Accuracy was 0.076 mm (overestimation) for digital images compared to 0.008 mm for digitized cine images. Diagnostic coronary catheters were also used for calibration; accuracy.varied according to size of catheter and whether or not they were filled with iodinated contrast. Intra- and interobserver variabilities were excellent and indicated that coronary lesion measurements were relatively user-independent. Thus, this easy to use and very fast UNIX based program appears to be robust with optimal accuracy and precision for clinical and research applications.

  13. Development of multitissue microfluidic dynamic array for assessing changes in gene expression associated with channel catfish appetite, growth, metabolism, and intestinal health

    USDA-ARS?s Scientific Manuscript database

    Large-scale, gene expression methods allow for high throughput analysis of physiological pathways at a fraction of the cost of individual gene expression analysis. Systems, such as the Fluidigm quantitative PCR array described here, can provide powerful assessments of the effects of diet, environme...

  14. The Main Concept Analysis in Cantonese Aphasic Oral Discourse: External Validation and Monitoring Chronic Aphasia

    ERIC Educational Resources Information Center

    Kong, Anthony Pak-Hin

    2011-01-01

    Purpose: The 1st aim of this study was to further establish the external validity of the main concept (MC) analysis by examining its relationship with the Cantonese Linguistic Communication Measure (CLCM; Kong, 2006; Kong & Law, 2004)--an established quantitative system for narrative production--and the Cantonese version of the Western Aphasia…

  15. An Overview of ACRL"Metrics", Part II: Using NCES and IPEDs Data

    ERIC Educational Resources Information Center

    Stewart, Christopher

    2012-01-01

    This report is the second in a two-part analysis of ACRL"Metrics", an online service that provides access to a range of quantitative data for academic libraries. In this analysis, ACRL"Metrics"' inclusion of data from the National Center for Educational Statistics' Academic Libraries Survey and the Integrated Postsecondary Education Data System is…

  16. Potable water taste enhancement

    NASA Technical Reports Server (NTRS)

    1974-01-01

    An analysis was conducted to determine the causes of and remedies for the unpalatability of potable water in manned spacecraft. Criteria and specifications for palatable water were established and a quantitative laboratory analysis technique was developed for determinig the amounts of volatile organics in good tasting water. Prototype spacecraft water reclamation systems are evaluated in terms of the essential palatability factors.

  17. The new numerology of immunity mediated by virus-specific CD8(+) T cells.

    PubMed

    Doherty, P C

    1998-08-01

    Our understanding of virus-specific CD8(+) T cell responses is currently being revolutionized by peptide-based assay systems that allow flow cytometric analysis of effector and memory cytotoxic T lymphocyte populations. These techniques are, for the first time, putting the analysis of T-cell-mediated immunity on a quantitative basis.

  18. Practice and Evaluation of Blended Learning with Cross-Cultural Distance Learning in a Foreign Language Class: Using Mix Methods Data Analysis

    ERIC Educational Resources Information Center

    Sugie, Satoko; Mitsugi, Makoto

    2014-01-01

    The Information and Communication Technology (ICT) utilization in Chinese as a "second" foreign language has mainly been focused on Learning Management System (LMS), digital material development, and quantitative analysis of learners' grammatical knowledge. There has been little research that has analyzed the effectiveness of…

  19. Quantitative and Descriptive Comparison of Four Acoustic Analysis Systems: Vowel Measurements

    ERIC Educational Resources Information Center

    Burris, Carlyn; Vorperian, Houri K.; Fourakis, Marios; Kent, Ray D.; Bolt, Daniel M.

    2014-01-01

    Purpose: This study examines accuracy and comparability of 4 trademarked acoustic analysis software packages (AASPs): Praat, WaveSurfer, TF32, and CSL by using synthesized and natural vowels. Features of AASPs are also described. Method: Synthesized and natural vowels were analyzed using each of the AASP's default settings to secure 9…

  20. Quantitative Determination of NTA and Other Chelating Agents in Detergents by Potentiometric Titration with Copper Ion Selective Electrode.

    PubMed

    Ito, Sana; Morita, Masaki

    2016-01-01

    Quantitative analysis of nitrilotriacetate (NTA) in detergents by titration with Cu 2+ solution using a copper ion selective electrode was achieved. This method tolerates a wide range of pH and ingredients in detergents. In addition to NTA, other chelating agents, having relatively lower stability constants toward Cu 2+ , were also qualified with sufficient accuracy by this analytical method for model detergent formulations. The titration process was automated by automatic titrating systems available commercially.

  1. Clinical and mathematical introduction to computer processing of scintigraphic images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goris, M.L.; Briandet, P.A.

    The authors state in their preface:''...we believe that there is no book yet available in which computing in nuclear medicine has been approached in a reasonable manner. This book is our attempt to correct the situation.'' The book is divided into four sections: (1) Clinical Applications of Quantitative Scintigraphic Analysis; (2) Mathematical Derivations; (3) Processing Methods of Scintigraphic Images; and (4) The (Computer) System. Section 1 has chapters on quantitative approaches to congenital and acquired heart diseases, nephrology and urology, and pulmonary medicine.

  2. Informatics methods to enable sharing of quantitative imaging research data.

    PubMed

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    PubMed

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  4. Qualification Testing Versus Quantitative Reliability Testing of PV - Gaining Confidence in a Rapidly Changing Technology: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L

    Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less

  5. Security Events and Vulnerability Data for Cybersecurity Risk Estimation.

    PubMed

    Allodi, Luca; Massacci, Fabio

    2017-08-01

    Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.

  6. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  7. How structurally stable are global socioeconomic systems?

    PubMed Central

    Saavedra, Serguei; Rohr, Rudolf P.; Gilarranz, Luis J.; Bascompte, Jordi

    2014-01-01

    The stability analysis of socioeconomic systems has been centred on answering whether small perturbations when a system is in a given quantitative state will push the system permanently to a different quantitative state. However, typically the quantitative state of socioeconomic systems is subject to constant change. Therefore, a key stability question that has been under-investigated is how strongly the conditions of a system itself can change before the system moves to a qualitatively different behaviour, i.e. how structurally stable the systems is. Here, we introduce a framework to investigate the structural stability of socioeconomic systems formed by a network of interactions among agents competing for resources. We measure the structural stability of the system as the range of conditions in the distribution and availability of resources compatible with the qualitative behaviour in which all the constituent agents can be self-sustained across time. To illustrate our framework, we study an empirical representation of the global socioeconomic system formed by countries sharing and competing for multinational companies used as proxy for resources. We demonstrate that the structural stability of the system is inversely associated with the level of competition and the level of heterogeneity in the distribution of resources. Importantly, we show that the qualitative behaviour of the observed global socioeconomic system is highly sensitive to changes in the distribution of resources. We believe that this work provides a methodological basis to develop sustainable strategies for socioeconomic systems subject to constantly changing conditions. PMID:25165600

  8. Expression and immunoaffinity purification of recombinant soluble human GPR56 protein for the analysis of GPR56 receptor shedding by ELISA.

    PubMed

    Yang, Tai-Yun; Chiang, Nien-Yi; Tseng, Wen-Yi; Pan, Hsiao-Lin; Peng, Yen-Ming; Shen, Jiann-Jong; Wu, Kuo-An; Kuo, Ming-Ling; Chang, Gin-Wen; Lin, Hsi-Hsien

    2015-05-01

    GPR56 is a multi-functional adhesion-class G protein-coupled receptor involved in biological systems as diverse as brain development, male gonad development, myoblast fusion, hematopoietic stem cell maintenance, tumor growth and metastasis, and immune-regulation. Ectodomain shedding of human GPR56 receptor has been demonstrated previously, however the quantitative detection of GPR56 receptor shedding has not been investigated fully due to the lack of appropriate assays. Herein, an efficient system of expression and immune-affinity purification of the recombinant soluble extracellular domain of human GPR56 (sGPR56) protein from a stably transduced human melanoma cell line was established. The identity and functionality of the recombinant human sGPR56 protein were verified by Western blotting and mass spectrometry, and ligand-binding assays, respectively. Combined with the use of two recently generated anti-GPR56 monoclonal antibodies, a sensitive sandwich ELISA assay was successfully developed for the quantitative detection of human sGPR56 molecule. We found that GPR56 receptor shedding occurred constitutively and was further increased in activated human melanoma cells expressing endogenous GPR56. In conclusion, we report herein an efficient system for the production and purification of human sGPR56 protein for the establishment of a quantitative ELISA analysis of GPR56 receptor shedding. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Quantitative trace analysis of complex mixtures using SABRE hyperpolarization.

    PubMed

    Eshuis, Nan; van Weerdenburg, Bram J A; Feiters, Martin C; Rutjes, Floris P J T; Wijmenga, Sybren S; Tessari, Marco

    2015-01-26

    Signal amplification by reversible exchange (SABRE) is an emerging nuclear spin hyperpolarization technique that strongly enhances NMR signals of small molecules in solution. However, such signal enhancements have never been exploited for concentration determination, as the efficiency of SABRE can strongly vary between different substrates or even between nuclear spins in the same molecule. The first application of SABRE for the quantitative analysis of a complex mixture is now reported. Despite the inherent complexity of the system under investigation, which involves thousands of competing binding equilibria, analytes at concentrations in the low micromolar range could be quantified from single-scan SABRE spectra using a standard-addition approach. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. How to Combine ChIP with qPCR.

    PubMed

    Asp, Patrik

    2018-01-01

    Chromatin immunoprecipitation (ChIP) coupled with quantitative PCR (qPCR) has in the last 15 years become a basic mainstream tool in genomic research. Numerous commercially available ChIP kits, qPCR kits, and real-time PCR systems allow for quick and easy analysis of virtually anything chromatin-related as long as there is an available antibody. However, the highly accurate quantitative dimension added by using qPCR to analyze ChIP samples significantly raises the bar in terms of experimental accuracy, appropriate controls, data analysis, and data presentation. This chapter will address these potential pitfalls by providing protocols and procedures that address the difficulties inherent in ChIP-qPCR assays.

  11. Computer simulation of schlieren images of rotationally symmetric plasma systems: a simple method.

    PubMed

    Noll, R; Haas, C R; Weikl, B; Herziger, G

    1986-03-01

    Schlieren techniques are commonly used methods for quantitative analysis of cylindrical or spherical index of refraction profiles. Many schlieren objects, however, are characterized by more complex geometries, so we have investigated the more general case of noncylindrical, rotationally symmetric distributions of index of refraction n(r,z). Assuming straight ray paths in the schlieren object we have calculated 2-D beam deviation profiles. It is shown that experimental schlieren images of the noncylindrical plasma generated by a plasma focus device can be simulated with these deviation profiles. The computer simulation allows a quantitative analysis of these schlieren images, which yields, for example, the plasma parameters, electron density, and electron density gradients.

  12. Advanced IR System For Supersonic Boundary Layer Transition Flight Experiment

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.

    2008-01-01

    Infrared thermography is a preferred method investigating transition in flight: a) Global and non-intrusive; b) Can also be used to visualize and characterize other fluid mechanic phenomena such as shock impingement, separation etc. F-15 based system was updated with new camera and digital video recorder to support high Reynolds number transition tests. Digital Recording improves image quality and analysis capability and allows for accurate quantitative (temperature) measurements and greater enhancement through image processing allows analysis of smaller scale phenomena.

  13. Robustness of reduced-order multivariable state-space self-tuning controller

    NASA Technical Reports Server (NTRS)

    Yuan, Zhuzhi; Chen, Zengqiang

    1994-01-01

    In this paper, we present a quantitative analysis of the robustness of a reduced-order pole-assignment state-space self-tuning controller for a multivariable adaptive control system whose order of the real process is higher than that of the model used in the controller design. The result of stability analysis shows that, under a specific bounded modelling error, the adaptively controlled closed-loop real system via the reduced-order state-space self-tuner is BIBO stable in the presence of unmodelled dynamics.

  14. Comparative Analysis of TIAA/CREF and North Dakota Public Employee Retirement System Pension Fund. North Dakota Economic Studies Number 55.

    ERIC Educational Resources Information Center

    Lee, Jeong W.

    Quantitative financial measures were applied to evaluate the performance of the North Dakota Public Employee Retirement System (NDPERS) pension fund portfolios and the Teachers Insurance and Annuity Association (TIAA)/College Retirement Equities Fund (CREF) portfolios, thus providing a relative performance assessment. Ten years of data were…

  15. A Quantitative and Qualitative Analysis of the Effectiveness of Online Homework in First-Semester Calculus

    ERIC Educational Resources Information Center

    Zerr, Ryan

    2007-01-01

    An online homework system created for use by beginning calculus students is described. This system was designed with the specific goal of supporting student engagement outside of class by replicating the attempt-feedback-reattempt sequence of events which often occurs in a teacher's presence. Evidence is presented which indicates that this goal…

  16. MAPPING TOXICANT-INDUCED NERVOUS SYSTEM DAMAGE WITH A CUPRIC SILVER STAIN: A QUANTITATIVE ANALYSIS OF NEURAL DEGENERATION INDUCED BY 3,4-METHYLENEDIOXYMETHAMPETHAMINE (MDMA)

    EPA Science Inventory

    The purpose of structural assessments in neurotoxicology is to provide a convincing picture of the location and extent of damage to the nervous system. ilver stains that selectively reveal neural degeneration hold particular promise in this regard. n this chapter we describe resu...

  17. Visualization of system dynamics using phasegrams

    PubMed Central

    Herbst, Christian T.; Herzel, Hanspeter; Švec, Jan G.; Wyman, Megan T.; Fitch, W. Tecumseh

    2013-01-01

    A new tool for visualization and analysis of system dynamics is introduced: the phasegram. Its application is illustrated with both classical nonlinear systems (logistic map and Lorenz system) and with biological voice signals. Phasegrams combine the advantages of sliding-window analysis (such as the spectrogram) with well-established visualization techniques from the domain of nonlinear dynamics. In a phasegram, time is mapped onto the x-axis, and various vibratory regimes, such as periodic oscillation, subharmonics or chaos, are identified within the generated graph by the number and stability of horizontal lines. A phasegram can be interpreted as a bifurcation diagram in time. In contrast to other analysis techniques, it can be automatically constructed from time-series data alone: no additional system parameter needs to be known. Phasegrams show great potential for signal classification and can act as the quantitative basis for further analysis of oscillating systems in many scientific fields, such as physics (particularly acoustics), biology or medicine. PMID:23697715

  18. Quantitative allochem compositional analysis of Lochkovian-Pragian boundary sections in the Prague Basin (Czech Republic)

    NASA Astrophysics Data System (ADS)

    Weinerová, Hedvika; Hron, Karel; Bábek, Ondřej; Šimíček, Daniel; Hladil, Jindřich

    2017-06-01

    Quantitative allochem compositional trends across the Lochkovian-Pragian boundary Event were examined at three sections recording the proximal to more distal carbonate ramp environment of the Prague Basin. Multivariate statistical methods (principal component analysis, correspondence analysis, cluster analysis) of point-counted thin section data were used to reconstruct facies stacking patterns and sea-level history. Both the closed-nature allochem percentages and their centred log-ratio (clr) coordinates were used. Both these approaches allow for distinguishing of lowstand, transgressive and highstand system tracts within the Praha Formation, which show gradual transition from crinoid-dominated facies deposited above the storm wave base to dacryoconarid-dominated facies of deep-water environment below the storm wave base. Quantitative compositional data also indicate progradative-retrogradative trends in the macrolithologically monotonous shallow-water succession and enable its stratigraphic correlation with successions from deeper-water environments. Generally, the stratigraphic trends of the clr data are more sensitive to subtle changes in allochem composition in comparison to the results based on raw data. A heterozoan-dominated allochem association in shallow-water environments of the Praha Formation supports the carbonate ramp environment assumed by previous authors.

  19. Cost analysis of oxygen recovery systems

    NASA Technical Reports Server (NTRS)

    Yakut, M. M.

    1973-01-01

    Report is made of the cost analysis of four leading oxygen recovery subsystems which include two carbon dioxide reduction subsystems and two water electrolysis subsystems, namely, the solid polymer electrolyte and the circulating KOH electrolyte. The four oxygen recovery systems were quantitatively evaluated. System characteristics, including process flows, performance, and physical characteristics were also analyzed. Additionally, the status of development of each of the systems considered and the required advance technology efforts required to bring conceptual and/or pre-prototype hardware to an operational prototype status were defined. Intimate knowledge of the operations, development status, and capabilities of the systems to meet space mission requirements were found to be essential in establishing the cost estimating relationships for advanced life support systems.

  20. Microscopic optical path length difference and polarization measurement system for cell analysis

    NASA Astrophysics Data System (ADS)

    Satake, H.; Ikeda, K.; Kowa, H.; Hoshiba, T.; Watanabe, E.

    2018-03-01

    In recent years, noninvasive, nonstaining, and nondestructive quantitative cell measurement techniques have become increasingly important in the medical field. These cell measurement techniques enable the quantitative analysis of living cells, and are therefore applied to various cell identification processes, such as those determining the passage number limit during cell culturing in regenerative medicine. To enable cell measurement, we developed a quantitative microscopic phase imaging system based on a Mach-Zehnder interferometer that measures the optical path length difference distribution without phase unwrapping using optical phase locking. The applicability of our phase imaging system was demonstrated by successful identification of breast cancer cells amongst normal cells. However, the cell identification method using this phase imaging system exhibited a false identification rate of approximately 7%. In this study, we implemented a polarimetric imaging system by introducing a polarimetric module to one arm of the Mach-Zehnder interferometer of our conventional phase imaging system. This module was comprised of a quarter wave plate and a rotational polarizer on the illumination side of the sample, and a linear polarizer on the optical detector side. In addition, we developed correction methods for the measurement errors of the optical path length and birefringence phase differences that arose through the influence of elements other than cells, such as the Petri dish. As the Petri dish holding the fluid specimens was transparent, it did not affect the amplitude information; however, the optical path length and birefringence phase differences were affected. Therefore, we proposed correction of the optical path length and birefringence phase for the influence of elements other than cells, as a prerequisite for obtaining highly precise phase and polarimetric images.

  1. Emergy evaluation of water utilization benefits in water-ecological-economic system based on water cycle process

    NASA Astrophysics Data System (ADS)

    Guo, X.; Wu, Z.; Lv, C.

    2017-12-01

    The water utilization benefits are formed by the material flow, energy flow, information flow and value stream in the whole water cycle process, and reflected along with the material circulation of inner system. But most of traditional water utilization benefits evaluation are based on the macro level, only consider the whole material input and output and energy conversion relation, and lack the characterization of water utilization benefits accompanying with water cycle process from the formation mechanism. In addition, most studies are from the perspective of economics, only pay attention to the whole economic output and sewage treatment economic investment, but neglect the ecological function benefits of water cycle, Therefore, from the perspective of internal material circulation in the whole system, taking water cycle process as the process of material circulation and energy flow, the circulation and flow process of water and other ecological environment, social economic elements were described, and the composition of water utilization positive and negative benefits in water-ecological-economic system was explored, and the performance of each benefit was analyzed. On this basis, the emergy calculation method of each benefit was proposed by emergy quantitative analysis technique, which can realize the unified measurement and evaluation of water utilization benefits in water-ecological-economic system. Then, taking Zhengzhou city as an example, the corresponding benefits of different water cycle links were calculated quantitatively by emergy method, and the results showed that the emergy evaluation method of water utilization benefits can unify the ecosystem and the economic system, achieve uniform quantitative analysis, and measure the true value of natural resources and human economic activities comprehensively.

  2. Visual Aggregate Analysis of Eligibility Features of Clinical Trials

    PubMed Central

    He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua

    2015-01-01

    Objective To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Methods Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. Results We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions “hypertension” and “Type 2 diabetes”, respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. Conclusions We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. PMID:25615940

  3. Visual aggregate analysis of eligibility features of clinical trials.

    PubMed

    He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua

    2015-04-01

    To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions "hypertension" and "Type 2 diabetes", respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Computational, Experimental and Engineering Foundations of Ionic Channels as Miniaturized Sensors, Devices and Systems

    DTIC Science & Technology

    2003-10-01

    made in an ensemble of channels of unknown orientation and number, preventing quantitative analysis . • Currents have been compared among continuum PNP...microfluidic) analysis of ion channels to obtain fundamental insights into the selectivity, conductivity, and sensitivity of ion channels [19], [6...1.1 Develop fast and efficient simulators for steady-state analysis of continuum model for extraction of I-V curves. 1.2 Create

  5. Multicenter study of quantitative computed tomography analysis using a computer-aided three-dimensional system in patients with idiopathic pulmonary fibrosis.

    PubMed

    Iwasawa, Tae; Kanauchi, Tetsu; Hoshi, Toshiko; Ogura, Takashi; Baba, Tomohisa; Gotoh, Toshiyuki; Oba, Mari S

    2016-01-01

    To evaluate the feasibility of automated quantitative analysis with a three-dimensional (3D) computer-aided system (i.e., Gaussian histogram normalized correlation, GHNC) of computed tomography (CT) images from different scanners. Each institution's review board approved the research protocol. Informed patient consent was not required. The participants in this multicenter prospective study were 80 patients (65 men, 15 women) with idiopathic pulmonary fibrosis. Their mean age was 70.6 years. Computed tomography (CT) images were obtained by four different scanners set at different exposures. We measured the extent of fibrosis using GHNC, and used Pearson's correlation analysis, Bland-Altman plots, and kappa analysis to directly compare the GHNC results with manual scoring by radiologists. Multiple linear regression analysis was performed to determine the association between the CT data and forced vital capacity (FVC). For each scanner, the extent of fibrosis as determined by GHNC was significantly correlated with the radiologists' score. In multivariate analysis, the extent of fibrosis as determined by GHNC was significantly correlated with FVC (p < 0.001). There was no significant difference between the results obtained using different CT scanners. Gaussian histogram normalized correlation was feasible, irrespective of the type of CT scanner used.

  6. Image registration and analysis for quantitative myocardial perfusion: application to dynamic circular cardiac CT.

    PubMed

    Isola, A A; Schmitt, H; van Stevendaal, U; Begemann, P G; Coulon, P; Boussel, L; Grass, M

    2011-09-21

    Large area detector computed tomography systems with fast rotating gantries enable volumetric dynamic cardiac perfusion studies. Prospectively, ECG-triggered acquisitions limit the data acquisition to a predefined cardiac phase and thereby reduce x-ray dose and limit motion artefacts. Even in the case of highly accurate prospective triggering and stable heart rate, spatial misalignment of the cardiac volumes acquired and reconstructed per cardiac cycle may occur due to small motion pattern variations from cycle to cycle. These misalignments reduce the accuracy of the quantitative analysis of myocardial perfusion parameters on a per voxel basis. An image-based solution to this problem is elastic 3D image registration of dynamic volume sequences with variable contrast, as it is introduced in this contribution. After circular cone-beam CT reconstruction of cardiac volumes covering large areas of the myocardial tissue, the complete series is aligned with respect to a chosen reference volume. The results of the registration process and the perfusion analysis with and without registration are evaluated quantitatively in this paper. The spatial alignment leads to improved quantification of myocardial perfusion for three different pig data sets.

  7. Quantitative analysis of fragrance in selectable one dimensional or two dimensional gas chromatography-mass spectrometry with simultaneous detection of multiple detectors in single injection.

    PubMed

    Tan, Hui Peng; Wan, Tow Shi; Min, Christina Liew Shu; Osborne, Murray; Ng, Khim Hui

    2014-03-14

    A selectable one-dimensional ((1)D) or two-dimensional ((2)D) gas chromatography-mass spectrometry (GC-MS) system coupled with flame ionization detector (FID) and olfactory detection port (ODP) was employed in this study to analyze perfume oil and fragrance in shower gel. A split/splitless (SSL) injector and a programmable temperature vaporization (PTV) injector are connected via a 2-way splitter of capillary flow technology (CFT) in this selectable (1)D/(2)D GC-MS/FID/ODP system to facilitate liquid sample injections and thermal desorption (TD) for stir bar sorptive extraction (SBSE) technique, respectively. The dual-linked injectors set-up enable the use of two different injector ports (one at a time) in single sequence run without having to relocate the (1)D capillary column from one inlet to another. Target analytes were separated in (1)D GC-MS/FID/ODP and followed by further separation of co-elution mixture from (1)D in (2)D GC-MS/FID/ODP in single injection without any instrumental reconfiguration. A (1)D/(2)D quantitative analysis method was developed and validated for its repeatability - tR; calculated linear retention indices (LRI); response ratio in both MS and FID signal, limit of detection (LOD), limit of quantitation (LOQ), as well as linearity over a concentration range. The method was successfully applied in quantitative analysis of perfume solution at different concentration level (RSD≤0.01%, n=5) and shower gel spiked with perfume at different dosages (RSD≤0.04%, n=5) with good recovery (96-103% for SSL injection; 94-107% for stir bar sorptive extraction-thermal desorption (SBSE-TD). Copyright © 2014 Elsevier B.V. All rights reserved.

  8. A probe-based qRT-PCR method to profile immunological gene expression in blood of captive beluga whales (Delphinapterus leucas)

    PubMed Central

    Wang, Jiann-Hsiung; Chou, Shih-Jen; Li, Tsung-Hsien; Leu, Ming-Yih; Ho, Hsiao-Kuan

    2017-01-01

    Cytokines are fundamental for a functioning immune system, and thus potentially serve as important indicators of animal health. Quantitation of mRNA using quantitative reverse transcription polymerase chain reaction (qRT-PCR) is an established immunological technique. It is particularly suitable for detecting the expression of proteins against which monoclonal antibodies are not available. In this study, we developed a probe-based quantitative gene expression assay for immunological assessment of captive beluga whales (Delphinapterus leucas) that is one of the most common cetacean species on display in aquariums worldwide. Six immunologically relevant genes (IL-2Rα, -4, -10, -12, TNFα, and IFNγ) were selected for analysis, and two validated housekeeping genes (PGK1 and RPL4) with stable expression were used as reference genes. Sixteen blood samples were obtained from four animals with different health conditions and stored in RNAlater™ solution. These samples were used for RNA extraction followed by qRT-PCR analysis. Analysis of gene transcripts was performed by relative quantitation using the comparative Cq method with the integration of amplification efficiency and two reference genes. The expression levels of each gene in the samples from clinically healthy animals were normally distributed. Transcript outliers for IL-2Rα, IL-4, IL-12, TNFα, and IFNγ were noticed in four samples collected from two clinically unhealthy animals. This assay has the potential to identify immune system deviation from normal state, which is caused by health problems. Furthermore, knowing the immune status of captive cetaceans could help both trainers and veterinarians in implementing preventive approaches prior to disease onset. PMID:28970970

  9. A probe-based qRT-PCR method to profile immunological gene expression in blood of captive beluga whales (Delphinapterus leucas).

    PubMed

    Tsai, Ming-An; Chen, I-Hua; Wang, Jiann-Hsiung; Chou, Shih-Jen; Li, Tsung-Hsien; Leu, Ming-Yih; Ho, Hsiao-Kuan; Yang, Wei Cheng

    2017-01-01

    Cytokines are fundamental for a functioning immune system, and thus potentially serve as important indicators of animal health. Quantitation of mRNA using quantitative reverse transcription polymerase chain reaction (qRT-PCR) is an established immunological technique. It is particularly suitable for detecting the expression of proteins against which monoclonal antibodies are not available. In this study, we developed a probe-based quantitative gene expression assay for immunological assessment of captive beluga whales ( Delphinapterus leucas ) that is one of the most common cetacean species on display in aquariums worldwide. Six immunologically relevant genes (IL-2Rα, -4, -10, -12, TNFα, and IFNγ) were selected for analysis, and two validated housekeeping genes (PGK1 and RPL4) with stable expression were used as reference genes. Sixteen blood samples were obtained from four animals with different health conditions and stored in RNA later ™ solution. These samples were used for RNA extraction followed by qRT-PCR analysis. Analysis of gene transcripts was performed by relative quantitation using the comparative Cq method with the integration of amplification efficiency and two reference genes. The expression levels of each gene in the samples from clinically healthy animals were normally distributed. Transcript outliers for IL-2Rα, IL-4, IL-12, TNFα, and IFNγ were noticed in four samples collected from two clinically unhealthy animals. This assay has the potential to identify immune system deviation from normal state, which is caused by health problems. Furthermore, knowing the immune status of captive cetaceans could help both trainers and veterinarians in implementing preventive approaches prior to disease onset.

  10. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  11. A Framework for Assessment of Aviation Safety Technology Portfolios

    NASA Technical Reports Server (NTRS)

    Jones, Sharon M.; Reveley, Mary S.

    2014-01-01

    The programs within NASA's Aeronautics Research Mission Directorate (ARMD) conduct research and development to improve the national air transportation system so that Americans can travel as safely as possible. NASA aviation safety systems analysis personnel support various levels of ARMD management in their fulfillment of system analysis and technology prioritization as defined in the agency's program and project requirements. This paper provides a framework for the assessment of aviation safety research and technology portfolios that includes metrics such as projected impact on current and future safety, technical development risk and implementation risk. The paper also contains methods for presenting portfolio analysis and aviation safety Bayesian Belief Network (BBN) output results to management using bubble charts and quantitative decision analysis techniques.

  12. Characterization of shape and deformation of MEMS by quantitative optoelectronic metrology techniques

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.

  13. Development of an analytical method for quantitative comparison of the e-waste management systems in Thailand, Laos, and China.

    PubMed

    Liang, Li; Sharp, Alice

    2016-11-01

    This study employed a set of quantitative criteria to analyse the three parameters; namely policy, process, and practice; of the respective e-waste management systems adopted in Thailand, Laos, and China. Questionnaire surveys were conducted to determine the current status of the three parameters in relation to mobile phones. A total of five, three, and six variables under Policy (P 1 ), Process (P 2 ), and Practice (P 3 ), respectively, were analysed and their weighted averages were calculated. The results showed that among the three countries surveyed, significant differences at p<0.01 were observed in all the P 1 , P 2 , and P 3 variables, except P 305 (sending e-waste to recovery centres) and P 306 (treating e-waste by retailers themselves). Based on the quantitative method developed in this study, Laos' e-waste management system received the highest scores in both P 1 average (0.130) and P 3 average (0.129). However, in the combined P total , China scored the highest (0.141), followed by Laos (0.132) and Thailand (0.121). This method could be used to assist decision makers in performing quantitative analysis of complex issues associating with e-waste management in a country. © The Author(s) 2016.

  14. Analysis of physical exercises and exercise protocols for space transportation system operation

    NASA Technical Reports Server (NTRS)

    Coleman, A. E.

    1982-01-01

    A quantitative evaluation of the Thornton-Whitmore treadmill was made so that informed management decisions regarding the role of this treadmill in operational flight crew exercise programs could be made. Specific tasks to be completed were: The Thornton-Whitmore passive treadmill as an exercise device at one-g was evaluated. Hardware, harness and restraint systems for use with the Thornton-Whitmore treadmill in the laboratory and in Shuttle flights were established. The quantitative and qualitative performance of human subjects on the Thorton-Whitmore treadmill with forces in excess of one-g, was evaluated. The performance of human subjects on the Thornton-Whitmore treadmill in weightlessness (onboard Shuttle flights) was also determined.

  15. Joint explorative analysis of neuroreceptor subsystems in the human brain: application to receptor-transporter correlation using PET data.

    PubMed

    Cselényi, Zsolt; Lundberg, Johan; Halldin, Christer; Farde, Lars; Gulyás, Balázs

    2004-10-01

    Positron emission tomography (PET) has proved to be a highly successful technique in the qualitative and quantitative exploration of the human brain's neurotransmitter-receptor systems. In recent years, the number of PET radioligands, targeted to different neuroreceptor systems of the human brain, has increased considerably. This development paves the way for a simultaneous analysis of different receptor systems and subsystems in the same individual. The detailed exploration of the versatility of neuroreceptor systems requires novel technical approaches, capable of operating on huge parametric image datasets. An initial step of such explorative data processing and analysis should be the development of novel exploratory data-mining tools to gain insight into the "structure" of complex multi-individual, multi-receptor data sets. For practical reasons, a possible and feasible starting point of multi-receptor research can be the analysis of the pre- and post-synaptic binding sites of the same neurotransmitter. In the present study, we propose an unsupervised, unbiased data-mining tool for this task and demonstrate its usefulness by using quantitative receptor maps, obtained with positron emission tomography, from five healthy subjects on (pre-synaptic) serotonin transporters (5-HTT or SERT) and (post-synaptic) 5-HT(1A) receptors. Major components of the proposed technique include the projection of the input receptor maps to a feature space, the quasi-clustering and classification of projected data (neighbourhood formation), trans-individual analysis of neighbourhood properties (trajectory analysis), and the back-projection of the results of trajectory analysis to normal space (creation of multi-receptor maps). The resulting multi-receptor maps suggest that complex relationships and tendencies in the relationship between pre- and post-synaptic transporter-receptor systems can be revealed and classified by using this method. As an example, we demonstrate the regional correlation of the serotonin transporter-receptor systems. These parameter-specific multi-receptor maps can usefully guide the researchers in their endeavour to formulate models of multi-receptor interactions and changes in the human brain.

  16. A review of presented mathematical models in Parkinson's disease: black- and gray-box models.

    PubMed

    Sarbaz, Yashar; Pourakbari, Hakimeh

    2016-06-01

    Parkinson's disease (PD), one of the most common movement disorders, is caused by damage to the central nervous system. Despite all of the studies on PD, the formation mechanism of its symptoms remained unknown. It is still not obvious why damage only to the substantia nigra pars compacta, a small part of the brain, causes a wide range of symptoms. Moreover, the causes of brain damages remain to be fully elucidated. Exact understanding of the brain function seems to be impossible. On the other hand, some engineering tools are trying to understand the behavior and performance of complex systems. Modeling is one of the most important tools in this regard. Developing quantitative models for this disease has begun in recent decades. They are very effective not only in better understanding of the disease, offering new therapies, and its prediction and control, but also in its early diagnosis. Modeling studies include two main groups: black-box models and gray-box models. Generally, in the black-box modeling, regardless of the system information, the symptom is only considered as the output. Such models, besides the quantitative analysis studies, increase our knowledge of the disorders behavior and the disease symptoms. The gray-box models consider the involved structures in the symptoms appearance as well as the final disease symptoms. These models can effectively save time and be cost-effective for the researchers and help them select appropriate treatment mechanisms among all possible options. In this review paper, first, efforts are made to investigate some studies on PD quantitative analysis. Then, PD quantitative models will be reviewed. Finally, the results of using such models are presented to some extent.

  17. Systems Biology, Neuroimaging, Neuropsychology, Neuroconnectivity and Traumatic Brain Injury

    PubMed Central

    Bigler, Erin D.

    2016-01-01

    The patient who sustains a traumatic brain injury (TBI) typically undergoes neuroimaging studies, usually in the form of computed tomography (CT) and magnetic resonance imaging (MRI). In most cases the neuroimaging findings are clinically assessed with descriptive statements that provide qualitative information about the presence/absence of visually identifiable abnormalities; though little if any of the potential information in a scan is analyzed in any quantitative manner, except in research settings. Fortunately, major advances have been made, especially during the last decade, in regards to image quantification techniques, especially those that involve automated image analysis methods. This review argues that a systems biology approach to understanding quantitative neuroimaging findings in TBI provides an appropriate framework for better utilizing the information derived from quantitative neuroimaging and its relation with neuropsychological outcome. Different image analysis methods are reviewed in an attempt to integrate quantitative neuroimaging methods with neuropsychological outcome measures and to illustrate how different neuroimaging techniques tap different aspects of TBI-related neuropathology. Likewise, how different neuropathologies may relate to neuropsychological outcome is explored by examining how damage influences brain connectivity and neural networks. Emphasis is placed on the dynamic changes that occur following TBI and how best to capture those pathologies via different neuroimaging methods. However, traditional clinical neuropsychological techniques are not well suited for interpretation based on contemporary and advanced neuroimaging methods and network analyses. Significant improvements need to be made in the cognitive and behavioral assessment of the brain injured individual to better interface with advances in neuroimaging-based network analyses. By viewing both neuroimaging and neuropsychological processes within a systems biology perspective could represent a significant advancement for the field. PMID:27555810

  18. NanoDrop Microvolume Quantitation of Nucleic Acids

    PubMed Central

    Desjardins, Philippe; Conklin, Deborah

    2010-01-01

    Biomolecular assays are continually being developed that use progressively smaller amounts of material, often precluding the use of conventional cuvette-based instruments for nucleic acid quantitation for those that can perform microvolume quantitation. The NanoDrop microvolume sample retention system (Thermo Scientific NanoDrop Products) functions by combining fiber optic technology and natural surface tension properties to capture and retain minute amounts of sample independent of traditional containment apparatus such as cuvettes or capillaries. Furthermore, the system employs shorter path lengths, which result in a broad range of nucleic acid concentration measurements, essentially eliminating the need to perform dilutions. Reducing the volume of sample required for spectroscopic analysis also facilitates the inclusion of additional quality control steps throughout many molecular workflows, increasing efficiency and ultimately leading to greater confidence in downstream results. The need for high-sensitivity fluorescent analysis of limited mass has also emerged with recent experimental advances. Using the same microvolume sample retention technology, fluorescent measurements may be performed with 2 μL of material, allowing fluorescent assays volume requirements to be significantly reduced. Such microreactions of 10 μL or less are now possible using a dedicated microvolume fluorospectrometer. Two microvolume nucleic acid quantitation protocols will be demonstrated that use integrated sample retention systems as practical alternatives to traditional cuvette-based protocols. First, a direct A260 absorbance method using a microvolume spectrophotometer is described. This is followed by a demonstration of a fluorescence-based method that enables reduced-volume fluorescence reactions with a microvolume fluorospectrometer. These novel techniques enable the assessment of nucleic acid concentrations ranging from 1 pg/ μL to 15,000 ng/ μL with minimal consumption of sample. PMID:21189466

  19. The need for academic electronic health record systems in nurse education.

    PubMed

    Chung, Joohyun; Cho, Insook

    2017-07-01

    The nursing profession has been slow to incorporate information technology into formal nurse education and practice. The aim of this study was to identify the use of academic electronic health record systems in nurse education and to determine student and faculty perceptions of academic electronic health record systems in nurse education. A quantitative research design with supportive qualitative research was used to gather information on nursing students' perceptions and nursing faculty's perceptions of academic electronic health record systems in nurse education. Eighty-three participants (21 nursing faculty and 62 students), from 5 nursing schools, participated in the study. A purposive sample of 9 nursing faculty was recruited from one university in the Midwestern United States to provide qualitative data for the study. The researcher-designed surveys (completed by faculty and students) were used for quantitative data collection. Qualitative data was taken from interviews, which were transcribed verbatim for analysis. Students and faculty agreed that academic electronic health record systems could be useful for teaching students to think critically about nursing documentation. Quantitative and qualitative findings revealed that academic electronic health record systems regarding nursing documentation could help prepare students for the future of health information technology. Meaningful adoption of academic electronic health record systems will help in building the undergraduate nursing students' competence in nursing documentation with electronic health record systems. Copyright © 2017. Published by Elsevier Ltd.

  20. Two-Photon Flow Cytometry

    NASA Technical Reports Server (NTRS)

    Zhog, Cheng Frank; Ye, Jing Yong; Norris, Theodore B.; Myc, Andrzej; Cao, Zhengyl; Bielinska, Anna; Thomas, Thommey; Baker, James R., Jr.

    2004-01-01

    Flow cytometry is a powerful technique for obtaining quantitative information from fluorescence in cells. Quantitation is achieved by assuring a high degree of uniformity in the optical excitation and detection, generally by using a highly controlled flow such as is obtained via hydrodynamic focusing. In this work, we demonstrate a two-beam, two- channel detection and two-photon excitation flow cytometry (T(sup 3)FC) system that enables multi-dye analysis to be performed very simply, with greatly relaxed requirements on the fluid flow. Two-photon excitation using a femtosecond near-infrared (NIR) laser has the advantages that it enables simultaneous excitation of multiple dyes and achieves very high signal-to-noise ratio through simplified filtering and fluorescence background reduction. By matching the excitation volume to the size of a cell, single-cell detection is ensured. Labeling of cells by targeted nanoparticles with multiple fluorophores enables normalization of the fluorescence signal and thus ratiometric measurements under nonuniform excitation. Quantitative size measurements can also be done even under conditions of nonuniform flow via a two-beam layout. This innovative detection scheme not only considerably simplifies the fluid flow system and the excitation and collection optics, it opens the way to quantitative cytometry in simple and compact microfluidics systems, or in vivo. Real-time detection of fluorescent microbeads in the vasculature of mouse ear demonstrates the ability to do flow cytometry in vivo. The conditions required to perform quantitative in vivo cytometry on labeled cells will be presented.

  1. A new method for qualitative simulation of water resources systems: 1. Theory

    NASA Astrophysics Data System (ADS)

    Camara, A. S.; Pinheiro, M.; Antunes, M. P.; Seixas, M. J.

    1987-11-01

    A new dynamic modeling methodology, SLIN (Simulação Linguistica), allowing for the analysis of systems defined by linguistic variables, is presented. SLIN applies a set of logical rules avoiding fuzzy theoretic concepts. To make the transition from qualitative to quantitative modes, logical rules are used as well. Extensions of the methodology to simulation-optimization applications and multiexpert system modeling are also discussed.

  2. Study of Cisatracurium and Sufentanil Consumption Using a Closed Loop Computer Control Infusion System

    ClinicalTrials.gov

    2013-02-06

    The Intraoperative Effect of Dexmedetomidine on Cisatracurium Infusion Consumption and Its Recovery Index.; Effect of Dexmedetomidine on Sufentanil Consumption.; Quantitative Analysis of Cisatracurium Infusion Requirements, Sufentanil Consumption and Recovery Index in Different Age Groups.

  3. A Descriptive Guide to Trade Space Analysis

    DTIC Science & Technology

    2015-09-01

    Development QFD Quality Function Deployment RSM Response Surface Method RSE Response Surface Equation SE Systems Engineering SME Subject Matter...surface equations ( RSEs ) as surrogate models. It uses the RSEs with Monte Carlo simulation to quantitatively explore changes across the surfaces to

  4. Measuring the effects of aborted takeoffs and landings on traffic flow at JFK

    DOT National Transportation Integrated Search

    2012-10-14

    The FAA Office of Accident Investigation and Prevention (AVP) supports research, analysis and demonstration of quantitative air traffic analyses to estimate safety performance and benefits of the Next Generation Air Transportation System (NextGen). T...

  5. Quantitative and Qualitative Analysis of Bacteria in Er(III) Solution by Thin-Film Magnetopheresis

    PubMed Central

    Zborowski, Maciej; Tada, Yoko; Malchesky, Paul S.; Hall, Geraldine S.

    1993-01-01

    Magnetic deposition, quantitation, and identification of bacteria reacting with the paramagnetic trivalent lanthanide ion, Er3+, was evaluated. The magnetic deposition method was dubbed thin-film magnetopheresis. The optimization of the magnetic deposition protocol was accomplished with Escherichia coli as a model organism in 150 mM NaCl and 5 mM ErCl3 solution. Three gram-positive bacteria, Staphylococcus epidermidis, Staphylococcus saprophyticus, and Enterococcus faecalis, and four gram-negative bacteria, E. coli, Pseudomonas aeruginosa, Proteus mirabilis, and Klebsiella pneumoniae, were subsequently investigated. Quantitative analysis consisted of the microscopic cell count and a scattered-light scanning of the magnetically deposited material aided by the computer data acquisition system. Qualitative analysis consisted of Gram stain differentiation and fluorescein isothiocyanate staining in combination with selected antisera against specific types of bacteria on the solid substrate. The magnetic deposition protocol allowed quantitative detection of E. coli down to the concentration of 105 CFU ml-1, significant in clinical diagnosis applications such as urinary tract infections. Er3+ did not interfere with the typical appearance of the Gram-stained bacteria nor with the antigen recognition by the antibody in the immunohistological evaluations. Indirect antiserum-fluorescein isothiocyanate labelling correctly revealed the presence of E. faecalis and P. aeruginosa in the magnetically deposited material obtained from the mixture of these two bacterial species. On average, the reaction of gram-positive organisms was significantly stronger to the magnetic field in the presence of Er3+ than the reaction of gram-negative organisms. The thin-film magnetophoresis offers promise as a rapid method for quantitative and qualitative analysis of bacteria in solutions such as urine or environmental water. Images PMID:16348916

  6. Advances in Surface Plasmon Resonance Imaging enable quantitative measurement of laterally heterogeneous coatings of nanoscale thickness

    NASA Astrophysics Data System (ADS)

    Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John

    2013-03-01

    The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.

  7. A Novel Method for Tracking Individuals of Fruit Fly Swarms Flying in a Laboratory Flight Arena.

    PubMed

    Cheng, Xi En; Qian, Zhi-Ming; Wang, Shuo Hong; Jiang, Nan; Guo, Aike; Chen, Yan Qiu

    2015-01-01

    The growing interest in studying social behaviours of swarming fruit flies, Drosophila melanogaster, has heightened the need for developing tools that provide quantitative motion data. To achieve such a goal, multi-camera three-dimensional tracking technology is the key experimental gateway. We have developed a novel tracking system for tracking hundreds of fruit flies flying in a confined cubic flight arena. In addition to the proposed tracking algorithm, this work offers additional contributions in three aspects: body detection, orientation estimation, and data validation. To demonstrate the opportunities that the proposed system offers for generating high-throughput quantitative motion data, we conducted experiments on five experimental configurations. We also performed quantitative analysis on the kinematics and the spatial structure and the motion patterns of fruit fly swarms. We found that there exists an asymptotic distance between fruit flies in swarms as the population density increases. Further, we discovered the evidence for repulsive response when the distance between fruit flies approached the asymptotic distance. Overall, the proposed tracking system presents a powerful method for studying flight behaviours of fruit flies in a three-dimensional environment.

  8. Reference condition approach to restoration planning

    USGS Publications Warehouse

    Nestler, J.M.; Theiling, C.H.; Lubinski, S.J.; Smith, D.L.

    2010-01-01

    Ecosystem restoration planning requires quantitative rigor to evaluate alternatives, define end states, report progress and perform environmental benefits analysis (EBA). Unfortunately, existing planning frameworks are, at best, semi-quantitative. In this paper, we: (1) describe a quantitative restoration planning approach based on a comprehensive, but simple mathematical framework that can be used to effectively apply knowledge and evaluate alternatives, (2) use the approach to derive a simple but precisely defined lexicon based on the reference condition concept and allied terms and (3) illustrate the approach with an example from the Upper Mississippi River System (UMRS) using hydrologic indicators. The approach supports the development of a scaleable restoration strategy that, in theory, can be expanded to ecosystem characteristics such as hydraulics, geomorphology, habitat and biodiversity. We identify three reference condition types, best achievable condition (A BAC), measured magnitude (MMi which can be determined at one or many times and places) and desired future condition (ADFC) that, when used with the mathematical framework, provide a complete system of accounts useful for goal-oriented system-level management and restoration. Published in 2010 by John Wiley & Sons, Ltd.

  9. Experiencing the Elicitation of User Requirements and Recording Them in Use Case Diagrams through Role-Play

    ERIC Educational Resources Information Center

    Costain, Gay; McKenna, Brad

    2011-01-01

    This paper describes a role-play exercise used in a second-year tertiary Systems Analysis and Design course, and the quantitative and qualitative analysis of the students' responses to a survey that solicited their perceptions of that role-play experience. The role-play involved students in eliciting user requirements from customers during a Joint…

  10. Quantitative Analysis of Defects in Silicon. Silicon Sheet Growth Development for the Large Area Silicon Sheet Task of the Low-cost Solar Array Project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Smith, J. M.; Qidwai, H. A.

    1979-01-01

    The various steps involved in the chemical polishing and etching of silicon samples are described. Data on twins, dislocation pits, and grain boundaries from thirty-one (31) silicon sample are also discussed. A brief review of the changes made to upgrade the image analysis system is included.

  11. Elucidating dynamic metabolic physiology through network integration of quantitative time-course metabolomics

    DOE PAGES

    Bordbar, Aarash; Yurkovich, James T.; Paglia, Giuseppe; ...

    2017-04-07

    In this study, the increasing availability of metabolomics data necessitates novel methods for deeper data analysis and interpretation. We present a flux balance analysis method that allows for the computation of dynamic intracellular metabolic changes at the cellular scale through integration of time-course absolute quantitative metabolomics. This approach, termed “unsteady-state flux balance analysis” (uFBA), is applied to four cellular systems: three dynamic and one steady-state as a negative control. uFBA and FBA predictions are contrasted, and uFBA is found to be more accurate in predicting dynamic metabolic flux states for red blood cells, platelets, and Saccharomyces cerevisiae. Notably, only uFBAmore » predicts that stored red blood cells metabolize TCA intermediates to regenerate important cofactors, such as ATP, NADH, and NADPH. These pathway usage predictions were subsequently validated through 13C isotopic labeling and metabolic flux analysis in stored red blood cells. Utilizing time-course metabolomics data, uFBA provides an accurate method to predict metabolic physiology at the cellular scale for dynamic systems.« less

  12. Elucidating dynamic metabolic physiology through network integration of quantitative time-course metabolomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bordbar, Aarash; Yurkovich, James T.; Paglia, Giuseppe

    In this study, the increasing availability of metabolomics data necessitates novel methods for deeper data analysis and interpretation. We present a flux balance analysis method that allows for the computation of dynamic intracellular metabolic changes at the cellular scale through integration of time-course absolute quantitative metabolomics. This approach, termed “unsteady-state flux balance analysis” (uFBA), is applied to four cellular systems: three dynamic and one steady-state as a negative control. uFBA and FBA predictions are contrasted, and uFBA is found to be more accurate in predicting dynamic metabolic flux states for red blood cells, platelets, and Saccharomyces cerevisiae. Notably, only uFBAmore » predicts that stored red blood cells metabolize TCA intermediates to regenerate important cofactors, such as ATP, NADH, and NADPH. These pathway usage predictions were subsequently validated through 13C isotopic labeling and metabolic flux analysis in stored red blood cells. Utilizing time-course metabolomics data, uFBA provides an accurate method to predict metabolic physiology at the cellular scale for dynamic systems.« less

  13. 75 FR 54117 - Building Energy Standards Program: Preliminary Determination Regarding Energy Efficiency...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... Response to Comments on Previous Analysis C. Summary of the Comparative Analysis 1. Quantitative Analysis 2... preliminary quantitative analysis are specific building designs, in most cases with specific spaces defined... preliminary determination. C. Summary of the Comparative Analysis DOE carried out both a broad quantitative...

  14. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework

    PubMed Central

    Anguera, M. Teresa; Portell, Mariona; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana

    2018-01-01

    Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions) or directly from narratives (e.g., letters of complaint, tweets, forum posts). It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies) for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet the requirement of flexibility and provide new perspectives on data integration in the study of biopsychosocial aspects in everyday contexts. PMID:29441028

  15. An image analysis system for near-infrared (NIR) fluorescence lymph imaging

    NASA Astrophysics Data System (ADS)

    Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.

    2011-03-01

    Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.

  16. The iFly Tracking System for an Automated Locomotor and Behavioural Analysis of Drosophila melanogaster

    PubMed Central

    Kohlhoff, Kai J.; Jahn, Thomas R.; Lomas, David A.; Dobson, Christopher M.; Crowther, Damian C.; Vendruscolo, Michele

    2016-01-01

    The use of animal models in medical research provides insights into molecular and cellular mechanisms of human disease, and helps identify and test novel therapeutic strategies. Drosophila melanogaster – the common fruit fly – is one of the most established model organisms, as its study can be performed more readily and with far less expense than for other model animal systems, such as mice, fish, or indeed primates. In the case of fruit flies, standard assays are based on the analysis of longevity and basic locomotor functions. Here we present the iFly tracking system, which enables to increase the amount of quantitative information that can be extracted from these studies, and to reduce significantly the duration and costs associated with them. The iFly system uses a single camera to simultaneously track the trajectories of up to 20 individual flies with about 100μm spatial and 33ms temporal resolution. The statistical analysis of fly movements recorded with such accuracy makes it possible to perform a rapid and fully automated quantitative analysis of locomotor changes in response to a range of different stimuli. We anticipate that the iFly method will reduce very considerably the costs and the duration of the testing of genetic and pharmacological interventions in Drosophila models, including an earlier detection of behavioural changes and a large increase in throughput compared to current longevity and locomotor assays. PMID:21698336

  17. Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.

    DTIC Science & Technology

    1996-08-12

    Symbolic model checking is a technique for verifying finite-state concurrent systems that has been extended to handle real - time systems . Models with...up to 10(exp 30) states can often be verified in minutes. In this paper, we present a new tool to analyze real - time systems , based on this technique...We have designed a language, called Verus, for the description of real - time systems . Such a description is compiled into a state-transition graph and

  18. Reference clock parameters for digital communications systems applications

    NASA Technical Reports Server (NTRS)

    Kartaschoff, P.

    1981-01-01

    The basic parameters relevant to the design of network timing systems describe the random and systematic time departures of the system elements, i.e., master (or reference) clocks, transmission links, and other clocks controlled over the links. The quantitative relations between these parameters were established and illustrated by means of numerical examples based on available measured data. The examples were limited to a simple PLL control system but the analysis can eventually be applied to more sophisticated systems at the cost of increased computational effort.

  19. Quantitative Analysis of Food and Feed Samples with Droplet Digital PCR

    PubMed Central

    Morisset, Dany; Štebih, Dejan; Milavec, Mojca; Gruden, Kristina; Žel, Jana

    2013-01-01

    In this study, the applicability of droplet digital PCR (ddPCR) for routine analysis in food and feed samples was demonstrated with the quantification of genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (qPCR) is currently used for quantitative molecular analysis of the presence of GMOs in products. However, its use is limited for detecting and quantifying very small numbers of DNA targets, as in some complex food and feed matrices. Using ddPCR duplex assay, we have measured the absolute numbers of MON810 transgene and hmg maize reference gene copies in DNA samples. Key performance parameters of the assay were determined. The ddPCR system is shown to offer precise absolute and relative quantification of targets, without the need for calibration curves. The sensitivity (five target DNA copies) of the ddPCR assay compares well with those of individual qPCR assays and of the chamber digital PCR (cdPCR) approach. It offers a dynamic range over four orders of magnitude, greater than that of cdPCR. Moreover, when compared to qPCR, the ddPCR assay showed better repeatability at low target concentrations and a greater tolerance to inhibitors. Finally, ddPCR throughput and cost are advantageous relative to those of qPCR for routine GMO quantification. It is thus concluded that ddPCR technology can be applied for routine quantification of GMOs, or any other domain where quantitative analysis of food and feed samples is needed. PMID:23658750

  20. [A novel approach to NIR spectral quantitative analysis: semi-supervised least-squares support vector regression machine].

    PubMed

    Li, Lin; Xu, Shuo; An, Xin; Zhang, Lu-Da

    2011-10-01

    In near infrared spectral quantitative analysis, the precision of measured samples' chemical values is the theoretical limit of those of quantitative analysis with mathematical models. However, the number of samples that can obtain accurately their chemical values is few. Many models exclude the amount of samples without chemical values, and consider only these samples with chemical values when modeling sample compositions' contents. To address this problem, a semi-supervised LS-SVR (S2 LS-SVR) model is proposed on the basis of LS-SVR, which can utilize samples without chemical values as well as those with chemical values. Similar to the LS-SVR, to train this model is equivalent to solving a linear system. Finally, the samples of flue-cured tobacco were taken as experimental material, and corresponding quantitative analysis models were constructed for four sample compositions' content(total sugar, reducing sugar, total nitrogen and nicotine) with PLS regression, LS-SVR and S2 LS-SVR. For the S2 LS-SVR model, the average relative errors between actual values and predicted ones for the four sample compositions' contents are 6.62%, 7.56%, 6.11% and 8.20%, respectively, and the correlation coefficients are 0.974 1, 0.973 3, 0.923 0 and 0.948 6, respectively. Experimental results show the S2 LS-SVR model outperforms the other two, which verifies the feasibility and efficiency of the S2 LS-SVR model.

  1. GiA Roots: software for the high throughput analysis of plant root system architecture.

    PubMed

    Galkovskyi, Taras; Mileyko, Yuriy; Bucksch, Alexander; Moore, Brad; Symonova, Olga; Price, Charles A; Topp, Christopher N; Iyer-Pascuzzi, Anjali S; Zurek, Paul R; Fang, Suqin; Harer, John; Benfey, Philip N; Weitz, Joshua S

    2012-07-26

    Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically for the high-throughput analysis of root system images. GiA Roots includes user-assisted algorithms to distinguish root from background and a fully automated pipeline that extracts dozens of root system phenotypes. Quantitative information on each phenotype, along with intermediate steps for full reproducibility, is returned to the end-user for downstream analysis. GiA Roots has a GUI front end and a command-line interface for interweaving the software into large-scale workflows. GiA Roots can also be extended to estimate novel phenotypes specified by the end-user. We demonstrate the use of GiA Roots on a set of 2393 images of rice roots representing 12 genotypes from the species Oryza sativa. We validate trait measurements against prior analyses of this image set that demonstrated that RSA traits are likely heritable and associated with genotypic differences. Moreover, we demonstrate that GiA Roots is extensible and an end-user can add functionality so that GiA Roots can estimate novel RSA traits. In summary, we show that the software can function as an efficient tool as part of a workflow to move from large numbers of root images to downstream analysis.

  2. Analytical Chemical Sensing in the Submillimeter/terahertz Spectral Range

    NASA Astrophysics Data System (ADS)

    Moran, Benjamin L.; Fosnight, Alyssa M.; Medvedev, Ivan R.; Neese, Christopher F.

    2012-06-01

    Highly sensitive and selective Terahertz sensor utilized to quantitatively analyze a complex mixture of Volatile Organic Compounds is reported. To best demonstrate analytical capabilities of THz chemical sensors we chose to perform analytical quantitative analysis of a certified gas mixture using a novel prototype chemical sensor that couples a commercial preconcentration system (Entech 7100A) to a high resolution THz spectrometer. We selected Method TO-14A certified mixture of 39 volatile organic compounds (VOCs) diluted to 1 part per million (ppm) in nitrogen. 26 of the 39 chemicals were identified by us as suitable for THz spectroscopic detection. Entech 7100A system is designed and marketed as an inlet system for Gas Chromatography-Mass Spectrometry (GC-MS) instruments with a specific focus on TO-14 and TO-15 EPA sampling methods. Its preconcentration efficiency is high for the 39 chemicals in the mixture used for this study and our preliminary results confirm this. Here we present the results of this study which serves as basis for our ongoing research in environmental sensing and analysis of exhaled human breath.

  3. Parallel labeling experiments for pathway elucidation and (13)C metabolic flux analysis.

    PubMed

    Antoniewicz, Maciek R

    2015-12-01

    Metabolic pathway models provide the foundation for quantitative studies of cellular physiology through the measurement of intracellular metabolic fluxes. For model organisms metabolic models are well established, with many manually curated genome-scale model reconstructions, gene knockout studies and stable-isotope tracing studies. However, for non-model organisms a similar level of knowledge is often lacking. Compartmentation of cellular metabolism in eukaryotic systems also presents significant challenges for quantitative (13)C-metabolic flux analysis ((13)C-MFA). Recently, innovative (13)C-MFA approaches have been developed based on parallel labeling experiments, the use of multiple isotopic tracers and integrated data analysis, that allow more rigorous validation of pathway models and improved quantification of metabolic fluxes. Applications of these approaches open new research directions in metabolic engineering, biotechnology and medicine. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Combined Falling Drop/Open Port Sampling Interface System for Automated Flow Injection Mass Spectrometry

    DOE PAGES

    Van Berkel, Gary J.; Kertesz, Vilmos; Orcutt, Matt; ...

    2017-11-07

    The aim of this work was to demonstrate and to evaluate the analytical performance of a combined falling drop/open port sampling interface (OPSI) system as a simple noncontact, no-carryover, automated system for flow injection analysis with mass spectrometry. The falling sample drops were introduced into the OPSI using a widely available autosampler platform utilizing low cost disposable pipet tips and conventional disposable microtiter well plates. The volume of the drops that fell onto the OPSI was in the 7–15 μL range with an injected sample volume of several hundred nanoliters. Sample drop height, positioning of the internal capillary on themore » sampling end of the probe, and carrier solvent flow rate were optimized for maximum signal. Sample throughput, signal reproducibility, matrix effects, and quantitative analysis capability of the system were established using the drug molecule propranolol and its isotope labeled internal standard in water, unprocessed river water and two commercially available buffer matrices. A sample-to-sample throughput of ~45 s with a ~4.5 s base-to-base flow injection peak profile was obtained in these experiments. In addition, quantitation with minimally processed rat plasma samples was demonstrated with three different statin drugs (atorvastatin, rosuvastatin, and fluvastatin). Direct characterization capability of unprocessed samples was demonstrated by the analysis of neat vegetable oils. Employing the autosampler system for spatially resolved liquid extraction surface sampling exemplified by the analysis of propranolol and its hydroxypropranolol glucuronide phase II metabolites from a rat thin tissue section was also illustrated.« less

  5. Modeling and analysis of cell membrane systems with probabilistic model checking

    PubMed Central

    2011-01-01

    Background Recently there has been a growing interest in the application of Probabilistic Model Checking (PMC) for the formal specification of biological systems. PMC is able to exhaustively explore all states of a stochastic model and can provide valuable insight into its behavior which are more difficult to see using only traditional methods for system analysis such as deterministic and stochastic simulation. In this work we propose a stochastic modeling for the description and analysis of sodium-potassium exchange pump. The sodium-potassium pump is a membrane transport system presents in all animal cell and capable of moving sodium and potassium ions against their concentration gradient. Results We present a quantitative formal specification of the pump mechanism in the PRISM language, taking into consideration a discrete chemistry approach and the Law of Mass Action aspects. We also present an analysis of the system using quantitative properties in order to verify the pump reversibility and understand the pump behavior using trend labels for the transition rates of the pump reactions. Conclusions Probabilistic model checking can be used along with other well established approaches such as simulation and differential equations to better understand pump behavior. Using PMC we can determine if specific events happen such as the potassium outside the cell ends in all model traces. We can also have a more detailed perspective on its behavior such as determining its reversibility and why its normal operation becomes slow over time. This knowledge can be used to direct experimental research and make it more efficient, leading to faster and more accurate scientific discoveries. PMID:22369714

  6. Combined Falling Drop/Open Port Sampling Interface System for Automated Flow Injection Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Berkel, Gary J.; Kertesz, Vilmos; Orcutt, Matt

    The aim of this work was to demonstrate and to evaluate the analytical performance of a combined falling drop/open port sampling interface (OPSI) system as a simple noncontact, no-carryover, automated system for flow injection analysis with mass spectrometry. The falling sample drops were introduced into the OPSI using a widely available autosampler platform utilizing low cost disposable pipet tips and conventional disposable microtiter well plates. The volume of the drops that fell onto the OPSI was in the 7–15 μL range with an injected sample volume of several hundred nanoliters. Sample drop height, positioning of the internal capillary on themore » sampling end of the probe, and carrier solvent flow rate were optimized for maximum signal. Sample throughput, signal reproducibility, matrix effects, and quantitative analysis capability of the system were established using the drug molecule propranolol and its isotope labeled internal standard in water, unprocessed river water and two commercially available buffer matrices. A sample-to-sample throughput of ~45 s with a ~4.5 s base-to-base flow injection peak profile was obtained in these experiments. In addition, quantitation with minimally processed rat plasma samples was demonstrated with three different statin drugs (atorvastatin, rosuvastatin, and fluvastatin). Direct characterization capability of unprocessed samples was demonstrated by the analysis of neat vegetable oils. Employing the autosampler system for spatially resolved liquid extraction surface sampling exemplified by the analysis of propranolol and its hydroxypropranolol glucuronide phase II metabolites from a rat thin tissue section was also illustrated.« less

  7. Complete polarization characterization of single plasmonic nanoparticle enabled by a novel Dark-field Mueller matrix spectroscopy system

    PubMed Central

    Chandel, Shubham; Soni, Jalpa; Ray, Subir kumar; Das, Anwesh; Ghosh, Anirudha; Raj, Satyabrata; Ghosh, Nirmalya

    2016-01-01

    Information on the polarization properties of scattered light from plasmonic systems are of paramount importance due to fundamental interest and potential applications. However, such studies are severely compromised due to the experimental difficulties in recording full polarization response of plasmonic nanostructures. Here, we report on a novel Mueller matrix spectroscopic system capable of acquiring complete polarization information from single isolated plasmonic nanoparticle/nanostructure. The outstanding issues pertaining to reliable measurements of full 4 × 4 spectroscopic scattering Mueller matrices from single nanoparticle/nanostructures are overcome by integrating an efficient Mueller matrix measurement scheme and a robust eigenvalue calibration method with a dark-field microscopic spectroscopy arrangement. Feasibility of quantitative Mueller matrix polarimetry and its potential utility is illustrated on a simple plasmonic system, that of gold nanorods. The demonstrated ability to record full polarization information over a broad wavelength range and to quantify the intrinsic plasmon polarimetry characteristics via Mueller matrix inverse analysis should lead to a novel route towards quantitative understanding, analysis/interpretation of a number of intricate plasmonic effects and may also prove useful towards development of polarization-controlled novel sensing schemes. PMID:27212687

  8. A simultaneous multimodal imaging system for tissue functional parameters

    NASA Astrophysics Data System (ADS)

    Ren, Wenqi; Zhang, Zhiwu; Wu, Qiang; Zhang, Shiwu; Xu, Ronald

    2014-02-01

    Simultaneous and quantitative assessment of skin functional characteristics in different modalities will facilitate diagnosis and therapy in many clinical applications such as wound healing. However, many existing clinical practices and multimodal imaging systems are subjective, qualitative, sequential for multimodal data collection, and need co-registration between different modalities. To overcome these limitations, we developed a multimodal imaging system for quantitative, non-invasive, and simultaneous imaging of cutaneous tissue oxygenation and blood perfusion parameters. The imaging system integrated multispectral and laser speckle imaging technologies into one experimental setup. A Labview interface was developed for equipment control, synchronization, and image acquisition. Advanced algorithms based on a wide gap second derivative reflectometry and laser speckle contrast analysis (LASCA) were developed for accurate reconstruction of tissue oxygenation and blood perfusion respectively. Quantitative calibration experiments and a new style of skinsimulating phantom were designed to verify the accuracy and reliability of the imaging system. The experimental results were compared with a Moor tissue oxygenation and perfusion monitor. For In vivo testing, a post-occlusion reactive hyperemia (PORH) procedure in human subject and an ongoing wound healing monitoring experiment using dorsal skinfold chamber models were conducted to validate the usability of our system for dynamic detection of oxygenation and perfusion parameters. In this study, we have not only setup an advanced multimodal imaging system for cutaneous tissue oxygenation and perfusion parameters but also elucidated its potential for wound healing assessment in clinical practice.

  9. Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.

    PubMed

    Sugino, T; Kawahira, H; Nakamura, R

    2014-09-01

       Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information.    Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits.    Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently.    Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.

  10. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  11. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  12. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  13. Quantization of liver tissue in dual kVp computed tomography using linear discriminant analysis

    NASA Astrophysics Data System (ADS)

    Tkaczyk, J. Eric; Langan, David; Wu, Xiaoye; Xu, Daniel; Benson, Thomas; Pack, Jed D.; Schmitz, Andrea; Hara, Amy; Palicek, William; Licato, Paul; Leverentz, Jaynne

    2009-02-01

    Linear discriminate analysis (LDA) is applied to dual kVp CT and used for tissue characterization. The potential to quantitatively model both malignant and benign, hypo-intense liver lesions is evaluated by analysis of portal-phase, intravenous CT scan data obtained on human patients. Masses with an a priori classification are mapped to a distribution of points in basis material space. The degree of localization of tissue types in the material basis space is related to both quantum noise and real compositional differences. The density maps are analyzed with LDA and studied with system simulations to differentiate these factors. The discriminant analysis is formulated so as to incorporate the known statistical properties of the data. Effective kVp separation and mAs relates to precision of tissue localization. Bias in the material position is related to the degree of X-ray scatter and partial-volume effect. Experimental data and simulations demonstrate that for single energy (HU) imaging or image-based decomposition pixel values of water-like tissues depend on proximity to other iodine-filled bodies. Beam-hardening errors cause a shift in image value on the scale of that difference sought between in cancerous and cystic lessons. In contrast, projection-based decomposition or its equivalent when implemented on a carefully calibrated system can provide accurate data. On such a system, LDA may provide novel quantitative capabilities for tissue characterization in dual energy CT.

  14. Foot and Ankle Kinematics and Dynamic Electromyography: Quantitative Analysis of Recovery From Peroneal Neuropathy in a Professional Football Player.

    PubMed

    Prasad, Nikhil K; Coleman Wood, Krista A; Spinner, Robert J; Kaufman, Kenton R

    The assessment of neuromuscular recovery after peripheral nerve surgery has typically been a subjective physical examination. The purpose of this report was to assess the value of gait analysis in documenting recovery quantitatively. A professional football player underwent gait analysis before and after surgery for a peroneal intraneural ganglion cyst causing a left-sided foot drop. Surface electromyography (SEMG) recording from surface electrodes and motion parameter acquisition from a computerized motion capture system consisting of 10 infrared cameras were performed simultaneously. A comparison between SEMG recordings before and after surgery showed a progression from disorganized activation in the left tibialis anterior and peroneus longus muscles to temporally appropriate activation for the phase of the gait cycle. Kinematic analysis of ankle motion planes showed resolution from a complete foot drop preoperatively to phase-appropriate dorsiflexion postoperatively. Gait analysis with dynamic SEMG and motion capture complements physical examination when assessing postoperative recovery in athletes.

  15. Use of a capillary electrophoresis instrument with laser-induced fluorescence detection for DNA quantitation. Comparison of YO-PRO-1 and PicoGreen assays.

    PubMed

    Guillo, Christelle; Ferrance, Jerome P; Landers, James P

    2006-04-28

    Highly selective and sensitive assays are required for detection and quantitation of the small masses of DNA typically encountered in clinical and forensic settings. High detection sensitivity is achieved using fluorescent labeling dyes and detection techniques such as spectrofluorometers, microplate readers and cytometers. This work describes the use of a laser-induced fluorescence (LIF) detector in conjunction with a commercial capillary electrophoresis instrument for DNA quantitation. PicoGreen and YO-PRO-1, two fluorescent DNA labeling dyes, were used to assess the potential of the system for routine DNA analysis. Linearity, reproducibility, sensitivity, limits of detection and quantitation, and sample stability were examined for the two assays. The LIF detector response was found to be linear (R2 > 0.999) and reproducible (RSD < 9%) in both cases. The PicoGreen assay displayed lower limits of detection and quantitation (20 pg and 60 pg, respectively) than the YO-PRO-1 assay (60 pg and 260 pg, respectively). Although a small variation in fluorescence was observed for the DNA/dye complexes over time, quantitation was not significantly affected and the solutions were found to be relatively stable for 80 min. The advantages of the technique include a 4- to 40-fold reduction in the volume of sample required compared to traditional assays, a 2- to 20-fold reduction in the volume of reagents consumed, fast and automated analysis, and low cost (no specific instrumentation required).

  16. MIiSR: Molecular Interactions in Super-Resolution Imaging Enables the Analysis of Protein Interactions, Dynamics and Formation of Multi-protein Structures.

    PubMed

    Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan

    2015-12-01

    Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.

  17. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  18. Mass-spectrometric analysis of hydroperoxy- and hydroxy-derivatives of cardiolipin and phosphatidylserine in cells and tissues induced by pro-apoptotic and pro-inflammatory stimuli

    PubMed Central

    Tyurin, Vladimir A.; Tyurina, Yulia Y.; Jung, Mi-Yeon; Tungekar, Muhammad A.; Wasserloos, Karla J.; Bayir, Hülya; Greenberger, Joel S.; Kochanek, Patrick M.; Shvedova, Anna A.; Pitt, Bruce; Kagan, Valerian E.

    2009-01-01

    Oxidation of two anionic phospholipids - cardiolipin (CL) in mitochondria and phosphatidylserine (PS) in extramitochondrial compartments - are important signaling events, particularly during the execution of programmed cell death and clearance of apoptotic cells. Quantitative analysis of CL and PS oxidation products is central to understanding their molecular mechanisms of action. We combined the identification of diverse phospholipid molecular species by ESI-MS with quantitative assessments of lipid hydroperoxides using a fluorescence HPLC-based protocol. We characterized CL and PS oxidation products formed in a model system (cyt c/H2O2), in apoptotic cells (neurons, pulmonary artery endothelial cells) and mouse lung under inflammatory/oxidative stress conditions (hyperoxia, inhalation of single walled carbon nanotubes). Our results demonstrate the usefulness of this approach for quantitative assessments, identification of individual molecular species and structural characterization of anionic phospholipids that are involved in oxidative modification in cells and tissues. PMID:19328050

  19. Quantitative body fluid proteomics in medicine - A focus on minimal invasiveness.

    PubMed

    Csősz, Éva; Kalló, Gergő; Márkus, Bernadett; Deák, Eszter; Csutak, Adrienne; Tőzsér, József

    2017-02-05

    Identification of new biomarkers specific for various pathological conditions is an important field in medical sciences. Body fluids have emerging potential in biomarker studies especially those which are continuously available and can be collected by non-invasive means. Changes in the protein composition of body fluids such as tears, saliva, sweat, etc. may provide information on both local and systemic conditions of medical relevance. In this review, our aim is to discuss the quantitative proteomics techniques used in biomarker studies, and to present advances in quantitative body fluid proteomics of non-invasively collectable body fluids with relevance to biomarker identification. The advantages and limitations of the widely used quantitative proteomics techniques are also presented. Based on the reviewed literature, we suggest an ideal pipeline for body fluid analyses aiming at biomarkers discoveries: starting from identification of biomarker candidates by shotgun quantitative proteomics or protein arrays, through verification of potential biomarkers by targeted mass spectrometry, to the antibody-based validation of biomarkers. The importance of body fluids as a rich source of biomarkers is discussed. Quantitative proteomics is a challenging part of proteomics applications. The body fluids collected by non-invasive means have high relevance in medicine; they are good sources for biomarkers used in establishing the diagnosis, follow up of disease progression and predicting high risk groups. The review presents the most widely used quantitative proteomics techniques in body fluid analysis and lists the potential biomarkers identified in tears, saliva, sweat, nasal mucus and urine for local and systemic diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. The effects of AVIRIS atmospheric calibration methodology on identification and quantitative mapping of surface mineralogy, Drum Mountains, Utah

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Dwyer, John L.

    1993-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.

  1. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Quantitative analysis of volatile organic compounds using ion mobility spectra and cascade correlation neural networks

    NASA Technical Reports Server (NTRS)

    Harrington, Peter DEB.; Zheng, Peng

    1995-01-01

    Ion Mobility Spectrometry (IMS) is a powerful technique for trace organic analysis in the gas phase. Quantitative measurements are difficult, because IMS has a limited linear range. Factors that may affect the instrument response are pressure, temperature, and humidity. Nonlinear calibration methods, such as neural networks, may be ideally suited for IMS. Neural networks have the capability of modeling complex systems. Many neural networks suffer from long training times and overfitting. Cascade correlation neural networks train at very fast rates. They also build their own topology, that is a number of layers and number of units in each layer. By controlling the decay parameter in training neural networks, reproducible and general models may be obtained.

  3. Quantitative analysis of serotonin secreted by human embryonic stem cells-derived serotonergic neurons via pH-mediated online stacking-CE-ESI-MRM.

    PubMed

    Zhong, Xuefei; Hao, Ling; Lu, Jianfeng; Ye, Hui; Zhang, Su-Chun; Li, Lingjun

    2016-04-01

    A CE-ESI-MRM-based assay was developed for targeted analysis of serotonin released by human embryonic stem cells-derived serotonergic neurons in a chemically defined environment. A discontinuous electrolyte system was optimized for pH-mediated online stacking of serotonin. Combining with a liquid-liquid extraction procedure, LOD of serotonin in the Krebs'-Ringer's solution by CE-ESI-MS/MS on a 3D ion trap MS was0.15 ng/mL. The quantitative results confirmed the serotonergic identity of the in vitro developed neurons and the capacity of these neurons to release serotonin in response to stimulus. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Detecting ordered small molecule drug aggregates in live macrophages: a multi-parameter microscope image data acquisition and analysis strategy

    PubMed Central

    Rzeczycki, Phillip; Yoon, Gi Sang; Keswani, Rahul K.; Sud, Sudha; Stringer, Kathleen A.; Rosania, Gus R.

    2017-01-01

    Following prolonged administration, certain orally bioavailable but poorly soluble small molecule drugs are prone to precipitate out and form crystal-like drug inclusions (CLDIs) within the cells of living organisms. In this research, we present a quantitative multi-parameter imaging platform for measuring the fluorescence and polarization diattenuation signals of cells harboring intracellular CLDIs. To validate the imaging system, the FDA-approved drug clofazimine (CFZ) was used as a model compound. Our results demonstrated that a quantitative multi-parameter microscopy image analysis platform can be used to study drug sequestering macrophages, and to detect the formation of ordered molecular aggregates formed by poorly soluble small molecule drugs in animals. PMID:28270989

  5. Three-dimensional characterization of pigment dispersion in dried paint films using focused ion beam-scanning electron microscopy.

    PubMed

    Lin, Jui-Ching; Heeschen, William; Reffner, John; Hook, John

    2012-04-01

    The combination of integrated focused ion beam-scanning electron microscope (FIB-SEM) serial sectioning and imaging techniques with image analysis provided quantitative characterization of three-dimensional (3D) pigment dispersion in dried paint films. The focused ion beam in a FIB-SEM dual beam system enables great control in slicing paints, and the sectioning process can be synchronized with SEM imaging providing high quality serial cross-section images for 3D reconstruction. Application of Euclidean distance map and ultimate eroded points image analysis methods can provide quantitative characterization of 3D particle distribution. It is concluded that 3D measurement of binder distribution in paints is effective to characterize the order of pigment dispersion in dried paint films.

  6. Estimating weak ratiometric signals in imaging data. II. Meta-analysis with multiple, dual-channel datasets.

    PubMed

    Sornborger, Andrew; Broder, Josef; Majumder, Anirban; Srinivasamoorthy, Ganesh; Porter, Erika; Reagin, Sean S; Keith, Charles; Lauderdale, James D

    2008-09-01

    Ratiometric fluorescent indicators are used for making quantitative measurements of a variety of physiological variables. Their utility is often limited by noise. This is the second in a series of papers describing statistical methods for denoising ratiometric data with the aim of obtaining improved quantitative estimates of variables of interest. Here, we outline a statistical optimization method that is designed for the analysis of ratiometric imaging data in which multiple measurements have been taken of systems responding to the same stimulation protocol. This method takes advantage of correlated information across multiple datasets for objectively detecting and estimating ratiometric signals. We demonstrate our method by showing results of its application on multiple, ratiometric calcium imaging experiments.

  7. Detecting ordered small molecule drug aggregates in live macrophages: a multi-parameter microscope image data acquisition and analysis strategy.

    PubMed

    Rzeczycki, Phillip; Yoon, Gi Sang; Keswani, Rahul K; Sud, Sudha; Stringer, Kathleen A; Rosania, Gus R

    2017-02-01

    Following prolonged administration, certain orally bioavailable but poorly soluble small molecule drugs are prone to precipitate out and form crystal-like drug inclusions (CLDIs) within the cells of living organisms. In this research, we present a quantitative multi-parameter imaging platform for measuring the fluorescence and polarization diattenuation signals of cells harboring intracellular CLDIs. To validate the imaging system, the FDA-approved drug clofazimine (CFZ) was used as a model compound. Our results demonstrated that a quantitative multi-parameter microscopy image analysis platform can be used to study drug sequestering macrophages, and to detect the formation of ordered molecular aggregates formed by poorly soluble small molecule drugs in animals.

  8. ELISA-BASE: An Integrated Bioinformatics Tool for Analyzing and Tracking ELISA Microarray Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Collett, James L.; Seurynck-Servoss, Shannon L.

    ELISA-BASE is an open-source database for capturing, organizing and analyzing protein enzyme-linked immunosorbent assay (ELISA) microarray data. ELISA-BASE is an extension of the BioArray Soft-ware Environment (BASE) database system, which was developed for DNA microarrays. In order to make BASE suitable for protein microarray experiments, we developed several plugins for importing and analyzing quantitative ELISA microarray data. Most notably, our Protein Microarray Analysis Tool (ProMAT) for processing quantita-tive ELISA data is now available as a plugin to the database.

  9. N-nitrosamines as "special case" leachables in a metered dose inhaler drug product.

    PubMed

    Norwood, Daniel L; Mullis, James O; Feinberg, Thomas N; Davis, Letha K

    2009-01-01

    N-nitrosamines are chemical entities, some of which are considered to be possible human carcinogens, which can be found at trace levels in some types of foods, tobacco smoke, certain cosmetics, and certain types of rubber. N-nitrosamines are of regulatory concern as leachables in inhalation drug products, particularly metered dose inhalers, which incorporate rubber seals into their container closure systems. The United States Food and Drug Administration considers N-nitrosamines (along with polycyclic aromatic hydrocarbons and 2-mercaptobenzothiazole) to be "special case" leachables in inhalation drug products, meaning that there are no recognized safety or analytical thresholds and these compounds must therefore be identified and quantitated at the lowest practical level. This report presents the development of a quantitative analytical method for target volatile N-nitrosamines in a metered dose inhaler drug product, Atrovent HFA. The method incorporates a target analyte recovery procedure from the drug product matrix with analysis by gas chromatography/thermal energy analysis detection. The capability of the method was investigated with respect to specificity, linearity/range, accuracy (linearity of recovery), precision (repeatability, intermediate precision), limits of quantitation, standard/sample stability, and system suitability. Sample analyses showed that Atrovent HFA contains no target N-nitrosamines at the trace level of 1 ng/canister.

  10. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    PubMed

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  11. CAD system for automatic analysis of CT perfusion maps

    NASA Astrophysics Data System (ADS)

    Hachaj, T.; Ogiela, M. R.

    2011-03-01

    In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.

  12. Quantitative analysis of cellular proteome alterations in human influenza A virus-infected mammalian cell lines.

    PubMed

    Vester, Diana; Rapp, Erdmann; Gade, Dörte; Genzel, Yvonne; Reichl, Udo

    2009-06-01

    Over the last years virus-host cell interactions were investigated in numerous studies. Viral strategies for evasion of innate immune response, inhibition of cellular protein synthesis and permission of viral RNA and protein production were disclosed. With quantitative proteome technology, comprehensive studies concerning the impact of viruses on the cellular machinery of their host cells at protein level are possible. Therefore, 2-D DIGE and nanoHPLC-nanoESI-MS/MS analysis were used to qualitatively and quantitatively determine the dynamic cellular proteome responses of two mammalian cell lines to human influenza A virus infection. A cell line used for vaccine production (MDCK) was compared with a human lung carcinoma cell line (A549) as a reference model. Analyzing 2-D gels of the proteomes of uninfected and influenza-infected host cells, 16 quantitatively altered protein spots (at least +/-1.7-fold change in relative abundance, p<0.001) were identified for both cell lines. Most significant changes were found for keratins, major components of the cytoskeleton system, and for Mx proteins, interferon-induced key components of the host cell defense. Time series analysis of infection processes allowed the identification of further proteins that are described to be involved in protein synthesis, signal transduction and apoptosis events. Most likely, these proteins are required for supporting functions during influenza viral life cycle or host cell stress response. Quantitative proteome-wide profiling of virus infection can provide insights into complexity and dynamics of virus-host cell interactions and may accelerate antiviral research and support optimization of vaccine manufacturing processes.

  13. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  14. Modelling default and likelihood reasoning as probabilistic reasoning

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  15. Quantitative HDL Proteomics Identifies Peroxiredoxin-6 as a Biomarker of Human Abdominal Aortic Aneurysm

    PubMed Central

    Burillo, Elena; Jorge, Inmaculada; Martínez-López, Diego; Camafeita, Emilio; Blanco-Colio, Luis Miguel; Trevisan-Herraz, Marco; Ezkurdia, Iakes; Egido, Jesús; Michel, Jean-Baptiste; Meilhac, Olivier; Vázquez, Jesús; Martin-Ventura, Jose Luis

    2016-01-01

    High-density lipoproteins (HDLs) are complex protein and lipid assemblies whose composition is known to change in diverse pathological situations. Analysis of the HDL proteome can thus provide insight into the main mechanisms underlying abdominal aortic aneurysm (AAA) and potentially detect novel systemic biomarkers. We performed a multiplexed quantitative proteomics analysis of HDLs isolated from plasma of AAA patients (N = 14) and control study participants (N = 7). Validation was performed by western-blot (HDL), immunohistochemistry (tissue), and ELISA (plasma). HDL from AAA patients showed elevated expression of peroxiredoxin-6 (PRDX6), HLA class I histocompatibility antigen (HLA-I), retinol-binding protein 4, and paraoxonase/arylesterase 1 (PON1), whereas α-2 macroglobulin and C4b-binding protein were decreased. The main pathways associated with HDL alterations in AAA were oxidative stress and immune-inflammatory responses. In AAA tissue, PRDX6 colocalized with neutrophils, vascular smooth muscle cells, and lipid oxidation. Moreover, plasma PRDX6 was higher in AAA (N = 47) than in controls (N = 27), reflecting increased systemic oxidative stress. Finally, a positive correlation was recorded between PRDX6 and AAA diameter. The analysis of the HDL proteome demonstrates that redox imbalance is a major mechanism in AAA, identifying the antioxidant PRDX6 as a novel systemic biomarker of AAA. PMID:27934969

  16. Python for Information Theoretic Analysis of Neural Data

    PubMed Central

    Ince, Robin A. A.; Petersen, Rasmus S.; Swan, Daniel C.; Panzeri, Stefano

    2008-01-01

    Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources. PMID:19242557

  17. Delivery of drugs to intracellular organelles using drug delivery systems: Analysis of research trends and targeting efficiencies.

    PubMed

    Maity, Amit Ranjan; Stepensky, David

    2015-12-30

    Targeting of drug delivery systems (DDSs) to specific intracellular organelles (i.e., subcellular targeting) has been investigated in numerous publications, but targeting efficiency of these systems is seldom reported. We searched scientific publications in the subcellular DDS targeting field and analyzed targeting efficiency and major formulation parameters that affect it. We identified 77 scientific publications that matched the search criteria. In the majority of these studies nanoparticle-based DDSs were applied, while liposomes, quantum dots and conjugates were used less frequently. The nucleus was the most common intracellular target, followed by mitochondrion, endoplasmic reticulum and Golgi apparatus. In 65% of the publications, DDSs surface was decorated with specific targeting residues, but the efficiency of this surface decoration was not analyzed in predominant majority of the studies. Moreover, only 23% of the analyzed publications contained quantitative data on DDSs subcellular targeting efficiency, while the majority of publications reported qualitative results only. From the analysis of publications in the subcellular targeting field, it appears that insufficient efforts are devoted to quantitative analysis of the major formulation parameters and of the DDSs' intracellular fate. Based on these findings, we provide recommendations for future studies in the field of organelle-specific drug delivery and targeting. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Homogeneity testing and quantitative analysis of manganese (Mn) in vitrified Mn-doped glasses by laser-induced breakdown spectroscopy (LIBS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unnikrishnan, V. K.; Nayak, Rajesh; Kartha, V. B.

    2014-09-15

    Laser-induced breakdown spectroscopy (LIBS), an atomic emission spectroscopy method, has rapidly grown as one of the best elemental analysis techniques over the past two decades. Homogeneity testing and quantitative analysis of manganese (Mn) in manganese-doped glasses have been carried out using an optimized LIBS system employing a nanosecond ultraviolet Nd:YAG laser as the source of excitation. The glass samples have been prepared using conventional vitrification methods. The laser pulse irradiance on the surface of the glass samples placed in air at atmospheric pressure was about 1.7×10{sup 9} W/cm{sup 2}. The spatially integrated plasma emission was collected and imaged on tomore » the spectrograph slit using an optical-fiber-based collection system. Homogeneity was checked by recording LIBS spectra from different sites on the sample surface and analyzing the elemental emission intensities for concentration determination. Validation of the observed LIBS results was done by comparison with scanning electron microscope- energy dispersive X-ray spectroscopy (SEM-EDX) surface elemental mapping. The analytical performance of the LIBS system has been evaluated through the correlation of the LIBS determined concentrations of Mn with its certified values. The results are found to be in very good agreement with the certified concentrations.« less

  19. Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board

    DTIC Science & Technology

    2017-03-01

    due to Marines being evaluated before the end of their initial service commitment. Our research utilizes quantitative variables to analyze the...not provide detailed information why. B. LIMITATIONS The photograph analysis in this research is strictly limited to a quantitative analysis in...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. QUANTITATIVE

  20. The SAM framework: modeling the effects of management factors on human behavior in risk analysis.

    PubMed

    Murphy, D M; Paté-Cornell, M E

    1996-08-01

    Complex engineered systems, such as nuclear reactors and chemical plants, have the potential for catastrophic failure with disastrous consequences. In recent years, human and management factors have been recognized as frequent root causes of major failures in such systems. However, classical probabilistic risk analysis (PRA) techniques do not account for the underlying causes of these errors because they focus on the physical system and do not explicitly address the link between components' performance and organizational factors. This paper describes a general approach for addressing the human and management causes of system failure, called the SAM (System-Action-Management) framework. Beginning with a quantitative risk model of the physical system, SAM expands the scope of analysis to incorporate first the decisions and actions of individuals that affect the physical system. SAM then links management factors (incentives, training, policies and procedures, selection criteria, etc.) to those decisions and actions. The focus of this paper is on four quantitative models of action that describe this last relationship. These models address the formation of intentions for action and their execution as a function of the organizational environment. Intention formation is described by three alternative models: a rational model, a bounded rationality model, and a rule-based model. The execution of intentions is then modeled separately. These four models are designed to assess the probabilities of individual actions from the perspective of management, thus reflecting the uncertainties inherent to human behavior. The SAM framework is illustrated for a hypothetical case of hazardous materials transportation. This framework can be used as a tool to increase the safety and reliability of complex technical systems by modifying the organization, rather than, or in addition to, re-designing the physical system.

  1. A gradient method for the quantitative analysis of cell movement and tissue flow and its application to the analysis of multicellular Dictyostelium development.

    PubMed

    Siegert, F; Weijer, C J; Nomura, A; Miike, H

    1994-01-01

    We describe the application of a novel image processing method, which allows quantitative analysis of cell and tissue movement in a series of digitized video images. The result is a vector velocity field showing average direction and velocity of movement for every pixel in the frame. We apply this method to the analysis of cell movement during different stages of the Dictyostelium developmental cycle. We analysed time-lapse video recordings of cell movement in single cells, mounds and slugs. The program can correctly assess the speed and direction of movement of either unlabelled or labelled cells in a time series of video images depending on the illumination conditions. Our analysis of cell movement during multicellular development shows that the entire morphogenesis of Dictyostelium is characterized by rotational cell movement. The analysis of cell and tissue movement by the velocity field method should be applicable to the analysis of morphogenetic processes in other systems such as gastrulation and neurulation in vertebrate embryos.

  2. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in challenging Raman endoscopic applications.

  3. The brainstem reticular formation is a small-world, not scale-free, network

    PubMed Central

    Humphries, M.D; Gurney, K; Prescott, T.J

    2005-01-01

    Recently, it has been demonstrated that several complex systems may have simple graph-theoretic characterizations as so-called ‘small-world’ and ‘scale-free’ networks. These networks have also been applied to the gross neural connectivity between primate cortical areas and the nervous system of Caenorhabditis elegans. Here, we extend this work to a specific neural circuit of the vertebrate brain—the medial reticular formation (RF) of the brainstem—and, in doing so, we have made three key contributions. First, this work constitutes the first model (and quantitative review) of this important brain structure for over three decades. Second, we have developed the first graph-theoretic analysis of vertebrate brain connectivity at the neural network level. Third, we propose simple metrics to quantitatively assess the extent to which the networks studied are small-world or scale-free. We conclude that the medial RF is configured to create small-world (implying coherent rapid-processing capabilities), but not scale-free, type networks under assumptions which are amenable to quantitative measurement. PMID:16615219

  4. A database system to support image algorithm evaluation

    NASA Technical Reports Server (NTRS)

    Lien, Y. E.

    1977-01-01

    The design is given of an interactive image database system IMDB, which allows the user to create, retrieve, store, display, and manipulate images through the facility of a high-level, interactive image query (IQ) language. The query language IQ permits the user to define false color functions, pixel value transformations, overlay functions, zoom functions, and windows. The user manipulates the images through generic functions. The user can direct images to display devices for visual and qualitative analysis. Image histograms and pixel value distributions can also be computed to obtain a quantitative analysis of images.

  5. Client - server programs analysis in the EPOCA environment

    NASA Astrophysics Data System (ADS)

    Donatelli, Susanna; Mazzocca, Nicola; Russo, Stefano

    1996-09-01

    Client - server processing is a popular paradigm for distributed computing. In the development of client - server programs, the designer has first to ensure that the implementation behaves correctly, in particular that it is deadlock free. Second, he has to guarantee that the program meets predefined performance requirements. This paper addresses the issues in the analysis of client - server programs in EPOCA. EPOCA is a computer-aided software engeneering (CASE) support system that allows the automated construction and analysis of generalized stochastic Petri net (GSPN) models of concurrent applications. The paper describes, on the basis of a realistic case study, how client - server systems are modelled in EPOCA, and the kind of qualitative and quantitative analysis supported by its tools.

  6. Reader-Responses of Pregnant Adolescents and Teenage Mothers to Young Adult Novels Portraying Protagonists with Problems Similar and Dissimilar to the Readers'.

    ERIC Educational Resources Information Center

    Poe, Elizabeth Ann

    Applying reader response theory, a study explored the responses of 19 pregnant adolescents and teenage mothers to two dissimilar young adult novels, one about teenage pregnancy and one about adolescent alcoholism. Quantitative analysis, using a modified version of the Purves-Rippere (1968) system, and qualitative analysis of written answers to…

  7. Multistage Deployment of the Army Theater Hospital

    DTIC Science & Technology

    2013-12-01

    analysis on the effects warfare tactics, casualty timing, and casualty types had on the medical treatment facility (Cecchine et al., 2001). This proved...the 44-bed mobile portion of the current CSH was potentially inadequate to support approximately four brigades in an asymmetrical warfare scenario...systems and its increased force effectiveness on the fleet. He utilized quantitative analysis with Lanchester and Hughes-Salvo models to show that

  8. Understanding responder neurobiology in schizophrenia using a quantitative systems pharmacology model: application to iloperidone.

    PubMed

    Geerts, Hugo; Roberts, Patrick; Spiros, Athan; Potkin, Steven

    2015-04-01

    The concept of targeted therapies remains a holy grail for the pharmaceutical drug industry for identifying responder populations or new drug targets. Here we provide quantitative systems pharmacology as an alternative to the more traditional approach of retrospective responder pharmacogenomics analysis and applied this to the case of iloperidone in schizophrenia. This approach implements the actual neurophysiological effect of genotypes in a computer-based biophysically realistic model of human neuronal circuits, is parameterized with human imaging and pathology, and is calibrated by clinical data. We keep the drug pharmacology constant, but allowed the biological model coupling values to fluctuate in a restricted range around their calibrated values, thereby simulating random genetic mutations and representing variability in patient response. Using hypothesis-free Design of Experiments methods the dopamine D4 R-AMPA (receptor-alpha-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid) receptor coupling in cortical neurons was found to drive the beneficial effect of iloperidone, likely corresponding to the rs2513265 upstream of the GRIA4 gene identified in a traditional pharmacogenomics analysis. The serotonin 5-HT3 receptor-mediated effect on interneuron gamma-aminobutyric acid conductance was identified as the process that moderately drove the differentiation of iloperidone versus ziprasidone. This paper suggests that reverse-engineered quantitative systems pharmacology is a powerful alternative tool to characterize the underlying neurobiology of a responder population and possibly identifying new targets. © The Author(s) 2015.

  9. 13C-based metabolic flux analysis: fundamentals and practice.

    PubMed

    Yang, Tae Hoon

    2013-01-01

    Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.

  10. INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

  11. Public transportation and the nation's economy : a quantitative analysis of public transportation's economic impact

    DOT National Transportation Integrated Search

    1999-10-01

    The relationship between the strength and competitiveness of the nation's economy and the extent, condition and performance of the nation's transportation system is a topic of critical interest. There is mounting evidence that we, as a nation, are se...

  12. High content screening of ToxCast compounds using Vala Sciences’ complex cell culturing systems (SOT)

    EPA Science Inventory

    US EPA’s ToxCast research program evaluates bioactivity for thousands of chemicals utilizing high-throughput screening assays to inform chemical testing decisions. Vala Sciences provides high content, multiplexed assays that utilize quantitative cell-based digital image analysis....

  13. Anticipatory Understanding of Adversary Intent: A Signature-Based Knowledge System

    DTIC Science & Technology

    2009-06-01

    concept of logical positivism has been applied more recently to all human knowledge and reflected in current data fusion research, information mining...this work has been successfully translated into useful analytical tools that can provide a rigorous and quantitative basis for predictive analysis

  14. Using an Integrated, Multi-disciplinary Framework to Support Quantitative Microbial Risk Assessments

    EPA Science Inventory

    The Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) provides the infrastructure to link disparate models and databases seamlessly, giving an assessor the ability to construct an appropriate conceptual site model from a host of modeling choices, so a numbe...

  15. Wake Vortex Systems Cost/Benefits Analysis

    NASA Technical Reports Server (NTRS)

    Crisp, Vicki K.

    1997-01-01

    The goals of cost/benefit assessments are to provide quantitative and qualitative data to aid in the decision-making process. Benefits derived from increased throughput (or decreased delays) used to balance life-cycle costs. Packaging technologies together may provide greater gains (demonstrate higher return on investment).

  16. Enzyme-linked immunosorbent assay for the quantitative/qualitative analysis of plant secondary metabolites.

    PubMed

    Sakamoto, Seiichi; Putalun, Waraporn; Vimolmangkang, Sornkanok; Phoolcharoen, Waranyoo; Shoyama, Yukihiro; Tanaka, Hiroyuki; Morimoto, Satoshi

    2018-01-01

    Immunoassays are antibody-based analytical methods for quantitative/qualitative analysis. Since the principle of immunoassays is based on specific antigen-antibody reaction, the assays have been utilized worldwide for diagnosis, pharmacokinetic studies by drug monitoring, and the quality control of commercially available products. Berson and Yalow were the first to develop an immunoassay, known as radioimmunoassay (RIA), for detecting endogenous plasma insulin [1], a development for which Yalow was awarded the Nobel Prize in Physiology or Medicine in 1977. Even today, after half a century, immunoassays are widely utilized with some modifications from the originally proposed system, e.g., radioisotopes have been replaced with enzymes because of safety concerns regarding the use of radioactivity, which is referred to as enzyme immunoassay/enzyme-linked immunosorbent assay (ELISA). In addition, progress has been made in ELISA with the recent advances in recombinant DNA technology, leading to increase in the range of antibodies, probes, and even systems. This review article describes ELISA and its applications for the detection of plant secondary metabolites.

  17. Reconstruction of the dynamics of the climatic system from time-series data

    PubMed Central

    Nicolis, C.; Nicolis, G.

    1986-01-01

    The oxygen isotope record of the last million years, as provided by a deep sea core sediment, is analyzed by a method recently developed in the theory of dynamical systems. The analysis suggests that climatic variability is the manifestation of a chaotic dynamics described by an attractor of fractal dimensionality. A quantitative measure of the limited predictability of the climatic system is provided by the evaluation of the time-correlation function and the largest positive Lyapounov exponent of the system. PMID:16593650

  18. Software For Design Of Life-Support Systems

    NASA Technical Reports Server (NTRS)

    Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.

    1991-01-01

    Design Assistant Workstation (DAWN) computer program is prototype of expert software system for analysis and design of regenerative, physical/chemical life-support systems that revitalize air, reclaim water, produce food, and treat waste. Incorporates both conventional software for quantitative mathematical modeling of physical, chemical, and biological processes and expert system offering user stored knowledge about materials and processes. Constructs task tree as it leads user through simulated process, offers alternatives, and indicates where alternative not feasible. Also enables user to jump from one design level to another.

  19. Learning in Modular Systems

    DTIC Science & Technology

    2010-05-07

    important for deep modular systems is that taking a series of small update steps and stopping before convergence, so called early stopping, is a form of regu...larization around the initial parameters of the system . For example, the stochastic gradient descent 5 1 u + 1 v = 1 6‖x2‖q = ‖x‖22q 22 Chapter 2...Aside from the overall speed of the classifier, no quantitative performance analysis was given, and the role played by the features in the larger system

  20. Improved Protein Arrays for Quantitative Systems Analysis of the Dynamics of Signaling Pathway Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Chin-Rang

    Astronauts and workers in nuclear plants who repeatedly exposed to low doses of ionizing radiation (IR, <10 cGy) are likely to incur specific changes in signal transduction and gene expression in various tissues of their body. Remarkable advances in high throughput genomics and proteomics technologies enable researchers to broaden their focus from examining single gene/protein kinetics to better understanding global gene/protein expression profiling and biological pathway analyses, namely Systems Biology. An ultimate goal of systems biology is to develop dynamic mathematical models of interacting biological systems capable of simulating living systems in a computer. This Glue Grant is to complementmore » Dr. Boothman’s existing DOE grant (No. DE-FG02-06ER64186) entitled “The IGF1/IGF-1R-MAPK-Secretory Clusterin (sCLU) Pathway: Mediator of a Low Dose IR-Inducible Bystander Effect” to develop sensitive and quantitative proteomic technology that suitable for low dose radiobiology researches. An improved version of quantitative protein array platform utilizing linear Quantum dot signaling for systematically measuring protein levels and phosphorylation states for systems biology modeling is presented. The signals are amplified by a confocal laser Quantum dot scanner resulting in ~1000-fold more sensitivity than traditional Western blots and show the good linearity that is impossible for the signals of HRP-amplification. Therefore this improved protein array technology is suitable to detect weak responses of low dose radiation. Software is developed to facilitate the quantitative readout of signaling network activities. Kinetics of EGFRvIII mutant signaling was analyzed to quantify cross-talks between EGFR and other signaling pathways.« less

  1. Quantification of Hepcidin-related Iron Accumulation in the Rat Liver.

    PubMed

    Böser, Preethne; Mordashova, Yulia; Maasland, Mark; Trommer, Isabel; Lorenz, Helga; Hafner, Mathias; Seemann, Dietmar; Mueller, Bernhard K; Popp, Andreas

    2016-02-01

    Hepcidin was originally detected as a liver peptide with antimicrobial activity and it functions as a central regulator in the systemic iron metabolism. Consequently suppression of hepcidin leads to iron accumulation in the liver. AbbVie developed a monoclonal antibody ([mAb]; repulsive guidance molecule [RGMa/c] mAb) that downregulates hepcidin expression by influencing the RGMc/bone morphogenetic protein (BMP)/neogenin receptor complex and causes iron deposition in the liver. In a dose range finding study with RGMa/c mAb, rats were treated with different dose levels for a total of 4 weekly doses. The results of this morphometric analysis in the liver showed that iron accumulation is not homogenous between liver lobes and the left lateral lobe was the most responsive lobe in the rat. Quantitative hepcidin messenger RNA analysis showed that the left lateral lobe was the most responsive lobe showing hepcidin downregulation with increasing antibody dose. In addition, the morphometric analysis had higher sensitivity than the chemical iron extraction and quantification using a colorimetric assay. In conclusion, the Prussian blue stain in combination with semi-quantitative and quantitative morphometric analysis is the most reliable method to demonstrate iron accumulation in the liver compared to direct measurement of iron in unfixed tissue using a colorimetric assay. © The Author(s) 2016.

  2. Integrative Analysis of Subcellular Quantitative Proteomics Studies Reveals Functional Cytoskeleton Membrane-Lipid Raft Interactions in Cancer.

    PubMed

    Shah, Anup D; Inder, Kerry L; Shah, Alok K; Cristino, Alexandre S; McKie, Arthur B; Gabra, Hani; Davis, Melissa J; Hill, Michelle M

    2016-10-07

    Lipid rafts are dynamic membrane microdomains that orchestrate molecular interactions and are implicated in cancer development. To understand the functions of lipid rafts in cancer, we performed an integrated analysis of quantitative lipid raft proteomics data sets modeling progression in breast cancer, melanoma, and renal cell carcinoma. This analysis revealed that cancer development is associated with increased membrane raft-cytoskeleton interactions, with ∼40% of elevated lipid raft proteins being cytoskeletal components. Previous studies suggest a potential functional role for the raft-cytoskeleton in the action of the putative tumor suppressors PTRF/Cavin-1 and Merlin. To extend the observation, we examined lipid raft proteome modulation by an unrelated tumor suppressor opioid binding protein cell-adhesion molecule (OPCML) in ovarian cancer SKOV3 cells. In agreement with the other model systems, quantitative proteomics revealed that 39% of OPCML-depleted lipid raft proteins are cytoskeletal components, with microfilaments and intermediate filaments specifically down-regulated. Furthermore, protein-protein interaction network and simulation analysis showed significantly higher interactions among cancer raft proteins compared with general human raft proteins. Collectively, these results suggest increased cytoskeleton-mediated stabilization of lipid raft domains with greater molecular interactions as a common, functional, and reversible feature of cancer cells.

  3. An approach for quantitative image quality analysis for CT

    NASA Astrophysics Data System (ADS)

    Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe

    2016-03-01

    An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.

  4. Sensitivity analysis and multidisciplinary optimization for aircraft design: Recent advances and results

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1988-01-01

    Optimization by decomposition, complex system sensitivity analysis, and a rapid growth of disciplinary sensitivity analysis are some of the recent developments that hold promise of a quantum jump in the support engineers receive from computers in the quantitative aspects of design. Review of the salient points of these techniques is given and illustrated by examples from aircraft design as a process that combines the best of human intellect and computer power to manipulate data.

  5. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gritsenko, Marina A.; Xu, Zhe; Liu, Tao

    Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less

  6. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS.

    PubMed

    Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D

    2016-01-01

    Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.

  7. Soldier as a System Value Analysis

    DTIC Science & Technology

    2008-09-01

    effort is to use established quantitative methods in the development of the framework and explore possible metrics for the assessment of applicable...and ease of use is an important part of the development of the holistic Soldier system. Small details can make a big difference to a Soldier in harsh...important (Friedl and Allan, 2004). Different kinds of environmental stressors include heat, cold, hypobaric hypoxia, physical work, energy

  8. The Effects of Higher Education Funding Reforms on the Lifetime Incomes of Graduates. CEE DP 78

    ERIC Educational Resources Information Center

    Dearden, Lorraine; Fitzsimons, Emla; Goodman, Alissa; Kaplan, Greg

    2008-01-01

    This paper undertakes a quantitative analysis of substantial reforms to the system of higher education (HE) finance first announced in 2004 and then revised again in July 2007. The reforms introduced deferred fees for HE, payable by graduates through the tax system in the form of income-contingent repayments on loans subsidised by the government.…

  9. Teaching Poverty with Geographic Visualization and Geographic Information Systems (GIS): A Case Study of East Buffalo and Food Access

    ERIC Educational Resources Information Center

    Gjesfjeld, Christopher D.; Jung, Jin-Kyu

    2014-01-01

    Although various methods have been used to teach about poverty in the social work classroom (e.g., quantitative, historical, and qualitative), the use of geographic visualization and geographic information systems (GIS) has become a relatively new method. In our analysis of food access on the East Side of Buffalo, New York, we demonstrate the…

  10. Space Shuttle Main Engine Quantitative Risk Assessment: Illustrating Modeling of a Complex System with a New QRA Software Package

    NASA Technical Reports Server (NTRS)

    Smart, Christian

    1998-01-01

    During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen out the many failure modes that did not contribute significantly to the catastrophic risk. The Hazard Analysis and FMEA for the SSME were also used to build ESDs that show the chain of events leading from the failure mode occurence to one of the following end states: catastrophic failure, engine shutdown, or siccessful operation( successful with respect to the failure mode under consideration).

  11. Quantitative Analysis of Homogeneous Electrocatalytic Reactions at IDA Electrodes: The Example of [Ni(PPh2NBn2)2]2+

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Fei; Parkinson, B. A.; Divan, Ralu

    Interdigitated array (IDA) electrodes have been applied to study the EC’ (electron transfer reaction followed by a catalytic reaction) reactions and a new method of quantitative analysis of IDA results was developed. In this new method, currents on IDA generator and collector electrodes for an EC’ mechanism are derived from the number of redox cycles and the contribution of non-catalytic current. And the fractions of bipotential recycling species and catalytic-active species are calculated, which helps understanding the catalytic reaction mechanism. The homogeneous hydrogen evolution reaction catalyzed by [Ni(PPh2NBn2)2]2+ (where PPh2NBn2 is 1,5-dibenzyl-3,7-diphenyl-1,5-diaza-3,7-diphosphacyclooctane) electrocatalyst was examined and analyzed with IDA electrodes.more » Besides, the existence of reaction intermediates in the catalytic cycle is inferred from the electrochemical behavior of a glassy carbon disk electrodes and carbon IDA electrodes. This quantitative analysis of IDA electrode cyclic voltammetry currents can be used as a simple and straightforward method for determining reaction mechanism in other catalytic systems as well.« less

  12. High-throughput SISCAPA quantitation of peptides from human plasma digests by ultrafast, liquid chromatography-free mass spectrometry.

    PubMed

    Razavi, Morteza; Frick, Lauren E; LaMarr, William A; Pope, Matthew E; Miller, Christine A; Anderson, N Leigh; Pearson, Terry W

    2012-12-07

    We investigated the utility of an SPE-MS/MS platform in combination with a modified SISCAPA workflow for chromatography-free MRM analysis of proteotypic peptides in digested human plasma. This combination of SISCAPA and SPE-MS/MS technology allows sensitive, MRM-based quantification of peptides from plasma digests with a sample cycle time of ∼7 s, a 300-fold improvement over typical MRM analyses with analysis times of 30-40 min that use liquid chromatography upstream of MS. The optimized system includes capture and enrichment to near purity of target proteotypic peptides using rigorously selected, high affinity, antipeptide monoclonal antibodies and reduction of background peptides using a novel treatment of magnetic bead immunoadsorbents. Using this method, we have successfully quantitated LPS-binding protein and mesothelin (concentrations of ∼5000 ng/mL and ∼10 ng/mL, respectively) in human plasma. The method eliminates the need for upstream liquid-chromatography and can be multiplexed, thus facilitating quantitative analysis of proteins, including biomarkers, in large sample sets. The method is ideal for high-throughput biomarker validation after affinity enrichment and has the potential for applications in clinical laboratories.

  13. Quantitative Analysis Of Three-dimensional Branching Systems From X-ray Computed Microtomography Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKinney, Adriana L.; Varga, Tamas

    Branching structures such as lungs, blood vessels and plant roots play a critical role in life. Growth, structure, and function of these branching structures have an immense effect on our lives. Therefore, quantitative size information on such structures in their native environment is invaluable for studying their growth and the effect of the environment on them. X-ray computed tomography (XCT) has been an effective tool for in situ imaging and analysis of branching structures. We developed a costless tool that approximates the surface and volume of branching structures. Our methodology of noninvasive imaging, segmentation and extraction of quantitative information ismore » demonstrated through the analysis of a plant root in its soil medium from 3D tomography data. XCT data collected on a grass specimen was used to visualize its root structure. A suite of open-source software was employed to segment the root from the soil and determine its isosurface, which was used to calculate its volume and surface. This methodology of processing 3D data is applicable to other branching structures even when the structure of interest is of similar x-ray attenuation to its environment and difficulties arise with sample segmentation.« less

  14. Automatic registration of ICG images using mutual information and perfusion analysis

    NASA Astrophysics Data System (ADS)

    Kim, Namkug; Seo, Jong-Mo; Lee, June-goo; Kim, Jong Hyo; Park, Kwangsuk; Yu, Hyeong-Gon; Yu, Young Suk; Chung, Hum

    2005-04-01

    Introduction: Indocyanin green fundus angiographic images (ICGA) of the eyes is useful method in detecting and characterizing the choroidal neovascularization (CNV), which is the major cause of the blindness over 65 years of age. To investigate the quantitative analysis of the blood flow on ICGA, systematic approach for automatic registration of using mutual information and a quantitative analysis was developed. Methods: Intermittent sequential images of indocyanin green angiography were acquired by Heidelberg retinal angiography that uses the laser scanning system for the image acquisition. Misalignment of the each image generated by the minute eye movement of the patients was corrected by the mutual information method because the distribution of the contrast media on image is changing throughout the time sequences. Several region of interest (ROI) were selected by a physician and the intensities of the selected region were plotted according to the time sequences. Results: The registration of ICGA time sequential images is required not only translate transform but also rotational transform. Signal intensities showed variation based on gamma-variate function depending on ROIs and capillary vessels show more variance of signal intensity than major vessels. CNV showed intermediate variance of signal intensity and prolonged transit time. Conclusion: The resulting registered images can be used not only for quantitative analysis, but also for perfusion analysis. Various investigative approached on CNV using this method will be helpful in the characterization of the lesion and follow-up.

  15. A software package to improve image quality and isolation of objects of interest for quantitative stereology studies of rat hepatocarcinogenesis.

    PubMed

    Xu, Yihua; Pitot, Henry C

    2006-03-01

    In the studies of quantitative stereology of rat hepatocarcinogenesis, we have used image analysis technology (automatic particle analysis) to obtain data such as liver tissue area, size and location of altered hepatic focal lesions (AHF), and nuclei counts. These data are then used for three-dimensional estimation of AHF occurrence and nuclear labeling index analysis. These are important parameters for quantitative studies of carcinogenesis, for screening and classifying carcinogens, and for risk estimation. To take such measurements, structures or cells of interest should be separated from the other components based on the difference of color and density. Common background problems seen on the captured sample image such as uneven light illumination or color shading can cause severe problems in the measurement. Two application programs (BK_Correction and Pixel_Separator) have been developed to solve these problems. With BK_Correction, common background problems such as incorrect color temperature setting, color shading, and uneven light illumination background, can be corrected. With Pixel_Separator different types of objects can be separated from each other in relation to their color, such as seen with different colors in immunohistochemically stained slides. The resultant images of such objects separated from other components are then ready for particle analysis. Objects that have the same darkness but different colors can be accurately differentiated in a grayscale image analysis system after application of these programs.

  16. A benchmark for comparison of dental radiography analysis algorithms.

    PubMed

    Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia

    2016-07-01

    Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Species identification of corynebacteria by cellular fatty acid analysis.

    PubMed

    Van den Velde, Sandra; Lagrou, Katrien; Desmet, Koen; Wauters, Georges; Verhaegen, Jan

    2006-02-01

    We evaluated the usefulness of cellular fatty acid analysis for the identification of corynebacteria. Therefore, 219 well-characterized strains belonging to 21 Corynebacterium species were analyzed with the Sherlock System of MIDI (Newark, DE). Most Corynebacterium species have a qualitative different fatty acid profile. Corynebacterium coyleae (subgroup 1), Corynebacterium riegelii, Corynebacterium simulans, and Corynebacterium imitans differ only quantitatively. Corynebacterium afermentans afermentans and C. coyleae (subgroup 2) have both a similar qualitative and quantitative profile. The commercially available database (CLIN 40, MIDI) identified only one third of the 219 strains correctly at the species level. We created a new database with these 219 strains. This new database was tested with 34 clinical isolates and could identify 29 strains correctly. Strains that remained unidentified were 2 Corynebacterium aurimucosum (not included in our database), 1 C. afermentans afermentans, and 2 Corynebacterium pseudodiphtheriticum. Cellular fatty acid analysis with a self-created database can be used for the identification and differentiation of corynebacteria.

  18. Quantitative Effectiveness Analysis of Solar Photovoltaic Policies, Introduction of Socio-Feed-in Tariff Mechanism (SocioFIT) and its Implementation in Turkey

    NASA Astrophysics Data System (ADS)

    Mustafaoglu, Mustafa Sinan

    Some of the main energy issues in developing countries are high dependence on non-renewable energy sources, low energy efficiency levels and as a result of this high amount of CO2 emissions. Besides, a common problem of many countries including developing countries is economic inequality problem. In the study, solar photovoltaic policies of Germany, Japan and the USA is analyzed through a quantitative analysis and a new renewable energy support mechanism called Socio Feed-in Tariff Mechanism (SocioFIT) is formed based on the analysis results to address the mentioned issues of developing countries as well as economic inequality problem by using energy savings as a funding source for renewable energy systems. The applicability of the mechanism is solidified by the calculations in case of an implementation of the mechanism in Turkey.

  19. Morphological analysis of dendrites and spines by hybridization of ridge detection with twin support vector machine.

    PubMed

    Wang, Shuihua; Chen, Mengmeng; Li, Yang; Shao, Ying; Zhang, Yudong; Du, Sidan; Wu, Jane

    2016-01-01

    Dendritic spines are described as neuronal protrusions. The morphology of dendritic spines and dendrites has a strong relationship to its function, as well as playing an important role in understanding brain function. Quantitative analysis of dendrites and dendritic spines is essential to an understanding of the formation and function of the nervous system. However, highly efficient tools for the quantitative analysis of dendrites and dendritic spines are currently undeveloped. In this paper we propose a novel three-step cascaded algorithm-RTSVM- which is composed of ridge detection as the curvature structure identifier for backbone extraction, boundary location based on differences in density, the Hu moment as features and Twin Support Vector Machine (TSVM) classifiers for spine classification. Our data demonstrates that this newly developed algorithm has performed better than other available techniques used to detect accuracy and false alarm rates. This algorithm will be used effectively in neuroscience research.

  20. Quantitative Analysis of Human Pluripotency and Neural Specification by In-Depth (Phospho)Proteomic Profiling.

    PubMed

    Singec, Ilyas; Crain, Andrew M; Hou, Junjie; Tobe, Brian T D; Talantova, Maria; Winquist, Alicia A; Doctor, Kutbuddin S; Choy, Jennifer; Huang, Xiayu; La Monaca, Esther; Horn, David M; Wolf, Dieter A; Lipton, Stuart A; Gutierrez, Gustavo J; Brill, Laurence M; Snyder, Evan Y

    2016-09-13

    Controlled differentiation of human embryonic stem cells (hESCs) can be utilized for precise analysis of cell type identities during early development. We established a highly efficient neural induction strategy and an improved analytical platform, and determined proteomic and phosphoproteomic profiles of hESCs and their specified multipotent neural stem cell derivatives (hNSCs). This quantitative dataset (nearly 13,000 proteins and 60,000 phosphorylation sites) provides unique molecular insights into pluripotency and neural lineage entry. Systems-level comparative analysis of proteins (e.g., transcription factors, epigenetic regulators, kinase families), phosphorylation sites, and numerous biological pathways allowed the identification of distinct signatures in pluripotent and multipotent cells. Furthermore, as predicted by the dataset, we functionally validated an autocrine/paracrine mechanism by demonstrating that the secreted protein midkine is a regulator of neural specification. This resource is freely available to the scientific community, including a searchable website, PluriProt. Published by Elsevier Inc.

  1. Diagnostic value of (99m)Tc-3PRGD2 scintimammography for differentiation of malignant from benign breast lesions: Comparison of visual and semi-quantitative analysis.

    PubMed

    Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie

    2015-01-01

    To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (P<0.05). When grade 2 of the disease was used as cut-off value for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.

  2. Linguistic Alternatives to Quantitative Research Strategies. Part One: How Linguistic Mechanisms Advance Research Outcomes

    ERIC Educational Resources Information Center

    Yeager, Joseph; Sommer, Linda

    2007-01-01

    Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…

  3. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  4. Implementation of an interactive liver surgery planning system

    NASA Astrophysics Data System (ADS)

    Wang, Luyao; Liu, Jingjing; Yuan, Rong; Gu, Shuguo; Yu, Long; Li, Zhitao; Li, Yanzhao; Li, Zhen; Xie, Qingguo; Hu, Daoyu

    2011-03-01

    Liver tumor, one of the most wide-spread diseases, has a very high mortality in China. To improve success rates of liver surgeries and life qualities of such patients, we implement an interactive liver surgery planning system based on contrastenhanced liver CT images. The system consists of five modules: pre-processing, segmentation, modeling, quantitative analysis and surgery simulation. The Graph Cuts method is utilized to automatically segment the liver based on an anatomical prior knowledge that liver is the biggest organ and has almost homogeneous gray value. The system supports users to build patient-specific liver segment and sub-segment models using interactive portal vein branch labeling, and to perform anatomical resection simulation. It also provides several tools to simulate atypical resection, including resection plane, sphere and curved surface. To match actual surgery resections well and simulate the process flexibly, we extend our work to develop a virtual scalpel model and simulate the scalpel movement in the hepatic tissue using multi-plane continuous resection. In addition, the quantitative analysis module makes it possible to assess the risk of a liver surgery. The preliminary results show that the system has the potential to offer an accurate 3D delineation of the liver anatomy, as well as the tumors' location in relation to vessels, and to facilitate liver resection surgeries. Furthermore, we are testing the system in a full-scale clinical trial.

  5. Google glass based immunochromatographic diagnostic test analysis

    NASA Astrophysics Data System (ADS)

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2015-03-01

    Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.

  6. The reliability analysis of a separated, dual fail operational redundant strapdown IMU. [inertial measurement unit

    NASA Technical Reports Server (NTRS)

    Motyka, P.

    1983-01-01

    A methodology for quantitatively analyzing the reliability of redundant avionics systems, in general, and the dual, separated Redundant Strapdown Inertial Measurement Unit (RSDIMU), in particular, is presented. The RSDIMU is described and a candidate failure detection and isolation system presented. A Markov reliability model is employed. The operational states of the system are defined and the single-step state transition diagrams discussed. Graphical results, showing the impact of major system parameters on the reliability of the RSDIMU system, are presented and discussed.

  7. FBI fingerprint identification automation study: AIDS 3 evaluation report. Volume 5: Current system evaluation

    NASA Technical Reports Server (NTRS)

    Mulhall, B. D. L.

    1980-01-01

    The performance, costs, organization and other characteristics of both the manual system and AIDS 2 were used to establish a baseline case. The results of the evaluation are to be used to determine the feasibility of the AIDS 3 System, as well as provide a basis for ranking alternative systems during the second phase of the JPL study. The results of the study were tabulated by subject, scope and methods, providing a descriptive, quantitative and qualitative analysis of the current operating systems employed by the FBI Identification Division.

  8. Small-Signal Analysis of Autonomous Hybrid Distributed Generation Systems in Presence of Ultracapacitor and Tie-Line Operation

    NASA Astrophysics Data System (ADS)

    Ray, Prakash K.; Mohanty, Soumya R.; Kishor, Nand

    2010-07-01

    This paper presents small-signal analysis of isolated as well as interconnected autonomous hybrid distributed generation system for sudden variation in load demand, wind speed and solar radiation. The hybrid systems comprise of different renewable energy resources such as wind, photovoltaic (PV) fuel cell (FC) and diesel engine generator (DEG) along with the energy storage devices such as flywheel energy storage system (FESS) and battery energy storage system (BESS). Further ultracapacitors (UC) as an alternative energy storage element and interconnection of hybrid systems through tie-line is incorporated into the system for improved performance. A comparative assessment of deviation of frequency profile for different hybrid systems in the presence of different storage system combinations is carried out graphically as well as in terms of the performance index (PI), ie integral square error (ISE). Both qualitative and quantitative analysis reflects the improvements of the deviation in frequency profiles in the presence of the ultracapacitors (UC) as compared to other energy storage elements.

  9. Uncertainty Analysis of Radar and Gauge Rainfall Estimates in the Russian River Basin

    NASA Astrophysics Data System (ADS)

    Cifelli, R.; Chen, H.; Willie, D.; Reynolds, D.; Campbell, C.; Sukovich, E.

    2013-12-01

    Radar Quantitative Precipitation Estimation (QPE) has been a very important application of weather radar since it was introduced and made widely available after World War II. Although great progress has been made over the last two decades, it is still a challenging process especially in regions of complex terrain such as the western U.S. It is also extremely difficult to make direct use of radar precipitation data in quantitative hydrologic forecasting models. To improve the understanding of rainfall estimation and distributions in the NOAA Hydrometeorology Testbed in northern California (HMT-West), extensive evaluation of radar and gauge QPE products has been performed using a set of independent rain gauge data. This study focuses on the rainfall evaluation in the Russian River Basin. The statistical properties of the different gridded QPE products will be compared quantitatively. The main emphasis of this study will be on the analysis of uncertainties of the radar and gauge rainfall products that are subject to various sources of error. The spatial variation analysis of the radar estimates is performed by measuring the statistical distribution of the radar base data such as reflectivity and by the comparison with a rain gauge cluster. The application of mean field bias values to the radar rainfall data will also be described. The uncertainty analysis of the gauge rainfall will be focused on the comparison of traditional kriging and conditional bias penalized kriging (Seo 2012) methods. This comparison is performed with the retrospective Multisensor Precipitation Estimator (MPE) system installed at the NOAA Earth System Research Laboratory. The independent gauge set will again be used as the verification tool for the newly generated rainfall products.

  10. Multi-Resolution Analysis of LiDAR data for Characterizing a Stabilized Aeolian Landscape in South Texas

    NASA Astrophysics Data System (ADS)

    Barrineau, C. P.; Dobreva, I. D.; Bishop, M. P.; Houser, C.

    2014-12-01

    Aeolian systems are ideal natural laboratories for examining self-organization in patterned landscapes, as certain wind regimes generate certain morphologies. Topographic information and scale dependent analysis offer the opportunity to study such systems and characterize process-form relationships. A statistically based methodology for differentiating aeolian features would enable the quantitative association of certain surface characteristics with certain morphodynamic regimes. We conducted a multi-resolution analysis of LiDAR elevation data to assess scale-dependent morphometric variations in an aeolian landscape in South Texas. For each pixel, mean elevation values are calculated along concentric circles moving outward at 100-meter intervals (i.e. 500 m, 600 m, 700 m from pixel). The calculated average elevation values plotted against distance from the pixel of interest as curves are used to differentiate multi-scalar variations in elevation across the landscape. In this case, it is hypothesized these curves may be used to quantitatively differentiate certain morphometries from others like a spectral signature may be used to classify paved surfaces from natural vegetation, for example. After generating multi-resolution curves for all the pixels in a selected area of interest (AOI), a Principal Components Analysis is used to highlight commonalities and singularities between generated curves from pixels across the AOI. Our findings suggest that the resulting components could be used for identification of discrete aeolian features like open sands, trailing ridges and active dune crests, and, in particular, zones of deflation. This new approach to landscape characterization not only works to mitigate bias introduced when researchers must select training pixels for morphometric investigations, but can also reveal patterning in aeolian landscapes that would not be as obvious without quantitative characterization.

  11. Comparison analysis between filtered back projection and algebraic reconstruction technique on microwave imaging

    NASA Astrophysics Data System (ADS)

    Ramadhan, Rifqi; Prabowo, Rian Gilang; Aprilliyani, Ria; Basari

    2018-02-01

    Victims of acute cancer and tumor are growing each year and cancer becomes one of the causes of human deaths in the world. Cancers or tumor tissue cells are cells that grow abnormally and turn to take over and damage the surrounding tissues. At the beginning, cancers or tumors do not have definite symptoms in its early stages, and can even attack the tissues inside of the body. This phenomena is not identifiable under visual human observation. Therefore, an early detection system which is cheap, quick, simple, and portable is essensially required to anticipate the further development of cancer or tumor. Among all of the modalities, microwave imaging is considered to be a cheaper, simple, and portable system method. There are at least two simple image reconstruction algorithms i.e. Filtered Back Projection (FBP) and Algebraic Reconstruction Technique (ART), which have been adopted in some common modalities. In this paper, both algorithms will be compared by reconstructing the image from an artificial tissue model (i.e. phantom), which has two different dielectric distributions. We addressed two performance comparisons, namely quantitative and qualitative analysis. Qualitative analysis includes the smoothness of the image and also the success in distinguishing dielectric differences by observing the image with human eyesight. In addition, quantitative analysis includes Histogram, Structural Similarity Index (SSIM), Mean Squared Error (MSE), and Peak Signal-to-Noise Ratio (PSNR) calculation were also performed. As a result, quantitative parameters of FBP might show better values than the ART. However, ART is likely more capable to distinguish two different dielectric value than FBP, due to higher contrast in ART and wide distribution grayscale level.

  12. Extracting Metrics for Three-dimensional Root Systems: Volume and Surface Analysis from In-soil X-ray Computed Tomography Data.

    PubMed

    Suresh, Niraj; Stephens, Sean A; Adams, Lexor; Beck, Anthon N; McKinney, Adriana L; Varga, Tamas

    2016-04-26

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and crop management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving plants. X-ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. We aimed to develop a costless and efficient tool that approximates the surface and volume of the root regardless of its shape from three-dimensional (3D) tomography data. The root structure of a Prairie dropseed (Sporobolus heterolepis) specimen was imaged using XCT. The root was reconstructed, and the primary root structure was extracted from the data using a combination of licensed and open-source software. An isosurface polygonal mesh was then created for ease of analysis. We have developed the standalone application imeshJ, generated in MATLAB(1), to calculate root volume and surface area from the mesh. The outputs of imeshJ are surface area (in mm(2)) and the volume (in mm(3)). The process, utilizing a unique combination of tools from imaging to quantitative root analysis, is described. A combination of XCT and open-source software proved to be a powerful combination to noninvasively image plant root samples, segment root data, and extract quantitative information from the 3D data. This methodology of processing 3D data should be applicable to other material/sample systems where there is connectivity between components of similar X-ray attenuation and difficulties arise with segmentation.

  13. Weighted Fuzzy Risk Priority Number Evaluation of Turbine and Compressor Blades Considering Failure Mode Correlations

    NASA Astrophysics Data System (ADS)

    Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong

    2014-06-01

    Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.

  14. Quantitative Magnetic Separation of Particles and Cells using Gradient Magnetic Ratcheting

    PubMed Central

    Murray, Coleman; Pao, Edward; Tseng, Peter; Aftab, Shayan; Kulkarni, Rajan; Rettig, Matthew; Di Carlo, Dino

    2016-01-01

    Extraction of rare target cells from biosamples is enabling for life science research. Traditional rare cell separation techniques, such as magnetic activated cell sorting (MACS), are robust but perform coarse, qualitative separations based on surface antigen expression. We report a quantitative magnetic separation technology using high-force magnetic ratcheting over arrays of magnetically soft micro-pillars with gradient spacing, and use the system to separate and concentrate magnetic beads based on iron oxide content (IOC) and cells based on surface expression. The system consists of a microchip of permalloy micro-pillar arrays with increasing lateral pitch and a mechatronic device to generate a cycling magnetic-field. Particles with higher IOC separate and equilibrate along the miro-pillar array at larger pitches. We develop a semi-analytical model that predicts behavior for particles and cells. Using the system, LNCaP cells were separated based on the bound quantity of 1μm anti-EpCAM particles as a metric for expression. The ratcheting cytometry system was able to resolve a ±13 bound particle differential, successfully distinguishing LNCaP from PC3 populations based on EpCAM expression, correlating with flow cytometry analysis. As a proof of concept, EpCAM-labeled cells from patient blood were isolated with 74% purity, demonstrating potential towards a quantitative magnetic separation instrument. PMID:26890496

  15. [Quantitative determination of biogenic amine from Biomphalaria glabrata nervous system by UPLC MS/MS].

    PubMed

    Tao, Huang; Yun-Hai, Guo; He-Xiang, Liu; Yi, Zhang

    2018-04-19

    To establish a method for the quantitative determination of serotonin and dopamine in the nervous system of Biomphalaria glabrata by using ultra high performance liquid chromatography-tandem quadrupole mass spectrometry (UPLC MS/MS) . The B. glabrata nervous system was broken in the pure methanol solution after obtaining it by dissecting with microscope. Then, the supernatant containing the target substance after twice high speed centrifugation was got. The extraction was separated on an ACQUITY UPLC BEH Amide column with Waters TQ-XS series mass spectrometry detector, with ESI source and positive electrospray ionization mode when the machine testing. The detection limit of serotonin was 0.03 ng/ml and the limit of quantification was 0.1 ng/ml. The detection limit of dopamine was 0.05 ng/ml and the limit of quantification was 0.15 ng/ml. The recoveries of serotonin ranged from 90.68% to 94.72% over the range of 1 to 40 ng/ml. The recoveries of dopamine ranged from 91.68% to 96.12% over the range of 1.0 ng/ml to 40 ng/ml. The established UPLC MS/MS method is simple, stable and reproducible. It can be used for the quantitative analysis of serotonin and dopamine in the nervous system of B. glabrata snails.

  16. A lab-on-a-chip system with integrated sample preparation and loop-mediated isothermal amplification for rapid and quantitative detection of Salmonella spp. in food samples.

    PubMed

    Sun, Yi; Quyen, Than Linh; Hung, Tran Quang; Chin, Wai Hoe; Wolff, Anders; Bang, Dang Duong

    2015-04-21

    Foodborne disease is a major public health threat worldwide. Salmonellosis, an infectious disease caused by Salmonella spp., is one of the most common foodborne diseases. Isolation and identification of Salmonella by conventional bacterial culture or molecular-based methods are time consuming and usually take a few hours to days to complete. In response to the demand for rapid on line or on site detection of pathogens, in this study, we describe for the first time an eight-chamber lab-on-a-chip (LOC) system with integrated magnetic bead-based sample preparation and loop-mediated isothermal amplification (LAMP) for rapid and quantitative detection of Salmonella spp. in food samples. The whole diagnostic procedures including DNA isolation, isothermal amplification, and real-time detection were accomplished in a single chamber. Up to eight samples could be handled simultaneously and the system was capable to detect Salmonella at concentration of 50 cells per test within 40 min. The simple design, together with high level of integration, isothermal amplification, and quantitative analysis of multiple samples in short time, will greatly enhance the practical applicability of the LOC system for rapid on-site screening of Salmonella for applications in food safety control, environmental surveillance, and clinical diagnostics.

  17. Inconsistencies in reporting risk information: a pilot analysis of online news coverage of West Nile Virus.

    PubMed

    Birnbrauer, Kristina; Frohlich, Dennis Owen; Treise, Debbie

    2017-09-01

    West Nile Virus (WNV) has been reported as one of the worst epidemics in US history. This study sought to understand how WNV news stories were framed and how risk information was portrayed from its 1999 arrival in the US through the year 2012. The authors conducted a quantitative content analysis of online news articles obtained through Google News ( N = 428). The results of this analysis were compared to the CDC's ArboNET surveillance system. The following story frames were identified in this study: action, conflict, consequence, new evidence, reassurance and uncertainty, with the action frame appearing most frequently. Risk was communicated quantitatively without context in the majority of articles, and only in 2006, the year with the third-highest reported deaths, was risk reported with statistical accuracy. The results from the analysis indicated that at-risk communities were potentially under-informed as accurate risks were not communicated. This study offers evidence about how disease outbreaks are covered in relation to actual disease surveillance data.

  18. Sugar and acid content of Citrus prediction modeling using FT-IR fingerprinting in combination with multivariate statistical analysis.

    PubMed

    Song, Seung Yeob; Lee, Young Koung; Kim, In-Jung

    2016-01-01

    A high-throughput screening system for Citrus lines were established with higher sugar and acid contents using Fourier transform infrared (FT-IR) spectroscopy in combination with multivariate analysis. FT-IR spectra confirmed typical spectral differences between the frequency regions of 950-1100 cm(-1), 1300-1500 cm(-1), and 1500-1700 cm(-1). Principal component analysis (PCA) and subsequent partial least square-discriminant analysis (PLS-DA) were able to discriminate five Citrus lines into three separate clusters corresponding to their taxonomic relationships. The quantitative predictive modeling of sugar and acid contents from Citrus fruits was established using partial least square regression algorithms from FT-IR spectra. The regression coefficients (R(2)) between predicted values and estimated sugar and acid content values were 0.99. These results demonstrate that by using FT-IR spectra and applying quantitative prediction modeling to Citrus sugar and acid contents, excellent Citrus lines can be early detected with greater accuracy. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Analysis system for characterisation of simple, low-cost microfluidic components

    NASA Astrophysics Data System (ADS)

    Smith, Suzanne; Naidoo, Thegaran; Nxumalo, Zandile; Land, Kevin; Davies, Emlyn; Fourie, Louis; Marais, Philip; Roux, Pieter

    2014-06-01

    There is an inherent trade-off between cost and operational integrity of microfluidic components, especially when intended for use in point-of-care devices. We present an analysis system developed to characterise microfluidic components for performing blood cell counting, enabling the balance between function and cost to be established quantitatively. Microfluidic components for sample and reagent introduction, mixing and dispensing of fluids were investigated. A simple inlet port plugging mechanism is used to introduce and dispense a sample of blood, while a reagent is released into the microfluidic system through compression and bursting of a blister pack. Mixing and dispensing of the sample and reagent are facilitated via air actuation. For these microfluidic components to be implemented successfully, a number of aspects need to be characterised for development of an integrated point-of-care device design. The functional components were measured using a microfluidic component analysis system established in-house. Experiments were carried out to determine: 1. the force and speed requirements for sample inlet port plugging and blister pack compression and release using two linear actuators and load cells for plugging the inlet port, compressing the blister pack, and subsequently measuring the resulting forces exerted, 2. the accuracy and repeatability of total volumes of sample and reagent dispensed, and 3. the degree of mixing and dispensing uniformity of the sample and reagent for cell counting analysis. A programmable syringe pump was used for air actuation to facilitate mixing and dispensing of the sample and reagent. Two high speed cameras formed part of the analysis system and allowed for visualisation of the fluidic operations within the microfluidic device. Additional quantitative measures such as microscopy were also used to assess mixing and dilution accuracy, as well as uniformity of fluid dispensing - all of which are important requirements towards the successful implementation of a blood cell counting system.

  20. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1990-01-01

    In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.

Top