Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-31
... selling investments for the Portfolio. Investments are selected based on fundamental and quantitative... Equity Portfolio. Investments are selected based on fundamental and quantitative analysis. MFS uses... trust and is registered with the Commission as an open-end management investment company.\\5\\ SSgA Funds...
NASA Technical Reports Server (NTRS)
Rubin, D. N.; Yazbek, N.; Garcia, M. J.; Stewart, W. J.; Thomas, J. D.
2000-01-01
Harmonic imaging is a new ultrasonographic technique that is designed to improve image quality by exploiting the spontaneous generation of higher frequencies as ultrasound propagates through tissue. We studied 51 difficult-to-image patients with blinded side-by-side cineloop evaluation of endocardial border definition by harmonic versus fundamental imaging. In addition, quantitative intensities from cavity versus wall were compared for harmonic versus fundamental imaging. Harmonic imaging improved left ventricular endocardial border delineation over fundamental imaging (superior: harmonic = 71.1%, fundamental = 18.7%; similar: 10.2%; P <.001). Quantitative analysis of 100 wall/cavity combinations demonstrated brighter wall segments and more strikingly darker cavities during harmonic imaging (cavity intensity on a 0 to 255 scale: fundamental = 15.6 +/- 8.6; harmonic = 6.0 +/- 5.3; P <.0001), which led to enhanced contrast between the wall and cavity (1.89 versus 1.19, P <.0001). Harmonic imaging reduces side-lobe artifacts, resulting in a darker cavity and brighter walls, thereby improving image contrast and endocardial delineation.
Kinetic Analysis of Amylase Using Quantitative Benedict's and Iodine Starch Reagents
ERIC Educational Resources Information Center
Cochran, Beverly; Lunday, Deborah; Miskevich, Frank
2008-01-01
Quantitative analysis of carbohydrates is a fundamental analytical tool used in many aspects of biology and chemistry. We have adapted a technique developed by Mathews et al. using an inexpensive scanner and open-source image analysis software to quantify amylase activity using both the breakdown of starch and the appearance of glucose. Breakdown…
Freddie Mercury-acoustic analysis of speaking fundamental frequency, vibrato, and subharmonics.
Herbst, Christian T; Hertegard, Stellan; Zangger-Borch, Daniel; Lindestad, Per-Åke
2017-04-01
Freddie Mercury was one of the twentieth century's best-known singers of commercial contemporary music. This study presents an acoustical analysis of his voice production and singing style, based on perceptual and quantitative analysis of publicly available sound recordings. Analysis of six interviews revealed a median speaking fundamental frequency of 117.3 Hz, which is typically found for a baritone voice. Analysis of voice tracks isolated from full band recordings suggested that the singing voice range was 37 semitones within the pitch range of F#2 (about 92.2 Hz) to G5 (about 784 Hz). Evidence for higher phonations up to a fundamental frequency of 1,347 Hz was not deemed reliable. Analysis of 240 sustained notes from 21 a-cappella recordings revealed a surprisingly high mean fundamental frequency modulation rate (vibrato) of 7.0 Hz, reaching the range of vocal tremor. Quantitative analysis utilizing a newly introduced parameter to assess the regularity of vocal vibrato corroborated its perceptually irregular nature, suggesting that vibrato (ir)regularity is a distinctive feature of the singing voice. Imitation of subharmonic phonation samples by a professional rock singer, documented by endoscopic high-speed video at 4,132 frames per second, revealed a 3:1 frequency locked vibratory pattern of vocal folds and ventricular folds.
Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis
ERIC Educational Resources Information Center
Rubin, Samuel J.; Abrams, Binyomin
2015-01-01
Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…
Quantitative mass spectrometry: an overview
NASA Astrophysics Data System (ADS)
Urban, Pawel L.
2016-10-01
Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.
Quantitative descriptions of generalized arousal, an elementary function of the vertebrate brain
Quinkert, Amy Wells; Vimal, Vivek; Weil, Zachary M.; Reeke, George N.; Schiff, Nicholas D.; Banavar, Jayanth R.; Pfaff, Donald W.
2011-01-01
We review a concept of the most primitive, fundamental function of the vertebrate CNS, generalized arousal (GA). Three independent lines of evidence indicate the existence of GA: statistical, genetic, and mechanistic. Here we ask, is this concept amenable to quantitative analysis? Answering in the affirmative, four quantitative approaches have proven useful: (i) factor analysis, (ii) information theory, (iii) deterministic chaos, and (iv) application of a Gaussian equation. It strikes us that, to date, not just one but at least four different quantitative approaches seem necessary for describing different aspects of scientific work on GA. PMID:21555568
Electric and Magnetic Interactions
NASA Astrophysics Data System (ADS)
Chabay, Ruth W.; Sherwood, Bruce A.
1994-08-01
The curriculum has been restructured so that students will have the necessary fundamental understanding of charges and fields before going on to more complex issues. Qualitative reasoning and quantitative analysis are discussed equally in order to provide a meaningful conceptual framework within which the quantitative work makes more sense. Atomic-level analysis is stressed and electrostatics and circuits are unified. Desktop experiments can be conducted at home or in the classroom and are tightly integrated with the theoretical treatment.
13C-based metabolic flux analysis: fundamentals and practice.
Yang, Tae Hoon
2013-01-01
Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.
Targeted Quantitation of Proteins by Mass Spectrometry
2013-01-01
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332
Targeted quantitation of proteins by mass spectrometry.
Liebler, Daniel C; Zimmerman, Lisa J
2013-06-04
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.
Statistical design of quantitative mass spectrometry-based proteomic experiments.
Oberg, Ann L; Vitek, Olga
2009-05-01
We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.
NASA Astrophysics Data System (ADS)
Shaltout, Abdallah A.; Moharram, Mohammed A.; Mostafa, Nasser Y.
2012-01-01
This work is the first attempt to quantify trace elements in the Catha edulis plant (Khat) with a fundamental parameter approach. C. edulis is a famous drug plant in east Africa and Arabian Peninsula. We have previously confirmed that hydroxyapatite represents one of the main inorganic compounds in the leaves and stalks of C. edulis. Comparable plant leaves from basil, mint and green tea were included in the present investigation as well as trifolium leaves were included as a non-related plant. The elemental analyses of the plants were done by Wavelength Dispersive X-Ray Fluorescence (WDXRF) spectroscopy. Standard-less quantitative WDXRF analysis was carried out based on the fundamental parameter approaches. According to the standard-less analysis algorithms, there is an essential need for an accurate determination of the amount of organic material in the sample. A new approach, based on the differential thermal analysis, was successfully used for the organic material determination. The obtained results based on this approach were in a good agreement with the commonly used methods. Depending on the developed method, quantitative analysis results of eighteen elements including; Al, Br, Ca, Cl, Cu, Fe, K, Na, Ni, Mg, Mn, P, Rb, S, Si, Sr, Ti and Zn were obtained for each plant. The results of the certified reference materials of green tea (NCSZC73014, China National Analysis Center for Iron and Steel, Beijing, China) confirmed the validity of the proposed method.
Quantification of Microbial Phenotypes
Martínez, Verónica S.; Krömer, Jens O.
2016-01-01
Metabolite profiling technologies have improved to generate close to quantitative metabolomics data, which can be employed to quantitatively describe the metabolic phenotype of an organism. Here, we review the current technologies available for quantitative metabolomics, present their advantages and drawbacks, and the current challenges to generate fully quantitative metabolomics data. Metabolomics data can be integrated into metabolic networks using thermodynamic principles to constrain the directionality of reactions. Here we explain how to estimate Gibbs energy under physiological conditions, including examples of the estimations, and the different methods for thermodynamics-based network analysis. The fundamentals of the methods and how to perform the analyses are described. Finally, an example applying quantitative metabolomics to a yeast model by 13C fluxomics and thermodynamics-based network analysis is presented. The example shows that (1) these two methods are complementary to each other; and (2) there is a need to take into account Gibbs energy errors. Better estimations of metabolic phenotypes will be obtained when further constraints are included in the analysis. PMID:27941694
Prediction Analysis for Measles Epidemics
NASA Astrophysics Data System (ADS)
Sumi, Ayako; Ohtomo, Norio; Tanaka, Yukio; Sawamura, Sadashi; Olsen, Lars Folke; Kobayashi, Nobumichi
2003-12-01
A newly devised procedure of prediction analysis, which is a linearized version of the nonlinear least squares method combined with the maximum entropy spectral analysis method, was proposed. This method was applied to time series data of measles case notification in several communities in the UK, USA and Denmark. The dominant spectral lines observed in each power spectral density (PSD) can be safely assigned as fundamental periods. The optimum least squares fitting (LSF) curve calculated using these fundamental periods can essentially reproduce the underlying variation of the measles data. An extension of the LSF curve can be used to predict measles case notification quantitatively. Some discussions including a predictability of chaotic time series are presented.
Fundamentals of quantitative dynamic contrast-enhanced MR imaging.
Paldino, Michael J; Barboriak, Daniel P
2009-05-01
Quantitative analysis of dynamic contrast-enhanced MR imaging (DCE-MR imaging) has the power to provide information regarding physiologic characteristics of the microvasculature and is, therefore, of great potential value to the practice of oncology. In particular, these techniques could have a significant impact on the development of novel anticancer therapies as a promising biomarker of drug activity. Standardization of DCE-MR imaging acquisition and analysis to provide more reproducible measures of tumor vessel physiology is of crucial importance to realize this potential. The purpose of this article is to review the pathophysiologic basis and technical aspects of DCE-MR imaging techniques.
NASA Astrophysics Data System (ADS)
Chen, Shichao; Zhu, Yizheng
2017-02-01
Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.
2003-10-01
made in an ensemble of channels of unknown orientation and number, preventing quantitative analysis . • Currents have been compared among continuum PNP...microfluidic) analysis of ion channels to obtain fundamental insights into the selectivity, conductivity, and sensitivity of ion channels [19], [6...1.1 Develop fast and efficient simulators for steady-state analysis of continuum model for extraction of I-V curves. 1.2 Create
Fundamentals of Structural Geology
NASA Astrophysics Data System (ADS)
Pollard, David D.; Fletcher, Raymond C.
2005-09-01
Fundamentals of Structural Geology provides a new framework for the investigation of geological structures by integrating field mapping and mechanical analysis. Assuming a basic knowledge of physical geology, introductory calculus and physics, it emphasizes the observational data, modern mapping technology, principles of continuum mechanics, and the mathematical and computational skills, necessary to quantitatively map, describe, model, and explain deformation in Earth's lithosphere. By starting from the fundamental conservation laws of mass and momentum, the constitutive laws of material behavior, and the kinematic relationships for strain and rate of deformation, the authors demonstrate the relevance of solid and fluid mechanics to structural geology. This book offers a modern quantitative approach to structural geology for advanced students and researchers in structural geology and tectonics. It is supported by a website hosting images from the book, additional colour images, student exercises and MATLAB scripts. Solutions to the exercises are available to instructors. The book integrates field mapping using modern technology with the analysis of structures based on a complete mechanics MATLAB is used to visualize physical fields and analytical results and MATLAB scripts can be downloaded from the website to recreate textbook graphics and enable students to explore their choice of parameters and boundary conditions The supplementary website hosts color images of outcrop photographs used in the text, supplementary color images, and images of textbook figures for classroom presentations The textbook website also includes student exercises designed to instill the fundamental relationships, and to encourage the visualization of the evolution of geological structures; solutions are available to instructors
Quantitative biology of single neurons
Eberwine, James; Lovatt, Ditte; Buckley, Peter; Dueck, Hannah; Francis, Chantal; Kim, Tae Kyung; Lee, Jaehee; Lee, Miler; Miyashiro, Kevin; Morris, Jacqueline; Peritz, Tiina; Schochet, Terri; Spaethling, Jennifer; Sul, Jai-Yoon; Kim, Junhyong
2012-01-01
The building blocks of complex biological systems are single cells. Fundamental insights gained from single-cell analysis promise to provide the framework for understanding normal biological systems development as well as the limits on systems/cellular ability to respond to disease. The interplay of cells to create functional systems is not well understood. Until recently, the study of single cells has concentrated primarily on morphological and physiological characterization. With the application of new highly sensitive molecular and genomic technologies, the quantitative biochemistry of single cells is now accessible. PMID:22915636
Enríquez-Navas, Pedro M; Guzzi, Cinzia; Muñoz-García, Juan C; Nieto, Pedro M; Angulo, Jesús
2015-01-01
Glycan-receptor interactions are of fundamental relevance for a large number of biological processes, and their kinetics properties (medium/weak binding affinities) make them appropriated to be studied by ligand observed NMR techniques, among which saturation transfer difference (STD) NMR spectroscopy has been shown to be a very robust and powerful approach. The quantitative analysis of the results from a STD NMR study of a glycan-receptor interaction is essential to be able to translate the resulting spectral intensities into a 3D molecular model of the complex. This chapter describes how to carry out such a quantitative analysis by means of the Complete Relaxation and Conformational Exchange Matrix Approach for STD NMR (CORCEMA-ST), in general terms, and an example of a previous work on an antibody-glycan interaction is also shown.
NASA Astrophysics Data System (ADS)
Shakeel, Hira; Haq, S. U.; Aisha, Ghulam; Nadeem, Ali
2017-06-01
The quantitative analysis of the standard aluminum-silicon alloy has been performed using calibration free laser induced breakdown spectroscopy (CF-LIBS). The plasma was produced using the fundamental harmonic (1064 nm) of the Nd: YAG laser and the emission spectra were recorded at 3.5 μs detector gate delay. The qualitative analysis of the emission spectra confirms the presence of Mg, Al, Si, Ti, Mn, Fe, Ni, Cu, Zn, Sn, and Pb in the alloy. The background subtracted and self-absorption corrected emission spectra were used for the estimation of plasma temperature as 10 100 ± 300 K. The plasma temperature and self-absorption corrected emission lines of each element have been used for the determination of concentration of each species present in the alloy. The use of corrected emission intensities and accurate evaluation of plasma temperature yield reliable quantitative analysis up to a maximum 2.2% deviation from reference sample concentration.
Critiquing qualitative research.
Beck, Cheryl Tatano
2009-10-01
The ability to critique research is a valuable skill that is fundamental to a perioperative nurse's ability to base his or her clinical practice on evidence derived from research. Criteria differ for critiquing a quantitative versus a qualitative study (ie, statistics are evaluated in a quantitative study, but not in a qualitative study). This article provides on guidelines for assessing qualitative research. Excerpts from a published qualitative research report are summarized and then critiqued. Questions are provided that help evaluate different sections of a research study (eg, sample, data collection methods, data analysis).
Correlation analysis of the physiological factors controlling fundamental voice frequency.
Atkinson, J E
1978-01-01
A technique has been developed to obtain a quantitative measure of correlation between electromyographic (EMG) activity of various laryngeal muscles, subglottal air pressure, and the fundamental frequency of vibration of the vocal folds (Fo). Data were collected and analyzed on one subject, a native speaker of American English. The results show that an analysis of this type can provide a useful measure of correlation between the physiological and acoustical events in speech and, furthermore, can yield detailed insights into the organization and nature of the speech production process. In particular, based on these results, a model is suggested of Fo control involving laryngeal state functions that seems to agree with present knowledge of laryngeal control and experimental evidence.
Qualitative and quantitative mass spectrometry imaging of drugs and metabolites.
Lietz, Christopher B; Gemperline, Erin; Li, Lingjun
2013-07-01
Mass spectrometric imaging (MSI) has rapidly increased its presence in the pharmaceutical sciences. While quantitative whole-body autoradiography and microautoradiography are the traditional techniques for molecular imaging of drug delivery and metabolism, MSI provides advantageous specificity that can distinguish the parent drug from metabolites and modified endogenous molecules. This review begins with the fundamentals of MSI sample preparation/ionization, and then moves on to both qualitative and quantitative applications with special emphasis on drug discovery and delivery. Cutting-edge investigations on sub-cellular imaging and endogenous signaling peptides are also highlighted, followed by perspectives on emerging technology and the path for MSI to become a routine analysis technique. Copyright © 2013 Elsevier B.V. All rights reserved.
Qualitative and quantitative mass spectrometry imaging of drugs and metabolites
Lietz, Christopher B.; Gemperline, Erin; Li, Lingjun
2013-01-01
Mass spectrometric imaging (MSI) has rapidly increased its presence in the pharmaceutical sciences. While quantitative whole-body autoradiography and microautoradiography are the traditional techniques for molecular imaging of drug delivery and metabolism, MSI provides advantageous specificity that can distinguish the parent drug from metabolites and modified endogenous molecules. This review begins with the fundamentals of MSI sample preparation/ionization, and then moves on to both qualitative and quantitative applications with special emphasis on drug discovery and delivery. Cutting-edge investigations on sub-cellular imaging and endogenous signaling peptides are also highlighted, followed by perspectives on emerging technology and the path for MSI to become a routine analysis technique. PMID:23603211
Buenrostro, Jason D.; Chircus, Lauren M.; Araya, Carlos L.; Layton, Curtis J.; Chang, Howard Y.; Snyder, Michael P.; Greenleaf, William J.
2015-01-01
RNA-protein interactions drive fundamental biological processes and are targets for molecular engineering, yet quantitative and comprehensive understanding of the sequence determinants of affinity remains limited. Here we repurpose a high-throughput sequencing instrument to quantitatively measure binding and dissociation of MS2 coat protein to >107 RNA targets generated on a flow-cell surface by in situ transcription and inter-molecular tethering of RNA to DNA. We decompose the binding energy contributions from primary and secondary RNA structure, finding that differences in affinity are often driven by sequence-specific changes in association rates. By analyzing the biophysical constraints and modeling mutational paths describing the molecular evolution of MS2 from low- to high-affinity hairpins, we quantify widespread molecular epistasis, and a long-hypothesized structure-dependent preference for G:U base pairs over C:A intermediates in evolutionary trajectories. Our results suggest that quantitative analysis of RNA on a massively parallel array (RNAMaP) relationships across molecular variants. PMID:24727714
A Quantitative Methodology for Vetting Dark Network Intelligence Sources for Social Network Analysis
2012-06-01
first algorithm by Erdös and Rényi (Erdös & Renyi , 1959). This earliest algorithm suffers from the fact that its degree distribution is not scale...Fundamental Media Understanding. Norderstedt: atpress. Erdös, P., & Renyi , A. (1959). On random graphs. Publicationes Mathematicae , 6, 290- 297. Erdös, P
Teaching Expression Proteomics: From the Wet-Lab to the Laptop
ERIC Educational Resources Information Center
Teixeira, Miguel C.; Santos, Pedro M.; Rodrigues, Catarina; Sa-Correia, Isabel
2009-01-01
Expression proteomics has become, in recent years, a key genome-wide expression approach in fundamental and applied life sciences. This postgenomic technology aims the quantitative analysis of all the proteins or protein forms (the so-called proteome) of a given organism in a given environmental and genetic context. It is a challenge to provide…
System safety education focused on flight safety
NASA Technical Reports Server (NTRS)
Holt, E.
1971-01-01
The measures necessary for achieving higher levels of system safety are analyzed with an eye toward maintaining the combat capability of the Air Force. Several education courses were provided for personnel involved in safety management. Data include: (1) Flight Safety Officer Course, (2) Advanced Safety Program Management, (3) Fundamentals of System Safety, and (4) Quantitative Methods of Safety Analysis.
Quantitative risk assessment system (QRAS)
NASA Technical Reports Server (NTRS)
Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)
2001-01-01
A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.
NASA Astrophysics Data System (ADS)
Melelli, Laura; Liucci, Luisa; Vergari, Francesca; Ciccacci, Sirio; Del Monte, Maurizio
2014-05-01
Drainage basins are primary landscape units for geomorphological investigations. Both hillslopes and river drainage system are fundamental components in drainage basins analysis. As other geomorphological systems, also the drainage basins aim to an equilibrium condition where the sequence of erosion, transport and sedimentation approach to a condition of minimum energy effort. This state is revealed by a typical geometry of landforms and of drainage net. Several morphometric indexes can measure how much a drainage basin is far from the theoretical equilibrium configuration, revealing possible external disarray. In active tectonic areas, the drainage basins have a primary importance in order to highlight style, amount and rate of tectonic impulses, and morphometric indexes allow to estimate the tectonic activity classes of different sectors in a study area. Moreover, drainage rivers are characterized by a self-similarity structure; this promotes the use of fractals theory to investigate the system. In this study, fractals techniques are employed together with quantitative geomorphological analysis to study the Upper Tiber Valley (UTV), a tectonic intermontane basin located in northern Apennines (Umbria, central Italy). The area is the result of different tectonic phases. From Late Pliocene until present time the UTV is strongly controlled by a regional uplift and by an extensional phase with different sets of normal faults playing a fundamental role in basin morphology. Thirty-four basins are taken into account for the quantitative analysis, twenty on the left side of the basin, the others on the right side. Using fractals dimension of drainage networks, Horton's laws results, concavity and steepness indexes, and hypsometric curves, this study aims to obtain an evolutionary model of the UTV, where the uplift is compared to local subsidence induced by normal fault activity. The results highlight a well defined difference between western and eastern tributary basins, suggesting a greater disequilibrium in the last ones. The quantitative analysis points out the segments of the basin boundaries where the fault activity is more efficient and the resulting geomorphological implications.
Tannin structural elucidation and quantitative ³¹P NMR analysis. 1. Model compounds.
Melone, Federica; Saladino, Raffaele; Lange, Heiko; Crestini, Claudia
2013-10-02
Tannins and flavonoids are secondary metabolites of plants that display a wide array of biological activities. This peculiarity is related to the inhibition of extracellular enzymes that occurs through the complexation of peptides by tannins. Not only the nature of these interactions, but more fundamentally also the structure of these heterogeneous polyphenolic molecules are not completely clear. This first paper describes the development of a new analytical method for the structural characterization of tannins on the basis of tannin model compounds employing an in situ labeling of all labile H groups (aliphatic OH, phenolic OH, and carboxylic acids) with a phosphorus reagent. The ³¹P NMR analysis of ³¹P-labeled samples allowed the unprecedented quantitative and qualitative structural characterization of hydrolyzable tannins, proanthocyanidins, and catechin tannin model compounds, forming the foundations for the quantitative structural elucidation of a variety of actual tannin samples described in part 2 of this series.
Secondary Ion Mass Spectrometry SIMS XI
NASA Astrophysics Data System (ADS)
Gillen, G.; Lareau, R.; Bennett, J.; Stevie, F.
2003-05-01
This volume contains 252 contributions presented as plenary, invited and contributed poster and oral presentations at the 11th International Conference on Secondary Ion Mass Spectrometry (SIMS XI) held at the Hilton Hotel, Walt Disney World Village, Orlando, Florida, 7 12 September, 1997. The book covers a diverse range of research, reflecting the rapid growth in advanced semiconductor characterization, ultra shallow depth profiling, TOF-SIMS and the new areas in which SIMS techniques are being used, for example in biological sciences and organic surface characterization. Papers are presented under the following categories: Isotopic SIMS Biological SIMS Semiconductor Characterization Techniques and Applications Ultra Shallow Depth Profiling Depth Profiling Fundamental/Modelling and Diffusion Sputter-Induced Topography Fundamentals of Molecular Desorption Organic Materials Practical TOF-SIMS Polyatomic Primary Ions Materials/Surface Analysis Postionization Instrumentation Geological SIMS Imaging Fundamentals of Sputtering Ion Formation and Cluster Formation Quantitative Analysis Environmental/Particle Characterization Related Techniques These proceedings provide an invaluable source of reference for both newcomers to the field and experienced SIMS users.
Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang
2016-08-01
Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.
Machado, Ana S; Darmohray, Dana M; Fayad, João; Marques, Hugo G; Carey, Megan R
2015-01-01
The coordination of movement across the body is a fundamental, yet poorly understood aspect of motor control. Mutant mice with cerebellar circuit defects exhibit characteristic impairments in locomotor coordination; however, the fundamental features of this gait ataxia have not been effectively isolated. Here we describe a novel system (LocoMouse) for analyzing limb, head, and tail kinematics of freely walking mice. Analysis of visibly ataxic Purkinje cell degeneration (pcd) mice reveals that while differences in the forward motion of individual paws are fully accounted for by changes in walking speed and body size, more complex 3D trajectories and, especially, inter-limb and whole-body coordination are specifically impaired. Moreover, the coordination deficits in pcd are consistent with a failure to predict and compensate for the consequences of movement across the body. These results isolate specific impairments in whole-body coordination in mice and provide a quantitative framework for understanding cerebellar contributions to coordinated locomotion. DOI: http://dx.doi.org/10.7554/eLife.07892.001 PMID:26433022
Lackey, Daniel P; Carruth, Eric D; Lasher, Richard A; Boenisch, Jan; Sachse, Frank B; Hitchcock, Robert W
2011-11-01
Gap junctions play a fundamental role in intercellular communication in cardiac tissue. Various types of heart disease including hypertrophy and ischemia are associated with alterations of the spatial arrangement of gap junctions. Previous studies applied two-dimensional optical and electron-microscopy to visualize gap junction arrangements. In normal cardiomyocytes, gap junctions were primarily found at cell ends, but can be found also in more central regions. In this study, we extended these approaches toward three-dimensional reconstruction of gap junction distributions based on high-resolution scanning confocal microscopy and image processing. We developed methods for quantitative characterization of gap junction distributions based on analysis of intensity profiles along the principal axes of myocytes. The analyses characterized gap junction polarization at cell ends and higher-order statistical image moments of intensity profiles. The methodology was tested in rat ventricular myocardium. Our analysis yielded novel quantitative data on gap junction distributions. In particular, the analysis demonstrated that the distributions exhibit significant variability with respect to polarization, skewness, and kurtosis. We suggest that this methodology provides a quantitative alternative to current approaches based on visual inspection, with applications in particular in characterization of engineered and diseased myocardium. Furthermore, we propose that these data provide improved input for computational modeling of cardiac conduction.
In silico quantitative structure-toxicity relationship study of aromatic nitro compounds.
Pasha, Farhan Ahmad; Neaz, Mohammad Morshed; Cho, Seung Joo; Ansari, Mohiuddin; Mishra, Sunil Kumar; Tiwari, Sharvan
2009-05-01
Small molecules often have toxicities that are a function of molecular structural features. Minor variations in structural features can make large difference in such toxicity. Consequently, in silico techniques may be used to correlate such molecular toxicities with their structural features. Relative to nine different sets of aromatic nitro compounds having known observed toxicities against different targets, we developed ligand-based 2D quantitative structure-toxicity relationship models using 20 selected topological descriptors. The topological descriptors have several advantages such as conformational independency, facile and less time-consuming computation to yield good results. Multiple linear regression analysis was used to correlate variations of toxicity with molecular properties. The information index on molecular size, lopping centric index and Kier flexibility index were identified as fundamental descriptors for different kinds of toxicity, and further showed that molecular size, branching and molecular flexibility might be particularly important factors in quantitative structure-toxicity relationship analysis. This study revealed that topological descriptor-guided quantitative structure-toxicity relationship provided a very useful, cost and time-efficient, in silico tool for describing small-molecule toxicities.
ERIC Educational Resources Information Center
Muir, Carrie
2012-01-01
The purpose of this study was to compare the performance of first year college students with similar high school mathematics backgrounds in two introductory level college mathematics courses, "Fundamentals and Techniques of College Algebra and Quantitative Reasoning and Mathematical Skills," and to compare the performance of students…
Sumi, A; Luo, T; Zhou, D; Yu, B; Kong, D; Kobayashi, N
2013-05-01
Viral hepatitis is recognized as one of the most frequently reported diseases, and especially in China, acute and chronic liver disease due to viral hepatitis has been a major public health problem. The present study aimed to analyse and predict surveillance data of infections of hepatitis A, B, C and E in Wuhan, China, by the method of time-series analysis (MemCalc, Suwa-Trast, Japan). On the basis of spectral analysis, fundamental modes explaining the underlying variation of the data for the years 2004-2008 were assigned. The model was calculated using the fundamental modes and the underlying variation of the data reproduced well. An extension of the model to the year 2009 could predict the data quantitatively. Our study suggests that the present method will allow us to model the temporal pattern of epidemics of viral hepatitis much more effectively than using the artificial neural network, which has been used previously.
The fundamental parameter method applied to X-ray fluorescence analysis with synchrotron radiation
NASA Astrophysics Data System (ADS)
Pantenburg, F. J.; Beier, T.; Hennrich, F.; Mommsen, H.
1992-05-01
Quantitative X-ray fluorescence analysis applying the fundamental parameter method is usually restricted to monochromatic excitation sources. It is shown here, that such analyses can be performed as well with a white synchrotron radiation spectrum. To determine absolute elemental concentration values it is necessary to know the spectral distribution of this spectrum. A newly designed and tested experimental setup, which uses the synchrotron radiation emitted from electrons in a bending magnet of ELSA (electron stretcher accelerator of the university of Bonn) is presented. The determination of the exciting spectrum, described by the given electron beam parameters, is limited due to uncertainties in the vertical electron beam size and divergence. We describe a method which allows us to determine the relative and absolute spectral distributions needed for accurate analysis. First test measurements of different alloys and standards of known composition demonstrate that it is possible to determine exact concentration values in bulk and trace element analysis.
Quadrupole Ion Mass Spectrometer for Masses of 2 to 50 Da
NASA Technical Reports Server (NTRS)
Helms, William; Griffin, Timothy P.; Ottens, Andrew; Harrison, Willard
2005-01-01
A customized quadrupole ion-trap mass spectrometer (QITMS) has been built to satisfy a need for a compact, rugged instrument for measuring small concentrations of hydrogen, helium, oxygen, and argon in a nitrogen atmosphere. This QITMS can also be used to perform quantitative analyses of other gases within its molecular-mass range, which is 2 to 50 daltons (Da). (More precisely, it can be used to perform quantitative analysis of gases that, when ionized, are characterized by m/Z ratios between 2 and 50, where m is the mass of an ion in daltons and Z is the number of fundamental electric charges on the ion.
Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.
Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young
2016-01-01
We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p < 0.0001), 0.062 for SD v (AUC: 0.847, p < 0.0001), 0.117 for A 1 (AUC: 0.876, p < 0.0001), and 0.349 for MUD-MDD (AUC: 0.948, p < 0.0001). This is the first study to analyze multiple aspects of respiration using various mathematical constructs and provides quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.
NASA Astrophysics Data System (ADS)
Egan, James; McMillan, Normal; Denieffe, David
2011-08-01
Proposals for a review of the limits of measurement for telecommunications are made. The measures are based on adapting work from the area of chemical metrology for the field of telecommunications. Currie has introduced recommendations for defining the limits of measurement in chemical metrology and has identified three key fundamental limits of measurement. These are the critical level, the detection limit and the determination limit. Measurements on an optical system are used to illustrate the utility of these measures and discussion is given into the advantages of using these fundamental quantitations over existing methods.
Using normalization 3D model for automatic clinical brain quantative analysis and evaluation
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping
2003-05-01
Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.
[Acoustic voice analysis using the Praat program: comparative study with the Dr. Speech program].
Núñez Batalla, Faustino; González Márquez, Rocío; Peláez González, M Belén; González Laborda, Irene; Fernández Fernández, María; Morato Galán, Marta
2014-01-01
The European Laryngological Society (ELS) basic protocol for functional assessment of voice pathology includes 5 different approaches: perception, videostroboscopy, acoustics, aerodynamics and subjective rating by the patient. In this study we focused on acoustic voice analysis. The purpose of the present study was to correlate the results obtained by the commercial software Dr. Speech and the free software Praat in 2 fields: 1. Narrow-band spectrogram (the presence of noise according to Yanagihara, and the presence of subharmonics) (semi-quantitative). 2. Voice acoustic parameters (jitter, shimmer, harmonics-to-noise ratio, fundamental frequency) (quantitative). We studied a total of 99 voice samples from individuals with Reinke's oedema diagnosed using videostroboscopy. One independent observer used Dr. Speech 3.0 and a second one used the Praat program (Phonetic Sciences, University of Amsterdam). The spectrographic analysis consisted of obtaining a narrow-band spectrogram from the previous digitalised voice samples by the 2 independent observers. They then determined the presence of noise in the spectrogram, using the Yanagihara grades, as well as the presence of subharmonics. As a final result, the acoustic parameters of jitter, shimmer, harmonics-to-noise ratio and fundamental frequency were obtained from the 2 acoustic analysis programs. The results indicated that the sound spectrogram and the numerical values obtained for shimmer and jitter were similar for both computer programs, even though types 1, 2 and 3 voice samples were analysed. The Praat and Dr. Speech programs provide similar results in the acoustic analysis of pathological voices. Copyright © 2013 Elsevier España, S.L. All rights reserved.
Analysis of High-Throughput ELISA Microarray Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Daly, Don S.; Zangar, Richard C.
Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).
Yan, Xu; Bishop, David J.
2018-01-01
Gene expression analysis by quantitative PCR in skeletal muscle is routine in exercise studies. The reproducibility and reliability of the data fundamentally depend on how the experiments are performed and interpreted. Despite the popularity of the assay, there is a considerable variation in experimental protocols and data analyses from different laboratories, and there is a lack of consistency of proper quality control steps throughout the assay. In this study, we present a number of experiments on various steps of quantitative PCR workflow, and demonstrate how to perform a quantitative PCR experiment with human skeletal muscle samples in an exercise study. We also tested some common mistakes in performing qPCR. Interestingly, we found that mishandling of muscle for a short time span (10 mins) before RNA extraction did not affect RNA quality, and isolated total RNA was preserved for up to one week at room temperature. Demonstrated by our data, use of unstable reference genes lead to substantial differences in the final results. Alternatively, cDNA content can be used for data normalisation; however, complete removal of RNA from cDNA samples is essential for obtaining accurate cDNA content. PMID:29746477
Photoelectron Spectroscopy for Identification of Chemical States
NASA Technical Reports Server (NTRS)
Novakov, T.
1971-01-01
The technique of X-ray photoelectron spectroscopy and the fundamental electronic interactions constituting the basis of the method will be discussed. The method provides information about chemical states ("oxidation states") of atoms in molecules. In addition, quantitative elemental analysis can be performed using the same method. On the basis of this information identification of chemical species is possible. Examples of applications are discussed with particular references to the study of smog particulate matter.
Mathematics of quantitative kinetic PCR and the application of standard curves.
Rutledge, R G; Côté, C
2003-08-15
Fluorescent monitoring of DNA amplification is the basis of real-time PCR, from which target DNA concentration can be determined from the fractional cycle at which a threshold amount of amplicon DNA is produced. Absolute quantification can be achieved using a standard curve constructed by amplifying known amounts of target DNA. In this study, the mathematics of quantitative PCR are examined in detail, from which several fundamental aspects of the threshold method and the application of standard curves are illustrated. The construction of five replicate standard curves for two pairs of nested primers was used to examine the reproducibility and degree of quantitative variation using SYBER Green I fluorescence. Based upon this analysis the application of a single, well- constructed standard curve could provide an estimated precision of +/-6-21%, depending on the number of cycles required to reach threshold. A simplified method for absolute quantification is also proposed, in which quantitative scale is determined by DNA mass at threshold.
To label or not to label: applications of quantitative proteomics in neuroscience research.
Filiou, Michaela D; Martins-de-Souza, Daniel; Guest, Paul C; Bahn, Sabine; Turck, Christoph W
2012-02-01
Proteomics has provided researchers with a sophisticated toolbox of labeling-based and label-free quantitative methods. These are now being applied in neuroscience research where they have already contributed to the elucidation of fundamental mechanisms and the discovery of candidate biomarkers. In this review, we evaluate and compare labeling-based and label-free quantitative proteomic techniques for applications in neuroscience research. We discuss the considerations required for the analysis of brain and central nervous system specimens, the experimental design of quantitative proteomic workflows as well as the feasibility, advantages, and disadvantages of the available techniques for neuroscience-oriented questions. Furthermore, we assess the use of labeled standards as internal controls for comparative studies in humans and review applications of labeling-based and label-free mass spectrometry approaches in relevant model organisms and human subjects. Providing a comprehensive guide of feasible and meaningful quantitative proteomic methodologies for neuroscience research is crucial not only for overcoming current limitations but also for gaining useful insights into brain function and translating proteomics from bench to bedside. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Baer, E. M.; Whittington, C.; Burn, H.
2008-12-01
The geological sciences are fundamentally quantitative. However, the diversity of students' mathematical preparation and skills makes the successful use of quantitative concepts difficult in introductory level classes. At Highline Community College, we have implemented a one-credit co-requisite course to give students supplemental instruction for quantitative skills used in the course. The course, formally titled "Quantitative Geology," nicknamed "MathPatch," runs parallel to our introductory Physical Geology course. MathPatch teaches the quantitative skills required for the geology class right before they are needed. Thus, students learn only the skills they need and are given opportunities to apply them immediately. Topics include complex-graph reading, unit conversions, large numbers, scientific notation, scale and measurement, estimation, powers of 10, and other fundamental mathematical concepts used in basic geological concepts. Use of this course over the past 8 years has successfully accomplished the goals of increasing students' quantitative skills, success and retention. Students master the quantitative skills to a greater extent than before the course was implemented, and less time is spent covering basic quantitative skills in the classroom. Because the course supports the use of quantitative skills, the large number of faculty that teach Geology 101 are more comfortable in using quantitative analysis, and indeed see it as an expectation of the course at Highline. Also significant, retention in the geology course has increased substantially, from 75% to 85%. Although successful, challenges persist with requiring MathPatch as a supplementary course. One, we have seen enrollments decrease in Geology 101, which may be the result of adding this co-requisite. Students resist mandatory enrollment in the course, although they are not good at evaluating their own need for the course. The logistics utilizing MathPatch in an evening class with fewer and longer class meetings has been challenging. Finally, in order to better serve our students' needs, we began to offer on-line sections of MathPatch; this mode of instruction is not as clearly effective, although it is very popular. Through the new The Math You Need project, we hope to improve the effectiveness of the on-line instruction so it can provide comparable results to the face-to-face sections of this class.
Bian, Xihui; Li, Shujuan; Lin, Ligang; Tan, Xiaoyao; Fan, Qingjie; Li, Ming
2016-06-21
Accurate prediction of the model is fundamental to the successful analysis of complex samples. To utilize abundant information embedded over frequency and time domains, a novel regression model is presented for quantitative analysis of hydrocarbon contents in the fuel oil samples. The proposed method named as high and low frequency unfolded PLSR (HLUPLSR), which integrates empirical mode decomposition (EMD) and unfolded strategy with partial least squares regression (PLSR). In the proposed method, the original signals are firstly decomposed into a finite number of intrinsic mode functions (IMFs) and a residue by EMD. Secondly, the former high frequency IMFs are summed as a high frequency matrix and the latter IMFs and residue are summed as a low frequency matrix. Finally, the two matrices are unfolded to an extended matrix in variable dimension, and then the PLSR model is built between the extended matrix and the target values. Coupled with Ultraviolet (UV) spectroscopy, HLUPLSR has been applied to determine hydrocarbon contents of light gas oil and diesel fuels samples. Comparing with single PLSR and other signal processing techniques, the proposed method shows superiority in prediction ability and better model interpretation. Therefore, HLUPLSR method provides a promising tool for quantitative analysis of complex samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Principles of Metamorphic Petrology
NASA Astrophysics Data System (ADS)
Williams, Michael L.
2009-05-01
The field of metamorphic petrology has seen spectacular advances in the past decade, including new X-ray mapping techniques for characterizing metamorphic rocks and minerals, new internally consistent thermobarometers, new software for constructing and viewing phase diagrams, new methods to date metamorphic processes, and perhaps most significant, revised petrologic databases and the ability to calculate accurate phase diagrams and pseudosections. These tools and techniques provide new power and resolution for constraining pressure-temperature (P-T) histories and tectonic events. Two books have been fundamental for empowering petrologists and structural geologists during the past decade. Frank Spear's Metamorphic Phase Equilibria and Pressure-Temperature-Time Paths, published in 1993, builds on his seminal papers to provide a quantitative framework for P-T path analysis. Spear's book lays the foundation for modern quantitative metamorphic analysis. Cees Passchier and Rudolph Trouw's Microtectonics, published in 2005, with its superb photos and figures, provides the tools and the theory for interpreting deformation textures and inferring deformation processes.
The phylogeny of swimming kinematics: The environment controls flagellar waveforms in sperm motility
NASA Astrophysics Data System (ADS)
Guasto, Jeffrey; Burton, Lisa; Zimmer, Richard; Hosoi, Anette; Stocker, Roman
2013-11-01
In recent years, phylogenetic and molecular analyses have dominated the study of ecology and evolution. However, physical interactions between organisms and their environment, a fundamental determinant of organism ecology and evolution, are mediated by organism form and function, highlighting the need to understand the mechanics of basic survival strategies, including locomotion. Focusing on spermatozoa, we combined high-speed video microscopy and singular value decomposition analysis to quantitatively compare the flagellar waveforms of eight species, ranging from marine invertebrates to humans. We found striking similarities in sperm swimming kinematics between genetically dissimilar organisms, which could not be uncovered by phylogenetic analysis. The emergence of dominant waveform patterns across species are suggestive of biological optimization for flagellar locomotion and point toward environmental cues as drivers of this convergence. These results reinforce the power of quantitative kinematic analysis to understand the physical drivers of evolution and as an approach to uncover new solutions for engineering applications, such as micro-robotics.
General description and understanding of the nonlinear dynamics of mode-locked fiber lasers.
Wei, Huai; Li, Bin; Shi, Wei; Zhu, Xiushan; Norwood, Robert A; Peyghambarian, Nasser; Jian, Shuisheng
2017-05-02
As a type of nonlinear system with complexity, mode-locked fiber lasers are known for their complex behaviour. It is a challenging task to understand the fundamental physics behind such complex behaviour, and a unified description for the nonlinear behaviour and the systematic and quantitative analysis of the underlying mechanisms of these lasers have not been developed. Here, we present a complexity science-based theoretical framework for understanding the behaviour of mode-locked fiber lasers by going beyond reductionism. This hierarchically structured framework provides a model with variable dimensionality, resulting in a simple view that can be used to systematically describe complex states. Moreover, research into the attractors' basins reveals the origin of stochasticity, hysteresis and multistability in these systems and presents a new method for quantitative analysis of these nonlinear phenomena. These findings pave the way for dynamics analysis and system designs of mode-locked fiber lasers. We expect that this paradigm will also enable potential applications in diverse research fields related to complex nonlinear phenomena.
Mast, Fred D.; Ratushny, Alexander V.
2014-01-01
Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. PMID:25225336
New EVSE Analytical Tools/Models: Electric Vehicle Infrastructure Projection Tool (EVI-Pro)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Eric W; Rames, Clement L; Muratori, Matteo
This presentation addresses the fundamental question of how much charging infrastructure is needed in the United States to support PEVs. It complements ongoing EVSE initiatives by providing a comprehensive analysis of national PEV charging infrastructure requirements. The result is a quantitative estimate for a U.S. network of non-residential (public and workplace) EVSE that would be needed to support broader PEV adoption. The analysis provides guidance to public and private stakeholders who are seeking to provide nationwide charging coverage, improve the EVSE business case by maximizing station utilization, and promote effective use of private/public infrastructure investments.
Lunar terrain mapping and relative-roughness analysis
NASA Technical Reports Server (NTRS)
Rowan, L. C.; Mccauley, J. F.; Holm, E. A.
1971-01-01
Terrain maps of the equatorial zone were prepared at scales of 1:2,000,000 and 1:1,000,000 to classify lunar terrain with respect to roughness and to provide a basis for selecting sites for Surveyor and Apollo landings, as well as for Ranger and Lunar Orbiter photographs. Lunar terrain was described by qualitative and quantitative methods and divided into four fundamental classes: maria, terrae, craters, and linear features. Some 35 subdivisions were defined and mapped throughout the equatorial zone, and, in addition, most of the map units were illustrated by photographs. The terrain types were analyzed quantitatively to characterize and order their relative roughness characteristics. For some morphologically homogeneous mare areas, relative roughness can be extrapolated to the large scales from measurements at small scales.
Gray, Meeghan E; Cameron, Elissa Z
2010-01-01
The efficacy of contraceptive treatments has been extensively tested, and several formulations are effective at reducing fertility in a range of species. However, these formulations should minimally impact the behavior of individuals and populations before a contraceptive is used for population manipulation, but these effects have received less attention. Potential side effects have been identified theoretically and we reviewed published studies that have investigated side effects on behavior and physiology of individuals or population-level effects, which provided mixed results. Physiological side effects were most prevalent. Most studies reported a lack of secondary effects, but were usually based on qualitative data or anecdotes. A meta-analysis on quantitative studies of side effects showed that secondary effects consistently occur across all categories and all contraceptive types. This contrasts with the qualitative studies, suggesting that anecdotal reports are insufficient to investigate secondary impacts of contraceptive treatment. We conclude that more research is needed to address fundamental questions about secondary effects of contraceptive treatment and experiments are fundamental to conclusions. In addition, researchers are missing a vital opportunity to use contraceptives as an experimental tool to test the influence of reproduction, sex and fertility on the behavior of wildlife species.
The influence of low frequency sound on the changes of EEG signal morphology
NASA Astrophysics Data System (ADS)
Damijan, Z.; Wiciak, J.
2006-11-01
The effects of low frequency sound on the changes of morphology of the spectral power density function of EEG signals were studied as a part of the research program f = 40 Hz, Lp = 110 dB HP. The research program involved 33 experiments. A quantitative analysis was conducted of the driving response effect for the fundamental frequency and its harmonics to find the frequency of the driving response effect occurrence depending on the sex of participants.
Mast, Fred D; Ratushny, Alexander V; Aitchison, John D
2014-09-15
Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. © 2014 Mast et al.
Causal Loop Analysis of coastal geomorphological systems
NASA Astrophysics Data System (ADS)
Payo, Andres; Hall, Jim W.; French, Jon; Sutherland, James; van Maanen, Barend; Nicholls, Robert J.; Reeve, Dominic E.
2016-03-01
As geomorphologists embrace ever more sophisticated theoretical frameworks that shift from simple notions of evolution towards single steady equilibria to recognise the possibility of multiple response pathways and outcomes, morphodynamic modellers are facing the problem of how to keep track of an ever-greater number of system feedbacks. Within coastal geomorphology, capturing these feedbacks is critically important, especially as the focus of activity shifts from reductionist models founded on sediment transport fundamentals to more synthesist ones intended to resolve emergent behaviours at decadal to centennial scales. This paper addresses the challenge of mapping the feedback structure of processes controlling geomorphic system behaviour with reference to illustrative applications of Causal Loop Analysis at two study cases: (1) the erosion-accretion behaviour of graded (mixed) sediment beds, and (2) the local alongshore sediment fluxes of sand-rich shorelines. These case study examples are chosen on account of their central role in the quantitative modelling of geomorphological futures and as they illustrate different types of causation. Causal loop diagrams, a form of directed graph, are used to distil the feedback structure to reveal, in advance of more quantitative modelling, multi-response pathways and multiple outcomes. In the case of graded sediment bed, up to three different outcomes (no response, and two disequilibrium states) can be derived from a simple qualitative stability analysis. For the sand-rich local shoreline behaviour case, two fundamentally different responses of the shoreline (diffusive and anti-diffusive), triggered by small changes of the shoreline cross-shore position, can be inferred purely through analysis of the causal pathways. Explicit depiction of feedback-structure diagrams is beneficial when developing numerical models to explore coastal morphological futures. By explicitly mapping the feedbacks included and neglected within a model, the modeller can readily assess if critical feedback loops are included.
Investigation on Law and Economics Based on Complex Network and Time Series Analysis.
Yang, Jian; Qu, Zhao; Chang, Hui
2015-01-01
The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing.
Unified Theory for Decoding the Signals from X-Ray Florescence and X-Ray Diffraction of Mixtures.
Chung, Frank H
2017-05-01
For research and development or for solving technical problems, we often need to know the chemical composition of an unknown mixture, which is coded and stored in the signals of its X-ray fluorescence (XRF) and X-ray diffraction (XRD). X-ray fluorescence gives chemical elements, whereas XRD gives chemical compounds. The major problem in XRF and XRD analyses is the complex matrix effect. The conventional technique to deal with the matrix effect is to construct empirical calibration lines with standards for each element or compound sought, which is tedious and time-consuming. A unified theory of quantitative XRF analysis is presented here. The idea is to cancel the matrix effect mathematically. It turns out that the decoding equation for quantitative XRF analysis is identical to that for quantitative XRD analysis although the physics of XRD and XRF are fundamentally different. The XRD work has been published and practiced worldwide. The unified theory derives a new intensity-concentration equation of XRF, which is free from the matrix effect and valid for a wide range of concentrations. The linear decoding equation establishes a constant slope for each element sought, hence eliminating the work on calibration lines. The simple linear decoding equation has been verified by 18 experiments.
The Design of a Quantitative Western Blot Experiment
Taylor, Sean C.; Posch, Anton
2014-01-01
Western blotting is a technique that has been in practice for more than three decades that began as a means of detecting a protein target in a complex sample. Although there have been significant advances in both the imaging and reagent technologies to improve sensitivity, dynamic range of detection, and the applicability of multiplexed target detection, the basic technique has remained essentially unchanged. In the past, western blotting was used simply to detect a specific target protein in a complex mixture, but now journal editors and reviewers are requesting the quantitative interpretation of western blot data in terms of fold changes in protein expression between samples. The calculations are based on the differential densitometry of the associated chemiluminescent and/or fluorescent signals from the blots and this now requires a fundamental shift in the experimental methodology, acquisition, and interpretation of the data. We have recently published an updated approach to produce quantitative densitometric data from western blots (Taylor et al., 2013) and here we summarize the complete western blot workflow with a focus on sample preparation and data analysis for quantitative western blotting. PMID:24738055
Loroch, Stefan; Schommartz, Tim; Brune, Wolfram; Zahedi, René Peiman; Sickmann, Albert
2015-05-01
Quantitative proteomics and phosphoproteomics have become key disciplines in understanding cellular processes. Fundamental research can be done using cell culture providing researchers with virtually infinite sample amounts. In contrast, clinical, pre-clinical and biomedical research is often restricted to minute sample amounts and requires an efficient analysis with only micrograms of protein. To address this issue, we generated a highly sensitive workflow for combined LC-MS-based quantitative proteomics and phosphoproteomics by refining an ERLIC-based 2D phosphoproteomics workflow into an ERLIC-based 3D workflow covering the global proteome as well. The resulting 3D strategy was successfully used for an in-depth quantitative analysis of both, the proteome and the phosphoproteome of murine cytomegalovirus-infected mouse fibroblasts, a model system for host cell manipulation by a virus. In a 2-plex SILAC experiment with 150 μg of a tryptic digest per condition, the 3D strategy enabled the quantification of ~75% more proteins and even ~134% more peptides compared to the 2D strategy. Additionally, we could quantify ~50% more phosphoproteins by non-phosphorylated peptides, concurrently yielding insights into changes on the levels of protein expression and phosphorylation. Beside its sensitivity, our novel three-dimensional ERLIC-strategy has the potential for semi-automated sample processing rendering it a suitable future perspective for clinical, pre-clinical and biomedical research. Copyright © 2015. Published by Elsevier B.V.
Quantum Sensors at the Intersections of Fundamental Science, Quantum Information Science & Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chattopadhyay, Swapan; Falcone, Roger; Walsworth, Ronald
Over the last twenty years, there has been a boom in quantum science - i.e., the development and exploitation of quantum systems to enable qualitatively and quantitatively new capabilities, with high-impact applications and fundamental insights that can range across all areas of science and technology.
Matrix evaluation of science objectives
NASA Technical Reports Server (NTRS)
Wessen, Randii R.
1994-01-01
The most fundamental objective of all robotic planetary spacecraft is to return science data. To accomplish this, a spacecraft is fabricated and built, software is planned and coded, and a ground system is designed and implemented. However, the quantitative analysis required to determine how the collection of science data drives ground system capabilities has received very little attention. This paper defines a process by which science objectives can be quantitatively evaluated. By applying it to the Cassini Mission to Saturn, this paper further illustrates the power of this technique. The results show which science objectives drive specific ground system capabilities. In addition, this process can assist system engineers and scientists in the selection of the science payload during pre-project mission planning; ground system designers during ground system development and implementation; and operations personnel during mission operations.
Quantifying patterns of research interest evolution
NASA Astrophysics Data System (ADS)
Jia, Tao; Wang, Dashun; Szymanski, Boleslaw
Changing and shifting research interest is an integral part of a scientific career. Despite extensive investigations of various factors that influence a scientist's choice of research topics, quantitative assessments of mechanisms that give rise to macroscopic patterns characterizing research interest evolution of individual scientists remain limited. Here we perform a large-scale analysis of extensive publication records, finding that research interest change follows a reproducible pattern characterized by an exponential distribution. We identify three fundamental features responsible for the observed exponential distribution, which arise from a subtle interplay between exploitation and exploration in research interest evolution. We develop a random walk based model, which adequately reproduces our empirical observations. Our study presents one of the first quantitative analyses of macroscopic patterns governing research interest change, documenting a high degree of regularity underlying scientific research and individual careers.
Silicon ribbon study program. [dendritic crystals for use in solar cells
NASA Technical Reports Server (NTRS)
Seidensticker, R. G.; Duncan, C. S.
1975-01-01
The feasibility is studied of growing wide, thin silicon dendritic web for solar cell fabrication and conceptual designs are developed for the apparatus required. An analysis of the mechanisms of dendritic web growth indicated that there were no apparent fundamental limitations to the process. The analysis yielded quantitative guidelines for the thermal conditions required for this mode of crystal growth. Crucible designs were then investigated: the usual quartz crucible configurations and configurations in which silicon itself is used for the crucible. The quartz crucible design is feasible and is incorporated into a conceptual design for a laboratory scale crystal growth facility capable of semi-automated quasi-continuous operation.
Dependence of sound characteristics on the bowing position in a violin
NASA Astrophysics Data System (ADS)
Roh, YuJi; Kim, Young H.
2014-12-01
A quantitative analysis of violin sounds produced for different bowing positions over the full length of a violin string has been carried out. An automated bowing machine was employed in order to keep the bowing parameters constant. A 3-dimensional profile of the frequency spectrum was introduced in order to characterize the violin's sound. We found that the fundamental frequency did not change for different bowing positions, whereas the frequencies of the higher harmonics were different. Bowing the string at 30 mm from the bridge produced musical sounds. The middle of the string was confirmed to be a dead zone, as reported in previous works. In addition, the quarter position was also found to be a dead zone. Bowing the string 90 mm from the bridge dominantly produces a fundamental frequency of 864 Hz and its harmonics.
Liu, Yang; Wilson, W David
2010-01-01
Surface plasmon resonance (SPR) technology with biosensor surfaces has become a widely-used tool for the study of nucleic acid interactions without any labeling requirements. The method provides simultaneous kinetic and equilibrium characterization of the interactions of biomolecules as well as small molecule-biopolymer binding. SPR monitors molecular interactions in real time and provides significant advantages over optical or calorimetic methods for systems with strong binding coupled to small spectroscopic signals and/or reaction heats. A detailed and practical guide for nucleic acid interaction analysis using SPR-biosensor methods is presented. Details of the SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips, and samples, as well as extensive information on experimental design, quantitative and qualitative data analysis and presentation. A specific example of the interaction of a minor-groove-binding agent with DNA is evaluated by both kinetic and steady-state SPR methods to illustrate the technique. Since the molecules that bind cooperatively to specific DNA sequences are attractive for many applications, a cooperative small molecule-DNA interaction is also presented.
Color and texture associations in voice-induced synesthesia
Moos, Anja; Simmons, David; Simner, Julia; Smith, Rachel
2013-01-01
Voice-induced synesthesia, a form of synesthesia in which synesthetic perceptions are induced by the sounds of people's voices, appears to be relatively rare and has not been systematically studied. In this study we investigated the synesthetic color and visual texture perceptions experienced in response to different types of “voice quality” (e.g., nasal, whisper, falsetto). Experiences of three different groups—self-reported voice synesthetes, phoneticians, and controls—were compared using both qualitative and quantitative analysis in a study conducted online. Whilst, in the qualitative analysis, synesthetes used more color and texture terms to describe voices than either phoneticians or controls, only weak differences, and many similarities, between groups were found in the quantitative analysis. Notable consistent results between groups were the matching of higher speech fundamental frequencies with lighter and redder colors, the matching of “whispery” voices with smoke-like textures, and the matching of “harsh” and “creaky” voices with textures resembling dry cracked soil. These data are discussed in the light of current thinking about definitions and categorizations of synesthesia, especially in cases where individuals apparently have a range of different synesthetic inducers. PMID:24032023
Quantitative Reappraisal of the Helmholtz-Guyton Resonance Theory of Frequency Tuning in the Cochlea
Babbs, Charles F.
2011-01-01
To explore the fundamental biomechanics of sound frequency transduction in the cochlea, a two-dimensional analytical model of the basilar membrane was constructed from first principles. Quantitative analysis showed that axial forces along the membrane are negligible, condensing the problem to a set of ordered one-dimensional models in the radial dimension, for which all parameters can be specified from experimental data. Solutions of the radial models for asymmetrical boundary conditions produce realistic deformation patterns. The resulting second-order differential equations, based on the original concepts of Helmholtz and Guyton, and including viscoelastic restoring forces, predict a frequency map and amplitudes of deflections that are consistent with classical observations. They also predict the effects of an observation hole drilled in the surrounding bone, the effects of curvature of the cochlear spiral, as well as apparent traveling waves under a variety of experimental conditions. A quantitative rendition of the classical Helmholtz-Guyton model captures the essence of cochlear mechanics and unifies the competing resonance and traveling wave theories. PMID:22028708
Decoupled form and function in disparate herbivorous dinosaur clades
NASA Astrophysics Data System (ADS)
Lautenschlager, Stephan; Brassey, Charlotte A.; Button, David J.; Barrett, Paul M.
2016-05-01
Convergent evolution, the acquisition of morphologically similar traits in unrelated taxa due to similar functional demands or environmental factors, is a common phenomenon in the animal kingdom. Consequently, the occurrence of similar form is used routinely to address fundamental questions in morphofunctional research and to infer function in fossils. However, such qualitative assessments can be misleading and it is essential to test form/function relationships quantitatively. The parallel occurrence of a suite of morphologically convergent craniodental characteristics in three herbivorous, phylogenetically disparate dinosaur clades (Sauropodomorpha, Ornithischia, Theropoda) provides an ideal test case. A combination of computational biomechanical models (Finite Element Analysis, Multibody Dynamics Analysis) demonstrate that despite a high degree of morphological similarity between representative taxa (Plateosaurus engelhardti, Stegosaurus stenops, Erlikosaurus andrewsi) from these clades, their biomechanical behaviours are notably different and difficult to predict on the basis of form alone. These functional differences likely reflect dietary specialisations, demonstrating the value of quantitative biomechanical approaches when evaluating form/function relationships in extinct taxa.
ITRAQ-based quantitative proteomic analysis of Cynops orientalis limb regeneration.
Tang, Jie; Yu, Yuan; Zheng, Hanxue; Yin, Lu; Sun, Mei; Wang, Wenjun; Cui, Jihong; Liu, Wenguang; Xie, Xin; Chen, Fulin
2017-09-22
Salamanders regenerate their limbs after amputation. However, the molecular mechanism of this unique regeneration remains unclear. In this study, isobaric tags for relative and absolute quantification (iTRAQ) coupled with liquid chromatography tandem mass spectrometry (LC-MS/MS) was employed to quantitatively identify differentially expressed proteins in regenerating limbs 3, 7, 14, 30 and 42 days post amputation (dpa). Of 2636 proteins detected in total, 253 proteins were differentially expressed during different regeneration stages. Among these proteins, Asporin, Cadherin-13, Keratin, Collagen alpha-1(XI) and Titin were down-regulated. CAPG, Coronin-1A, AnnexinA1, Cathepsin B were up-regulated compared with the control. The identified proteins were further analyzed to obtain information about their expression patterns and functions in limb regeneration. Functional analysis indicated that the differentially expressed proteins were associated with wound healing, immune response, cellular process, metabolism and binding. This work indicated that significant proteome alternations occurred during salamander limb regeneration. The results may provide fundamental knowledge to understand the mechanism of limb regeneration.
Decoupled form and function in disparate herbivorous dinosaur clades.
Lautenschlager, Stephan; Brassey, Charlotte A; Button, David J; Barrett, Paul M
2016-05-20
Convergent evolution, the acquisition of morphologically similar traits in unrelated taxa due to similar functional demands or environmental factors, is a common phenomenon in the animal kingdom. Consequently, the occurrence of similar form is used routinely to address fundamental questions in morphofunctional research and to infer function in fossils. However, such qualitative assessments can be misleading and it is essential to test form/function relationships quantitatively. The parallel occurrence of a suite of morphologically convergent craniodental characteristics in three herbivorous, phylogenetically disparate dinosaur clades (Sauropodomorpha, Ornithischia, Theropoda) provides an ideal test case. A combination of computational biomechanical models (Finite Element Analysis, Multibody Dynamics Analysis) demonstrate that despite a high degree of morphological similarity between representative taxa (Plateosaurus engelhardti, Stegosaurus stenops, Erlikosaurus andrewsi) from these clades, their biomechanical behaviours are notably different and difficult to predict on the basis of form alone. These functional differences likely reflect dietary specialisations, demonstrating the value of quantitative biomechanical approaches when evaluating form/function relationships in extinct taxa.
Decision support intended to improve ecosystem sustainability requires that we link stakeholder priorities directly to quantitative tools and measures of desired outcomes. Actions taken at the community level can have large impacts on production and delivery of ecosystem service...
Implementing online quantitative support modules in an intermediate-level course
NASA Astrophysics Data System (ADS)
Daly, J.
2011-12-01
While instructors typically anticipate that students in introductory geology courses enter a class with a wide range of quantitative ability, we often overlook the fact that this may also be true in upper-level courses. Some students are drawn to the subject and experience success in early courses with an emphasis on descriptive geology, then experience frustration and disappointment in mid- and upper-level courses that are more quantitative. To bolster student confidence in quantitative skills and enhance their performance in an upper-level course, I implemented several modules from The Math You Need (TMYN) online resource with a 200-level geomorphology class. Student facility with basic quantitative skills (rearranging equations, manipulating units, and graphing) was assessed with an online pre- and post-test. During the semester, modules were assigned to complement existing course activities (for example, the module on manipulating units was assigned prior to a lab on measurement of channel area and water velocity, then calculation of discharge). The implementation was designed to be a concise review of relevant skills for students with higher confidence in their quantitative abilities, and to provide a self-paced opportunity for students with less quantitative facility to build skills. This course already includes a strong emphasis on quantitative data collection, analysis, and presentation; in the past, student performance in the course has been strongly influenced by their individual quantitative ability. I anticipate that giving students the opportunity to improve mastery of fundamental quantitative skills will improve their performance on higher-stakes assignments and exams, and will enhance their sense of accomplishment in the course.
Investigation on Law and Economics Based on Complex Network and Time Series Analysis
Yang, Jian; Qu, Zhao; Chang, Hui
2015-01-01
The research focuses on the cooperative relationship and the strategy tendency among three mutually interactive parties in financing: small enterprises, commercial banks and micro-credit companies. Complex network theory and time series analysis were applied to figure out the quantitative evidence. Moreover, this paper built up a fundamental model describing the particular interaction among them through evolutionary game. Combining the results of data analysis and current situation, it is justifiable to put forward reasonable legislative recommendations for regulations on lending activities among small enterprises, commercial banks and micro-credit companies. The approach in this research provides a framework for constructing mathematical models and applying econometrics and evolutionary game in the issue of corporation financing. PMID:26076460
Electrochemistry in hollow-channel paper analytical devices.
Renault, Christophe; Anderson, Morgan J; Crooks, Richard M
2014-03-26
In the present article we provide a detailed analysis of fundamental electrochemical processes in a new class of paper-based analytical devices (PADs) having hollow channels (HCs). Voltammetry and amperometry were applied under flow and no flow conditions yielding reproducible electrochemical signals that can be described by classical electrochemical theory as well as finite-element simulations. The results shown here provide new and quantitative insights into the flow within HC-PADs. The interesting new result is that despite their remarkable simplicity these HC-PADs exhibit electrochemical and hydrodynamic behavior similar to that of traditional microelectrochemical devices.
Evolution in students' understanding of thermal physics with increasing complexity
NASA Astrophysics Data System (ADS)
Langbeheim, Elon; Safran, Samuel A.; Livne, Shelly; Yerushalmi, Edit
2013-12-01
We analyze the development in students’ understanding of fundamental principles in the context of learning a current interdisciplinary research topic—soft matter—that was adapted to the level of high school students. The topic was introduced in a program for interested 11th grade high school students majoring in chemistry and/or physics, in an off-school setting. Soft matter was presented in a gradual increase in the degree of complexity of the phenomena as well as in the level of the quantitative analysis. We describe the evolution in students’ use of fundamental thermodynamics principles to reason about phase separation—a phenomenon that is ubiquitous in soft matter. In particular, we examine the impact of the use of free energy analysis, a common approach in soft matter, on the understanding of the fundamental principles of thermodynamics. The study used diagnostic questions and classroom observations to gauge the student’s learning. In order to gain insight on the aspects that shape the understanding of the basic principles, we focus on the responses and explanations of two case-study students who represent two trends of evolution in conceptual understanding in the group. We analyze changes in the two case studies’ management of conceptual resources used in their analysis of phase separation, and suggest how their prior knowledge and epistemological framing (a combination of their personal tendencies and their prior exposure to different learning styles) affect their conceptual evolution. Finally, we propose strategies to improve the instruction of these concepts.
Tromberg, B.J.; Tsay, T.T.; Berns, M.W.; Svaasand, L.O.; Haskell, R.C.
1995-06-13
Optical measurements of turbid media, that is media characterized by multiple light scattering, is provided through an apparatus and method for exposing a sample to a modulated laser beam. The light beam is modulated at a fundamental frequency and at a plurality of integer harmonics thereof. Modulated light is returned from the sample and preferentially detected at cross frequencies at frequencies slightly higher than the fundamental frequency and at integer harmonics of the same. The received radiance at the beat or cross frequencies is compared against a reference signal to provide a measure of the phase lag of the radiance and modulation ratio relative to a reference beam. The phase and modulation amplitude are then provided as a frequency spectrum by an array processor to which a computer applies a complete curve fit in the case of highly scattering samples or a linear curve fit below a predetermined frequency in the case of highly absorptive samples. The curve fit in any case is determined by the absorption and scattering coefficients together with a concentration of the active substance in the sample. Therefore, the curve fitting to the frequency spectrum can be used both for qualitative and quantitative analysis of substances in the sample even though the sample is highly turbid. 14 figs.
Tromberg, Bruce J.; Tsay, Tsong T.; Berns, Michael W.; Svaasand, Lara O.; Haskell, Richard C.
1995-01-01
Optical measurements of turbid media, that is media characterized by multiple light scattering, is provided through an apparatus and method for exposing a sample to a modulated laser beam. The light beam is modulated at a fundamental frequency and at a plurality of integer harmonics thereof. Modulated light is returned from the sample and preferentially detected at cross frequencies at frequencies slightly higher than the fundamental frequency and at integer harmonics of the same. The received radiance at the beat or cross frequencies is compared against a reference signal to provide a measure of the phase lag of the radiance and modulation ratio relative to a reference beam. The phase and modulation amplitude are then provided as a frequency spectrum by an array processor to which a computer applies a complete curve fit in the case of highly scattering samples or a linear curve fit below a predetermined frequency in the case of highly absorptive samples. The curve fit in any case is determined by the absorption and scattering coefficients together with a concentration of the active substance in the sample. Therefore, the curve fitting to the frequency spectrum can be used both for qualitative and quantitative analysis of substances in the sample even though the sample is highly turbid.
Quantitative model of super-Arrhenian behavior in glass forming materials
NASA Astrophysics Data System (ADS)
Caruthers, J. M.; Medvedev, G. A.
2018-05-01
The key feature of glass forming liquids is the super-Arrhenian temperature dependence of the mobility, where the mobility can increase by ten orders of magnitude or more as the temperature is decreased if crystallization does not intervene. A fundamental description of the super-Arrhenian behavior has been developed; specifically, the logarithm of the relaxation time is a linear function of 1 /U¯x , where U¯x is the independently determined excess molar internal energy and B is a material constant. This one-parameter mobility model quantitatively describes data for 21 glass forming materials, which are all the materials where there are sufficient experimental data for analysis. The effect of pressure on the loga mobility is also described using the same U¯x(T ,p ) function determined from the difference between the liquid and crystalline internal energies. It is also shown that B is well correlated with the heat of fusion. The prediction of the B /U¯x model is compared to the Adam and Gibbs 1 /T S¯x model, where the B /U¯x model is significantly better in unifying the full complement of mobility data. The implications of the B /U¯x model for the development of a fundamental description of glass are discussed.
[Image processing applying in analysis of motion features of cultured cardiac myocyte in rat].
Teng, Qizhi; He, Xiaohai; Luo, Daisheng; Wang, Zhengrong; Zhou, Beiyi; Yuan, Zhirun; Tao, Dachang
2007-02-01
Study of mechanism of medicine actions, by quantitative analysis of cultured cardiac myocyte, is one of the cutting edge researches in myocyte dynamics and molecular biology. The characteristics of cardiac myocyte auto-beating without external stimulation make the research sense. Research of the morphology and cardiac myocyte motion using image analysis can reveal the fundamental mechanism of medical actions, increase the accuracy of medicine filtering, and design the optimal formula of medicine for best medical treatments. A system of hardware and software has been built with complete sets of functions including living cardiac myocyte image acquisition, image processing, motion image analysis, and image recognition. In this paper, theories and approaches are introduced for analysis of living cardiac myocyte motion images and implementing quantitative analysis of cardiac myocyte features. A motion estimation algorithm is used for motion vector detection of particular points and amplitude and frequency detection of a cardiac myocyte. Beatings of cardiac myocytes are sometimes very small. In such case, it is difficult to detect the motion vectors from the particular points in a time sequence of images. For this reason, an image correlation theory is employed to detect the beating frequencies. Active contour algorithm in terms of energy function is proposed to approximate the boundary and detect the changes of edge of myocyte.
Instrumentation Automation for Concrete Structures; Report 1: Instrumentation Automation Techniques
1986-12-01
The internat.i..onal measuring system sets up independent standards for t:hese fundamental quanti ties. All other quanti ties (force, acceleration...measurement systems are typically composed of several fundamental performing a special function (Figure 1). 3 accuracy of a quantitative measurement is...equiJ2.ID_g_:)1t 2 J, A fundament ’’ J function of ~" i1 ’r. ’’i::rumentation system is to prese~t desired measurement data to Lne user in a form that
How much separation for LC-MS/MS quantitative bioanalysis of drugs and metabolites?
Tan, Aimin; Fanaras, John C
2018-05-01
LC-MS/MS has been the dominant analytical technology for quantitative bioanalysis of drugs and metabolites for more than two decades. Despite this, a very fundamental question like how much separation is required for LC-MS/MS quantitative bioanalysis of drugs and metabolites has not been adequately addressed. Some think that no or only very limited separation is necessary thanks to the unparalleled selectivity offered by tandem mass spectrometry. Others think that the more separation, the better, because of the potential detrimental impact of matrix effect (ion suppression or enhancement). Still others just use a rule-of-thumb approach by keeping the adjusted retention/capacity factor always between 2 and 5. The purpose of this article is to address this fundamental question through rational thinking together with various real case examples drawn from regulated bioanalytical laboratories. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Orlić, Ivica; Mekterović, Darko; Mekterović, Igor; Ivošević, Tatjana
2015-11-01
VIBA-Lab is a computer program originally developed by the author and co-workers at the National University of Singapore (NUS) as an interactive software package for simulation of Particle Induced X-ray Emission and Rutherford Backscattering Spectra. The original program is redeveloped to a VIBA-Lab 3.0 in which the user can perform semi-quantitative analysis by comparing simulated and measured spectra as well as simulate 2D elemental maps for a given 3D sample composition. The latest version has a new and more versatile user interface. It also has the latest data set of fundamental parameters such as Coster-Kronig transition rates, fluorescence yields, mass absorption coefficients and ionization cross sections for K and L lines in a wider energy range than the original program. Our short-term plan is to introduce routine for quantitative analysis for multiple PIXE and XRF excitations. VIBA-Lab is an excellent teaching tool for students and researchers in using PIXE and RBS techniques. At the same time the program helps when planning an experiment and when optimizing experimental parameters such as incident ions, their energy, detector specifications, filters, geometry, etc. By "running" a virtual experiment the user can test various scenarios until the optimal PIXE and BS spectra are obtained and in this way save a lot of expensive machine time.
[Acoustic and aerodynamic characteristics of the oesophageal voice].
Vázquez de la Iglesia, F; Fernández González, S
2005-12-01
The aim of the study is to determine the physiology and pathophisiology of esophageal voice according to objective aerodynamic and acoustic parameters (quantitative and qualitative parameters). Our subjects were comprised of 33 laryngectomized patients (all male) that underwent aerodynamic, acoustic and perceptual protocol. There is a statistical association between acoustic and aerodynamic qualitative parameters (phonation flow chart type, sound spectrum, perceptual analysis) among quantitative parameters (neoglotic pressure, phonation flow, phonation time, fundamental frequency, maximum intensity sound level, speech rate). Nevertheles, not always such observations bring practical resources to clinical practice. We consider that the facts studied may enable us to add, pragmatically, new resources to the more effective vocal rehabilitation to these patients. The physiology of esophageal voice is well understood by the method we have applied, also seeking for rehabilitation, improving oral communication skills in the laryngectomee population.
Masci, Ilaria; Vannozzi, Giuseppe; Bergamini, Elena; Pesce, Caterina; Getchell, Nancy; Cappozzo, Aurelio
2013-04-01
Objective quantitative evaluation of motor skill development is of increasing importance to carefully drive physical exercise programs in childhood. Running is a fundamental motor skill humans adopt to accomplish locomotion, which is linked to physical activity levels, although the assessment is traditionally carried out using qualitative evaluation tests. The present study aimed at investigating the feasibility of using inertial sensors to quantify developmental differences in the running pattern of young children. Qualitative and quantitative assessment tools were adopted to identify a skill-sensitive set of biomechanical parameters for running and to further our understanding of the factors that determine progression to skilled running performance. Running performances of 54 children between the ages of 2 and 12 years were submitted to both qualitative and quantitative analysis, the former using sequences of developmental level, the latter estimating temporal and kinematic parameters from inertial sensor measurements. Discriminant analysis with running developmental level as dependent variable allowed to identify a set of temporal and kinematic parameters, within those obtained with the sensor, that best classified children into the qualitative developmental levels (accuracy higher than 67%). Multivariate analysis of variance with the quantitative parameters as dependent variables allowed to identify whether and which specific parameters or parameter subsets were differentially sensitive to specific transitions between contiguous developmental levels. The findings showed that different sets of temporal and kinematic parameters are able to tap all steps of the transitional process in running skill described through qualitative observation and can be prospectively used for applied diagnostic and sport training purposes. Copyright © 2012 Elsevier B.V. All rights reserved.
"Genetically Engineered" Nanoelectronics
NASA Technical Reports Server (NTRS)
Klimeck, Gerhard; Salazar-Lazaro, Carlos H.; Stoica, Adrian; Cwik, Thomas
2000-01-01
The quantum mechanical functionality of nanoelectronic devices such as resonant tunneling diodes (RTDs), quantum well infrared-photodetectors (QWIPs), quantum well lasers, and heterostructure field effect transistors (HFETs) is enabled by material variations on an atomic scale. The design and optimization of such devices requires a fundamental understanding of electron transport in such dimensions. The Nanoelectronic Modeling Tool (NEMO) is a general-purpose quantum device design and analysis tool based on a fundamental non-equilibrium electron transport theory. NEW was combined with a parallelized genetic algorithm package (PGAPACK) to evolve structural and material parameters to match a desired set of experimental data. A numerical experiment that evolves structural variations such as layer widths and doping concentrations is performed to analyze an experimental current voltage characteristic. The genetic algorithm is found to drive the NEMO simulation parameters close to the experimentally prescribed layer thicknesses and doping profiles. With such a quantitative agreement between theory and experiment design synthesis can be performed.
Wen, Jia-Long; Sun, Shao-Ni; Yuan, Tong-Qi; Xu, Feng; Sun, Run-Cang
2013-12-01
Bamboo (Phyllostachys pubescens) was successfully fractionated using a three-step integrated process: (1) autohydrolysis pretreatment facilitating xylooligosaccharide (XOS) production (2) organosolv delignification with organic acids to obtain high-purity lignin, and (3) extended delignification with alkaline hydrogen peroxide (AHP) to produce purified pulp. The integrated process was comprehensively evaluated by component analysis, SEM, XRD, and CP-MAS NMR techniques. Emphatically, the fundamental chemistry of the lignin fragments obtained from the integrated process was thoroughly investigated by gel permeation chromatography and solution-state NMR techniques (quantitative (13)C, 2D-HSQC, and (31)P-NMR spectroscopies). It is believed that the integrated process facilitate the production of XOS, high-purity lignin, and purified pulp. Moreover, the enhanced understanding of structural features and chemical reactivity of lignin polymers will maximize their utilizations in a future biorefinery industry. Copyright © 2013 Elsevier Ltd. All rights reserved.
Multifractal spectrum and lacunarity as measures of complexity of osseointegration.
de Souza Santos, Daniel; Dos Santos, Leonardo Cavalcanti Bezerra; de Albuquerque Tavares Carvalho, Alessandra; Leão, Jair Carneiro; Delrieux, Claudio; Stosic, Tatijana; Stosic, Borko
2016-07-01
The goal of this study is to contribute to a better quantitative description of the early stages of osseointegration, by application of fractal, multifractal, and lacunarity analysis. Fractal, multifractal, and lacunarity analysis are performed on scanning electron microscopy (SEM) images of titanium implants that were first subjected to different treatment combinations of i) sand blasting, ii) acid etching, and iii) exposition to calcium phosphate, and were then submersed in a simulated body fluid (SBF) for 30 days. All the three numerical techniques are applied to the implant SEM images before and after SBF immersion, in order to provide a comprehensive set of common quantitative descriptors. It is found that implants subjected to different physicochemical treatments before submersion in SBF exhibit a rather similar level of complexity, while the great variety of crystal forms after SBF submersion reveals rather different quantitative measures (reflecting complexity), for different treatments. In particular, it is found that acid treatment, in most combinations with the other considered treatments, leads to a higher fractal dimension (more uniform distribution of crystals), lower lacunarity (lesser variation in gap sizes), and narrowing of the multifractal spectrum (smaller fluctuations on different scales). The current quantitative description has shown the capacity to capture the main features of complex images of implant surfaces, for several different treatments. Such quantitative description should provide a fundamental tool for future large scale systematic studies, considering the large variety of possible implant treatments and their combinations. Quantitative description of early stages of osseointegration on titanium implants with different treatments should help develop a better understanding of this phenomenon, in general, and provide basis for further systematic experimental studies. Clinical practice should benefit from such studies in the long term, by more ready access to implants of higher quality.
[Myocardial perfusion scintigraphy - short form of the German guideline].
Lindner, O; Burchert, W; Hacker, M; Schaefer, W; Schmidt, M; Schober, O; Schwaiger, M; vom Dahl, J; Zimmermann, R; Schäfers, M
2013-01-01
This guideline is a short summary of the guideline for myocardial perfusion scintigraphy published by the Association of the Scientific Medical Societies in Ger-many (AWMF). The purpose of this guideline is to provide practical assistance for indication and examination procedures as well as image analysis and to present the state-of-the-art of myocardial-perfusion-scintigraphy. After a short introduction on the fundamentals of imaging, precise and detailed information is given on the indications, patient preparation, stress testing, radiopharmaceuticals, examination protocols and techniques, radiation exposure, data reconstruction as well as information on visual and quantitative image analysis and interpretation. In addition possible pitfalls, artefacts and key elements of reporting are described.
RGB color calibration for quantitative image analysis: the "3D thin-plate spline" warping approach.
Menesatti, Paolo; Angelini, Claudio; Pallottino, Federico; Antonucci, Francesca; Aguzzi, Jacopo; Costa, Corrado
2012-01-01
In the last years the need to numerically define color by its coordinates in n-dimensional space has increased strongly. Colorimetric calibration is fundamental in food processing and other biological disciplines to quantitatively compare samples' color during workflow with many devices. Several software programmes are available to perform standardized colorimetric procedures, but they are often too imprecise for scientific purposes. In this study, we applied the Thin-Plate Spline interpolation algorithm to calibrate colours in sRGB space (the corresponding Matlab code is reported in the Appendix). This was compared with other two approaches. The first is based on a commercial calibration system (ProfileMaker) and the second on a Partial Least Square analysis. Moreover, to explore device variability and resolution two different cameras were adopted and for each sensor, three consecutive pictures were acquired under four different light conditions. According to our results, the Thin-Plate Spline approach reported a very high efficiency of calibration allowing the possibility to create a revolution in the in-field applicative context of colour quantification not only in food sciences, but also in other biological disciplines. These results are of great importance for scientific color evaluation when lighting conditions are not controlled. Moreover, it allows the use of low cost instruments while still returning scientifically sound quantitative data.
ERIC Educational Resources Information Center
Cizdziel, James V.
2011-01-01
In this laboratory experiment, students quantitatively determine the concentration of an element (mercury) in an environmental or biological sample while comparing and contrasting the fundamental techniques of atomic absorption spectrometry (AAS) and atomic fluorescence spectrometry (AFS). A mercury analyzer based on sample combustion,…
Good practices for quantitative bias analysis.
Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander
2014-12-01
Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage more widespread use of bias analysis to estimate the potential magnitude and direction of biases, as well as the uncertainty in estimates potentially influenced by the biases. © The Author 2014; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.
Kockmann, Tobias; Trachsel, Christian; Panse, Christian; Wahlander, Asa; Selevsek, Nathalie; Grossmann, Jonas; Wolski, Witold E; Schlapbach, Ralph
2016-08-01
Quantitative mass spectrometry is a rapidly evolving methodology applied in a large number of omics-type research projects. During the past years, new designs of mass spectrometers have been developed and launched as commercial systems while in parallel new data acquisition schemes and data analysis paradigms have been introduced. Core facilities provide access to such technologies, but also actively support the researchers in finding and applying the best-suited analytical approach. In order to implement a solid fundament for this decision making process, core facilities need to constantly compare and benchmark the various approaches. In this article we compare the quantitative accuracy and precision of current state of the art targeted proteomics approaches single reaction monitoring (SRM), parallel reaction monitoring (PRM) and data independent acquisition (DIA) across multiple liquid chromatography mass spectrometry (LC-MS) platforms, using a readily available commercial standard sample. All workflows are able to reproducibly generate accurate quantitative data. However, SRM and PRM workflows show higher accuracy and precision compared to DIA approaches, especially when analyzing low concentrated analytes. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantitative fiber-optic Raman spectroscopy for tissue Raman measurements
NASA Astrophysics Data System (ADS)
Duraipandian, Shiyamala; Bergholt, Mads; Zheng, Wei; Huang, Zhiwei
2014-03-01
Molecular profiling of tissue using near-infrared (NIR) Raman spectroscopy has shown great promise for in vivo detection and prognostication of cancer. The Raman spectra measured from the tissue generally contain fundamental information about the absolute biomolecular concentrations in tissue and its changes associated with disease transformation. However, producing analogues tissue Raman spectra present a great technical challenge. In this preliminary study, we propose a method to ensure the reproducible tissue Raman measurements and validated with the in vivo Raman spectra (n=150) of inner lip acquired using different laser powers (i.e., 30 and 60 mW). A rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe was utilized for tissue Raman measurements. The investigational results showed that the variations between the spectra measured with different laser powers are almost negligible, facilitating the quantitative analysis of tissue Raman measurements in vivo.
NASA Astrophysics Data System (ADS)
Florinsky, I. V.
2012-04-01
Predictive digital soil mapping is widely used in soil science. Its objective is the prediction of the spatial distribution of soil taxonomic units and quantitative soil properties via the analysis of spatially distributed quantitative characteristics of soil-forming factors. Western pedometrists stress the scientific priority and principal importance of Hans Jenny's book (1941) for the emergence and development of predictive soil mapping. In this paper, we demonstrate that Vasily Dokuchaev explicitly defined the central idea and statement of the problem of contemporary predictive soil mapping in the year 1886. Then, we reconstruct the history of the soil formation equation from 1899 to 1941. We argue that Jenny adopted the soil formation equation from Sergey Zakharov, who published it in a well-known fundamental textbook in 1927. It is encouraging that this issue was clarified in 2011, the anniversary year for publications of Dokuchaev and Jenny.
A Team Mental Model Perspective of Pre-Quantitative Risk
NASA Technical Reports Server (NTRS)
Cooper, Lynne P.
2011-01-01
This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.
Designer drugs: the evolving science of drug discovery.
Wanke, L A; DuBose, R F
1998-07-01
Drug discovery and design are fundamental to drug development. Until recently, most drugs were discovered through random screening or developed through molecular modification. New technologies are revolutionizing this phase of drug development. Rational drug design, using powerful computers and computational chemistry and employing X-ray crystallography, nuclear magnetic resonance spectroscopy, and three-dimensional quantitative structure activity relationship analysis, is creating highly specific, biologically active molecules by virtual reality modeling. Sophisticated screening technologies are eliminating all but the most active lead compounds. These new technologies promise more efficacious, safe, and cost-effective medications, while minimizing drug development time and maximizing profits.
Wang, Jiann-Hsiung; Chou, Shih-Jen; Li, Tsung-Hsien; Leu, Ming-Yih; Ho, Hsiao-Kuan
2017-01-01
Cytokines are fundamental for a functioning immune system, and thus potentially serve as important indicators of animal health. Quantitation of mRNA using quantitative reverse transcription polymerase chain reaction (qRT-PCR) is an established immunological technique. It is particularly suitable for detecting the expression of proteins against which monoclonal antibodies are not available. In this study, we developed a probe-based quantitative gene expression assay for immunological assessment of captive beluga whales (Delphinapterus leucas) that is one of the most common cetacean species on display in aquariums worldwide. Six immunologically relevant genes (IL-2Rα, -4, -10, -12, TNFα, and IFNγ) were selected for analysis, and two validated housekeeping genes (PGK1 and RPL4) with stable expression were used as reference genes. Sixteen blood samples were obtained from four animals with different health conditions and stored in RNAlater™ solution. These samples were used for RNA extraction followed by qRT-PCR analysis. Analysis of gene transcripts was performed by relative quantitation using the comparative Cq method with the integration of amplification efficiency and two reference genes. The expression levels of each gene in the samples from clinically healthy animals were normally distributed. Transcript outliers for IL-2Rα, IL-4, IL-12, TNFα, and IFNγ were noticed in four samples collected from two clinically unhealthy animals. This assay has the potential to identify immune system deviation from normal state, which is caused by health problems. Furthermore, knowing the immune status of captive cetaceans could help both trainers and veterinarians in implementing preventive approaches prior to disease onset. PMID:28970970
Tsai, Ming-An; Chen, I-Hua; Wang, Jiann-Hsiung; Chou, Shih-Jen; Li, Tsung-Hsien; Leu, Ming-Yih; Ho, Hsiao-Kuan; Yang, Wei Cheng
2017-01-01
Cytokines are fundamental for a functioning immune system, and thus potentially serve as important indicators of animal health. Quantitation of mRNA using quantitative reverse transcription polymerase chain reaction (qRT-PCR) is an established immunological technique. It is particularly suitable for detecting the expression of proteins against which monoclonal antibodies are not available. In this study, we developed a probe-based quantitative gene expression assay for immunological assessment of captive beluga whales ( Delphinapterus leucas ) that is one of the most common cetacean species on display in aquariums worldwide. Six immunologically relevant genes (IL-2Rα, -4, -10, -12, TNFα, and IFNγ) were selected for analysis, and two validated housekeeping genes (PGK1 and RPL4) with stable expression were used as reference genes. Sixteen blood samples were obtained from four animals with different health conditions and stored in RNA later ™ solution. These samples were used for RNA extraction followed by qRT-PCR analysis. Analysis of gene transcripts was performed by relative quantitation using the comparative Cq method with the integration of amplification efficiency and two reference genes. The expression levels of each gene in the samples from clinically healthy animals were normally distributed. Transcript outliers for IL-2Rα, IL-4, IL-12, TNFα, and IFNγ were noticed in four samples collected from two clinically unhealthy animals. This assay has the potential to identify immune system deviation from normal state, which is caused by health problems. Furthermore, knowing the immune status of captive cetaceans could help both trainers and veterinarians in implementing preventive approaches prior to disease onset.
NASA Astrophysics Data System (ADS)
Neubauer, Jürgen; Mergell, Patrick; Eysholdt, Ulrich; Herzel, Hanspeter
2001-12-01
This report is on direct observation and modal analysis of irregular spatio-temporal vibration patterns of vocal fold pathologies in vivo. The observed oscillation patterns are described quantitatively with multiline kymograms, spectral analysis, and spatio-temporal plots. The complex spatio-temporal vibration patterns are decomposed by empirical orthogonal functions into independent vibratory modes. It is shown quantitatively that biphonation can be induced either by left-right asymmetry or by desynchronized anterior-posterior vibratory modes, and the term ``AP (anterior-posterior) biphonation'' is introduced. The presented phonation examples show that for normal phonation the first two modes sufficiently explain the glottal dynamics. The spatio-temporal oscillation pattern associated with biphonation due to left-right asymmetry can be explained by the first three modes. Higher-order modes are required to describe the pattern for biphonation induced by anterior-posterior vibrations. Spatial irregularity is quantified by an entropy measure, which is significantly higher for irregular phonation than for normal phonation. Two asymmetry measures are introduced: the left-right asymmetry and the anterior-posterior asymmetry, as the ratios of the fundamental frequencies of left and right vocal fold and of anterior-posterior modes, respectively. These quantities clearly differentiate between left-right biphonation and anterior-posterior biphonation. This paper proposes methods to analyze quantitatively irregular vocal fold contour patterns in vivo and complements previous findings of desynchronization of vibration modes in computer modes and in in vitro experiments.
NASA Astrophysics Data System (ADS)
Kafle, Amol; Coy, Stephen L.; Wong, Bryan M.; Fornace, Albert J.; Glick, James J.; Vouros, Paul
2014-07-01
A systematic study involving the use and optimization of gas-phase modifiers in quantitative differential mobility-mass spectrometry (DMS-MS) analysis is presented using nucleoside-adduct biomarkers of DNA damage as an important reference point for analysis in complex matrices. Commonly used polar protic and polar aprotic modifiers have been screened for use against two deoxyguanosine adducts of DNA: N-(deoxyguanosin-8-yl)-4-aminobiphenyl (dG-C8-4-ABP) and N-(deoxyguanosin-8-y1)-2-amino-l-methyl-6-phenylimidazo[4,5-b]pyridine (dG-C8-PhIP). Particular attention was paid to compensation voltage (CoV) shifts, peak shapes, and product ion signal intensities while optimizing the DMS-MS conditions. The optimized parameters were then applied to rapid quantitation of the DNA adducts in calf thymus DNA. After a protein precipitation step, adduct levels corresponding to less than one modification in 106 normal DNA bases were detected using the DMS-MS platform. Based on DMS fundamentals and ab initio thermochemical results, we interpret the complexity of DMS modifier responses in terms of thermal activation and the development of solvent shells. At very high bulk gas temperature, modifier dipole moment may be the most important factor in cluster formation and cluster geometry, but at lower temperatures, multi-neutral clusters are important and less predictable. This work provides a useful protocol for targeted DNA adduct quantitation and a basis for future work on DMS modifier effects.
Kafle, Amol; Coy, Stephen L.; Wong, Bryan M.; Fornace, Albert J.; Glick, James J.; Vouros, Paul
2014-01-01
A systematic study involving the use and optimization of gas phase modifiers in quantitative differential mobility- mass spectrometry (DMS-MS) analysis is presented using mucleoside-adduct biomarkers of DNA damage as an important reference point for analysis in complex matrices. Commonly used polar protic and polar aprotic modifiers have been screened for use against two deoxyguanosine adducts of DNA: N-(deoxyguanosin-8-yl)-4-aminobiphenyl (dG-C8-4-ABP) and N-(deoxyguanosin-8-y1)-2-amino-l-methyl-6-phenylimidazo[4,5-b]pyridine (dG-C8-PhIP). Particular attention was paid to compensation voltage (CoV) shifts, peak shapes and product ion signal intensities while optimizing the DMS-MS conditions. The optimized parameters were then applied to rapid quantitation of the DNA adducts in calf thymus DNA. After a protein precipitation step, adduct levels corresponding to less than one modification in 106 normal DNA bases were detected using the DMS-MS platform. Based on DMS fundamentals and ab-initio thermochemical results we interpret the complexity of DMS modifier responses in terms of thermal activation and the development of solvent shells. At very high bulk gas temperature, modifier dipole moment may be the most important factor in cluster formation and cluster geometry in mobility differences, but at lower temperatures multi-neutral clusters are important and less predictable. This work provides a useful protocol for targeted DNA adduct quantitation and a basis for future work on DMS modifier effects. PMID:24452298
Video fluoroscopic techniques for the study of Oral Food Processing
Matsuo, Koichiro; Palmer, Jeffrey B.
2016-01-01
Food oral processing and pharyngeal food passage cannot be observed directly from the outside of the body without instrumental methods. Videofluoroscopy (x-ray video recording) reveals the movement of oropharyngeal anatomical structures in two dimensions. By adding a radiopaque contrast medium, the motion and shape of the food bolus can be also visualized, providing critical information about the mechanisms of eating, drinking, and swallowing. For quantitative analysis of the kinematics of oral food processing, radiopaque markers are attached to the teeth, tongue or soft palate. This approach permits kinematic analysis with a variety of textures and consistencies, both solid and liquid. Fundamental mechanisms of food oral processing are clearly observed with videofluoroscopy in lateral and anteroposterior projections. PMID:27213138
Multi-segmental movements as a function of experience in karate.
Zago, Matteo; Codari, Marina; Iaia, F Marcello; Sforza, Chiarella
2017-08-01
Karate is a martial art that partly depends on subjective scoring of complex movements. Principal component analysis (PCA)-based methods can identify the fundamental synergies (principal movements) of motor system, providing a quantitative global analysis of technique. In this study, we aimed at describing the fundamental multi-joint synergies of a karate performance, under the hypothesis that the latter are skilldependent; estimate karateka's experience level, expressed as years of practice. A motion capture system recorded traditional karate techniques of 10 professional and amateur karateka. At any time point, the 3D-coordinates of body markers produced posture vectors that were normalised, concatenated from all karateka and submitted to a first PCA. Five principal movements described both gross movement synergies and individual differences. A second PCA followed by linear regression estimated the years of practice using principal movements (eigenpostures and weighting curves) and centre of mass kinematics (error: 3.71 years; R2 = 0.91, P ≪ 0.001). Principal movements and eigenpostures varied among different karateka and as functions of experience. This approach provides a framework to develop visual tools for the analysis of motor synergies in karate, allowing to detect the multi-joint motor patterns that should be restored after an injury, or to be specifically trained to increase performance.
ERIC Educational Resources Information Center
Bowers, Alex J.
2017-01-01
The quantitative research methods course is a staple of graduate programs in education leadership and administration. Historically, these courses serve to train aspiring district and school leaders in fundamental statistical research topics. This article argues for programs to focus as well in these courses on helping aspiring leaders develop…
Quantitative DNA Methylation Profiling in Cancer.
Ammerpohl, Ole; Haake, Andrea; Kolarova, Julia; Siebert, Reiner
2016-01-01
Epigenetic mechanisms including DNA methylation are fundamental for the regulation of gene expression. Epigenetic alterations can lead to the development and the evolution of malignant tumors as well as the emergence of phenotypically different cancer cells or metastasis from one single tumor cell. Here we describe bisulfite pyrosequencing, a technology to perform quantitative DNA methylation analyses, to detect aberrant DNA methylation in malignant tumors.
Functional Regression Models for Epistasis Analysis of Multiple Quantitative Traits.
Zhang, Futao; Xie, Dan; Liang, Meimei; Xiong, Momiao
2016-04-01
To date, most genetic analyses of phenotypes have focused on analyzing single traits or analyzing each phenotype independently. However, joint epistasis analysis of multiple complementary traits will increase statistical power and improve our understanding of the complicated genetic structure of the complex diseases. Despite their importance in uncovering the genetic structure of complex traits, the statistical methods for identifying epistasis in multiple phenotypes remains fundamentally unexplored. To fill this gap, we formulate a test for interaction between two genes in multiple quantitative trait analysis as a multiple functional regression (MFRG) in which the genotype functions (genetic variant profiles) are defined as a function of the genomic position of the genetic variants. We use large-scale simulations to calculate Type I error rates for testing interaction between two genes with multiple phenotypes and to compare the power with multivariate pairwise interaction analysis and single trait interaction analysis by a single variate functional regression model. To further evaluate performance, the MFRG for epistasis analysis is applied to five phenotypes of exome sequence data from the NHLBI's Exome Sequencing Project (ESP) to detect pleiotropic epistasis. A total of 267 pairs of genes that formed a genetic interaction network showed significant evidence of epistasis influencing five traits. The results demonstrate that the joint interaction analysis of multiple phenotypes has a much higher power to detect interaction than the interaction analysis of a single trait and may open a new direction to fully uncovering the genetic structure of multiple phenotypes.
Fan, Qianrui; Wang, Wenyu; Hao, Jingcan; He, Awen; Wen, Yan; Guo, Xiong; Wu, Cuiyan; Ning, Yujie; Wang, Xi; Wang, Sen; Zhang, Feng
2017-08-01
Neuroticism is a fundamental personality trait with significant genetic determinant. To identify novel susceptibility genes for neuroticism, we conducted an integrative analysis of genomic and transcriptomic data of genome wide association study (GWAS) and expression quantitative trait locus (eQTL) study. GWAS summary data was driven from published studies of neuroticism, totally involving 170,906 subjects. eQTL dataset containing 927,753 eQTLs were obtained from an eQTL meta-analysis of 5311 samples. Integrative analysis of GWAS and eQTL data was conducted by summary data-based Mendelian randomization (SMR) analysis software. To identify neuroticism associated gene sets, the SMR analysis results were further subjected to gene set enrichment analysis (GSEA). The gene set annotation dataset (containing 13,311 annotated gene sets) of GSEA Molecular Signatures Database was used. SMR single gene analysis identified 6 significant genes for neuroticism, including MSRA (p value=2.27×10 -10 ), MGC57346 (p value=6.92×10 -7 ), BLK (p value=1.01×10 -6 ), XKR6 (p value=1.11×10 -6 ), C17ORF69 (p value=1.12×10 -6 ) and KIAA1267 (p value=4.00×10 -6 ). Gene set enrichment analysis observed significant association for Chr8p23 gene set (false discovery rate=0.033). Our results provide novel clues for the genetic mechanism studies of neuroticism. Copyright © 2017. Published by Elsevier Inc.
Global spectral graph wavelet signature for surface analysis of carpal bones
NASA Astrophysics Data System (ADS)
Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A.
2018-02-01
Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.
Global spectral graph wavelet signature for surface analysis of carpal bones.
Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A
2018-02-05
Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J; Ibarra-Manzano, Mario A
2016-03-05
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states.
Batres-Mendoza, Patricia; Montoro-Sanjose, Carlos R.; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Rostro-Gonzalez, Horacio; Romero-Troncoso, Rene J.; Ibarra-Manzano, Mario A.
2016-01-01
Quaternions can be used as an alternative to model the fundamental patterns of electroencephalographic (EEG) signals in the time domain. Thus, this article presents a new quaternion-based technique known as quaternion-based signal analysis (QSA) to represent EEG signals obtained using a brain-computer interface (BCI) device to detect and interpret cognitive activity. This quaternion-based signal analysis technique can extract features to represent brain activity related to motor imagery accurately in various mental states. Experimental tests in which users where shown visual graphical cues related to left and right movements were used to collect BCI-recorded signals. These signals were then classified using decision trees (DT), support vector machine (SVM) and k-nearest neighbor (KNN) techniques. The quantitative analysis of the classifiers demonstrates that this technique can be used as an alternative in the EEG-signal modeling phase to identify mental states. PMID:26959029
Taralova, Ekaterina; Dupre, Christophe; Yuste, Rafael
2018-01-01
Animal behavior has been studied for centuries, but few efficient methods are available to automatically identify and classify it. Quantitative behavioral studies have been hindered by the subjective and imprecise nature of human observation, and the slow speed of annotating behavioral data. Here, we developed an automatic behavior analysis pipeline for the cnidarian Hydra vulgaris using machine learning. We imaged freely behaving Hydra, extracted motion and shape features from the videos, and constructed a dictionary of visual features to classify pre-defined behaviors. We also identified unannotated behaviors with unsupervised methods. Using this analysis pipeline, we quantified 6 basic behaviors and found surprisingly similar behavior statistics across animals within the same species, regardless of experimental conditions. Our analysis indicates that the fundamental behavioral repertoire of Hydra is stable. This robustness could reflect a homeostatic neural control of "housekeeping" behaviors which could have been already present in the earliest nervous systems. PMID:29589829
Snyder, Jessica M.; Washington, Ida M.; Birkland, Timothy; Chang, Mary Y.; Frevert, Charles W.
2015-01-01
Versican, a chondroitin sulfate proteoglycan, is important in embryonic development, and disruption of the versican gene is embryonically lethal in the mouse. Although several studies show that versican is increased in various organs during development, a focused quantitative study on versican expression and distribution during lung and central nervous system development in the mouse has not previously been performed. We tracked changes in versican (Vcan) gene expression and in the accumulation and degradation of versican. Vcan expression and quantitative immunohistochemistry performed from embryonic day (E) 11.5 to E15.5 showed peak Vcan expression at E13.5 in the lungs and brain. Quantitative mRNA analysis and versican immunohistochemistry showed differences in the expression of the versican isoforms in the embryonic lung and head. The expression of Vcan mRNA and accumulation of versican in tissues was complementary. Immunohistochemistry demonstrated co-localization of versican accumulation and degradation, suggesting distinct roles of versican deposition and degradation in embryogenesis. Very little versican mRNA or protein was found in the lungs of 12- to 16-week-old mice but versican accumulation was significantly increased in mice with Pseudomonas aeruginosa lung infection. These data suggest that versican plays an important role in fundamental, overlapping cellular processes in lung development and infection. PMID:26385570
Cross-study projections of genomic biomarkers: an evaluation in cancer genomics.
Lucas, Joseph E; Carvalho, Carlos M; Chen, Julia Ling-Yu; Chi, Jen-Tsan; West, Mike
2009-01-01
Human disease studies using DNA microarrays in both clinical/observational and experimental/controlled studies are having increasing impact on our understanding of the complexity of human diseases. A fundamental concept is the use of gene expression as a "common currency" that links the results of in vitro controlled experiments to in vivo observational human studies. Many studies--in cancer and other diseases--have shown promise in using in vitro cell manipulations to improve understanding of in vivo biology, but experiments often simply fail to reflect the enormous phenotypic variation seen in human diseases. We address this with a framework and methods to dissect, enhance and extend the in vivo utility of in vitro derived gene expression signatures. From an experimentally defined gene expression signature we use statistical factor analysis to generate multiple quantitative factors in human cancer gene expression data. These factors retain their relationship to the original, one-dimensional in vitro signature but better describe the diversity of in vivo biology. In a breast cancer analysis, we show that factors can reflect fundamentally different biological processes linked to molecular and clinical features of human cancers, and that in combination they can improve prediction of clinical outcomes.
Yu, Xiaodong; Li, Yang; Gu, Xiaofeng; Bao, Jiming; Yang, Huizhong; Sun, Li
2014-12-01
Water quality monitoring is a critical part of environmental management and protection, and to be able to qualitatively and quantitatively determine contamination and impurity levels in water is especially important. Compared to the currently available water quality monitoring methods and techniques, laser-induced breakdown spectroscopy (LIBS) has several advantages, including no need for sample pre-preparation, fast and easy operation, and chemical free during the process. Therefore, it is of great importance to understand the fundamentals of aqueous LIBS analysis and effectively apply this technique to environmental monitoring. This article reviews the research conducted on LIBS analysis for liquid samples, and the article content includes LIBS theory, history and applications, quantitative analysis of metallic species in liquids, LIBS signal enhancement methods and data processing, characteristics of plasma generated by laser in water, and the factors affecting accuracy of analysis results. Although there have been many research works focusing on aqueous LIBS analysis, detection limit and stability of this technique still need to be improved to satisfy the requirements of environmental monitoring standard. In addition, determination of nonmetallic species in liquid by LIBS is equally important and needs immediate attention from the community. This comprehensive review will assist the readers to better understand the aqueous LIBS technique and help to identify current research needs for environmental monitoring of water quality.
RGB Color Calibration for Quantitative Image Analysis: The “3D Thin-Plate Spline” Warping Approach
Menesatti, Paolo; Angelini, Claudio; Pallottino, Federico; Antonucci, Francesca; Aguzzi, Jacopo; Costa, Corrado
2012-01-01
In the last years the need to numerically define color by its coordinates in n-dimensional space has increased strongly. Colorimetric calibration is fundamental in food processing and other biological disciplines to quantitatively compare samples' color during workflow with many devices. Several software programmes are available to perform standardized colorimetric procedures, but they are often too imprecise for scientific purposes. In this study, we applied the Thin-Plate Spline interpolation algorithm to calibrate colours in sRGB space (the corresponding Matlab code is reported in the Appendix). This was compared with other two approaches. The first is based on a commercial calibration system (ProfileMaker) and the second on a Partial Least Square analysis. Moreover, to explore device variability and resolution two different cameras were adopted and for each sensor, three consecutive pictures were acquired under four different light conditions. According to our results, the Thin-Plate Spline approach reported a very high efficiency of calibration allowing the possibility to create a revolution in the in-field applicative context of colour quantification not only in food sciences, but also in other biological disciplines. These results are of great importance for scientific color evaluation when lighting conditions are not controlled. Moreover, it allows the use of low cost instruments while still returning scientifically sound quantitative data. PMID:22969337
Quantitative phosphoproteome on the silkworm (Bombyx mori) cells infected with baculovirus.
Shobahah, Jauharotus; Xue, Shengjie; Hu, Dongbing; Zhao, Cui; Wei, Ming; Quan, Yanping; Yu, Wei
2017-06-19
Bombyx mori has become an important model organism for many fundamental studies. Bombyx mori nucleopolyhedrovirus (BmNPV) is a significant pathogen to Bombyx mori, yet also an efficient vector for recombinant protein production. A previous study indicated that acetylation plays many vital roles in several cellular processes of Bombyx mori while global phosphorylation pattern upon BmNPV infection remains elusive. Employing tandem mass tag (TMT) labeling and phosphorylation affinity enrichment followed by high-resolution LC-MS/MS analysis and intensive bioinformatics analysis, the quantitative phosphoproteome in Bombyx mori cells infected by BmNPV at 24 hpi with an MOI of 10 was extensively examined. Totally, 6480 phosphorylation sites in 2112 protein groups were identified, among which 4764 sites in 1717 proteins were quantified. Among the quantified proteins, 81 up-regulated and 25 down-regulated sites were identified with significant criteria (the quantitative ratio above 1.3 was considered as up-regulation and below 0.77 was considered as down-regulation) and with significant p-value (p < 0.05). Some proteins of BmNPV were also hyperphosphorylated during infection, such as P6.9, 39 K, LEF-6, Ac58-like protein, Ac82-like protein and BRO-D. The phosphorylated proteins were primary involved in several specific functions, out of which, we focused on the binding activity, protein synthesis, viral replication and apoptosis through kinase activity.
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
Combining formal and functional approaches to topic structure.
Zellers, Margaret; Post, Brechtje
2012-03-01
Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to adopt the insights of both PTI's qualitative analysis and EP's quantitative analysis and combine them into a multiple-methods approach. One realm in which it is possible to combine these frameworks is in the analysis of discourse topic structure and the prosodic cues relevant to it. By combining a quantitative and a qualitative approach to discourse topic structure, it is possible to give a better account of the observed variation in prosody, for example in the case of fundamental frequency (F0) peak timing, which can be explained in terms of pitch accent distribution over different topic structure categories. Similarly, local and global patterns in speech rate variation can be better explained and motivated by adopting insights from both PTI and EP in the study of topic structure. Combining PTI and EP can provide better accounts of speech data as well as opening up new avenues of investigation which would not have been possible in either approach alone.
Extraction and quantitative analysis of iodine in solid and solution matrixes.
Brown, Christopher F; Geiszler, Keith N; Vickerman, Tanya S
2005-11-01
129I is a contaminant of interest in the vadose zone and groundwater at numerous federal and privately owned facilities. Several techniques have been utilized to extract iodine from solid matrixes; however, all of them rely on two fundamental approaches: liquid extraction or chemical/heat-facilitated volatilization. While these methods are typically chosen for their ease of implementation, they do not totally dissolve the solid. We defined a method that produces complete solid dissolution and conducted laboratory tests to assess its efficacy to extract iodine from solid matrixes. Testing consisted of potassium nitrate/potassium hydroxide fusion of the sample, followed by sample dissolution in a mixture of sulfuric acid and sodium bisulfite. The fusion extraction method resulted in complete sample dissolution of all solid matrixes tested. Quantitative analysis of 127I and 129I via inductively coupled plasma mass spectrometry showed better than +/-10% accuracy for certified reference standards, with the linear operating range extending more than 3 orders of magnitude (0.005-5 microg/L). Extraction and analysis of four replicates of standard reference material containing 5 microg/g 127I resulted in an average recovery of 98% with a relative deviation of 6%. This simple and cost-effective technique can be applied to solid samples of varying matrixes with little or no adaptation.
Crocker, Jonny; Bartram, Jamie
2014-07-18
Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country's ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries.
NASA Astrophysics Data System (ADS)
Martin, Madhavi Z.; Allman, Steve; Brice, Deanne J.; Martin, Rodger C.; Andre, Nicolas O.
2012-08-01
Laser-induced breakdown spectroscopy (LIBS) has been used to determine the limits of detection of strontium (Sr) and cesium (Cs), common nuclear fission products. Additionally, detection limits were determined for cerium (Ce), often used as a surrogate for radioactive plutonium in laboratory studies. Results were obtained using a laboratory instrument with a Nd:YAG laser at fundamental wavelength of 1064 nm, frequency doubled to 532 nm with energy of 50 mJ/pulse. The data was compared for different concentrations of Sr and Ce dispersed in a CaCO3 (white) and carbon (black) matrix. We have addressed the sampling errors, limits of detection, reproducibility, and accuracy of measurements as they relate to multivariate analysis in pellets that were doped with the different elements at various concentrations. These results demonstrate that LIBS technique is inherently well suited for in situ analysis of nuclear materials in hot cells. Three key advantages are evident: (1) small samples (mg) can be evaluated; (2) nuclear materials can be analyzed with minimal sample preparation; and (3) samples can be remotely analyzed very rapidly (ms-seconds). Our studies also show that the methods can be made quantitative. Very robust multivariate models have been used to provide quantitative measurement and statistical evaluation of complex materials derived from our previous research on wood and soil samples.
Fundamental ecology is fundamental.
Courchamp, Franck; Dunne, Jennifer A; Le Maho, Yvon; May, Robert M; Thébaud, Christophe; Hochberg, Michael E
2015-01-01
The primary reasons for conducting fundamental research are satisfying curiosity, acquiring knowledge, and achieving understanding. Here we develop why we believe it is essential to promote basic ecological research, despite increased impetus for ecologists to conduct and present their research in the light of potential applications. This includes the understanding of our environment, for intellectual, economical, social, and political reasons, and as a major source of innovation. We contend that we should focus less on short-term, objective-driven research and more on creativity and exploratory analyses, quantitatively estimate the benefits of fundamental research for society, and better explain the nature and importance of fundamental ecology to students, politicians, decision makers, and the general public. Our perspective and underlying arguments should also apply to evolutionary biology and to many of the other biological and physical sciences. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Representational Similarity Analysis – Connecting the Branches of Systems Neuroscience
Kriegeskorte, Nikolaus; Mur, Marieke; Bandettini, Peter
2008-01-01
A fundamental challenge for systems neuroscience is to quantitatively relate its three major branches of research: brain-activity measurement, behavioral measurement, and computational modeling. Using measured brain-activity patterns to evaluate computational network models is complicated by the need to define the correspondency between the units of the model and the channels of the brain-activity data, e.g., single-cell recordings or voxels from functional magnetic resonance imaging (fMRI). Similar correspondency problems complicate relating activity patterns between different modalities of brain-activity measurement (e.g., fMRI and invasive or scalp electrophysiology), and between subjects and species. In order to bridge these divides, we suggest abstracting from the activity patterns themselves and computing representational dissimilarity matrices (RDMs), which characterize the information carried by a given representation in a brain or model. Building on a rich psychological and mathematical literature on similarity analysis, we propose a new experimental and data-analytical framework called representational similarity analysis (RSA), in which multi-channel measures of neural activity are quantitatively related to each other and to computational theory and behavior by comparing RDMs. We demonstrate RSA by relating representations of visual objects as measured with fMRI in early visual cortex and the fusiform face area to computational models spanning a wide range of complexities. The RDMs are simultaneously related via second-level application of multidimensional scaling and tested using randomization and bootstrap techniques. We discuss the broad potential of RSA, including novel approaches to experimental design, and argue that these ideas, which have deep roots in psychology and neuroscience, will allow the integrated quantitative analysis of data from all three branches, thus contributing to a more unified systems neuroscience. PMID:19104670
The dynamics of adapting, unregulated populations and a modified fundamental theorem.
O'Dwyer, James P
2013-01-06
A population in a novel environment will accumulate adaptive mutations over time, and the dynamics of this process depend on the underlying fitness landscape: the fitness of and mutational distance between possible genotypes in the population. Despite its fundamental importance for understanding the evolution of a population, inferring this landscape from empirical data has been problematic. We develop a theoretical framework to describe the adaptation of a stochastic, asexual, unregulated, polymorphic population undergoing beneficial, neutral and deleterious mutations on a correlated fitness landscape. We generate quantitative predictions for the change in the mean fitness and within-population variance in fitness over time, and find a simple, analytical relationship between the distribution of fitness effects arising from a single mutation, and the change in mean population fitness over time: a variant of Fisher's 'fundamental theorem' which explicitly depends on the form of the landscape. Our framework can therefore be thought of in three ways: (i) as a set of theoretical predictions for adaptation in an exponentially growing phase, with applications in pathogen populations, tumours or other unregulated populations; (ii) as an analytically tractable problem to potentially guide theoretical analysis of regulated populations; and (iii) as a basis for developing empirical methods to infer general features of a fitness landscape.
HuMOVE: a low-invasive wearable monitoring platform in sexual medicine.
Ciuti, Gastone; Nardi, Matteo; Valdastri, Pietro; Menciassi, Arianna; Basile Fasolo, Ciro; Dario, Paolo
2014-10-01
To investigate an accelerometer-based wearable system, named Human Movement (HuMOVE) platform, designed to enable quantitative and continuous measurement of sexual performance with minimal invasiveness and inconvenience for users. Design, implementation, and development of HuMOVE, a wearable platform equipped with an accelerometer sensor for monitoring inertial parameters for sexual performance assessment and diagnosis, were performed. The system enables quantitative measurement of movement parameters during sexual intercourse, meeting the requirements of wearability, data storage, sampling rate, and interfacing methods, which are fundamental for human sexual intercourse performance analysis. HuMOVE was validated through characterization using a controlled experimental test bench and evaluated in a human model during simulated sexual intercourse conditions. HuMOVE demonstrated to be a robust and quantitative monitoring platform and a reliable candidate for sexual performance evaluation and diagnosis. Characterization analysis on the controlled experimental test bench demonstrated an accurate correlation between the HuMOVE system and data from a reference displacement sensor. Experimental tests in the human model during simulated intercourse conditions confirmed the accuracy of the sexual performance evaluation platform and the effectiveness of the selected and derived parameters. The obtained outcomes also established the project expectations in terms of usability and comfort, evidenced by the questionnaires that highlighted the low invasiveness and acceptance of the device. To the best of our knowledge, HuMOVE platform is the first device for human sexual performance analysis compatible with sexual intercourse; the system has the potential to be a helpful tool for physicians to accurately classify sexual disorders, such as premature or delayed ejaculation. Copyright © 2014 Elsevier Inc. All rights reserved.
A two-factor error model for quantitative steganalysis
NASA Astrophysics Data System (ADS)
Böhme, Rainer; Ker, Andrew D.
2006-02-01
Quantitative steganalysis refers to the exercise not only of detecting the presence of hidden stego messages in carrier objects, but also of estimating the secret message length. This problem is well studied, with many detectors proposed but only a sparse analysis of errors in the estimators. A deep understanding of the error model, however, is a fundamental requirement for the assessment and comparison of different detection methods. This paper presents a rationale for a two-factor model for sources of error in quantitative steganalysis, and shows evidence from a dedicated large-scale nested experimental set-up with a total of more than 200 million attacks. Apart from general findings about the distribution functions found in both classes of errors, their respective weight is determined, and implications for statistical hypothesis tests in benchmarking scenarios or regression analyses are demonstrated. The results are based on a rigorous comparison of five different detection methods under many different external conditions, such as size of the carrier, previous JPEG compression, and colour channel selection. We include analyses demonstrating the effects of local variance and cover saturation on the different sources of error, as well as presenting the case for a relative bias model for between-image error.
Systems microscopy: an emerging strategy for the life sciences.
Lock, John G; Strömblad, Staffan
2010-05-01
Dynamic cellular processes occurring in time and space are fundamental to all physiology and disease. To understand complex and dynamic cellular processes therefore demands the capacity to record and integrate quantitative multiparametric data from the four spatiotemporal dimensions within which living cells self-organize, and to subsequently use these data for the mathematical modeling of cellular systems. To this end, a raft of complementary developments in automated fluorescence microscopy, cell microarray platforms, quantitative image analysis and data mining, combined with multivariate statistics and computational modeling, now coalesce to produce a new research strategy, "systems microscopy", which facilitates systems biology analyses of living cells. Systems microscopy provides the crucial capacities to simultaneously extract and interrogate multiparametric quantitative data at resolution levels ranging from the molecular to the cellular, thereby elucidating a more comprehensive and richly integrated understanding of complex and dynamic cellular systems. The unique capacities of systems microscopy suggest that it will become a vital cornerstone of systems biology, and here we describe the current status and future prospects of this emerging field, as well as outlining some of the key challenges that remain to be overcome. Copyright 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.
2013-07-01
The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.
Photochemical Reactions of Tris (Oxalato) Iron (III): A First-Year Chemistry Experiment.
ERIC Educational Resources Information Center
Baker, A. D.; And Others
1980-01-01
Describes a first-year chemistry experiment that illustrates the fundamental concepts of a photoinduced reaction. Qualitative and quantitative parts of the photoreduction of potassium ferrioxalate are detailed. (CS)
Pediatric environmental medicine in Eastern Central Europe.
Muceniece, S; Muszynska, M; Otto, M; Rozentale, G; Rudkowski, Z; Skerliene, B; Slotova, K; Suurorg, L; Tur, I; von Mühlendahl, K E
2007-10-01
Pediatric environmental medicine in Central Eastern Europe needs support and development on national, institutional and individual basis. This situation is quantitatively, but not fundamentally different from what is to be found in Central Europe.
Applications of surface analysis and surface theory in tribology
NASA Technical Reports Server (NTRS)
Ferrante, John
1988-01-01
Tribology, the study of adhesion, friction and wear of materials is a complex field which requires a knowledge of solid state physics, surface physics, chemistry, material science and mechanical engineering. It has been dominated, however, by the more practical need to make equipment work. With the advent of surface analysis and advances in surface and solid state theory, a new dimension has been added to the analysis of interactions at tribological interfaces. In this paper the applications of tribological studies and their limitations are presented. Examples from research at the NASA Lewis Research Center are given. Emphasis is on fundamental studies involving the effects of monolayer coverage and thick films on friction and wear. A summary of the current status of theoretical calculations of defect energetics is presented. In addition, some new theoretical techniques which enable simplified quantitative calculations of adhesion, fracture and friction are discussed.
Applications of surface analysis and surface theory in tribology
NASA Technical Reports Server (NTRS)
Ferrante, John
1989-01-01
Tribology, the study of adhesion, friction and wear of materials, is a complex field which requires a knowledge of solid state physics, surface physics, chemistry, material science, and mechanical engineering. It has been dominated, however, by the more practical need to make equipment work. With the advent of surface analysis and advances in surface and solid-state theory, a new dimension has been added to the analysis of interactions at tribological interfaces. In this paper the applications of tribological studies and their limitations are presented. Examples from research at the NASA Lewis Research Center are given. Emphasis is on fundamental studies involving the effects of monolayer coverage and thick films on friction and wear. A summary of the current status of theoretical calculations of defect energetics is presented. In addition, some new theoretical techniques which enable simplified quantitative calculations of adhesion, fracture, and friction are discussed.
Application of atomic force microscopy as a nanotechnology tool in food science.
Yang, Hongshun; Wang, Yifen; Lai, Shaojuan; An, Hongjie; Li, Yunfei; Chen, Fusheng
2007-05-01
Atomic force microscopy (AFM) provides a method for detecting nanoscale structural information. First, this review explains the fundamentals of AFM, including principle, manipulation, and analysis. Applications of AFM are then reported in food science and technology research, including qualitative macromolecule and polymer imaging, complicated or quantitative structure analysis, molecular interaction, molecular manipulation, surface topography, and nanofood characterization. The results suggested that AFM could bring insightful knowledge on food properties, and the AFM analysis could be used to illustrate some mechanisms of property changes during processing and storage. However, the current difficulty in applying AFM to food research is lacking appropriate methodology for different food systems. Better understanding of AFM technology and developing corresponding methodology for complicated food systems would lead to a more in-depth understanding of food properties at macromolecular levels and enlarge their applications. The AFM results could greatly improve the food processing and storage technologies.
Burns, P
1986-05-01
An acoustical analysis of the speaking and singing voices of two types of professional singers was conducted. The vowels /i/, /a/, and /o/ were spoken and sung ten times each by seven opera and seven country and western singers. Vowel spectra were derived by computer software techniques allowing quantitative assessment of formant structure (F1-F4), relative amplitude of resonance peaks (F1-F4), fundamental frequency, and harmonic high frequency energy. Formant analysis was the most effective parameter differentiating the two groups. Only opera singers lowered their fourth formant creating a wide-band resonance area (approximately 2,800 Hz) corresponding to the well-known "singing formant." Country and western singers revealed similar resonatory voice characteristics for both spoken and sung output. These results implicate faulty vocal technique in country and western singers as a contributory reason for vocal abuse/fatigue.
NASA Astrophysics Data System (ADS)
Huang, Chun-Yi; Chang, Hsin-Wei; Chang, Che-Chen
2018-03-01
Knowledge about the chemical compositions of meso/nanomaterials is fundamental to development of their applications in advanced technologies. Auger electron spectroscopy (AES) is an effective analysis method for the characterization of meso/nanomaterial structures. Although a few studies have reported the use of AES for the analysis of the local composition of these structures, none have explored in detail the validity of the meso/nanoanalysis results generated by the AES instrument. This paper addresses the limitations of AES and the corrections necessary to offset them for this otherwise powerful meso/nanoanalysis tool. The results of corrections made to the AES multi-point analysis of high-density copper-based meso/nanostructures provides major insights into their local chemical compositions and technological prospects, which the primitive composition output of the AES instrument failed to provide.
Chandel, Shubham; Soni, Jalpa; Ray, Subir kumar; Das, Anwesh; Ghosh, Anirudha; Raj, Satyabrata; Ghosh, Nirmalya
2016-01-01
Information on the polarization properties of scattered light from plasmonic systems are of paramount importance due to fundamental interest and potential applications. However, such studies are severely compromised due to the experimental difficulties in recording full polarization response of plasmonic nanostructures. Here, we report on a novel Mueller matrix spectroscopic system capable of acquiring complete polarization information from single isolated plasmonic nanoparticle/nanostructure. The outstanding issues pertaining to reliable measurements of full 4 × 4 spectroscopic scattering Mueller matrices from single nanoparticle/nanostructures are overcome by integrating an efficient Mueller matrix measurement scheme and a robust eigenvalue calibration method with a dark-field microscopic spectroscopy arrangement. Feasibility of quantitative Mueller matrix polarimetry and its potential utility is illustrated on a simple plasmonic system, that of gold nanorods. The demonstrated ability to record full polarization information over a broad wavelength range and to quantify the intrinsic plasmon polarimetry characteristics via Mueller matrix inverse analysis should lead to a novel route towards quantitative understanding, analysis/interpretation of a number of intricate plasmonic effects and may also prove useful towards development of polarization-controlled novel sensing schemes. PMID:27212687
Greco, Todd M.; Guise, Amanda J.; Cristea, Ileana M.
2016-01-01
In biological systems, proteins catalyze the fundamental reactions that underlie all cellular functions, including metabolic processes and cell survival and death pathways. These biochemical reactions are rarely accomplished alone. Rather, they involve a concerted effect from many proteins that may operate in a directed signaling pathway and/or may physically associate in a complex to achieve a specific enzymatic activity. Therefore, defining the composition and regulation of protein complexes is critical for understanding cellular functions. In this chapter, we describe an approach that uses quantitative mass spectrometry (MS) to assess the specificity and the relative stability of protein interactions. Isolation of protein complexes from mammalian cells is performed by rapid immunoaffinity purification, and followed by in-solution digestion and high-resolution mass spectrometry analysis. We employ complementary quantitative MS workflows to assess the specificity of protein interactions using label-free MS and statistical analysis, and the relative stability of the interactions using a metabolic labeling technique. For each candidate protein interaction, scores from the two workflows can be correlated to minimize nonspecific background and profile protein complex composition and relative stability. PMID:26867737
Security of BB84 with weak randomness and imperfect qubit encoding
NASA Astrophysics Data System (ADS)
Zhao, Liang-Yuan; Yin, Zhen-Qiang; Li, Hong-Wei; Chen, Wei; Fang, Xi; Han, Zheng-Fu; Huang, Wei
2018-03-01
The main threats for the well-known Bennett-Brassard 1984 (BB84) practical quantum key distribution (QKD) systems are that its encoding is inaccurate and measurement device may be vulnerable to particular attacks. Thus, a general physical model or security proof to tackle these loopholes simultaneously and quantitatively is highly desired. Here we give a framework on the security of BB84 when imperfect qubit encoding and vulnerability of measurement device are both considered. In our analysis, the potential attacks to measurement device are generalized by the recently proposed weak randomness model which assumes the input random numbers are partially biased depending on a hidden variable planted by an eavesdropper. And the inevitable encoding inaccuracy is also introduced here. From a fundamental view, our work reveals the potential information leakage due to encoding inaccuracy and weak randomness input. For applications, our result can be viewed as a useful tool to quantitatively evaluate the security of a practical QKD system.
El-Rami, Fadi; Nelson, Kristina; Xu, Ping
2017-01-01
Streptococcus sanguinis is a commensal and early colonizer of oral cavity as well as an opportunistic pathogen of infectious endocarditis. Extracting the soluble proteome of this bacterium provides deep insights about the physiological dynamic changes under different growth and stress conditions, thus defining “proteomic signatures” as targets for therapeutic intervention. In this protocol, we describe an experimentally verified approach to extract maximal cytoplasmic proteins from Streptococcus sanguinis SK36 strain. A combination of procedures was adopted that broke the thick cell wall barrier and minimized denaturation of the intracellular proteome, using optimized buffers and a sonication step. Extracted proteome was quantitated using Pierce BCA Protein Quantitation assay and protein bands were macroscopically assessed by Coomassie Blue staining. Finally, a high resolution detection of the extracted proteins was conducted through Synapt G2Si mass spectrometer, followed by label-free relative quantification via Progenesis QI. In conclusion, this pipeline for proteomic extraction and analysis of soluble proteins provides a fundamental tool in deciphering the biological complexity of Streptococcus sanguinis. PMID:29152022
Investigating the strategic antecedents of agility in humanitarian logistics.
L'Hermitte, Cécile; Brooks, Benjamin; Bowles, Marcus; Tatham, Peter H
2017-10-01
This study investigates the strategic antecedents of operational agility in humanitarian logistics. It began by identifying the particular actions to be taken at the strategic level of a humanitarian organisation to support field-level agility. Next, quantitative data (n=59) were collected on four strategic-level capabilities (being purposeful, action-focused, collaborative, and learning-oriented) and on operational agility (field responsiveness and flexibility). Using a quantitative analysis, the study tested the relationship between organisational capacity building and operational agility and found that the four strategic-level capabilities are fundamental building blocks of agility. Collectively they account for 52 per cent of the ability of humanitarian logisticians to deal with ongoing changes and disruptions in the field. This study emphasises the need for researchers and practitioners to embrace a broader perspective of agility in humanitarian logistics. In addition, it highlights the inherently strategic nature of agility, the development of which involves focusing simultaneously on multiple drivers. © 2017 The Author(s). Disasters © Overseas Development Institute, 2017.
NASA Astrophysics Data System (ADS)
Vilar, Jose M. G.; Saiz, Leonor
2006-06-01
DNA looping plays a fundamental role in a wide variety of biological processes, providing the backbone for long range interactions on DNA. Here we develop the first model for DNA looping by an arbitrarily large number of proteins and solve it analytically in the case of identical binding. We uncover a switchlike transition between looped and unlooped phases and identify the key parameters that control this transition. Our results establish the basis for the quantitative understanding of fundamental cellular processes like DNA recombination, gene silencing, and telomere maintenance.
NASA Technical Reports Server (NTRS)
Weaver, David
2008-01-01
Effectively communicate qualitative and quantitative information orally and in writing. Explain the application of fundamental physical principles to various physical phenomena. Apply appropriate problem-solving techniques to practical and meaningful problems using graphical, mathematical, and written modeling tools. Work effectively in collaborative groups.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryan, C.G.; De Geronimo, G.; Kirkham, R.
2009-11-13
The fundamental parameter method for quantitative SXRF and PIXE analysis and imaging using the dynamic analysis method is extended to model the changing X-ray yields and detector sensitivity with angle across large detector arrays. The method is implemented in the GeoPIXE software and applied to cope with the large solid-angle of the new Maia 384 detector array and its 96 detector prototype developed by CSIRO and BNL for SXRF imaging applications at the Australian and NSLS synchrotrons. Peak-to-background is controlled by mitigating charge-sharing between detectors through careful optimization of a patterned molybdenum absorber mask. A geological application demonstrates the capabilitymore » of the method to produce high definition elemental images up to {approx}100 M pixels in size.« less
Neutron activation analysis of certified samples by the absolute method
NASA Astrophysics Data System (ADS)
Kadem, F.; Belouadah, N.; Idiri, Z.
2015-07-01
The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.
Han, Shuting; Taralova, Ekaterina; Dupre, Christophe; Yuste, Rafael
2018-03-28
Animal behavior has been studied for centuries, but few efficient methods are available to automatically identify and classify it. Quantitative behavioral studies have been hindered by the subjective and imprecise nature of human observation, and the slow speed of annotating behavioral data. Here, we developed an automatic behavior analysis pipeline for the cnidarian Hydra vulgaris using machine learning. We imaged freely behaving Hydra , extracted motion and shape features from the videos, and constructed a dictionary of visual features to classify pre-defined behaviors. We also identified unannotated behaviors with unsupervised methods. Using this analysis pipeline, we quantified 6 basic behaviors and found surprisingly similar behavior statistics across animals within the same species, regardless of experimental conditions. Our analysis indicates that the fundamental behavioral repertoire of Hydra is stable. This robustness could reflect a homeostatic neural control of "housekeeping" behaviors which could have been already present in the earliest nervous systems. © 2018, Han et al.
Lorantfy, Bettina; Seyer, Bernhard; Herwig, Christoph
2014-01-25
Extreme halophilic Archaea are extremophile species which can thrive in hypersaline environments of up to 3-5 M sodium chloride concentration. Although their ecology and physiology are widely identified on the microbiological level, little emphasis has been laid on quantitative bioprocess development with extreme halophiles. The goal of this study was to establish, on the one hand, a methodological basis for quantitative bioprocess analysis of extreme halophilic Archaea with an extreme halophilic strain as an example. Firstly, as a novel usage, a corrosion resistant bioreactor setup for extreme halophiles has been implemented. Then, paying special attention to total bioprocess quantification approaches, an indirect method for biomass quantification using on-line process signals was introduced. Subsequently, robust quantitative data evaluation methods for halophiles could be developed, providing defined and controlled cultivation conditions in the bioreactor and therefore obtaining suitable quality of on-line as well as off-line datasets. On the other hand, new physiological results of extreme halophiles in bioreactor have also been obtained based on the quantitative methodological tools. For the first time, quantitative data on stoichiometry and kinetics were collected and evaluated on different carbon sources. The results on various substrates were interpreted, with proposed metabolic mechanisms, by linking to the reported primary carbon metabolism of extreme halophilic Archaea. Moreover, results of chemostat cultures demonstrated that extreme halophilic organisms show Monod-kinetics on different sole carbon sources. A diauxic growth pattern was described on a mixture of substrates in batch cultivations. In addition, the methodologies presented here enable one to characterize the utilized strain Haloferax mediterranei (HFX) as a potential new host organism. Thus, this study offers a strong methodological basis as well as a fundamental physiological assessment for bioreactor quantification of extreme halophiles that can serve as primary knowledge for applications of extreme halophiles in biotechnology. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Rubinger, Rero Marques; da Silva, Edna Raimunda; Pinto, Daniel Zaroni; Rubinger, Carla Patrícia Lacerda; Oliveira, Adhimar Flávio; da Costa Bortoni, Edson
2015-01-01
We compared the photometric and radiometric quantities in the visible, ultraviolet, and infrared spectra of white light-emitting diodes (LEDs), incandescent light bulbs and a compact fluorescent lamp used for home illumination. The color-rendering index and efficiency-related quantities were also used as auxiliary tools in this comparison. LEDs have a better performance in all aspects except for the color-rendering index, which is better with an incandescent light bulb. Compact fluorescent lamps presented results that, to our knowledge, do not justify their substitution for the incandescent light bulb. The main contribution of this work is an approach based on fundamental quantities to evaluate LEDs and other light sources.
Sakuraba, Shun; Asai, Kiyoshi; Kameda, Tomoshi
2015-11-05
The dimerization free energies of RNA-RNA duplexes are fundamental values that represent the structural stability of RNA complexes. We report a comparative analysis of RNA-RNA duplex dimerization free-energy changes upon mutations, estimated from a molecular dynamics simulation and experiments. A linear regression for nine pairs of double-stranded RNA sequences, six base pairs each, yielded a mean absolute deviation of 0.55 kcal/mol and an R(2) value of 0.97, indicating quantitative agreement between simulations and experimental data. The observed accuracy indicates that the molecular dynamics simulation with the current molecular force field is capable of estimating the thermodynamic properties of RNA molecules.
Recent advances in non-LTE stellar atmosphere models
NASA Astrophysics Data System (ADS)
Sander, Andreas A. C.
2017-11-01
In the last decades, stellar atmosphere models have become a key tool in understanding massive stars. Applied for spectroscopic analysis, these models provide quantitative information on stellar wind properties as well as fundamental stellar parameters. The intricate non-LTE conditions in stellar winds dictate the development of adequate sophisticated model atmosphere codes. The increase in both, the computational power and our understanding of physical processes in stellar atmospheres, led to an increasing complexity in the models. As a result, codes emerged that can tackle a wide range of stellar and wind parameters. After a brief address of the fundamentals of stellar atmosphere modeling, the current stage of clumped and line-blanketed model atmospheres will be discussed. Finally, the path for the next generation of stellar atmosphere models will be outlined. Apart from discussing multi-dimensional approaches, I will emphasize on the coupling of hydrodynamics with a sophisticated treatment of the radiative transfer. This next generation of models will be able to predict wind parameters from first principles, which could open new doors for our understanding of the various facets of massive star physics, evolution, and death.
NASA Technical Reports Server (NTRS)
De Bothezat, George
1920-01-01
Report presents a theory which gives a complete picture and an exact quantitative analysis of the whole phenomenon of the working of blade screws, but also unites in a continuous whole the entire scale of states of work conceivable for a blade screw. Chapter 1 is devoted to the establishment of the system of fundamental equations relating to the blade screw. Chapter 2 contains the general discussion of the 16 states of work which may establish themselves for a blade screw. The existence of the vortex ring state and the whirling phenomenon are established. All the fundamental functions which enter the blade-screw theory are submitted to a general analytical discussion. The general outline of the curve of the specific function is examined. Two limited cases of the work of the screw, the screw with a zero constructive pitch and the screw with an infinite constructive pitch, are pointed out. Chapter 3 is devoted to the study of the propulsive screw or propeller. (author)
An alternative approach based on artificial neural networks to study controlled drug release.
Reis, Marcus A A; Sinisterra, Rubén D; Belchior, Jadson C
2004-02-01
An alternative methodology based on artificial neural networks is proposed to be a complementary tool to other conventional methods to study controlled drug release. Two systems are used to test the approach; namely, hydrocortisone in a biodegradable matrix and rhodium (II) butyrate complexes in a bioceramic matrix. Two well-established mathematical models are used to simulate different release profiles as a function of fundamental properties; namely, diffusion coefficient (D), saturation solubility (C(s)), drug loading (A), and the height of the device (h). The models were tested, and the results show that these fundamental properties can be predicted after learning the experimental or model data for controlled drug release systems. The neural network results obtained after the learning stage can be considered to quantitatively predict ideal experimental conditions. Overall, the proposed methodology was shown to be efficient for ideal experiments, with a relative average error of <1% in both tests. This approach can be useful for the experimental analysis to simulate and design efficient controlled drug-release systems. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, T. J.; Aker, P. M.; Scharko, N. K.
Using vetted methods for generating quantitative absorption reference data, broadband infrared and near-infrared spectra (total range 11,000 – 600 cm-1) of pyridine vapor were recorded at 0.1 cm-1 spectral resolution, with the analyte thermostatted at 298 K and pressure-broadened to 1 atmosphere using N2 ballast gas. The quantitative spectrum is reported for the first time, and we have re-assigned some of the 27 fundamental modes. Fundamental assignments were confirmed by IR vapor phase band shapes, FT-Raman measurements and comparison with previous analyses. For the 760-Torr vapor-phase IR data several bands show resolved peaks (Q-branches). We have also assigned for themore » first time hundreds of combination and overtone bands in the mid- and near-IR. All assignments were made via comparison to theoretically calculated frequencies and intensities: The frequencies were computed with Gaussian03 with the anharmonic option, using MP2 and the ccpvtz basis set. The intensities were taken from a VSCF calculation in GAMESS using Hartree-Fock (for overtones and combination bands) or from the harmonic MP2 for fundamentals. Overtone and combination band harmonic and anharmonic frequencies, as well as intensities were also calculated using the CFOUR program. It is seen in the NIR spectrum near 6000 cm-1 that the very strong bands arise from the C-H first overtones, whereas only much weaker bands are observed for combination bands of C-H stretching modes. Certain features are discussed for their potential utility for atmospheric monitoring.« less
Statistical genetics and evolution of quantitative traits
NASA Astrophysics Data System (ADS)
Neher, Richard A.; Shraiman, Boris I.
2011-10-01
The distribution and heritability of many traits depends on numerous loci in the genome. In general, the astronomical number of possible genotypes makes the system with large numbers of loci difficult to describe. Multilocus evolution, however, greatly simplifies in the limit of weak selection and frequent recombination. In this limit, populations rapidly reach quasilinkage equilibrium (QLE) in which the dynamics of the full genotype distribution, including correlations between alleles at different loci, can be parametrized by the allele frequencies. This review provides a simplified exposition of the concept and mathematics of QLE which is central to the statistical description of genotypes in sexual populations. Key results of quantitative genetics such as the generalized Fisher’s “fundamental theorem,” along with Wright’s adaptive landscape, are shown to emerge within QLE from the dynamics of the genotype distribution. This is followed by a discussion under what circumstances QLE is applicable, and what the breakdown of QLE implies for the population structure and the dynamics of selection. Understanding the fundamental aspects of multilocus evolution obtained through simplified models may be helpful in providing conceptual and computational tools to address the challenges arising in the studies of complex quantitative phenotypes of practical interest.
Crocker, Jonny; Bartram, Jamie
2014-01-01
Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country’s ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632
Deformation and Failure Mechanisms of Shape Memory Alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daly, Samantha Hayes
2015-04-15
The goal of this research was to understand the fundamental mechanics that drive the deformation and failure of shape memory alloys (SMAs). SMAs are difficult materials to characterize because of the complex phase transformations that give rise to their unique properties, including shape memory and superelasticity. These phase transformations occur across multiple length scales (one example being the martensite-austenite twinning that underlies macroscopic strain localization) and result in a large hysteresis. In order to optimize the use of this hysteretic behavior in energy storage and damping applications, we must first have a quantitative understanding of this transformation behavior. Prior resultsmore » on shape memory alloys have been largely qualitative (i.e., mapping phase transformations through cracked oxide coatings or surface morphology). The PI developed and utilized new approaches to provide a quantitative, full-field characterization of phase transformation, conducting a comprehensive suite of experiments across multiple length scales and tying these results to theoretical and computational analysis. The research funded by this award utilized new combinations of scanning electron microscopy, diffraction, digital image correlation, and custom testing equipment and procedures to study phase transformation processes at a wide range of length scales, with a focus at small length scales with spatial resolution on the order of 1 nanometer. These experiments probe the basic connections between length scales during phase transformation. In addition to the insights gained on the fundamental mechanisms driving transformations in shape memory alloys, the unique experimental methodologies developed under this award are applicable to a wide range of solid-to-solid phase transformations and other strain localization mechanisms.« less
Global Analysis of River Planform Change using the Google Earth Engine
NASA Astrophysics Data System (ADS)
Bryk, A.; Dietrich, W. E.; Gorelick, N.; Sargent, R.; Braudrick, C. A.
2014-12-01
Geomorphologists have historically tracked river dynamics using a combination of maps, aerial photographs, and the stratigraphic record. Although stratigraphic records can extend into deep time, maps and aerial photographs often confine our record of change to sparse measurements over the last ~80 years and in some cases much less time. For the first time Google's Earth Engine (GEE) cloud based platform allows researchers the means to analyze quantitatively the pattern and pace of river channel change over the last 30 years with high temporal resolution across the entire planet. The GEE provides an application programing interface (API) that enables quantitative analysis of various data sets including the entire Landsat L1T archive. This allows change detection for channels wider than about 150 m over 30 years of successive, georeferenced imagery. Qualitatively, it becomes immediately evident that the pace of channel morphodynamics for similar planforms varies by orders of magnitude across the planet and downstream along individual rivers. To quantify these rates of change and to explore their controls we have developed methods for differentiating channels from floodplain along large alluvial rivers. We introduce a new metric of morphodynamics: the ratio of eroded area to channel area per unit time, referred to as "M". We also keep track of depositional areas resulting from channel shifting. To date our quantitative analysis has focused on rivers in the Andean foreland. Our analysis shows channel bank erosion rates, M, varies by orders of magnitude for these rivers, from 0 to ~0.25 yr-1, yet these rivers have essentially identical curvature and sinuosity and are visually indistinguishable. By tracking both bank paths in time, we find that, for some meandering rivers, a significant fraction of new floodplain is produced through outer-bank accretion rather than point bar deposition. This process is perhaps more important in generating floodplain stratigraphy than previously recognized. These initial findings indicate a new set of quantitative observations will emerge to further test and advance morphodynamic theory. The Google Earth Engine offers the opportunity to explore river morphodynamics on an unprecedented scale and provides a powerful tool for addressing fundamental questions in river morphodynamics.
Ji, Qinqin; Salomon, Arthur R.
2015-01-01
The activation of T-lymphocytes through antigen-mediated T-cell receptor (TCR) clustering is vital in regulating the adaptive-immune response. Although T cell receptor signaling has been extensively studied, the fundamental mechanisms for signal initiation are not fully understood. Reduced temperature initiated some of the hallmarks of TCR signaling such as increased phosphorylation and activation on ERK and calcium release from the endoplasmic reticulum as well as coalesce T-cell membrane microdomains. The precise mechanism of TCR signaling initiation due to temperature change remains obscure. One critical question is whether signaling initiated by cold treatment of T cells differs from signaling initiated by crosslinking of the T cell receptor. To address this uncertainty, a wide-scale, quantitative mass spectrometry-based phosphoproteomic analysis was performed on T cells stimulated either by temperature shift or through crosslinking of the TCR. Careful statistical comparison between the two stimulations revealed a striking level of identity between the subset of 339 sites that changed significantly with both stimulations. This study demonstrates for the first time, at unprecedented detail, that T cell cold treatment was sufficient to initiate signaling patterns nearly identical to soluble antibody stimulation, shedding new light on the mechanism of activation of these critically important immune cells. PMID:25839225
Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures
de la Fuente, Ildefonso Martínez
2010-01-01
One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111
Lorenz, Alyson; Dhingra, Radhika; Chang, Howard H; Bisanzio, Donal; Liu, Yang; Remais, Justin V
2014-01-01
Extrapolating landscape regression models for use in assessing vector-borne disease risk and other applications requires thoughtful evaluation of fundamental model choice issues. To examine implications of such choices, an analysis was conducted to explore the extent to which disparate landscape models agree in their epidemiological and entomological risk predictions when extrapolated to new regions. Agreement between six literature-drawn landscape models was examined by comparing predicted county-level distributions of either Lyme disease or Ixodes scapularis vector using Spearman ranked correlation. AUC analyses and multinomial logistic regression were used to assess the ability of these extrapolated landscape models to predict observed national data. Three models based on measures of vegetation, habitat patch characteristics, and herbaceous landcover emerged as effective predictors of observed disease and vector distribution. An ensemble model containing these three models improved precision and predictive ability over individual models. A priori assessment of qualitative model characteristics effectively identified models that subsequently emerged as better predictors in quantitative analysis. Both a methodology for quantitative model comparison and a checklist for qualitative assessment of candidate models for extrapolation are provided; both tools aim to improve collaboration between those producing models and those interested in applying them to new areas and research questions.
High temperature polymer degradation: Rapid IR flow-through method for volatile quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giron, Nicholas H.; Celina, Mathew C.
Accelerated aging of polymers at elevated temperatures often involves the generation of volatiles. These can be formed as the products of oxidative degradation reactions or intrinsic pyrolytic decomposition as part of polymer scission reactions. A simple analytical method for the quantification of water, CO 2, and CO as fundamental signatures of degradation kinetics is required. Here, we describe an analytical framework and develops a rapid mid-IR based gas analysis methodology to quantify volatiles that are contained in small ampoules after aging exposures. The approach requires identification of unique spectral signatures, systematic calibration with known concentrations of volatiles, and a rapidmore » acquisition FTIR spectrometer for time resolved successive spectra. Furthermore, the volatiles are flushed out from the ampoule with dry N2 carrier gas and are then quantified through spectral and time integration. This method is sufficiently sensitive to determine absolute yields of ~50 μg water or CO 2, which relates to probing mass losses of less than 0.01% for a 1 g sample, i.e. the early stages in the degradation process. Such quantitative gas analysis is not easily achieved with other approaches. Our approach opens up the possibility of quantitative monitoring of volatile evolution as an avenue to explore polymer degradation kinetics and its dependence on time and temperature.« less
High temperature polymer degradation: Rapid IR flow-through method for volatile quantification
Giron, Nicholas H.; Celina, Mathew C.
2017-05-19
Accelerated aging of polymers at elevated temperatures often involves the generation of volatiles. These can be formed as the products of oxidative degradation reactions or intrinsic pyrolytic decomposition as part of polymer scission reactions. A simple analytical method for the quantification of water, CO 2, and CO as fundamental signatures of degradation kinetics is required. Here, we describe an analytical framework and develops a rapid mid-IR based gas analysis methodology to quantify volatiles that are contained in small ampoules after aging exposures. The approach requires identification of unique spectral signatures, systematic calibration with known concentrations of volatiles, and a rapidmore » acquisition FTIR spectrometer for time resolved successive spectra. Furthermore, the volatiles are flushed out from the ampoule with dry N2 carrier gas and are then quantified through spectral and time integration. This method is sufficiently sensitive to determine absolute yields of ~50 μg water or CO 2, which relates to probing mass losses of less than 0.01% for a 1 g sample, i.e. the early stages in the degradation process. Such quantitative gas analysis is not easily achieved with other approaches. Our approach opens up the possibility of quantitative monitoring of volatile evolution as an avenue to explore polymer degradation kinetics and its dependence on time and temperature.« less
Bernstein, Lynne E.; Lu, Zhong-Lin; Jiang, Jintao
2008-01-01
A fundamental question about human perception is how the speech perceiving brain combines auditory and visual phonetic stimulus information. We assumed that perceivers learn the normal relationship between acoustic and optical signals. We hypothesized that when the normal relationship is perturbed by mismatching the acoustic and optical signals, cortical areas responsible for audiovisual stimulus integration respond as a function of the magnitude of the mismatch. To test this hypothesis, in a previous study, we developed quantitative measures of acoustic-optical speech stimulus incongruity that correlate with perceptual measures. In the current study, we presented low incongruity (LI, matched), medium incongruity (MI, moderately mismatched), and high incongruity (HI, highly mismatched) audiovisual nonsense syllable stimuli during fMRI scanning. Perceptual responses differed as a function of the incongruity level, and BOLD measures were found to vary regionally and quantitatively with perceptual and quantitative incongruity levels. Each increase in level of incongruity resulted in an increase in overall levels of cortical activity and in additional activations. However, the only cortical region that demonstrated differential sensitivity to the three stimulus incongruity levels (HI > MI > LI) was a subarea of the left supramarginal gyrus (SMG). The left SMG might support a fine-grained analysis of the relationship between audiovisual phonetic input in comparison with stored knowledge, as hypothesized here. The methods here show that quantitative manipulation of stimulus incongruity is a new and powerful tool for disclosing the system that processes audiovisual speech stimuli. PMID:18495091
Martins, Cassio Henrique Taques; Assunção, Catarina De Marchi
2018-01-01
It is a fundamental element in both research and clinical applications of electroencephalography to know the frequency composition of brain electrical activity. The quantitative analysis of brain electrical activity uses computer resources to evaluate the electroencephalography and allows quantification of the data. The contribution of the quantitative perspective is unique, since conventional electroencephalography based on the visual examination of the tracing is not as objective. A systematic review was performed on the MEDLINE database in October 2017. The authors independently analyzed the studies, by title and abstract, and selected articles that met the inclusion criteria: comparative studies, not older than 30 years, that compared the use of conventional electroencephalogram (EEG) with the use of quantitative electroencephalogram (QEEG) in the English language. One hundred twelve articles were automatically selected by the MEDLINE search engine, but only six met the above criteria. The review found that given a 95% confidence interval, QEEG had no statistically higher sensitivity than EEG in four of the six studies reviewed. However, these results must be viewed with appropriate caution, particularly as groups in between studies were not matched on important variables such as gender, age, type of illness, recovery stage, and treatment. The authors' findings in this systematic review are suggestive of the importance of QEEG as an auxiliary tool to traditional EEG, and as such, justifying further refinement, standardization, and eventually the future execution of a head-to-head prospective study on comparing the two methods.
Development of a theoretical framework for analyzing cerebrospinal fluid dynamics
Cohen, Benjamin; Voorhees, Abram; Vedel, Søren; Wei, Timothy
2009-01-01
Background To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservation; but control volume analysis enforces independent conditions on pressure and volume. Previously, utilization of clinical measurements has been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Methods Control volume analysis is presented to introduce the reader to the theoretical background of this foundational fluid mechanics technique for application to general control volumes. This approach is able to directly incorporate the diverse measurements obtained by clinicians to better elucidate intracranial dynamics and progression to disorder. Results Several examples of meaningful intracranial control volumes and the particular measurement sets needed for the analysis are discussed. Conclusion Control volume analysis provides a framework to guide the type and location of measurements and also a way to interpret the resulting data within a fundamental fluid physics analysis. PMID:19772652
A comment on measuring the Hurst exponent of financial time series
NASA Astrophysics Data System (ADS)
Couillard, Michel; Davison, Matt
2005-03-01
A fundamental hypothesis of quantitative finance is that stock price variations are independent and can be modeled using Brownian motion. In recent years, it was proposed to use rescaled range analysis and its characteristic value, the Hurst exponent, to test for independence in financial time series. Theoretically, independent time series should be characterized by a Hurst exponent of 1/2. However, finite Brownian motion data sets will always give a value of the Hurst exponent larger than 1/2 and without an appropriate statistical test such a value can mistakenly be interpreted as evidence of long term memory. We obtain a more precise statistical significance test for the Hurst exponent and apply it to real financial data sets. Our empirical analysis shows no long-term memory in some financial returns, suggesting that Brownian motion cannot be rejected as a model for price dynamics.
Mehrabi, Maryam; Eskandarieh, Sharareh; Khodadost, Mahmoud; Sadeghi, Maneli; Nikfarjam, Ali; Hajebi, Ahmad
2016-01-01
This study is a sociological analysis of the three dimensions of social structure including institutional, relational, and embodied structures that have an impact on the individuals' deviant behaviors in the society. The authors used a mix method to analyze the qualitative and quantitative data of 402 high risk abandoned substance users in 2008 in Tehran, capital city of Iran. The leading reasons of substance use were categorized into four fundamental themes as follows: stress, deviant social networks, and low social capital and weak social support sources. In addition, the epidemiology model of regression analysis provides a brief explanation to assess the association between the demographical and etiological variables, and the drug users' deviant behaviors. In sum, substance use is discussed as a deviant behavior pattern which stems from a comorbidity of weak social structures.
Structural and conformational determinants of macrocycle cell permeability.
Over, Björn; Matsson, Pär; Tyrchan, Christian; Artursson, Per; Doak, Bradley C; Foley, Michael A; Hilgendorf, Constanze; Johnston, Stephen E; Lee, Maurice D; Lewis, Richard J; McCarren, Patrick; Muncipinto, Giovanni; Norinder, Ulf; Perry, Matthew W D; Duvall, Jeremy R; Kihlberg, Jan
2016-12-01
Macrocycles are of increasing interest as chemical probes and drugs for intractable targets like protein-protein interactions, but the determinants of their cell permeability and oral absorption are poorly understood. To enable rational design of cell-permeable macrocycles, we generated an extensive data set under consistent experimental conditions for more than 200 non-peptidic, de novo-designed macrocycles from the Broad Institute's diversity-oriented screening collection. This revealed how specific functional groups, substituents and molecular properties impact cell permeability. Analysis of energy-minimized structures for stereo- and regioisomeric sets provided fundamental insight into how dynamic, intramolecular interactions in the 3D conformations of macrocycles may be linked to physicochemical properties and permeability. Combined use of quantitative structure-permeability modeling and the procedure for conformational analysis now, for the first time, provides chemists with a rational approach to design cell-permeable non-peptidic macrocycles with potential for oral absorption.
URBAN WATER SYSTEM PATHOGEN ASSESSMENTS: SIGNIFICANCE OF DISTRIBUTION BIOFILMS
Quantitative microbial risk assessment (QMRA), while not new to science is now providing a fundamental role in framing water guidelines internationally as well as identifying research gaps to be filled. Professor Ashbolt has been instrumental in working QMRA concepts into WHO gui...
URBAN WATER SYSTEM PATHOGEN ASSESSMENT: SIGNIFICANCE OF DISTRIBUTION BIOFILMS
Quantitative microbial risk assessment (QMRA), while not new to science is now providing a fundamental role in framing water guidelines internationally as well as identifying research gaps to be filled. Professor Ashbolt has been instrumental in working QMRA concepts into WHO gui...
Agent-Based Computational Modeling to Examine How Individual Cell Morphology Affects Dosimetry
Cell-based models utilizing high-content screening (HCS) data have applications for predictive toxicology. Evaluating concentration-dependent effects on cell fate and state response is a fundamental utilization of HCS data.Although HCS assays may capture quantitative readouts at ...
NASA Astrophysics Data System (ADS)
Li, Jiajia; Li, Rongxi; Zhao, Bangsheng; Guo, Hui; Zhang, Shuan; Cheng, Jinghua; Wu, Xiaoli
2018-04-01
The use of Micro-Laser Raman spectroscopy technology for quantitatively determining gas carbon isotope composition is presented. In this study, 12CO2 and 13CO2 were mixed with N2 at various molar fraction ratios to obtain Raman quantification factors (F12CO2 and F13CO2), which provide a theoretical basis for calculating the δ13C value. And the corresponding values were 0.523 (0 < C12CO2/CN2 < 2) and 1.11998 (0 < C13CO2/CN2 < 1.5) respectively. It has shown that the representative Raman peak area can be used for the determination of δ13C values within the relative errors range of 0.076% to 1.154% in 13CO2/12CO2 binary mixtures when F12CO2/F13CO2 is 0.466972625. In addition, measurement of δ13C values by Micro-Laser Raman analysis were carried out on natural CO2 gas from Shengli Oil-field at room temperature under different pressures. The δ13C values obtained by Micro-Laser Raman spectroscopy technology and Isotope Ratio Mass Spectrometry (IRMS) technology are in good agreement with each other, and the relative errors range of δ13C values is 1.232%-6.964%. This research provides a fundamental analysis tool for determining gas carbon isotope composition (δ13C values) quantitatively by using Micro-Laser Raman spectroscopy. Experiment of results demonstrates that this method has the potential for obtaining δ13C values in natural CO2 gas reservoirs.
Quantifying the relationship between sequence and three-dimensional structure conservation in RNA
2010-01-01
Background In recent years, the number of available RNA structures has rapidly grown reflecting the increased interest on RNA biology. Similarly to the studies carried out two decades ago for proteins, which gave the fundamental grounds for developing comparative protein structure prediction methods, we are now able to quantify the relationship between sequence and structure conservation in RNA. Results Here we introduce an all-against-all sequence- and three-dimensional (3D) structure-based comparison of a representative set of RNA structures, which have allowed us to quantitatively confirm that: (i) there is a measurable relationship between sequence and structure conservation that weakens for alignments resulting in below 60% sequence identity, (ii) evolution tends to conserve more RNA structure than sequence, and (iii) there is a twilight zone for RNA homology detection. Discussion The computational analysis here presented quantitatively describes the relationship between sequence and structure for RNA molecules and defines a twilight zone region for detecting RNA homology. Our work could represent the theoretical basis and limitations for future developments in comparative RNA 3D structure prediction. PMID:20550657
NASA Astrophysics Data System (ADS)
Fabre, Anne-Claire; Salesa, Manuel J.; Cornette, Raphael; Antón, Mauricio; Morales, Jorge; Peigné, Stéphane
2015-06-01
Inferences of function and ecology in extinct taxa have long been a subject of interest because it is fundamental to understand the evolutionary history of species. In this study, we use a quantitative approach to investigate the locomotor behaviour of Simocyon batalleri, a key taxon related to the ailurid family. To do so, we use 3D surface geometric morphometric approaches on the three long bones of the forelimb of an extant reference sample. Next, we test the locomotor strategy of S. batalleri using a leave-one-out cross-validated linear discriminant analysis. Our results show that S. batalleri is included in the morphospace of the living species of musteloids. However, each bone of the forelimb appears to show a different functional signal suggesting that inferring the lifestyle or locomotor behaviour of fossils can be difficult and dependent on the bone investigated. This highlights the importance of studying, where possible, a maximum of skeletal elements to be able to make robust inferences on the lifestyle of extinct species. Finally, our results suggest that S. batalleri may be more arboreal than previously suggested.
Krishnan, Ramesh K.; Nolte, Hendrik; Sun, Tianliang; Kaur, Harmandeep; Sreenivasan, Krishnamoorthy; Looso, Mario; Offermanns, Stefan; Krüger, Marcus; Swiercz, Jakub M.
2015-01-01
The inhibitor of the nuclear factor-κB (IκB) kinase (IKK) complex is a key regulator of the canonical NF-κB signalling cascade and is crucial for fundamental cellular functions, including stress and immune responses. The majority of IKK complex functions are attributed to NF-κB activation; however, there is increasing evidence for NF-κB pathway-independent signalling. Here we combine quantitative mass spectrometry with random forest bioinformatics to dissect the TNF-α-IKKβ-induced phosphoproteome in MCF-7 breast cancer cells. In total, we identify over 20,000 phosphorylation sites, of which ∼1% are regulated up on TNF-α stimulation. We identify various potential novel IKKβ substrates including kinases and regulators of cellular trafficking. Moreover, we show that one of the candidates, AEG-1/MTDH/LYRIC, is directly phosphorylated by IKKβ on serine 298. We provide evidence that IKKβ-mediated AEG-1 phosphorylation is essential for IκBα degradation as well as NF-κB-dependent gene expression and cell proliferation, which correlate with cancer patient survival in vivo. PMID:25849741
Statistical methods for quantitative mass spectrometry proteomic experiments with labeling.
Oberg, Ann L; Mahoney, Douglas W
2012-01-01
Mass Spectrometry utilizing labeling allows multiple specimens to be subjected to mass spectrometry simultaneously. As a result, between-experiment variability is reduced. Here we describe use of fundamental concepts of statistical experimental design in the labeling framework in order to minimize variability and avoid biases. We demonstrate how to export data in the format that is most efficient for statistical analysis. We demonstrate how to assess the need for normalization, perform normalization, and check whether it worked. We describe how to build a model explaining the observed values and test for differential protein abundance along with descriptive statistics and measures of reliability of the findings. Concepts are illustrated through the use of three case studies utilizing the iTRAQ 4-plex labeling protocol.
Comparison of particular logistic models' adoption in the Czech Republic
NASA Astrophysics Data System (ADS)
Vrbová, Petra; Cempírek, Václav
2016-12-01
Managing inventory is considered as one of the most challenging tasks facing supply chain managers and specialists. Decisions related to inventory locations along with level of inventory kept throughout the supply chain have a fundamental impact on the response time, service level, delivery lead-time and the total cost of the supply chain. The main objective of this paper is to identify and analyse the share of a particular logistic model adopted in the Czech Republic (Consignment stock, Buffer stock, Safety stock) and also compare their usage and adoption according to different industries. This paper also aims to specify possible reasons of particular logistic model preferences in comparison to the others. The analysis is based on quantitative survey held in the Czech Republic.
Autonomous space processor for orbital debris
NASA Technical Reports Server (NTRS)
Ramohalli, Kumar; Campbell, David; Marine, Micky; Saad, Mohamad; Bertles, Daniel; Nichols, Dave
1990-01-01
Advanced designs are being continued to develop the ultimate goal of a GETAWAY special to demonstrate economical removal of orbital debris utilizing local resources in orbit. The fundamental technical feasibility was demonstrated in 1988 through theoretical calculations, quantitative computer animation, a solar focal point cutter, a robotic arm design and a subcase model. Last year improvements were made to the solar cutter and the robotic arm. Also performed last year was a mission analysis which showed the feasibility of retrieve at least four large (greater than 1500 kg) pieces of debris. Advances made during this reporting period are the incorporation of digital control with the existing placement arm, the development of a new robotic manipulator arm, and the study of debris spin attenuation. These advances are discussed.
Fundamentals and techniques of nonimaging optics for solar energy concentration
NASA Astrophysics Data System (ADS)
Winston, R.; Ogallaher, J. J.
1980-09-01
Recent progress in basic research into the theoretical understanding of nonimaging optical systems and their application to the design of practical solar concentration was reviewed. Work was done to extend the previously developed geometrical vector flux formalism with the goal of applying it to the analysis of nonideal concentrators. Both phase space and vector flux representation for traditional concentrators were generated. Understanding of the thermodynamically derived relationship between concentration and cavity effects led to the design of new lossless and low loss concentrators for absorbers with gaps. Quantitative measurements of the response of real collector systems and the distribution of diffuse insolation shows that in most cases performance exceeds predictions in solar applications. These developments led to improved nonimaging solar concentrator designs and applications.
NASA Astrophysics Data System (ADS)
Ohnaka, M.
2004-12-01
For the past four decades, great progress has been made in understanding earthquake source processes. In particular, recent progress in the field of the physics of earthquakes has contributed substantially to unraveling the earthquake generation process in quantitative terms. Yet, a fundamental problem remains unresolved in this field. The constitutive law that governs the behavior of earthquake ruptures is the basis of earthquake physics, and the governing law plays a fundamental role in accounting for the entire process of an earthquake rupture, from its nucleation to the dynamic propagation to its arrest, quantitatively in a unified and consistent manner. Therefore, without establishing the rational constitutive law, the physics of earthquakes cannot be a quantitative science in a true sense, and hence it is urgent to establish the rational constitutive law. However, it has been controversial over the past two decades, and it is still controversial, what the constitutive law for earthquake ruptures ought to be, and how it should be formulated. To resolve the controversy is a necessary step towards a more complete, unified theory of earthquake physics, and now the time is ripe to do so. Because of its fundamental importance, we have to discuss thoroughly and rigorously what the constitutive law ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid evidence. There are prerequisites for the constitutive formulation. The brittle, seismogenic layer and individual faults therein are characterized by inhomogeneity, and fault inhomogeneity has profound implications for earthquake ruptures. In addition, rupture phenomena including earthquakes are inherently scale dependent; indeed, some of the physical quantities inherent in rupture exhibit scale dependence. To treat scale-dependent physical quantities inherent in the rupture over a broad scale range quantitatively in a unified and consistent manner, it is critical to formulate the governing law properly so as to incorporate the scaling property. Thus, the properties of fault inhomogeneity and physical scaling are indispensable prerequisites to be incorporated into the constitutive formulation. Thorough discussion in this context necessarily leads to the consistent conclusion that the constitutive law must be formulated in such a manner that the shear traction is a primary function of the slip displacement, with the secondary effect of slip rate or stationary contact time. This constitutive formulation makes it possible to account for the entire process of an earthquake rupture over a broad scale range quantitatively in a unified and consistent manner.
Accumulation of the Antibiotic Phenazine-1-Carboxylic Acid in the Rhizosphere of Dryland Cereals
USDA-ARS?s Scientific Manuscript database
Natural antibiotics are thought to function in the defense, fitness, competitiveness, biocontrol activity, communication and gene regulation of microorganisms. However, the scale and quantitative aspects of antibiotic production in natural settings are poorly understood. We addressed these fundament...
The Student-as-Bricoleur: Making Sense of Research Paradigms
ERIC Educational Resources Information Center
Schnelker, Diane L.
2006-01-01
Although there is consensus that qualitative approaches to social research are distinguished from quantitative approaches by their fundamental philosophical systems, there is resistance to incorporating philosophical distinctions into graduate level research courses. Resistance may be due to the recognition that students have limited experience…
Quantitative molecular orbital energies within a G0W0 approximation
NASA Astrophysics Data System (ADS)
Sharifzadeh, S.; Tamblyn, I.; Doak, P.; Darancet, P. T.; Neaton, J. B.
2012-09-01
Using many-body perturbation theory within a G 0 W 0 approximation, with a plane wave basis set and using a starting point based on density functional theory within the generalized gradient approximation, we explore routes for computing the ionization potential (IP), electron affinity (EA), and fundamental gap of three gas-phase molecules — benzene, thiophene, and (1,4) diamino-benzene — and compare with experiments. We examine the dependence of the IP and fundamental gap on the number of unoccupied states used to represent the dielectric function and the self energy, as well as the dielectric function plane-wave cutoff. We find that with an effective completion strategy for approximating the unoccupied subspace, and a well converged dielectric function kinetic energy cutoff, the computed IPs and EAs are in excellent quantitative agreement with available experiment (within 0.2 eV), indicating that a one-shot G 0 W 0 approach can be very accurate for calculating addition/removal energies of small organic molecules.
Takamatsu, Daiko; Yoneyama, Akio; Asari, Yusuke; Hirano, Tatsumi
2018-02-07
A fundamental understanding of concentrations of salts in lithium-ion battery electrolytes during battery operation is important for optimal operation and design of lithium-ion batteries. However, there are few techniques that can be used to quantitatively characterize salt concentration distributions in the electrolytes during battery operation. In this paper, we demonstrate that in operando X-ray phase imaging can quantitatively visualize the salt concentration distributions that arise in electrolytes during battery operation. From quantitative evaluation of the concentration distributions at steady states, we obtained the salt diffusivities in electrolytes with different initial salt concentrations. Because of no restriction on samples and high temporal and spatial resolutions, X-ray phase imaging will be a versatile technique for evaluating electrolytes, both aqueous and nonaqueous, of many electrochemical systems.
MR Imaging-based Semi-quantitative Methods for Knee Osteoarthritis
JARRAYA, Mohamed; HAYASHI, Daichi; ROEMER, Frank Wolfgang; GUERMAZI, Ali
2016-01-01
Magnetic resonance imaging (MRI)-based semi-quantitative (SQ) methods applied to knee osteoarthritis (OA) have been introduced during the last decade and have fundamentally changed our understanding of knee OA pathology since then. Several epidemiological studies and clinical trials have used MRI-based SQ methods to evaluate different outcome measures. Interest in MRI-based SQ scoring system has led to continuous update and refinement. This article reviews the different SQ approaches for MRI-based whole organ assessment of knee OA and also discuss practical aspects of whole joint assessment. PMID:26632537
Physiological attributes of 11 Northwest conifer species
Ronni L. Korol
2001-01-01
The quantitative description and simulation of the fundamental processes that characterize forest growth are increasing in importance in forestry research. Predicting future forest growth, however, is compounded by the various combinations of temperature, humidity, precipitation, and atmospheric carbon dioxide concentration that may occur. One method of integrating new...
The mass-specific energy cost of human walking is set by stature
USDA-ARS?s Scientific Manuscript database
The metabolic and mechanical requirements of walking are considered to be of fundamental importance to the health, physiological function and even the evolution of modern humans. Although walking energy expenditure and gait mechanics are clearly linked, a direct quantitative relationship has not eme...
Probes, Surveys, and the Ontology of the Social
ERIC Educational Resources Information Center
Collins, Harry; Evans, Robert
2017-01-01
By distinguishing between a survey and--a newly introduced term--a "probe," we recast the relationship between qualitative and quantitative approaches to social science. The difference turns on the "uniformity" of the phenomenon being examined. Uniformity is a fundamental idea underlying all scientific research but is rarely…
Spreading Effect in Industrial Complex Network Based on Revised Structural Holes Theory
Ye, Qing; Guan, Jun
2016-01-01
This paper analyzed the spreading effect of industrial sectors with complex network model under perspective of econophysics. Input-output analysis, as an important research tool, focuses more on static analysis. However, the fundamental aim of industry analysis is to figure out how interaction between different industries makes impacts on economic development, which turns out to be a dynamic process. Thus, industrial complex network based on input-output tables from WIOD is proposed to be a bridge connecting accurate static quantitative analysis and comparable dynamic one. With application of revised structural holes theory, flow betweenness and random walk centrality were respectively chosen to evaluate industrial sectors’ long-term and short-term spreading effect process in this paper. It shows that industries with higher flow betweenness or random walk centrality would bring about more intensive industrial spreading effect to the industrial chains they stands in, because value stream transmission of industrial sectors depends on how many products or services it can get from the other ones, and they are regarded as brokers with bigger information superiority and more intermediate interests. PMID:27218468
Spreading Effect in Industrial Complex Network Based on Revised Structural Holes Theory.
Xing, Lizhi; Ye, Qing; Guan, Jun
2016-01-01
This paper analyzed the spreading effect of industrial sectors with complex network model under perspective of econophysics. Input-output analysis, as an important research tool, focuses more on static analysis. However, the fundamental aim of industry analysis is to figure out how interaction between different industries makes impacts on economic development, which turns out to be a dynamic process. Thus, industrial complex network based on input-output tables from WIOD is proposed to be a bridge connecting accurate static quantitative analysis and comparable dynamic one. With application of revised structural holes theory, flow betweenness and random walk centrality were respectively chosen to evaluate industrial sectors' long-term and short-term spreading effect process in this paper. It shows that industries with higher flow betweenness or random walk centrality would bring about more intensive industrial spreading effect to the industrial chains they stands in, because value stream transmission of industrial sectors depends on how many products or services it can get from the other ones, and they are regarded as brokers with bigger information superiority and more intermediate interests.
How multi segmental patterns deviate in spastic diplegia from typical developed.
Zago, Matteo; Sforza, Chiarella; Bona, Alessia; Cimolin, Veronica; Costici, Pier Francesco; Condoluci, Claudia; Galli, Manuela
2017-10-01
The relationship between gait features and coordination in children with Cerebral Palsy is not sufficiently analyzed yet. Principal Component Analysis can help in understanding motion patterns decomposing movement into its fundamental components (Principal Movements). This study aims at quantitatively characterizing the functional connections between multi-joint gait patterns in Cerebral Palsy. 65 children with spastic diplegia aged 10.6 (SD 3.7) years participated in standardized gait analysis trials; 31 typically developing adolescents aged 13.6 (4.4) years were also tested. To determine if posture affects gait patterns, patients were split into Crouch and knee Hyperextension group according to knee flexion angle at standing. 3D coordinates of hips, knees, ankles, metatarsal joints, pelvis and shoulders were submitted to Principal Component Analysis. Four Principal Movements accounted for 99% of global variance; components 1-3 explained major sagittal patterns, components 4-5 referred to movements on frontal plane and component 6 to additional movement refinements. Dimensionality was higher in patients than in controls (p<0.01), and the Crouch group significantly differed from controls in the application of components 1 and 4-6 (p<0.05), while the knee Hyperextension group in components 1-2 and 5 (p<0.05). Compensatory strategies of children with Cerebral Palsy (interactions between main and secondary movement patterns), were objectively determined. Principal Movements can reduce the effort in interpreting gait reports, providing an immediate and quantitative picture of the connections between movement components. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quantification of multiple gene expression in individual cells.
Peixoto, António; Monteiro, Marta; Rocha, Benedita; Veiga-Fernandes, Henrique
2004-10-01
Quantitative gene expression analysis aims to define the gene expression patterns determining cell behavior. So far, these assessments can only be performed at the population level. Therefore, they determine the average gene expression within a population, overlooking possible cell-to-cell heterogeneity that could lead to different cell behaviors/cell fates. Understanding individual cell behavior requires multiple gene expression analyses of single cells, and may be fundamental for the understanding of all types of biological events and/or differentiation processes. We here describe a new reverse transcription-polymerase chain reaction (RT-PCR) approach allowing the simultaneous quantification of the expression of 20 genes in the same single cell. This method has broad application, in different species and any type of gene combination. RT efficiency is evaluated. Uniform and maximized amplification conditions for all genes are provided. Abundance relationships are maintained, allowing the precise quantification of the absolute number of mRNA molecules per cell, ranging from 2 to 1.28 x 10(9) for each individual gene. We evaluated the impact of this approach on functional genetic read-outs by studying an apparently homogeneous population (monoclonal T cells recovered 4 d after antigen stimulation), using either this method or conventional real-time RT-PCR. Single-cell studies revealed considerable cell-to-cell variation: All T cells did not express all individual genes. Gene coexpression patterns were very heterogeneous. mRNA copy numbers varied between different transcripts and in different cells. As a consequence, this single-cell assay introduces new and fundamental information regarding functional genomic read-outs. By comparison, we also show that conventional quantitative assays determining population averages supply insufficient information, and may even be highly misleading.
Optimizing Ti:Sapphire laser for quantitative biomedical imaging
NASA Astrophysics Data System (ADS)
James, Jeemol; Thomsen, Hanna; Hanstorp, Dag; Alemán Hérnandez, Felipe Ademir; Rothe, Sebastian; Enger, Jonas; Ericson, Marica B.
2018-02-01
Ti:Sapphire lasers are powerful tools in the field of scientific research and industry for a wide range of applications such as spectroscopic studies and microscopic imaging where tunable near-infrared light is required. To push the limits of the applicability of Ti:Sapphire lasers, fundamental understanding of the construction and operation is required. This paper presents two projects, (i) dealing with the building and characterization of custom built tunable narrow linewidth Ti:Sapphire laser for fundamental spectroscopy studies; and the second project (ii) the implementation of a fs-pulsed commercial Ti:Sapphire laser in an experimental multiphoton microscopy platform. For the narrow linewidth laser, a gold-plated diffraction grating with a Littrow geometry was implemented for highresolution wavelength selection. We demonstrate that the laser is tunable between 700 to 950 nm, operating in a pulsed mode with a repetition rate of 1 kHz and maximum average output power around 350 mW. The output linewidth was reduced from 6 GHz to 1.5 GHz by inserting an additional 6 mm thick etalon. The bandwidth was measured by means of a scanning Fabry Perot interferometer. Future work will focus on using a fs-pulsed commercial Ti:Sapphire laser (Tsunami, Spectra physics), operating at 80 MHz and maximum average output power around 1 W, for implementation in an experimental multiphoton microscopy set up dedicated for biomedical applications. Special focus will be on controlling pulse duration and dispersion in the optical components and biological tissue using pulse compression. Furthermore, time correlated analysis of the biological samples will be performed with the help of time correlated single photon counting module (SPCM, Becker&Hickl) which will give a novel dimension in quantitative biomedical imaging.
Microbial Cellulose Utilization: Fundamentals and Biotechnology
Lynd, Lee R.; Weimer, Paul J.; van Zyl, Willem H.; Pretorius, Isak S.
2002-01-01
Fundamental features of microbial cellulose utilization are examined at successively higher levels of aggregation encompassing the structure and composition of cellulosic biomass, taxonomic diversity, cellulase enzyme systems, molecular biology of cellulase enzymes, physiology of cellulolytic microorganisms, ecological aspects of cellulase-degrading communities, and rate-limiting factors in nature. The methodological basis for studying microbial cellulose utilization is considered relative to quantification of cells and enzymes in the presence of solid substrates as well as apparatus and analysis for cellulose-grown continuous cultures. Quantitative description of cellulose hydrolysis is addressed with respect to adsorption of cellulase enzymes, rates of enzymatic hydrolysis, bioenergetics of microbial cellulose utilization, kinetics of microbial cellulose utilization, and contrasting features compared to soluble substrate kinetics. A biological perspective on processing cellulosic biomass is presented, including features of pretreated substrates and alternative process configurations. Organism development is considered for “consolidated bioprocessing” (CBP), in which the production of cellulolytic enzymes, hydrolysis of biomass, and fermentation of resulting sugars to desired products occur in one step. Two organism development strategies for CBP are examined: (i) improve product yield and tolerance in microorganisms able to utilize cellulose, or (ii) express a heterologous system for cellulose hydrolysis and utilization in microorganisms that exhibit high product yield and tolerance. A concluding discussion identifies unresolved issues pertaining to microbial cellulose utilization, suggests approaches by which such issues might be resolved, and contrasts a microbially oriented cellulose hydrolysis paradigm to the more conventional enzymatically oriented paradigm in both fundamental and applied contexts. PMID:12209002
Wood, William E.; Osseward, Peter J.; Roseberry, Thomas K.; Perkel, David J.
2013-01-01
Complex motor skills are more difficult to perform at certain points in the day (for example, shortly after waking), but the daily trajectory of motor-skill error is more difficult to predict. By undertaking a quantitative analysis of the fundamental frequency (FF) and amplitude of hundreds of zebra finch syllables per animal per day, we find that zebra finch song follows a previously undescribed daily oscillation. The FF and amplitude of harmonic syllables rises across the morning, reaching a peak near mid-day, and then falls again in the late afternoon until sleep. This oscillation, although somewhat variable, is consistent across days and across animals and does not require serotonin, as animals with serotonergic lesions maintained daily oscillations. We hypothesize that this oscillation is driven by underlying physiological factors which could be shared with other taxa. Song production in zebra finches is a model system for studying complex learned behavior because of the ease of gathering comprehensive behavioral data and the tractability of the underlying neural circuitry. The daily oscillation that we describe promises to reveal new insights into how time of day affects the ability to accomplish a variety of complex learned motor skills. PMID:24312654
NASA Astrophysics Data System (ADS)
Hilsenbeck-Fajardo, Jacqueline L.
2009-08-01
The research described herein is a multi-dimensional attempt to measure student's abilities to recall, conceptualize, and transfer fundamental and dynamic protein structure concepts as revealed by their own diagrammatic (pictorial) representations and written self-explanations. A total of 120 participants enrolled in a 'Fundamentals of Biochemistry' course contributed to this mixed-methodological study. The population of interest consisted primarily of pre-nursing and sport and exercise science majors. This course is typically associated with a high (<30%) combined drop/failure rate, thus the course provided the researcher with an ideal context in which to apply novel transfer assessment strategies. In the past, students within this population have reported very little chemistry background. In the following study, student-generated diagrammatic representations and written explanations were coded thematically using a highly objective rubric that was designed specifically for this study. Responses provided by the students were characterized on the macroscopic, microscopic, molecular-level, and integrated scales. Recall knowledge gain (i.e., knowledge that was gained through multiple-choice questioning techniques) was quantitatively correlated to learning style preferences (i.e., high-object, low-object, and non-object). Quantitative measures revealed that participants tended toward an object (i.e., snapshot) -based visualization preference, a potentially limiting factor in their desire to consider dynamic properties of fundamental biochemical contexts such as heat-induced protein denaturation. When knowledge transfer was carefully assessed within the predefined context, numerous misconceptions pertaining to the fundamental and dynamic nature of protein structure were revealed. Misconceptions tended to increase as the transfer model shifted away from the context presented in the original learning material. Ultimately, a fundamentally new, novel, and unique measure of knowledge transfer was developed as a main result of this study. It is envisioned by the researcher that this new measure of learning is applicable specifically to physical and chemical science education-based research in the form of deep transfer on the atomic-level scale.
The protein expression landscape of mitosis and meiosis in diploid budding yeast.
Becker, Emmanuelle; Com, Emmanuelle; Lavigne, Régis; Guilleux, Marie-Hélène; Evrard, Bertrand; Pineau, Charles; Primig, Michael
2017-03-06
Saccharomyces cerevisiae is an established model organism for the molecular analysis of fundamental biological processes. The genomes of numerous strains have been sequenced, and the transcriptome and proteome ofmajor phases during the haploid and diploid yeast life cycle have been determined. However, much less is known about dynamic changes of the proteome when cells switch from mitotic growth to meiotic development. We report a quantitative protein profiling analysis of yeast cell division and differentiation based on mass spectrometry. Information about protein levels was integrated with strand-specific tiling array expression data. We identified a total of 2366 proteins in at least one condition, including 175 proteins showing a statistically significant>5-fold change across the sample set, and 136 proteins detectable in sporulating but not respiring cells. We correlate protein expression patterns with biological processes and molecular function by Gene Ontology term enrichment, chemoprofiling, transcription interference and the formation of double stranded RNAs by overlapping sense/antisense transcripts. Our work provides initial quantitative insight into protein expression in diploid respiring and differentiating yeast cells. Critically, it associates developmentally regulated induction of antisense long noncoding RNAs and double stranded RNAs with fluctuating protein concentrations during growth and development. This integrated genomics analysis helps better understand how the transcriptome and the proteome correlate in diploid yeast cells undergoing mitotic growth in the presence of acetate (respiration) versus meiotic differentiation (Meiosis I and II). The study (i) provides quantitative expression data for 2366 proteins and their cognate mRNAs in at least one sample, (ii) shows strongly fluctuating protein levels during growth and differentiation for 175 cases, and (iii) identifies 136 proteins absent in mitotic but present in meiotic yeast cells. We have integrated protein profiling data using mass spectrometry with tiling array RNA profiling data and information on double-stranded RNAs (dsRNAs) by overlapping sense/antisense transcripts from an RNA-Sequencing experiment. This work therefore provides quantitative insight into protein expression during cell division and development and associates changing protein levels with developmental stage specific induction of antisense transcripts and the formation of dsRNAs. Copyright © 2017 Elsevier B.V. All rights reserved.
Vehof, H; Sanders, J; van Dooren, A; Heerdink, E; Das, E
2018-05-04
Researchers have discussed that journalistic reporting of medical developments is often characterised by exaggeration or lack of context, but additional quantitative evidence to support this claim is needed. This study introduces a quantitative approach to assessing coverage of medical innovations, by aiming at provided references to observed clinical effects. Although observed clinical effects reflect increased chances for future medical applications, it is unknown to which extent newspaper articles refer to it when spreading health information. We aimed to assess, over a 6-year period, newspaper publication characteristics of diabetes innovations, arising from all scientific areas of interest, regarding the total count and the proportion of articles that provide references to demonstrated clinical efficacy. Quantitative content analysis of newspaper articles covering innovative treatments for diabetes. We performed a systematic review of newspaper articles between 2011 and 2016 printed in the largest six Dutch newspapers. By assessing in-article references, it was possible to quickly distinguish between (1) articles that referred to actual clinical efficacy demonstrated in a scientific setting and (2) articles that presented either predictions, fundamental research, preclinical research or personal experiences and recommendations. Proportion differences between scientific areas of interest were analysed using the chi-squared test. A total of 613 articles were categorised. Total newspaper publication frequency increased with 9.9 articles per year (P = .031). In total, 17% of the articles contained a reference to any proven clinical efficacy. Articles about human nutrition science (7%; P = .001) and (neuro)psychology (4.3%; P = .014) less frequently provided a reference to actual clinical efficacy. Our findings show that less than one in five newspaper articles about diabetes research contains a reference to relevant clinical effects, while the publication count is increasing. These statistics may contribute to feelings of false hope and confusion in patients. Copyright © 2018 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Luchansky, Matthew Sam
In order to guide critical care therapies that are personalized to a patient's unique disease state, a diagnostic or theranostic medical device must quickly provide a detailed biomolecular understanding of disease onset and progression. This detailed molecular understanding of cellular processes and pathways requires the ability to measure multiple analytes in parallel. Though many traditional sensing technologies for biomarker analysis and fundamental biological studies (i.e. enzyme-linked immunosorbent assays, real-time polymerase chain reaction, etc.) rely on single-parameter measurements, it has become increasingly clear that the inherent complexity of many human illnesses and pathways necessitates quantitative and multiparameter analysis of biological samples. Currently used analytical methods are deficient in that they often provide either highly quantitative data for a single biomarker or qualitative data for many targets, but methods that simultaneously provide highly quantitative analysis of many targets have yet to be adequately developed. Fields such as medical diagnostics and cellular biology would benefit greatly from a technology that enables rapid, quantitative and reproducible assays for many targets within a single sample. In an effort to fill this unmet need, this doctoral dissertation describes the development of a clinically translational biosensing technology based on silicon photonics and developed in the chemistry research laboratory of Ryan C. Bailey. Silicon photonic microring resonators, a class of high-Q optical sensors, represent a promising platform for rapid, multiparameter in vitro measurements. The original device design utilizes 32-ring arrays for real-time biomolecular sensing without fluorescent labels, and these optical biosensors display great potential for more highly multiplexed (100s-1000s) measurements based on the impressive scalability of silicon device fabrication. Though this technology can be used to detect a variety of molecules, this dissertation establishes the utility of microring resonator chips for multiparameter analysis of several challenging protein targets in cell cultures, human blood sera, and other clinical samples such as cerebrospinal fluid. Various sandwich immunoassay formats for diverse protein analytes are described herein, but the bulk of this dissertation focuses on applying the technology to cytokine analysis. Cytokines are small signaling proteins that are present in serum and cell secretomes at concentrations in the pg/mL or ng/mL range. Cytokines are very challenging to quantitate due to their low abundance and small size, but play important roles in a variety of immune response and inflammatory pathways; cytokine quantitation is thus important in fundamental biological studies and diagnostics, and complex and overlapping cytokine roles make multiplexed measurements especially vital. In a typical experiment, microfluidics are used to spatially control chip functionalization by directing capture antibodies against a variety of protein targets to groups of microring sensors. In each case, binding of analytes to the rings causes a change in the local refractive index that is transduced into a real-time, quantitative optical signal. This photonic sensing modality is based on the interaction of the propagating evanescent field with molecules near the ring surface. Since each microring sensor in the array is monitored independently, this technology allows multiple proteins to be quantified in parallel from a single sample. This dissertation describes the fabrication, characterization, development, and application of silicon photonic microring resonator technology to multiplexed protein measurements in a variety of biological systems. Chapter 1 introduces the field of high-Q optical sensors and places microring resonator technology within the broader context of related whispering gallery mode devices. The final stages of cleanroom device fabrication, in which 8" silicon wafers that contain hundreds of ring resonator arrays are transformed into individual functional chips, are described in Chapter 2. Chapter 3 characterizes the physical and optical properties of the microring resonator arrays, especially focusing on the evanescent field profile and mass sensitivity metrics. Chapter 4 demonstrates the ability to apply ring resonator technology to cytokine detection and T cell secretion analysis. Chapter 5 builds on the initial cytokine work to demonstrate the simultaneous detection of multiple cytokines with higher throughput to enable studies of T cell differentiation. In preparation for reaching the goal of cytokine analysis in clinical samples, Chapter 6 describes magnetic bead-based signal enhancement of sandwich immunoassays for serum analysis. Additional examples of the utility of nanoparticles and sub-micron beads for signal amplification are described in Chapter 7, also demonstrating the ability to monitor single bead binding events. Chapter 8 describes an alternative cytokine signal enhancement strategy based on enzymatic amplification for human cerebrospinal fluid (CSF) analysis. Chapter 9 adds work with other CSF protein targets that are relevant to the continuing development of a multiparameter Alzheimer's Disease diagnostic chip. Future directions for multiplexed protein analysis as it pertains to important immunological studies and in vitro diagnostic applications are defined in Chapter 10. (Abstract shortened by UMI.).
NASA Astrophysics Data System (ADS)
Castellanos, Milagros; Carrillo, Pablo J. P.; Mateu, Mauricio G.
2015-03-01
Viruses are increasingly being studied from the perspective of fundamental physics at the nanoscale as biologically evolved nanodevices with many technological applications. In viral particles of the minute virus of mice (MVM), folded segments of the single-stranded DNA genome are bound to the capsid inner wall and act as molecular buttresses that increase locally the mechanical stiffness of the particle. We have explored whether a quantitative linkage exists in MVM particles between their DNA-mediated stiffening and impairment of a heat-induced, virus-inactivating structural change. A series of structurally modified virus particles with disrupted capsid-DNA interactions and/or distorted capsid cavities close to the DNA-binding sites were engineered and characterized, both in classic kinetics assays and by single-molecule mechanical analysis using atomic force microscopy. The rate constant of the virus inactivation reaction was found to decrease exponentially with the increase in elastic constant (stiffness) of the regions closer to DNA-binding sites. The application of transition state theory suggests that the height of the free energy barrier of the virus-inactivating structural transition increases linearly with local mechanical stiffness. From a virological perspective, the results indicate that infectious MVM particles may have acquired the biological advantage of increased survival under thermal stress by evolving architectural elements that rigidify the particle and impair non-productive structural changes. From a nanotechnological perspective, this study provides proof of principle that determination of mechanical stiffness and its manipulation by protein engineering may be applied for quantitatively probing and tuning the conformational dynamics of virus-based and other protein-based nanoassemblies.Viruses are increasingly being studied from the perspective of fundamental physics at the nanoscale as biologically evolved nanodevices with many technological applications. In viral particles of the minute virus of mice (MVM), folded segments of the single-stranded DNA genome are bound to the capsid inner wall and act as molecular buttresses that increase locally the mechanical stiffness of the particle. We have explored whether a quantitative linkage exists in MVM particles between their DNA-mediated stiffening and impairment of a heat-induced, virus-inactivating structural change. A series of structurally modified virus particles with disrupted capsid-DNA interactions and/or distorted capsid cavities close to the DNA-binding sites were engineered and characterized, both in classic kinetics assays and by single-molecule mechanical analysis using atomic force microscopy. The rate constant of the virus inactivation reaction was found to decrease exponentially with the increase in elastic constant (stiffness) of the regions closer to DNA-binding sites. The application of transition state theory suggests that the height of the free energy barrier of the virus-inactivating structural transition increases linearly with local mechanical stiffness. From a virological perspective, the results indicate that infectious MVM particles may have acquired the biological advantage of increased survival under thermal stress by evolving architectural elements that rigidify the particle and impair non-productive structural changes. From a nanotechnological perspective, this study provides proof of principle that determination of mechanical stiffness and its manipulation by protein engineering may be applied for quantitatively probing and tuning the conformational dynamics of virus-based and other protein-based nanoassemblies. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr07046a
Geophysical data analysis and visualization using the Grid Analysis and Display System
NASA Technical Reports Server (NTRS)
Doty, Brian E.; Kinter, James L., III
1995-01-01
Several problems posed by the rapidly growing volume of geophysical data are described, and a selected set of existing solutions to these problems is outlined. A recently developed desktop software tool called the Grid Analysis and Display System (GrADS) is presented. The GrADS' user interface is a natural extension of the standard procedures scientists apply to their geophysical data analysis problems. The basic GrADS operations have defaults that naturally map to data analysis actions, and there is a programmable interface for customizing data access and manipulation. The fundamental concept of the GrADS' dimension environment, which defines both the space in which the geophysical data reside and the 'slice' of data which is being analyzed at a given time, is expressed The GrADS' data storage and access model is described. An argument is made in favor of describable data formats rather than standard data formats. The manner in which GrADS users may perform operations on their data and display the results is also described. It is argued that two-dimensional graphics provides a powerful quantitative data analysis tool whose value is underestimated in the current development environment which emphasizes three dimensional structure modeling.
Biological Components of Colour Preference in Infancy
ERIC Educational Resources Information Center
Franklin, Anna; Bevis, Laura; Ling, Yazhu; Hurlbert, Anya
2010-01-01
Adult colour preference has been summarized quantitatively in terms of weights on the two fundamental neural processes that underlie early colour encoding: the S-(L+M) ("blue-yellow") and L-M ("red-green") cone-opponent contrast channels ( Ling, Hurlbert & Robinson, 2006; Hurlbert & Ling, 2007). Here, we investigate whether colour preference in…
A quantitative structure-property relationship (QSPR) was developed and combined with the Polanyi-Dubinin-Manes model to predict adsorption isotherms of emerging contaminants on activated carbons with a wide range of physico-chemical properties. Affinity coefficients (βl
USDA-ARS?s Scientific Manuscript database
Random mating (i.e., panmixis) is a fundamental assumption in quantitative genetics. In outcrossing bee-pollinated perennial forage legume polycrosses, mating is assumed by default to follow theoretical random mating. This assumption informs breeders of expected inbreeding estimates based on polycro...
Using Manipulatives To Teach Quantitative Concepts in Ecology.
ERIC Educational Resources Information Center
Eyster, Linda S.; Tashiro, Jay Shiro
1997-01-01
Describes the use of manipulatives to teach the fundamental concept of limiting factors and presents a series of questions that can be used to test whether students are harboring some of the most common misconceptions about limiting factors. Includes applications to discussions of cultural eutrophication and vegetarianism. (JRH)
The next generation of training for Arabidopsis researchers: bioinformatics and quantitative biology
USDA-ARS?s Scientific Manuscript database
It has been more than 50 years since Arabidopsis (Arabidopsis thaliana) was first introduced as a model organism to understand basic processes in plant biology. A well-organized scientific community has used this small reference plant species to make numerous fundamental plant biology discoveries (P...
Evaluating Sustainable Development Solutions Quantitatively: Competence Modelling for GCE and ESD
ERIC Educational Resources Information Center
Böhm, Marko; Eggert, Sabina; Barkmann, Jan; Bögeholz, Susanne
2016-01-01
To comprehensively address global environmental challenges such as biodiversity loss, citizens need an understanding of the socio-economic fundamentals of human behaviour in relation to natural resources. We argue that Global Citizenship Education and Education for Sustainable Development provide a core set of socio-economic competencies that can…
Quantification of the level of crowdedness for pedestrian movements
NASA Astrophysics Data System (ADS)
Duives, Dorine C.; Daamen, Winnie; Hoogendoorn, Serge P.
2015-06-01
Within the realm of pedestrian research numerous measures have been proposed to estimate the level of crowdedness experienced by pedestrians. However, within the field of pedestrian traffic flow modelling there does not seem to be consensus on the question which of these measures performs best. This paper shows that the shape and scatter within the resulting fundamental diagrams differs a lot depending on the measure of crowdedness used. The main aim of the paper is to establish the advantages and disadvantages of the currently existing measures to quantify crowdedness in order to evaluate which measures provide both accurate and consistent results. The assessment is not only based on the theoretical differences, but also on the qualitative and quantitative differences between the resulting fundamental diagrams computed using the crowdedness measures on one and the same data set. The qualitative and quantitative functioning of the classical Grid-based measure is compared to with the X-T measure, an Exponentially Weighted Distance measure, and a Voronoi-Diagram measure. The consistency of relating these measures for crowdedness to the two macroscopic flow variables velocity and flow, the computational efficiency and the amount of scatter present within the fundamental diagrams produced by the implementation of the different measures are reviewed. It is found that the Voronoi-Diagram and X-T measure are the most efficient and consistent measures for crowdedness.
How we hear what is not there: A neural mechanism for the missing fundamental illusion
NASA Astrophysics Data System (ADS)
Chialvo, Dante R.
2003-12-01
How the brain estimates the pitch of a complex sound remains unsolved. Complex sounds are composed of more than one tone. When two tones occur together, a third lower pitched tone is often heard. This is referred to as the "missing fundamental illusion" because the perceived pitch is a frequency (fundamental) for which there is no actual source vibration. This phenomenon exemplifies a larger variety of problems related to how pitch is extracted from complex tones, music and speech, and thus has been extensively used to test theories of pitch perception. A noisy nonlinear process is presented here as a candidate neural mechanism to explain the majority of reported phenomenology and provide specific quantitative predictions. The two basic premises of this model are as follows: (I) The individual tones composing the complex tones add linearly producing peaks of constructive interference whose amplitude is always insufficient to fire the neuron (II): The spike threshold is reached only with noise, which naturally selects the maximum constructive interferences. The spacing of these maxima, and consequently the spikes, occurs at a rate identical to the perceived pitch for the complex tone. Comparison with psychophysical and physiological data reveals a remarkable quantitative agreement not dependent on adjustable parameters. In addition, results from numerical simulations across different models are consistent, suggesting relevance to other sensory modalities.
Mapping quantitative trait loci for binary trait in the F2:3 design.
Zhu, Chengsong; Zhang, Yuan-Ming; Guo, Zhigang
2008-12-01
In the analysis of inheritance of quantitative traits with low heritability, an F(2:3) design that genotypes plants in F(2) and phenotypes plants in F(2:3) progeny is often used in plant genetics. Although statistical approaches for mapping quantitative trait loci (QTL) in the F(2:3) design have been well developed, those for binary traits of biological interest and economic importance are seldom addressed. In this study, an attempt was made to map binary trait loci (BTL) in the F(2:3) design. The fundamental idea was: the F(2) plants were genotyped, all phenotypic values of each F(2:3) progeny were measured for binary trait, and these binary trait values and the marker genotype informations were used to detect BTL under the penetrance and liability models. The proposed method was verified by a series of Monte-Carlo simulation experiments. These results showed that maximum likelihood approaches under the penetrance and liability models provide accurate estimates for the effects and the locations of BTL with high statistical power, even under of low heritability. Moreover, the penetrance model is as efficient as the liability model, and the F(2:3) design is more efficient than classical F(2) design, even though only a single progeny is collected from each F(2:3) family. With the maximum likelihood approaches under the penetrance and the liability models developed in this study, we can map binary traits as we can do for quantitative trait in the F(2:3) design.
Quali-quantitative analysis (QQA): why it could open new frontiers for holistic health practice.
Bell, Erica
2006-12-15
Holistic health practice is often described as being about understanding the larger contexts of patients, their health services, and their communities. Yet do traditional quantitative and qualitative health research methods produce the best possible evidence for the holistic practices of doctors, nurses, and allied health professionals? This paper argues "no", and examines the potential of a cutting-edge, social science research method--Quali-Quantitative Research (QQA)--for providing better evidence for holistic practice, particularly in small-N populations, such as rural and remote communities. It does so with reference to the international literature on holistic medicine, as well as three holistic health projects conducted in Tasmania: about prevention of falls in older people, adolescent substance abuse, and interventions for children aged 0-5 exposed to domestic violence. The findings suggest that much health research fails to capture rigorously the contextual complexity of holistic health challenges: the multiple different needs of individual patients, and the interprofessional approaches needed to deliver multidisciplinary and multiservice health interventions tailored to meet those needs in particular community contexts. QQA offers a "configurational", case-based, diversity-oriented approach to analysing data that combines qualitative and quantitative techniques to overcome the limitations of both research traditions. The author concludes that QQA could open new frontiers for holistic health by helping doctors, nurses, and allied health professionals answer a fundamental question presented by complex health challenges: "Given this set of whole-of-patient needs, what elements of which interventions in what services would work best in this particular community?"
Li, Jiajia; Li, Rongxi; Zhao, Bangsheng; Guo, Hui; Zhang, Shuan; Cheng, Jinghua; Wu, Xiaoli
2018-04-15
The use of Micro-Laser Raman spectroscopy technology for quantitatively determining gas carbon isotope composition is presented. In this study, 12 CO 2 and 13 CO 2 were mixed with N 2 at various molar fraction ratios to obtain Raman quantification factors (F 12CO2 and F 13CO2 ), which provide a theoretical basis for calculating the δ 13 C value. And the corresponding values were 0.523 (0
Leahy, Edmund; Chipchase, Lucy; Blackstock, Felicity
2017-04-17
Learning activities are fundamental for the development of expertise in physiotherapy practice. Continuing professional development (CPD) encompasses formal and informal learning activities undertaken by physiotherapists. Identifying the most efficient and effective learning activities is essential to enable the profession to assimilate research findings and improve clinical skills to ensure the most efficacious care for clients. To date, systematic reviews on the effectiveness of CPD provide limited guidance on the most efficacious models of professional development for physiotherapists. The aim of this systematic review is to evaluate which learning activities enhance physiotherapy practice. A search of Ovid MEDLINE, EMBASE, Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsycINFO (Psychological Abstracts), PEDro, Cochrane Library, AMED and Educational Resources and Information Center (ERIC) will be completed. Citation searching and reference list searching will be undertaken to locate additional studies. Quantitative and qualitative studies will be included if they examine the impact of learning activities on clinician's behaviour, attitude, knowledge, beliefs, skills, self-efficacy, work satisfaction and patient outcomes. Risk of bias will be assessed by two independent researchers. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) and Confidence in the Evidence from Reviews of Qualitative research (CERQual) will be used to synthesise results where a meta-analysis is possible. Where a meta-analysis is not possible, a narrative synthesis will be conducted. PROSPERO CRD42016050157.
NASA Astrophysics Data System (ADS)
Ghikas, Demetris P. K.; Oikonomou, Fotios D.
2018-04-01
Using the generalized entropies which depend on two parameters we propose a set of quantitative characteristics derived from the Information Geometry based on these entropies. Our aim, at this stage, is to construct first some fundamental geometric objects which will be used in the development of our geometrical framework. We first establish the existence of a two-parameter family of probability distributions. Then using this family we derive the associated metric and we state a generalized Cramer-Rao Inequality. This gives a first two-parameter classification of complex systems. Finally computing the scalar curvature of the information manifold we obtain a further discrimination of the corresponding classes. Our analysis is based on the two-parameter family of generalized entropies of Hanel and Thurner (2011).
A single-cell spiking model for the origin of grid-cell patterns
Kempter, Richard
2017-01-01
Spatial cognition in mammals is thought to rely on the activity of grid cells in the entorhinal cortex, yet the fundamental principles underlying the origin of grid-cell firing are still debated. Grid-like patterns could emerge via Hebbian learning and neuronal adaptation, but current computational models remained too abstract to allow direct confrontation with experimental data. Here, we propose a single-cell spiking model that generates grid firing fields via spike-rate adaptation and spike-timing dependent plasticity. Through rigorous mathematical analysis applicable in the linear limit, we quantitatively predict the requirements for grid-pattern formation, and we establish a direct link to classical pattern-forming systems of the Turing type. Our study lays the groundwork for biophysically-realistic models of grid-cell activity. PMID:28968386
Social in, social out: How the brain responds to social language with more social language.
O'Donnell, Matthew Brook; Falk, Emily B; Lieberman, Matthew D
Social connection is a fundamental human need. As such, people's brains are sensitized to social cues, such as those carried by language, and to promoting social communication. The neural mechanisms of certain key building blocks in this process, such as receptivity to and reproduction of social language, however, are not known. We combined quantitative linguistic analysis and neuroimaging to connect neural activity in brain regions used to simulate the mental states of others with exposure to, and re-transmission of, social language. Our results link findings on successful idea transmission from communication science, sociolinguistics and cognitive neuroscience to prospectively predict the degree of social language that participants utilize when re-transmitting ideas as a function of 1) initial language inputs and 2) neural activity during idea exposure.
Hierarchy of models: From qualitative to quantitative analysis of circadian rhythms in cyanobacteria
NASA Astrophysics Data System (ADS)
Chaves, M.; Preto, M.
2013-06-01
A hierarchy of models, ranging from high to lower levels of abstraction, is proposed to construct "minimal" but predictive and explanatory models of biological systems. Three hierarchical levels will be considered: Boolean networks, piecewise affine differential (PWA) equations, and a class of continuous, ordinary, differential equations' models derived from the PWA model. This hierarchy provides different levels of approximation of the biological system and, crucially, allows the use of theoretical tools to more exactly analyze and understand the mechanisms of the system. The Kai ABC oscillator, which is at the core of the cyanobacterial circadian rhythm, is analyzed as a case study, showing how several fundamental properties—order of oscillations, synchronization when mixing oscillating samples, structural robustness, and entrainment by external cues—can be obtained from basic mechanisms.
Single Cell Gene Expression Profiling of Skeletal Muscle-Derived Cells.
Gatto, Sole; Puri, Pier Lorenzo; Malecova, Barbora
2017-01-01
Single cell gene expression profiling is a fundamental tool for studying the heterogeneity of a cell population by addressing the phenotypic and functional characteristics of each cell. Technological advances that have coupled microfluidic technologies with high-throughput quantitative RT-PCR analyses have enabled detailed analyses of single cells in various biological contexts. In this chapter, we describe the procedure for isolating the skeletal muscle interstitial cells termed Fibro-Adipogenic Progenitors (FAPs ) and their gene expression profiling at the single cell level. Moreover, we accompany our bench protocol with bioinformatics analysis designed to process raw data as well as to visualize single cell gene expression data. Single cell gene expression profiling is therefore a useful tool in the investigation of FAPs heterogeneity and their contribution to muscle homeostasis.
NASA Astrophysics Data System (ADS)
Janssen, Paul; Wouters, Steinar H. W.; Cox, Matthijs; Koopmans, Bert
2013-11-01
In recent years, it was discovered that the current through an organic semiconductor, sandwiched between two non-magnetic electrodes, can be changed significantly by applying a small magnetic field. This surprisingly large magnetoresistance effect, often dubbed as organic magnetoresistance (OMAR), has puzzled the young field of organic spintronics during the last decade. Here, we present a detailed study on the voltage and temperature dependence of OMAR, aiming to unravel the lineshapes of the magnetic field effects and thereby gain a deeper fundamental understanding of the underlying microscopic mechanism. Using a full quantitative analysis of the lineshapes, we are able to extract all linewidth parameters and the voltage and temperature dependencies are explained with a recently proposed trion mechanism. Moreover, explicit microscopic simulations show a qualitative agreement to the experimental results.
Internet research: an opportunity to revisit classic ethical problems in behavioral research.
Pittenger, David J
2003-01-01
The Internet offers many new opportunities for behavioral researchers to conduct quantitative and qualitative research. Although the ethical guidelines of the American Psychological Association generalize, in part, to research conducted through the Internet, several matters related to Internet research require further analysis. This article reviews several fundamental ethical issues related to Internet research, namely the preservation of privacy, the issuance of informed consent, the use of deception and false feedback, and research methods. In essence, the Internet offers unique challenges to behavioral researchers. Among these are the need to better define the distinction between private and public behavior performed through the Internet, ensure mechanisms for obtaining valid informed consent from participants and performing debriefing exercises, and verify the validity of data collected through the Internet.
Autonomous space processor for orbital debris
NASA Technical Reports Server (NTRS)
1990-01-01
This work continues to develop advanced designs toward the ultimate goal of a Get Away Special to demonstrate economical removal of orbital debris using local resources in orbit. The fundamental technical feasibility was demonstrated in 1988 through theoretical calculations, quantitative computer animation, a solar focal point cutter, a robotic arm design, and a subscale model. Last year improvements were made to the solar cutter and the robotic arm. Also performed last year was a mission analysis that showed the feasibility of retrieving at least four large (greater than 1500-kg) pieces of debris. Advances made during this reporting period are the incorporation of digital control with the existing placement arm, the development of a new robotic manipulator arm, and the study of debris spin attenuation. These advances are discussed here.
Marconcini, Simone; Covani, Ugo; Barone, Antonio; Vittorio, Orazio; Curcio, Michele; Barbuti, Serena; Scatena, Fabrizio; Felli, Lamberto; Nicolini, Claudio
2011-07-01
Periodontitis is a complex multifactorial disease and is typically polygenic in origin. Genes play a fundamental part in each biologic process forming complex networks of interactions. However, only some genes have a high number of interactions with other genes in the network and may, therefore, be considered to play an important role. In a preliminary bioinformatic analysis, five genes that showed a higher number of interactions were identified and termed leader genes. In the present study, we use real-time quantitative polymerase chain reaction (PCR) technology to evaluate the expression levels of leader genes in the leukocytes of 10 patients with refractory chronic periodontitis and compare the expression levels with those of the same genes in 24 healthy patients. Blood was collected from 24 healthy human subjects and 10 patients with refractory chronic periodontitis and placed into heparinized blood collection tubes by personnel trained in phlebotomy using a sterile technique. Blood leukocyte cells were immediately lysed by using a kit for total RNA purification from human whole blood. Complementary DNA (cDNA) synthesis was obtained from total RNA and then real-time quantitative PCR was performed. PCR efficiencies were calculated with a relative standard curve derived from a five cDNA dilution series in triplicate that gave regression coefficients >0.98 and efficiencies >96%. The standard curves were obtained using glyceraldehyde-3-phosphate dehydrogenase (GAPDH) and growth factor receptor binding protein 2 (GRB2), casitas B-lineage lymphoma (CBL), nuclear factor-KB1 (NFKB1), and REL-A (gene for transcription factor p65) gene primers and amplified with 1.6, 8, 40, 200, and 1,000 ng/μL total cDNA. Curves obtained for each sample showed a linear relationship between RNA concentrations and the cycle threshold value of real-time quantitative PCR for all genes. Data were expressed as mean ± SE (SEM). The groups were compared to the analysis of variance. A probability value <0.01 was considered statistically significant. The present study agrees with the preliminary bioinformatics analysis. In our experiments, the association of pathology with the genes was statistically significant for GRB2 and CBL (P <0.01), and it was not statistically significant for REL-A and NFKB1. This article lends support to our preliminary hypothesis that assigned an important role in refractory aggressive periodontitis to leader genes.
Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?
Gizak, Agnieszka; Rakus, Dariusz
2016-01-11
Molecular and cellular biology methodology is traditionally based on the reasoning called "the mechanistic explanation". In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems' complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.
A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data
Chen, Yi-Hau
2017-01-01
Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https://github.com/roqe/T2GA. PMID:28622336
A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.
Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin
2017-06-01
Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https://github.com/roqe/T2GA.
Learning Quantitative Sequence-Function Relationships from Massively Parallel Experiments
NASA Astrophysics Data System (ADS)
Atwal, Gurinder S.; Kinney, Justin B.
2016-03-01
A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships—functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes"—directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.
De Benedetti, Pier G; Fanelli, Francesca
2018-03-21
Simple comparative correlation analyses and quantitative structure-kinetics relationship (QSKR) models highlight the interplay of kinetic rates and binding affinity as an essential feature in drug design and discovery. The choice of the molecular series, and their structural variations, used in QSKR modeling is fundamental to understanding the mechanistic implications of ligand and/or drug-target binding and/or unbinding processes. Here, we discuss the implications of linear correlations between kinetic rates and binding affinity constants and the relevance of the computational approaches to QSKR modeling. Copyright © 2018 Elsevier Ltd. All rights reserved.
ULTRASONIC STUDIES OF THE FUNDAMENTAL MECHANISMS OF RECRYSTALLIZATION AND SINTERING OF METALS
DOE Office of Scientific and Technical Information (OSTI.GOV)
TURNER, JOSEPH A.
2005-11-30
The purpose of this project was to develop a fundamental understanding of the interaction of an ultrasonic wave with complex media, with specific emphases on recrystallization and sintering of metals. A combined analytical, numerical, and experimental research program was implemented. Theoretical models of elastic wave propagation through these complex materials were developed using stochastic wave field techniques. The numerical simulations focused on finite element wave propagation solutions through complex media. The experimental efforts were focused on corroboration of the models developed and on the development of new experimental techniques. The analytical and numerical research allows the experimental results to bemore » interpreted quantitatively.« less
Methane spectral line widths and shifts, and dependences on physical parameters
NASA Technical Reports Server (NTRS)
Fox, K.; Quillen, D. T.; Jennings, D. E.; Wagner, J.; Plymate, C.
1991-01-01
A detailed report of the recent high-resolution spectroscopic research on widths and shifts measured for a strong infrared-active fundamental of methane is presented. They were measured in collision with several rare gases and diatomic molecules, in the vibrational-rotational fundamental near 3000/cm. These measurements were made at an ambient temperature of 294 K over a range of pressures from 100 to 700 torr. The measurements are discussed in a preliminary but detailed and quantitative manner with reference to masses, polarizabilities, and quadrupole moments. Some functional dependences on these physical parameters are considered. The present data are useful for studies of corresponding planetary spectra.
A Quantitative Electrochemiluminescence Assay for Clostridium perfringens alpha toxin
2006-08-10
Doyle, L.R. Beuchat, T.J. Montville (Eds.), Food Microbiology : Fundamentals and Fron- tiers, Second ed., ASM Press, Washington, D.C., 2001, pp. 351...D.E. Lorant, A.E. Bryant, G.A. Zimmerman, T.M. McIn- tyre, D.L. Stevens, S.M. Prescott , Alpha toxin from Clostridium per- fringens induces
A Review of Web Information Seeking Research: Considerations of Method and Foci of Interest
ERIC Educational Resources Information Center
Martzoukou, Konstantina
2005-01-01
Introduction: This review shows that Web information seeking research suffers from inconsistencies in method and a lack of homogeneity in research foci. Background: Qualitative and quantitative methods are needed to produce a comprehensive view of information seeking. Studies also recommend observation as one of the most fundamental ways of…
USDA-ARS?s Scientific Manuscript database
N-nitroso compounds are recognized as important dietary carcinogens. Accurate assessment of N-nitroso intake is fundamental to advancing research regarding its role with cancer. Previous studies have not used a quantitative database to estimate the intake of these compounds in a US population. To ad...
An Evaluation of Oral Language: The Relationship between Listening, Speaking and Self-Efficacy
ERIC Educational Resources Information Center
Demir, Sezgin
2017-01-01
Listening and speaking skills are fundamental determinants of an individual's academic success. The aim of this research is to establish the relationship between listening and speaking skills, and study how listening predicts and cognitively arranges speaking. The research was carried out using the quantitative pattern in correlational type. The…
ERIC Educational Resources Information Center
Rayno, Marisue
2010-01-01
Nursing student attrition in community colleges negatively affects students, faculty, colleges, and the nursing profession. The purpose of this quantitative correlational retrospective research study was to examine the possible relationships between each of the independent variables of academic preparedness (as measured by NET mathematics and…
Human Abilities and Modes of Attention: The Issue of Stylistic Consistencies in Cognition.
ERIC Educational Resources Information Center
Messick, Samuel
Spearman's notions of mental energy and mental span presage modern conceptions of attentional resources and working memory as fundamental to intelligence. Viewing attention as the conative directing of the intellect, as "the application of intellectual energy," Spearman's quantitative law of mental span deals with limits on the…
Know-how and know-why in biochemical engineering.
von Stockar, U; Valentinotti, S; Marison, I; Cannizzaro, C; Herwig, C
2003-08-01
This contribution analyzes the position of biochemical engineering in general and bioprocess engineering particularly in the force fields between fundamental science and applications, and between academia and industry. By using culture technology as an example, it can be shown that bioprocess engineering has moved slowly but steadily from an empirical art concerned with mainly know-how to a science elucidating the know-why of culture behavior. Highly powerful monitoring tools enable biochemical engineers to understand and explain quantitatively the activity of cellular culture on a metabolic basis. Among these monitoring tools are not just semi-online analyses of culture broth by HPLC, GC and FIA, but, increasingly, also noninvasive methods such as midrange IR, Raman and capacitance spectroscopy, as well as online calorimetry. The detailed and quantitative insight into the metabolome and the fluxome that bioprocess engineers are establishing offers an unprecedented opportunity for building bridges between molecular biology and engineering biosciences. Thus, one of the major tasks of biochemical engineering sciences is not developing new know-how for industrial applications, but elucidating the know-why in biochemical engineering by conducting research on the underlying scientific fundamentals.
Quantitative prediction of perceptual decisions during near-threshold fear detection
NASA Astrophysics Data System (ADS)
Pessoa, Luiz; Padmala, Srikanth
2005-04-01
A fundamental goal of cognitive neuroscience is to explain how mental decisions originate from basic neural mechanisms. The goal of the present study was to investigate the neural correlates of perceptual decisions in the context of emotional perception. To probe this question, we investigated how fluctuations in functional MRI (fMRI) signals were correlated with behavioral choice during a near-threshold fear detection task. fMRI signals predicted behavioral choice independently of stimulus properties and task accuracy in a network of brain regions linked to emotional processing: posterior cingulate cortex, medial prefrontal cortex, right inferior frontal gyrus, and left insula. We quantified the link between fMRI signals and behavioral choice in a whole-brain analysis by determining choice probabilities by means of signal-detection theory methods. Our results demonstrate that voxel-wise fMRI signals can reliably predict behavioral choice in a quantitative fashion (choice probabilities ranged from 0.63 to 0.78) at levels comparable to neuronal data. We suggest that the conscious decision that a fearful face has been seen is represented across a network of interconnected brain regions that prepare the organism to appropriately handle emotionally challenging stimuli and that regulate the associated emotional response. decision making | emotion | functional MRI
Automatic detection and quantitative analysis of cells in the mouse primary motor cortex
NASA Astrophysics Data System (ADS)
Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui
2014-09-01
Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.
A simulation model of IT risk on program trading
NASA Astrophysics Data System (ADS)
Xia, Bingying; Jiang, Wenbao; Luo, Guangxuan
2015-12-01
The biggest difficulty for Program trading IT risk measures lies in the loss of data, in view of this situation, the current scholars approach is collecting court, network and other public media such as all kinds of accident of IT both at home and abroad for data collection, and the loss of IT risk quantitative analysis based on this database. However, the IT risk loss database established by this method can only fuzzy reflect the real situation and not for real to make fundamental explanation. In this paper, based on the study of the concept and steps of the MC simulation, we use computer simulation method, by using the MC simulation method in the "Program trading simulation system" developed by team to simulate the real programming trading and get the IT risk loss of data through its IT failure experiment, at the end of the article, on the effectiveness of the experimental data is verified. In this way, better overcome the deficiency of the traditional research method and solves the problem of lack of IT risk data in quantitative research. More empirically provides researchers with a set of simulation method are used to study the ideas and the process template.
The genetic architecture of photosynthesis and plant growth-related traits in tomato.
de Oliveira Silva, Franklin Magnum; Lichtenstein, Gabriel; Alseekh, Saleh; Rosado-Souza, Laise; Conte, Mariana; Suguiyama, Vanessa Fuentes; Lira, Bruno Silvestre; Fanourakis, Dimitrios; Usadel, Björn; Bhering, Leonardo Lopes; DaMatta, Fábio M; Sulpice, Ronan; Araújo, Wagner L; Rossi, Magdalena; de Setta, Nathalia; Fernie, Alisdair R; Carrari, Fernando; Nunes-Nesi, Adriano
2018-02-01
To identify genomic regions involved in the regulation of fundamental physiological processes such as photosynthesis and respiration, a population of Solanum pennellii introgression lines was analyzed. We determined phenotypes for physiological, metabolic, and growth related traits, including gas exchange and chlorophyll fluorescence parameters. Data analysis allowed the identification of 208 physiological and metabolic quantitative trait loci with 33 of these being associated to smaller intervals of the genomic regions, termed BINs. Eight BINs were identified that were associated with higher assimilation rates than the recurrent parent M82. Two and 10 genomic regions were related to shoot and root dry matter accumulation, respectively. Nine genomic regions were associated with starch levels, whereas 12 BINs were associated with the levels of other metabolites. Additionally, a comprehensive and detailed annotation of the genomic regions spanning these quantitative trait loci allowed us to identify 87 candidate genes that putatively control the investigated traits. We confirmed 8 of these at the level of variance in gene expression. Taken together, our results allowed the identification of candidate genes that most likely regulate photosynthesis, primary metabolism, and plant growth and as such provide new avenues for crop improvement. © 2017 John Wiley & Sons Ltd.
Analysis of Self-Potential Response beyond the Fixed Geometry Technique
NASA Astrophysics Data System (ADS)
Mahardika, Harry
2018-03-01
The self-potential (SP) method is one of the oldest geophysical methods that are still available for today’s application. Since its early days SP data interpretation has been done qualitatively until the emerging of the fixed geometry analysis that was used to characterize the orientation and the electric-dipole properties of a mineral ore structure. Through the expansion of fundamental theories, computational methods, field-and-lab experiments in the last fifteen years, SP method has emerge from its low-class reputation to become more respectable. It became a complementary package alongside electric-resistivity tomography (ERT) for detecting groundwater flow in the subsurface, and extends to the hydrothermal flow in geothermal areas. As the analysis of SP data becomes more quantitative, its potential applications become more diverse. In this paper, we will show examples of our current SP studies such as the groundwater flow characterization inside a fault area. Lastly we will introduce the application of the "active" SP method - that is the seismoelectric method - which can be used for 4D real-time monitoring systems.
Nahan, Keaton S; Alvarez, Noe; Shanov, Vesselin; Vonderheide, Anne
2017-11-01
Mass spectrometry continues to tackle many complicated tasks, and ongoing research seeks to simplify its instrumentation as well as sampling. The desorption electrospray ionization (DESI) source was the first ambient ionization source to function without extensive gas requirements and chromatography. Electrospray techniques generally have low efficiency for ionization of nonpolar analytes and some researchers have resorted to methods such as direct analysis in real time (DART) or desorption atmospheric pressure chemical ionization (DAPCI) for their analysis. In this work, a carbon nanotube fiber ionization (nanoCFI) source was developed and was found to be capable of solid phase microextraction (SPME) of nonpolar analytes as well as ionization and sampling similar to that of direct probe atmospheric pressure chemical ionization (DP-APCI). Conductivity and adsorption were maintained by utilizing a corona pin functionalized with a multi-walled carbon nanotube (MWCNT) thread. Quantitative work with the nanoCFI source with a designed corona discharge pin insert demonstrated linearity up to 0.97 (R 2 ) of three target PAHs with phenanthrene internal standard. Graphical Abstract ᅟ.
Film analysis employing subtarget effect using 355 nm Nd-YAG laser-induced plasma at low pressure
NASA Astrophysics Data System (ADS)
Hedwig, Rinda; Budi, Wahyu Setia; Abdulmadjid, Syahrun Nur; Pardede, Marincan; Suliyanti, Maria Margaretha; Lie, Tjung Jie; Kurniawan, Davy Putra; Kurniawan, Koo Hendrik; Kagawa, Kiichiro; Tjia, May On
2006-12-01
The applicability of spectrochemical analysis for liquid and powder samples of minute amount in the form of thin film was investigated using ultraviolet Nd-YAG laser (355 nm) and low-pressure ambient air. A variety of organic samples such as commercial black ink usually used for stamp pad, ginseng extract, human blood, liquid milk and ginseng powder was prepared as film deposited on the surface of an appropriate hard substrate such as copper plate or glass slide. It was demonstrated that in all cases studied, good quality spectra were obtained with very low background and free from undesirable contamination by the substrate elements, featuring ppm or even sub-ppm sensitivity and worthy of application for quantitative analysis of organic samples. The proper preparation of the films was found to be crucial in achieving the high quality spectra. It was further shown that much inferior results were obtained when the atmospheric-pressure (101 kPa) operating condition of laser-induced breakdown spectroscopy or the fundamental wavelength of the Nd-YAG laser was employed due to the excessive or improper laser ablation process.
Lorenz, Kevin S.; Salama, Paul; Dunn, Kenneth W.; Delp, Edward J.
2013-01-01
Digital image analysis is a fundamental component of quantitative microscopy. However, intravital microscopy presents many challenges for digital image analysis. In general, microscopy volumes are inherently anisotropic, suffer from decreasing contrast with tissue depth, lack object edge detail, and characteristically have low signal levels. Intravital microscopy introduces the additional problem of motion artifacts, resulting from respiratory motion and heartbeat from specimens imaged in vivo. This paper describes an image registration technique for use with sequences of intravital microscopy images collected in time-series or in 3D volumes. Our registration method involves both rigid and non-rigid components. The rigid registration component corrects global image translations, while the non-rigid component manipulates a uniform grid of control points defined by B-splines. Each control point is optimized by minimizing a cost function consisting of two parts: a term to define image similarity, and a term to ensure deformation grid smoothness. Experimental results indicate that this approach is promising based on the analysis of several image volumes collected from the kidney, lung, and salivary gland of living rodents. PMID:22092443
NASA Astrophysics Data System (ADS)
Nahan, Keaton S.; Alvarez, Noe; Shanov, Vesselin; Vonderheide, Anne
2017-09-01
Mass spectrometry continues to tackle many complicated tasks, and ongoing research seeks to simplify its instrumentation as well as sampling. The desorption electrospray ionization (DESI) source was the first ambient ionization source to function without extensive gas requirements and chromatography. Electrospray techniques generally have low efficiency for ionization of nonpolar analytes and some researchers have resorted to methods such as direct analysis in real time (DART) or desorption atmospheric pressure chemical ionization (DAPCI) for their analysis. In this work, a carbon nanotube fiber ionization (nanoCFI) source was developed and was found to be capable of solid phase microextraction (SPME) of nonpolar analytes as well as ionization and sampling similar to that of direct probe atmospheric pressure chemical ionization (DP-APCI). Conductivity and adsorption were maintained by utilizing a corona pin functionalized with a multi-walled carbon nanotube (MWCNT) thread. Quantitative work with the nanoCFI source with a designed corona discharge pin insert demonstrated linearity up to 0.97 (R2) of three target PAHs with phenanthrene internal standard. [Figure not available: see fulltext.
Doll, Sophia; Urisman, Anatoly; Oses-Prieto, Juan A; Arnott, David; Burlingame, Alma L
2017-01-01
Glioblastoma multiformes (GBMs) are high-grade astrocytomas and the most common brain malignancies. Primary GBMs are often associated with disturbed RAS signaling, and expression of oncogenic HRAS results in a malignant phenotype in glioma cell lines. Secondary GBMs arise from lower-grade astrocytomas, have slower progression than primary tumors, and contain IDH1 mutations in over 70% of cases. Despite significant amount of accumulating genomic and transcriptomic data, the fundamental mechanistic differences of gliomagenesis in these two types of high-grade astrocytoma remain poorly understood. Only a few studies have attempted to investigate the proteome, phosphorylation signaling, and epigenetic regulation in astrocytoma. In the present study, we applied quantitative phosphoproteomics to identify the main signaling differences between oncogenic HRAS and mutant IDH1-driven glioma cells as models of primary and secondary GBM, respectively. Our analysis confirms the driving roles of the MAPK and PI3K/mTOR signaling pathways in HRAS driven cells and additionally uncovers dysregulation of other signaling pathways. Although a subset of the signaling changes mediated by HRAS could be reversed by a MEK inhibitor, dual inhibition of MEK and PI3K resulted in more complete reversal of the phosphorylation patterns produced by HRAS expression. In contrast, cells expressing mutant IDH1 did not show significant activation of MAPK or PI3K/mTOR pathways. Instead, global downregulation of protein expression was observed. Targeted proteomic analysis of histone modifications identified significant histone methylation, acetylation, and butyrylation changes in the mutant IDH1 expressing cells, consistent with a global transcriptional repressive state. Our findings offer novel mechanistic insight linking mutant IDH1 associated inhibition of histone demethylases with specific histone modification changes to produce global transcriptional repression in secondary glioblastoma. Our proteomic datasets are available for download and provide a comprehensive catalogue of alterations in protein abundance, phosphorylation, and histone modifications in oncogenic HRAS and IDH1 driven astrocytoma cells beyond the transcriptomic level. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Mateo, Carlos, E-mail: cgm@cenim.csic.es
Since the major strengthening mechanisms in nanocrystalline bainitic steels arise from the exceptionally small size of the bainitc ferrite plate, accurate determination of this parameter is fundamental for quantitative relating the microstructure to the mechanical properties. In this work, the thickness of the bainitic ferrite subunits obtained by different bainitic heat treatments was determined in two steels, with carbon contents of 0.3 and 0.7 wt.%, from SEM and TEM micrographs. As these measurements were made on 2D images taken from random sections, the method includes some stereological correction factors to obtain accurate information. Finally, the determined thicknesses of bainitic ferritemore » plates were compared with the crystallite size calculated from the analysis of X-ray diffraction peak broadening. Although in some case the values obtained for crystallite size and plate thickness can be similar, this study confirms that indeed they are two different parameters. - Highlights: •Bainitic microstructure in a nanostructured and sub-micron steel •Bainitic ferrite plate thickness measured by SEM and TEM •Crystallite size determined by X-ray analysis.« less
Investigating student understanding of simple harmonic motion
NASA Astrophysics Data System (ADS)
Somroob, S.; Wattanakasiwich, P.
2017-09-01
This study aimed to investigate students’ understanding and develop instructional material on a topic of simple harmonic motion. Participants were 60 students taking a course on vibrations and wave and 46 students taking a course on Physics 2 and 28 students taking a course on Fundamental Physics 2 on the 2nd semester of an academic year 2016. A 16-question conceptual test and tutorial activities had been developed from previous research findings and evaluated by three physics experts in teaching mechanics before using in a real classroom. Data collection included both qualitative and quantitative methods. Item analysis and whole-test analysis were determined from student responses in the conceptual test. As results, most students had misconceptions about restoring force and they had problems connecting mathematical solutions to real motions, especially phase angle. Moreover, they had problems with interpreting mechanical energy from graphs and diagrams of the motion. These results were used to develop effective instructional materials to enhance student abilities in understanding simple harmonic motion in term of multiple representations.
Expression of fox-related genes in the skin follicles of Inner Mongolia cashmere goat.
Han, Wenjing; Li, Xiaoyan; Wang, Lele; Wang, Honghao; Yang, Kun; Wang, Zhixin; Wang, Ruijun; Su, Rui; Liu, Zhihong; Zhao, Yanhong; Zhang, Yanjun; Li, Jinquan
2018-03-01
This study investigated the expression of genes in cashmere goats at different periods of their fetal development. Bioinformatics analysis was used to evaluate data obtained by transcriptome sequencing of fetus skin samples collected from Inner Mongolia cashmere goats on days 45, 55, and 65 of fetal age. We found that FoxN1 , FoxE1 , and FoxI3 genes of the Fox gene family were probably involved in the growth and development of the follicle and the formation of hair, which is consistent with previous findings. Real-time quantitative polymerase chain reaction detecting system and Western blot analysis were employed to study the relative differentially expressed genes FoxN1 , FoxE1 , and FoxI3 in the body skin of cashmere goat fetuses and adult individuals. This study provided new fundamental information for further investigation of the genes related to follicle development and exploration of their roles in hair follicle initiation, growth, and development.
Wen, Jia-Long; Sun, Shao-Long; Xue, Bai-Liang; Sun, Run-Cang
2013-01-01
The demand for efficient utilization of biomass induces a detailed analysis of the fundamental chemical structures of biomass, especially the complex structures of lignin polymers, which have long been recognized for their negative impact on biorefinery. Traditionally, it has been attempted to reveal the complicated and heterogeneous structure of lignin by a series of chemical analyses, such as thioacidolysis (TA), nitrobenzene oxidation (NBO), and derivatization followed by reductive cleavage (DFRC). Recent advances in nuclear magnetic resonance (NMR) technology undoubtedly have made solution-state NMR become the most widely used technique in structural characterization of lignin due to its versatility in illustrating structural features and structural transformations of lignin polymers. As one of the most promising diagnostic tools, NMR provides unambiguous evidence for specific structures as well as quantitative structural information. The recent advances in two-dimensional solution-state NMR techniques for structural analysis of lignin in isolated and whole cell wall states (in situ), as well as their applications are reviewed. PMID:28809313
Text Mining in Organizational Research
Kobayashi, Vladimer B.; Berkers, Hannah A.; Kismihók, Gábor; Den Hartog, Deanne N.
2017-01-01
Despite the ubiquity of textual data, so far few researchers have applied text mining to answer organizational research questions. Text mining, which essentially entails a quantitative approach to the analysis of (usually) voluminous textual data, helps accelerate knowledge discovery by radically increasing the amount data that can be analyzed. This article aims to acquaint organizational researchers with the fundamental logic underpinning text mining, the analytical stages involved, and contemporary techniques that may be used to achieve different types of objectives. The specific analytical techniques reviewed are (a) dimensionality reduction, (b) distance and similarity computing, (c) clustering, (d) topic modeling, and (e) classification. We describe how text mining may extend contemporary organizational research by allowing the testing of existing or new research questions with data that are likely to be rich, contextualized, and ecologically valid. After an exploration of how evidence for the validity of text mining output may be generated, we conclude the article by illustrating the text mining process in a job analysis setting using a dataset composed of job vacancies. PMID:29881248
Text Mining in Organizational Research.
Kobayashi, Vladimer B; Mol, Stefan T; Berkers, Hannah A; Kismihók, Gábor; Den Hartog, Deanne N
2018-07-01
Despite the ubiquity of textual data, so far few researchers have applied text mining to answer organizational research questions. Text mining, which essentially entails a quantitative approach to the analysis of (usually) voluminous textual data, helps accelerate knowledge discovery by radically increasing the amount data that can be analyzed. This article aims to acquaint organizational researchers with the fundamental logic underpinning text mining, the analytical stages involved, and contemporary techniques that may be used to achieve different types of objectives. The specific analytical techniques reviewed are (a) dimensionality reduction, (b) distance and similarity computing, (c) clustering, (d) topic modeling, and (e) classification. We describe how text mining may extend contemporary organizational research by allowing the testing of existing or new research questions with data that are likely to be rich, contextualized, and ecologically valid. After an exploration of how evidence for the validity of text mining output may be generated, we conclude the article by illustrating the text mining process in a job analysis setting using a dataset composed of job vacancies.
Chan, Wei Ping; Wang, Jing-Yuan
2016-08-01
Recently, sludge attracted great interest as a potential feedstock in thermochemical conversion processes. However, compositions and thermal degradation behaviours of sludge were highly complex and distinctive compared to other traditional feedstock led to a need of fundamental research on sludge. Comprehensive characterisation of sludge specifically for thermochemical conversion was carried out for all existing Water Reclamation Plants in Singapore. In total, 14 sludge samples collected based on the type, plant, and batch categorisation. Existing characterisation methods for physical and chemical properties were analysed and reviewed using the collected samples. Qualitative similarities and quantitative variations of different sludge samples were identified and discussed. Oxidation of inorganic in sludge during ash forming analysis found to be causing significant deviations on proximate and ultimate analysis. Therefore, alternative parameters and comparison basis including Fixed Residues (FR), Inorganic Matters (IM) and Total Inorganics (TI) were proposed for better understanding on the thermochemical characteristics of sludge. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Melick, H. C., Jr.; Ybarra, A. H.; Bencze, D. P.
1975-01-01
An inexpensive method is developed to determine the extreme values of instantaneous inlet distortion. This method also provides insight into the basic mechanics of unsteady inlet flow and the associated engine reaction. The analysis is based on fundamental fluid dynamics and statistical methods to provide an understanding of the turbulent inlet flow and quantitatively relate the rms level and power spectral density (PSD) function of the measured time variant total pressure fluctuations to the strength and size of the low pressure regions. The most probable extreme value of the instantaneous distortion is then synthesized from this information in conjunction with the steady state distortion. Results of the analysis show the extreme values to be dependent upon the steady state distortion, the measured turbulence rms level and PSD function, the time on point, and the engine response characteristics. Analytical projections of instantaneous distortion are presented and compared with data obtained by a conventional, highly time correlated, 40 probe instantaneous pressure measurement system.
Spot detection and image segmentation in DNA microarray data.
Qin, Li; Rueda, Luis; Ali, Adnan; Ngom, Alioune
2005-01-01
Following the invention of microarrays in 1994, the development and applications of this technology have grown exponentially. The numerous applications of microarray technology include clinical diagnosis and treatment, drug design and discovery, tumour detection, and environmental health research. One of the key issues in the experimental approaches utilising microarrays is to extract quantitative information from the spots, which represent genes in a given experiment. For this process, the initial stages are important and they influence future steps in the analysis. Identifying the spots and separating the background from the foreground is a fundamental problem in DNA microarray data analysis. In this review, we present an overview of state-of-the-art methods for microarray image segmentation. We discuss the foundations of the circle-shaped approach, adaptive shape segmentation, histogram-based methods and the recently introduced clustering-based techniques. We analytically show that clustering-based techniques are equivalent to the one-dimensional, standard k-means clustering algorithm that utilises the Euclidean distance.
Tertiary structural propensities reveal fundamental sequence/structure relationships.
Zheng, Fan; Zhang, Jian; Grigoryan, Gevorg
2015-05-05
Extracting useful generalizations from the continually growing Protein Data Bank (PDB) is of central importance. We hypothesize that the PDB contains valuable quantitative information on the level of local tertiary structural motifs (TERMs). We show that by breaking a protein structure into its constituent TERMs, and querying the PDB to characterize the natural ensemble matching each, we can estimate the compatibility of the structure with a given amino acid sequence through a metric we term "structure score." Considering submissions from recent Critical Assessment of Structure Prediction (CASP) experiments, we found a strong correlation (R = 0.69) between structure score and model accuracy, with poorly predicted regions readily identifiable. This performance exceeds that of leading atomistic statistical energy functions. Furthermore, TERM-based analysis of two prototypical multi-state proteins rapidly produced structural insights fully consistent with prior extensive experimental studies. We thus find that TERM-based analysis should have considerable utility for protein structural biology. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ciccimaro, Eugene; Ranasinghe, Asoka; D'Arienzo, Celia; Xu, Carrie; Onorato, Joelle; Drexler, Dieter M; Josephs, Jonathan L; Poss, Michael; Olah, Timothy
2014-12-02
Due to observed collision induced dissociation (CID) fragmentation inefficiency, developing sensitive liquid chromatography tandem mass spectrometry (LC-MS/MS) assays for CID resistant compounds is especially challenging. As an alternative to traditional LC-MS/MS, we present here a methodology that preserves the intact analyte ion for quantification by selectively filtering ions while reducing chemical noise. Utilizing a quadrupole-Orbitrap MS, the target ion is selectively isolated while interfering matrix components undergo MS/MS fragmentation by CID, allowing noise-free detection of the analyte's surviving molecular ion. In this manner, CID affords additional selectivity during high resolution accurate mass analysis by elimination of isobaric interferences, a fundamentally different concept than the traditional approach of monitoring a target analyte's unique fragment following CID. This survivor-selected ion monitoring (survivor-SIM) approach has allowed sensitive and specific detection of disulfide-rich cyclic peptides extracted from plasma.
NASA Astrophysics Data System (ADS)
Crouch, Catherine H.; Heller, Kenneth
2014-05-01
We describe restructuring the introductory physics for life science students (IPLS) course to better support these students in using physics to understand their chosen fields. Our courses teach physics using biologically rich contexts. Specifically, we use examples in which fundamental physics contributes significantly to understanding a biological system to make explicit the value of physics to the life sciences. This requires selecting the course content to reflect the topics most relevant to biology while maintaining the fundamental disciplinary structure of physics. In addition to stressing the importance of the fundamental principles of physics, an important goal is developing students' quantitative and problem solving skills. Our guiding pedagogical framework is the cognitive apprenticeship model, in which learning occurs most effectively when students can articulate why what they are learning matters to them. In this article, we describe our courses, summarize initial assessment data, and identify needs for future research.
Fundamental Design Principles for Transcription-Factor-Based Metabolite Biosensors.
Mannan, Ahmad A; Liu, Di; Zhang, Fuzhong; Oyarzún, Diego A
2017-10-20
Metabolite biosensors are central to current efforts toward precision engineering of metabolism. Although most research has focused on building new biosensors, their tunability remains poorly understood and is fundamental for their broad applicability. Here we asked how genetic modifications shape the dose-response curve of biosensors based on metabolite-responsive transcription factors. Using the lac system in Escherichia coli as a model system, we built promoter libraries with variable operator sites that reveal interdependencies between biosensor dynamic range and response threshold. We developed a phenomenological theory to quantify such design constraints in biosensors with various architectures and tunable parameters. Our theory reveals a maximal achievable dynamic range and exposes tunable parameters for orthogonal control of dynamic range and response threshold. Our work sheds light on fundamental limits of synthetic biology designs and provides quantitative guidelines for biosensor design in applications such as dynamic pathway control, strain optimization, and real-time monitoring of metabolism.
Network Analysis of Earth's Co-Evolving Geosphere and Biosphere
NASA Astrophysics Data System (ADS)
Hazen, R. M.; Eleish, A.; Liu, C.; Morrison, S. M.; Meyer, M.; Consortium, K. D.
2017-12-01
A fundamental goal of Earth science is the deep understanding of Earth's dynamic, co-evolving geosphere and biosphere through deep time. Network analysis of geo- and bio- `big data' provides an interactive, quantitative, and predictive visualization framework to explore complex and otherwise hidden high-dimension features of diversity, distribution, and change in the evolution of Earth's geochemistry, mineralogy, paleobiology, and biochemistry [1]. Networks also facilitate quantitative comparison of different geological time periods, tectonic settings, and geographical regions, as well as different planets and moons, through network metrics, including density, centralization, diameter, and transitivity.We render networks by employing data related to geographical, paragenetic, environmental, or structural relationships among minerals, fossils, proteins, and microbial taxa. An important recent finding is that the topography of many networks reflects parameters not explicitly incorporated in constructing the network. For example, networks for minerals, fossils, and protein structures reveal embedded qualitative time axes, with additional network geometries possibly related to extinction and/or other punctuation events (see Figure). Other axes related to chemical activities and volatile fugacities, as well as pressure and/or depth of formation, may also emerge from network analysis. These patterns provide new insights into the way planets evolve, especially Earth's co-evolving geosphere and biosphere. 1. Morrison, S.M. et al. (2017) Network analysis of mineralogical systems. American Mineralogist 102, in press. Figure Caption: A network of Phanerozoic Era fossil animals from the past 540 million years includes blue, red, and black circles (nodes) representing family-level taxa and grey lines (links) between coexisting families. Age information was not used in the construction of this network; nevertheless an intrinsic timeline is embedded in the network topology. In addition, two mass extinction events appear as "pinch points" in the network.
NeuronMetrics: Software for Semi-Automated Processing of Cultured-Neuron Images
Narro, Martha L.; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L.
2007-01-01
Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics™ for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch-number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of ~60 2D images is 1.0–2.5 hours, from a folder of images to a table of numeric data. NeuronMetrics’ output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery. PMID:17270152
NeuronMetrics: software for semi-automated processing of cultured neuron images.
Narro, Martha L; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L
2007-03-23
Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of approximately 60 2D images is 1.0-2.5 h, from a folder of images to a table of numeric data. NeuronMetrics' output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery.
Carbonate landscapes evolution: Insights from 36Cl
NASA Astrophysics Data System (ADS)
Godard, Vincent; Thomas, Franck; Ollivier, Vincent; Bellier, Olivier; Shabanian, Esmaeil; Miramont, Cécile; Fleury, Jules; Benedetti, Lucilla; Guillou, Valéry; Aster Team
2017-04-01
Carbonate landscapes cover a significant fraction of the Earth surface, but their long-term dynamics is still poorly understood. When comparing with the situation in areas underlain by quartz-rich lithologies, where the routine use of 10Be-derived denudation rates has delivered fundamental insights on landscape evolution processes, this knowledge gap is particularly notable. Recent advances in the measurement of 36Cl and better understanding of its production pathways has opened the way to the development of a similar physically-based and quantitative analysis of landscape evolution in carbonate settings. However, beyond these methodological considerations, we still face fundamental geomorphological open questions, as for example the assessment of the importance of congruent carbonate dissolution in long-wavelength topographic evolution. Such unresolved problems concerning the relative importance of physical and chemical weathering processes lead to question the applicability of standard slope-dependent Geomorphic Transport Laws in carbonate settings. These issues have been addressed studying the geomorphological evolution of selected limestone ranges in Provence, SE France, where 36Cl concentration measurements in bedrock and stream sediment samples allow constraining denudation over 10 ka time-scale. We first identify a significant denudation contrast between the summit surface and the flanks of the ranges, pointing to a substantial contribution of gravity-driven processes to the landscape evolution, in addition to dissolution. Furthermore, a detailed analysis of the relationships between hillslope morphology and hilltop denudation allow to identify a fundamental transition between two regimes: (1) a dynamics where hillslope evolution is controlled by linear diffusive downslope regolith transport; and, (2) a domain where denudation is limited by the rate at which physical and chemical weathering processes can produce clasts and lower the hilltop. Such an abrupt transition toward a weathering-limited dynamics may prevent hillslope denudation from balancing the rate of base level fall imposed by the river network and could potentially explain the development of high local relief observed in many Mediterranean carbonate landscapes.
NASA Astrophysics Data System (ADS)
Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh
2016-11-01
The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.
NASA Astrophysics Data System (ADS)
Poudyal, R.; Singh, M.; Gautam, R.; Gatebe, C. K.
2016-12-01
The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR)- http://car.gsfc.nasa.gov/. Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wildfire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.
A meta-analysis of factors affecting trust in human-robot interaction.
Hancock, Peter A; Billings, Deborah R; Schaefer, Kristin E; Chen, Jessie Y C; de Visser, Ewart J; Parasuraman, Raja
2011-10-01
We evaluate and quantify the effects of human, robot, and environmental factors on perceived trust in human-robot interaction (HRI). To date, reviews of trust in HRI have been qualitative or descriptive. Our quantitative review provides a fundamental empirical foundation to advance both theory and practice. Meta-analytic methods were applied to the available literature on trust and HRI. A total of 29 empirical studies were collected, of which 10 met the selection criteria for correlational analysis and 11 for experimental analysis. These studies provided 69 correlational and 47 experimental effect sizes. The overall correlational effect size for trust was r = +0.26,with an experimental effect size of d = +0.71. The effects of human, robot, and environmental characteristics were examined with an especial evaluation of the robot dimensions of performance and attribute-based factors. The robot performance and attributes were the largest contributors to the development of trust in HRI. Environmental factors played only a moderate role. Factors related to the robot itself, specifically, its performance, had the greatest current association with trust, and environmental factors were moderately associated. There was little evidence for effects of human-related factors. The findings provide quantitative estimates of human, robot, and environmental factors influencing HRI trust. Specifically, the current summary provides effect size estimates that are useful in establishing design and training guidelines with reference to robot-related factors of HRI trust. Furthermore, results indicate that improper trust calibration may be mitigated by the manipulation of robot design. However, many future research needs are identified.
NASA Technical Reports Server (NTRS)
Singh, Manoj K.; Gautam, Ritesh; Gatebe, Charles K.; Poudyal, Rajesh
2016-01-01
The Bidirectional Reflectance Distribution Function (BRDF) is a fundamental concept for characterizing the reflectance property of a surface, and helps in the analysis of remote sensing data from satellite, airborne and surface platforms. Multi-angular remote sensing measurements are required for the development and evaluation of BRDF models for improved characterization of surface properties. However, multi-angular data and the associated BRDF models are typically multidimensional involving multi-angular and multi-wavelength information. Effective visualization of such complex multidimensional measurements for different wavelength combinations is presently somewhat lacking in the literature, and could serve as a potentially useful research and teaching tool in aiding both interpretation and analysis of BRDF measurements. This article describes a newly developed software package in Python (PolarBRDF) to help visualize and analyze multi-angular data in polar and False Color Composite (FCC) forms. PolarBRDF also includes functionalities for computing important multi-angular reflectance/albedo parameters including spectral albedo, principal plane reflectance and spectral reflectance slope. Application of PolarBRDF is demonstrated using various case studies obtained from airborne multi-angular remote sensing measurements using NASA's Cloud Absorption Radiometer (CAR). Our visualization program also provides functionalities for untangling complex surface/atmosphere features embedded in pixel-based remote sensing measurements, such as the FCC imagery generation of BRDF measurements of grasslands in the presence of wild fire smoke and clouds. Furthermore, PolarBRDF also provides quantitative information of the angular distribution of scattered surface/atmosphere radiation, in the form of relevant BRDF variables such as sunglint, hotspot and scattering statistics.
Comparison of fundamental, second harmonic, and superharmonic imaging: a simulation study.
van Neer, Paul L M J; Danilouchkine, Mikhail G; Verweij, Martin D; Demi, Libertario; Voormolen, Marco M; van der Steen, Anton F W; de Jong, Nico
2011-11-01
In medical ultrasound, fundamental imaging (FI) uses the reflected echoes from the same spectral band as that of the emitted pulse. The transmission frequency determines the trade-off between penetration depth and spatial resolution. Tissue harmonic imaging (THI) employs the second harmonic of the emitted frequency band to construct images. Recently, superharmonic imaging (SHI) has been introduced, which uses the third to the fifth (super) harmonics. The harmonic level is determined by two competing phenomena: nonlinear propagation and frequency dependent attenuation. Thus, the transmission frequency yielding the optimal trade-off between the spatial resolution and the penetration depth differs for THI and SHI. This paper quantitatively compares the concepts of fundamental, second harmonic, and superharmonic echocardiography at their optimal transmission frequencies. Forward propagation is modeled using a 3D-KZK implementation and the iterative nonlinear contrast source (INCS) method. Backpropagation is assumed to be linear. Results show that the fundamental lateral beamwidth is the narrowest at focus, while the superharmonic one is narrower outside the focus. The lateral superharmonic roll-off exceeds the fundamental and second harmonic roll-off. Also, the axial resolution of SHI exceeds that of FI and THI. The far-field pulse-echo superharmonic pressure is lower than that of the fundamental and second harmonic. SHI appears suited for echocardiography and is expected to improve its image quality at the cost of a slight reduction in depth-of-field.
NASA Astrophysics Data System (ADS)
Neuland, M. B.; Grimaudo, V.; Mezger, K.; Moreno-García, P.; Riedo, A.; Tulej, M.; Wurz, P.
2016-03-01
A key interest of planetary space missions is the quantitative determination of the chemical composition of the planetary surface material. The chemical composition of surface material (minerals, rocks, soils) yields fundamental information that can be used to answer key scientific questions about the formation and evolution of the planetary body in particular and the Solar System in general. We present a miniature time-of-flight type laser ablation/ionization mass spectrometer (LMS) and demonstrate its capability in measuring the elemental and mineralogical composition of planetary surface samples quantitatively by using a femtosecond laser for ablation/ionization. The small size and weight of the LMS make it a remarkable tool for in situ chemical composition measurements in space research, convenient for operation on a lander or rover exploring a planetary surface. In the laboratory, we measured the chemical composition of four geological standard reference samples USGS AGV-2 Andesite, USGS SCo-l Cody Shale, NIST 97b Flint Clay and USGS QLO-1 Quartz Latite with LMS. These standard samples are used to determine the sensitivity factors of the instrument. One important result is that all sensitivity factors are close to 1. Additionally, it is observed that the sensitivity factor of an element depends on its electron configuration, hence on the electron work function and the elemental group in agreement with existing theory. Furthermore, the conformity of the sensitivity factors is supported by mineralogical analyses of the USGS SCo-l and the NIST 97b samples. With the four different reference samples, the consistency of the calibration factors can be demonstrated, which constitutes the fundamental basis for a standard-less measurement-technique for in situ quantitative chemical composition measurements on planetary surface.
NASA Astrophysics Data System (ADS)
Rausch, J.; Vonlanthen, P.; Grobety, B. H.
2014-12-01
The quantification of shape parameters in pyroclasts is fundamental to infer the dominant type of magma fragmentation (magmatic vs. phreatomagmatic), as well as the behavior of volcanic plumes and clouds in the atmosphere. In a case study aiming at reconstructing the fragmentation mechanisms triggering maar eruptions in two geologically and compositionally distinctive volcanic fields (West and East Eifel, Germany), the shapes of a large number of ash particle contours obtained from SEM images were analyzed by a dilation-based fractal method. Volcanic particle contours are pseudo-fractals showing mostly two distinct slopes in Richardson plots related to the fractal dimensions D1 (small-scale "textural" dimension) and D2 (large-scale "morphological" dimension). The validity of the data obtained from 2D sections was tested by analysing SEM micro-CT slices of one particle cut in different orientations and positions. Results for West Eifel maar particles yield large D1 values (> 1.023), resembling typical values of magmatic particles, which are characterized by a complex shape, especially at small scales. In contrast, the D1 values of ash particles from one East Eifel maar deposit are much smaller, coinciding with the fractal dimensions obtained from phreatomagmatic end-member particles. These quantitative morphological analyses suggest that the studied maar eruptions were triggered by two different fragmentation processes: phreatomagmatic in the East Eifel and magmatic in the West Eifel. The application of fractal analysis to quantitatively characterize the shape of pyroclasts and the linking of fractal dimensions to specific fragmentation processes has turned out to be a very promising tool for studying the fragmentation history of any volcanic eruption. The next step is to extend morphological analysis of volcanic particles to 3 dimensions. SEM micro-CT, already applied in this study, offers the required resolution, but is not suitable for the analysis of a large number of particles. Newly released nano CT-scanners, however, allows the simultaneous analysis of a statistically relevant number of particles (in the hundreds range). Preliminary results of a first trial will be presented.
Flory-Stockmayer analysis on reprocessable polymer networks
NASA Astrophysics Data System (ADS)
Li, Lingqiao; Chen, Xi; Jin, Kailong; Torkelson, John
Reprocessable polymer networks can undergo structure rearrangement through dynamic chemistries under proper conditions, making them a promising candidate for recyclable crosslinked materials, e.g. tires. This research field has been focusing on various chemistries. However, there has been lacking of an essential physical theory explaining the relationship between abundancy of dynamic linkages and reprocessability. Based on the classical Flory-Stockmayer analysis on network gelation, we developed a similar analysis on reprocessable polymer networks to quantitatively predict the critical condition for reprocessability. Our theory indicates that it is unnecessary for all bonds to be dynamic to make the resulting network reprocessable. As long as there is no percolated permanent network in the system, the material can fully rearrange. To experimentally validate our theory, we used a thiol-epoxy network model system with various dynamic linkage compositions. The stress relaxation behavior of resulting materials supports our theoretical prediction: only 50 % of linkages between crosslinks need to be dynamic for a tri-arm network to be reprocessable. Therefore, this analysis provides the first fundamental theoretical platform for designing and evaluating reprocessable polymer networks. We thank McCormick Research Catalyst Award Fund and ISEN cluster fellowship (L. L.) for funding support.
A diagnostic analysis of the VVP single-doppler retrieval technique
NASA Technical Reports Server (NTRS)
Boccippio, Dennis J.
1995-01-01
A diagnostic analysis of the VVP (volume velocity processing) retrieval method is presented, with emphasis on understanding the technique as a linear, multivariate regression. Similarities and differences to the velocity-azimuth display and extended velocity-azimuth display retrieval techniques are discussed, using this framework. Conventional regression diagnostics are then employed to quantitatively determine situations in which the VVP technique is likely to fail. An algorithm for preparation and analysis of a robust VVP retrieval is developed and applied to synthetic and actual datasets with high temporal and spatial resolution. A fundamental (but quantifiable) limitation to some forms of VVP analysis is inadequate sampling dispersion in the n space of the multivariate regression, manifest as a collinearity between the basis functions of some fitted parameters. Such collinearity may be present either in the definition of these basis functions or in their realization in a given sampling configuration. This nonorthogonality may cause numerical instability, variance inflation (decrease in robustness), and increased sensitivity to bias from neglected wind components. It is shown that these effects prevent the application of VVP to small azimuthal sectors of data. The behavior of the VVP regression is further diagnosed over a wide range of sampling constraints, and reasonable sector limits are established.
Can NMR solve some significant challenges in metabolomics?
Gowda, G.A. Nagana; Raftery, Daniel
2015-01-01
The field of metabolomics continues to witness rapid growth driven by fundamental studies, methods development, and applications in a number of disciplines that include biomedical science, plant and nutrition sciences, drug development, energy and environmental sciences, toxicology, etc. NMR spectroscopy is one of the two most widely used analytical platforms in the metabolomics field, along with mass spectrometry (MS). NMR's excellent reproducibility and quantitative accuracy, its ability to identify structures of unknown metabolites, its capacity to generate metabolite profiles using intact biospecimens with no need for separation, and its capabilities for tracing metabolic pathways using isotope labeled substrates offer unique strengths for metabolomics applications. However, NMR's limited sensitivity and resolution continue to pose a major challenge and have restricted both the number and the quantitative accuracy of metabolites analyzed by NMR. Further, the analysis of highly complex biological samples has increased the demand for new methods with improved detection, better unknown identification, and more accurate quantitation of larger numbers of metabolites. Recent efforts have contributed significant improvements in these areas, and have thereby enhanced the pool of routinely quantifiable metabolites. Additionally, efforts focused on combining NMR and MS promise opportunities to exploit the combined strength of the two analytical platforms for direct comparison of the metabolite data, unknown identification and reliable biomarker discovery that continue to challenge the metabolomics field. This article presents our perspectives on the emerging trends in NMR-based metabolomics and NMR's continuing role in the field with an emphasis on recent and ongoing research from our laboratory. PMID:26476597
Bulk magnetic domain stability controls paleointensity fidelity
NASA Astrophysics Data System (ADS)
Paterson, Greig A.; Muxworthy, Adrian R.; Yamamoto, Yuhji; Pan, Yongxin
2017-12-01
Nonideal, nonsingle-domain magnetic grains are ubiquitous in rocks; however, they can have a detrimental impact on the fidelity of paleomagnetic records—in particular the determination of ancient magnetic field strength (paleointensity), a key means of understanding the evolution of the earliest geodynamo and the formation of the solar system. As a consequence, great effort has been expended to link rock magnetic behavior to paleointensity results, but with little quantitative success. Using the most comprehensive rock magnetic and paleointensity data compilations, we quantify a stability trend in hysteresis data that characterizes the bulk domain stability (BDS) of the magnetic carriers in a paleomagnetic specimen. This trend is evident in both geological and archeological materials that are typically used to obtain paleointensity data and is therefore pervasive throughout most paleomagnetic studies. Comparing this trend to paleointensity data from both laboratory and historical experiments reveals a quantitative relationship between BDS and paleointensity behavior. Specimens that have lower BDS values display higher curvature on the paleointensity analysis plot, which leads to more inaccurate results. In-field quantification of BDS therefore reflects low-field bulk remanence stability. Rapid hysteresis measurements can be used to provide a powerful quantitative method for preselecting paleointensity specimens and postanalyzing previous studies, further improving our ability to select high-fidelity recordings of ancient magnetic fields. BDS analyses will enhance our ability to understand the evolution of the geodynamo and can help in understanding many fundamental Earth and planetary science questions that remain shrouded in controversy.
Bulk magnetic domain stability controls paleointensity fidelity
Muxworthy, Adrian R.; Yamamoto, Yuhji; Pan, Yongxin
2017-01-01
Nonideal, nonsingle-domain magnetic grains are ubiquitous in rocks; however, they can have a detrimental impact on the fidelity of paleomagnetic records—in particular the determination of ancient magnetic field strength (paleointensity), a key means of understanding the evolution of the earliest geodynamo and the formation of the solar system. As a consequence, great effort has been expended to link rock magnetic behavior to paleointensity results, but with little quantitative success. Using the most comprehensive rock magnetic and paleointensity data compilations, we quantify a stability trend in hysteresis data that characterizes the bulk domain stability (BDS) of the magnetic carriers in a paleomagnetic specimen. This trend is evident in both geological and archeological materials that are typically used to obtain paleointensity data and is therefore pervasive throughout most paleomagnetic studies. Comparing this trend to paleointensity data from both laboratory and historical experiments reveals a quantitative relationship between BDS and paleointensity behavior. Specimens that have lower BDS values display higher curvature on the paleointensity analysis plot, which leads to more inaccurate results. In-field quantification of BDS therefore reflects low-field bulk remanence stability. Rapid hysteresis measurements can be used to provide a powerful quantitative method for preselecting paleointensity specimens and postanalyzing previous studies, further improving our ability to select high-fidelity recordings of ancient magnetic fields. BDS analyses will enhance our ability to understand the evolution of the geodynamo and can help in understanding many fundamental Earth and planetary science questions that remain shrouded in controversy. PMID:29187534
Can NMR solve some significant challenges in metabolomics?
NASA Astrophysics Data System (ADS)
Nagana Gowda, G. A.; Raftery, Daniel
2015-11-01
The field of metabolomics continues to witness rapid growth driven by fundamental studies, methods development, and applications in a number of disciplines that include biomedical science, plant and nutrition sciences, drug development, energy and environmental sciences, toxicology, etc. NMR spectroscopy is one of the two most widely used analytical platforms in the metabolomics field, along with mass spectrometry (MS). NMR's excellent reproducibility and quantitative accuracy, its ability to identify structures of unknown metabolites, its capacity to generate metabolite profiles using intact bio-specimens with no need for separation, and its capabilities for tracing metabolic pathways using isotope labeled substrates offer unique strengths for metabolomics applications. However, NMR's limited sensitivity and resolution continue to pose a major challenge and have restricted both the number and the quantitative accuracy of metabolites analyzed by NMR. Further, the analysis of highly complex biological samples has increased the demand for new methods with improved detection, better unknown identification, and more accurate quantitation of larger numbers of metabolites. Recent efforts have contributed significant improvements in these areas, and have thereby enhanced the pool of routinely quantifiable metabolites. Additionally, efforts focused on combining NMR and MS promise opportunities to exploit the combined strength of the two analytical platforms for direct comparison of the metabolite data, unknown identification and reliable biomarker discovery that continue to challenge the metabolomics field. This article presents our perspectives on the emerging trends in NMR-based metabolomics and NMR's continuing role in the field with an emphasis on recent and ongoing research from our laboratory.
Unpacking the Meaning of the Mole Concept for Secondary School Teachers and Students
ERIC Educational Resources Information Center
Fang, Su-Chi; Hart, Christina; Clarke, David
2014-01-01
The "mole" is a fundamental concept in quantitative chemistry, yet research has shown that the mole is one of the most perplexing concepts in the teaching and learning of chemistry. This paper provides a survey of the relevant literature, identifies the necessary components of a sound understanding of the mole concept, and unpacks and…
ERIC Educational Resources Information Center
Hite, Seven J.
Educational planners and policymakers are rarely able to base their decision-making on sound information and research, according to this book. Because the situation is even more difficult in developing countries, educational policy often is based on research conducted in others parts of the world. This book provides a practical framework that can…
ERIC Educational Resources Information Center
Awan, Mahmood A.
2010-01-01
With the emergence of the Internet and the World Wide Web, economic conditions and business practices have been fundamentally reshaped. The Internet is believed to promote the rapid internationalization of companies, particularly small and medium enterprises (SME). The purpose of this quantitative study was to study the affect of Electronic…
A Test Method for Monitoring Modulus Changes during Durability Tests on Building Joint Sealants
Christopher C. White; Donald L. Hunston; Kar Tean Tan; Gregory T. Schueneman
2012-01-01
The durability of building joint sealants is generally assessed using a descriptive methodology involving visual inspection of exposed specimens for defects. It is widely known that this methodology has inherent limitations, including that the results are qualitative. A new test method is proposed that provides more fundamental and quantitative information about...
A simple model for pollen-parent fecundity distributions in bee-pollinated forage legume polycrosses
USDA-ARS?s Scientific Manuscript database
Random mating or panmixis is a fundamental assumption in quantitative genetic theory. Random mating is sometimes thought to occur in actual fact although a large body of empirical work shows that this is often not the case in nature. Models have been developed to model many non-random mating phenome...
USDA-ARS?s Scientific Manuscript database
Among the fundamental evolutionary forces, recombination arguably has the largest impact on the practical work of plant breeders. Varying over 1,000-fold across the maize genome, the local meiotic recombination rate limits the resolving power of quantitative trait mapping and the precision of favora...
Thermodynamic Cycle Analysis of Magnetohydrodynamic-Bypass Airbreathing Hypersonic Engines
NASA Technical Reports Server (NTRS)
Litchford, Ron J.; Bityurin, Valentine A.; Lineberry, John T.
1999-01-01
Established analyses of conventional ramjet/scramjet performance characteristics indicate that a considerable decrease in efficiency can be expected at off-design flight conditions. This can be explained, in large part, by the deterioration of intake mass flow and limited inlet compression at low flight speeds and by the onset of thrust degradation effects associated with increased burner entry temperature at high flight speeds. In combination, these effects tend to impose lower and upper Mach number limits for practical flight. It has been noted, however, that Magnetohydrodynamic (MHD) energy management techniques represent a possible means for extending the flight Mach number envelope of conventional engines. By transferring enthalpy between different stages of the engine cycle, it appears that the onset of thrust degradation may be delayed to higher flight speeds. Obviously, the introduction of additional process inefficiencies is inevitable with this approach, but it is believed that these losses are more than compensated through optimization of the combustion process. The fundamental idea is to use MHD energy conversion processes to extract and bypass a portion of the intake kinetic energy around the burner. We refer to this general class of propulsion system as an MHD-bypass engine. In this paper, we quantitatively assess the performance potential and scientific feasibility of MHD-bypass airbreathing hypersonic engines using ideal gasdynamics and fundamental thermodynamic principles.
Coded excitation speeds up the detection of the fundamental flexural guided wave in coated tubes
NASA Astrophysics Data System (ADS)
Song, Xiaojun; Moilanen, Petro; Zhao, Zuomin; Ta, Dean; Pirhonen, Jalmari; Salmi, Ari; Hæeggström, Edward; Myllylä, Risto; Timonen, Jussi; Wang, Weiqi
2016-09-01
The fundamental flexural guided wave (FFGW) permits ultrasonic assessment of the wall thickness of solid waveguides, such as tubes or, e.g., long cortical bones. Recently, an optical non-contact method was proposed for ultrasound excitation and detection with the aim of facilitating the FFGW reception by suppressing the interfering modes from the soft coating. This technique suffers from low SNR and requires iterative physical scanning across the source-receiver distance for 2D-FFT analysis. This means that SNR improvement achieved by temporal averaging becomes time-consuming (several minutes) which reduces the applicability of the technique, especially in time-critical applications such as clinical quantitative ultrasound. To achieve sufficient SNR faster, an ultrasonic excitation by a base-sequence-modulated Golay code (BSGC, 64-bit code pair) on coated tube samples (1-5 mm wall thickness and 5 mm soft coating layer) was used. This approach improved SNR by 21 dB and speeded up the measurement by a factor of 100 compared to using a classical pulse excitation with temporal averaging. The measurement now took seconds instead of minutes, while the ability to determine the wall thickness of the phantoms was maintained. The technique thus allows rapid noncontacting assessment of the wall thickness in coated solid tubes, such as the human bone.
NASA Astrophysics Data System (ADS)
Rosero-Zambrano, Carlos Andrés; Avila, Alba; Osorio, Luz Adriana; Aguirre, Sandra
2018-04-01
The coupling of the traditional classroom instruction and a virtual learning environment (VLE) in an engineering course is critical to stimulating the learning process and to encouraging students to develop competencies outside of the classroom. This can be achieved through planned activities and the use of information and communication technologies (ICTs), resources designed to complement students' autonomous learning needs. A quantitative analysis of students' academic performance using final course grades was performed for a fundamentals of electronics course and we examine students' perception of their autonomy using surveys. The students' progress and attitudes were monitored over four consecutive semesters. The first began with the design of the intervention and the following three consisted in the implementation. The strategy was focused on the development of course competencies through autonomous learning with ICT tools presented in the VLE. Findings indicate that the students who did the activities in the VLE showed an increase in performance scores in comparison with students who did not do them. The strategy used in this study, which enhanced perceived autonomy, was associated with a positive effect on their learning process. This research shows that a technology-enhanced course supported by ICT activities can both improve academic performance and foster autonomy in students.
NASA Astrophysics Data System (ADS)
Stampanoni, M.; Reichold, J.; Weber, B.; Haberthür, D.; Schittny, J.; Eller, J.; Büchi, F. N.; Marone, F.
2010-09-01
Nowadays, thanks to the high brilliance available at modern, third generation synchrotron facilities and recent developments in detector technology, it is possible to record volumetric information at the micrometer scale within few minutes. High signal-to-noise ratio, quantitative information on very complex structures like the brain micro vessel architecture, lung airways or fuel cells can be obtained thanks to the combination of dedicated sample preparation protocols, in-situ acquisition schemes and cutting-edge imaging analysis instruments. In this work we report on recent experiments carried out at the TOMCAT beamline of the Swiss Light Source [1] where synchrotron-based tomographic microscopy has been successfully used to obtain fundamental information on preliminary models for cerebral fluid flow [2], to provide an accurate mesh for 3D finite-element simulation of the alveolar structure of the pulmonary acinus [3] and to investigate the complex functional mechanism of fuel cells [4]. Further, we introduce preliminary results on the combination of absorption and phase contrast microscopy for the visualization of high-Z nanoparticles in soft tissues, a fundamental information when designing modern drug delivery systems [5]. As an outlook we briefly discuss the new possibilities offered by high sensitivity, high resolution grating interferomtery as well as Zernike Phase contrast nanotomography [6].
Social in, social out: How the brain responds to social language with more social language
O’Donnell, Matthew Brook; Falk, Emily B.; Lieberman, Matthew D.
2014-01-01
Social connection is a fundamental human need. As such, people’s brains are sensitized to social cues, such as those carried by language, and to promoting social communication. The neural mechanisms of certain key building blocks in this process, such as receptivity to and reproduction of social language, however, are not known. We combined quantitative linguistic analysis and neuroimaging to connect neural activity in brain regions used to simulate the mental states of others with exposure to, and re-transmission of, social language. Our results link findings on successful idea transmission from communication science, sociolinguistics and cognitive neuroscience to prospectively predict the degree of social language that participants utilize when re-transmitting ideas as a function of 1) initial language inputs and 2) neural activity during idea exposure. PMID:27642220
On agent-based modeling and computational social science.
Conte, Rosaria; Paolucci, Mario
2014-01-01
In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS.
Daepp, Madeleine I. G.; Hamilton, Marcus J.; West, Geoffrey B.; Bettencourt, Luís M. A.
2015-01-01
The firm is a fundamental economic unit of contemporary human societies. Studies on the general quantitative and statistical character of firms have produced mixed results regarding their lifespans and mortality. We examine a comprehensive database of more than 25 000 publicly traded North American companies, from 1950 to 2009, to derive the statistics of firm lifespans. Based on detailed survival analysis, we show that the mortality of publicly traded companies manifests an approximately constant hazard rate over long periods of observation. This regularity indicates that mortality rates are independent of a company's age. We show that the typical half-life of a publicly traded company is about a decade, regardless of business sector. Our results shed new light on the dynamics of births and deaths of publicly traded companies and identify some of the necessary ingredients of a general theory of firms. PMID:25833247
Hannemann, S; van Duijn, E-J; Ubachs, W
2007-10-01
A narrow-band tunable injection-seeded pulsed titanium:sapphire laser system has been developed for application in high-resolution spectroscopic studies at the fundamental wavelengths in the near infrared as well as in the ultraviolet, deep ultraviolet, and extreme ultraviolet after upconversion. Special focus is on the quantitative assessment of the frequency characteristics of the oscillator-amplifier system on a pulse-to-pulse basis. Frequency offsets between continuous-wave seed light and the pulsed output are measured as well as linear chirps attributed mainly to mode pulling effects in the oscillator cavity. Operational conditions of the laser are found in which these offset and chirp effects are minimal. Absolute frequency calibration at the megahertz level of accuracy is demonstrated on various atomic and molecular resonance lines.
Daepp, Madeleine I G; Hamilton, Marcus J; West, Geoffrey B; Bettencourt, Luís M A
2015-05-06
The firm is a fundamental economic unit of contemporary human societies. Studies on the general quantitative and statistical character of firms have produced mixed results regarding their lifespans and mortality. We examine a comprehensive database of more than 25 000 publicly traded North American companies, from 1950 to 2009, to derive the statistics of firm lifespans. Based on detailed survival analysis, we show that the mortality of publicly traded companies manifests an approximately constant hazard rate over long periods of observation. This regularity indicates that mortality rates are independent of a company's age. We show that the typical half-life of a publicly traded company is about a decade, regardless of business sector. Our results shed new light on the dynamics of births and deaths of publicly traded companies and identify some of the necessary ingredients of a general theory of firms.
Magnetic Propulsion of Microswimmers with DNA-Based Flagellar Bundles.
Maier, Alexander M; Weig, Cornelius; Oswald, Peter; Frey, Erwin; Fischer, Peer; Liedl, Tim
2016-02-10
We show that DNA-based self-assembly can serve as a general and flexible tool to construct artificial flagella of several micrometers in length and only tens of nanometers in diameter. By attaching the DNA flagella to biocompatible magnetic microparticles, we provide a proof of concept demonstration of hybrid structures that, when rotated in an external magnetic field, propel by means of a flagellar bundle, similar to self-propelling peritrichous bacteria. Our theoretical analysis predicts that flagellar bundles that possess a length-dependent bending stiffness should exhibit a superior swimming speed compared to swimmers with a single appendage. The DNA self-assembly method permits the realization of these improved flagellar bundles in good agreement with our quantitative model. DNA flagella with well-controlled shape could fundamentally increase the functionality of fully biocompatible nanorobots and extend the scope and complexity of active materials.
A mechanistic model of tau amyloid aggregation based on direct observation of oligomers
NASA Astrophysics Data System (ADS)
Shammas, Sarah L.; Garcia, Gonzalo A.; Kumar, Satish; Kjaergaard, Magnus; Horrocks, Mathew H.; Shivji, Nadia; Mandelkow, Eva; Knowles, Tuomas P. J.; Mandelkow, Eckhard; Klenerman, David
2015-04-01
Protein aggregation plays a key role in neurodegenerative disease, giving rise to small oligomers that may become cytotoxic to cells. The fundamental microscopic reactions taking place during aggregation, and their rate constants, have been difficult to determine due to lack of suitable methods to identify and follow the low concentration of oligomers over time. Here we use single-molecule fluorescence to study the aggregation of the repeat domain of tau (K18), and two mutant forms linked with familial frontotemporal dementia, the deletion mutant ΔK280 and the point mutant P301L. Our kinetic analysis reveals that aggregation proceeds via monomeric assembly into small oligomers, and a subsequent slow structural conversion step before fibril formation. Using this approach, we have been able to quantitatively determine how these mutations alter the aggregation energy landscape.
Ancient Cosmology, superfine structure of the Universe and Anthropological Principle
NASA Astrophysics Data System (ADS)
Arakelyan, Hrant; Vardanyan, Susan
2015-07-01
The modern cosmology by its spirit, conception of the Big Bang is closer to the ancient cosmology, than to the cosmological paradigm of the XIX century. Repeating the speculations of the ancients, but using at the same time subtle mathematical methods and relying on the steadily accumulating empirical material, the modern theory tends to a quantitative description of nature, in which increasing role are playing the numerical ratios between the physical constants. The detailed analysis of the influence of the numerical values -- of physical quantities on the physical state of the universe revealed amazing relations called fine and hyperfine tuning. In order to explain, why the observable universe comes to be a certain set of interrelated fundamental parameters, in fact a speculative anthropic principle was proposed, which focuses on the fact of the existence of sentient beings.
On agent-based modeling and computational social science
Conte, Rosaria; Paolucci, Mario
2014-01-01
In the first part of the paper, the field of agent-based modeling (ABM) is discussed focusing on the role of generative theories, aiming at explaining phenomena by growing them. After a brief analysis of the major strengths of the field some crucial weaknesses are analyzed. In particular, the generative power of ABM is found to have been underexploited, as the pressure for simple recipes has prevailed and shadowed the application of rich cognitive models. In the second part of the paper, the renewal of interest for Computational Social Science (CSS) is focused upon, and several of its variants, such as deductive, generative, and complex CSS, are identified and described. In the concluding remarks, an interdisciplinary variant, which takes after ABM, reconciling it with the quantitative one, is proposed as a fundamental requirement for a new program of the CSS. PMID:25071642
Pyroxene Spectroscopy: Effects of Major Element Composition on Near, Mid and Far-Infrared Spectra
NASA Technical Reports Server (NTRS)
Klima, R. L.; Pieters, C. M.; Dyar, M. D.
2005-01-01
Pyroxene is one of the most common minerals in both evolved and undifferentiated solid bodies of the solar system. Various compositions of pyroxene have been directly studied in meteorites and lunar samples and remotely observed by telescopic and orbital measurements of the moon, Mars, Mercury, and several classes of asteroids. Laboratory studies of pyroxene spectra have shown that absorption features diagnostic of pyroxene in both the near and mid infrared are composition dependent. The challenge for remote analyses has been to reduce the level of ambiguity to allow a quantitative assessment of mineral chemistry. This study focuses on the analysis of a comprehensive set of synthetic Ca-Fe-Mg pyroxenes from the visible through far-IR (0.3-50 m) to address the fundamental constraints of crystal structure on absorption.
Fundamental study on non-invasive blood glucose sensing.
Xu, K; Li, Q; Lu, Z; Jiang, J
2002-01-01
Diabetes is a disease which severely threatens the health of human beings. Unfortunately, current monitoring techniques with finger sticks discourage the regular use. Noninvasive spectroscopic measurement of blood glucose is a simple and painless technique, and reduces the long-term health care costs of diabetic patients due to no reagents. It is suitable for home use. Moreover, the establishment of the methodology not only applies to blood glucose noninvasive measurement, but also can be extended to noninvasive measurement of other analytes in body fluid, which will be of important significance for the development of the technique of clinical analysis. In this paper, some fundamental researches, which have been achieved in our laboratory in the field of non-invasive blood glucose measurement, were introduced. 1. Fundamental research was done for the glucose concentrations from simple to complex samples with near and middle infrared spectroscopy: (1) the relationship between the instrument precision and prediction accuracy of the glucose measurement; (2) the change of the result of the quantitative measurement with the change of the complexity of samples; (3) the attempt of increasing the prediction accuracy of the glucose measurement by improving the methods of modeling. The research results showed that it is feasible for non-invasive blood glucose measurement with near and middle infrared spectroscopy in theory, and the experimental results, from simple to complex samples, proved that it is effective for the methodology consisting of hardware and software. 2. According to the characteristics of human body measurement, the effects of measuring conditions on measurement results, such as: (1) the effect of measurement position; (2) the effect of measurement pressure; (3) the effect of measurement site; (4) the effect of measured individual, were investigated. With the fundamental researches, the special problems of human body measurement were solved. In addition, the practical and effective method of noninvasive human blood glucose measurement was proposed.
Fundamentals and Recent Developments in Approximate Bayesian Computation
Lintusaari, Jarno; Gutmann, Michael U.; Dutta, Ritabrata; Kaski, Samuel; Corander, Jukka
2017-01-01
Abstract Bayesian inference plays an important role in phylogenetics, evolutionary biology, and in many other branches of science. It provides a principled framework for dealing with uncertainty and quantifying how it changes in the light of new evidence. For many complex models and inference problems, however, only approximate quantitative answers are obtainable. Approximate Bayesian computation (ABC) refers to a family of algorithms for approximate inference that makes a minimal set of assumptions by only requiring that sampling from a model is possible. We explain here the fundamentals of ABC, review the classical algorithms, and highlight recent developments. [ABC; approximate Bayesian computation; Bayesian inference; likelihood-free inference; phylogenetics; simulator-based models; stochastic simulation models; tree-based models.] PMID:28175922
Silk, Daniel; Kirk, Paul D W; Barnes, Chris P; Toni, Tina; Rose, Anna; Moon, Simon; Dallman, Margaret J; Stumpf, Michael P H
2011-10-04
Chaos and oscillations continue to capture the interest of both the scientific and public domains. Yet despite the importance of these qualitative features, most attempts at constructing mathematical models of such phenomena have taken an indirect, quantitative approach, for example, by fitting models to a finite number of data points. Here we develop a qualitative inference framework that allows us to both reverse-engineer and design systems exhibiting these and other dynamical behaviours by directly specifying the desired characteristics of the underlying dynamical attractor. This change in perspective from quantitative to qualitative dynamics, provides fundamental and new insights into the properties of dynamical systems.
Bortolussi, Silva; Ciani, Laura; Postuma, Ian; Protti, Nicoletta; Luca Reversi; Bruschi, Piero; Ferrari, Cinzia; Cansolino, Laura; Panza, Luigi; Ristori, Sandra; Altieri, Saverio
2014-06-01
The possibility to measure boron concentration with high precision in tissues that will be irradiated represents a fundamental step for a safe and effective BNCT treatment. In Pavia, two techniques have been used for this purpose, a quantitative method based on charged particles spectrometry and a boron biodistribution imaging based on neutron autoradiography. A quantitative method to determine boron concentration by neutron autoradiography has been recently set-up and calibrated for the measurement of biological samples, both solid and liquid, in the frame of the feasibility study of BNCT. This technique was calibrated and the obtained results were cross checked with those of α spectrometry, in order to validate them. The comparisons were performed using tissues taken form animals treated with different boron administration protocols. Subsequently the quantitative neutron autoradiography was employed to measure osteosarcoma cell samples treated with BPA and with new boronated formulations. © 2013 Published by Elsevier Ltd.
Fundamental quantitative security in quantum key generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuen, Horace P.
2010-12-15
We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographicmore » context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.« less
Sayles, Jesse S; Baggio, Jacopo A
2017-01-15
Governance silos are settings in which different organizations work in isolation and avoid sharing information and strategies. Siloes are a fundamental challenge for environmental planning and problem solving, which generally requires collaboration. Siloes can be overcome by creating governance networks. Studying the structure and function of these networks is important for understanding how to create institutional arrangements that can respond to the biophysical dynamics of a specific natural resource system (i.e., social-ecological, or institutional fit). Using the case of salmon restoration in a sub-basin of Puget Sound, USA, we assess network integration, considering three different reasons for network collaborations (i.e., mandated, funded, and shared interest relationships) and analyze how these different collaboration types relate to productivity based on practitioner's assessments. We also illustrate how specific and targeted network interventions might enhance the network. To do so, we use a mixed methods approach that combines quantitative social network analysis (SNA) and qualitative interview analysis. Overall, the sub-basin's governance network is fairly well integrated, but several concerning gaps exist. Funded, mandated, and shared interest relationships lead to different network patterns. Mandated relationships are associated with lower productivity than shared interest relationships, highlighting the benefit of genuine collaboration in collaborative watershed governance. Lastly, quantitative and qualitative data comparisons strengthen recent calls to incorporate geographic space and the role of individual actors versus organizational culture into natural resource governance research using SNA. Copyright © 2016 Elsevier Ltd. All rights reserved.
Neural network of cognitive emotion regulation — An ALE meta-analysis and MACM analysis
Kohn, N.; Eickhoff, S.B.; Scheller, M.; Laird, A.R.; Fox, P.T.; Habel, U.
2016-01-01
Cognitive regulation of emotions is a fundamental prerequisite for intact social functioning which impacts on both well being and psychopathology. The neural underpinnings of this process have been studied intensively in recent years, without, however, a general consensus. We here quantitatively summarize the published literature on cognitive emotion regulation using activation likelihood estimation in fMRI and PET (23 studies/479 subjects). In addition, we assessed the particular functional contribution of identified regions and their interactions using quantitative functional inference and meta-analytic connectivity modeling, respectively. In doing so, we developed a model for the core brain network involved in emotion regulation of emotional reactivity. According to this, the superior temporal gyrus, angular gyrus and (pre) supplementary motor area should be involved in execution of regulation initiated by frontal areas. The dorsolateral prefrontal cortex may be related to regulation of cognitive processes such as attention, while the ventrolateral prefrontal cortex may not necessarily reflect the regulatory process per se, but signals salience and therefore the need to regulate. We also identified a cluster in the anterior middle cingulate cortex as a region, which is anatomically and functionally in an ideal position to influence behavior and subcortical structures related to affect generation. Hence this area may play a central, integrative role in emotion regulation. By focusing on regions commonly active across multiple studies, this proposed model should provide important a priori information for the assessment of dysregulated emotion regulation in psychiatric disorders. PMID:24220041
Takahashi, Yoichiro; Kubo, Rieko; Sano, Rie; Nakajima, Tamiko; Takahashi, Keiko; Kobayashi, Momoko; Handa, Hiroshi; Tsukada, Junichi; Kominato, Yoshihiko
2017-03-01
The ABO system is of fundamental importance in the fields of transfusion and transplantation and has apparent associations with certain diseases, including cardiovascular disorders. ABO expression is reduced in the late phase of erythroid differentiation in vitro, whereas histone deacetylase inhibitors (HDACIs) are known to promote cell differentiation. Therefore, whether or not HDACIs could reduce the amount of ABO transcripts and A or B antigens is an intriguing issue. Quantitative polymerase chain reactions were carried out for the ABO transcripts in erythroid-lineage K562 and epithelial-lineage KATOIII cells after incubation with HDACIs, such as sodium butyrate, panobinostat, vorinostat, and sodium valproate. Flow cytometric analysis was conducted to evaluate the amounts of antigen in KATOIII cells treated with panobinostat. Quantitative chromatin immunoprecipitation (ChIP) assays and luciferase assays were performed on both cell types to examine the mechanisms of ABO suppression. HDACIs reduced the ABO transcripts in both K562 and KATOIII cells, with panobinostat exerting the most significant effect. Flow cytometric analysis demonstrated a decrease in B-antigen expression on panobinostat-treated KATOIII cells. ChIP assays indicated that panobinostat altered the modification of histones in the transcriptional regulatory regions of ABO, and luciferase assays demonstrated reduced activity of these elements. ABO transcription seems to be regulated by an epigenetic mechanism. Panobinostat appears to suppress ABO transcription, reducing the amount of antigens on the surface of cultured cells. © 2016 AABB.
Quantitative analysis of protein-ligand interactions by NMR.
Furukawa, Ayako; Konuma, Tsuyoshi; Yanaka, Saeko; Sugase, Kenji
2016-08-01
Protein-ligand interactions have been commonly studied through static structures of the protein-ligand complex. Recently, however, there has been increasing interest in investigating the dynamics of protein-ligand interactions both for fundamental understanding of the underlying mechanisms and for drug development. NMR is a versatile and powerful tool, especially because it provides site-specific quantitative information. NMR has widely been used to determine the dissociation constant (KD), in particular, for relatively weak interactions. The simplest NMR method is a chemical-shift titration experiment, in which the chemical-shift changes of a protein in response to ligand titration are measured. There are other quantitative NMR methods, but they mostly apply only to interactions in the fast-exchange regime. These methods derive the dissociation constant from population-averaged NMR quantities of the free and bound states of a protein or ligand. In contrast, the recent advent of new relaxation-based experiments, including R2 relaxation dispersion and ZZ-exchange, has enabled us to obtain kinetic information on protein-ligand interactions in the intermediate- and slow-exchange regimes. Based on R2 dispersion or ZZ-exchange, methods that can determine the association rate, kon, dissociation rate, koff, and KD have been developed. In these approaches, R2 dispersion or ZZ-exchange curves are measured for multiple samples with different protein and/or ligand concentration ratios, and the relaxation data are fitted to theoretical kinetic models. It is critical to choose an appropriate kinetic model, such as the two- or three-state exchange model, to derive the correct kinetic information. The R2 dispersion and ZZ-exchange methods are suitable for the analysis of protein-ligand interactions with a micromolar or sub-micromolar dissociation constant but not for very weak interactions, which are typical in very fast exchange. This contrasts with the NMR methods that are used to analyze population-averaged NMR quantities. Essentially, to apply NMR successfully, both the type of experiment and equation to fit the data must be carefully and specifically chosen for the protein-ligand interaction under analysis. In this review, we first explain the exchange regimes and kinetic models of protein-ligand interactions, and then describe the NMR methods that quantitatively analyze these specific interactions. Copyright © 2016 Elsevier B.V. All rights reserved.
An Assessment of the State-of-the-Art in Multidisciplinary Aeromechanical Analyses
2008-01-01
monolithic formulations. In summary, for aerospace structures, partitioned formulations provide fundamental advantages over fully coupled ones, in addition...important frequencies of local analysis directly to global analysis using detailed modeling. Performed ju- diciously, based on a fundamental understanding of...in 2000 has com- prehensively described the problem, and reviewed the status of fundamental understanding, experimental data, and analytical
[Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].
Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie
2013-11-01
In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.
Characterizing resonant component in speech: A different view of tracking fundamental frequency
NASA Astrophysics Data System (ADS)
Dong, Bin
2017-05-01
Inspired by the nonlinearity and nonstationarity and the modulations in speech, Hilbert-Huang Transform and cyclostationarity analysis are employed to investigate the speech resonance in vowel in sequence. Cyclostationarity analysis is not directly manipulated on the target vowel, but on its intrinsic mode functions one by one. Thanks to the equivalence between the fundamental frequency in speech and the cyclic frequency in cyclostationarity analysis, the modulation intensity distributions of the intrinsic mode functions provide much information for the estimation of the fundamental frequency. To highlight the relationship between frequency and time, the pseudo-Hilbert spectrum is proposed to replace the Hilbert spectrum here. After contrasting the pseudo-Hilbert spectra of and the modulation intensity distributions of the intrinsic mode functions, it finds that there is usually one intrinsic mode function which works as the fundamental component of the vowel. Furthermore, the fundamental frequency of the vowel can be determined by tracing the pseudo-Hilbert spectrum of its fundamental component along the time axis. The later method is more robust to estimate the fundamental frequency, when meeting nonlinear components. Two vowels [a] and [i], picked up from a speech database FAU Aibo Emotion Corpus, are applied to validate the above findings.
Stretching of passive tracers and implications for mantle mixing
NASA Astrophysics Data System (ADS)
Conjeepuram, N.; Kellogg, L. H.
2007-12-01
Mid ocean ridge basalts(MORB) and ocean island basalts(OIB) have fundamentally different geochemical signatures. Understanding this difference requires a fundamental knowledge of the mixing processes that led to their formation. Quantitative methods used to assess mixing include examining the distribution of passive tracers, attaching time-evolution information to simulate decay of radioactive isotopes, and, for chaotic flows, calculating the Lyapunov exponent, which characterizes whether two nearby particles diverge at an exponential rate. Although effective, these methods are indirect measures of the two fundamental processes associated with mixing namely, stretching and folding. Building on work done by Kellogg and Turcotte, we present a method to compute the stretching and thinning of a passive, ellipsoidal tracer in three orthogonal directions in isoviscous, incompressible three dimensional flows. We also compute the Lyapunov exponents associated with the given system based on the quantitative measures of stretching and thinning. We test our method with two analytical and three numerical flow fields which exhibit Lagrangian turbulence. The ABC and STF class of analytical flows are a three and two parameter class of flows respectively and have been well studied for fast dynamo action. Since they generate both periodic and chaotic particle paths depending either on the starting point or on the choice of the parameters, they provide a good foundation to understand mixing. The numerical flow fields are similar to the geometries used by Ferrachat and Ricard (1998) and emulate a ridge - transform system. We also compute the stable and unstable manifolds associated with the numerical flow fields to illustrate the directions of rapid and slow mixing. We find that stretching in chaotic flow fields is significantly more effective than regular or periodic flow fields. Consequently, chaotic mixing is far more efficient than regular mixing. We also find that in the numerical flow field, there is a fundamental topological difference in the regions exhibiting slow or regular mixing for different model geometries.
Mazzini, Virginia; Craig, Vincent S J
2017-10-01
The importance of electrolyte solutions cannot be overstated. Beyond the ionic strength of electrolyte solutions the specific nature of the ions present is vital in controlling a host of properties. Therefore ion specificity is fundamentally important in physical chemistry, engineering and biology. The observation that the strengths of the effect of ions often follows well established series suggests that a single predictive and quantitative description of specific-ion effects covering a wide range of systems is possible. Such a theory would revolutionise applications of physical chemistry from polymer precipitation to drug design. Current approaches to understanding specific-ion effects involve consideration of the ions themselves, the solvent and relevant interfaces and the interactions between them. Here we investigate the specific-ion effects trends of standard partial molar volumes and electrostrictive volumes of electrolytes in water and eleven non-aqueous solvents. We choose these measures as they relate to bulk properties at infinite dilution, therefore they are the simplest electrolyte systems. This is done to test the hypothesis that the ions alone exhibit a specific-ion effect series that is independent of the solvent and unrelated to surface properties. The specific-ion effects trends of standard partial molar volumes and normalised electrostrictive volumes examined in this work show a fundamental ion-specific series that is reproduced across the solvents, which is the Hofmeister series for anions and the reverse lyotropic series for cations, supporting the hypothesis. This outcome is important in demonstrating that ion specificity is observed at infinite dilution and demonstrates that the complexity observed in the manifestation of specific-ion effects in a very wide range of systems is due to perturbations of solvent, surfaces and concentration on the underlying fundamental series. This knowledge will guide a general understanding of specific-ion effects and assist in the development of a quantitative predictive theory of ion specificity.
ERIC Educational Resources Information Center
Grobler, Bennie; Moloi, Connie; Thakhordas, Sunita
2017-01-01
This quantitative study investigates teachers' perceptions of how Emotional Intelligence (EI) was utilised by their school principals to manage mandated curriculum change processes in schools in the Johannesburg North district of Gauteng in South Africa. Research shows that EI consists of a range of fundamental skills that could enable school…
Data Sources for Estimating Environment-Related Diseases
Walker, Bailus
1984-01-01
Relating current morbidity and mortality to environmental and occupational factors requires information on parameters of environmental exposure for practitioners of medicine and other health scientists. A fundamental source of that information is the exposure history recorded in hospitals, clinics, and other points of entry to the health care system. The qualitative and quantitative aspects of this issue are reviewed. PMID:6716500
2009-10-01
that both qualitative and quantitative observations of the spall propagation performance warrant further investigation into the fundamental ...K., (2000), “Hybrid ceramic ball bearings for turbochargers ,” SAE Paper 2000-01-1339, pp. 1-14. (3) Wang, L., Snidle, R. W., and Gu, L., (2000
Quantitative Phase Microscopy for Accurate Characterization of Microlens Arrays
NASA Astrophysics Data System (ADS)
Grilli, Simonetta; Miccio, Lisa; Merola, Francesco; Finizio, Andrea; Paturzo, Melania; Coppola, Sara; Vespini, Veronica; Ferraro, Pietro
Microlens arrays are of fundamental importance in a wide variety of applications in optics and photonics. This chapter deals with an accurate digital holography-based characterization of both liquid and polymeric microlenses fabricated by an innovative pyro-electrowetting process. The actuation of liquid and polymeric films is obtained through the use of pyroelectric charges generated into polar dielectric lithium niobate crystals.
Relationships between net primary productivity and forest stand age in U.S. forests
Liming He; Jing M. Chen; Yude Pan; Richard Birdsey; Jens Kattge
2012-01-01
Net primary productivity (NPP) is a key flux in the terrestrial ecosystem carbon balance, as it summarizes the autotrophic input into the system. Forest NPP varies predictably with stand age, and quantitative information on the NPP-age relationship for different regions and forest types is therefore fundamentally important for forest carbon cycle modeling. We used four...
ERIC Educational Resources Information Center
Davis, Mary M.
2009-01-01
The American Association of Colleges and Universities reports that over 50% of the students entering colleges and universities are academically under prepared; that is, according to Miller and Murray (2005), students "lack basic skills in at least one of the three fundamental areas of reading, writing, and mathematics" (paragraph 4). Furthermore,…
J. E. Winandy; P. K. Lebow
2001-01-01
In this study, we develop models for predicting loss in bending strength of clear, straight-grained pine from changes in chemical composition. Although significant work needs to be done before truly universal predictive models are developed, a quantitative fundamental relationship between changes in chemical composition and strength loss for pine was demonstrated. In...
ERIC Educational Resources Information Center
Pritchard, Jan Teena
2013-01-01
The most basic and fundamental skill for academic success is the ability to read. The purpose of this 1-group pretest and posttest pre-experimental quantitative study was to investigate how a unique instructional approach, called "curriculum in motion" with an emphasis on therapeutic martial arts and Brain Gym exercises influenced…
Mechanisms, functions and ecology of colour vision in the honeybee.
Hempel de Ibarra, N; Vorobyev, M; Menzel, R
2014-06-01
Research in the honeybee has laid the foundations for our understanding of insect colour vision. The trichromatic colour vision of honeybees shares fundamental properties with primate and human colour perception, such as colour constancy, colour opponency, segregation of colour and brightness coding. Laborious efforts to reconstruct the colour vision pathway in the honeybee have provided detailed descriptions of neural connectivity and the properties of photoreceptors and interneurons in the optic lobes of the bee brain. The modelling of colour perception advanced with the establishment of colour discrimination models that were based on experimental data, the Colour-Opponent Coding and Receptor Noise-Limited models, which are important tools for the quantitative assessment of bee colour vision and colour-guided behaviours. Major insights into the visual ecology of bees have been gained combining behavioural experiments and quantitative modelling, and asking how bee vision has influenced the evolution of flower colours and patterns. Recently research has focussed on the discrimination and categorisation of coloured patterns, colourful scenes and various other groupings of coloured stimuli, highlighting the bees' behavioural flexibility. The identification of perceptual mechanisms remains of fundamental importance for the interpretation of their learning strategies and performance in diverse experimental tasks.
Development, Validation, and Application of the Microbiology Concept Inventory †
Paustian, Timothy D.; Briggs, Amy G.; Brennan, Robert E.; Boury, Nancy; Buchner, John; Harris, Shannon; Horak, Rachel E. A.; Hughes, Lee E.; Katz-Amburn, D. Sue; Massimelli, Maria J.; McDonald, Ann H.; Primm, Todd P.; Smith, Ann C.; Stevens, Ann M.; Yung, Sunny B.
2017-01-01
If we are to teach effectively, tools are needed to measure student learning. A widely used method for quickly measuring student understanding of core concepts in a discipline is the concept inventory (CI). Using the American Society for Microbiology Curriculum Guidelines (ASMCG) for microbiology, faculty from 11 academic institutions created and validated a new microbiology concept inventory (MCI). The MCI was developed in three phases. In phase one, learning outcomes and fundamental statements from the ASMCG were used to create T/F questions coupled with open responses. In phase two, the 743 responses to MCI 1.0 were examined to find the most common misconceptions, which were used to create distractors for multiple-choice questions. MCI 2.0 was then administered to 1,043 students. The responses of these students were used to create MCI 3.0, a 23-question CI that measures students’ understanding of all 27 fundamental statements. MCI 3.0 was found to be reliable, with a Cronbach’s alpha score of 0.705 and Ferguson’s delta of 0.97. Test item analysis demonstrated good validity and discriminatory power as judged by item difficulty, item discrimination, and point-biserial correlation coefficient. Comparison of pre- and posttest scores showed that microbiology students at 10 institutions showed an increase in understanding of concepts after instruction, except for questions probing metabolism (average normalized learning gain was 0.15). The MCI will enable quantitative analysis of student learning gains in understanding microbiology, help to identify misconceptions, and point toward areas where efforts should be made to develop teaching approaches to overcome them. PMID:29854042
Moffatt, Suzanne; White, Martin; Mackintosh, Joan; Howel, Denise
2006-03-08
In this methodological paper we document the interpretation of a mixed methods study and outline an approach to dealing with apparent discrepancies between qualitative and quantitative research data in a pilot study evaluating whether welfare rights advice has an impact on health and social outcomes among a population aged 60 and over. Quantitative and qualitative data were collected contemporaneously. Quantitative data were collected from 126 men and women aged over 60 within a randomised controlled trial. Participants received a full welfare benefits assessment which successfully identified additional financial and non-financial resources for 60% of them. A range of demographic, health and social outcome measures were assessed at baseline, 6, 12 and 24 month follow up. Qualitative data were collected from a sub-sample of 25 participants purposively selected to take part in individual interviews to examine the perceived impact of welfare rights advice. Separate analysis of the quantitative and qualitative data revealed discrepant findings. The quantitative data showed little evidence of significant differences of a size that would be of practical or clinical interest, suggesting that the intervention had no impact on these outcome measures. The qualitative data suggested wide-ranging impacts, indicating that the intervention had a positive effect. Six ways of further exploring these data were considered: (i) treating the methods as fundamentally different; (ii) exploring the methodological rigour of each component; (iii) exploring dataset comparability; (iv) collecting further data and making further comparisons; (v) exploring the process of the intervention; and (vi) exploring whether the outcomes of the two components match. The study demonstrates how using mixed methods can lead to different and sometimes conflicting accounts and, using this six step approach, how such discrepancies can be harnessed to interrogate each dataset more fully. Not only does this enhance the robustness of the study, it may lead to different conclusions from those that would have been drawn through relying on one method alone and demonstrates the value of collecting both types of data within a single study. More widespread use of mixed methods in trials of complex interventions is likely to enhance the overall quality of the evidence base.
Nadeau, Kyle P; Rice, Tyler B; Durkin, Anthony J; Tromberg, Bruce J
2015-11-01
We present a method for spatial frequency domain data acquisition utilizing a multifrequency synthesis and extraction (MSE) method and binary square wave projection patterns. By illuminating a sample with square wave patterns, multiple spatial frequency components are simultaneously attenuated and can be extracted to determine optical property and depth information. Additionally, binary patterns are projected faster than sinusoids typically used in spatial frequency domain imaging (SFDI), allowing for short (millisecond or less) camera exposure times, and data acquisition speeds an order of magnitude or more greater than conventional SFDI. In cases where sensitivity to superficial layers or scattering is important, the fundamental component from higher frequency square wave patterns can be used. When probing deeper layers, the fundamental and harmonic components from lower frequency square wave patterns can be used. We compared optical property and depth penetration results extracted using square waves to those obtained using sinusoidal patterns on an in vivo human forearm and absorbing tube phantom, respectively. Absorption and reduced scattering coefficient values agree with conventional SFDI to within 1% using both high frequency (fundamental) and low frequency (fundamental and harmonic) spatial frequencies. Depth penetration reflectance values also agree to within 1% of conventional SFDI.
Nadeau, Kyle P.; Rice, Tyler B.; Durkin, Anthony J.; Tromberg, Bruce J.
2015-01-01
Abstract. We present a method for spatial frequency domain data acquisition utilizing a multifrequency synthesis and extraction (MSE) method and binary square wave projection patterns. By illuminating a sample with square wave patterns, multiple spatial frequency components are simultaneously attenuated and can be extracted to determine optical property and depth information. Additionally, binary patterns are projected faster than sinusoids typically used in spatial frequency domain imaging (SFDI), allowing for short (millisecond or less) camera exposure times, and data acquisition speeds an order of magnitude or more greater than conventional SFDI. In cases where sensitivity to superficial layers or scattering is important, the fundamental component from higher frequency square wave patterns can be used. When probing deeper layers, the fundamental and harmonic components from lower frequency square wave patterns can be used. We compared optical property and depth penetration results extracted using square waves to those obtained using sinusoidal patterns on an in vivo human forearm and absorbing tube phantom, respectively. Absorption and reduced scattering coefficient values agree with conventional SFDI to within 1% using both high frequency (fundamental) and low frequency (fundamental and harmonic) spatial frequencies. Depth penetration reflectance values also agree to within 1% of conventional SFDI. PMID:26524682
Preparing Master of Public Health Graduates to Work in Local Health Departments.
Hemans-Henry, Calaine; Blake, Janice; Parton, Hilary; Koppaka, Ram; Greene, Carolyn M
2016-01-01
To identify key competencies and skills that all master of public health (MPH) graduates should have to be prepared to work in a local health department. In 2011-2012, the New York City Department of Health and Mental Hygiene administered electronic surveys to 2 categories of staff: current staff with an MPH as their highest degree, and current hiring managers. In all, 312 (77%) staff members with an MPH as their highest degree and 170 (57%) hiring managers responded to the survey. Of the respondents with an MPH as their highest degree, 85% stated that their MPH program prepared them for work at the New York City Health Department. Skills for which MPH graduates most often stated they were underprepared included facility in using SAS® statistical software, quantitative data analysis/statistics, personnel management/leadership, and data collection/database management/data cleaning. Among the skills hiring managers identified as required of MPH graduates, the following were most often cited as those for which newly hired MPH graduates were inadequately prepared: quantitative data analysis, researching/conducting literature reviews, scientific writing and publication, management skills, and working with contracts/requests for proposals. These findings suggest that MPH graduates could be better prepared to work in a local health department upon graduation. To be successful, new MPH graduate hires should possess fundamental skills and knowledge related to analysis, communication, management, and leadership. Local health departments and schools of public health must each contribute to the development of the current and future public health workforce through both formal learning opportunities and supplementary employment-based training to reinforce prior coursework and facilitate practical skill development.
Application of Elements of TPM Strategy for Operation Analysis of Mining Machine
NASA Astrophysics Data System (ADS)
Brodny, Jaroslaw; Tutak, Magdalena
2017-12-01
Total Productive Maintenance (TPM) strategy includes group of activities and actions in order to maintenance machines in failure-free state and without breakdowns thanks to tending limitation of failures, non-planned shutdowns, lacks and non-planned service of machines. These actions are ordered to increase effectiveness of utilization of possessed devices and machines in company. Very significant element of this strategy is connection of technical actions with changes in their perception by employees. Whereas fundamental aim of introduction this strategy is improvement of economic efficiency of enterprise. Increasing competition and necessity of reduction of production costs causes that also mining enterprises are forced to introduce this strategy. In the paper examples of use of OEE model for quantitative evaluation of selected mining devices were presented. OEE model is quantitative tool of TPM strategy and can be the base for further works connected with its introduction. OEE indicator is the product of three components which include availability and performance of the studied machine and the quality of the obtained product. The paper presents the results of the effectiveness analysis of the use of a set of mining machines included in the longwall system, which is the first and most important link in the technological line of coal production. The set of analyzed machines included the longwall shearer, armored face conveyor and cruscher. From a reliability point of view, the analyzed set of machines is a system that is characterized by the serial structure. The analysis was based on data recorded by the industrial automation system used in the mines. This method of data acquisition ensured their high credibility and a full time synchronization. Conclusions from the research and analyses should be used to reduce breakdowns, failures and unplanned downtime, increase performance and improve production quality.
Breitbart, Eckhard; Köberlein-Neu, Juliane
2017-01-01
Introduction Occurring from ultraviolet radiation combined with impairing ozone levels, uncritical sun exposure and use of tanning beds an increasing number of people are affected by different types of skin cancer. But preventive interventions like skin cancer screening are still missing the evidence for effectiveness and therefore are criticised. Fundamental for an appropriate course of action is to approach the defined parameters as measures for effectiveness critically. A prerequisite should be the critical application of used parameter that are defined as measures for effectiveness. This research seeks to establish, through the available literature, the effects and conditions that prove the effectiveness of prevention strategies in skin cancer. Method and analysis A mixed-method approach is employed to combine quantitative to qualitative methods and answer what effects can display effectiveness considering time horizon, perspective and organisational level and what are essential and sufficient conditions to prove effectiveness and cost-effectiveness in skin cancer prevention strategies. A systematic review will be performed to spot studies from any design and assess the data quantitatively and qualitatively. Included studies from each key question will be summarised by characteristics like population, intervention, comparison, outcomes, study design, endpoints, effect estimator and so on. Beside statistical relevancies for a systematic review the qualitative method of qualitative comparative analysis (QCA) will be performed. The estimated outcomes from this review and QCA are the accomplishment and absence of effects that are appropriate for application in effectiveness assessments and further cost-effectiveness assessment. Ethics and dissemination Formal ethical approval is not required as primary data will not be collected. Trial registration number International Prospective Register for Systematic Reviews number CRD42017053859. PMID:28877950
Calibration Issues and Operating System Requirements for Electron-Probe Microanalysis
NASA Technical Reports Server (NTRS)
Carpenter, P.
2006-01-01
Instrument purchase requirements and dialogue with manufacturers have established hardware parameters for alignment, stability, and reproducibility, which have helped improve the precision and accuracy of electron microprobe analysis (EPMA). The development of correction algorithms and the accurate solution to quantitative analysis problems requires the minimization of systematic errors and relies on internally consistent data sets. Improved hardware and computer systems have resulted in better automation of vacuum systems, stage and wavelength-dispersive spectrometer (WDS) mechanisms, and x-ray detector systems which have improved instrument stability and precision. Improved software now allows extended automated runs involving diverse setups and better integrates digital imaging and quantitative analysis. However, instrumental performance is not regularly maintained, as WDS are aligned and calibrated during installation but few laboratories appear to check and maintain this calibration. In particular, detector deadtime (DT) data is typically assumed rather than measured, due primarily to the difficulty and inconvenience of the measurement process. This is a source of fundamental systematic error in many microprobe laboratories and is unknown to the analyst, as the magnitude of DT correction is not listed in output by microprobe operating systems. The analyst must remain vigilant to deviations in instrumental alignment and calibration, and microprobe system software must conveniently verify the necessary parameters. Microanalysis of mission critical materials requires an ongoing demonstration of instrumental calibration. Possible approaches to improvements in instrument calibration, quality control, and accuracy will be discussed. Development of a set of core requirements based on discussions with users, researchers, and manufacturers can yield documents that improve and unify the methods by which instruments can be calibrated. These results can be used to continue improvements of EPMA.
Quick, Christopher M; Venugopal, Arun M; Dongaonkar, Ranjeet M; Laine, Glen A; Stewart, Randolph H
2008-05-01
To return lymph to the great veins of the neck, it must be actively pumped against a pressure gradient. Mean lymph flow in a portion of a lymphatic network has been characterized by an empirical relationship (P(in) - P(out) = -P(p) + R(L)Q(L)), where P(in) - P(out) is the axial pressure gradient and Q(L) is mean lymph flow. R(L) and P(p) are empirical parameters characterizing the effective lymphatic resistance and pump pressure, respectively. The relation of these global empirical parameters to the properties of lymphangions, the segments of a lymphatic vessel bounded by valves, has been problematic. Lymphangions have a structure like blood vessels but cyclically contract like cardiac ventricles; they are characterized by a contraction frequency (f) and the slopes of the end-diastolic pressure-volume relationship [minimum value of resulting elastance (E(min))] and end-systolic pressure-volume relationship [maximum value of resulting elastance (E(max))]. Poiseuille's law provides a first-order approximation relating the pressure-flow relationship to the fundamental properties of a blood vessel. No analogous formula exists for a pumping lymphangion. We therefore derived an algebraic formula predicting lymphangion flow from fundamental physical principles and known lymphangion properties. Quantitative analysis revealed that lymph inertia and resistance to lymph flow are negligible and that lymphangions act like a series of interconnected ventricles. For a single lymphangion, P(p) = P(in) (E(max) - E(min))/E(min) and R(L) = E(max)/f. The formula was tested against a validated, realistic mathematical model of a lymphangion and found to be accurate. Predicted flows were within the range of flows measured in vitro. The present work therefore provides a general solution that makes it possible to relate fundamental lymphangion properties to lymphatic system function.
Chatterjee, Paramita; Chakraborty, Arup; Mukherjee, Alok K
2018-07-05
Pathological calcification in human urinary tract (kidney stones) is a common problem affecting an increasing number of people around the world. Analysis of such minerals or compounds is of fundamental importance for understanding their etiology and for the development of prophylactic measures. In the present study, structural characterization, phase quantification and morphological behaviour of thirty three (33) human kidney stones from eastern India have been carried out using IR spectroscopy (FT-IR), powder X-ray diffraction (PXRD) and scanning electron microscopy (SEM). Quantitative phase composition of kidney stones has been analyzed following the Rietveld method. Based on the quantitative estimates of constituent phases, the calculi samples have been classified into oxalate (OX), uric acid (UA), phosphate (PH) and mixed (MX) groups. Rietveld analysis of PXRD patterns showed that twelve (36%) of the renal calculi were composed exclusively of whewellite (calcium oxalate monohydrate, COM). The remaining twenty one (64%) stones were mixture of phases with oxalate as the major constituent in fourteen (67%) of these stones. The average crystallite size of whewellite in oxalate stones, as determined from the PXRD analysis, varies between 93 (1) nm and 202 (3) nm, whereas the corresponding sizes for the uric acid and struvite crystallites in UA and PH stones are 79 (1)-155 (4) nm and 69 (1)-123(1) nm, respectively. The size of hydroxyapatite crystallites, 10 (1)-21 (1) nm, is smaller by about one order of magnitude compared to other minerals in the kidney stones. A statistical analysis using fifty (50) kidney stones (33 calculi from the present study and 17 calculi reported earlier from our laboratory) revealed that the oxalate group (whewellite, weddellite or mixture of whewellite and weddellite as the major constituent) is the most prevalent (82%) kidney stone type in eastern India. Copyright © 2018 Elsevier B.V. All rights reserved.
Shikina, Shinya; Chung, Yi-Jou; Chiu, Yi-Ling; Huang, Yi-Jie; Lee, Yan-Horn; Chang, Ching-Fong
2016-03-01
Sex steroids play a fundamental role not only in reproduction but also in various other biological processes in vertebrates. Although the presence of sex steroids has been confirmed in cnidarians (e.g., coral, sea anemone, jellyfish, and hydra), which are basal metazoans, only a few studies to date have characterized steroidogenesis-related genes in cnidarians. Based on a transcriptomic analysis of the stony coral Euphyllia ancora, we identified the steroidogenic enzyme 17β-hydroxysteroid dehydrogenase type 14 (17beta-hsd 14), an oxidative enzyme that catalyzes the NAD(+)-dependent inactivation of estrogen/androgen (estradiol to estrone and testosterone to androstenedione) in mammals. Phylogenetic analysis showed that E. ancora 17beta-Hsd 14 (Ea17beta-Hsd 14) clusters with other animal 17beta-HSD 14s but not with other members of the 17beta-HSD family. Subsequent quantitative RT-PCR analysis revealed a lack of correlation of Ea17beta-hsd 14 transcript levels with the coral's reproductive cycle. In addition, Ea17beta-hsd 14 transcript and protein were detected in all tissues examined, such as the tentacles, mesenterial filaments, and gonads, at similar levels in both sexes, as determined by quantitative RT-PCR analysis and Western blotting with an anti-Ea17beta-Hsd 14 antibody. Immunohistochemical analysis revealed that Ea17beta-Hsd 14 is mainly distributed in the endodermal regions of the polyps, but the protein was also observed in all tissues examined. These results suggest that Ea17beta-Hsd 14 is involved in important functions that commonly occur in endodermal cells or has multiple functions in different tissues. Our data provide information for comparison with advanced animals as well as insight into the evolution of steroidogenesis-related genes in metazoans. Copyright © 2016 Elsevier Inc. All rights reserved.
Models of volcanic eruption hazards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wohletz, K.H.
1992-01-01
Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluidmore » flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.« less
Models of volcanic eruption hazards
NASA Astrophysics Data System (ADS)
Wohletz, K. H.
Volcanic eruptions pose an ever present but poorly constrained hazard to life and property for geothermal installations in volcanic areas. Because eruptions occur sporadically and may limit field access, quantitative and systematic field studies of eruptions are difficult to complete. Circumventing this difficulty, laboratory models and numerical simulations are pivotal in building our understanding of eruptions. For example, the results of fuel-coolant interaction experiments show that magma-water interaction controls many eruption styles. Applying these results, increasing numbers of field studies now document and interpret the role of external water eruptions. Similarly, numerical simulations solve the fundamental physics of high-speed fluid flow and give quantitative predictions that elucidate the complexities of pyroclastic flows and surges. A primary goal of these models is to guide geologists in searching for critical field relationships and making their interpretations. Coupled with field work, modeling is beginning to allow more quantitative and predictive volcanic hazard assessments.
Sensitivity analysis of bi-layered ceramic dental restorations.
Zhang, Zhongpu; Zhou, Shiwei; Li, Qing; Li, Wei; Swain, Michael V
2012-02-01
The reliability and longevity of ceramic prostheses have become a major concern. The existing studies have focused on some critical issues from clinical perspectives, but more researches are needed to address fundamental sciences and fabrication issues to ensure the longevity and durability of ceramic prostheses. The aim of this paper was to explore how "sensitive" the thermal and mechanical responses, in terms of changes in temperature and thermal residual stress of the bi-layered ceramic systems and crown models will be with respect to the perturbation of the design variables chosen (e.g. layer thickness and heat transfer coefficient) in a quantitative way. In this study, three bi-layered ceramic models with different geometries are considered: (i) a simple bi-layered plate, (ii) a simple bi-layer triangle, and (iii) an axisymmetric bi-layered crown. The layer thickness and convective heat transfer coefficient (or cooling rate) seem to be more sensitive for the porcelain fused on zirconia substrate models. The resultant sensitivities indicate a critical importance of the heat transfer coefficient and thickness ratio of core to veneer on the temperature distributions and residual stresses in each model. The findings provide a quantitative basis for assessing the effects of fabrication uncertainties and optimizing the design of ceramic prostheses. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin
2015-02-01
When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise. Published by Elsevier Ltd.
Lehmann, Sylvain; Hoofnagle, Andrew; Hochstrasser, Denis; Brede, Cato; Glueckmann, Matthias; Cocho, José A; Ceglarek, Uta; Lenz, Christof; Vialaret, Jérôme; Scherl, Alexander; Hirtz, Christophe
2013-05-01
Proteomics studies typically aim to exhaustively detect peptides/proteins in a given biological sample. Over the past decade, the number of publications using proteomics methodologies has exploded. This was made possible due to the availability of high-quality genomic data and many technological advances in the fields of microfluidics and mass spectrometry. Proteomics in biomedical research was initially used in 'functional' studies for the identification of proteins involved in pathophysiological processes, complexes and networks. Improved sensitivity of instrumentation facilitated the analysis of even more complex sample types, including human biological fluids. It is at that point the field of clinical proteomics was born, and its fundamental aim was the discovery and (ideally) validation of biomarkers for the diagnosis, prognosis, or therapeutic monitoring of disease. Eventually, it was recognized that the technologies used in clinical proteomics studies [particularly liquid chromatography-tandem mass spectrometry (LC-MS/MS)] could represent an alternative to classical immunochemical assays. Prior to deploying MS in the measurement of peptides/proteins in the clinical laboratory, it seems likely that traditional proteomics workflows and data management systems will need to adapt to the clinical environment and meet in vitro diagnostic (IVD) regulatory constraints. This defines a new field, as reviewed in this article, that we have termed quantitative Clinical Chemistry Proteomics (qCCP).
Dynamics of water bound to crystalline cellulose
DOE Office of Scientific and Technical Information (OSTI.GOV)
O’Neill, Hugh; Pingali, Sai Venkatesh; Petridis, Loukas
Interactions of water with cellulose are of both fundamental and technological importance. Here, we characterize the properties of water associated with cellulose using deuterium labeling, neutron scattering and molecular dynamics simulation. Quasi-elastic neutron scattering provided quantitative details about the dynamical relaxation processes that occur and was supported by structural characterization using small-angle neutron scattering and X-ray diffraction. We can unambiguously detect two populations of water associated with cellulose. The first is “non-freezing bound” water that gradually becomes mobile with increasing temperature and can be related to surface water. The second population is consistent with confined water that abruptly becomes mobilemore » at ~260 K, and can be attributed to water that accumulates in the narrow spaces between the microfibrils. Quantitative analysis of the QENS data showed that, at 250 K, the water diffusion coefficient was 0.85 ± 0.04 × 10-10 m2sec-1 and increased to 1.77 ± 0.09 × 10-10 m2sec-1 at 265 K. MD simulations are in excellent agreement with the experiments and support the interpretation that water associated with cellulose exists in two dynamical populations. Our results provide clarity to previous work investigating the states of bound water and provide a new approach for probing water interactions with lignocellulose materials.« less
Mapping morphological shape as a high-dimensional functional curve
Fu, Guifang; Huang, Mian; Bo, Wenhao; Hao, Han; Wu, Rongling
2018-01-01
Abstract Detecting how genes regulate biological shape has become a multidisciplinary research interest because of its wide application in many disciplines. Despite its fundamental importance, the challenges of accurately extracting information from an image, statistically modeling the high-dimensional shape and meticulously locating shape quantitative trait loci (QTL) affect the progress of this research. In this article, we propose a novel integrated framework that incorporates shape analysis, statistical curve modeling and genetic mapping to detect significant QTLs regulating variation of biological shape traits. After quantifying morphological shape via a radius centroid contour approach, each shape, as a phenotype, was characterized as a high-dimensional curve, varying as angle θ runs clockwise with the first point starting from angle zero. We then modeled the dynamic trajectories of three mean curves and variation patterns as functions of θ. Our framework led to the detection of a few significant QTLs regulating the variation of leaf shape collected from a natural population of poplar, Populus szechuanica var tibetica. This population, distributed at altitudes 2000–4500 m above sea level, is an evolutionarily important plant species. This is the first work in the quantitative genetic shape mapping area that emphasizes a sense of ‘function’ instead of decomposing the shape into a few discrete principal components, as the majority of shape studies do. PMID:28062411
Fundamentals of Enzyme-Based Sensors
NASA Astrophysics Data System (ADS)
Moreno-Bondi, María C.; Benito-Peña, Elena
One of the mayor outbreaks in the development of analytical measurement techniques was the introduction, in the mid-twentieth century, of bioprobes for the analysis of chemical and biochemical compounds in real samples. The first devices, developed in the 1950's and 1960's by Clark et al. were based on electrochemical measurements and allowed the determination of oxygen and glucose in tissues and blood samples. Later on, in the 1970's, optical transduction was coupled to enzymatically-catalyzed reactions3 and since those early days the field of application of optical biosensors has broaden up considerably. According to the definition proposed by the International Union of Pure and Applied Chemistry (IUPAC): "A biosensor is a self-contained integrated device which is capable of providing specific quantitative or semi-quantitative analytical information using a biological recognition element (biochemical receptor) which is in direct spatial contact with a transducer element. A biosensor should be clearly distinguished from a bioanalytical system, which requires additional processing steps, such as reagent addition. Furthermore, a biosensor should be distinguished from a bioprobe which is either disposable after one measurement, i.e. single use, or unable to continuously monitor the analyte concentration". The general scheme of a biosensor configuration is shown in Figure 1. Biosensors that include transducers based on integrated circuit microchips are known as biochips.
Tan, Bing; Huang, Min; Zhu, Qibing; Guo, Ya; Qin, Jianwei
2017-09-01
The laser induced breakdown spectroscopy (LIBS) technique is an effective method to detect material composition by obtaining the plasma emission spectrum. The overlapping peaks in the spectrum are a fundamental problem in the qualitative and quantitative analysis of LIBS. Based on a curve fitting method, this paper studies an error compensation method to achieve the decomposition and correction of overlapping peaks. The vital step is that the fitting residual is fed back to the overlapping peaks and performs multiple curve fitting processes to obtain a lower residual result. For the quantitative experiments of Cu, the Cu-Fe overlapping peaks in the range of 321-327 nm obtained from the LIBS spectrum of five different concentrations of CuSO 4 ·5H 2 O solution were decomposed and corrected using curve fitting and error compensation methods. Compared with the curve fitting method, the error compensation reduced the fitting residual about 18.12-32.64% and improved the correlation about 0.86-1.82%. Then, the calibration curve between the intensity and concentration of the Cu was established. It can be seen that the error compensation method exhibits a higher linear correlation between the intensity and concentration of Cu, which can be applied to the decomposition and correction of overlapping peaks in the LIBS spectrum.
Liu, Xin; Li, Weiyi; Chong, Tzyy Haur; Fane, Anthony G
2017-03-01
Spacer design plays an important role in improving the performance of membrane processes for water/wastewater treatment. This work focused on a fundamental issue of spacer design, i.e., investigating the effects of spacer orientations on the fouling behavior during a membrane process. A series of fouling experiments with different spacer orientation were carried out to in situ characterize the formation of a cake layer in a spacer unit cell via 3D optical coherence tomography (OCT) imaging. The cake layers formed at different times were digitalized for quantitatively analyzing the variation in the cake morphology as a function of time. In particular, the local deposition rates were evaluated to determine the active regions where the instantaneous changes in deposit thickness were significant. The characterization results indicate that varying the spacer orientation could substantially change the evolution of membrane fouling by particulate foulants and thereby result in a cake layer with various morphologies; the competition between growth and erosion at different locations would instantaneously respond to the micro-hydrodynamic environment that might change with time. This work confirms that the OCT-based characterization method is a powerful tool for exploring novel spacer design. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hydrodynamic Interactions in Active and Passive Matter
NASA Astrophysics Data System (ADS)
Krafnick, Ryan C.
Active matter is present at all biological length scales, from molecular apparatuses interior to cells, to swimming microscopic organisms, to birds, fish, and people. Its properties are varied and its applications diverse, but our understanding of the fundamental driving forces of systems with these constituents remains incomplete. This thesis examines active matter suspensions, exploring the role of hydrodynamic interactions on the unique and emergent properties therein. Both qualitative and quantitative impacts are considered, and care is taken in determining the physical origin of the results in question. It is found that fluid dynamical interactions are fundamentally, qualitatively important, and much of the properties of a system can be explained with an effective energy density defined via the fluid fields arising from the embedded self-propelling entities themselves.
Electronic and Optical properties of Graphene Nanoribbons
NASA Astrophysics Data System (ADS)
Molinari, Elisa; Ferretti, Andrea; Cardoso, Claudia; Prezzi, Deborah; Ruini, Alice
Narrow graphene nanoribbons (GNRs) exhibit substantial electronic band gaps, and optical properties expected to be fundamentally different from the ones of their parent material graphene. Unlike graphene the optical response of GNRs may be tuned by the ribbon width and the directly related electronic band gap. We have addressed the optical properties of chevron-like and finite-size armchair nanoribbons by computing the fundamental and optical gap from ab initio methods. Our results are in very good agreement with the experimental values obtained by STS, ARPES, and differential reflectance spectroscopy, indicating that this computational scheme can be quantitatively predictive for electronic and optical spectroscopies of nanostructures. These study has been partly supported by the EU Centre of Excellence ''MaX - MAterials design at the eXascale''.
An experimental approach to the fundamental principles of hemodynamics.
Pontiga, Francisco; Gaytán, Susana P
2005-09-01
An experimental model has been developed to give students hands-on experience with the fundamental laws of hemodynamics. The proposed experimental setup is of simple construction but permits the precise measurements of physical variables involved in the experience. The model consists in a series of experiments where different basic phenomena are quantitatively investigated, such as the pressure drop in a long straight vessel and in an obstructed vessel, the transition from laminar to turbulent flow, the association of vessels in vascular networks, or the generation of a critical stenosis. Through these experiments, students acquire a direct appreciation of the importance of the parameters involved in the relationship between pressure and flow rate, thus facilitating the comprehension of more complex problems in hemodynamics.
NASA Technical Reports Server (NTRS)
Dum, C. T.
1990-01-01
The generation of waves with frequencies downshifted from the plasma frequency, as observed in the electron foreshock, is analyzed by particle simulation. Wave excitation differs fundamentally from the familiar excitation of the plasma eigenmodes by a gentle bump-on-tail electron distribution. Beam modes are destabilized by resonant interaction with bulk electrons, provided the beam velocity spread is very small. These modes are stabilized, starting with the higher frequencies, as the beam is broadened and slowed down by the interaction with the wave spectrum. Initially a very cold beam is also capable of exciting frequencies considerably above the plasma frequency, but such oscillations are quickly stabilized. Low-frequency modes persist for a long time, until the bump in the electron distribution is completely 'ironed' out. This diffusion process also is quite different from the familiar case of well-separated beam and bulk electrons. A quantitative analysis of these processes is carried out.
A Review on Segmentation of Positron Emission Tomography Images
Foster, Brent; Bagci, Ulas; Mansoor, Awais; Xu, Ziyue; Mollura, Daniel J.
2014-01-01
Positron Emission Tomography (PET), a non-invasive functional imaging method at the molecular level, images the distribution of biologically targeted radiotracers with high sensitivity. PET imaging provides detailed quantitative information about many diseases and is often used to evaluate inflammation, infection, and cancer by detecting emitted photons from a radiotracer localized to abnormal cells. In order to differentiate abnormal tissue from surrounding areas in PET images, image segmentation methods play a vital role; therefore, accurate image segmentation is often necessary for proper disease detection, diagnosis, treatment planning, and follow-ups. In this review paper, we present state-of-the-art PET image segmentation methods, as well as the recent advances in image segmentation techniques. In order to make this manuscript self-contained, we also briefly explain the fundamentals of PET imaging, the challenges of diagnostic PET image analysis, and the effects of these challenges on the segmentation results. PMID:24845019
Universal Power Law Governing Pedestrian Interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karamouzas, Ioannis; Skinner, Brian; Guy, Stephen J.
2014-12-01
Human crowds often bear a striking resemblance to interacting particle systems, and this has prompted many researchers to describe pedestrian dynamics in terms of interaction forces and potential energies. The correct quantitative form of this interaction, however, has remained an open question. Here, we introduce a novel statistical-mechanical approach to directly measure the interaction energy between pedestrians. This analysis, when applied to a large collection of human motion data, reveals a simple power-law interaction that is based not on the physical separation between pedestrians but on their projected time to a potential future collision, and is therefore fundamentally anticipatory inmore » nature. Remarkably, this simple law is able to describe human interactions across a wide variety of situations, speeds, and densities. We further show, through simulations, that the interaction law we identify is sufficient to reproduce many known crowd phenomena.« less
Cascaes, Andreia Morales; Camargo, Maria Beatriz Junqueira de; Castilhos, Eduardo Dickie de; Silva, Alexandre Emidio Ribeiro; Barros, Aluísio J D
2017-12-01
The aim was to analyze Brazilians' private spending on dental care and oral hygiene products. Data were analyzed from 55,970 households in the Family Budgets Survey, 2008-2009. Expenditures were analyzed by major geographic region, state, state capital, and household socioeconomic and demographic characteristics (sex, age, head-of-household's skin color and schooling, per capita household income, and presence of elderly in the household). Brazilians spent an average of BRL 42.19 per year on dental care and BRL 10.27 on oral hygiene products. The study detected social inequalities in the distribution of these expenditures according to household residents' characteristics and the different geographic regions, states, and state capitals. The current study evidenced quantitative and specific details on Brazilians' spending on dental care and oral hygiene products. Monitoring and assessment of these expenditures are fundamental for evaluating and orienting public policies in oral health.
Single-cell technologies to study the immune system.
Proserpio, Valentina; Mahata, Bidesh
2016-02-01
The immune system is composed of a variety of cells that act in a coordinated fashion to protect the organism against a multitude of different pathogens. The great variability of existing pathogens corresponds to a similar high heterogeneity of the immune cells. The study of individual immune cells, the fundamental unit of immunity, has recently transformed from a qualitative microscopic imaging to a nearly complete quantitative transcriptomic analysis. This shift has been driven by the rapid development of multiple single-cell technologies. These new advances are expected to boost the detection of less frequent cell types and transient or intermediate cell states. They will highlight the individuality of each single cell and greatly expand the resolution of current available classifications and differentiation trajectories. In this review we discuss the recent advancement and application of single-cell technologies, their limitations and future applications to study the immune system. © 2015 The Authors. Immunology Published by John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monet, Giath; Bacon, David J; Osetskiy, Yury N
2010-01-01
Given the time and length scales in molecular dynamics (MD) simulations of dislocation-defect interactions, quantitative MD results cannot be used directly in larger scale simulations or compared directly with experiment. A method to extract fundamental quantities from MD simulations is proposed here. The first quantity is a critical stress defined to characterise the obstacle resistance. This mesoscopic parameter, rather than the obstacle 'strength' designed for a point obstacle, is to be used for an obstacle of finite size. At finite temperature, our analyses of MD simulations allow the activation energy to be determined as a function of temperature. The resultsmore » confirm the proportionality between activation energy and temperature that is frequently observed by experiment. By coupling the data for the activation energy and the critical stress as functions of temperature, we show how the activation energy can be deduced at a given value of the critical stress.« less
NASA Astrophysics Data System (ADS)
Wang, Baoming; Haque, M. A.
2015-08-01
With atomic-scale imaging and analytical capabilities such as electron diffraction and energy-loss spectroscopy, the transmission electron microscope has allowed access to the internal microstructure of materials like no other microscopy. It has been mostly a passive or post-mortem analysis tool, but that trend is changing with in situ straining, heating and electrical biasing. In this study, we design and demonstrate a multi-functional microchip that integrates actuators, sensors, heaters and electrodes with freestanding electron transparent specimens. In addition to mechanical testing at elevated temperatures, the chip can actively control microstructures (grain growth and phase change) of the specimen material. Using nano-crystalline aluminum, nickel and zirconium as specimen materials, we demonstrate these novel capabilities inside the microscope. Our approach of active microstructural control and quantitative testing with real-time visualization can influence mechanistic modeling by providing direct and accurate evidence of the fundamental mechanisms behind materials behavior.
Label-Free Imaging and Biochemical Characterization of Bovine Sperm Cells
Ferrara, Maria Antonietta; Di Caprio, Giuseppe; Managò, Stefano; De Angelis, Annalisa; Sirleto, Luigi; Coppola, Giuseppe; De Luca, Anna Chiara
2015-01-01
A full label-free morphological and biochemical characterization is desirable to select spermatozoa during preparation for artificial insemination. In order to study these fundamental parameters, we take advantage of two attractive techniques: digital holography (DH) and Raman spectroscopy (RS). DH presents new opportunities for studying morphological aspect of cells and tissues non-invasively, quantitatively and without the need for staining or tagging, while RS is a very specific technique allowing the biochemical analysis of cellular components with a spatial resolution in the sub-micrometer range. In this paper, morphological and biochemical bovine sperm cell alterations were studied using these techniques. In addition, a complementary DH and RS study was performed to identify X- and Y-chromosome-bearing sperm cells. We demonstrate that the two techniques together are a powerful and highly efficient tool elucidating some important criterions for sperm morphological selection and sex-identification, overcoming many of the limitations associated with existing protocols. PMID:25836358
Measurement of locus copy number by hybridisation with amplifiable probes
Armour, John A. L.; Sismani, Carolina; Patsalis, Philippos C.; Cross, Gareth
2000-01-01
Despite its fundamental importance in genome analysis, it is only recently that systematic approaches have been developed to assess copy number at specific genetic loci, or to examine genomic DNA for submicroscopic deletions of unknown location. In this report we show that short probes can be recovered and amplified quantitatively following hybridisation to genomic DNA. This simple observation forms the basis of a new approach to determining locus copy number in complex genomes. The power and specificity of multiplex amplifiable probe hybridisation is demonstrated by the simultaneous assessment of copy number at a set of 40 human loci, including detection of deletions causing Duchenne muscular dystrophy and Prader–Willi/Angelman syndromes. Assembly of other probe sets will allow novel, technically simple approaches to a wide variety of genetic analyses, including the potential for extension to high resolution genome-wide screens for deletions and amplifications. PMID:10606661
Measurement of locus copy number by hybridisation with amplifiable probes.
Armour, J A; Sismani, C; Patsalis, P C; Cross, G
2000-01-15
Despite its fundamental importance in genome analysis, it is only recently that systematic approaches have been developed to assess copy number at specific genetic loci, or to examine genomic DNA for submicro-scopic deletions of unknown location. In this report we show that short probes can be recovered and amplified quantitatively following hybridisation to genomic DNA. This simple observation forms the basis of a new approach to determining locus copy number in complex genomes. The power and specificity of multiplex amplifiable probe hybridisation is demonstrated by the simultaneous assessment of copy number at a set of 40 human loci, including detection of deletions causing Duchenne muscular dystrophy and Prader-Willi/Angelman syndromes. Assembly of other probe sets will allow novel, technically simple approaches to a wide variety of genetic analyses, including the potential for extension to high resolution genome-wide screens for deletions and amplifications.
3D electron tomography of pretreated biomass informs atomic modeling of cellulose microfibrils.
Ciesielski, Peter N; Matthews, James F; Tucker, Melvin P; Beckham, Gregg T; Crowley, Michael F; Himmel, Michael E; Donohoe, Bryon S
2013-09-24
Fundamental insights into the macromolecular architecture of plant cell walls will elucidate new structure-property relationships and facilitate optimization of catalytic processes that produce fuels and chemicals from biomass. Here we introduce computational methodology to extract nanoscale geometry of cellulose microfibrils within thermochemically treated biomass directly from electron tomographic data sets. We quantitatively compare the cell wall nanostructure in corn stover following two leading pretreatment strategies: dilute acid with iron sulfate co-catalyst and ammonia fiber expansion (AFEX). Computational analysis of the tomographic data is used to extract mathematical descriptions for longitudinal axes of cellulose microfibrils from which we calculate their nanoscale curvature. These nanostructural measurements are used to inform the construction of atomistic models that exhibit features of cellulose within real, process-relevant biomass. By computational evaluation of these atomic models, we propose relationships between the crystal structure of cellulose Iβ and the nanoscale geometry of cellulose microfibrils.
Cockbain, Ella; Ashby, Matthew; Brayley, Helen
2017-10-01
Child sexual exploitation is increasingly recognized nationally and internationally as a pressing child protection, crime prevention, and public health issue. In the United Kingdom, for example, a recent series of high-profile cases has fueled pressure on policy makers and practitioners to improve responses. Yet, prevailing discourse, research, and interventions around child sexual exploitation have focused overwhelmingly on female victims. This study was designed to help redress fundamental knowledge gaps around boys affected by sexual exploitation. This was achieved through rigorous quantitative analysis of individual-level data for 9,042 users of child sexual exploitation services in the United Kingdom. One third of the sample were boys, and gender was associated with statistically significant differences on many variables. The results of this exploratory study highlight the need for further targeted research and more nuanced and inclusive counter-strategies.
Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.
Montalvo-Acosta, Joel José; Cecchini, Marco
2016-12-01
The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Work probability distribution for a ferromagnet with long-ranged and short-ranged correlations
NASA Astrophysics Data System (ADS)
Bhattacharjee, J. K.; Kirkpatrick, T. R.; Sengers, J. V.
2018-04-01
Work fluctuations and work probability distributions are fundamentally different in systems with short-ranged versus long-ranged correlations. Specifically, in systems with long-ranged correlations the work distribution is extraordinarily broad compared to systems with short-ranged correlations. This difference profoundly affects the possible applicability of fluctuation theorems like the Jarzynski fluctuation theorem. The Heisenberg ferromagnet, well below its Curie temperature, is a system with long-ranged correlations in very low magnetic fields due to the presence of Goldstone modes. As the magnetic field is increased the correlations gradually become short ranged. Hence, such a ferromagnet is an ideal system for elucidating the changes of the work probability distribution as one goes from a domain with long-ranged correlations to a domain with short-ranged correlations by tuning the magnetic field. A quantitative analysis of this crossover behavior of the work probability distribution and the associated fluctuations is presented.
A changing climate of skepticism: The factors shaping climate change coverage in the US press.
Schmid-Petri, Hannah; Adam, Silke; Schmucki, Ivo; Häussler, Thomas
2017-05-01
Skepticism toward climate change has a long tradition in the United States. We focus on mass media as the conveyors of the image of climate change and ask: Is climate change skepticism still a characteristic of US print media coverage? If so, to what degree and in what form? And which factors might pave the way for skeptics entering mass media debates? We conducted a quantitative content analysis of US print media during one year (1 June 2012 to 31 May 2013). Our results show that the debate has changed: fundamental forms of climate change skepticism (such as denial of anthropogenic causes) have been abandoned in the coverage, being replaced by more subtle forms (such as the goal to avoid binding regulations). We find no evidence for the norm of journalistic balance, nor do our data support the idea that it is the conservative press that boosts skepticism.
White, M D; Bissiere, S; Alvarez, Y D; Plachta, N
2016-01-01
Compaction is a critical first morphological event in the preimplantation development of the mammalian embryo. Characterized by the transformation of the embryo from a loose cluster of spherical cells into a tightly packed mass, compaction is a key step in the establishment of the first tissue-like structures of the embryo. Although early investigation of the mechanisms driving compaction implicated changes in cell-cell adhesion, recent work has identified essential roles for cortical tension and a compaction-specific class of filopodia. During the transition from 8 to 16 cells, as the embryo is compacting, it must also make fundamental decisions regarding cell position, polarity, and fate. Understanding how these and other processes are integrated with compaction requires further investigation. Emerging imaging-based techniques that enable quantitative analysis from the level of cell-cell interactions down to the level of individual regulatory molecules will provide a greater understanding of how compaction shapes the early mammalian embryo. © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sunden, Fanny; Peck, Ariana; Salzman, Julia
Enzymes enable life by accelerating reaction rates to biological timescales. Conventional studies have focused on identifying the residues that have a direct involvement in an enzymatic reaction, but these so-called ‘catalytic residues’ are embedded in extensive interaction networks. Although fundamental to our understanding of enzyme function, evolution, and engineering, the properties of these networks have yet to be quantitatively and systematically explored. We dissected an interaction network of five residues in the active site of Escherichia coli alkaline phosphatase. Analysis of the complex catalytic interdependence of specific residues identified three energetically independent but structurally interconnected functional units with distinct modesmore » of cooperativity. From an evolutionary perspective, this network is orders of magnitude more probable to arise than a fully cooperative network. From a functional perspective, new catalytic insights emerge. Further, such comprehensive energetic characterization will be necessary to benchmark the algorithms required to rationally engineer highly efficient enzymes.« less
Dong, Shan; Zhang, Anmin; Liu, Kai; ...
2016-02-26
The recent renaissance of black phosphorus (BP) as a two-dimensional (2D) layered material has generated tremendous interest, but its unique structural characters underlying many of its outstanding properties still need elucidation. Here we report Raman measurements that reveal an ultralow-frequency collective compression mode (CCM) in BP, which is unprecedented among similar 2D layered materials. This novel CCM indicates an unusually strong interlayer coupling, and this result is quantitatively supported by a phonon frequency analysis and first-principles calculations. Moreover, the CCM and another branch of low-frequency Raman modes shift sensitively with changing number of layers, allowing an accurate determination of themore » thickness up to tens of atomic layers, which is considerably higher than previously achieved by using high-frequency Raman modes. Lastly, these findings offer fundamental insights and practical tools for further exploration of BP as a highly promising new 2D semiconductor.« less
Effects of feedstock characteristics on microwave-assisted pyrolysis - A review.
Zhang, Yaning; Chen, Paul; Liu, Shiyu; Peng, Peng; Min, Min; Cheng, Yanling; Anderson, Erik; Zhou, Nan; Fan, Liangliang; Liu, Chenghui; Chen, Guo; Liu, Yuhuan; Lei, Hanwu; Li, Bingxi; Ruan, Roger
2017-04-01
Microwave-assisted pyrolysis is an important approach to obtain bio-oil from biomass. Similar to conventional electrical heating pyrolysis, microwave-assisted pyrolysis is significantly affected by feedstock characteristics. However, microwave heating has its unique features which strongly depend on the physical and chemical properties of biomass feedstock. In this review, the relationships among heating, bio-oil yield, and feedstock particle size, moisture content, inorganics, and organics in microwave-assisted pyrolysis are discussed and compared with those in conventional electrical heating pyrolysis. The quantitative analysis of data reported in the literature showed a strong contrast between the conventional processes and microwave based processes. Microwave-assisted pyrolysis is a relatively new process with limited research compared with conventional electrical heating pyrolysis. The lack of understanding of some observed results warrant more and in-depth fundamental research. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Vasisht, Vishwas V.; Dutta, Sudeep K.; Del Gado, Emanuela; Blair, Daniel L.
2018-01-01
We use a combination of confocal microscopy, rheology, and molecular dynamics simulations to investigate jammed emulsions under shear, by analyzing the 3D droplets rearrangements in the shear frame. Our quantitative analysis of local dynamics reveals elementary nonaffine rearrangements that underlie the onset of the flow at small strains. We find that the mechanism of unjamming and the upturn in the material flow curve are associated to a qualitative change in spatiotemporal correlations of such rearrangements with the applied shear rate. At high shear rates, droplet clusters follow coordinated, stringlike motion. Conversely, at low shear rates, the elementary nonaffine rearrangements exhibit longer-ranged correlations, with complex spatiotemporal patterns. The 3D microscopic details provide novel insights into the specific features of the material flow curve, common to a large class of technologically relevant soft disordered solids and new fundamental ingredients for constitutive models.
Understanding the Global Structure and Evolution of Coronal Mass Ejections in the Solar Wind
NASA Technical Reports Server (NTRS)
Riley, Pete
2004-01-01
This report summarizes the technical progress made during the first six months of the second year of the NASA Living with a Star program contract Understanding the global structure and evolution of coronal mass ejections in the solar wind, between NASA and Science Applications International Corporation, and covers the period November 18, 2003 - May 17,2004. Under this contract SAIC has conducted numerical and data analysis related to fundamental issues concerning the origin, intrinsic properties, global structure, and evolution of coronal mass ejections in the solar wind. During this working period we have focused on a quantitative assessment of 5 flux rope fitting techniques. In the following sections we summarize the main aspects of this work and our proposed investigation plan for the next reporting period. Thus far, our investigation has resulted in 6 refereed scientific publications and we have presented the results at a number of scientific meetings and workshops.
Aiken, Leona S; West, Stephen G; Millsap, Roger E
2008-01-01
In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD programs (86%) participated. This survey replicated and extended a previous survey (L. S. Aiken, S. G. West, L. B. Sechrest, & R. R. Reno, 1990), permitting examination of curriculum development. Most training supported laboratory and not field research. The median of 1.6 years of training in statistics and measurement was mainly devoted to the modally 1-year introductory statistics course, leaving little room for advanced study. Curricular enhancements were noted in statistics and to a minor degree in measurement. Additional coverage of both fundamental and innovative quantitative methodology is needed. The research design curriculum has largely stagnated, a cause for great concern. Elite programs showed no overall advantage in quantitative training. Forces that support curricular innovation are characterized. Human capital challenges to quantitative training, including recruiting and supporting young quantitative faculty, are discussed. Steps must be taken to bring innovations in quantitative methodology into the curriculum of PhD programs in psychology. PsycINFO Database Record (c) 2008 APA, all rights reserved.
Mohammadfam, Iraj; Bastani, Susan; Esaghi, Mahbobeh; Golmohamadi, Rostam; Saee, Ali
2015-03-01
The purpose of this study was to examine the cohesions status of the coordination within response teams in the emergency response team (ERT) in a refinery. For this study, cohesion indicators of social network analysis (SNA; density, degree centrality, reciprocity, and transitivity) were utilized to examine the coordination of the response teams as a whole network. The ERT of this research, which was a case study, included seven teams consisting of 152 members. The required data were collected through structured interviews and were analyzed using the UCINET 6.0 Social Network Analysis Program. The results reported a relatively low number of triple connections, poor coordination with key members, and a high level of mutual relations in the network with low density, all implying that there were low cohesions of coordination in the ERT. The results showed that SNA provided a quantitative and logical approach for the examination of the coordination status among response teams and it also provided a main opportunity for managers and planners to have a clear understanding of the presented status. The research concluded that fundamental efforts were needed to improve the presented situations.
Chen, Qingshan; Lazennec, Jean Yves; Guyen, Olivier; Kinbrum, Amy; Berry, Daniel J; An, Kai-Nan
2005-07-01
Tripolar total hip arthroplasty (THA) prosthesis had been suggested as a method to reduce the occurrence of hip dislocation and microseparation. Precisely measuring the motion of the intermediate component in vitro would provide fundamental knowledge for understanding its mechanism. The present study validates the accuracy and repeatability of a three-dimensional motion analysis system to quantitatively measure the relative motion of the intermediate component of tripolar total hip arthroplasty prostheses. Static and dynamic validations of the system were made by comparing the measurement to that of a potentiometer. Differences between the mean system-calculated angle and the angle measured by the potentiometer were within +/-1 degrees . The mean within-trial variability was less than 1 degrees . The mean slope was 0.9-1.02 for different angular velocities. The dynamic noise was within 1 degrees . The system was then applied to measure the relative motion of an eccentric THA prosthesis. The study shows that this motion analysis system provides an accurate and practical method for measuring the relative motion of the tripolar THA prosthesis in vitro, a necessary first step towards the understanding of its in vivo kinematics.
Hwang, Hyundoo; Barnes, Dawn E; Matsunaga, Yohei; Benian, Guy M; Ono, Shoichiro; Lu, Hang
2016-01-29
The sarcomere, the fundamental unit of muscle contraction, is a highly-ordered complex of hundreds of proteins. Despite decades of genetics work, the functional relationships and the roles of those sarcomeric proteins in animal behaviors remain unclear. In this paper, we demonstrate that optogenetic activation of the motor neurons that induce muscle contraction can facilitate quantitative studies of muscle kinetics in C. elegans. To increase the throughput of the study, we trapped multiple worms in parallel in a microfluidic device and illuminated for photoactivation of channelrhodopsin-2 to induce contractions in body wall muscles. Using image processing, the change in body size was quantified over time. A total of five parameters including rate constants for contraction and relaxation were extracted from the optogenetic assay as descriptors of sarcomere functions. To potentially relate the genes encoding the sarcomeric proteins functionally, a hierarchical clustering analysis was conducted on the basis of those parameters. Because it assesses physiological output different from conventional assays, this method provides a complement to the phenotypic analysis of C. elegans muscle mutants currently performed in many labs; the clusters may provide new insights and drive new hypotheses for functional relationships among the many sarcomere components.
NASA Astrophysics Data System (ADS)
Hwang, Hyundoo; Barnes, Dawn E.; Matsunaga, Yohei; Benian, Guy M.; Ono, Shoichiro; Lu, Hang
2016-01-01
The sarcomere, the fundamental unit of muscle contraction, is a highly-ordered complex of hundreds of proteins. Despite decades of genetics work, the functional relationships and the roles of those sarcomeric proteins in animal behaviors remain unclear. In this paper, we demonstrate that optogenetic activation of the motor neurons that induce muscle contraction can facilitate quantitative studies of muscle kinetics in C. elegans. To increase the throughput of the study, we trapped multiple worms in parallel in a microfluidic device and illuminated for photoactivation of channelrhodopsin-2 to induce contractions in body wall muscles. Using image processing, the change in body size was quantified over time. A total of five parameters including rate constants for contraction and relaxation were extracted from the optogenetic assay as descriptors of sarcomere functions. To potentially relate the genes encoding the sarcomeric proteins functionally, a hierarchical clustering analysis was conducted on the basis of those parameters. Because it assesses physiological output different from conventional assays, this method provides a complement to the phenotypic analysis of C. elegans muscle mutants currently performed in many labs; the clusters may provide new insights and drive new hypotheses for functional relationships among the many sarcomere components.
Purcell, Maureen K.; Getchell, Rodman G.; McClure, Carol A.; Weber, S.E.; Garver, Kyle A.
2011-01-01
Real-time, or quantitative, polymerase chain reaction (qPCR) is quickly supplanting other molecular methods for detecting the nucleic acids of human and other animal pathogens owing to the speed and robustness of the technology. As the aquatic animal health community moves toward implementing national diagnostic testing schemes, it will need to evaluate how qPCR technology should be employed. This review outlines the basic principles of qPCR technology, considerations for assay development, standards and controls, assay performance, diagnostic validation, implementation in the diagnostic laboratory, and quality assurance and control measures. These factors are fundamental for ensuring the validity of qPCR assay results obtained in the diagnostic laboratory setting.
Sacci, Robert L; Black, Jennifer M.; Wisinger, Nina; ...
2015-02-23
The performance characteristics of Li-ion batteries are intrinsically linked to evolving nanoscale interfacial electrochemical reactions. To probe the mechanisms of solid electrolyte interphase formation and Li electrodeposition from a standard battery electrolyte, we use in situ electrochemical scanning transmission electron microscopy for controlled potential sweep-hold electrochemical measurements with simultaneous BF and ADF STEM image acquisition. Through a combined quantitative electrochemical measurement and quantitative STEM imaging approach, based upon electron scattering theory, we show that chemically sensitive ADF STEM imaging can be used to estimate the density of evolving SEI constituents and distinguish contrast mechanisms of Li-bearing components in the liquidmore » cell.« less
Fraunhofer line-dept sensing applied to water
NASA Technical Reports Server (NTRS)
Stoertz, G. E.
1969-01-01
An experimental Fraunhofer line discriminator is basically an airborne fluorometer, capable of quantitatively measuring the concentration of fluorescent substances dissolved in water. It must be calibrated against standards and supplemented by ground-truth data on turbidity and on approximate vertical distribution of the fluorescent substance. Quantitative use requires that it be known in advance what substance is the source of the luminescence emission; qualitative sensing, or detection of luminescence is also possible. The two approaches are fundamentally different, having different purposes, different applications, and different instruments. When used for sensing of Rhodamine WT dye in coastal waters and estuaries, the FLD is sensing in the spectral region permitting nearly maximum depth of light penetration.
NASA Astrophysics Data System (ADS)
Obersteiner, F.; Bönisch, H.; Engel, A.
2016-01-01
We present the characterization and application of a new gas chromatography time-of-flight mass spectrometry instrument (GC-TOFMS) for the quantitative analysis of halocarbons in air samples. The setup comprises three fundamental enhancements compared to our earlier work (Hoker et al., 2015): (1) full automation, (2) a mass resolving power R = m/Δm of the TOFMS (Tofwerk AG, Switzerland) increased up to 4000 and (3) a fully accessible data format of the mass spectrometric data. Automation in combination with the accessible data allowed an in-depth characterization of the instrument. Mass accuracy was found to be approximately 5 ppm in mean after automatic recalibration of the mass axis in each measurement. A TOFMS configuration giving R = 3500 was chosen to provide an R-to-sensitivity ratio suitable for our purpose. Calculated detection limits are as low as a few femtograms by means of the accurate mass information. The precision for substance quantification was 0.15 % at the best for an individual measurement and in general mainly determined by the signal-to-noise ratio of the chromatographic peak. Detector non-linearity was found to be insignificant up to a mixing ratio of roughly 150 ppt at 0.5 L sampled volume. At higher concentrations, non-linearities of a few percent were observed (precision level: 0.2 %) but could be attributed to a potential source within the detection system. A straightforward correction for those non-linearities was applied in data processing, again by exploiting the accurate mass information. Based on the overall characterization results, the GC-TOFMS instrument was found to be very well suited for the task of quantitative halocarbon trace gas observation and a big step forward compared to scanning, quadrupole MS with low mass resolving power and a TOFMS technique reported to be non-linear and restricted by a small dynamical range.
Schultz-Coulon, H J
1975-07-01
The applicability of a newly developed fundamental frequency analyzer to diagnosis in phoniatrics is reviewed. During routine voice examination, the analyzer allows a quick and accurate measurement of fundamental frequency and sound level of the speaking voice, and of vocal range and maximum phonation time. By computing fundamental frequency histograms, the median fundamental frequency and the total pitch range can be better determined and compared. Objective studies of certain technical faculties of the singing voice, which usually are estimated subjectively by the speech therapist, may now be done by means of this analyzer. Several examples demonstrate the differences between correct and incorrect phonation. These studies compare the pitch perturbations during the crescendo and decrescendo of a swell-tone, and show typical traces of staccato, thrill and yodel. Conclusions of the study indicate that fundamental frequency analysis is a valuable supplemental method for objective voice examination.
Can NMR solve some significant challenges in metabolomics?
Nagana Gowda, G A; Raftery, Daniel
2015-11-01
The field of metabolomics continues to witness rapid growth driven by fundamental studies, methods development, and applications in a number of disciplines that include biomedical science, plant and nutrition sciences, drug development, energy and environmental sciences, toxicology, etc. NMR spectroscopy is one of the two most widely used analytical platforms in the metabolomics field, along with mass spectrometry (MS). NMR's excellent reproducibility and quantitative accuracy, its ability to identify structures of unknown metabolites, its capacity to generate metabolite profiles using intact bio-specimens with no need for separation, and its capabilities for tracing metabolic pathways using isotope labeled substrates offer unique strengths for metabolomics applications. However, NMR's limited sensitivity and resolution continue to pose a major challenge and have restricted both the number and the quantitative accuracy of metabolites analyzed by NMR. Further, the analysis of highly complex biological samples has increased the demand for new methods with improved detection, better unknown identification, and more accurate quantitation of larger numbers of metabolites. Recent efforts have contributed significant improvements in these areas, and have thereby enhanced the pool of routinely quantifiable metabolites. Additionally, efforts focused on combining NMR and MS promise opportunities to exploit the combined strength of the two analytical platforms for direct comparison of the metabolite data, unknown identification and reliable biomarker discovery that continue to challenge the metabolomics field. This article presents our perspectives on the emerging trends in NMR-based metabolomics and NMR's continuing role in the field with an emphasis on recent and ongoing research from our laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.
Anomalous chiral transport in heavy ion collisions from Anomalous-Viscous Fluid Dynamics
NASA Astrophysics Data System (ADS)
Shi, Shuzhe; Jiang, Yin; Lilleskov, Elias; Liao, Jinfeng
2018-07-01
Chiral anomaly is a fundamental aspect of quantum theories with chiral fermions. How such microscopic anomaly manifests itself in a macroscopic many-body system with chiral fermions, is a highly nontrivial question that has recently attracted significant interest. As it turns out, unusual transport currents can be induced by chiral anomaly under suitable conditions in such systems, with the notable example of the Chiral Magnetic Effect (CME) where a vector current (e.g. electric current) is generated along an external magnetic field. A lot of efforts have been made to search for CME in heavy ion collisions, by measuring the charge separation effect induced by the CME transport. A crucial challenge in such effort, is the quantitative prediction for the CME signal. In this paper, we develop the Anomalous-Viscous Fluid Dynamics (AVFD) framework, which implements the anomalous fluid dynamics to describe the evolution of fermion currents in QGP, on top of the neutral bulk background described by the VISH2+1 hydrodynamic simulations for heavy ion collisions. With this new tool, we quantitatively and systematically investigate the dependence of the CME signal to a series of theoretical inputs and associated uncertainties. With realistic estimates of initial conditions and magnetic field lifetime, the predicted CME signal is quantitatively consistent with measured change separation data in 200GeV Au-Au collisions. Based on analysis of Au-Au collisions, we further make predictions for the CME observable to be measured in the planned isobaric (Ru-Ru v.s. Zr-Zr) collision experiment, which could provide a most decisive test of the CME in heavy ion collisions.
[Building mathematics in imagination].
Patras, Frédéric
2015-01-01
The extraordinary quantitative achievements of contemporary science often hide their qualitative dimension. In mathematics, the understanding of fundamental theoretical phenomena we have got today goes much beyond that achieved in previous periods. This also holds when it comes to the theorisation of mathematical practice.Philosophically, these changes remain largely to be properly analyzed. The present article will address this issue from the point of view of Bachelard's epistemology.
Low-Energy Positron-Matter Interactions Using Trap-Based Beams
2002-06-24
qualitatively by the recent exploitation of nonneutral plasma physics techniques to produce antimatter plasmas and beams in new regimes of parameter space...a quantitative antimatter - matter chemistry, important not only in obtaining a fundamental understanding of nature, but also in using antimatter in...ANNIHILATION MEASUREMENTS The fate of all antimatter in our world is annihilation with ordinary matter. Thus understanding the details of these annihilation
Comparative genomics of defense systems in archaea and bacteria
Makarova, Kira S.; Wolf, Yuri I.; Koonin, Eugene V.
2013-01-01
Our knowledge of prokaryotic defense systems has vastly expanded as the result of comparative genomic analysis, followed by experimental validation. This expansion is both quantitative, including the discovery of diverse new examples of known types of defense systems, such as restriction-modification or toxin-antitoxin systems, and qualitative, including the discovery of fundamentally new defense mechanisms, such as the CRISPR-Cas immunity system. Large-scale statistical analysis reveals that the distribution of different defense systems in bacterial and archaeal taxa is non-uniform, with four groups of organisms distinguishable with respect to the overall abundance and the balance between specific types of defense systems. The genes encoding defense system components in bacterial and archaea typically cluster in defense islands. In addition to genes encoding known defense systems, these islands contain numerous uncharacterized genes, which are candidates for new types of defense systems. The tight association of the genes encoding immunity systems and dormancy- or cell death-inducing defense systems in prokaryotic genomes suggests that these two major types of defense are functionally coupled, providing for effective protection at the population level. PMID:23470997
A practical guide to single-cell RNA-sequencing for biomedical research and clinical applications.
Haque, Ashraful; Engel, Jessica; Teichmann, Sarah A; Lönnberg, Tapio
2017-08-18
RNA sequencing (RNA-seq) is a genomic approach for the detection and quantitative analysis of messenger RNA molecules in a biological sample and is useful for studying cellular responses. RNA-seq has fueled much discovery and innovation in medicine over recent years. For practical reasons, the technique is usually conducted on samples comprising thousands to millions of cells. However, this has hindered direct assessment of the fundamental unit of biology-the cell. Since the first single-cell RNA-sequencing (scRNA-seq) study was published in 2009, many more have been conducted, mostly by specialist laboratories with unique skills in wet-lab single-cell genomics, bioinformatics, and computation. However, with the increasing commercial availability of scRNA-seq platforms, and the rapid ongoing maturation of bioinformatics approaches, a point has been reached where any biomedical researcher or clinician can use scRNA-seq to make exciting discoveries. In this review, we present a practical guide to help researchers design their first scRNA-seq studies, including introductory information on experimental hardware, protocol choice, quality control, data analysis and biological interpretation.
Non-lambertian reflectance modeling and shape recovery of faces using tensor splines.
Kumar, Ritwik; Barmpoutis, Angelos; Banerjee, Arunava; Vemuri, Baba C
2011-03-01
Modeling illumination effects and pose variations of a face is of fundamental importance in the field of facial image analysis. Most of the conventional techniques that simultaneously address both of these problems work with the Lambertian assumption and thus fall short of accurately capturing the complex intensity variation that the facial images exhibit or recovering their 3D shape in the presence of specularities and cast shadows. In this paper, we present a novel Tensor-Spline-based framework for facial image analysis. We show that, using this framework, the facial apparent BRDF field can be accurately estimated while seamlessly accounting for cast shadows and specularities. Further, using local neighborhood information, the same framework can be exploited to recover the 3D shape of the face (to handle pose variation). We quantitatively validate the accuracy of the Tensor Spline model using a more general model based on the mixture of single-lobed spherical functions. We demonstrate the effectiveness of our technique by presenting extensive experimental results for face relighting, 3D shape recovery, and face recognition using the Extended Yale B and CMU PIE benchmark data sets.
Utilizing visual art to enhance the clinical observation skills of medical students.
Jasani, Sona K; Saks, Norma S
2013-07-01
Clinical observation is fundamental in practicing medicine, but these skills are rarely taught. Currently no evidence-based exercises/courses exist for medical student training in observation skills. The goal was to develop and teach a visual arts-based exercise for medical students, and to evaluate its usefulness in enhancing observation skills in clinical diagnosis. A pre- and posttest and evaluation survey were developed for a three-hour exercise presented to medical students just before starting clerkships. Students were provided with questions to guide discussion of both representational and non-representational works of art. Quantitative analysis revealed that the mean number of observations between pre- and posttests was not significantly different (n=70: 8.63 vs. 9.13, p=0.22). Qualitative analysis of written responses identified four themes: (1) use of subjective terminology, (2) scope of interpretations, (3) speculative thinking, and (4) use of visual analogies. Evaluative comments indicated that students felt the exercise enhanced both mindfulness and skills. Using visual art images with guided questions can train medical students in observation skills. This exercise can be replicated without specially trained personnel or art museum partnerships.
Hunt, Kristopher A.; Jennings, Ryan deM.; Inskeep, William P.; Carlson, Ross P.
2017-01-01
Summary Assimilatory and dissimilatory utilisation of autotroph biomass by heterotrophs is a fundamental mechanism for the transfer of nutrients and energy across trophic levels. Metagenome data from a tractable, thermoacidophilic microbial community in Yellowstone National Park was used to build an in silico model to study heterotrophic utilisation of autotroph biomass using elementary flux mode analysis and flux balance analysis. Assimilatory and dissimilatory biomass utilisation was investigated using 29 forms of biomass-derived dissolved organic carbon (DOC) including individual monomer pools, individual macromolecular pools and aggregate biomass. The simulations identified ecologically competitive strategies for utilizing DOC under conditions of varying electron donor, electron acceptor or enzyme limitation. The simulated growth environment affected which form of DOC was the most competitive use of nutrients; for instance, oxygen limitation favoured utilisation of less reduced and fermentable DOC while carbon-limited environments favoured more reduced DOC. Additionally, metabolism was studied considering two encompassing metabolic strategies: simultaneous versus sequential use of DOC. Results of this study bound the transfer of nutrients and energy through microbial food webs, providing a quantitative foundation relevant to most microbial ecosystems. PMID:27387069
Pulsed Magnetic Field Improves the Transport of Iron Oxide Nanoparticles through Cell Barriers
Min, Kyoung Ah; Shin, Meong Cheol; Yu, Faquan; Yang, Meizhu; David, Allan E.; Yang, Victor C.; Rosania, Gus R.
2013-01-01
Understanding how a magnetic field affects the interaction of magnetic nanoparticles (MNPs) with cells is fundamental to any potential downstream applications of MNPs as gene and drug delivery vehicles. Here, we present a quantitative analysis of how a pulsed magnetic field influences the manner in which MNPs interact with, and penetrate across a cell monolayer. Relative to a constant magnetic field, the rate of MNP uptake and transport across cell monolayers was enhanced by a pulsed magnetic field. MNP transport across cells was significantly inhibited at low temperature under both constant and pulsed magnetic field conditions, consistent with an active mechanism (i.e. endocytosis) mediating MNP transport. Microscopic observations and biochemical analysis indicated that, in a constant magnetic field, transport of MNPs across the cells was inhibited due to the formation of large (>2 μm) magnetically-induced MNP aggregates, which exceeded the size of endocytic vesicles. Thus, a pulsed magnetic field enhances the cellular uptake and transport of MNPs across cell barriers relative to a constant magnetic field by promoting accumulation while minimizing magnetically-induced MNP aggregates at the cell surface. PMID:23373613
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindenmaier, Rodica; Scharko, Nicole K.; Tonkyn, Russell G.
Xylenes contain a blend of the ortho-, meta-, and para- isomers, and all are abundant contaminants in the ground, surface waters, and air. To better characterize xylene and to better enable its detection, we report high quality quantitative vapor-phase infrared spectra of all three isomers over the 540-6500 cm -1 range. All fundamental vibrational modes are assigned based on these vapor-phase infrared spectra, liquid-phase infrared and Raman spectra, along with density functional theory (DFT), ab initio MP2 and high energy-accuracy compound theoretical model (W1BD) calculations. Both MP2 and DFT predict a single conformer with C 2v symmetry for ortho-xylene, andmore » two conformers each for meta- and para-xylene, depending on the preferred orientations of the methyl groups. For meta-xylene the two conformers have C s and C 2 symmetry, and for para-xylene these conformers have C 2v or C 2h symmetry. Since the relative population of the two conformers is approximately 50% for both isomers and predicted frequencies and intensities are very similar for each conformer, we made an arbitrary choice to discuss the C s conformer for meta-xylene and the C 2v conformer for para-xylene. We report integrated band intensities for all isomers. Using the quantitative infrared data, we determine the global warming potential values of each isomer and discuss potential bands for atmospheric monitoring.« less
NASA Astrophysics Data System (ADS)
Consonni, Viviana; Todeschini, Roberto
In the last decades, several scientific researches have been focused on studying how to encompass and convert - by a theoretical pathway - the information encoded in the molecular structure into one or more numbers used to establish quantitative relationships between structures and properties, biological activities, or other experimental properties. Molecular descriptors are formally mathematical representations of a molecule obtained by a well-specified algorithm applied to a defined molecular representation or a well-specified experimental procedure. They play a fundamental role in chemistry, pharmaceutical sciences, environmental protection policy, toxicology, ecotoxicology, health research, and quality control. Evidence of the interest of the scientific community in the molecular descriptors is provided by the huge number of descriptors proposed up today: more than 5000 descriptors derived from different theories and approaches are defined in the literature and most of them can be calculated by means of dedicated software applications. Molecular descriptors are of outstanding importance in the research fields of quantitative structure-activity relationships (QSARs) and quantitative structure-property relationships (QSPRs), where they are the independent chemical information used to predict the properties of interest. Along with the definition of appropriate molecular descriptors, the molecular structure representation and the mathematical tools for deriving and assessing models are other fundamental components of the QSAR/QSPR approach. The remarkable progress during the last few years in chemometrics and chemoinformatics has led to new strategies for finding mathematical meaningful relationships between the molecular structure and biological activities, physico-chemical, toxicological, and environmental properties of chemicals. Different approaches for deriving molecular descriptors here reviewed and some of the most relevant descriptors are presented in detail with numerical examples.
NASA Astrophysics Data System (ADS)
Lindenmaier, Rodica; Scharko, Nicole K.; Tonkyn, Russell G.; Nguyen, Kiet T.; Williams, Stephen D.; Johnson, Timothy J.
2017-12-01
Xylenes contain a blend of the ortho-, meta-, and para- isomers, and all are abundant contaminants in the ground, surface waters, and air. To better characterize xylene and to better enable its detection, high quality quantitative vapor-phase infrared spectra of all three isomers over the 6500 - 540 cm-1 range are reported. All fundamental vibrational modes are assigned based on these vapor-phase infrared spectra, liquid-phase infrared and Raman spectra, along with density functional theory (DFT), ab initio MP2 and high energy-accuracy compound theoretical model (W1BD) calculations. Both MP2 and DFT predict a single conformer with C2v symmetry for ortho-xylene, and two conformers each for meta- and para-xylene, depending on the preferred orientations of the methyl groups. For meta-xylene the two conformers have Cs and C2 symmetry, and for para-xylene these conformers have C2v or C2h symmetry. Since the relative population of the two conformers is approximately 50% for both isomers and predicted frequencies and intensities are very similar for each conformer, an arbitrary choice to discuss the Cs conformer for meta-xylene and the C2v conformer for para-xylene is made. Integrated band intensities for all isomers are reported. Using the quantitative infrared data, the global warming potential values of each isomer are determined. Potential bands for atmospheric monitoring are also discussed.
Macrostrat: A Platform for Geological Data Integration and Deep-Time Earth Crust Research
NASA Astrophysics Data System (ADS)
Peters, Shanan E.; Husson, Jon M.; Czaplewski, John
2018-04-01
Characterizing the lithology, age, and physical-chemical properties of rocks and sediments in the Earth's upper crust is necessary to fully assess energy, water, and mineral resources and to address many fundamental questions. Although a large number of geological maps, regional geological syntheses, and sample-based measurements have been produced, there is no openly available database that integrates rock record-derived data, while also facilitating large-scale, quantitative characterization of the volume, age, and material properties of the upper crust. Here we describe Macrostrat, a relational geospatial database and supporting cyberinfrastructure that is designed to enable quantitative spatial and geochronological analyses of the entire assemblage of surface and subsurface sedimentary, igneous, and metamorphic rocks. Macrostrat contains general, comprehensive summaries of the age and properties of 33,903 lithologically and chronologically defined geological units distributed across 1,474 regions in North and South America, the Caribbean, New Zealand, and the deep sea. Sample-derived data, including fossil occurrences in the Paleobiology Database, more than 180,000 geochemical and outcrop-derived measurements, and more than 2.3 million bedrock geologic map units from over 200 map sources, are linked to specific Macrostrat units and/or lithologies. Macrostrat has generated numerous quantitative results and its infrastructure is used as a data platform in several independently developed mobile applications. It is necessary to expand geographic coverage and to refine age models and material properties to arrive at a more precise characterization of the upper crust globally and test fundamental hypotheses about the long-term evolution of Earth systems.
Contract Actions for Leased Equipment
1999-06-30
Fundamentals, Fundamentals of Contract Pricing, and Government Contract Law courses. The additional instruction should emphasize the contracting officers...Contracting Fundamentals, Fundamentals of Contract Pricing, and Government Contract Law courses. This additional instruction should emphasize the important...FAR 107.401 and 207.470 in the Basics of Contracting and Government Contract Law courses, and that price analysis in assessing lease versus purchase
The Role for an Evaluator: A Fundamental Issue for Evaluation of Education and Social Programs
ERIC Educational Resources Information Center
Luo, Heng
2010-01-01
This paper discusses one of the fundamental issues in education and social program evaluation: the proper role for an evaluator. Based on respective and comparative analysis of five theorists' positions on this fundamental issue, this paper reveals how different perspectives on other fundamental issues in evaluation such as value, methods, use and…
Transcriptomic analysis of flower development in tea (Camellia sinensis (L.)).
Liu, Feng; Wang, Yu; Ding, Zhaotang; Zhao, Lei; Xiao, Jun; Wang, Linjun; Ding, Shibo
2017-10-05
Flowering is a critical and complicated process in plant development, involving interactions of numerous endogenous and environmental factors, but little is known about the complex network regulating flower development in tea plants. In this study, de novo transcriptome assembly and gene expression analysis using Illumina sequencing technology were performed. Transcriptomic analysis assembles gene-related information involved in reproductive growth of C. sinensis. Gene Ontology (GO) analysis of the annotated unigenes revealed that the majority of sequenced genes were associated with metabolic and cellular processes, cell and cell parts, catalytic activity and binding. Kyoto Encyclopedia of Genes and Genomes (KEGG) pathway analysis indicated that metabolic pathways, biosynthesis of secondary metabolites, and plant hormone signal transduction were enriched among the DEGs. Furthermore, 207 flowering-associated unigenes were identified from our database. Some transcription factors, such as WRKY, ERF, bHLH, MYB and MADS-box were shown to be up-regulated in floral transition, which might play the role of progression of flowering. Furthermore, 14 genes were selected for confirmation of expression levels using quantitative real-time PCR (qRT-PCR). The comprehensive transcriptomic analysis presents fundamental information on the genes and pathways which are involved in flower development in C. sinensis. Our data also provided a useful database for further research of tea and other species of plants. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Yang; Zhang, Lei; Zhao, Shu-Xia; Li, Yu-Fang; Gong, Yao; Dong, Lei; Ma, Wei-Guang; Yin, Wang-Bao; Yao, Shun-Chun; Lu, Ji-Dong; Xiao, Lian-Tuan; Jia, Suo-Tang
2016-12-01
Laser-induced breakdown spectroscopy (LIBS) is an emerging analytical spectroscopy technique. This review presents the main recent developments in China regarding the implementation of LIBS for coal analysis. The paper mainly focuses on the progress of the past few years in the fundamentals, data pretreatment, calibration model, and experimental issues of LIBS and its application to coal analysis. Many important domestic studies focusing on coal quality analysis have been conducted. For example, a proposed novel hybrid quantification model can provide more reproducible quantitative analytical results; the model obtained the average absolute errors (AREs) of 0.42%, 0.05%, 0.07%, and 0.17% for carbon, hydrogen, volatiles, and ash, respectively, and a heat value of 0.07 MJ/kg. Atomic/ionic emission lines and molecular bands, such as CN and C2, have been employed to generate more accurate analysis results, achieving an ARE of 0.26% and a 0.16% limit of detection (LOD) for the prediction of unburned carbon in fly ashes. Both laboratory and on-line LIBS apparatuses have been developed for field application in coal-fired power plants. We consider that both the accuracy and the repeatability of the elemental and proximate analysis of coal have increased significantly and further efforts will be devoted to realizing large-scale commercialization of coal quality analyzer in China.
Ocean wavenumber estimation from wave-resolving time series imagery
Plant, N.G.; Holland, K.T.; Haller, M.C.
2008-01-01
We review several approaches that have been used to estimate ocean surface gravity wavenumbers from wave-resolving remotely sensed image sequences. Two fundamentally different approaches that utilize these data exist. A power spectral density approach identifies wavenumbers where image intensity variance is maximized. Alternatively, a cross-spectral correlation approach identifies wavenumbers where intensity coherence is maximized. We develop a solution to the latter approach based on a tomographic analysis that utilizes a nonlinear inverse method. The solution is tolerant to noise and other forms of sampling deficiency and can be applied to arbitrary sampling patterns, as well as to full-frame imagery. The solution includes error predictions that can be used for data retrieval quality control and for evaluating sample designs. A quantitative analysis of the intrinsic resolution of the method indicates that the cross-spectral correlation fitting improves resolution by a factor of about ten times as compared to the power spectral density fitting approach. The resolution analysis also provides a rule of thumb for nearshore bathymetry retrievals-short-scale cross-shore patterns may be resolved if they are about ten times longer than the average water depth over the pattern. This guidance can be applied to sample design to constrain both the sensor array (image resolution) and the analysis array (tomographic resolution). ?? 2008 IEEE.
Student Learning about Biomolecular Self-Assembly Using Two Different External Representations
Höst, Gunnar E.; Larsson, Caroline; Olson, Arthur; Tibell, Lena A. E.
2013-01-01
Self-assembly is the fundamental but counterintuitive principle that explains how ordered biomolecular complexes form spontaneously in the cell. This study investigated the impact of using two external representations of virus self-assembly, an interactive tangible three-dimensional model and a static two-dimensional image, on student learning about the process of self-assembly in a group exercise. A conceptual analysis of self-assembly into a set of facets was performed to support study design and analysis. Written responses were collected in a pretest/posttest experimental design with 32 Swedish university students. A quantitative analysis of close-ended items indicated that the students improved their scores between pretest and posttest, with no significant difference between the conditions (tangible model/image). A qualitative analysis of an open-ended item indicated students were unfamiliar with self-assembly prior to the study. Students in the tangible model condition used the facets of self-assembly in their open-ended posttest responses more frequently than students in the image condition. In particular, it appears that the dynamic properties of the tangible model may support student understanding of self-assembly in terms of the random and reversible nature of molecular interactions. A tentative difference was observed in response complexity, with more multifaceted responses in the tangible model condition. PMID:24006395
Student learning about biomolecular self-assembly using two different external representations.
Höst, Gunnar E; Larsson, Caroline; Olson, Arthur; Tibell, Lena A E
2013-01-01
Self-assembly is the fundamental but counterintuitive principle that explains how ordered biomolecular complexes form spontaneously in the cell. This study investigated the impact of using two external representations of virus self-assembly, an interactive tangible three-dimensional model and a static two-dimensional image, on student learning about the process of self-assembly in a group exercise. A conceptual analysis of self-assembly into a set of facets was performed to support study design and analysis. Written responses were collected in a pretest/posttest experimental design with 32 Swedish university students. A quantitative analysis of close-ended items indicated that the students improved their scores between pretest and posttest, with no significant difference between the conditions (tangible model/image). A qualitative analysis of an open-ended item indicated students were unfamiliar with self-assembly prior to the study. Students in the tangible model condition used the facets of self-assembly in their open-ended posttest responses more frequently than students in the image condition. In particular, it appears that the dynamic properties of the tangible model may support student understanding of self-assembly in terms of the random and reversible nature of molecular interactions. A tentative difference was observed in response complexity, with more multifaceted responses in the tangible model condition.
Wang, Shan-Ning; Peng, Yong; Lu, Zi-Yun; Dhiloo, Khalid Hussain; Zheng, Yao; Shan, Shuang; Li, Rui-Jun; Zhang, Yong-Jun; Guo, Yu-Yuan
2016-07-01
Ionotropic receptors (IRs) mainly detect the acids and amines having great importance in many insect species, representing an ancient olfactory receptor family in insects. In the present work, we performed RNAseq of Microplitis mediator antennae and identified seventeen IRs. Full-length MmedIRs were cloned and sequenced. Phylogenetic analysis of the Hymenoptera IRs revealed that ten MmedIR genes encoded "antennal IRs" and seven encoded "divergent IRs". Among the IR25a orthologous groups, two genes, MmedIR25a.1 and MmedIR25a.2, were found in M. mediator. Gene structure analysis of MmedIR25a revealed a tandem duplication of IR25a in M. mediator. The tissue distribution and development specific expression of the MmedIR genes suggested that these genes showed a broad expression profile. Quantitative gene expression analysis showed that most of the genes are highly enriched in adult antennae, indicating the candidate chemosensory function of this family in parasitic wasps. Using immunocytochemistry, we confirmed that one co-receptor, MmedIR8a, was expressed in the olfactory sensory neurons. Our data will supply fundamental information for functional analysis of the IRs in parasitoid wasp chemoreception. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; Jung, Sung-Young; Poon, Emily; Lee, Jung Woo; Na, Ilyoun; Geisler, Amelia; Sadhwani, Divya; Zhang, Yihui; Su, Yewang; Wang, Xiaoqi; Liu, Zhuangjian; Xia, Jing; Cheng, Huanyu; Webb, R. Chad; Bonifas, Andrew P.; Won, Philip; Jeong, Jae-Woong; Jang, Kyung-In; Song, Young Min; Nardone, Beatrice; Nodzenski, Michael; Fan, Jonathan A.; Huang, Yonggang; West, Dennis P.; Paller, Amy S.; Alam, Murad
2014-01-01
Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. Here we report a skin-like electronics platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing. Clinical studies on patients using thermal sensors and actuators in fractal layouts provide precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of ‘epidermal’ electronics system in a realistic, clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. The results have the potential to address important unmet needs in chronic wound management. PMID:24668927
Timmins, Fiona; Neill, Freda; Murphy, Maryanne; Begley, Thelma; Sheaf, Greg
2015-11-01
Spirituality is receiving unprecedented attention in the nursing literature. Both the volume and scope of literature on the topic is expanding, and it is clear that this topic is of interest to nurses. There is consensus that the spiritual required by clients receiving health ought to be an integrated effort across the health care team. Although undergraduate nurses receive some education on the topic, this is ad hoc and inconsistent across universities. Textbooks are clearly a key resource in this area however the extent to which they form a comprehensive guide for nursing students and nurses is unclear. This study provides a hitherto unperformed analysis of core nursing textbooks to ascertain spirituality related content. 543 books were examined and this provides a range of useful information about inclusions and omissions in this field. Findings revealed that spirituality is not strongly portrayed as a component of holistic care and specific direction for the provision of spiritual care is lacking. Fundamental textbooks used by nurses and nursing students ought to inform and guide integrated spiritual care and reflect a more holistic approach to nursing care. The religious and/or spiritual needs of an increasingly diverse community need to be taken seriously within scholarly texts so that this commitment to individual clients' needs can be mirrored in practice. Copyright © 2015 Elsevier Ltd. All rights reserved.
Generation mechanisms of fundamental rogue wave spatial-temporal structure.
Ling, Liming; Zhao, Li-Chen; Yang, Zhan-Ying; Guo, Boling
2017-08-01
We discuss the generation mechanism of fundamental rogue wave structures in N-component coupled systems, based on analytical solutions of the nonlinear Schrödinger equation and modulational instability analysis. Our analysis discloses that the pattern of a fundamental rogue wave is determined by the evolution energy and growth rate of the resonant perturbation that is responsible for forming the rogue wave. This finding allows one to predict the rogue wave pattern without the need to solve the N-component coupled nonlinear Schrödinger equation. Furthermore, our results show that N-component coupled nonlinear Schrödinger systems may possess N different fundamental rogue wave patterns at most. These results can be extended to evaluate the type and number of fundamental rogue wave structure in other coupled nonlinear systems.
Astrophysical properties of star clusters in the Magellanic Clouds homogeneously estimated by ASteCA
NASA Astrophysics Data System (ADS)
Perren, G. I.; Piatti, A. E.; Vázquez, R. A.
2017-06-01
Aims: We seek to produce a homogeneous catalog of astrophysical parameters of 239 resolved star clusters, located in the Small and Large Magellanic Clouds, observed in the Washington photometric system. Methods: The cluster sample was processed with the recently introduced Automated Stellar Cluster Analysis (ASteCA) package, which ensures both an automatized and a fully reproducible treatment, together with a statistically based analysis of their fundamental parameters and associated uncertainties. The fundamental parameters determined for each cluster with this tool, via a color-magnitude diagram (CMD) analysis, are metallicity, age, reddening, distance modulus, and total mass. Results: We generated a homogeneous catalog of structural and fundamental parameters for the studied cluster sample and performed a detailed internal error analysis along with a thorough comparison with values taken from 26 published articles. We studied the distribution of cluster fundamental parameters in both Clouds and obtained their age-metallicity relationships. Conclusions: The ASteCA package can be applied to an unsupervised determination of fundamental cluster parameters, which is a task of increasing relevance as more data becomes available through upcoming surveys. A table with the estimated fundamental parameters for the 239 clusters analyzed is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/602/A89
Single Case Method in Psychology: How to Improve as a Possible Methodology in Quantitative Research.
Krause-Kjær, Elisa; Nedergaard, Jensine I
2015-09-01
Awareness of including Single-Case Method (SCM), as a possible methodology in quantitative research in the field of psychology, has been argued as useful, e.g., by Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Their article introduces a historical and conceptual analysis of SCMs and proposes changing the, often prevailing, tendency of neglecting SCM as an alternative to Null Hypothesis Significance Testing (NHST). This article contributes by putting a new light on SCM as an equally important methodology in psychology. The intention of the present article is to elaborate this point of view further by discussing one of the most fundamental requirements as well as main characteristics of SCM regarding temporality. In this respect that; "…performance is assessed continuously over time and under different conditions…" Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Defining principles when it comes to particular units of analysis, both synchronic (spatial) and diachronic (temporal) elements should be incorporated. In this article misunderstandings of the SCM will be adduced, and further the temporality will be described in order to propose how the SCM could have a more severe usability in psychological research. It is further discussed how to implement SCM in psychological methodology. It is suggested that one solution might be to reconsider the notion of time in psychological research to cover more than a variable of control and in this respect also include the notion of time as an irreversible unity within life.
The mettle of moral fundamentalism: a reply to Robert Baker.
Beauchamp, Tom L
1998-12-01
This article is a reply to Robert Baker's attempt to rebut moral fundamentalism, while grounding international bioethics in a form of contractarianism. Baker is mistaken in several of his interpretations of the alleged moral fundamentalism and findings of the Advisory Committee on Human Radiation Experiments. He also misunderstands moral fundamentalism generally and wrongly categorizes it as morally bankrupt. His negotiated contract model is, in the final analysis, itself a form of the moral fundamentalism he declares bankrupt.
NASA Astrophysics Data System (ADS)
Monroe, Roberta Lynn
The intrinsic fundamental frequency effect among vowels is a vocalic phenomenon of adult speech in which high vowels have higher fundamental frequencies in relation to low vowels. Acoustic investigations of children's speech have shown that variability of the speech signal decreases as children's ages increase. Fundamental frequency measures have been suggested as an indirect metric for the development of laryngeal stability and coordination. Studies of the intrinsic fundamental frequency effect have been conducted among 8- and 9-year old children and in infants. The present study investigated this effect among 2- and 4-year old children. Eight 2-year old and eight 4-year old children produced four vowels, /ae/, /i/, /u/, and /a/, in CVC syllables. Three measures of fundamental frequency were taken. These were mean fundamental frequency, the intra-utterance standard deviation of the fundamental frequency, and the extent to which the cycle-to-cycle pattern of the fundamental frequency was predicted by a linear trend. An analysis of variance was performed to compare the two age groups, the four vowels, and the earlier and later repetitions of the CVC syllables. A significant difference between the two age groups was detected using the intra-utterance standard deviation of the fundamental frequency. Mean fundamental frequencies and linear trend analysis showed that voicing of the preceding consonant determined the statistical significance of the age-group comparisons. Statistically significant differences among the fundamental frequencies of the four vowels were not detected for either age group.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-03
... Response to Comments on Previous Analysis C. Summary of the Comparative Analysis 1. Quantitative Analysis 2... preliminary quantitative analysis are specific building designs, in most cases with specific spaces defined... preliminary determination. C. Summary of the Comparative Analysis DOE carried out both a broad quantitative...
Quantized thermal transport in single-atom junctions
NASA Astrophysics Data System (ADS)
Cui, Longji; Jeong, Wonho; Hur, Sunghoon; Matt, Manuel; Klöckner, Jan C.; Pauly, Fabian; Nielaba, Peter; Cuevas, Juan Carlos; Meyhofer, Edgar; Reddy, Pramod
2017-03-01
Thermal transport in individual atomic junctions and chains is of great fundamental interest because of the distinctive quantum effects expected to arise in them. By using novel, custom-fabricated, picowatt-resolution calorimetric scanning probes, we measured the thermal conductance of gold and platinum metallic wires down to single-atom junctions. Our work reveals that the thermal conductance of gold single-atom junctions is quantized at room temperature and shows that the Wiedemann-Franz law relating thermal and electrical conductance is satisfied even in single-atom contacts. Furthermore, we quantitatively explain our experimental results within the Landauer framework for quantum thermal transport. The experimental techniques reported here will enable thermal transport studies in atomic and molecular chains, which will be key to investigating numerous fundamental issues that thus far have remained experimentally inaccessible.
Ţarălungă, Dragoş-Daniel; Ungureanu, Georgeta-Mihaela; Gussi, Ilinca; Strungaru, Rodica; Wolf, Werner
2014-01-01
Interference of power line (PLI) (fundamental frequency and its harmonics) is usually present in biopotential measurements. Despite all countermeasures, the PLI still corrupts physiological signals, for example, electromyograms (EMG), electroencephalograms (EEG), and electrocardiograms (ECG). When analyzing the fetal ECG (fECG) recorded on the maternal abdomen, the PLI represents a particular strong noise component, being sometimes 10 times greater than the fECG signal, and thus impairing the extraction of any useful information regarding the fetal health state. Many signal processing methods for cancelling the PLI from biopotentials are available in the literature. In this review study, six different principles are analyzed and discussed, and their performance is evaluated on simulated data (three different scenarios), based on five quantitative performance indices.
Ţarălungă, Dragoş-Daniel; Ungureanu, Georgeta-Mihaela; Gussi, Ilinca; Strungaru, Rodica; Wolf, Werner
2014-01-01
Interference of power line (PLI) (fundamental frequency and its harmonics) is usually present in biopotential measurements. Despite all countermeasures, the PLI still corrupts physiological signals, for example, electromyograms (EMG), electroencephalograms (EEG), and electrocardiograms (ECG). When analyzing the fetal ECG (fECG) recorded on the maternal abdomen, the PLI represents a particular strong noise component, being sometimes 10 times greater than the fECG signal, and thus impairing the extraction of any useful information regarding the fetal health state. Many signal processing methods for cancelling the PLI from biopotentials are available in the literature. In this review study, six different principles are analyzed and discussed, and their performance is evaluated on simulated data (three different scenarios), based on five quantitative performance indices. PMID:24660020
ERIC Educational Resources Information Center
Chubbuck, Kay; Curley, W. Edward; King, Teresa C.
2016-01-01
This study gathered quantitative and qualitative evidence concerning gender differences in performance by using critical reading material on the "SAT"® test with sports and science content. The fundamental research questions guiding the study were: If sports and science are to be included in a skills test, what kinds of material are…
Curbing "Math Anxiety" with Galileo While Teaching Physicists, too
NASA Astrophysics Data System (ADS)
Schwartz, Brian P.
2006-12-01
Carthage College's introductory physics course caters to both freshmen in our program and students in general education. While "Understandings of Physics" is a conceptual overview of our discipline, physical science is necessarily quantitative. Galileo's "Dialogue Concerning the Two New Sciences" provides us with a novel way to teach the fundamentals of motion both to students who "fear" mathematics, as well as those who are adept at solving algebraic equations.
NASA Technical Reports Server (NTRS)
Kuehl, H.
1947-01-01
The basic principles of the control of TL ongincs are developed on .the basis of a quantitative investigation of the behavior of these behavior under various operating conditions with particular consideration of the simplifications pormissible in each case. Various possible means of control of jet engines are suggested and are illustrated by schematic designs.
[Quality assessment in anesthesia].
Kupperwasser, B
1996-01-01
Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.
Rodrigues, Luis Monteiro; Pinto, Pedro Contreiras; Pereira, Luis Marcelo
2003-02-01
In vivo water assessment would greatly benefit from a dynamical approach since the evaluation of common related variables such as trans-epidermal water loss or "capacitance" measurements is always limited to instantaneous data. Mathematical modelling is still an attractive alternative already attempted with bi-exponential empirical models. A classical two-compartment interpretation of such models raises a number of questions about the underlying fundamentals, which can hardly be experimentally confirmed. However, in a system analysis sense, skin water dynamics may be approached as an ensemble of many factors, impossible to discretize, but conceptually grouped in terms of feasible properties of the system. The present paper explores the applicability of this strategy to the in vivo water dynamics assessment. From the plastic occlusion stress test (POST) skin water balance is assessed by modelling trans-epidermal water loss (TEWL) and "capacitance" data obtained at skin's surface. With system analysis (disposition-decomposition analysis) the distribution function, H(t), modelled as a sum of exponential terms, covers only the distribution characteristics of water molecules traversing the skin. This may correspond macroscopically to the experimental data accessed by "corneometry". Separately, the hyperbolic elimination function Q(TEWL) helps to characterise the dynamic aspects of water influx through the skin. In the observable range there seems to be a linear relationship between the net amount of water lost at the surface by evaporation, and the capability of the system to replenish that loss. This may be a specific characteristic of the system related to what may be described as the skin's "intrinsic hydration capacity" (IHC) a new functional parameter only identified by this strategy. These new quantitative tools are expected to find different applicabilities (from the in vivo skin characterisation to efficacy testing) contributing to disclose the dynamical nature of the skin water balance process. Copyright Blackwell Munksgaard 2003
Loss of Chromosome 18 in Neuroendocrine Tumors of the Small Intestine: The Enigma Remains.
Nieser, Maike; Henopp, Tobias; Brix, Joachim; Stoß, Laura; Sitek, Barbara; Naboulsi, Wael; Anlauf, Martin; Schlitter, Anna M; Klöppel, Günter; Gress, Thomas; Moll, Roland; Bartsch, Detlef K; Heverhagen, Anna E; Knoefel, Wolfram T; Kaemmerer, Daniel; Haybaeck, Johannes; Fend, Falko; Sperveslage, Jan; Sipos, Bence
2017-01-01
Neuroendocrine tumors of the small intestine (SI-NETs) exhibit an increasing incidence and high mortality rate. Until now, no fundamental molecular event has been linked to the tumorigenesis and progression of these tumors. Only the loss of chromosome 18 (Chr18) has been shown in up to two thirds of SI-NETs, whereby the significance of this alteration is still not understood. We therefore performed the first comprehensive study to identify Chr18-related events at the genetic, epigenetic and gene/protein expression levels. We did expression analysis of all seven putative Chr18-related tumor suppressors by quantitative real-time PCR (qRT-PCR), Western blot and immunohistochemistry. Next-generation exome sequencing and SNP array analysis were performed with five SI-NETs with (partial) loss of Chr18. Finally, we analyzed all microRNAs (miRNAs) located on Chr18 by qRT-PCR, comparing Chr18+/- and Chr18+/+ SI-NETs. Only DCC (deleted in colorectal cancer) revealed loss of/greatly reduced expression in 6/21 cases (29%). No relevant loss of SMAD2, SMAD4, elongin A3 and CABLES was detected. PMAIP1 and maspin were absent at the protein level. Next-generation sequencing did not reveal relevant recurrent somatic mutations on Chr18 either in an exploratory cohort of five SI-NETs, or in a validation cohort (n = 30). SNP array analysis showed no additional losses. The quantitative analysis of all 27 Chr18-related miRNAs revealed no difference in expression between Chr18+/- and Chr18+/+ SI-NETs. DCC seems to be the only Chr18-related tumor suppressor affected by the monoallelic loss of Chr18 resulting in a loss of DCC protein expression in one third of SI-NETs. No additional genetic or epigenetic alterations were present on Chr18. © 2016 S. Karger AG, Basel.
Aligned fibers direct collective cell migration to engineer closing and nonclosing wound gaps
Sharma, Puja; Ng, Colin; Jana, Aniket; Padhi, Abinash; Szymanski, Paige; Lee, Jerry S. H.; Behkam, Bahareh; Nain, Amrinder S.
2017-01-01
Cell emergence onto damaged or organized fibrous extracellular matrix (ECM) is a crucial precursor to collective cell migration in wound closure and cancer metastasis, respectively. However, there is a fundamental gap in our quantitative understanding of the role of local ECM size and arrangement in cell emergence–based migration and local gap closure. Here, using ECM-mimicking nanofibers bridging cell monolayers, we describe a method to recapitulate and quantitatively describe these in vivo behaviors over multispatial (single cell to cell sheets) and temporal (minutes to weeks) scales. On fiber arrays with large interfiber spacing, cells emerge (invade) either singularly by breaking cell–cell junctions analogous to release of a stretched rubber band (recoil), or in groups of few cells (chains), whereas on closely spaced fibers, multiple chains emerge collectively. Advancing cells on fibers form cell streams, which support suspended cell sheets (SCS) of various sizes and curvatures. SCS converge to form local gaps that close based on both the gap size and shape. We document that cell stream spacing of 375 µm and larger hinders SCS advancement, thus providing abilities to engineer closing and nonclosing gaps. Altogether we highlight the importance of studying cell-fiber interactions and matrix structural remodeling in fundamental and translational cell biology. PMID:28747440
NASA Astrophysics Data System (ADS)
El Koussaifi, R.; Tikan, A.; Toffoli, A.; Randoux, S.; Suret, P.; Onorato, M.
2018-01-01
Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.
El Koussaifi, R; Tikan, A; Toffoli, A; Randoux, S; Suret, P; Onorato, M
2018-01-01
Rogue waves are extreme and rare fluctuations of the wave field that have been discussed in many physical systems. Their presence substantially influences the statistical properties of a partially coherent wave field, i.e., a wave field characterized by a finite band spectrum with random Fourier phases. Their understanding is fundamental for the design of ships and offshore platforms. In many meteorological conditions waves in the ocean are characterized by the so-called Joint North Sea Wave Project (JONSWAP) spectrum. Here we compare two unique experimental results: the first one has been performed in a 270 m wave tank and the other in optical fibers. In both cases, waves characterized by a JONSWAP spectrum and random Fourier phases have been launched at the input of the experimental device. The quantitative comparison, based on an appropriate scaling of the two experiments, shows a very good agreement between the statistics in hydrodynamics and optics. Spontaneous emergence of heavy tails in the probability density function of the wave amplitude is observed in both systems. The results demonstrate the universal features of rogue waves and provide a fundamental and explicit bridge between two important fields of research. Numerical simulations are also compared with experimental results.
Trends in plant research using molecular markers.
Garrido-Cardenas, Jose Antonio; Mesa-Valle, Concepción; Manzano-Agugliaro, Francisco
2018-03-01
A deep bibliometric analysis has been carried out, obtaining valuable parameters that facilitate the understanding around the research in plant using molecular markers. The evolution of the improvement in the field of agronomy is fundamental for its adaptation to the new exigencies that the current world context raises. In addition, within these improvements, this article focuses on those related to the biotechnology sector. More specifically, the use of DNA markers that allow the researcher to know the set of genes associated with a particular quantitative trait or QTL. The use of molecular markers is widely extended, including: restriction fragment length polymorphism, random-amplified polymorphic DNA, amplified fragment length polymorphism, microsatellites, and single-nucleotide polymorphisms. In addition to classical methodology, new approaches based on the next generation sequencing are proving to be fundamental. In this article, a historical review of the molecular markers traditionally used in plants, since its birth and how the new molecular tools facilitate the work of plant breeders is carried out. The evolution of the most studied cultures from the point of view of molecular markers is also reviewed and other parameters whose prior knowledge can facilitate the approach of researchers to this field of research are analyzed. The bibliometric analysis of molecular markers in plants shows that top five countries in this research are: US, China, India, France, and Germany, and from 2013, this research is led by China. On the other hand, the basic research using Arabidopsis is deeper in France and Germany, while other countries focused its efforts in their main crops as the US for wheat or maize, while China and India for wheat and rice.
Wick, Kristin; Leeger-Aschmann, Claudia S; Monn, Nico D; Radtke, Thomas; Ott, Laura V; Rebholz, Cornelia E; Cruz, Sergio; Gerber, Natalie; Schmutz, Einat A; Puder, Jardena J; Munsch, Simone; Kakebeeke, Tanja H; Jenni, Oskar G; Granacher, Urs; Kriemler, Susi
2017-10-01
Proficiency in fundamental movement skills (FMS) lays the foundation for being physically active and developing more complex motor skills. Improving these motor skills may provide enhanced opportunities for the development of a variety of perceptual, social, and cognitive skills. The objective of this systematic review and meta-analysis was to assess the effects of FMS interventions on actual FMS, targeting typically developing young children. Searches in seven databases (CINAHL, Embase, MEDLINE, PsycINFO, PubMed, Scopus, Web of Science) up to August 2015 were completed. Trials with children (aged 2-6 years) in childcare or kindergarten settings that applied FMS-enhancing intervention programs of at least 4 weeks and meeting the inclusion criteria were included. Standardized data extraction forms were used. Risk of bias was assessed using a standard scoring scheme (Effective Public Health Practice Project-Quality Assessment Tool for Quantitative Studies [EPHPP]). We calculated effects on overall FMS, object control and locomotor subscales (OCS and LMS) by weighted standardized mean differences (SMD between ) using random-effects models. Certainty in training effects was evaluated using GRADE (Grading of Recommendations Assessment, Development, and Evaluation System). Thirty trials (15 randomized controlled trials and 15 controlled trials) involving 6126 preschoolers (aged 3.3-5.5 years) revealed significant differences among groups in favor of the intervention group (INT) with small-to-large effects on overall FMS (SMD between 0.46), OCS (SMD between 1.36), and LMS (SMD between 0.94). Our certainty in the treatment estimates based on GRADE is very low. Although there is relevant effectiveness of programs to improve FMS proficiency in healthy young children, they need to be interpreted with care as they are based on low-quality evidence and immediate post-intervention effects without long-term follow-up.
Relationship of physical activity to fundamental movement skills among adolescents.
Okely, A D; Booth, M L; Patterson, J W
2001-11-01
To determine the relationship of participation in organized and nonorganized physical activity with fundamental movement skills among adolescents. Male and female children in Grade 8 (mean age, 13.3 yr) and Grade 10 (mean age, 15.3 yr) were assessed on six fundamental movement skills (run, vertical jump, catch, overhand throw, forehand strike, and kick). Physical activity was assessed using a self-report recall measure where students reported the type, duration, and frequency of participation in organized physical activity and nonorganized physical activity during a usual week. Multiple regression analysis indicated that fundamental movement skills significantly predicted time in organized physical activity, although the percentage of variance it could explain was small. This prediction was stronger for girls than for boys. Multiple regression analysis showed no relationship between time in nonorganized physical activity and fundamental movement skills. Fundamental movement skills are significantly associated with adolescents' participation in organized physical activity, but predict only a small portion of it.
Quantitative Imaging in Cancer Clinical Trials
Yankeelov, Thomas E.; Mankoff, David A.; Schwartz, Lawrence H.; Lieberman, Frank S.; Buatti, John M.; Mountz, James M.; Erickson, Bradley J.; Fennessy, Fiona M.M.; Huang, Wei; Kalpathy-Cramer, Jayashree; Wahl, Richard L.; Linden, Hannah M.; Kinahan, Paul; Zhao, Binsheng; Hylton, Nola M.; Gillies, Robert J.; Clarke, Laurence; Nordstrom, Robert; Rubin, Daniel L.
2015-01-01
As anti-cancer therapies designed to target specific molecular pathways have been developed, it has become critical to develop methods to assess the response induced by such agents. While traditional, anatomic CT and MRI exams are useful in many settings, there is increasing evidence that these methods cannot answer the fundamental biological and physiological questions essential for assessment and, eventually, prediction of treatment response in the clinical trial setting, especially in the critical period soon after treatment is initiated. To optimally apply advances in quantitative imaging methods to trials of targeted cancer therapy, new infrastructure improvements are needed that incorporate these emerging techniques into the settings where they are most likely to have impact. In this review, we first elucidate the needs for therapeutic response assessment in the era of molecularly targeted therapy and describe how quantitative imaging can most effectively provide scientifically and clinically relevant data. We then describe the tools and methods required to apply quantitative imaging and provide concrete examples of work making these advances practically available for routine application in clinical trials. We conclude by proposing strategies to surmount barriers to wider incorporation of these quantitative imaging methods into clinical trials and, eventually, clinical practice. Our goal is to encourage and guide the oncology community to deploy standardized quantitative imaging techniques in clinical trials to further personalize care for cancer patients, and to provide a more efficient path for the development of improved targeted therapies. PMID:26773162
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chan, Chi Yuet X’avia; Gritsenko, Marina A.; Smith, Richard D.
Protein phosphorylation is a fundamental regulatory mechanism in many cellular processes and aberrant perturbation of phosphorylation has been revealed in various human diseases. Kinases and their cognate inhibitors have been hotspot for drug development. Therefore, the emerging tools, which enable a system-wide quantitative profiling of phosphoproteome, would offer a powerful impetus in unveiling novel signaling pathways, drug targets and/or biomarkers for the disease of interest. In this review, we will highlight recent advances in phosphoproteomics, the current state-of-the-art of the technologies, and the challenges and future perspectives of this research area. Finally, we will underscore some exemplary applications of phosphoproteomicsmore » in diabetes research.« less
Quantitative Interferometry in the Severe Acoustic Environment of Resonant Supersonic Jets
NASA Technical Reports Server (NTRS)
Mercer, Carolyn R.; Raman, Ganesh
1999-01-01
Understanding fundamental fluidic dynamic and acoustic processes in high-speed jets requires quantitative velocity, density and temperature measurements. In this paper we demonstrate a new, robust Liquid Crystal Point Diffraction Interferometer (LCPDI) that includes phase stepping and can provide accurate data even in the presence of intense acoustic fields. This novel common path interferometer (LCPDI) was developed to overcome difficulties with the Mach Zehnder interferometer in vibratory environments and is applied here to the case of a supersonic shock- containing jet. The environmentally insensitive LCPDI that is easy to align and capable of measuring optical wavefronts with high accuracy is briefly described, then integrated line of sight density data from the LCPDI for two underexpanded jets are presented.
Wang, Shuo; Poon, Gregory M K; Wilson, W David
2015-01-01
Biosensor-surface plasmon resonance (SPR) technology has emerged as a powerful label-free approach for the study of nucleic acid interactions in real time. The method provides simultaneous equilibrium and kinetic characterization for biomolecular interactions with low sample requirements and without the need for external probes. A detailed and practical guide for protein-DNA interaction analyses using biosensor-SPR methods is presented. Details of SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips and samples, experimental design, quantitative and qualitative data analyses and presentation. A specific example of the interaction of a transcription factor with DNA is provided with results evaluated by both kinetic and steady-state SPR methods.
Kappen, Claudia
2016-01-01
The process of patterning along the anterior-posterior axis in vertebrates is highly conserved. The function of Hox genes in the axis patterning process is particularly well documented for bone development in the vertebral column and the limbs. We here show that Hoxb6, in skeletal elements at the cervico-thoracic junction, controls multiple independent aspects of skeletal pattern, implicating discrete developmental pathways as substrates for this transcription factor. In addition, we demonstrate that Hoxb6 function is subject to modulation by genetic factors. These results establish Hox-controlled skeletal pattern as a quantitative trait modulated by gene-gene interactions, and provide evidence that distinct modifiers influence the function of conserved developmental genes in fundamental patterning processes. PMID:26800342
ERIC Educational Resources Information Center
Anderson, James L.; And Others
1980-01-01
Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)
NASA Astrophysics Data System (ADS)
Baietto, Oliviero; Amodeo, Francesco; Giorgis, Ilaria; Vitaliti, Martina
2017-04-01
The quantification of NOA (Naturally Occurring Asbestos) in a rock or soil matrix is complex and subject to numerous errors. The purpose of this study is to compare two fundamental methodologies used for the analysis: the first one uses Phase Contrast Optical Microscope (PCOM) while the second one uses Scanning Electron Microscope (SEM). The two methods, although they provide the same result, which is the asbestos mass to total mass ratio, have completely different characteristics and both present pros and cons. The current legislation in Italy involves the use of SEM, DRX, FTIR, PCOM (DM 6/9/94) for the quantification of asbestos in bulk materials and soils and the threshold beyond which the material is considered as hazardous waste is a concentration of asbestos fiber of 1000 mg/kg.(DM 161/2012). The most used technology is the SEM which is the one among these with the better analytical sensitivity.(120mg/Kg DM 6 /9/94) The fundamental differences among the analyses are mainly: - Amount of analyzed sample portion - Representativeness of the sample - Resolution - Analytical precision - Uncertainty of the methodology - Operator errors Due to the problem of quantification of DRX and FTIR (1% DM 6/9/94) our Asbestos Laboratory (DIATI POLITO) since more than twenty years apply the PCOM methodology and in the last years the SEM methodology for quantification of asbestos content. The aim of our research is to compare the results obtained from a PCOM analysis with the results provided by SEM analysis on the base of more than 100 natural samples both from cores (tunnel-boring or explorative-drilling) and from tunnelling excavation . The results obtained show, in most cases, a good correlation between the two techniques. Of particular relevance is the fact that both techniques are reliable for very low quantities of asbestos, even lower than the analytical sensitivity. This work highlights the comparison between the two techniques emphasizing strengths and weaknesses of the two procedures and suggests how an integrated approach, together with the skills and experience of the operator may be the best way forward in order to obtain a constructive improvement of analysis techniques.
NASA Technical Reports Server (NTRS)
Carpenter, Paul; Curreri, Peter A. (Technical Monitor)
2002-01-01
This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.
Quantitative proteomics in Giardia duodenalis-Achievements and challenges.
Emery, Samantha J; Lacey, Ernest; Haynes, Paul A
2016-08-01
Giardia duodenalis (syn. G. lamblia and G. intestinalis) is a protozoan parasite of vertebrates and a major contributor to the global burden of diarrheal diseases and gastroenteritis. The publication of multiple genome sequences in the G. duodenalis species complex has provided important insights into parasite biology, and made post-genomic technologies, including proteomics, significantly more accessible. The aims of proteomics are to identify and quantify proteins present in a cell, and assign functions to them within the context of dynamic biological systems. In Giardia, proteomics in the post-genomic era has transitioned from reliance on gel-based systems to utilisation of a diverse array of techniques based on bottom-up LC-MS/MS technologies. Together, these have generated crucial foundations for subcellular proteomes, elucidated intra- and inter-assemblage isolate variation, and identified pathways and markers in differentiation, host-parasite interactions and drug resistance. However, in Giardia, proteomics remains an emerging field, with considerable shortcomings evident from the published research. These include a bias towards assemblage A, a lack of emphasis on quantitative analytical techniques, and limited information on post-translational protein modifications. Additionally, there are multiple areas of research for which proteomic data is not available to add value to published transcriptomic data. The challenge of amalgamating data in the systems biology paradigm necessitates the further generation of large, high-quality quantitative datasets to accurately model parasite biology. This review surveys the current proteomic research available for Giardia and evaluates their technical and quantitative approaches, while contextualising their biological insights into parasite pathology, isolate variation and eukaryotic evolution. Finally, we propose areas of priority for the generation of future proteomic data to explore fundamental questions in Giardia, including the analysis of post-translational modifications, and the design of MS-based assays for validation of differentially expressed proteins in large datasets. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.
2016-11-01
Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.
Pre-Participation Screening: The Use of Fundamental Movements as an Assessment of Function – Part 1
Burton, Lee; Hoogenboom, Barb
2006-01-01
To prepare an athlete for the wide variety of activities needed to participate in their sport, the analysis of fundamental movements should be incorporated into pre-participation screening in order to determine who possesses, or lacks, the ability to perform certain essential movements. In a series of two articles, the background and rationale for the analysis of fundamental movement will be provided. In addition, one such evaluation tool that attempts to assess the fundamental movement patterns performed by an individual, the Functional Movement Screen (FMS™), will be described. Three of the seven fundamental movement patterns that comprise the FMS™ are described in detail in Part I: deep squat, hurdle step, and in-line lunge. Part II of this series, which will be published in the August issue of NAJSPT, will provide a brief review of the analysis of fundamental movements, as well a detailed description of the four additional patterns that complement those presented in Part I (to complete the total of seven fundamental movement patterns which comprise the FMS™): shoulder mobility, active straight leg raise, trunk stability push-up, and rotary stability. The intent of this two part series is to introduce the concept of the evaluation of fundamental movements, whether it is the FMS™ system or a different system devised by another clinician. Such a functional assessment should be incorporated into pre-participation screening in order to determine whether the athlete has the essential movements needed to participate in sports activities with a decreased risk of injury. PMID:21522216
NASA Astrophysics Data System (ADS)
Araneda, A.; Larocque-Tobler, I.; Torrejon, F.; Grosjean, M.; Jana-Pinninghoff, P.; Ortega, C.; Urrutia, R.
2012-12-01
Quantitative climate reconstructions of the last two millennia are a fundamental issue in order to compare the current trends in climate observed nowadays. At global scale most of the climate reconstructions have been developed for the Northern Hemisphere, while for the Southern Hemisphere quantitative reconstructions are very rare and very limited geographically. The recognition of such disparity has generated among other research initiatives the LOTRED-SA Long-Term climate Reconstruction and Dynamics of (southern) South America, a collaborative, high-resolution multi-proxy approach within the framework of the IGBP-PAGES program. In this context our work presents the results of a 50-lakes training set in Central-Southern Chile developed with the aim to generate a basis for quantitative chironomid-inferred temperature reconstructions for this part of the continent. Chironomids (Insecta: Diptera) are aquatic insects that develop a great proportion of their life cycle as larvae in aquatic ecosystems. Several studies, developed mainly in the Northern Hemisphere, have proven their usefulness in reconstructing past climate due to the larvae's relationship to temperature. The training set developed here includes lakes located between 34 and 48 S, covering a broad temperature (as latitudinal) gradient. The surface (0-1 cm) sediment of each lake was sampled and chironomids, organic matter and nutrient were analyzed. Water analyses included the measurement of 10 variables (AirT, WBT, WST, N-tot, P-tot, Fe, Na, pH among others). In order to identify the most important variables explaining the highest variance in the chironomid assemblages, ordinations analyses were performed. A preliminary DCA analysis indicated, according to the length of gradients smaller than 3 STD, that a linear model was more appropriate for further analysis. Hence a RDA analysis was applied to the environmental and species data, indicating that the most important variables to determine chironomid assemblages are water temperature (WST) and organic matter (OM). The statistical performances of the first model evaluated for WST were relatively weak (r2boot = 0.46, RMSEP=0.79) compared with other models. Nevertheless, the results indicate that temperature is still an important predictor of chironomid distribution and that by increasing the number of lakes in the environmental gradient will further increase the predictive performance and contribute to climate reconstruction in the region. Funding for this research is from Fondecyt project No. 11080158 and from de cooperation project CONICYT-SER-01 and CJRP 1001 between Switzerland and Chile. Partial funding of Fondecyt projects 1120765 and 1120807, is also acknowledged.
Environmental Law: Fundamentals for Schools.
ERIC Educational Resources Information Center
Day, David R.
This booklet outlines the environmental problems most likely to arise in schools. An overview provides a fundamental analysis of environmental issues rather than comprehensive analysis and advice. The text examines the concerns that surround superfund cleanups, focusing on the legal framework, and furnishes some practical pointers, such as what to…
2009-05-12
56 RBC Financial Group, Daily Forex Fundamentals, February 27, 2009. [ http...www.actionforex.com/fundamental- analysis/daily- forex -fundamentals/canada%27s-fourth%11quarter-current-account-moves-into-deficit-after-nine-years- of-surpluses...sharing, infrastructure improvements, improvement of compatible immigration databases , visa policy coordination, common biometric identifiers in
NASA Astrophysics Data System (ADS)
Köhler, Reinhard
2014-12-01
We have long been used to the domination of qualitative methods in modern linguistics. Indeed, qualitative methods have advantages such as ease of use and wide applicability to many types of linguistic phenomena. However, this shall not overshadow the fact that a great part of human language is amenable to quantification. Moreover, qualitative methods may lead to over-simplification by employing the rigid yes/no scale. When variability and vagueness of human language must be taken into account, qualitative methods will prove inadequate and give way to quantitative methods [1, p. 11]. In addition to such advantages as exactness and precision, quantitative concepts and methods make it possible to find laws of human language which are just like those in natural sciences. These laws are fundamental elements of linguistic theories in the spirit of the philosophy of science [2,3]. Theorization effort of this type is what quantitative linguistics [1,4,5] is devoted to. The review of Cong and Liu [6] has provided an informative and insightful survey of linguistic complex networks as a young field of quantitative linguistics, including the basic concepts and measures, the major lines of research with linguistic motivation, and suggestions for future research.
Second Harmonic Generation of Unpolarized Light
NASA Astrophysics Data System (ADS)
Ding, Changqin; Ulcickas, James R. W.; Deng, Fengyuan; Simpson, Garth J.
2017-11-01
A Mueller tensor mathematical framework was applied for predicting and interpreting the second harmonic generation (SHG) produced with an unpolarized fundamental beam. In deep tissue imaging through SHG and multiphoton fluorescence, partial or complete depolarization of the incident light complicates polarization analysis. The proposed framework has the distinct advantage of seamlessly merging the purely polarized theory based on the Jones or Cartesian susceptibility tensors with a more general Mueller tensor framework capable of handling partial depolarized fundamental and/or SHG produced. The predictions of the model are in excellent agreement with experimental measurements of z -cut quartz and mouse tail tendon obtained with polarized and depolarized incident light. The polarization-dependent SHG produced with unpolarized fundamental allowed determination of collagen fiber orientation in agreement with orthogonal methods based on image analysis. This method has the distinct advantage of being immune to birefringence or depolarization of the fundamental beam for structural analysis of tissues.
Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board
2017-03-01
due to Marines being evaluated before the end of their initial service commitment. Our research utilizes quantitative variables to analyze the...not provide detailed information why. B. LIMITATIONS The photograph analysis in this research is strictly limited to a quantitative analysis in...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. QUANTITATIVE
Mazzini, Virginia
2017-01-01
The importance of electrolyte solutions cannot be overstated. Beyond the ionic strength of electrolyte solutions the specific nature of the ions present is vital in controlling a host of properties. Therefore ion specificity is fundamentally important in physical chemistry, engineering and biology. The observation that the strengths of the effect of ions often follows well established series suggests that a single predictive and quantitative description of specific-ion effects covering a wide range of systems is possible. Such a theory would revolutionise applications of physical chemistry from polymer precipitation to drug design. Current approaches to understanding specific-ion effects involve consideration of the ions themselves, the solvent and relevant interfaces and the interactions between them. Here we investigate the specific-ion effects trends of standard partial molar volumes and electrostrictive volumes of electrolytes in water and eleven non-aqueous solvents. We choose these measures as they relate to bulk properties at infinite dilution, therefore they are the simplest electrolyte systems. This is done to test the hypothesis that the ions alone exhibit a specific-ion effect series that is independent of the solvent and unrelated to surface properties. The specific-ion effects trends of standard partial molar volumes and normalised electrostrictive volumes examined in this work show a fundamental ion-specific series that is reproduced across the solvents, which is the Hofmeister series for anions and the reverse lyotropic series for cations, supporting the hypothesis. This outcome is important in demonstrating that ion specificity is observed at infinite dilution and demonstrates that the complexity observed in the manifestation of specific-ion effects in a very wide range of systems is due to perturbations of solvent, surfaces and concentration on the underlying fundamental series. This knowledge will guide a general understanding of specific-ion effects and assist in the development of a quantitative predictive theory of ion specificity. PMID:29147533
Development and Experimental Evaluation of an Automated Multi-Media Course on Transistors.
ERIC Educational Resources Information Center
Whitted, J.H., Jr.; And Others
A completely automated multi-media self-study program for teaching a portion of electronic solid-state fundamentals was developed. The subject matter areas included were fundamental theory of transistors, transistor amplifier fundamentals, and simple mathematical analysis of transistors including equivalent circuits, parameters, and characteristic…
Fundamental movement skills and motivational factors influencing engagement in physical activity.
Kalaja, Sami; Jaakkola, Timo; Liukkonen, Jarmo; Watt, Anthony
2010-08-01
To assess whether subgroups based on children's fundamental movement skills, perceived competence, and self-determined motivation toward physical education vary with current self-reported physical activity, a sample of 316 Finnish Grade 7 students completed fundamental movement skills measures and self-report questionnaires assessing perceived competence, self-determined motivation toward physical education, and current physical activity. Cluster analysis indicated a three-cluster structure: "Low motivation/low skills profile," "High skills/low motivation profile," and "High skills/high motivation profile." Analysis of variance indicated that students in the third cluster engaged in significantly more physical activity than students of clusters one and two. These results provide support for previous claims regarding the importance of the relationship of fundamental movement skills with continuing engagement in physical activity. High fundamental movement skills, however, may represent only one element in maintaining adolescents' engagement in physical activity.
Vowel selection and its effects on perturbation and nonlinear dynamic measures.
Maccallum, Julia K; Zhang, Yu; Jiang, Jack J
2011-01-01
Acoustic analysis of voice is typically conducted on recordings of sustained vowel phonation. This study applied perturbation and nonlinear dynamic analyses to the vowels /a/, /i/, and /u/ in order to determine vowel selection effects on analysis. Forty subjects (20 males and 20 females) with normal voices participated in recording. Traditional parameters of fundamental frequency, signal-to-noise ratio, percent jitter, and percent shimmer were calculated for the signals using CSpeech. Nonlinear dynamic parameters of correlation dimension and second-order entropy were also calculated. Perturbation analysis results were largely incongruous in this study and in previous research. Fundamental frequency results corroborated previous work, indicating higher fundamental frequency for /i/ and /u/ and lower fundamental frequency for /a/. Signal-to-noise ratio results showed that /i/ and /u/ have greater harmonic levels than /a/. Results of nonlinear dynamic analysis suggested that more complex activity may be evident in /a/ than in /i/ or /u/. Percent jitter and percent shimmer may not be useful for description of acoustic differences between vowels. Fundamental frequency, signal-to-noise ratio, and nonlinear dynamic parameters may be applied to characterize /a/ as having lower frequency, higher noise, and greater nonlinear components than /i/ and /u/. Copyright © 2010 S. Karger AG, Basel.
Kitson, Alison L.; Muntlin Athlin, Åsa
2013-01-01
Aim. To develop and test a framework describing the interrelationship of three key dimensions (physical, psychosocial, and relational) in the provision of the fundamentals of care to patients. Background. There are few conceptual frameworks to help healthcare staff, particularly nurses, know how to provide direct care around fundamental needs such as eating, drinking, and going to the toilet. Design. Deductive development of a conceptual framework and qualitative analysis of secondary interview data. Method. Framework development followed by a secondary in-depth analysis of primary narrative interview data from three stroke survivors. Results. Using the physical, psychosocial and relational dimensions to develop a conceptual framework, it was possible to identify a number of “archetypes” or scenarios that could explain stroke survivors' positive experiences of their care. Factors contributing to suboptimal care were also identified. Conclusions. This way of thinking about how the fundamentals of care are experienced by patients may help to elucidate the complex processes involved around providing high quality fundamentals of care. This analysis illustrates the multiple dimensions at play. However, more systematic investigation is required with further refining and testing with wider healthcare user groups. The framework has potential to be used as a predictive, evaluative, and explanatory tool. PMID:23864946
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labuda, Aleksander; Proksch, Roger
An ongoing challenge in atomic force microscope (AFM) experiments is the quantitative measurement of cantilever motion. The vast majority of AFMs use the optical beam deflection (OBD) method to infer the deflection of the cantilever. The OBD method is easy to implement, has impressive noise performance, and tends to be mechanically robust. However, it represents an indirect measurement of the cantilever displacement, since it is fundamentally an angular rather than a displacement measurement. Here, we demonstrate a metrological AFM that combines an OBD sensor with a laser Doppler vibrometer (LDV) to enable accurate measurements of the cantilever velocity and displacement.more » The OBD/LDV AFM allows a host of quantitative measurements to be performed, including in-situ measurements of cantilever oscillation modes in piezoresponse force microscopy. As an example application, we demonstrate how this instrument can be used for accurate quantification of piezoelectric sensitivity—a longstanding goal in the electromechanical community.« less
Focal Point Theory Models for Dissecting Dynamic Duality Problems of Microbial Infections
Huang, S.-H.; Zhou, W.; Jong, A.
2008-01-01
Extending along the dynamic continuum from conflict to cooperation, microbial infections always involve symbiosis (Sym) and pathogenesis (Pat). There exists a dynamic Sym-Pat duality (DSPD) in microbial infection that is the most fundamental problem in infectomics. DSPD is encoded by the genomes of both the microbes and their hosts. Three focal point (FP) theory-based game models (pure cooperative, dilemma, and pure conflict) are proposed for resolving those problems. Our health is associated with the dynamic interactions of three microbial communities (nonpathogenic microbiota (NP) (Cooperation), conditional pathogens (CP) (Dilemma), and unconditional pathogens (UP) (Conflict)) with the hosts at different health statuses. Sym and Pat can be quantitated by measuring symbiotic index (SI), which is quantitative fitness for the symbiotic partnership, and pathogenic index (PI), which is quantitative damage to the symbiotic partnership, respectively. Symbiotic point (SP), which bears analogy to FP, is a function of SI and PI. SP-converting and specific pathogen-targeting strategies can be used for the rational control of microbial infections. PMID:18350122
Quantitative evaluation of the voice range profile in patients with voice disorder.
Ikeda, Y; Masuda, T; Manako, H; Yamashita, H; Yamamoto, T; Komiyama, S
1999-01-01
In 1953, Calvet first displayed the fundamental frequency (pitch) and sound pressure level (intensity) of a voice on a two-dimensional plane and created a voice range profile. This profile has been used to evaluate clinically various vocal disorders, although such evaluations to date have been subjective without quantitative assessment. In the present study, a quantitative system was developed to evaluate the voice range profile utilizing a personal computer. The area of the voice range profile was defined as the voice volume. This volume was analyzed in 137 males and 175 females who were treated for various dysphonias at Kyushu University between 1984 and 1990. Ten normal subjects served as controls. The voice volume in cases with voice disorders significantly decreased irrespective of the disease and sex. Furthermore, cases having better improvement after treatment showed a tendency for the voice volume to increase. These findings illustrated the voice volume as a useful clinical test for evaluating voice control in cases with vocal disorders.
Novel cardiac magnetic resonance biomarkers: native T1 and extracellular volume myocardial mapping.
Cannaò, Paola Maria; Altabella, Luisa; Petrini, Marcello; Alì, Marco; Secchi, Francesco; Sardanelli, Francesco
2016-04-28
Cardiac magnetic resonance (CMR) is a non-invasive diagnostic tool playing a key role in the assessment of cardiac morphology and function as well as in tissue characterization. Late gadolinium enhancement is a fundamental CMR technique for detecting focal or regional abnormalities such as scar tissue, replacement fibrosis, or inflammation using qualitative, semi-quantitative, or quantitative methods, but not allowing for evaluating the whole myocardium in the presence of diffuse disease. The novel T1 mapping approach permits a quantitative assessment of the entire myocardium providing a voxel-by-voxel map of native T1 relaxation time, obtained before the intravenous administration of gadolinium-based contrast material. Combining T1 data obtained before and after contrast injection, it is also possible to calculate the voxel-by-voxel extracellular volume (ECV), resulting in another myocardial parametric map. This article describes technical challenges and clinical perspectives of these two novel CMR biomarkers: myocardial native T1 and ECV mapping.
Comparative assessment of fluorescent transgene methods for quantitative imaging in human cells.
Mahen, Robert; Koch, Birgit; Wachsmuth, Malte; Politi, Antonio Z; Perez-Gonzalez, Alexis; Mergenthaler, Julia; Cai, Yin; Ellenberg, Jan
2014-11-05
Fluorescence tagging of proteins is a widely used tool to study protein function and dynamics in live cells. However, the extent to which different mammalian transgene methods faithfully report on the properties of endogenous proteins has not been studied comparatively. Here we use quantitative live-cell imaging and single-molecule spectroscopy to analyze how different transgene systems affect imaging of the functional properties of the mitotic kinase Aurora B. We show that the transgene method fundamentally influences level and variability of expression and can severely compromise the ability to report on endogenous binding and localization parameters, providing a guide for quantitative imaging studies in mammalian cells. © 2014 Mahen et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
NASA Astrophysics Data System (ADS)
Shvelidze, T. D.; Malyuto, V. D.
Quantitative spectral classification of F, G and K stars with the 70-cm telescope of the Ambastumani Astrophysical Observatory in areas of the main meridional section of the Galaxy, and for which proper motion data are available, has been performed. Fundamental parameters have been obtained for 333 stars in four areas. Space densities of stars of different spectral types, the stellar luminosity function and the relationships between the kinematics and metallicity of stars have been studied. The results have confirmed and completed the conclusions made on the basis of some previous spectroscopic and photometric surveys. Many plates have been obtained for other important directions in the sky: the Kapteyn areas, the Galactic anticentre and the main meridional section of the Galaxy. The data can be treated with the same quantitative method applied here. This method may also be applied to other available and future spectroscopic data of similar resolution, notably that obtained with large format CCD detectors on Schmidt-type telescopes.
Quantitative fluorescence microscopy and image deconvolution.
Swedlow, Jason R
2013-01-01
Quantitative imaging and image deconvolution have become standard techniques for the modern cell biologist because they can form the basis of an increasing number of assays for molecular function in a cellular context. There are two major types of deconvolution approaches--deblurring and restoration algorithms. Deblurring algorithms remove blur but treat a series of optical sections as individual two-dimensional entities and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed in this chapter. Image deconvolution in fluorescence microscopy has usually been applied to high-resolution imaging to improve contrast and thus detect small, dim objects that might otherwise be obscured. Their proper use demands some consideration of the imaging hardware, the acquisition process, fundamental aspects of photon detection, and image processing. This can prove daunting for some cell biologists, but the power of these techniques has been proven many times in the works cited in the chapter and elsewhere. Their usage is now well defined, so they can be incorporated into the capabilities of most laboratories. A major application of fluorescence microscopy is the quantitative measurement of the localization, dynamics, and interactions of cellular factors. The introduction of green fluorescent protein and its spectral variants has led to a significant increase in the use of fluorescence microscopy as a quantitative assay system. For quantitative imaging assays, it is critical to consider the nature of the image-acquisition system and to validate its response to known standards. Any image-processing algorithms used before quantitative analysis should preserve the relative signal levels in different parts of the image. A very common image-processing algorithm, image deconvolution, is used to remove blurred signal from an image. There are two major types of deconvolution approaches, deblurring and restoration algorithms. Deblurring algorithms remove blur, but treat a series of optical sections as individual two-dimensional entities, and therefore sometimes mishandle blurred light. Restoration algorithms determine an object that, when convolved with the point-spread function of the microscope, could produce the image data. The advantages and disadvantages of these methods are discussed. Copyright © 1998 Elsevier Inc. All rights reserved.
Moffatt, Suzanne; White, Martin; Mackintosh, Joan; Howel, Denise
2006-01-01
Background In this methodological paper we document the interpretation of a mixed methods study and outline an approach to dealing with apparent discrepancies between qualitative and quantitative research data in a pilot study evaluating whether welfare rights advice has an impact on health and social outcomes among a population aged 60 and over. Methods Quantitative and qualitative data were collected contemporaneously. Quantitative data were collected from 126 men and women aged over 60 within a randomised controlled trial. Participants received a full welfare benefits assessment which successfully identified additional financial and non-financial resources for 60% of them. A range of demographic, health and social outcome measures were assessed at baseline, 6, 12 and 24 month follow up. Qualitative data were collected from a sub-sample of 25 participants purposively selected to take part in individual interviews to examine the perceived impact of welfare rights advice. Results Separate analysis of the quantitative and qualitative data revealed discrepant findings. The quantitative data showed little evidence of significant differences of a size that would be of practical or clinical interest, suggesting that the intervention had no impact on these outcome measures. The qualitative data suggested wide-ranging impacts, indicating that the intervention had a positive effect. Six ways of further exploring these data were considered: (i) treating the methods as fundamentally different; (ii) exploring the methodological rigour of each component; (iii) exploring dataset comparability; (iv) collecting further data and making further comparisons; (v) exploring the process of the intervention; and (vi) exploring whether the outcomes of the two components match. Conclusion The study demonstrates how using mixed methods can lead to different and sometimes conflicting accounts and, using this six step approach, how such discrepancies can be harnessed to interrogate each dataset more fully. Not only does this enhance the robustness of the study, it may lead to different conclusions from those that would have been drawn through relying on one method alone and demonstrates the value of collecting both types of data within a single study. More widespread use of mixed methods in trials of complex interventions is likely to enhance the overall quality of the evidence base. PMID:16524479
Wu, Shu-Han; Karmenyan, Artashes; Chiou, Arthur
2015-01-01
Very late antigen-4 (VLA-4), a member of integrin superfamily, interacts with its major counter ligand vascular cell adhesion molecule-1 (VCAM-1) and plays an important role in leukocyte adhesion to vascular endothelium and immunological synapse formation. However, irregular expressions of these proteins may also lead to several autoimmune diseases and metastasis cancer. Thus, quantifying the interaction affinity of the VCAM-1/VLA-4 interaction is of fundamental importance in further understanding the nature of this interaction and drug discovery. In this study, we report an ‘in solution’ steady state organic fluorophore based quantitative fluorescence resonance energy transfer (FRET) assay to quantify this interaction in terms of the dissociation constant (Kd). We have used, in our FRET assay, the Alexa Fluor 488-VLA-4 conjugate as the donor, and Alexa Fluor 546-VCAM-1 as the acceptor. From the FRET signal analysis, Kd of this interaction was determined to be 41.82 ± 2.36 nM. To further confirm our estimation, we have employed surface plasmon resonance (SPR) technique to obtain Kd = 39.60 ± 1.78 nM, which is in good agreement with the result obtained by FRET. This is the first reported work which applies organic fluorophore based ‘in solution’ simple quantitative FRET assay to obtain the dissociation constant of the VCAM-1/VLA-4 interaction, and is also the first quantification of this interaction. Moreover, the value of Kd can serve as an indicator of abnormal protein-protein interactions; hence, this assay can potentially be further developed into a drug screening platform of VLA-4/VCAM-1 as well as other protein-ligand interactions. PMID:25793408
Accurate, Sensitive, and Precise Multiplexed Proteomics Using the Complement Reporter Ion Cluster
Sonnett, Matthew; Yeung, Eyan; Wuhr, Martin
2018-03-09
We present that quantitative analysis of proteomes across multiple time points, organelles, and perturbations is essential for understanding both fundamental biology and disease states. The development of isobaric tags (e.g. TMT) have enabled the simultaneous measurement of peptide abundances across several different conditions. These multiplexed approaches are promising in principle because of advantages in throughput and measurement quality. However, in practice existing multiplexing approaches suffer from key limitations. In its simple implementation (TMT-MS2), measurements are distorted by chemical noise leading to poor measurement accuracy. The current state-of-the-art (TMT-MS3) addresses this, but requires specialized quadrupole-iontrap-Orbitrap instrumentation. The complement reporter ion approachmore » (TMTc) produces high accuracy measurements and is compatible with many more instruments, like quadrupole-Orbitraps. However, the required deconvolution of the TMTc cluster leads to poor measurement precision. Here, we introduce TMTc+, which adds the modeling of the MS2-isolation step into the deconvolution algorithm. The resulting measurements are comparable in precision to TMT-MS3/MS2. The improved duty cycle, and lower filtering requirements make TMTc+ more sensitive than TMT-MS3 and comparable with TMT-MS2. At the same time, unlike TMT-MS2, TMTc+ is exquisitely able to distinguish signal from chemical noise even outperforming TMT-MS3. Lastly, we compare TMTc+ to quantitative label-free proteomics of total HeLa lysate and find that TMTc+ quantifies 7.8k versus 3.9k proteins in a 5-plex sample. At the same time the median coefficient of variation improves from 13% to 4%. Furthermore, TMTc+ advances quantitative proteomics by enabling accurate, sensitive, and precise multiplexed experiments on more commonly used instruments.« less
Accurate, Sensitive, and Precise Multiplexed Proteomics Using the Complement Reporter Ion Cluster
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sonnett, Matthew; Yeung, Eyan; Wuhr, Martin
We present that quantitative analysis of proteomes across multiple time points, organelles, and perturbations is essential for understanding both fundamental biology and disease states. The development of isobaric tags (e.g. TMT) have enabled the simultaneous measurement of peptide abundances across several different conditions. These multiplexed approaches are promising in principle because of advantages in throughput and measurement quality. However, in practice existing multiplexing approaches suffer from key limitations. In its simple implementation (TMT-MS2), measurements are distorted by chemical noise leading to poor measurement accuracy. The current state-of-the-art (TMT-MS3) addresses this, but requires specialized quadrupole-iontrap-Orbitrap instrumentation. The complement reporter ion approachmore » (TMTc) produces high accuracy measurements and is compatible with many more instruments, like quadrupole-Orbitraps. However, the required deconvolution of the TMTc cluster leads to poor measurement precision. Here, we introduce TMTc+, which adds the modeling of the MS2-isolation step into the deconvolution algorithm. The resulting measurements are comparable in precision to TMT-MS3/MS2. The improved duty cycle, and lower filtering requirements make TMTc+ more sensitive than TMT-MS3 and comparable with TMT-MS2. At the same time, unlike TMT-MS2, TMTc+ is exquisitely able to distinguish signal from chemical noise even outperforming TMT-MS3. Lastly, we compare TMTc+ to quantitative label-free proteomics of total HeLa lysate and find that TMTc+ quantifies 7.8k versus 3.9k proteins in a 5-plex sample. At the same time the median coefficient of variation improves from 13% to 4%. Furthermore, TMTc+ advances quantitative proteomics by enabling accurate, sensitive, and precise multiplexed experiments on more commonly used instruments.« less
Revealing and analyzing networks of environmental systems
NASA Astrophysics Data System (ADS)
Eveillard, D.; Bittner, L.; Chaffron, S.; Guidi, L.; Raes, J.; Karsenti, E.; Bowler, C.; Gorsky, G.
2015-12-01
Understanding the interactions between microbial communities and their environment well enough to be able to predict diversity on the basis of physicochemical parameters is a fundamental pursuit of microbial ecology that still eludes us. However, modeling microbial communities is a complicated task, because (i) communities are complex, (ii) most are described qualitatively, and (iii) quantitative understanding of the way communities interacts with their surroundings remains incomplete. Within this seminar, we will illustrate two complementary approaches that aim to overcome these points in different manners. First, we will present a network analysis that focus on the biological carbon pump in the global ocean. The biological carbon pump is the process by which photosynthesis transforms CO2 to organic carbon sinking to the deep-ocean as particles where it is sequestered. While the intensity of the pump correlate to plankton community composition, the underlying ecosystem structure and interactions driving this process remain largely uncharacterized Here we use environmental and metagenomic data gathered during the Tara Oceans expedition to improve understanding of these drivers. We show that specific plankton communities correlate with carbon export and highlight unexpected and overlooked taxa such as Radiolaria, alveolate parasites and bacterial pathogens, as well as Synechococcus and their phages, as key players in the biological pump. Additionally, we show that the abundances of just a few bacterial and viral genes predict most of the global ocean carbon export's variability. Together these findings help elucidate ecosystem drivers of the biological carbon pump and present a case study for scaling from genes-to-ecosystems. Second, we will show preliminary results on a probabilistic modeling that predicts microbial community structure across observed physicochemical data, from a putative network and partial quantitative knowledge. This modeling shows that, despite distinct quantitative environmental perturbations, the constraints on community structure could remain stable.
Quantitative trait loci that control the oil content variation of rapeseed (Brassica napus L.).
Jiang, Congcong; Shi, Jiaqin; Li, Ruiyuan; Long, Yan; Wang, Hao; Li, Dianrong; Zhao, Jianyi; Meng, Jinling
2014-04-01
This report describes an integrative analysis of seed-oil-content quantitative trait loci (QTL) in Brassica napus , using a high-density genetic map to align QTL among different populations. Rapeseed (Brassica napus) is an important source of edible oil and sustainable energy. Given the challenge involved in using only a few genes to substantially increase the oil content of rapeseed without affecting the fatty acid composition, exploitation of a greater number of genetic loci that regulate the oil content variation among rapeseed germplasm is of fundamental importance. In this study, we investigated variation in the seed-oil content among two related genetic populations of Brassica napus, the TN double-haploid population and its derivative reconstructed-F2 population. Each population was grown in multiple experiments under different environmental conditions. Mapping of quantitative trait loci (QTL) identified 41 QTL in the TN populations. Furthermore, of the 20 pairs of epistatic interaction loci detected, approximately one-third were located within the QTL intervals. The use of common markers on different genetic maps and the TN genetic map as a reference enabled us to project QTL from an additional three genetic populations onto the TN genetic map. In summary, we used the TN genetic map of the B. napus genome to identify 46 distinct QTL regions that control seed-oil content on 16 of the 19 linkage groups of B. napus. Of these, 18 were each detected in multiple populations. The present results are of value for ongoing efforts to breed rapeseed with high oil content, and alignment of the QTL makes an important contribution to the development of an integrative system for genetic studies of rapeseed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gritsenko, Marina A.; Xu, Zhe; Liu, Tao
Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less
Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D
2016-01-01
Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.
Dynamics of water bound to crystalline cellulose.
O'Neill, Hugh; Pingali, Sai Venkatesh; Petridis, Loukas; He, Junhong; Mamontov, Eugene; Hong, Liang; Urban, Volker; Evans, Barbara; Langan, Paul; Smith, Jeremy C; Davison, Brian H
2017-09-19
Interactions of water with cellulose are of both fundamental and technological importance. Here, we characterize the properties of water associated with cellulose using deuterium labeling, neutron scattering and molecular dynamics simulation. Quasi-elastic neutron scattering provided quantitative details about the dynamical relaxation processes that occur and was supported by structural characterization using small-angle neutron scattering and X-ray diffraction. We can unambiguously detect two populations of water associated with cellulose. The first is "non-freezing bound" water that gradually becomes mobile with increasing temperature and can be related to surface water. The second population is consistent with confined water that abruptly becomes mobile at ~260 K, and can be attributed to water that accumulates in the narrow spaces between the microfibrils. Quantitative analysis of the QENS data showed that, at 250 K, the water diffusion coefficient was 0.85 ± 0.04 × 10 -10 m 2 sec -1 and increased to 1.77 ± 0.09 × 10 -10 m 2 sec -1 at 265 K. MD simulations are in excellent agreement with the experiments and support the interpretation that water associated with cellulose exists in two dynamical populations. Our results provide clarity to previous work investigating the states of bound water and provide a new approach for probing water interactions with lignocellulose materials.
Acar, Evrim; Plopper, George E.; Yener, Bülent
2012-01-01
The structure/function relationship is fundamental to our understanding of biological systems at all levels, and drives most, if not all, techniques for detecting, diagnosing, and treating disease. However, at the tissue level of biological complexity we encounter a gap in the structure/function relationship: having accumulated an extraordinary amount of detailed information about biological tissues at the cellular and subcellular level, we cannot assemble it in a way that explains the correspondingly complex biological functions these structures perform. To help close this information gap we define here several quantitative temperospatial features that link tissue structure to its corresponding biological function. Both histological images of human tissue samples and fluorescence images of three-dimensional cultures of human cells are used to compare the accuracy of in vitro culture models with their corresponding human tissues. To the best of our knowledge, there is no prior work on a quantitative comparison of histology and in vitro samples. Features are calculated from graph theoretical representations of tissue structures and the data are analyzed in the form of matrices and higher-order tensors using matrix and tensor factorization methods, with a goal of differentiating between cancerous and healthy states of brain, breast, and bone tissues. We also show that our techniques can differentiate between the structural organization of native tissues and their corresponding in vitro engineered cell culture models. PMID:22479315
Lowry, David B.; Logan, Tierney L.; Santuari, Luca; Hardtke, Christian S.; Richards, James H.; DeRose-Wilson, Leah J.; McKay, John K.; Sen, Saunak; Juenger, Thomas E.
2013-01-01
The regulation of gene expression is crucial for an organism’s development and response to stress, and an understanding of the evolution of gene expression is of fundamental importance to basic and applied biology. To improve this understanding, we conducted expression quantitative trait locus (eQTL) mapping in the Tsu-1 (Tsushima, Japan) × Kas-1 (Kashmir, India) recombinant inbred line population of Arabidopsis thaliana across soil drying treatments. We then used genome resequencing data to evaluate whether genomic features (promoter polymorphism, recombination rate, gene length, and gene density) are associated with genes responding to the environment (E) or with genes with genetic variation (G) in gene expression in the form of eQTLs. We identified thousands of genes that responded to soil drying and hundreds of main-effect eQTLs. However, we identified very few statistically significant eQTLs that interacted with the soil drying treatment (GxE eQTL). Analysis of genome resequencing data revealed associations of several genomic features with G and E genes. In general, E genes had lower promoter diversity and local recombination rates. By contrast, genes with eQTLs (G) had significantly greater promoter diversity and were located in genomic regions with higher recombination. These results suggest that genomic architecture may play an important a role in the evolution of gene expression. PMID:24045022
How to integrate quantitative information into imaging reports for oncologic patients.
Martí-Bonmatí, L; Ruiz-Martínez, E; Ten, A; Alberich-Bayarri, A
2018-05-01
Nowadays, the images and information generated in imaging tests, as well as the reports that are issued, are digital and represent a reliable source of data. Reports can be classified according to their content and to the type of information they include into three main types: organized (free text in natural language), predefined (with templates and guidelines elaborated with previously determined natural language like that used in BI-RADS and PI-RADS), or structured (with drop-down menus displaying questions with various possible answers that have been agreed on with the rest of the multidisciplinary team, which use standardized lexicons and are structured in the form of a database with data that can be traced and exploited with statistical tools and data mining). The structured report, compatible with Management of Radiology Report Templates (MRRT), makes it possible to incorporate quantitative information related with the digital analysis of the data from the acquired images to accurately and precisely describe the properties and behavior of tissues by means of radiomics (characteristics and parameters). In conclusion, structured digital information (images, text, measurements, radiomic features, and imaging biomarkers) should be integrated into computerized reports so that they can be indexed in large repositories. Radiologic databanks are fundamental for exploiting health information, phenotyping lesions and diseases, and extracting conclusions in personalized medicine. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.
Complexity in Acid–Base Titrations: Multimer Formation Between Phosphoric Acids and Imines
Malm, Christian; Kim, Heejae; Wagner, Manfred
2017-01-01
Abstract Solutions of Brønsted acids with bases in aprotic solvents are not only common model systems to study the fundamentals of proton transfer pathways but are also highly relevant to Brønsted acid catalysis. Despite their importance the light nature of the proton makes characterization of acid–base aggregates challenging. Here, we track such acid–base interactions over a broad range of relative compositions between diphenyl phosphoric acid and the base quinaldine in dichloromethane, by using a combination of dielectric relaxation and NMR spectroscopy. In contrast to what one would expect for an acid–base titration, we find strong deviations from quantitative proton transfer from the acid to the base. Even for an excess of the base, multimers consisting of one base and at least two acid molecules are formed, in addition to the occurrence of proton transfer from the acid to the base and simultaneous formation of ion pairs. For equimolar mixtures such multimers constitute about one third of all intermolecular aggregates. Quantitative analysis of our results shows that the acid‐base association constant is only around six times larger than that for the acid binding to an acid‐base dimer, that is, to an already protonated base. Our findings have implications for the interpretation of previous studies of reactive intermediates in organocatalysis and provide a rationale for previously observed nonlinear effects in phosphoric acid catalysis. PMID:28597513
Interspecific competition in plants: how well do current methods answer fundamental questions?
Connolly, J; Wayne, P; Bazzaz, F A
2001-02-01
Accurately quantifying and interpreting the processes and outcomes of competition among plants is essential for evaluating theories of plant community organization and evolution. We argue that many current experimental approaches to quantifying competitive interactions introduce size bias, which may significantly impact the quantitative and qualitative conclusions drawn from studies. Size bias generally arises when estimates of competitive ability are erroneously influenced by the initial size of competing individuals. We employ a series of quantitative thought experiments to demonstrate the potential for size bias in analysis of four traditional experimental designs (pairwise, replacement series, additive series, and response surfaces) either when only final measurements are available or when both initial and final measurements are collected. We distinguish three questions relevant to describing competitive interactions: Which species dominates? Which species gains? and How do species affect each other? The choice of experimental design and measurements greatly influences the scope of inference permitted. Conditions under which the latter two questions can give biased information are tabulated. We outline a new approach to characterizing competition that avoids size bias and that improves the concordance between research question and experimental design. The implications of the choice of size metrics used to quantify both the initial state and the responses of elements in interspecific mixtures are discussed. The relevance of size bias in competition studies with organisms other than plants is also discussed.
Bizouarn, Francisco
2014-01-01
Digital PCR (dPCR) is a molecular biology technique going through a renaissance. With the arrival of new instrumentation dPCR can now be performed as a routine molecular biology assay. This exciting new technique provides quantitative and detection capabilities that by far surpass other methods currently used. This chapter is an overview of some of the applications currently being performed using dPCR as well as the fundamental concepts and techniques this technology is based on.
Defining Soldier Intent in a Human-Robot Natural Language Interaction Context
2017-10-01
this burden on the human and expand the scope of human–robot operations, this project investigates fundamental research issues in the autonomous...attempted to devise a quantitative metric for the Shared Interpretation of Commander’s Intent (SICI). The authors’ background research indicated that...Another interesting set of results were the cases where the battalion and company commanders disagreed on the meaning of key terms, such as “delay”, which
Non-destructive evaluation of composite materials using ultrasound
NASA Technical Reports Server (NTRS)
Miller, J. G.
1984-01-01
Investigation of the nondestructive evaluation of advanced composite-laminates is summarized. Indices derived from the measurement of fundamental acoustic parameters are used in order to quantitatively estimate the local material properties of the laminate. The following sections describe ongoing studies of phase insensitive attenuation measurements, and discuss several phenomena which influences the previously reported technique of polar backscatter. A simple and effective programmable gate circuit designed for use in estimating attenuation from backscatter is described.
NASA Astrophysics Data System (ADS)
Brereton, Margot Felicity
A series of short engineering exercises and design projects was created to help students learn to apply abstract knowledge to physical experiences with hardware. The exercises involved designing machines from kits of materials and dissecting and analyzing familiar household products. Students worked in teams. During the activities students brought their knowledge of engineering fundamentals to bear. Videotape analysis was used to identify and characterize the ways in which hardware contributed to learning fundamental concepts. Structural and qualitative analyses of videotaped activities were undertaken. Structural analysis involved counting the references to theory and hardware and the extent of interleaving of references in activity. The analysis found that there was much more discussion linking fundamental concepts to hardware in some activities than in others. The analysis showed that the interleaving of references to theory and hardware in activity is observable and quantifiable. Qualitative analysis was used to investigate the dialog linking concepts and hardware. Students were found to advance their designs and their understanding of engineering fundamentals through a negotiation process in which they pitted abstract concepts against hardware behavior. Through this process students sorted out theoretical assumptions and causal relations. In addition they discovered design assumptions, functional connections and physical embodiments of abstract concepts in hardware, developing a repertoire of familiar hardware components and machines. Hardware was found to be integral to learning, affecting the course of inquiry and the dynamics of group interaction. Several case studies are presented to illustrate the processes at work. The research illustrates the importance of working across the boundary between abstractions and experiences with hardware in order to learn engineering and physical sciences. The research findings are: (a) the negotiation process by which students discover fundamental concepts in hardware (and three central causes of negotiation breakdown); (b) a characterization of the ways that material systems contribute to learning activities, (the seven roles of hardware in learning); (c) the characteristics of activities that support discovering fundamental concepts in hardware (plus several engineering exercises); (d) a research methodology to examine how students learn in practice.
Retrieval of haze properties and HCN concentrations from the three-micron spectrum of Titan
NASA Astrophysics Data System (ADS)
Kim, Sang J.; Lee, D. W.; Sim, C. K.; Seon, K. I.; Courtin, R.; Geballe, T. R.
2018-05-01
The 3 μm spectrum of Titan contains line emission and absorption as well as a significant haze continuum. The line emission has been previously analyzed in the literature, but that analysis has not properly included the influence of haze on the line emission. We report a new analysis of the 3 μm HCN emission spectrum using radiative transfer equations that include scattering and absorption by molecules and haze particles at altitudes lower than 500 km, where the influence of haze on the emergent spectrum becomes significant. Taking advantage of the dominance of resonant single scattering in the HCN ν3 fundamental and of the moderate haze optical thickness of the atmosphere around 3 μm, we adopt single dust and molecular scattering and present a formulation for the radiative transfer process. We evaluate the quantitative influence of haze scattering on the emission line intensities, and derive vertically-resolved single scattering albedos of the haze from model fits. We also present the resulting concentrations of HCN for altitudes below 500 km, where we find that the haze scattering significantly influences the retrieval of the concentrations of HCN. We conclude that the formulation we present is useful for the analysis of the HCN line emission from Titan and other similar hazy planetary or celestial objects.
Disentangling Time-series Spectra with Gaussian Processes: Applications to Radial Velocity Analysis
NASA Astrophysics Data System (ADS)
Czekala, Ian; Mandel, Kaisey S.; Andrews, Sean M.; Dittmann, Jason A.; Ghosh, Sujit K.; Montet, Benjamin T.; Newton, Elisabeth R.
2017-05-01
Measurements of radial velocity variations from the spectroscopic monitoring of stars and their companions are essential for a broad swath of astrophysics; these measurements provide access to the fundamental physical properties that dictate all phases of stellar evolution and facilitate the quantitative study of planetary systems. The conversion of those measurements into both constraints on the orbital architecture and individual component spectra can be a serious challenge, however, especially for extreme flux ratio systems and observations with relatively low sensitivity. Gaussian processes define sampling distributions of flexible, continuous functions that are well-motivated for modeling stellar spectra, enabling proficient searches for companion lines in time-series spectra. We introduce a new technique for spectral disentangling, where the posterior distributions of the orbital parameters and intrinsic, rest-frame stellar spectra are explored simultaneously without needing to invoke cross-correlation templates. To demonstrate its potential, this technique is deployed on red-optical time-series spectra of the mid-M-dwarf binary LP661-13. We report orbital parameters with improved precision compared to traditional radial velocity analysis and successfully reconstruct the primary and secondary spectra. We discuss potential applications for other stellar and exoplanet radial velocity techniques and extensions to time-variable spectra. The code used in this analysis is freely available as an open-source Python package.
Thermodynamic Cycle Analysis of Magnetohydrodynamic-Bypass Hypersonic Airbreathing Engines
NASA Technical Reports Server (NTRS)
Litchford, R. J.; Cole, J. W.; Bityurin, V. A.; Lineberry, J. T.
2000-01-01
The prospects for realizing a magnetohydrodynamic (MHD) bypass hypersonic airbreathing engine are examined from the standpoint of fundamental thermodynamic feasibility. The MHD-bypass engine, first proposed as part of the Russian AJAX vehicle concept, is based on the idea of redistributing energy between various stages of the propulsion system flow train. The system uses an MHD generator to extract a portion of the aerodynamic heating energy from the inlet and an MHD accelerator to reintroduce this power as kinetic energy in the exhaust stream. In this way, the combustor entrance Mach number can be limited to a specified value even as the flight Mach number increases. Thus, the fuel and air can be efficiently mixed and burned within a practical combustor length, and the flight Mach number operating envelope can be extended. In this paper, we quantitatively assess the performance potential and scientific feasibility of MHD-bypass engines using a simplified thermodynamic analysis. This cycle analysis, based on a thermally and calorically perfect gas, incorporates a coupled MHD generator-accelerator system and accounts for aerodynamic losses and thermodynamic process efficiencies in the various engin components. It is found that the flight Mach number range can be significantly extended; however, overall performance is hampered by non-isentropic losses in the MHD devices.
Applications of mid-infrared spectroscopy in the clinical laboratory setting.
De Bruyne, Sander; Speeckaert, Marijn M; Delanghe, Joris R
2018-01-01
Fourier transform mid-infrared (MIR-FTIR) spectroscopy is a nondestructive, label-free, highly sensitive and specific technique that provides complete information on the chemical composition of biological samples. The technique both can offer fundamental structural information and serve as a quantitative analysis tool. Therefore, it has many potential applications in different fields of clinical laboratory science. Although considerable technological progress has been made to promote biomedical applications of this powerful analytical technique, most clinical laboratory analyses are based on spectroscopic measurements in the visible or ultraviolet (UV) spectrum and the potential role of FTIR spectroscopy still remains unexplored. In this review, we present some general principles of FTIR spectroscopy as a useful method to study molecules in specimens by MIR radiation together with a short overview of methods to interpret spectral data. We aim at illustrating the wide range of potential applications of the proposed technique in the clinical laboratory setting with a focus on its advantages and limitations and discussing the future directions. The reviewed applications of MIR spectroscopy include (1) quantification of clinical parameters in body fluids, (2) diagnosis and monitoring of cancer and other diseases by analysis of body fluids, cells, and tissues, (3) classification of clinically relevant microorganisms, and (4) analysis of kidney stones, nails, and faecal fat.
Epidemics in Ming and Qing China: Impacts of changes of climate and economic well-being.
Pei, Qing; Zhang, David D; Li, Guodong; Winterhalder, Bruce; Lee, Harry F
2015-07-01
We investigated the mechanism of epidemics with the impacts of climate change and socio-economic fluctuations in the Ming and Qing Dynasties in China (AD 1368-1901). Using long-term and high-quality datasets, this study is the first quantitative research that verifies the 'climate change → economy → epidemics' mechanism in historical China by statistical methods that include correlation analysis, Granger causality analysis, ARX, and Poisson-ARX modeling. The analysis provides the evidences that climate change could only fundamentally lead to the epidemics spread and occurrence, but the depressed economic well-being is the direct trigger of epidemics spread and occurrence at the national and long term scale in historical China. Moreover, statistical modeling shows that economic well-being is more important than population pressure in the mechanism of epidemics. However, population pressure remains a key element in determining the social vulnerability of the epidemics occurrence under climate change. Notably, the findings not only support adaptation theories but also enhance our confidence to address climatic shocks if economic buffering capacity can be promoted steadily. The findings can be a basis for scientists and policymakers in addressing global and regional environmental changes. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Flexible Analysis Tool for the Quantitative Acoustic Assessment of Infant Cry
Reggiannini, Brian; Sheinkopf, Stephen J.; Silverman, Harvey F.; Li, Xiaoxue; Lester, Barry M.
2015-01-01
Purpose In this article, the authors describe and validate the performance of a modern acoustic analyzer specifically designed for infant cry analysis. Method Utilizing known algorithms, the authors developed a method to extract acoustic parameters describing infant cries from standard digital audio files. They used a frame rate of 25 ms with a frame advance of 12.5 ms. Cepstral-based acoustic analysis proceeded in 2 phases, computing frame-level data and then organizing and summarizing this information within cry utterances. Using signal detection methods, the authors evaluated the accuracy of the automated system to determine voicing and to detect fundamental frequency (F0) as compared to voiced segments and pitch periods manually coded from spectrogram displays. Results The system detected F0 with 88% to 95% accuracy, depending on tolerances set at 10 to 20 Hz. Receiver operating characteristic analyses demonstrated very high accuracy at detecting voicing characteristics in the cry samples. Conclusions This article describes an automated infant cry analyzer with high accuracy to detect important acoustic features of cry. A unique and important aspect of this work is the rigorous testing of the system’s accuracy as compared to ground-truth manual coding. The resulting system has implications for basic and applied research on infant cry development. PMID:23785178
Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie
2015-01-01
To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (P<0.05). When grade 2 of the disease was used as cut-off value for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.
Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; ...
2014-03-26
Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. In this paper, an electronic sensor platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing is reported. Clinical studies on patients using thermal sensors and actuators in fractal layouts providemore » precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of “epidermal” electronics system in a realistic clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. Finally, the results have the potential to address important unmet needs in chronic wound management.« less
Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; Jung, Sung-Young; Poon, Emily; Lee, Jung Woo; Na, Ilyoun; Geisler, Amelia; Sadhwani, Divya; Zhang, Yihui; Su, Yewang; Wang, Xiaoqi; Liu, Zhuangjian; Xia, Jing; Cheng, Huanyu; Webb, R Chad; Bonifas, Andrew P; Won, Philip; Jeong, Jae-Woong; Jang, Kyung-In; Song, Young Min; Nardone, Beatrice; Nodzenski, Michael; Fan, Jonathan A; Huang, Yonggang; West, Dennis P; Paller, Amy S; Alam, Murad; Yeo, Woon-Hong; Rogers, John A
2014-10-01
Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. Here, an electronic sensor platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing is reported. Clinical studies on patients using thermal sensors and actuators in fractal layouts provide precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of "epidermal" electronics system in a realistic clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. The results have the potential to address important unmet needs in chronic wound management. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Yap, John Stephen; Fan, Jianqing; Wu, Rongling
2009-12-01
Estimation of the covariance structure of longitudinal processes is a fundamental prerequisite for the practical deployment of functional mapping designed to study the genetic regulation and network of quantitative variation in dynamic complex traits. We present a nonparametric approach for estimating the covariance structure of a quantitative trait measured repeatedly at a series of time points. Specifically, we adopt Huang et al.'s (2006, Biometrika 93, 85-98) approach of invoking the modified Cholesky decomposition and converting the problem into modeling a sequence of regressions of responses. A regularized covariance estimator is obtained using a normal penalized likelihood with an L(2) penalty. This approach, embedded within a mixture likelihood framework, leads to enhanced accuracy, precision, and flexibility of functional mapping while preserving its biological relevance. Simulation studies are performed to reveal the statistical properties and advantages of the proposed method. A real example from a mouse genome project is analyzed to illustrate the utilization of the methodology. The new method will provide a useful tool for genome-wide scanning for the existence and distribution of quantitative trait loci underlying a dynamic trait important to agriculture, biology, and health sciences.
Imaging mRNA In Vivo, from Birth to Death.
Tutucci, Evelina; Livingston, Nathan M; Singer, Robert H; Wu, Bin
2018-05-20
RNA is the fundamental information transfer system in the cell. The ability to follow single messenger RNAs (mRNAs) from transcription to degradation with fluorescent probes gives quantitative information about how the information is transferred from DNA to proteins. This review focuses on the latest technological developments in the field of single-mRNA detection and their usage to study gene expression in both fixed and live cells. By describing the application of these imaging tools, we follow the journey of mRNA from transcription to decay in single cells, with single-molecule resolution. We review current theoretical models for describing transcription and translation that were generated by single-molecule and single-cell studies. These methods provide a basis to study how single-molecule interactions generate phenotypes, fundamentally changing our understating of gene expression regulation.
Kelvin–Helmholtz instability in an ultrathin air film causes drop splashing on smooth surfaces
Liu, Yuan; Tan, Peng; Xu, Lei
2015-01-01
When a fast-moving drop impacts onto a smooth substrate, splashing will be produced at the edge of the expanding liquid sheet. This ubiquitous phenomenon lacks a fundamental understanding. Combining experiment with model, we illustrate that the ultrathin air film trapped under the expanding liquid front triggers splashing. Because this film is thinner than the mean free path of air molecules, the interior airflow transfers momentum with an unusually high velocity comparable to the speed of sound and generates a stress 10 times stronger than the airflow in common situations. Such a large stress initiates Kelvin–Helmholtz instabilities at small length scales and effectively produces splashing. Our model agrees quantitatively with experimental verifications and brings a fundamental understanding to the ubiquitous phenomenon of drop splashing on smooth surfaces. PMID:25713350
Evidence-based dentistry: fundamentals for the dentist.
Bauer, Janet; Chiappelli, Francesco; Spackman, Sue; Prolo, Paolo; Stevenson, Richard
2006-06-01
This article explains the fundamentals of evidence-based dentistry for the dentist. Evidence-based dentistry is a discipline whose primary participant is the translational researcher. Recent developments have emphasized the importance of this discipline (clinical and translational research) for improving health care. The process of evidence-based dentistry is the reciprocation of new and existing evidence between dentists and quantitative and qualitative researchers, facilitated by the translational researcher. The product of this reciprocation is the clinical practice guideline, or best evidence, that provides the patient options in choosing treatments or services. These options are quantified and qualified by decision, utility, and cost data. Using shared decision-making, the dentist and patient arrive at a mutual understanding of which option best meets an acceptable and preferred treatment course that is cost effective. This option becomes the clinical decision.
Primordial Evolution in the Finitary Process Soup
NASA Astrophysics Data System (ADS)
Görnerup, Olof; Crutchfield, James P.
A general and basic model of primordial evolution—a soup of reacting finitary and discrete processes—is employed to identify and analyze fundamental mechanisms that generate and maintain complex structures in prebiotic systems. The processes—ɛ-machines as defined in computational mechanics—and their interaction networks both provide well defined notions of structure. This enables us to quantitatively demonstrate hierarchical self-organization in the soup in terms of complexity. We found that replicating processes evolve the strategy of successively building higher levels of organization by autocatalysis. Moreover, this is facilitated by local components that have low structural complexity, but high generality. In effect, the finitary process soup spontaneously evolves a selection pressure that favors such components. In light of the finitary process soup's generality, these results suggest a fundamental law of hierarchical systems: global complexity requires local simplicity.
Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis
NASA Technical Reports Server (NTRS)
Shortle, J. F.; Allocco, M.
2005-01-01
Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.
Muntlin Athlin, Åsa
2018-06-01
To examine and map research on minimum data sets linked to nursing practice and the fundamentals of care. Another aim was to identify gaps in the evidence to suggest future research questions to highlight the need for standardisation of terminology around nursing practice and fundamental care. Addressing fundamental care has been highlighted internationally as a response to missed nursing care. Systematic performance measurements are needed to capture nursing practice outcomes. Overview of the literature framed by the scoping study methodology. PubMed and CINAHL were searched using the following inclusion criteria: peer-reviewed empirical quantitative and qualitative studies related to minimum data sets and nursing practice published in English. No time restrictions were set. Exclusion criteria were as follows: no available full text, reviews and methodological and discursive studies. Data were categorised into one of the fundamentals of care elements. The review included 20 studies published in 1999-2016. Settings were mainly nursing homes or hospitals. Of 14 elements of the fundamentals of care, 11 were identified as measures in the included studies, but their frequency varied. The most commonly identified elements concerned safety, prevention and medication (n = 11), comfort (n = 6) and eating and drinking (n = 5). Studies have used minimum data sets and included variables linked to nursing practices and fundamentals of care. However, the relations of these variables to nursing practice were not always clearly described and the main purpose of the studies was seldom to measure the outcomes of nursing interventions. More robust studies focusing on nursing practice and patient outcomes are warranted. Using minimum data sets can highlight the nurses' work and what impact it has on direct patient care. Appropriate models, systems and standardised terminology are needed to facilitate the documentation of nursing activities. © 2017 John Wiley & Sons Ltd.
Siebert, Uwe; Rochau, Ursula; Claxton, Karl
2013-01-01
Decision analysis (DA) and value-of-information (VOI) analysis provide a systematic, quantitative methodological framework that explicitly considers the uncertainty surrounding the currently available evidence to guide healthcare decisions. In medical decision making under uncertainty, there are two fundamental questions: 1) What decision should be made now given the best available evidence (and its uncertainty)?; 2) Subsequent to the current decision and given the magnitude of the remaining uncertainty, should we gather further evidence (i.e., perform additional studies), and if yes, which studies should be undertaken (e.g., efficacy, side effects, quality of life, costs), and what sample sizes are needed? Using the currently best available evidence, VoI analysis focuses on the likelihood of making a wrong decision if the new intervention is adopted. The value of performing further studies and gathering additional evidence is based on the extent to which the additional information will reduce this uncertainty. A quantitative framework allows for the valuation of the additional information that is generated by further research, and considers the decision maker's objectives and resource constraints. Claxton et al. summarise: "Value of information analysis can be used to inform a range of policy questions including whether a new technology should be approved based on existing evidence, whether it should be approved but additional research conducted or whether approval should be withheld until the additional evidence becomes available." [Claxton K. Value of information entry in Encyclopaedia of Health Economics, Elsevier, forthcoming 2014.] The purpose of this tutorial is to introduce the framework of systematic VoI analysis to guide further research. In our tutorial article, we explain the theoretical foundations and practical methods of decision analysis and value-of-information analysis. To illustrate, we use a simple case example of a foot ulcer (e.g., with diabetes) as well as key references from the literature, including examples for the use of the decision-analytic VoI framework by health technology assessment agencies to guide further research. These concepts may guide stakeholders involved or interested in how to determine whether or not and, if so, which additional evidence is needed to make decisions. Copyright © 2013. Published by Elsevier GmbH.
Modeling activity patterns of wildlife using time-series analysis.
Zhang, Jindong; Hull, Vanessa; Ouyang, Zhiyun; He, Liang; Connor, Thomas; Yang, Hongbo; Huang, Jinyan; Zhou, Shiqiang; Zhang, Zejun; Zhou, Caiquan; Zhang, Hemin; Liu, Jianguo
2017-04-01
The study of wildlife activity patterns is an effective approach to understanding fundamental ecological and evolutionary processes. However, traditional statistical approaches used to conduct quantitative analysis have thus far had limited success in revealing underlying mechanisms driving activity patterns. Here, we combine wavelet analysis, a type of frequency-based time-series analysis, with high-resolution activity data from accelerometers embedded in GPS collars to explore the effects of internal states (e.g., pregnancy) and external factors (e.g., seasonal dynamics of resources and weather) on activity patterns of the endangered giant panda ( Ailuropoda melanoleuca ). Giant pandas exhibited higher frequency cycles during the winter when resources (e.g., water and forage) were relatively poor, as well as during spring, which includes the giant panda's mating season. During the summer and autumn when resources were abundant, pandas exhibited a regular activity pattern with activity peaks every 24 hr. A pregnant individual showed distinct differences in her activity pattern from other giant pandas for several months following parturition. These results indicate that animals adjust activity cycles to adapt to seasonal variation of the resources and unique physiological periods. Wavelet coherency analysis also verified the synchronization of giant panda activity level with air temperature and solar radiation at the 24-hr band. Our study also shows that wavelet analysis is an effective tool for analyzing high-resolution activity pattern data and its relationship to internal and external states, an approach that has the potential to inform wildlife conservation and management across species.
Conroy, Tiffany
2017-11-17
To explore the factors described by nurses and consumer representatives influencing the delivery of the fundamentals of care. An ongoing challenge facing nursing is ensuring the "basics" or fundamentals of care are delivered optimally. The way nurses and patients perceive the delivery of the fundamentals of care had not been explored. Once identified, the factors that promote the delivery of the fundamentals of care may be facilitated. Inductive content analysis of scenario based focus groups. A qualitative approach was taken using three stages, including direct observation, focus groups and interviews. This paper reports the second stage. Focus groups discussed four patient care scenarios derived from the observational data. Focus groups were conducted separately for registered nurses, nurses in leadership roles and consumer representatives. Content analysis was used. The analysis of the focus group data resulted in three themes: Organisational factors; Individual nurse or patient factors; and Interpersonal factors. Organisational factors include nursing leadership, the context of care delivery and the availability of time. Individual nurse and patient factors include the specific care needs of the patient and the individual nurse and patient characteristics. Interpersonal factors include the nurse-patient relationship; involving the patient in their care, ensuring understanding and respecting choices; communication; and setting care priorities. Seeking the perspective of the people involved in delivering and receiving the fundamentals of care showed a shared understanding of the factors influencing the delivery of the fundamentals of care. The influence of nursing leadership and the quality of the nurse-patient relationship were perceived as important factors. Nurses and consumers share a common perspective of the factors influencing the delivery of the fundamentals of care and both value a therapeutic nurse-patient relationship. Clinical nursing leaders must understand the impact of their role in shaping the delivery of the fundamentals of care. © 2017 John Wiley & Sons Ltd.
On the role of electron-driven processes in planetary atmospheres and comets
NASA Astrophysics Data System (ADS)
Campbell, L.; Brunger, M. J.
2009-11-01
After the presence of ionized layers in the Earth's atmosphere was inferred, it took 50 years to quantitatively understand them. The electron density could not be accounted for until Sir David Bates first suggested (along with Sir Harrie Massey) that the main electron-loss process was dissociative recombination with molecular ions, and he and colleagues then developed a theory to predict those rates of dissociative recombination. However, electron impact processes, particularly excitation, have been considered insignificant in most situations, in both planetary and cometary atmospheres. Here we describe cases where recent calculations have shown that electron impact excitation of molecules is important, suggesting that, just as in the time of Sir David Bates, electron-driven processes remain fundamental to our quantitative understanding of atmospheric and cometary phenomena.
Nick, Todd G
2007-01-01
Statistics is defined by the Medical Subject Headings (MeSH) thesaurus as the science and art of collecting, summarizing, and analyzing data that are subject to random variation. The two broad categories of summarizing and analyzing data are referred to as descriptive and inferential statistics. This chapter considers the science and art of summarizing data where descriptive statistics and graphics are used to display data. In this chapter, we discuss the fundamentals of descriptive statistics, including describing qualitative and quantitative variables. For describing quantitative variables, measures of location and spread, for example the standard deviation, are presented along with graphical presentations. We also discuss distributions of statistics, for example the variance, as well as the use of transformations. The concepts in this chapter are useful for uncovering patterns within the data and for effectively presenting the results of a project.
Strength of signal: a fundamental mechanism for cell fate specification.
Hayes, Sandra M; Love, Paul E
2006-02-01
How equipotent cells develop into complex tissues containing many diverse cell types is still a mystery. However, evidence is accumulating from different tissue systems in multiple organisms that many of the specific receptor families known to regulate cell fate decisions target conserved signaling pathways. A mechanism for preserving specificity in the cellular response that has emerged from these studies is one in which quantitative differences in receptor signaling regulate the cell fate decision. A signal strength model has recently gained support as a means to explain alphabeta/gammadelta lineage commitment. In this review, we compare the alphabeta/gammadelta fate decision with other cell fate decisions that occur outside of the lymphoid system to attain a better picture of the quantitative signaling mechanism for cell fate specification.
Embryoids, organoids and gastruloids: new approaches to understanding embryogenesis
2017-01-01
ABSTRACT Cells have an intrinsic ability to self-assemble and self-organize into complex and functional tissues and organs. By taking advantage of this ability, embryoids, organoids and gastruloids have recently been generated in vitro, providing a unique opportunity to explore complex embryological events in a detailed and highly quantitative manner. Here, we examine how such approaches are being used to answer fundamental questions in embryology, such as how cells self-organize and assemble, how the embryo breaks symmetry, and what controls timing and size in development. We also highlight how further improvements to these exciting technologies, based on the development of quantitative platforms to precisely follow and measure subcellular and molecular events, are paving the way for a more complete understanding of the complex events that help build the human embryo. PMID:28292844
Santos, Radleigh G; Appel, Jon R; Giulianotti, Marc A; Edwards, Bruce S; Sklar, Larry A; Houghten, Richard A; Pinilla, Clemencia
2013-05-30
In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays.
Sub-aperture switching based ptychographic iterative engine (sasPIE) method for quantitative imaging
NASA Astrophysics Data System (ADS)
Sun, Aihui; Kong, Yan; Jiang, Zhilong; Yu, Wei; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng
2018-03-01
Though ptychographic iterative engine (PIE) has been widely adopted in the quantitative micro-imaging with various illuminations as visible light, X-ray and electron beam, the mechanical inaccuracy in the raster scanning of the sample relative to the illumination always degrades the reconstruction quality seriously and makes the resolution reached much lower than that determined by the numerical aperture of the optical system. To overcome this disadvantage, the sub-aperture switching based PIE method is proposed: the mechanical scanning in the common PIE is replaced by the sub-aperture switching, and the reconstruction error related to the positioning inaccuracy is completely avoided. The proposed technique remarkably improves the reconstruction quality, reduces the complexity of the experimental setup and fundamentally accelerates the data acquisition and reconstruction.
Research on evaluating water resource resilience based on projection pursuit classification model
NASA Astrophysics Data System (ADS)
Liu, Dong; Zhao, Dan; Liang, Xu; Wu, Qiuchen
2016-03-01
Water is a fundamental natural resource while agriculture water guarantees the grain output, which shows that the utilization and management of water resource have a significant practical meaning. Regional agricultural water resource system features with unpredictable, self-organization, and non-linear which lays a certain difficulty on the evaluation of regional agriculture water resource resilience. The current research on water resource resilience remains to focus on qualitative analysis and the quantitative analysis is still in the primary stage, thus, according to the above issues, projection pursuit classification model is brought forward. With the help of artificial fish-swarm algorithm (AFSA), it optimizes the projection index function, seeks for the optimal projection direction, and improves AFSA with the application of self-adaptive artificial fish step and crowding factor. Taking Hongxinglong Administration of Heilongjiang as the research base and on the basis of improving AFSA, it established the evaluation of projection pursuit classification model to agriculture water resource system resilience besides the proceeding analysis of projection pursuit classification model on accelerating genetic algorithm. The research shows that the water resource resilience of Hongxinglong is the best than Raohe Farm, and the last 597 Farm. And the further analysis shows that the key driving factors influencing agricultural water resource resilience are precipitation and agriculture water consumption. The research result reveals the restoring situation of the local water resource system, providing foundation for agriculture water resource management.