NASA Astrophysics Data System (ADS)
Sader, John E.; Uchihashi, Takayuki; Higgins, Michael J.; Farrell, Alan; Nakayama, Yoshikazu; Jarvis, Suzanne P.
2005-03-01
Use of the atomic force microscope (AFM) in quantitative force measurements inherently requires a theoretical framework enabling conversion of the observed deflection properties of the cantilever to an interaction force. In this paper, the theoretical foundations of using frequency modulation atomic force microscopy (FM-AFM) in quantitative force measurements are examined and rigorously elucidated, with consideration being given to both 'conservative' and 'dissipative' interactions. This includes a detailed discussion of the underlying assumptions involved in such quantitative force measurements, the presentation of globally valid explicit formulae for evaluation of so-called 'conservative' and 'dissipative' forces, discussion of the origin of these forces, and analysis of the applicability of FM-AFM to quantitative force measurements in liquid.
QTest: Quantitative Testing of Theories of Binary Choice.
Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.
QTest: Quantitative Testing of Theories of Binary Choice
Regenwetter, Michel; Davis-Stober, Clintin P.; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of “Random Cumulative Prospect Theory.” A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences. PMID:24999495
Measuring Aircraft Capability for Military and Political Analysis
1976-03-01
challenged in 1932 when a panel of distinguished British scientists discussed the feasibility of quantitatively estimating sensory events... Quantitative Analysis of Social Problems , E.R. Tufte (ed.), p. 407, Addison-Wesley, 1970. 17 "artificial" boundaries are imposed on the data. Less...of arms transfers in various parts of the world as well. Quantitative research (and hence measurement) contributes to theoretical development by
EnviroLand: A Simple Computer Program for Quantitative Stream Assessment.
ERIC Educational Resources Information Center
Dunnivant, Frank; Danowski, Dan; Timmens-Haroldson, Alice; Newman, Meredith
2002-01-01
Introduces the Enviroland computer program which features lab simulations of theoretical calculations for quantitative analysis and environmental chemistry, and fate and transport models. Uses the program to demonstrate the nature of linear and nonlinear equations. (Author/YDS)
ERIC Educational Resources Information Center
Dunlop, David Livingston
The purpose of this study was to use an information theoretic memory model to quantitatively investigate classification sorting and recall behaviors of various groups of students. The model provided theorems for the determination of information theoretic measures from which inferences concerning mental processing were made. The basic procedure…
Collegiate Grading Practices and the Gender Pay Gap.
ERIC Educational Resources Information Center
Dowd, Alicia C.
2000-01-01
Presents a theoretical analysis showing that relatively low grading quantitative fields and high grading verbal fields create a disincentive for college women to invest in quantitative study. Extends research by R. Sabot and J. Wakeman-Linn. Models pressures on grading practices using higher education production functions. (Author/SLD)
ERIC Educational Resources Information Center
Shandra, John M.; Nobles, Jenna E.; London, Bruce; Williamson, John B.
2005-01-01
This study presents quantitative, sociological models designed to account for cross-national variation in child mortality. We consider variables linked to five different theoretical perspectives that include the economic modernization, social modernization, political modernization, ecological-evolutionary, and dependency perspectives. The study is…
Quantitation in chiral capillary electrophoresis: theoretical and practical considerations.
D'Hulst, A; Verbeke, N
1994-06-01
Capillary electrophoresis (CE) represents a decisive step forward in stereoselective analysis. The present paper deals with the theoretical aspects of the quantitation of peak separation in chiral CE. Because peak shape is very different in CE with respect to high performance liquid chromatography (HPLC), the resolution factor Rs, commonly used to describe the extent of separation between enantiomers as well as unrelated compounds, is demonstrated to be of limited value for the assessment of chiral separations in CE. Instead, the conjunct use of a relative chiral separation factor (RCS) and the percent chiral separation (% CS) is advocated. An array of examples is given to illustrate this. The practical aspects of method development using maltodextrins--which have been proposed previously as a major innovation in chiral selectors applicable in CE--are documented with the stereoselective analysis of coumarinic anticoagulant drugs. The possibilities of quantitation using CE were explored under two extreme conditions. Using ibuprofen, it has been demonstrated that enantiomeric excess determinations are possible down to a 1% level of optical contamination and stereoselective determinations are still possible with a good precision near the detection limit, increasing sample load by very long injection times. The theoretical aspects of this possibility are addressed in the discussion.
Development of Nomarski microscopy for quantitative determination of surface topography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartman, J. S.; Gordon, R. L.; Lessor, D. L.
1979-01-01
The use of Nomarski differential interference contrast (DIC) microscopy has been extended to provide nondestructive, quantitative analysis of a sample's surface topography. Theoretical modeling has determined the dependence of the image intensity on the microscope's optical components, the sample's optical properties, and the sample's surface orientation relative to the microscope. Results include expressions to allow the inversion of image intensity data to determine sample surface slopes. A commercial Nomarski system has been modified and characterized to allow the evaluation of the optical model. Data have been recorded with smooth, planar samples that verify the theoretical predictions.
Electric and Magnetic Interactions
NASA Astrophysics Data System (ADS)
Chabay, Ruth W.; Sherwood, Bruce A.
1994-08-01
The curriculum has been restructured so that students will have the necessary fundamental understanding of charges and fields before going on to more complex issues. Qualitative reasoning and quantitative analysis are discussed equally in order to provide a meaningful conceptual framework within which the quantitative work makes more sense. Atomic-level analysis is stressed and electrostatics and circuits are unified. Desktop experiments can be conducted at home or in the classroom and are tightly integrated with the theoretical treatment.
NASA Astrophysics Data System (ADS)
Chen, Shichao; Zhu, Yizheng
2017-02-01
Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.
Python for Information Theoretic Analysis of Neural Data
Ince, Robin A. A.; Petersen, Rasmus S.; Swan, Daniel C.; Panzeri, Stefano
2008-01-01
Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources. PMID:19242557
Statistical design of quantitative mass spectrometry-based proteomic experiments.
Oberg, Ann L; Vitek, Olga
2009-05-01
We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.
The Intercultural Component in Textbooks for Teaching a Service Technical Writing Course
ERIC Educational Resources Information Center
Matveeva, Natalia
2007-01-01
This research article investigates new developments in the representation of the intercultural component in textbooks for a service technical writing course. Through textual analysis, using quantitative and qualitative techniques, I report discourse analysis of 15 technical writing textbooks published during 1993-2006. The theoretical and…
Remmersmann, Christian; Stürwald, Stephan; Kemper, Björn; Langehanenberg, Patrik; von Bally, Gert
2009-03-10
In temporal phase-shifting-based digital holographic microscopy, high-resolution phase contrast imaging requires optimized conditions for hologram recording and phase retrieval. To optimize the phase resolution, for the example of a variable three-step algorithm, a theoretical analysis on statistical errors, digitalization errors, uncorrelated errors, and errors due to a misaligned temporal phase shift is carried out. In a second step the theoretically predicted results are compared to the measured phase noise obtained from comparative experimental investigations with several coherent and partially coherent light sources. Finally, the applicability for noise reduction is demonstrated by quantitative phase contrast imaging of pancreas tumor cells.
Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy
Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong
2016-01-01
Local surface charge density of lipid membranes influences membrane–protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values. PMID:27561322
Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy
NASA Astrophysics Data System (ADS)
Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong
2016-08-01
Local surface charge density of lipid membranes influences membrane-protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values.
Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy.
Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong
2016-08-26
Local surface charge density of lipid membranes influences membrane-protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values.
Multi-frequency local wavenumber analysis and ply correlation of delamination damage.
Juarez, Peter D; Leckey, Cara A C
2015-09-01
Wavenumber domain analysis through use of scanning laser Doppler vibrometry has been shown to be effective for non-contact inspection of damage in composites. Qualitative and semi-quantitative local wavenumber analysis of realistic delamination damage and quantitative analysis of idealized damage scenarios (Teflon inserts) have been performed previously in the literature. This paper presents a new methodology based on multi-frequency local wavenumber analysis for quantitative assessment of multi-ply delamination damage in carbon fiber reinforced polymer (CFRP) composite specimens. The methodology is presented and applied to a real world damage scenario (impact damage in an aerospace CFRP composite). The methodology yields delamination size and also correlates local wavenumber results from multiple excitation frequencies to theoretical dispersion curves in order to robustly determine the delamination ply depth. Results from the wavenumber based technique are validated against a traditional nondestructive evaluation method. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Sakashita, Tatsuo; Chazono, Hirokazu; Pezzotti, Giuseppe
2007-12-01
A quantitative determination of domain distribution in polycrystalline barium titanate (BaTiO3, henceforth BT) ceramics has been pursued with the aid of a microprobe polarized Raman spectrometer. The crystallographic texture and domain orientation distribution of BT ceramics, which switched upon applying stress according to ferroelasticity principles, were determined from the relative intensity of selected phonon modes, taking into consideration a theoretical analysis of the angular dependence of phonon mode intensity for the tetragonal BT phase. Furthermore, the angular dependence of Raman intensity measured in polycrystalline BT depended on the statistical distribution of domain angles in the laser microprobe, which was explicitly taken into account in this work for obtaining a quantitative analysis of domain orientation for in-plane textured BT polycrystalline materials.
Dehmer, Matthias; Kurt, Zeyneb; Emmert-Streib, Frank; Them, Christa; Schulc, Eva; Hofer, Sabine
2015-01-01
In this paper, we investigate treatment cycles inferred from diabetes data by means of graph theory. We define the term treatment cycles graph-theoretically and perform a descriptive as well as quantitative analysis thereof. Also, we interpret our findings in terms of nursing and clinical management. PMID:26030296
Theoretical Analysis of an Iron Mineral-Based Magnetoreceptor Model in Birds
Solov'yov, Ilia A.; Greiner, Walter
2007-01-01
Sensing the magnetic field has been established as an essential part of navigation and orientation of various animals for many years. Only recently has the first detailed receptor concept for magnetoreception been published based on histological and physical results. The considered mechanism involves two types of iron minerals (magnetite and maghemite) that were found in subcellular compartments within sensory dendrites of the upper beak of several bird species. But so far a quantitative evaluation of the proposed receptor is missing. In this article, we develop a theoretical model to quantitatively and qualitatively describe the magnetic field effects among particles containing iron minerals. The analysis of forces acting between these subcellular compartments shows a particular dependence on the orientation of the external magnetic field. The iron minerals in the beak are found in the form of crystalline maghemite platelets and assemblies of magnetite nanoparticles. We demonstrate that the pull or push to the magnetite assemblies, which are connected to the cell membrane, may reach a value of 0.2 pN—sufficient to excite specific mechanoreceptive membrane channels in the nerve cell. The theoretical analysis of the assumed magnetoreceptor system in the avian beak skin clearly shows that it might indeed be a sensitive biological magnetometer providing an essential part of the magnetic map for navigation. PMID:17496012
Qualitative Research in Career Development: Content Analysis from 1990 to 2009
ERIC Educational Resources Information Center
Stead, Graham B.; Perry, Justin C.; Munka, Linda M.; Bonnett, Heather R.; Shiban, Abbey P.; Care, Esther
2012-01-01
A content analysis of 11 journals that published career, vocational, and work-related articles from 1990 to 2009 was conducted. Of 3,279 articles analyzed, 55.9% used quantitative methods and 35.5% were theoretical/conceptual articles. Only 6.3% used qualitative research methods. Among the qualitative empirical studies, standards of academic rigor…
ERIC Educational Resources Information Center
Roessger, Kevin M.
2017-01-01
Translating theory to practice has been a historical concern of adult education. It remains unclear, though, if adult education's theoretical and epistemological focus on meaning making transcends the academy. A manifest content analysis was conducted to determine if the frequency of meaning making language differed between the field's U.S.…
Liang, Lihua; Sun, Mingxiao; Shi, Hongyu; Luan, Tiantian
2017-01-01
Fin-angle feedback control is usually used in conventional fin stabilizers, and its actual anti-rolling effect is difficult to reach theoretical design requirements. Primarily, lift of control torque is a theoretical value calculated by static hydrodynamic characteristics of fin. However, hydrodynamic characteristics of fin are dynamic while fin is moving in waves. As a result, there is a large deviation between actual value and theoretical value of lift. Firstly, the reasons of deviation are analyzed theoretically, which could avoid a variety of interference factors and complex theoretical derivations. Secondly, a new device is designed for direct measurement of actual lift, which is composed of fin-shaft combined mechanism and sensors. This new device can make fin-shaft not only be the basic function of rotating fin, but also detect actual lift. Through analysis using stiffness matrix of Euler-Bernoulli beam, displacement of shaft-core end is measured instead of lift which is difficult to measure. Then quantitative relationship between lift and displacement is defined. Three main factors are analyzed with quantitative relationship. What is more, two installation modes of sensors and a removable shaft-end cover are proposed according to hydrodynamic characteristics of fin. Thus the new device contributes to maintenance and measurement. Lastly, the effectiveness and accuracy of device are verified by contrasting calculation and simulation on the basis of actual design parameters. And the new measuring lift method can be proved to be effective through experiments. The new device is achieved from conventional fin stabilizers. Accordingly, the reliability of original equipment is inherited. The alteration of fin stabilizers is minor, which is suitable for engineering application. In addition, the flexural properties of fin-shaft are digitized with analysis of stiffness matrix. This method provides theoretical support for engineering application by carrying out finite element analysis with computers. PMID:28046122
Borgese, L; Salmistraro, M; Gianoncelli, A; Zacco, A; Lucchini, R; Zimmerman, N; Pisani, L; Siviero, G; Depero, L E; Bontempi, E
2012-01-30
This work is presented as an improvement of a recently introduced method for airborne particulate matter (PM) filter analysis [1]. X-ray standing wave (XSW) and total reflection X-ray fluorescence (TXRF) were performed with a new dedicated laboratory instrumentation. The main advantage of performing both XSW and TXRF, is the possibility to distinguish the nature of the sample: if it is a small droplet dry residue, a thin film like or a bulk sample. Another advantage is related to the possibility to select the angle of total reflection to make TXRF measurements. Finally, the possibility to switch the X-ray source allows to measure with more accuracy lighter and heavier elements (with a change in X-ray anode, for example from Mo to Cu). The aim of the present study is to lay the theoretical foundation of the new proposed method for airborne PM filters quantitative analysis improving the accuracy and efficiency of quantification by means of an external standard. The theoretical model presented and discussed demonstrated that airborne PM filters can be considered as thin layers. A set of reference samples is prepared in laboratory and used to obtain a calibration curve. Our results demonstrate that the proposed method for quantitative analysis of air PM filters is affordable and reliable without the necessity to digest filters to obtain quantitative chemical analysis, and that the use of XSW improve the accuracy of TXRF analysis. Copyright © 2011 Elsevier B.V. All rights reserved.
The Definition, Rationale, and Effects of Thresholding in OCT Angiography.
Cole, Emily D; Moult, Eric M; Dang, Sabin; Choi, WooJhon; Ploner, Stefan B; Lee, ByungKun; Louzada, Ricardo; Novais, Eduardo; Schottenhamml, Julia; Husvogt, Lennart; Maier, Andreas; Fujimoto, James G; Waheed, Nadia K; Duker, Jay S
2017-01-01
To examine the definition, rationale, and effects of thresholding in OCT angiography (OCTA). A theoretical description of OCTA thresholding in combination with qualitative and quantitative analysis of the effects of OCTA thresholding in eyes from a retrospective case series. Four eyes were qualitatively examined: 1 from a 27-year-old control, 1 from a 78-year-old exudative age-related macular degeneration (AMD) patient, 1 from a 58-year-old myopic patient, and 1 from a 77-year-old nonexudative AMD patient with geographic atrophy (GA). One eye from a 75-year-old nonexudative AMD patient with GA was quantitatively analyzed. A theoretical thresholding model and a qualitative and quantitative description of the dependency of OCTA on thresholding level. Due to the presence of system noise, OCTA thresholding is a necessary step in forming OCTA images; however, thresholding can complicate the relationship between blood flow and OCTA signal. Thresholding in OCTA can cause significant artifacts, which should be considered when interpreting and quantifying OCTA images.
The brainstem reticular formation is a small-world, not scale-free, network
Humphries, M.D; Gurney, K; Prescott, T.J
2005-01-01
Recently, it has been demonstrated that several complex systems may have simple graph-theoretic characterizations as so-called ‘small-world’ and ‘scale-free’ networks. These networks have also been applied to the gross neural connectivity between primate cortical areas and the nervous system of Caenorhabditis elegans. Here, we extend this work to a specific neural circuit of the vertebrate brain—the medial reticular formation (RF) of the brainstem—and, in doing so, we have made three key contributions. First, this work constitutes the first model (and quantitative review) of this important brain structure for over three decades. Second, we have developed the first graph-theoretic analysis of vertebrate brain connectivity at the neural network level. Third, we propose simple metrics to quantitatively assess the extent to which the networks studied are small-world or scale-free. We conclude that the medial RF is configured to create small-world (implying coherent rapid-processing capabilities), but not scale-free, type networks under assumptions which are amenable to quantitative measurement. PMID:16615219
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, X; Arbique, G; Guild, J
Purpose: To evaluate the quantitative image quality of spectral reconstructions of phantom data from a spectral CT scanner. Methods: The spectral CT scanner (IQon Spectral CT, Philips Healthcare) is equipped with a dual-layer detector and generates conventional 80-140 kVp images and variety of spectral reconstructions, e.g., virtual monochromatic (VM) images, virtual non-contrast (VNC) images, iodine maps, and effective atomic number (Z) images. A cylindrical solid water phantom (Gammex 472, 33 cm diameter and 5 cm thick) with iodine (2.0-20.0 mg I/ml) and calcium (50-600 mg/ml) rod inserts was scanned at 120 kVp and 27 mGy CTDIvol. Spectral reconstructions were evaluatedmore » by comparing image measurements with theoretical values calculated from nominal rod compositions provided by the phantom manufacturer. The theoretical VNC was calculated using water and iodine basis material decomposition, and the theoretical Z was calculated using two common methods, the chemical formula method (Z1) and the dual-energy ratio method (Z2). Results: Beam-hardening-like artifacts between high-attenuation calcium rods (≥300 mg/ml, >800 HU) influenced quantitative measurements, so the quantitative analysis was only performed on iodine rods using the images from the scan with all the calcium rods removed. The CT numbers of the iodine rods in the VM images (50∼150 keV) were close to theoretical values with average difference of 2.4±6.9 HU. Compared with theoretical values, the average difference for iodine concentration, VNC CT number and effective Z of iodine rods were −0.10±0.38 mg/ml, −0.1±8.2 HU, 0.25±0.06 (Z1) and −0.23±0.07 (Z2). Conclusion: The results indicate that the spectral CT scanner generates quantitatively accurate spectral reconstructions at clinically relevant iodine concentrations. Beam-hardening-like artifacts still exist when high-attenuation objects are present and their impact on patient images needs further investigation. YY is an employee of Philips Healthcare.« less
Screening hypochromism (sieve effect) in red blood cells: a quantitative analysis
Razi Naqvi, K.
2014-01-01
Multiwavelength UV-visible spectroscopy, Kramers-Kronig analysis, and several other experimental and theoretical tools have been applied over the last several decades to fathom absorption and scattering of light by suspensions of micron-sized pigmented particles, including red blood cells, but a satisfactory quantitative analysis of the difference between the absorption spectra of suspension of intact and lysed red blood cells is still lacking. It is stressed that such a comparison is meaningful only if the pertinent spectra are free from, or have been corrected for, scattering losses, and it is shown that Duysens’ theory can, whereas that of Vekshin cannot, account satisfactorily for the observed hypochromism of suspensions of red blood cells. PMID:24761307
Screening hypochromism (sieve effect) in red blood cells: a quantitative analysis.
Razi Naqvi, K
2014-04-01
Multiwavelength UV-visible spectroscopy, Kramers-Kronig analysis, and several other experimental and theoretical tools have been applied over the last several decades to fathom absorption and scattering of light by suspensions of micron-sized pigmented particles, including red blood cells, but a satisfactory quantitative analysis of the difference between the absorption spectra of suspension of intact and lysed red blood cells is still lacking. It is stressed that such a comparison is meaningful only if the pertinent spectra are free from, or have been corrected for, scattering losses, and it is shown that Duysens' theory can, whereas that of Vekshin cannot, account satisfactorily for the observed hypochromism of suspensions of red blood cells.
ERIC Educational Resources Information Center
Mikkelsen, Kim Sass
2017-01-01
Contemporary case studies rely on verbal arguments and set theory to build or evaluate theoretical claims. While existing procedures excel in the use of qualitative information (information about kind), they ignore quantitative information (information about degree) at central points of the analysis. Effectively, contemporary case studies rely on…
THE APPLICATION OF CYBERNETICS IN PEDAGOGY.
ERIC Educational Resources Information Center
ATUTOV, P.R.
THE APPLICATION OF CYBERNETICS TO PEDAGOGY CAN CREATE A PRECISE SCIENCE OF INSTRUCTION AND EDUCATION THROUGH THE TIME-CONSUMING BUT INEVITABLE TRANSITION FROM IDENTIFICATION OF QUALITATIVE RELATIONSHIPS AMONG PEDAGOGICAL OBJECTS TO QUANTITATIVE ANALYSIS OF THESE OBJECTS. THE THEORETICAL UTILITY OF MATHEMATICAL MODELS AND FORMULAE FOR EXPLANATORY…
Balaev, Mikhail
2014-07-01
The author examines how time delayed effects of economic development, education, and gender equality influence political democracy. Literature review shows inadequate understanding of lagged effects, which raises methodological and theoretical issues with the current quantitative studies of democracy. Using country-years as a unit of analysis, the author estimates a series of OLS PCSE models for each predictor with a systematic analysis of the distributions of the lagged effects. The second set of multiple OLS PCSE regressions are estimated including all three independent variables. The results show that economic development, education, and gender have three unique trajectories of the time-delayed effects: Economic development has long-term effects, education produces continuous effects regardless of the timing, and gender equality has the most prominent immediate and short term effects. The results call for the reassessment of model specifications and theoretical setups in the quantitative studies of democracy. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Wilcox, W. R.; Subramanian, R. S.; Meyyappan, M.; Smith, H. D.; Mattox, D. M.; Partlow, D. P.
1981-01-01
Thermal fining, thermal migration of bubbles under reduced gravity conditions, and data to verify current theoretical models of bubble location and temperatures as a function of time are discussed. A sample, sodium borate glass, was tested during 5 to 6 minutes of zero gravity during rocket flight. The test cell contained a heater strip; thermocouples were in the sample. At present quantitative data are insufficient to confirm results of theoretical calculations.
Wright, Kevin A; Bouffard, Leana A
2016-02-01
The qualitative analysis of individual cases has a prominent place in the development of criminological theory, yet progression in the scientific study of crime has largely been viewed as a distinctly quantitative endeavor. In the process, much of the theoretical depth and precision supplied by earlier methods of criminological knowledge production have been sacrificed. The current work argues for a return to our criminological roots by supplementing quantitative analyses with the qualitative inspection of individual cases. We provide a specific example of a literature (i.e., criminal specialization/versatility) that has become increasingly quantitative and could benefit from the use of the proposed approach. We conclude by offering additional areas of research that might be advanced by our framework presented here. © The Author(s) 2014.
Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh
2011-03-01
It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.
A collection of flow visualization techniques used in the Aerodynamic Research Branch
NASA Technical Reports Server (NTRS)
1984-01-01
Theoretical and experimental research on unsteady aerodynamic flows is discussed. Complex flow fields that involve separations, vortex interactions, and transonic flow effects were investigated. Flow visualization techniques are used to obtain a global picture of the flow phenomena before detailed quantitative studies are undertaken. A wide variety of methods are used to visualize fluid flow and a sampling of these methods is presented. It is emphasized that the visualization technique is a thorough quantitative analysis and subsequent physical understanding of these flow fields.
Ernst, Marielle; Kriston, Levente; Romero, Javier M; Frölich, Andreas M; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik
2016-01-01
We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time.
Joseph, Paul; Tretsiakova-McNally, Svetlana
2015-01-01
Polymeric materials often exhibit complex combustion behaviours encompassing several stages and involving solid phase, gas phase and interphase. A wide range of qualitative, semi-quantitative and quantitative testing techniques are currently available, both at the laboratory scale and for commercial purposes, for evaluating the decomposition and combustion behaviours of polymeric materials. They include, but are not limited to, techniques such as: thermo-gravimetric analysis (TGA), oxygen bomb calorimetry, limiting oxygen index measurements (LOI), Underwriters Laboratory 94 (UL-94) tests, cone calorimetry, etc. However, none of the above mentioned techniques are capable of quantitatively deciphering the underpinning physiochemical processes leading to the melt flow behaviour of thermoplastics. Melt-flow of polymeric materials can constitute a serious secondary hazard in fire scenarios, for example, if they are present as component parts of a ceiling in an enclosure. In recent years, more quantitative attempts to measure the mass loss and melt-drip behaviour of some commercially important chain- and step-growth polymers have been accomplished. The present article focuses, primarily, on the experimental and some theoretical aspects of melt-flow behaviours of thermoplastics under heat/fire conditions. PMID:28793746
Joseph, Paul; Tretsiakova-McNally, Svetlana
2015-12-15
Polymeric materials often exhibit complex combustion behaviours encompassing several stages and involving solid phase, gas phase and interphase. A wide range of qualitative, semi-quantitative and quantitative testing techniques are currently available, both at the laboratory scale and for commercial purposes, for evaluating the decomposition and combustion behaviours of polymeric materials. They include, but are not limited to, techniques such as: thermo-gravimetric analysis (TGA), oxygen bomb calorimetry, limiting oxygen index measurements (LOI), Underwriters Laboratory 94 (UL-94) tests, cone calorimetry, etc. However, none of the above mentioned techniques are capable of quantitatively deciphering the underpinning physiochemical processes leading to the melt flow behaviour of thermoplastics. Melt-flow of polymeric materials can constitute a serious secondary hazard in fire scenarios, for example, if they are present as component parts of a ceiling in an enclosure. In recent years, more quantitative attempts to measure the mass loss and melt-drip behaviour of some commercially important chain- and step-growth polymers have been accomplished. The present article focuses, primarily, on the experimental and some theoretical aspects of melt-flow behaviours of thermoplastics under heat/fire conditions.
Xu, Y.; Xia, J.; Miller, R.D.
2006-01-01
Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhang, Chao; Qin, Ting Xin; Huang, Shuai; Wu, Jian Song; Meng, Xin Yan
2018-06-01
Some factors can affect the consequences of oil pipeline accident and their effects should be analyzed to improve emergency preparation and emergency response. Although there are some qualitative analysis models of risk factors' effects, the quantitative analysis model still should be researched. In this study, we introduce a Bayesian network (BN) model of risk factors' effects analysis in an oil pipeline accident case that happened in China. The incident evolution diagram is built to identify the risk factors. And the BN model is built based on the deployment rule for factor nodes in BN and the expert knowledge by Dempster-Shafer evidence theory. Then the probabilities of incident consequences and risk factors' effects can be calculated. The most likely consequences given by this model are consilient with the case. Meanwhile, the quantitative estimations of risk factors' effects may provide a theoretical basis to take optimal risk treatment measures for oil pipeline management, which can be used in emergency preparation and emergency response.
The Internationalization of Creativity as a Learning Competence
ERIC Educational Resources Information Center
Jules, Tavis D.; Sundberg, Kelly Cebold
2018-01-01
This study uses a quantitative content analysis of learning competences -- as described and prescribed in 21st century frameworks -- and those competences evaluated by international assessments to explore the nexus between recommendation and reality. In drawing insights from the theoretical underpinnings of human capital theory we argue, with…
ERIC Educational Resources Information Center
Skinner, Ann
2018-01-01
Resource-based theory provided the theoretical foundation to investigate the extent that developer knowledge correlated to success of information technology (IT) development projects. Literature indicated there was a knowledge gap in understanding whether developer information system development, behavior and business knowledge contributed to IT…
Mass Communication Research Trends from 1980 to 1999.
ERIC Educational Resources Information Center
Kamhawi, Rasha; Weaver, David
2003-01-01
Uses thematic meta-analysis to examine study method, medium and area of focus, theoretical approach, funding source, and time period covered in research articles published in 10 major mass communications journals during the 1980 to 1999 period. Finds that qualitative research methods continued to be much less common than quantitative methods…
Toward Validation of the Genius Discipline-Specific Literacy Model
ERIC Educational Resources Information Center
Ellis, Edwin S.; Wills, Stephen; Deshler, Donald D.
2011-01-01
An analysis of the rationale and theoretical foundations of the Genius Discipline-specific Literacy Model and its use of SMARTvisuals to cue information-processing skills and strategies and focus attention on essential informational elements in high-frequency topics in history and the English language arts are presented. Quantitative data…
Representations of Scientists in Canadian High School and College Textbooks
ERIC Educational Resources Information Center
van Eijck, Michiel; Roth, Wolff-Michael
2008-01-01
This study investigated the representations of a select group of scientists (n = 10) in a sample of Canadian high school and college textbooks. Drawing on semiotic and cultural-historical activity theoretical frameworks, we conducted two analyses. A coarse-grained, quantitative analysis of the prevalence and structure of these representations…
Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan
2015-06-01
Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.
Size Dependent Mechanical Properties of Monolayer Densely Arranged Polystyrene Nanospheres.
Huang, Peng; Zhang, Lijing; Yan, Qingfeng; Guo, Dan; Xie, Guoxin
2016-12-13
In contrast to macroscopic materials, the mechanical properties of polymer nanospheres show fascinating scientific and application values. However, the experimental measurements of individual nanospheres and quantitative analysis of theoretical mechanisms remain less well performed and understood. We provide a highly efficient and accurate method with monolayer densely arranged honeycomb polystyrene (PS) nanospheres for the quantitatively mechanical characterization of individual nanospheres on the basis of atomic force microscopy (AFM) nanoindentation. The efficiency is improved by 1-2 orders, and the accuracy is also enhanced almost by half-order. The elastic modulus measured in the experiments increases with decreasing radius to the smallest nanospheres (25-35 nm in radius). A core-shell model is introduced to predict the size dependent elasticity of PS nanospheres, and the theoretical prediction agrees reasonably well with the experimental results and also shows a peak modulus value.
Analysis of Market Opportunities for Chinese Private Express Delivery Industry
NASA Astrophysics Data System (ADS)
Jiang, Changbing; Bai, Lijun; Tong, Xiaoqing
China's express delivery market has become the arena in which each express enterprise struggles to chase due to the huge potential demand and high profitable prospects. So certain qualitative and quantitative forecast for the future changes of China's express delivery market will help enterprises understand various types of market conditions and social changes in demand and adjust business activities to enhance their competitiveness timely. The development of China's express delivery industry is first introduced in this chapter. Then the theoretical basis of the regression model is overviewed. We also predict the demand trends of China's express delivery market by using Pearson correlation analysis and regression analysis from qualitative and quantitative aspects, respectively. Finally, we draw some conclusions and recommendations for China's express delivery industry.
Interactive 3D visualization for theoretical virtual observatories
NASA Astrophysics Data System (ADS)
Dykes, T.; Hassan, A.; Gheller, C.; Croton, D.; Krokos, M.
2018-06-01
Virtual observatories (VOs) are online hubs of scientific knowledge. They encompass a collection of platforms dedicated to the storage and dissemination of astronomical data, from simple data archives to e-research platforms offering advanced tools for data exploration and analysis. Whilst the more mature platforms within VOs primarily serve the observational community, there are also services fulfilling a similar role for theoretical data. Scientific visualization can be an effective tool for analysis and exploration of data sets made accessible through web platforms for theoretical data, which often contain spatial dimensions and properties inherently suitable for visualization via e.g. mock imaging in 2D or volume rendering in 3D. We analyse the current state of 3D visualization for big theoretical astronomical data sets through scientific web portals and virtual observatory services. We discuss some of the challenges for interactive 3D visualization and how it can augment the workflow of users in a virtual observatory context. Finally we showcase a lightweight client-server visualization tool for particle-based data sets, allowing quantitative visualization via data filtering, highlighting two example use cases within the Theoretical Astrophysical Observatory.
NASA Astrophysics Data System (ADS)
Förtsch, Christian; Dorfner, Tobias; Baumgartner, Julia; Werner, Sonja; von Kotzebue, Lena; Neuhaus, Birgit J.
2018-04-01
The German National Education Standards (NES) for biology were introduced in 2005. The content part of the NES emphasizes fostering conceptual knowledge. However, there are hardly any indications of what such an instructional implementation could look like. We introduce a theoretical framework of an instructional approach to foster students' conceptual knowledge as demanded in the NES (Fostering Conceptual Knowledge) including instructional practices derived from research on single core ideas, general psychological theories, and biology-specific features of instructional quality. First, we aimed to develop a rating manual, which is based on this theoretical framework. Second, we wanted to describe current German biology instruction according to this approach and to quantitatively analyze its effectiveness. And third, we aimed to provide qualitative examples of this approach to triangulate our findings. In a first step, we developed a theoretically devised rating manual to measure Fostering Conceptual Knowledge in videotaped lessons. Data for quantitative analysis included 81 videotaped biology lessons of 28 biology teachers from different German secondary schools. Six hundred forty students completed a questionnaire on their situational interest after each lesson and an achievement test. Results from multilevel modeling showed significant positive effects of Fostering Conceptual Knowledge on students' achievement and situational interest. For qualitative analysis, we contrasted instruction of four teachers, two with high and two with low student achievement and situational interest using the qualitative method of thematic analysis. Qualitative analysis revealed five main characteristics describing Fostering Conceptual Knowledge. Therefore, implementing Fostering Conceptual Knowledge in biology instruction seems promising. Examples of how to implement Fostering Conceptual Knowledge in instruction are shown and discussed.
Analysis of nonlinear internal waves observed by Landsat thematic mapper
NASA Astrophysics Data System (ADS)
Artale, V.; Levi, D.; Marullo, S.; Santoleri, R.
1990-09-01
In this work we test the compatibility between the theoretical parameters of a nonlinear wave model and the quantitative information that one can deduce from satellite-derived data. The theoretical parameters are obtained by applying an inverse problem to the solution of the Cauchy problem for the Korteweg-de Vries equation. Our results are applied to the case of internal wave patterns elaborated from two different satellite sensors at the south of Messina (the thematic mapper) and at the north of Messina (the synthetic aperture radar).
A new theoretical approach to analyze complex processes in cytoskeleton proteins.
Li, Xin; Kolomeisky, Anatoly B
2014-03-20
Cytoskeleton proteins are filament structures that support a large number of important biological processes. These dynamic biopolymers exist in nonequilibrium conditions stimulated by hydrolysis chemical reactions in their monomers. Current theoretical methods provide a comprehensive picture of biochemical and biophysical processes in cytoskeleton proteins. However, the description is only qualitative under biologically relevant conditions because utilized theoretical mean-field models neglect correlations. We develop a new theoretical method to describe dynamic processes in cytoskeleton proteins that takes into account spatial correlations in the chemical composition of these biopolymers. Our approach is based on analysis of probabilities of different clusters of subunits. It allows us to obtain exact analytical expressions for a variety of dynamic properties of cytoskeleton filaments. By comparing theoretical predictions with Monte Carlo computer simulations, it is shown that our method provides a fully quantitative description of complex dynamic phenomena in cytoskeleton proteins under all conditions.
Quantitative analysis of intermolecular interactions in orthorhombic rubrene
Hathwar, Venkatesha R.; Sist, Mattia; Jørgensen, Mads R. V.; ...
2015-08-14
Rubrene is one of the most studied organic semiconductors to date due to its high charge carrier mobility which makes it a potentially applicable compound in modern electronic devices. Previous electronic device characterizations and first principles theoretical calculations assigned the semiconducting properties of rubrene to the presence of a large overlap of the extended π-conjugated core between molecules. We present here the electron density distribution in rubrene at 20 K and at 100 K obtained using a combination of high-resolution X-ray and neutron diffraction data. The topology of the electron density and energies of intermolecular interactions are studied quantitatively. Specifically,more » the presence of C π...C πinteractions between neighbouring tetracene backbones of the rubrene molecules is experimentally confirmed from a topological analysis of the electron density, Non-Covalent Interaction (NCI) analysis and the calculated interaction energy of molecular dimers. A significant contribution to the lattice energy of the crystal is provided by H—H interactions. The electron density features of H—H bonding, and the interaction energy of molecular dimers connected by H—H interaction clearly demonstrate an importance of these weak interactions in the stabilization of the crystal structure. Finally, the quantitative nature of the intermolecular interactions is virtually unchanged between 20 K and 100 K suggesting that any changes in carrier transport at these low temperatures would have a different origin. The obtained experimental results are further supported by theoretical calculations.« less
Ernst, Marielle; Kriston, Levente; Romero, Javier M.; Frölich, Andreas M.; Jansen, Olav; Fiehler, Jens; Buhk, Jan-Hendrik
2016-01-01
Purpose We sought to develop a standardized curriculum capable of assessing key competencies in Interventional Neuroradiology by the use of models and simulators in an objective, quantitative, and efficient way. In this evaluation we analyzed the associations between the practical experience, theoretical knowledge, and the skills lab performance of interventionalists. Materials and Methods We evaluated the endovascular skills of 26 participants of the Advanced Course in Endovascular Interventional Neuroradiology of the European Society of Neuroradiology with a set of three tasks (aneurysm coiling and thrombectomy in a virtual simulator and placement of an intra-aneurysmal flow disruptor in a flow model). Practical experience was assessed by a survey. Participants completed a written and oral examination to evaluate theoretical knowledge. Bivariate and multivariate analyses were performed. Results In multivariate analysis knowledge of materials and techniques in Interventional Neuroradiology was moderately associated with skills in aneurysm coiling and thrombectomy. Experience in mechanical thrombectomy was moderately associated with thrombectomy skills, while age was negatively associated with thrombectomy skills. We found no significant association between age, sex, or work experience and skills in aneurysm coiling. Conclusion Our study gives an example of how an integrated curriculum for reasonable and cost-effective assessment of key competences of an interventional neuroradiologist could look. In addition to traditional assessment of theoretical knowledge practical skills are measured by the use of endovascular simulators yielding objective, quantitative, and constructive data for the evaluation of the current performance status of participants as well as the evolution of their technical competency over time. PMID:26848840
A Method for Quantifying, Visualising, and Analysing Gastropod Shell Form
Liew, Thor-Seng; Schilthuizen, Menno
2016-01-01
Quantitative analysis of organismal form is an important component for almost every branch of biology. Although generally considered an easily-measurable structure, the quantification of gastropod shell form is still a challenge because many shells lack homologous structures and have a spiral form that is difficult to capture with linear measurements. In view of this, we adopt the idea of theoretical modelling of shell form, in which the shell form is the product of aperture ontogeny profiles in terms of aperture growth trajectory that is quantified as curvature and torsion, and of aperture form that is represented by size and shape. We develop a workflow for the analysis of shell forms based on the aperture ontogeny profile, starting from the procedure of data preparation (retopologising the shell model), via data acquisition (calculation of aperture growth trajectory, aperture form and ontogeny axis), and data presentation (qualitative comparison between shell forms) and ending with data analysis (quantitative comparison between shell forms). We evaluate our methods on representative shells of the genera Opisthostoma and Plectostoma, which exhibit great variability in shell form. The outcome suggests that our method is a robust, reproducible, and versatile approach for the analysis of shell form. Finally, we propose several potential applications of our methods in functional morphology, theoretical modelling, taxonomy, and evolutionary biology. PMID:27280463
ERIC Educational Resources Information Center
Olaniran, Bolanle; Austin, Katherine A.
2009-01-01
Purpose: This paper aims to describe the incorporation of technologies into two upper division Communication Studies courses at Texas Tech University. Design/methodology/approach: The article discusses the methodological and pedagogical rationale used to select the appropriate technologies and to effectively incorporate them into the classroom. An…
How fast is fisheries-induced evolution? Quantitative analysis of modelling and empirical studies
Audzijonyte, Asta; Kuparinen, Anna; Fulton, Elizabeth A
2013-01-01
A number of theoretical models, experimental studies and time-series studies of wild fish have explored the presence and magnitude of fisheries-induced evolution (FIE). While most studies agree that FIE is likely to be happening in many fished stocks, there are disagreements about its rates and implications for stock viability. To address these disagreements in a quantitative manner, we conducted a meta-analysis of FIE rates reported in theoretical and empirical studies. We discovered that rates of phenotypic change observed in wild fish are about four times higher than the evolutionary rates reported in modelling studies, but correlation between the rate of change and instantaneous fishing mortality (F) was very similar in the two types of studies. Mixed-model analyses showed that in the modelling studies traits associated with reproductive investment and growth evolved slower than rates related to maturation. In empirical observations age-at-maturation was changing faster than other life-history traits. We also found that, despite different assumption and modelling approaches, rates of evolution for a given F value reported in 10 of 13 modelling studies were not significantly different. PMID:23789026
A method to explore the quantitative interactions between metal and ceria for M/CeO2 catalysts
NASA Astrophysics Data System (ADS)
Zhu, Kong-Jie; Liu, Jie; Yang, Yan-Ju; Xu, Yu-Xing; Teng, Bo-Tao; Wen, Xiao-Dong; Fan, Maohong
2018-03-01
To explore the quantitative relationship of metal interaction with ceria plays a key role in the theoretical design of M/CeO2 catalysts, especially for the new hot topic of atomically dispersed catalysts. A method to quantitatively explore the interactions between metal and ceria is proposed in the present work on the basis of the qualitative analysis of the effects of different factors on metal adsorption at different ceria surfaces by using Ag/CeO2 as a case. Two parameters are firstly presented, Ep which converts the total adsorption energy into the interaction energy per Agsbnd O bond, and θdiff which measures the deviation of Agsbnd Osbnd Ce bond angle from the angle of the sp3 orbital hybridization of O atom. Using the two parameters, the quantitative relationship of the interaction energy between Ag and ceria is established. There is a linear correlation between Ep and dAgsbndO with θdiff. The higher θdiff, the weaker Ep, and the longer Agsbnd O bond. This method is also suitable for other metals (Cu, Ni, Pd, and Rh, etc.) on ceria. It is the first time to establish the quantitative relationship for the interaction between metal and ceria, and sheds light into the theoretical design of M/CeO2 catalysts.
Pezzotti, Giuseppe; Zhu, Wenliang; Boffelli, Marco; Adachi, Tetsuya; Ichioka, Hiroaki; Yamamoto, Toshiro; Marunaka, Yoshinori; Kanamura, Narisato
2015-05-01
The Raman spectroscopic method has quantitatively been applied to the analysis of local crystallographic orientation in both single-crystal hydroxyapatite and human teeth. Raman selection rules for all the vibrational modes of the hexagonal structure were expanded into explicit functions of Euler angles in space and six Raman tensor elements (RTE). A theoretical treatment has also been put forward according to the orientation distribution function (ODF) formalism, which allows one to resolve the statistical orientation patterns of the nm-sized hydroxyapatite crystallite comprised in the Raman microprobe. Close-form solutions could be obtained for the Euler angles and their statistical distributions resolved with respect to the direction of the average texture axis. Polarized Raman spectra from single-crystalline hydroxyapatite and textured polycrystalline (teeth enamel) samples were compared, and a validation of the proposed Raman method could be obtained through confirming the agreement between RTE values obtained from different samples.
Probing lipid membrane electrostatics
NASA Astrophysics Data System (ADS)
Yang, Yi
The electrostatic properties of lipid bilayer membranes play a significant role in many biological processes. Atomic force microscopy (AFM) is highly sensitive to membrane surface potential in electrolyte solutions. With fully characterized probe tips, AFM can perform quantitative electrostatic analysis of lipid membranes. Electrostatic interactions between Silicon nitride probes and supported zwitterionic dioleoylphosphatidylcholine (DOPC) bilayer with a variable fraction of anionic dioleoylphosphatidylserine (DOPS) were measured by AFM. Classical Gouy-Chapman theory was used to model the membrane electrostatics. The nonlinear Poisson-Boltzmann equation was numerically solved with finite element method to provide the potential distribution around the AFM tips. Theoretical tip-sample electrostatic interactions were calculated with the surface integral of both Maxwell and osmotic stress tensors on tip surface. The measured forces were interpreted with theoretical forces and the resulting surface charge densities of the membrane surfaces were in quantitative agreement with the Gouy-Chapman-Stern model of membrane charge regulation. It was demonstrated that the AFM can quantitatively detect membrane surface potential at a separation of several screening lengths, and that the AFM probe only perturbs the membrane surface potential by <2%. One important application of this technique is to estimate the dipole density of lipid membrane. Electrostatic analysis of DOPC lipid bilayers with the AFM reveals a repulsive force between the negatively charged probe tips and the zwitterionic lipid bilayers. This unexpected interaction has been analyzed quantitatively to reveal that the repulsion is due to a weak external field created by the internai membrane dipole moment. The analysis yields a dipole moment of 1.5 Debye per lipid with a dipole potential of +275 mV for supported DOPC membranes. This new ability to quantitatively measure the membrane dipole density in a noninvasive manner will be useful in identifying the biological effects of the dipole potential. Finally, heterogeneous model membranes were studied with fluid electric force microscopy (FEFM). Electrostatic mapping was demonstrated with 50 nm resolution. The capabilities of quantitative electrostatic measurement and lateral charge density mapping make AFM a unique and powerful probe of membrane electrostatics.
ERIC Educational Resources Information Center
Murtonen, Mari
2015-01-01
University research education in many disciplines is frequently confronted by problems with students' weak level of understanding of research concepts. A mind map technique was used to investigate how students understand central methodological concepts of empirical, theoretical, qualitative and quantitative. The main hypothesis was that some…
NASA Astrophysics Data System (ADS)
Chen, Ying; Yuan, Jianghong; Zhang, Yingchao; Huang, Yonggang; Feng, Xue
2017-10-01
The interfacial failure of integrated circuit (IC) chips integrated on flexible substrates under bending deformation has been studied theoretically and experimentally. A compressive buckling test is used to impose the bending deformation onto the interface between the IC chip and the flexible substrate quantitatively, after which the failed interface is investigated using scanning electron microscopy. A theoretical model is established based on the beam theory and a bi-layer interface model, from which an analytical expression of the critical curvature in relation to the interfacial failure is obtained. The relationships between the critical curvature, the material, and the geometric parameters of the device are discussed in detail, providing guidance for future optimization flexible circuits based on IC chips.
A theoretical study of alpha star populations in loaded nuclear emulsions
Senftle, F.E.; Farley, T.A.; Stieff, L.R.
1954-01-01
This theoretical study of the alpha star populations in loaded emulsions was undertaken in an effort to find a quantitative method for the analysis of less than microgram amounts of thorium in the presence of larger amounts of uranium. Analytical expressions for each type of star from each of the significantly contributing members of the uranium and thorium series as well as summation formulas for the whole series have been computed. The analysis for thorium may be made by determining the abundance of five-branched stars in a loaded nuclear emulsion and comparing of observed and predicted star populations. The comparison may also be used to check the half-lives of several members of the uranium and thorium series. ?? 1954.
ERIC Educational Resources Information Center
Dalanon, Junhel; Diano, Liz Muriel; Belarmino, Ma Paciencia; Hayama, Rika; Miyagi, Mayu; Matsuka, Yoshizo
2018-01-01
This 2016 cross-sectional inquiry used quantitative and thematic content analysis to determine the organizational climate (OC) with empirical and theoretical relation to the teachers' performance (TP) and management competencies (MC) of a rural, K-12, private school in the Philippines. Analyses from a focus group discussion (FGD) was done using…
ERIC Educational Resources Information Center
Sochos, Antigonos
2014-01-01
The couple relationship is an essential source of support for individuals undergoing psychological treatment and the aim of this study was to apply a new methodology in assessing the quality of such support. A theoretically informed thematic analysis of interview transcripts was conducted, triangulated by quantitative data. Twenty-one brief…
ERIC Educational Resources Information Center
Kearney, W. Sean; Webb, Michael; Goldhorn, Jeff; Peters, Michelle L.
2013-01-01
This article presents a quantitative study utilizing HLM to analyze classroom walkthrough data completed by principals within 87 secondary mathematics classrooms across 9 public schools in Texas. This research is based on the theoretical framework of learner engagement as established by Argryis & Schon (1996), and refined by Marks (2000). It…
ERIC Educational Resources Information Center
Murakami, Yusuke
2013-01-01
There are two types of qualitative research that analyze a small number of cases or a single case: idiographic differentiation and nomothetic/generalization. There are few case studies of generalization. This is because theoretical inclination is weak in the field of education, and the binary framework of quantitative versus qualitative research…
Critical Quantitative Inquiry in Context
ERIC Educational Resources Information Center
Stage, Frances K.; Wells, Ryan S.
2014-01-01
This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…
Bade, Richard; White, Jason M; Gerber, Cobus
2018-01-01
The combination of qualitative and quantitative bimonthly analysis of pharmaceuticals and illicit drugs using liquid chromatography coupled to mass spectrometry is presented. A liquid chromatography-quadrupole time of flight instrument equipped with Sequential Window Acquisition of all THeoretical fragment-ion spectra (SWATH) was used to qualitatively screen 346 compounds in influent wastewater from two wastewater treatment plants in South Australia over a 14-month period. A total of 100 compounds were confirmed and/or detected using this strategy, with 61 confirmed in all samples including antidepressants (amitriptyline, dothiepin, doxepin), antipsychotics (amisulpride, clozapine), illicit drugs (cocaine, methamphetamine, amphetamine, 3,4-methylenedioxymethamphetamine (MDMA)), and known drug adulterants (lidocaine and tetramisole). A subset of these compounds was also included in a quantitative method, analyzed on a liquid chromatography-triple quadrupole mass spectrometer. The use of illicit stimulants (methamphetamine) showed a clear decrease, levels of opioid analgesics (morphine and methadone) remained relatively stable, while the use of new psychoactive substances (methylenedioxypyrovalerone (MDPV) and Alpha PVP) varied with no visible trend. This work demonstrates the value that high-frequency sampling combined with quantitative and qualitative analysis can deliver. Graphical abstract Temporal analysis of licit and illicit drugs in South Australia.
Optical Basicity and Nepheline Crystallization in High Alumina Glasses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriguez, Carmen P.; McCloy, John S.; Schweiger, M. J.
2011-02-25
The purpose of this study was to find compositions that increase waste loading of high-alumina wastes beyond what is currently acceptable while avoiding crystallization of nepheline (NaAlSiO4) on slow cooling. Nepheline crystallization has been shown to have a large impact on the chemical durability of high-level waste glasses. It was hypothesized that there would be some composition regions where high-alumina would not result in nepheline crystal production, compositions not currently allowed by the nepheline discriminator. Optical basicity (OB) and the nepheline discriminator (ND) are two ways of describing a given complex glass composition. This report presents the theoretical and experimentalmore » basis for these models. They are being studied together in a quadrant system as metrics to explore nepheline crystallization and chemical durability as a function of waste glass composition. These metrics were calculated for glasses with existing data and also for theoretical glasses to explore nepheline formation in Quadrant IV (passes OB metric but fails ND metric), where glasses are presumed to have good chemical durability. Several of these compositions were chosen, and glasses were made to fill poorly represented regions in Quadrant IV. To evaluate nepheline formation and chemical durability of these glasses, quantitative X-ray diffraction (XRD) analysis and the Product Consistency Test were conducted. A large amount of quantitative XRD data is collected here, both from new glasses and from glasses of previous studies that had not previously performed quantitative XRD on the phase assemblage. Appendix A critically discusses a large dataset to be considered for future quantitative studies on nepheline formation in glass. Appendix B provides a theoretical justification for choice of the oxide coefficients used to compute the OB criterion for nepheline formation.« less
Jarnuczak, Andrew F.; Eyers, Claire E.; Schwartz, Jean‐Marc; Grant, Christopher M.
2015-01-01
Molecular chaperones play an important role in protein homeostasis and the cellular response to stress. In particular, the HSP70 chaperones in yeast mediate a large volume of protein folding through transient associations with their substrates. This chaperone interaction network can be disturbed by various perturbations, such as environmental stress or a gene deletion. Here, we consider deletions of two major chaperone proteins, SSA1 and SSB1, from the chaperone network in Sacchromyces cerevisiae. We employ a SILAC‐based approach to examine changes in global and local protein abundance and rationalise our results via network analysis and graph theoretical approaches. Although the deletions result in an overall increase in intracellular protein content, correlated with an increase in cell size, this is not matched by substantial changes in individual protein concentrations. Despite the phenotypic robustness to deletion of these major hub proteins, it cannot be simply explained by the presence of paralogues. Instead, network analysis and a theoretical consideration of folding workload suggest that the robustness to perturbation is a product of the overall network structure. This highlights how quantitative proteomics and systems modelling can be used to rationalise emergent network properties, and how the HSP70 system can accommodate the loss of major hubs. PMID:25689132
Chinese Interpreting Studies: a data-driven analysis of a dynamic field of enquiry
Pekelis, Leonid
2015-01-01
Over the five decades since its beginnings, Chinese Interpreting Studies (CIS) has evolved into a dynamic field of academic enquiry with more than 3,500 scholars and 4,200 publications. Using quantitative and qualitative analysis, this scientometric study delves deep into CIS citation data to examine some of the noteworthy trends and patterns of behavior in the field: how can the field’s progress be quantified by means of citation analysis? Do its authors tend repeatedly to cite ‘classic’ papers or are they more drawn to their colleagues’ latest research? What different effects does the choice of empirical vs. theoretical research have on the use of citations in the various research brackets? The findings show that the field is steadily moving forward with new papers continuously being cited, although a number of influential papers stand out, having received a stream of citations in all the years examined. CIS scholars also have a tendency to cite much older English than Chinese publications across all document types, and empirical research has the greatest influence on the citation behavior of doctoral scholars, while theoretical studies have the largest impact on that of article authors. The goal of this study is to demonstrate the merits of blending quantitative and qualitative analyses to uncover hidden trends. PMID:26401459
Method for a quantitative investigation of the frozen flow hypothesis
Schock; Spillar
2000-09-01
We present a technique to test the frozen flow hypothesis quantitatively, using data from wave-front sensors such as those found in adaptive optics systems. Detailed treatments of the theoretical background of the method and of the error analysis are presented. Analyzing data from the 1.5-m and 3.5-m telescopes at the Starfire Optical Range, we find that the frozen flow hypothesis is an accurate description of the temporal development of atmospheric turbulence on time scales of the order of 1-10 ms but that significant deviations from the frozen flow behavior are present for longer time scales.
Demonstration of brain noise on human EEG signals in perception of bistable images
NASA Astrophysics Data System (ADS)
Grubov, Vadim V.; Runnova, Anastasiya E.; Kurovskaya, Maria K.; Pavlov, Alexey N.; Koronovskii, Alexey A.; Hramov, Alexander E.
2016-03-01
In this report we studied human brain activity in the case of bistable visual perception. We proposed a new approach for quantitative characterization of this activity based on analysis of EEG oscillatory patterns and evoked potentials. Accordingly to theoretical background, obtained experimental EEG data and results of its analysis we studied a characteristics of brain activity during decision-making. Also we have shown that decisionmaking process has the special patterns on the EEG data.
Deng, Ruixiang; Li, Meiling; Muneer, Badar; Zhu, Qi; Shi, Zaiying; Song, Lixin; Zhang, Tao
2018-01-01
Optically Transparent Microwave Metamaterial Absorber (OTMMA) is of significant use in both civil and military field. In this paper, equivalent circuit model is adopted as springboard to navigate the design of OTMMA. The physical model and absorption mechanisms of ideal lightweight ultrathin OTMMA are comprehensively researched. Both the theoretical value of equivalent resistance and the quantitative relation between the equivalent inductance and equivalent capacitance are derived for design. Frequency-dependent characteristics of theoretical equivalent resistance are also investigated. Based on these theoretical works, an effective and controllable design approach is proposed. To validate the approach, a wideband OTMMA is designed, fabricated, analyzed and tested. The results reveal that high absorption more than 90% can be achieved in the whole 6~18 GHz band. The fabricated OTMMA also has an optical transparency up to 78% at 600 nm and is much thinner and lighter than its counterparts. PMID:29324686
Deng, Ruixiang; Li, Meiling; Muneer, Badar; Zhu, Qi; Shi, Zaiying; Song, Lixin; Zhang, Tao
2018-01-11
Optically Transparent Microwave Metamaterial Absorber (OTMMA) is of significant use in both civil and military field. In this paper, equivalent circuit model is adopted as springboard to navigate the design of OTMMA. The physical model and absorption mechanisms of ideal lightweight ultrathin OTMMA are comprehensively researched. Both the theoretical value of equivalent resistance and the quantitative relation between the equivalent inductance and equivalent capacitance are derived for design. Frequency-dependent characteristics of theoretical equivalent resistance are also investigated. Based on these theoretical works, an effective and controllable design approach is proposed. To validate the approach, a wideband OTMMA is designed, fabricated, analyzed and tested. The results reveal that high absorption more than 90% can be achieved in the whole 6~18 GHz band. The fabricated OTMMA also has an optical transparency up to 78% at 600 nm and is much thinner and lighter than its counterparts.
A quantitative dynamic systems model of health-related quality of life among older adults
Roppolo, Mattia; Kunnen, E Saskia; van Geert, Paul L; Mulasso, Anna; Rabaglietti, Emanuela
2015-01-01
Health-related quality of life (HRQOL) is a person-centered concept. The analysis of HRQOL is highly relevant in the aged population, which is generally suffering from health decline. Starting from a conceptual dynamic systems model that describes the development of HRQOL in individuals over time, this study aims to develop and test a quantitative dynamic systems model, in order to reveal the possible dynamic trends of HRQOL among older adults. The model is tested in different ways: first, with a calibration procedure to test whether the model produces theoretically plausible results, and second, with a preliminary validation procedure using empirical data of 194 older adults. This first validation tested the prediction that given a particular starting point (first empirical data point), the model will generate dynamic trajectories that lead to the observed endpoint (second empirical data point). The analyses reveal that the quantitative model produces theoretically plausible trajectories, thus providing support for the calibration procedure. Furthermore, the analyses of validation show a good fit between empirical and simulated data. In fact, no differences were found in the comparison between empirical and simulated final data for the same subgroup of participants, whereas the comparison between different subgroups of people resulted in significant differences. These data provide an initial basis of evidence for the dynamic nature of HRQOL during the aging process. Therefore, these data may give new theoretical and applied insights into the study of HRQOL and its development with time in the aging population. PMID:26604722
Takeda, Hiroaki; Izumi, Yoshihiro; Takahashi, Masatomo; Paxton, Thanai; Tamura, Shohei; Koike, Tomonari; Yu, Ying; Kato, Noriko; Nagase, Katsutoshi; Shiomi, Masashi; Bamba, Takeshi
2018-05-03
Lipidomics, the mass spectrometry-based comprehensive analysis of lipids, has attracted attention as an analytical approach to provide novel insight into lipid metabolism and to search for biomarkers. However, an ideal method for both comprehensive and quantitative analysis of lipids has not been fully developed. Herein, we have proposed a practical methodology for widely-targeted quantitative lipidome analysis using supercritical fluid chromatography fast-scanning triple-quadrupole mass spectrometry (SFC/QqQMS) and theoretically calculated a comprehensive lipid multiple reaction monitoring (MRM) library. Lipid classes can be separated by SFC with a normal phase diethylamine-bonded silica column with high-resolution, high-throughput, and good repeatability. Structural isomers of phospholipids can be monitored by mass spectrometric separation with fatty acyl-based MRM transitions. SFC/QqQMS analysis with an internal standard-dilution method offers quantitative information for both lipid class and individual lipid molecular species in the same lipid class. Additionally, data acquired using this method has advantages including reduction of misidentification and acceleration of data analysis. Using the SFC/QqQMS system, alteration of plasma lipid levels in myocardial infarction-prone rabbits to the supplementation of eicosapentaenoic acid was first observed. Our developed SFC/QqQMS method represents a potentially useful tool for in-depth studies focused on complex lipid metabolism and biomarker discovery. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.
ERIC Educational Resources Information Center
Harvey, Stephen; Cushion, Christopher J.; Wegis, Heidi M.; Massa-Gonzalez, Ada N.
2010-01-01
Background: Previous research examining the effectiveness of the Teaching Games for Understanding (TGfU) approach has been equivocal. This has been hampered by a dependence on a comparative (i.e., "which method is best?") theoretical framework. An alternative "practice-referenced" framework has the potential to examine the effectiveness of TGfU…
Theoretical Framework for Interaction Game Design
2016-05-19
modeling. We take a data-driven quantitative approach to understand conversational behaviors by measuring conversational behaviors using advanced sensing...current state of the art, human computing is considered to be a reasonable approach to break through the current limitation. To solicit high quality and...proper resources in conversation to enable smooth and effective interaction. The last technique is about conversation measurement , analysis, and
Applications of surface analysis and surface theory in tribology
NASA Technical Reports Server (NTRS)
Ferrante, John
1988-01-01
Tribology, the study of adhesion, friction and wear of materials is a complex field which requires a knowledge of solid state physics, surface physics, chemistry, material science and mechanical engineering. It has been dominated, however, by the more practical need to make equipment work. With the advent of surface analysis and advances in surface and solid state theory, a new dimension has been added to the analysis of interactions at tribological interfaces. In this paper the applications of tribological studies and their limitations are presented. Examples from research at the NASA Lewis Research Center are given. Emphasis is on fundamental studies involving the effects of monolayer coverage and thick films on friction and wear. A summary of the current status of theoretical calculations of defect energetics is presented. In addition, some new theoretical techniques which enable simplified quantitative calculations of adhesion, fracture and friction are discussed.
Applications of surface analysis and surface theory in tribology
NASA Technical Reports Server (NTRS)
Ferrante, John
1989-01-01
Tribology, the study of adhesion, friction and wear of materials, is a complex field which requires a knowledge of solid state physics, surface physics, chemistry, material science, and mechanical engineering. It has been dominated, however, by the more practical need to make equipment work. With the advent of surface analysis and advances in surface and solid-state theory, a new dimension has been added to the analysis of interactions at tribological interfaces. In this paper the applications of tribological studies and their limitations are presented. Examples from research at the NASA Lewis Research Center are given. Emphasis is on fundamental studies involving the effects of monolayer coverage and thick films on friction and wear. A summary of the current status of theoretical calculations of defect energetics is presented. In addition, some new theoretical techniques which enable simplified quantitative calculations of adhesion, fracture, and friction are discussed.
Optimally weighted least-squares steganalysis
NASA Astrophysics Data System (ADS)
Ker, Andrew D.
2007-02-01
Quantitative steganalysis aims to estimate the amount of payload in a stego object, and such estimators seem to arise naturally in steganalysis of Least Significant Bit (LSB) replacement in digital images. However, as with all steganalysis, the estimators are subject to errors, and their magnitude seems heavily dependent on properties of the cover. In very recent work we have given the first derivation of estimation error, for a certain method of steganalysis (the Least-Squares variant of Sample Pairs Analysis) of LSB replacement steganography in digital images. In this paper we make use of our theoretical results to find an improved estimator and detector. We also extend the theoretical analysis to another (more accurate) steganalysis estimator (Triples Analysis) and hence derive an improved version of that estimator too. Experimental results show that the new steganalyzers have improved accuracy, particularly in the difficult case of never-compressed covers.
Quantitative genetic models of sexual conflict based on interacting phenotypes.
Moore, Allen J; Pizzari, Tommaso
2005-05-01
Evolutionary conflict arises between reproductive partners when alternative reproductive opportunities are available. Sexual conflict can generate sexually antagonistic selection, which mediates sexual selection and intersexual coevolution. However, despite intense interest, the evolutionary implications of sexual conflict remain unresolved. We propose a novel theoretical approach to study the evolution of sexually antagonistic phenotypes based on quantitative genetics and the measure of social selection arising from male-female interactions. We consider the phenotype of one sex as both a genetically influenced evolving trait as well as the (evolving) social environment in which the phenotype of the opposite sex evolves. Several important points emerge from our analysis, including the relationship between direct selection on one sex and indirect effects through selection on the opposite sex. We suggest that the proposed approach may be a valuable tool to complement other theoretical approaches currently used to study sexual conflict. Most importantly, our approach highlights areas where additional empirical data can help clarify the role of sexual conflict in the evolutionary process.
Tallarida, Ronald J.; Raffa, Robert B.
2014-01-01
In this review we show that the concept of dose equivalence for two drugs, the theoretical basis of the isobologram, has a wider use in the analysis of pharmacological data derived from single and combination drug use. In both its application to drug combination analysis with isoboles and certain other actions, listed below, the determination of doses, or receptor occupancies, that yield equal effects provide useful metrics that can be used to obtain quantitative information on drug actions without postulating any intimate mechanism of action. These other drug actions discussed here include (1) combinations of agonists that produce opposite effects, (2) analysis of inverted U-shaped dose effect curves of single agents, (3) analysis on the effect scale as an alternative to isoboles and (4) the use of occupation isoboles to examine competitive antagonism in the dual receptor case. New formulas derived to assess the statistical variance for additive combinations are included, and the more detailed mathematical topics are included in the appendix. PMID:20546783
Nonlinear Analysis of Experimental Measurements 7.6. Theoretical Chemistry
2015-01-26
Jianshu Cao, Robert J. Silbey, Jaeyoung Sung. Quantitative Interpretation of the Randomness in Single Enzyme Turnover Times, Biophysical Journal...Universality of Poisson Indicator and Fano Factor of Transport Event Statistics in Ion Channels and Enzyme Kinetics., J. Phys. B: At. Mol. Opt. Phys...TOTAL: 4 01/26/2015 Received Book 4.00 Jianshu Cao, Jianlan Wu. GENERALIZED MICHAELIS–MENTENEQUATION FOR CONFORMATION- MODULATEDMONOMERIC ENZYMES , New
2018-02-15
models and approaches are also valid using other invasive and non - invasive technologies. Finally, we illustrate and experimentally evaluate this...2017 Project Outline q Pattern formation diversity in wild microbial societies q Experimental and mathematical analysis methodology q Skeleton...chemotaxis, nutrient degradation, and the exchange of amino acids between cells. Using both quantitative experimental methods and several theoretical
Quantitative characterisation of sedimentary grains
NASA Astrophysics Data System (ADS)
Tunwal, Mohit; Mulchrone, Kieran F.; Meere, Patrick A.
2016-04-01
Analysis of sedimentary texture helps in determining the formation, transportation and deposition processes of sedimentary rocks. Grain size analysis is traditionally quantitative, whereas grain shape analysis is largely qualitative. A semi-automated approach to quantitatively analyse shape and size of sand sized sedimentary grains is presented. Grain boundaries are manually traced from thin section microphotographs in the case of lithified samples and are automatically identified in the case of loose sediments. Shape and size paramters can then be estimated using a software package written on the Mathematica platform. While automated methodology already exists for loose sediment analysis, the available techniques for the case of lithified samples are limited to cases of high definition thin section microphotographs showing clear contrast between framework grains and matrix. Along with the size of grain, shape parameters such as roundness, angularity, circularity, irregularity and fractal dimension are measured. A new grain shape parameter developed using Fourier descriptors has also been developed. To test this new approach theoretical examples were analysed and produce high quality results supporting the accuracy of the algorithm. Furthermore sandstone samples from known aeolian and fluvial environments from the Dingle Basin, County Kerry, Ireland were collected and analysed. Modern loose sediments from glacial till from County Cork, Ireland and aeolian sediments from Rajasthan, India have also been collected and analysed. A graphical summary of the data is presented and allows for quantitative distinction between samples extracted from different sedimentary environments.
Empirical analysis of storm-time energetic electron enhancements
NASA Astrophysics Data System (ADS)
O'Brien, Thomas Paul, III
This Ph.D. thesis documents a program for studying the appearance of energetic electrons in the Earth's outer radiation belts that is associated with many geomagnetic storms. The dynamic evolution of the electron radiation belts is an outstanding empirical problem in both theoretical space physics and its applied sibling, space weather. The project emphasizes the development of empirical tools and their use in testing several theoretical models of the energization of the electron belts. First, I develop the Statistical Asynchronous Regression technique to provide proxy electron fluxes throughout the parts of the radiation belts explored by geosynchronous and GPS spacecraft. Next, I show that a theoretical adiabatic model can relate the local time asymmetry of the proxy geosynchronous fluxes to the asymmetry of the geomagnetic field. Then, I perform a superposed epoch analysis on the proxy fluxes at local noon to identify magnetospheric and interplanetary precursors of relativistic electron enhancements. Finally, I use statistical and neural network phase space analyses to determine the hourly evolution of flux at a virtual stationary monitor. The dynamic equation quantitatively identifies the importance of different drivers of the electron belts. This project provides empirical constraints on theoretical models of electron acceleration.
Jarnuczak, Andrew F; Eyers, Claire E; Schwartz, Jean-Marc; Grant, Christopher M; Hubbard, Simon J
2015-09-01
Molecular chaperones play an important role in protein homeostasis and the cellular response to stress. In particular, the HSP70 chaperones in yeast mediate a large volume of protein folding through transient associations with their substrates. This chaperone interaction network can be disturbed by various perturbations, such as environmental stress or a gene deletion. Here, we consider deletions of two major chaperone proteins, SSA1 and SSB1, from the chaperone network in Sacchromyces cerevisiae. We employ a SILAC-based approach to examine changes in global and local protein abundance and rationalise our results via network analysis and graph theoretical approaches. Although the deletions result in an overall increase in intracellular protein content, correlated with an increase in cell size, this is not matched by substantial changes in individual protein concentrations. Despite the phenotypic robustness to deletion of these major hub proteins, it cannot be simply explained by the presence of paralogues. Instead, network analysis and a theoretical consideration of folding workload suggest that the robustness to perturbation is a product of the overall network structure. This highlights how quantitative proteomics and systems modelling can be used to rationalise emergent network properties, and how the HSP70 system can accommodate the loss of major hubs. © 2015 The Authors. PROTEOMICS published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Accurate airway centerline extraction based on topological thinning using graph-theoretic analysis.
Bian, Zijian; Tan, Wenjun; Yang, Jinzhu; Liu, Jiren; Zhao, Dazhe
2014-01-01
The quantitative analysis of the airway tree is of critical importance in the CT-based diagnosis and treatment of popular pulmonary diseases. The extraction of airway centerline is a precursor to identify airway hierarchical structure, measure geometrical parameters, and guide visualized detection. Traditional methods suffer from extra branches and circles due to incomplete segmentation results, which induce false analysis in applications. This paper proposed an automatic and robust centerline extraction method for airway tree. First, the centerline is located based on the topological thinning method; border voxels are deleted symmetrically to preserve topological and geometrical properties iteratively. Second, the structural information is generated using graph-theoretic analysis. Then inaccurate circles are removed with a distance weighting strategy, and extra branches are pruned according to clinical anatomic knowledge. The centerline region without false appendices is eventually determined after the described phases. Experimental results show that the proposed method identifies more than 96% branches and keep consistency across different cases and achieves superior circle-free structure and centrality.
Dependence of quantitative accuracy of CT perfusion imaging on system parameters
NASA Astrophysics Data System (ADS)
Li, Ke; Chen, Guang-Hong
2017-03-01
Deconvolution is a popular method to calculate parametric perfusion parameters from four dimensional CT perfusion (CTP) source images. During the deconvolution process, the four dimensional space is squeezed into three-dimensional space by removing the temporal dimension, and a prior knowledge is often used to suppress noise associated with the process. These additional complexities confound the understanding about deconvolution-based CTP imaging system and how its quantitative accuracy depends on parameters and sub-operations involved in the image formation process. Meanwhile, there has been a strong clinical need in answering this question, as physicians often rely heavily on the quantitative values of perfusion parameters to make diagnostic decisions, particularly during an emergent clinical situation (e.g. diagnosis of acute ischemic stroke). The purpose of this work was to develop a theoretical framework that quantitatively relates the quantification accuracy of parametric perfusion parameters with CTP acquisition and post-processing parameters. This goal was achieved with the help of a cascaded systems analysis for deconvolution-based CTP imaging systems. Based on the cascaded systems analysis, the quantitative relationship between regularization strength, source image noise, arterial input function, and the quantification accuracy of perfusion parameters was established. The theory could potentially be used to guide developments of CTP imaging technology for better quantification accuracy and lower radiation dose.
Quantitative model of diffuse speckle contrast analysis for flow measurement.
Liu, Jialin; Zhang, Hongchao; Lu, Jian; Ni, Xiaowu; Shen, Zhonghua
2017-07-01
Diffuse speckle contrast analysis (DSCA) is a noninvasive optical technique capable of monitoring deep tissue blood flow. However, a detailed study of the speckle contrast model for DSCA has yet to be presented. We deduced the theoretical relationship between speckle contrast and exposure time and further simplified it to a linear approximation model. The feasibility of this linear model was validated by the liquid phantoms which demonstrated that the slope of this linear approximation was able to rapidly determine the Brownian diffusion coefficient of the turbid media at multiple distances using multiexposure speckle imaging. Furthermore, we have theoretically quantified the influence of optical property on the measurements of the Brownian diffusion coefficient which was a consequence of the fact that the slope of this linear approximation was demonstrated to be equal to the inverse of correlation time of the speckle.
Giraud, Nicolas; Blackledge, Martin; Goldman, Maurice; Böckmann, Anja; Lesage, Anne; Penin, François; Emsley, Lyndon
2005-12-28
A detailed analysis of nitrogen-15 longitudinal relaxation times in microcrystalline proteins is presented. A theoretical model to quantitatively interpret relaxation times is developed in terms of motional amplitude and characteristic time scale. Different averaging schemes are examined in order to propose an analysis of relaxation curves that takes into account the specificity of MAS experiments. In particular, it is shown that magic angle spinning averages the relaxation rate experienced by a single spin over one rotor period, resulting in individual relaxation curves that are dependent on the orientation of their corresponding carousel with respect to the rotor axis. Powder averaging thus leads to a nonexponential behavior in the observed decay curves. We extract dynamic information from experimental decay curves, using a diffusion in a cone model. We apply this study to the analysis of spin-lattice relaxation rates of the microcrystalline protein Crh at two different fields and determine differential dynamic parameters for several residues in the protein.
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2013-04-01
Critical analysis of the standard foundations of differential and integral calculus -- as mathematical formalism of theoretical physics -- is proposed. Methodological basis of the analysis is the unity of formal logic and rational dialectics. It is shown that: (a) the foundations (i.e. d 1ptyd,;=;δ,;->;0,;δ,δ,, δ,;->;0;δ,δ,;=;δ,;->;0;f,( x;+;δ, );-;f,( x )δ,;, d,;=;δ,, d,;=;δ, where y;=;f,( x ) is a continuous function of one argument x; δ, and δ, are increments; d, and d, are differentials) not satisfy formal logic law -- the law of identity; (b) the infinitesimal quantities d,, d, are fictitious quantities. They have neither algebraic meaning, nor geometrical meaning because these quantities do not take numerical values and, therefore, have no a quantitative measure; (c) expressions of the kind x;+;d, are erroneous because x (i.e. finite quantity) and d, (i.e. infinitely diminished quantity) have different sense, different qualitative determinacy; since x;,;,,,,onst under δ,;,;,, a derivative does not contain variable quantity x and depends only on constant c. Consequently, the standard concepts ``infinitesimal quantity (uninterruptedly diminishing quantity)'', ``derivative'', ``derivative as function of variable quantity'' represent incorrect basis of mathematics and theoretical physics.
What physicists should learn about finance (if they want to)
NASA Astrophysics Data System (ADS)
Schmidt, Anatoly
2006-03-01
There has been growing interest among physicists to Econophysics, i.e. analysis and modeling of financial and economic processes using the concepts of theoretical Physics. There has been also perception that the financial industry is a viable alternative for those physicists who are not able or are not willing to pursue career in their major field. However in our times, the Wall Street expects from applicants for quantitative positions not only the knowledge of the stochastic calculus and the methods of time series analysis but also of such concepts as option pricing, portfolio management, and risk measurement. Here I describe a synthetic course based on my book ``Quantitative Finance for Physicists'' (Elsevier, 2004) that outlines both worlds: Econophysics and Mathematical Finance. This course may be offered as elective for senior undergraduate or graduate Physics majors.
Delsignore, Ann Marie; Petrova, Elena; Harper, Amney; Stowe, Angela M; Mu'min, Ameena S; Middleton, Renée A
2010-07-01
An exploratory qualitative analysis of the critical incidents and assistance-seeking behaviors of White mental health psychologists and professional counselors was performed in an effort to examine a theoretical supposition presented within a Person(al)-as-Profession(al) transtheoretical framework (P-A-P). A concurrent nested strategy was used in which both quantitative and qualitative data were collected simultaneously (Creswell, 2003). In this nested strategy, qualitative data was embedded in a predominant (quantitative) method of analysis from an earlier study (see Middleton et al., 2005). Critical incidents categorized as informal (i.e., personal) experiences were cited more often than those characterized as formal (i.e., professional) experiences as influencing the professional perspectives of White mental health practitioners regarding multicultural diversity. Implications for the counseling and psychology professions are discussed.
Quantitative image analysis of WE43-T6 cracking behavior
NASA Astrophysics Data System (ADS)
Ahmad, A.; Yahya, Z.
2013-06-01
Environment-assisted cracking of WE43 cast magnesium (4.2 wt.% Yt, 2.3 wt.% Nd, 0.7% Zr, 0.8% HRE) in the T6 peak-aged condition was induced in ambient air in notched specimens. The mechanism of fracture was studied using electron backscatter diffraction, serial sectioning and in situ observations of crack propagation. The intermetallic (rare earthed-enriched divorced intermetallic retained at grain boundaries and predominantly at triple points) material was found to play a significant role in initiating cracks which leads to failure of this material. Quantitative measurements were required for this project. The populations of the intermetallic and clusters of intermetallic particles were analyzed using image analysis of metallographic images. This is part of the work to generate a theoretical model of the effect of notch geometry on the static fatigue strength of this material.
Analysis techniques for tracer studies of oxidation. M. S. Thesis Final Report
NASA Technical Reports Server (NTRS)
Basu, S. N.
1984-01-01
Analysis techniques to obtain quantitative diffusion data from tracer concentration profiles were developed. Mass balance ideas were applied to determine the mechanism of oxide growth and to separate the fraction of inward and outward growth of oxide scales. The process of inward oxygen diffusion with exchange was theoretically modelled and the effect of lattice diffusivity, grain boundary diffusivity and grain size on the tracer concentration profile was studied. The development of the tracer concentration profile in a growing oxide scale was simulated. The double oxidation technique was applied to a FeCrAl-Zr alloy using 0-18 as a tracer. SIMS was used to obtain the tracer concentration profile. The formation of lacey oxide on the alloy was discussed. Careful consideration was given to the quality of data required to obtain quantitative information.
Quadrant photodetector sensitivity.
Manojlović, Lazo M
2011-07-10
A quantitative theoretical analysis of the quadrant photodetector (QPD) sensitivity in position measurement is presented. The Gaussian light spot irradiance distribution on the QPD surface was assumed to meet most of the real-life applications of this sensor. As the result of the mathematical treatment of the problem, we obtained, in a closed form, the sensitivity function versus the ratio of the light spot 1/e radius and the QPD radius. The obtained result is valid for the full range of the ratios. To check the influence of the finite light spot radius on the interaxis cross talk and linearity, we also performed a mathematical analysis to quantitatively measure these types of errors. An optimal range of the ratio of light spot radius and QPD radius has been found to simultaneously achieve low interaxis cross talk and high linearity of the sensor. © 2011 Optical Society of America
Determination of neutron flux distribution in an Am-Be irradiator using the MCNP.
Shtejer-Diaz, K; Zamboni, C B; Zahn, G S; Zevallos-Chávez, J Y
2003-10-01
A neutron irradiator has been assembled at IPEN facilities to perform qualitative-quantitative analysis of many materials using thermal and fast neutrons outside the nuclear reactor premises. To establish the prototype specifications, the neutron flux distribution and the absorbed dose rates were calculated using the MCNP computer code. These theoretical predictions then allow one to discuss the optimum irradiator design and its performance.
ERIC Educational Resources Information Center
Detering, Brad
2017-01-01
This research study, grounded in the theoretical framework of education change, used the Concerns-Based Adoption Model of change to examine the concerns of Illinois high school teachers and administrators regarding the implementation of 1:1 computing programs. A quantitative study of educators investigated the stages of concern and the mathematics…
A new method for qualitative simulation of water resources systems: 1. Theory
NASA Astrophysics Data System (ADS)
Camara, A. S.; Pinheiro, M.; Antunes, M. P.; Seixas, M. J.
1987-11-01
A new dynamic modeling methodology, SLIN (Simulação Linguistica), allowing for the analysis of systems defined by linguistic variables, is presented. SLIN applies a set of logical rules avoiding fuzzy theoretic concepts. To make the transition from qualitative to quantitative modes, logical rules are used as well. Extensions of the methodology to simulation-optimization applications and multiexpert system modeling are also discussed.
Ali, Ashehad A.; Medlyn, Belinda E.; Aubier, Thomas G.; ...
2015-10-06
Differential species responses to atmospheric CO 2 concentration (C a) could lead to quantitative changes in competition among species and community composition, with flow-on effects for ecosystem function. However, there has been little theoretical analysis of how elevated C a (eC a) will affect plant competition, or how composition of plant communities might change. Such theoretical analysis is needed for developing testable hypotheses to frame experimental research. Here, we investigated theoretically how plant competition might change under eC a by implementing two alternative competition theories, resource use theory and resource capture theory, in a plant carbon and nitrogen cycling model.more » The model makes several novel predictions for the impact of eC a on plant community composition. Using resource use theory, the model predicts that eC a is unlikely to change species dominance in competition, but is likely to increase coexistence among species. Using resource capture theory, the model predicts that eC a may increase community evenness. Collectively, both theories suggest that eC a will favor coexistence and hence that species diversity should increase with eC a. Our theoretical analysis leads to a novel hypothesis for the impact of eC a on plant community composition. In this study, the hypothesis has potential to help guide the design and interpretation of eC a experiments.« less
Spreading of a granular droplet.
Sánchez, Iván; Raynaud, Franck; Lanuza, José; Andreotti, Bruno; Clément, Eric; Aranson, Igor S
2007-12-01
The influence of controlled vibrations on the granular rheology is investigated in a specifically designed experiment in which a granular film spreads under the action of horizontal vibrations. A nonlinear diffusion equation is derived theoretically that describes the evolution of the deposit shape. A self-similar parabolic shape (the "granular droplet") and a spreading dynamics are predicted that both agree quantitatively with the experimental results. The theoretical analysis is used to extract effective friction coefficients between the base and the granular layer under sustained and controlled vibrations. A shear thickening regime characteristic of dense granular flows is evidenced at low vibration energy, both for glass beads and natural sand. Conversely, shear thinning is observed at high agitation.
Spreading of a granular droplet
NASA Astrophysics Data System (ADS)
Clement, Eric; Sanchez, Ivan; Raynaud, Franck; Lanuza, Jose; Andreotti, Bruno; Aranson, Igor
2008-03-01
The influence of controlled vibrations on the granular rheology is investigated in a specifically designed experiment in which a granular film spreads under the action of horizontal vibrations. A nonlinear diffusion equation is derived theoretically that describes the evolution of the deposit shape. A self-similar parabolic shape (the``granular droplet'') and a spreading dynamics are predicted that both agree quantitatively with the experimental results. The theoretical analysis is used to extract effective friction coefficients between the base and the granular layer under sustained and controlled vibrations. A shear thickening regime characteristic of dense granular flows is evidenced at low vibration energy, both for glass beads and natural sand. Conversely, shear thinning is observed at high agitation.
Spreading of a granular droplet
NASA Astrophysics Data System (ADS)
Sánchez, Iván; Raynaud, Franck; Lanuza, José; Andreotti, Bruno; Clément, Eric; Aranson, Igor S.
2007-12-01
The influence of controlled vibrations on the granular rheology is investigated in a specifically designed experiment in which a granular film spreads under the action of horizontal vibrations. A nonlinear diffusion equation is derived theoretically that describes the evolution of the deposit shape. A self-similar parabolic shape (the“granular droplet”) and a spreading dynamics are predicted that both agree quantitatively with the experimental results. The theoretical analysis is used to extract effective friction coefficients between the base and the granular layer under sustained and controlled vibrations. A shear thickening regime characteristic of dense granular flows is evidenced at low vibration energy, both for glass beads and natural sand. Conversely, shear thinning is observed at high agitation.
Development of a theoretical framework for analyzing cerebrospinal fluid dynamics
Cohen, Benjamin; Voorhees, Abram; Vedel, Søren; Wei, Timothy
2009-01-01
Background To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservation; but control volume analysis enforces independent conditions on pressure and volume. Previously, utilization of clinical measurements has been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Methods Control volume analysis is presented to introduce the reader to the theoretical background of this foundational fluid mechanics technique for application to general control volumes. This approach is able to directly incorporate the diverse measurements obtained by clinicians to better elucidate intracranial dynamics and progression to disorder. Results Several examples of meaningful intracranial control volumes and the particular measurement sets needed for the analysis are discussed. Conclusion Control volume analysis provides a framework to guide the type and location of measurements and also a way to interpret the resulting data within a fundamental fluid physics analysis. PMID:19772652
Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective
Jacobs, Arthur M.
2017-01-01
In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-)Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA) and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials. PMID:29311877
Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective.
Jacobs, Arthur M
2017-01-01
In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-)Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA) and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials.
Man-machine analysis of translation and work tasks of Skylab films
NASA Technical Reports Server (NTRS)
Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.
1979-01-01
An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.
Mason, Deanna M
2014-01-01
This study employed a grounded theory research design to develop a theoretical model focused on the maturation of spirituality and its influence on behavior during late adolescence. Quantitative research studies have linked spirituality with decreased health-risk behaviors and increased health-promotion behaviors during late adolescence. Qualitative, theoretical data is proposed to discover the underlying reasons this relationship exists and increase the ability to apply this knowledge to practice. Twenty-one adolescents, age 16-21 years, were e-mail interviewed and transcripts analyzed using a conceptual lens of Blumer's symbolic interactionism. From this analysis, a theoretical model emerged with the core concept, finding myself that represents 5 core process concepts. Implications of this study illustrate that late adolescents are aware of their personal spiritual maturation as well as its influence on behavior. In addition, a distinction between the generic concept of spirituality, personal spirituality, and religion emerged.
Li, Lin; Xu, Shuo; An, Xin; Zhang, Lu-Da
2011-10-01
In near infrared spectral quantitative analysis, the precision of measured samples' chemical values is the theoretical limit of those of quantitative analysis with mathematical models. However, the number of samples that can obtain accurately their chemical values is few. Many models exclude the amount of samples without chemical values, and consider only these samples with chemical values when modeling sample compositions' contents. To address this problem, a semi-supervised LS-SVR (S2 LS-SVR) model is proposed on the basis of LS-SVR, which can utilize samples without chemical values as well as those with chemical values. Similar to the LS-SVR, to train this model is equivalent to solving a linear system. Finally, the samples of flue-cured tobacco were taken as experimental material, and corresponding quantitative analysis models were constructed for four sample compositions' content(total sugar, reducing sugar, total nitrogen and nicotine) with PLS regression, LS-SVR and S2 LS-SVR. For the S2 LS-SVR model, the average relative errors between actual values and predicted ones for the four sample compositions' contents are 6.62%, 7.56%, 6.11% and 8.20%, respectively, and the correlation coefficients are 0.974 1, 0.973 3, 0.923 0 and 0.948 6, respectively. Experimental results show the S2 LS-SVR model outperforms the other two, which verifies the feasibility and efficiency of the S2 LS-SVR model.
Enzyme Active Site Interactions by Raman/FTIR, NMR, and Ab Initio Calculations
Deng, Hua
2017-01-01
Characterization of enzyme active site structure and interactions at high resolution is important for the understanding of the enzyme catalysis. Vibrational frequency and NMR chemical shift measurements of enzyme-bound ligands are often used for such purpose when X-ray structures are not available or when higher resolution active site structures are desired. This review is focused on how ab initio calculations may be integrated with vibrational and NMR chemical shift measurements to quantitatively determine high-resolution ligand structures (up to 0.001 Å for bond length and 0.01 Å for hydrogen bonding distance) and how interaction energies between bound ligand and its surroundings at the active site may be determined. Quantitative characterization of substrate ionic states, bond polarizations, tautomeric forms, conformational changes and its interactions with surroundings in enzyme complexes that mimic ground state or transition state can provide snapshots for visualizing the substrate structural evolution along enzyme-catalyzed reaction pathway. Our results have shown that the integration of spectroscopic studies with theoretical computation greatly enhances our ability to interpret experimental data and significantly increases the reliability of the theoretical analysis. PMID:24018325
Quantitative research on critical thinking and predicting nursing students' NCLEX-RN performance.
Romeo, Elizabeth M
2010-07-01
The concept of critical thinking has been influential in several disciplines. Both education and nursing in general have been attempting to define, teach, and measure this concept for decades. Nurse educators realize that critical thinking is the cornerstone of the objectives and goals for nursing students. The purpose of this article is to review and analyze quantitative research findings relevant to the measurement of critical thinking abilities and skills in undergraduate nursing students and the usefulness of critical thinking as a predictor of National Council Licensure Examination-Registered Nurse (NCLEX-RN) performance. The specific issues that this integrative review examined include assessment and analysis of the theoretical and operational definitions of critical thinking, theoretical frameworks used to guide the studies, instruments used to evaluate critical thinking skills and abilities, and the role of critical thinking as a predictor of NCLEX-RN outcomes. A list of key assumptions related to critical thinking was formulated. The limitations and gaps in the literature were identified, as well as the types of future research needed in this arena. Copyright 2010, SLACK Incorporated.
NASA Astrophysics Data System (ADS)
D'Angelo, Paola; Migliorati, Valentina; Mancini, Giordano; Barone, Vincenzo; Chillemi, Giovanni
2008-02-01
The structural and dynamic properties of the solvated Hg2+ ion in aqueous solution have been investigated by a combined experimental-theoretical approach employing x-ray absorption spectroscopy and molecular dynamics (MD) simulations. This method allows one to perform a quantitative analysis of the x-ray absorption near-edge structure (XANES) spectra of ionic solutions using a proper description of the thermal and structural fluctuations. XANES spectra have been computed starting from the MD trajectory, without carrying out any minimization in the structural parameter space. The XANES experimental data are accurately reproduced by a first-shell heptacoordinated cluster only if the second hydration shell is included in the calculations. These results confirm at the same time the existence of a sevenfold first hydration shell for the Hg2+ ion in aqueous solution and the reliability of the potentials used in the MD simulations. The combination of MD and XANES is found to be very helpful to get important new insights into the quantitative estimation of structural properties of disordered systems.
Quantitative analysis of diffusion tensor orientation: theoretical framework.
Wu, Yu-Chien; Field, Aaron S; Chung, Moo K; Badie, Benham; Alexander, Andrew L
2004-11-01
Diffusion-tensor MRI (DT-MRI) yields information about the magnitude, anisotropy, and orientation of water diffusion of brain tissues. Although white matter tractography and eigenvector color maps provide visually appealing displays of white matter tract organization, they do not easily lend themselves to quantitative and statistical analysis. In this study, a set of visual and quantitative tools for the investigation of tensor orientations in the human brain was developed. Visual tools included rose diagrams, which are spherical coordinate histograms of the major eigenvector directions, and 3D scatterplots of the major eigenvector angles. A scatter matrix of major eigenvector directions was used to describe the distribution of major eigenvectors in a defined anatomic region. A measure of eigenvector dispersion was developed to describe the degree of eigenvector coherence in the selected region. These tools were used to evaluate directional organization and the interhemispheric symmetry of DT-MRI data in five healthy human brains and two patients with infiltrative diseases of the white matter tracts. In normal anatomical white matter tracts, a high degree of directional coherence and interhemispheric symmetry was observed. The infiltrative diseases appeared to alter the eigenvector properties of affected white matter tracts, showing decreased eigenvector coherence and interhemispheric symmetry. This novel approach distills the rich, 3D information available from the diffusion tensor into a form that lends itself to quantitative analysis and statistical hypothesis testing. (c) 2004 Wiley-Liss, Inc.
A quantum probability perspective on borderline vagueness.
Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter
2013-10-01
The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon. © 2013 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Takahashi, Tomoko; Thornton, Blair
2017-12-01
This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.
NBSGSC - a FORTRAN program for quantitative x-ray fluorescence analysis. Technical note (final)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tao, G.Y.; Pella, P.A.; Rousseau, R.M.
1985-04-01
A FORTRAN program (NBSGSC) was developed for performing quantitative analysis of bulk specimens by x-ray fluorescence spectrometry. This program corrects for x-ray absorption/enhancement phenomena using the comprehensive alpha coefficient algorithm proposed by Lachance (COLA). NBSGSC is a revision of the program ALPHA and CARECAL originally developed by R.M. Rousseau of the Geological Survey of Canada. Part one of the program (CALCO) performs the calculation of theoretical alpha coefficients, and part two (CALCOMP) computes the composition of the analyte specimens. The analysis of alloys, pressed minerals, and fused specimens can currently be treated by the program. In addition to using measuredmore » x-ray tube spectral distributions, spectra from seven commonly used x-ray tube targets could also be calculated with an NBS algorithm included in the program. NBSGSC is written in FORTRAN IV for a Digital Equipment Corporation (DEC PDP-11/23) minicomputer using RLO2 firm disks and an RSX 11M operating system.« less
Project risk management in the construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya
2018-03-01
This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.
General description and understanding of the nonlinear dynamics of mode-locked fiber lasers.
Wei, Huai; Li, Bin; Shi, Wei; Zhu, Xiushan; Norwood, Robert A; Peyghambarian, Nasser; Jian, Shuisheng
2017-05-02
As a type of nonlinear system with complexity, mode-locked fiber lasers are known for their complex behaviour. It is a challenging task to understand the fundamental physics behind such complex behaviour, and a unified description for the nonlinear behaviour and the systematic and quantitative analysis of the underlying mechanisms of these lasers have not been developed. Here, we present a complexity science-based theoretical framework for understanding the behaviour of mode-locked fiber lasers by going beyond reductionism. This hierarchically structured framework provides a model with variable dimensionality, resulting in a simple view that can be used to systematically describe complex states. Moreover, research into the attractors' basins reveals the origin of stochasticity, hysteresis and multistability in these systems and presents a new method for quantitative analysis of these nonlinear phenomena. These findings pave the way for dynamics analysis and system designs of mode-locked fiber lasers. We expect that this paradigm will also enable potential applications in diverse research fields related to complex nonlinear phenomena.
Quantitative analysis of intra-Golgi transport shows intercisternal exchange for all cargo
Dmitrieff, Serge; Rao, Madan; Sens, Pierre
2013-01-01
The mechanisms controlling the transport of proteins through the Golgi stack of mammalian and plant cells is the subject of intense debate, with two models, cisternal progression and intercisternal exchange, emerging as major contenders. A variety of transport experiments have claimed support for each of these models. We reevaluate these experiments using a single quantitative coarse-grained framework of intra-Golgi transport that accounts for both transport models and their many variants. Our analysis makes a definitive case for the existence of intercisternal exchange both for small membrane proteins and large protein complexes––this implies that membrane structures larger than the typical protein-coated vesicles must be involved in transport. Notwithstanding, we find that current observations on protein transport cannot rule out cisternal progression as contributing significantly to the transport process. To discriminate between the different models of intra-Golgi transport, we suggest experiments and an analysis based on our extended theoretical framework that compare the dynamics of transiting and resident proteins. PMID:24019488
Stevanović, Nikola R; Perušković, Danica S; Gašić, Uroš M; Antunović, Vesna R; Lolić, Aleksandar Đ; Baošić, Rada M
2017-03-01
The objectives of this study were to gain insights into structure-retention relationships and to propose the model to estimating their retention. Chromatographic investigation of series of 36 Schiff bases and their copper(II) and nickel(II) complexes was performed under both normal- and reverse-phase conditions. Chemical structures of the compounds were characterized by molecular descriptors which are calculated from the structure and related to the chromatographic retention parameters by multiple linear regression analysis. Effects of chelation on retention parameters of investigated compounds, under normal- and reverse-phase chromatographic conditions, were analyzed by principal component analysis, quantitative structure-retention relationship and quantitative structure-activity relationship models were developed on the basis of theoretical molecular descriptors, calculated exclusively from molecular structure, and parameters of retention and lipophilicity. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Isono, Hiroshi; Hirata, Shinnosuke; Hachiya, Hiroyuki
2015-07-01
In medical ultrasonic images of liver disease, a texture with a speckle pattern indicates a microscopic structure such as nodules surrounded by fibrous tissues in hepatitis or cirrhosis. We have been applying texture analysis based on a co-occurrence matrix to ultrasonic images of fibrotic liver for quantitative tissue characterization. A co-occurrence matrix consists of the probability distribution of brightness of pixel pairs specified with spatial parameters and gives new information on liver disease. Ultrasonic images of different types of fibrotic liver were simulated and the texture-feature contrast was calculated to quantify the co-occurrence matrices generated from the images. The results show that the contrast converges with a value that can be theoretically estimated using a multi-Rayleigh model of echo signal amplitude distribution. We also found that the contrast value increases as liver fibrosis progresses and fluctuates depending on the size of fibrotic structure.
Amide I vibrational circular dichroism of dipeptide: Conformation dependence and fragment analysis
NASA Astrophysics Data System (ADS)
Choi, Jun-Ho; Cho, Minhaeng
2004-03-01
The amide I vibrational circular dichroic response of alanine dipeptide analog (ADA) was theoretically investigated and the density functional theory calculation and fragment analysis results are presented. A variety of vibrational spectroscopic properties, local and normal mode frequencies, coupling constant, dipole, and rotational strengths, are calculated by varying two dihedral angles determining the three-dimensional ADA conformation. Considering two monopeptide fragments separately, we show that the amide I vibrational circular dichroism of the ADA can be quantitatively predicted. For several representative conformations of the model ADA, vibrational circular dichroism spectra are calculated by using both the density functional theory calculation and fragment analysis methods.
Neural electrical activity and neural network growth.
Gafarov, F M
2018-05-01
The development of central and peripheral neural system depends in part on the emergence of the correct functional connectivity in its input and output pathways. Now it is generally accepted that molecular factors guide neurons to establish a primary scaffold that undergoes activity-dependent refinement for building a fully functional circuit. However, a number of experimental results obtained recently shows that the neuronal electrical activity plays an important role in the establishing of initial interneuronal connections. Nevertheless, these processes are rather difficult to study experimentally, due to the absence of theoretical description and quantitative parameters for estimation of the neuronal activity influence on growth in neural networks. In this work we propose a general framework for a theoretical description of the activity-dependent neural network growth. The theoretical description incorporates a closed-loop growth model in which the neural activity can affect neurite outgrowth, which in turn can affect neural activity. We carried out the detailed quantitative analysis of spatiotemporal activity patterns and studied the relationship between individual cells and the network as a whole to explore the relationship between developing connectivity and activity patterns. The model, developed in this work will allow us to develop new experimental techniques for studying and quantifying the influence of the neuronal activity on growth processes in neural networks and may lead to a novel techniques for constructing large-scale neural networks by self-organization. Copyright © 2018 Elsevier Ltd. All rights reserved.
Quantitative Theoretical and Conceptual Framework Use in Agricultural Education Research
ERIC Educational Resources Information Center
Kitchel, Tracy; Ball, Anna L.
2014-01-01
The purpose of this philosophical paper was to articulate the disciplinary tenets for consideration when using theory in agricultural education quantitative research. The paper clarified terminology around the concept of theory in social sciences and introduced inaccuracies of theory use in agricultural education quantitative research. Finally,…
NASA Astrophysics Data System (ADS)
Hunt, Allen G.; Sahimi, Muhammad
2017-12-01
We describe the most important developments in the application of three theoretical tools to modeling of the morphology of porous media and flow and transport processes in them. One tool is percolation theory. Although it was over 40 years ago that the possibility of using percolation theory to describe flow and transport processes in porous media was first raised, new models and concepts, as well as new variants of the original percolation model are still being developed for various applications to flow phenomena in porous media. The other two approaches, closely related to percolation theory, are the critical-path analysis, which is applicable when porous media are highly heterogeneous, and the effective medium approximation—poor man's percolation—that provide a simple and, under certain conditions, quantitatively correct description of transport in porous media in which percolation-type disorder is relevant. Applications to topics in geosciences include predictions of the hydraulic conductivity and air permeability, solute and gas diffusion that are particularly important in ecohydrological applications and land-surface interactions, and multiphase flow in porous media, as well as non-Gaussian solute transport, and flow morphologies associated with imbibition into unsaturated fractures. We describe new applications of percolation theory of solute transport to chemical weathering and soil formation, geomorphology, and elemental cycling through the terrestrial Earth surface. Wherever quantitatively accurate predictions of such quantities are relevant, so are the techniques presented here. Whenever possible, the theoretical predictions are compared with the relevant experimental data. In practically all the cases, the agreement between the theoretical predictions and the data is excellent. Also discussed are possible future directions in the application of such concepts to many other phenomena in geosciences.
Phylogenetic Properties of RNA Viruses
Pompei, Simone; Loreto, Vittorio; Tria, Francesca
2012-01-01
A new word, phylodynamics, was coined to emphasize the interconnection between phylogenetic properties, as observed for instance in a phylogenetic tree, and the epidemic dynamics of viruses, where selection, mediated by the host immune response, and transmission play a crucial role. The challenges faced when investigating the evolution of RNA viruses call for a virtuous loop of data collection, data analysis and modeling. This already resulted both in the collection of massive sequences databases and in the formulation of hypotheses on the main mechanisms driving qualitative differences observed in the (reconstructed) evolutionary patterns of different RNA viruses. Qualitatively, it has been observed that selection driven by the host immune response induces an uneven survival ability among co-existing strains. As a consequence, the imbalance level of the phylogenetic tree is manifestly more pronounced if compared to the case when the interaction with the host immune system does not play a central role in the evolutive dynamics. While many imbalance metrics have been introduced, reliable methods to discriminate in a quantitative way different level of imbalance are still lacking. In our work, we reconstruct and analyze the phylogenetic trees of six RNA viruses, with a special emphasis on the human Influenza A virus, due to its relevance for vaccine preparation as well as for the theoretical challenges it poses due to its peculiar evolutionary dynamics. We focus in particular on topological properties. We point out the limitation featured by standard imbalance metrics, and we introduce a new methodology with which we assign the correct imbalance level of the phylogenetic trees, in agreement with the phylodynamics of the viruses. Our thorough quantitative analysis allows for a deeper understanding of the evolutionary dynamics of the considered RNA viruses, which is crucial in order to provide a valuable framework for a quantitative assessment of theoretical predictions. PMID:23028645
NASA Astrophysics Data System (ADS)
Döveling, Katrin
2015-04-01
In an age of rising impact of online communication in social network sites (SNS), emotional interaction is neither limited nor restricted by time or space. Bereavement extends to the anonymity of cyberspace. What role does virtual interaction play in SNS in dealing with the basic human emotion of grief caused by the loss of a beloved person? The analysis laid out in this article provides answers in light of an interdisciplinary perspective on online bereavement. Relevant lines of research are scrutinized. After laying out the theoretical spectrum for the study, hypotheses based on a prior in-depth qualitative content analysis of 179 postings in three different German online bereavement platforms are proposed and scrutinized in a quantitative content analysis (2127 postings from 318 users). Emotion regulation patterns in SNS and similarities as well as differences in online bereavement of children, adolescents and adults are revealed. Large-scale quantitative findings into central motives, patterns, and restorative effects of online shared bereavement in regulating distress, fostering personal empowerment, and engendering meaning are presented. The article closes with implications for further analysis in memorialization practices.
Note: Eddy current displacement sensors independent of target conductivity.
Wang, Hongbo; Li, Wei; Feng, Zhihua
2015-01-01
Eddy current sensors (ECSs) are widely used for non-contact displacement measurement. In this note, the quantitative error of an ECS caused by target conductivity was analyzed using a complex image method. The response curves (L-x) of the ECS with different targets were similar and could be overlapped by shifting the curves on x direction with √2δ/2. Both finite element analysis and experiments match well with the theoretical analysis, which indicates that the measured error of high precision ECSs caused by target conductivity can be completely eliminated, and the ECSs can measure different materials precisely without calibration.
1985-11-01
Kappus ,19a5; Tyson and Green, in press). When ethane evolution was quantitated, the experimental conditions were modified to maximize sensitivity, as...commonly used and convenient technique for tVat purpose ( Kappus , 1985). Since MDA, the lipid breakdown product that the TBA reaction primarily...cytcchrome P-450(c) reductase. Mol. Pharmacol. 20, 669-673 (1981). Kappus , H. (1985). Lipid peroxidatioa: mechanisms, analysis, enzymology and
Day, Charles A.; Kraft, Lewis J.; Kang, Minchul; Kenworthy, Anne K.
2012-01-01
Fluorescence recovery after photobleaching (FRAP) is a powerful, versatile and widely accessible tool to monitor molecular dynamics in living cells that can be performed using modern confocal microscopes. Although the basic principles of FRAP are simple, quantitative FRAP analysis requires careful experimental design, data collection and analysis. In this review we discuss the theoretical basis for confocal FRAP, followed by step-by-step protocols for FRAP data acquisition using a laser scanning confocal microscope for (1) measuring the diffusion of a membrane protein, (2) measuring the diffusion of a soluble protein, and (3) analysis of intracellular trafficking. Finally, data analysis procedures are discussed and an equation for determining the diffusion coefficient of a molecular species undergoing pure diffusion is presented. PMID:23042527
Symbolic interactionism as a theoretical perspective for multiple method research.
Benzies, K M; Allen, M N
2001-02-01
Qualitative and quantitative research rely on different epistemological assumptions about the nature of knowledge. However, the majority of nurse researchers who use multiple method designs do not address the problem of differing theoretical perspectives. Traditionally, symbolic interactionism has been viewed as one perspective underpinning qualitative research, but it is also the basis for quantitative studies. Rooted in social psychology, symbolic interactionism has a rich intellectual heritage that spans more than a century. Underlying symbolic interactionism is the major assumption that individuals act on the basis of the meaning that things have for them. The purpose of this paper is to present symbolic interactionism as a theoretical perspective for multiple method designs with the aim of expanding the dialogue about new methodologies. Symbolic interactionism can serve as a theoretical perspective for conceptually clear and soundly implemented multiple method research that will expand the understanding of human health behaviour.
How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography
Jørgensen, J. S.; Sidky, E. Y.
2015-01-01
We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization. PMID:25939620
How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography.
Jørgensen, J S; Sidky, E Y
2015-06-13
We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization.
ERIC Educational Resources Information Center
Dodd, Bucky J.
2013-01-01
Online course design is an emerging practice in higher education, yet few theoretical models currently exist to explain or predict how the diffusion of innovations occurs in this space. This study used a descriptive, quantitative survey research design to examine theoretical relationships between decision-making style and resistance to change…
NASA Technical Reports Server (NTRS)
Fu, L. S.
1980-01-01
The three main topics covered are: (1) fracture toughness and microstructure, (2) quantitative ultrasonic and microstructure; and (3) scattering and related mathematical methods. Literature in these areas is reviewed to give insight to the search of a theoretical foundation for quantitative ultrasonic measurement of fracture toughness. The literature review shows that fracture toughness is inherently related to the microstructure and in particular, it depends upon the spacing of inclusions or second particles and the aspect ratio of second phase particles. There are indications that ultrasonic velocity attenuation measurements can be used to determine fracture toughness. The leads to a review of the mathematical models available in solving boundary value problems related to microstructural factors that govern facture toughness and wave motion. A framework towards the theoretical study for the quantitative determination of fracture toughness is described and suggestions for future research are proposed.
Proposal for a quantitative index of flood disasters.
Feng, Lihua; Luo, Gaoyuan
2010-07-01
Drawing on calculations of wind scale and earthquake magnitude, this paper develops a new quantitative method for measuring flood magnitude and disaster intensity. Flood magnitude is the quantitative index that describes the scale of a flood; the flood's disaster intensity is the quantitative index describing the losses caused. Both indices have numerous theoretical and practical advantages with definable concepts and simple applications, which lend them key practical significance.
Guo, Yujie; Shen, Jie; Ye, Xuchun; Chen, Huali; Jiang, Anli
2013-08-01
This paper aims to report the design and test the effectiveness of an innovative caring teaching model based on the theoretical framework of caring in the Chinese context. Since the 1970's, caring has been a core value in nursing education. In a previous study, a theoretical framework of caring in the Chinese context is explored employing a grounded theory study, considered beneficial for caring education. A caring teaching model was designed theoretically and a one group pre- and post-test quasi-experimental study was administered to test its effectiveness. From Oct, 2009 to Jul, 2010, a cohort of grade-2 undergraduate nursing students (n=64) in a Chinese medical school was recruited to participate in the study. Data were gathered through quantitative and qualitative methods to evaluate the effectiveness of the caring teaching model. The caring teaching model created an esthetic situation and experiential learning style for teaching caring that was integrated within the curricula. Quantitative data from the quasi-experimental study showed that the post-test scores of each item were higher than those on the pre-test (p<0.01). Thematic analysis of 1220 narratives from students' caring journals and reports of participant class observation revealed two main thematic categories, which reflected, from the students' points of view, the development of student caring character and the impact that the caring teaching model had on this regard. The model could be used as an integrated approach to teach caring in nursing curricula. It would also be beneficial for nursing administrators in cultivating caring nurse practitioners. Copyright © 2012 Elsevier Ltd. All rights reserved.
Faiola, C. L.; Erickson, M. H.; Fricaud, V. L.; ...
2012-08-10
Biogenic volatile organic compounds (BVOCs) are emitted into the atmosphere by plants and include isoprene, monoterpenes, sesquiterpenes, and their oxygenated derivatives. These BVOCs are among the principal factors influencing the oxidative capacity of the atmosphere in forested regions. BVOC emission rates are often measured by collecting samples onto adsorptive cartridges in the field and then transporting these samples to the laboratory for chromatographic analysis. One of the most commonly used detectors in chromatographic analysis is the flame ionization detector (FID). For quantitative analysis with an FID, relative response factors may be estimated using the effective carbon number (ECN) concept. Themore » purpose of this study was to determine the ECN for a variety of terpenoid compounds to enable improved quantification of BVOC measurements. A dynamic dilution system was developed to make quantitative gas standards of VOCs with mixing ratios from 20–55 ppb. For each experiment using this system, one terpene standard was co-injected with an internal reference, n-octane, and analyzed via an automated cryofocusing system interfaced to a gas chromatograph flame ionization detector and mass spectrometer (GC/MS/FID). The ECNs of 16 compounds (14 BVOCs) were evaluated with this approach, with each test compound analyzed at least three times. The difference between the actual carbon number and measured ECN ranged from -24% to -2%. Furthermore, the difference between theoretical ECN and measured ECN ranged from -22% to 9%. Measured ECN values were within 10% of theoretical ECN values for most terpenoid compounds.« less
Testing adaptive toolbox models: a Bayesian hierarchical approach.
Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.
Contact thermal shock test of ceramics
NASA Technical Reports Server (NTRS)
Rogers, W. P.; Emery, A. F.
1992-01-01
A novel quantitative thermal shock test of ceramics is described. The technique employs contact between a metal-cooling rod and hot disk-shaped specimen. In contrast with traditional techniques, the well-defined thermal boundary condition allows for accurate analyses of heat transfer, stress, and fracture. Uniform equibiaxial tensile stresses are induced in the center of the test specimen. Transient specimen temperature and acoustic emission are monitored continuously during the thermal stress cycle. The technique is demonstrated with soda-lime glass specimens. Experimental results are compared with theoretical predictions based on a finite-element method thermal stress analysis combined with a statistical model of fracture. Material strength parameters are determined using concentric ring flexure tests. Good agreement is found between experimental results and theoretical predictions of failure probability as a function of time and initial specimen temperature.
Hötzel, Fabian; Seino, Kaori; Huck, Christian; Skibbe, Olaf; Bechstedt, Friedhelm; Pucci, Annemarie
2015-06-10
The metal-atom chains on the Si(111) - 5 × 2 - Au surface represent an exceedingly interesting system for the understanding of one-dimensional electrical interconnects. While other metal-atom chain structures on silicon suffer from metal-to-insulator transitions, Si(111) - 5 × 2 - Au stays metallic at least down to 20 K as we have proven by the anisotropic absorption from localized plasmon polaritons in the infrared. A quantitative analysis of the infrared plasmonic signal done here for the first time yields valuable band structure information in agreement with the theoretically derived data. The experimental and theoretical results are consistently explained in the framework of the atomic geometry, electronic structure, and IR spectra of the recent Kwon-Kang model.
Bidirectional selection between two classes in complex social networks.
Zhou, Bin; He, Zhe; Jiang, Luo-Luo; Wang, Nian-Xin; Wang, Bing-Hong
2014-12-19
The bidirectional selection between two classes widely emerges in various social lives, such as commercial trading and mate choosing. Until now, the discussions on bidirectional selection in structured human society are quite limited. We demonstrated theoretically that the rate of successfully matching is affected greatly by individuals' neighborhoods in social networks, regardless of the type of networks. Furthermore, it is found that the high average degree of networks contributes to increasing rates of successful matches. The matching performance in different types of networks has been quantitatively investigated, revealing that the small-world networks reinforces the matching rate more than scale-free networks at given average degree. In addition, our analysis is consistent with the modeling result, which provides the theoretical understanding of underlying mechanisms of matching in complex networks.
NASA Astrophysics Data System (ADS)
Liu, Lixian; Mandelis, Andreas; Huan, Huiting; Melnikov, Alexander
2016-10-01
A step-scan differential Fourier transform infrared photoacoustic spectroscopy (DFTIR-PAS) using a commercial FTIR spectrometer was developed theoretically and experimentally for air contaminant monitoring. The configuration comprises two identical, small-size and low-resonance-frequency T cells satisfying the conflicting requirements of low chopping frequency and limited space in the sample compartment. Carbon dioxide (CO2) IR absorption spectra were used to demonstrate the capability of the DFTIR-PAS method to detect ambient pollutants. A linear amplitude response to CO2 concentrations from 100 to 10,000 ppmv was observed, leading to a theoretical detection limit of 2 ppmv. The differential mode was able to suppress the coherent noise, thereby imparting the DFTIR-PAS method with a better signal-to-noise ratio and lower theoretical detection limit than the single mode. The results indicate that it is possible to use step-scan DFTIR-PAS with T cells as a quantitative method for high sensitivity analysis of ambient contaminants.
Inverse transport problems in quantitative PAT for molecular imaging
NASA Astrophysics Data System (ADS)
Ren, Kui; Zhang, Rongting; Zhong, Yimin
2015-12-01
Fluorescence photoacoustic tomography (fPAT) is a molecular imaging modality that combines photoacoustic tomography with fluorescence imaging to obtain high-resolution imaging of fluorescence distributions inside heterogeneous media. The objective of this work is to study inverse problems in the quantitative step of fPAT where we intend to reconstruct physical coefficients in a coupled system of radiative transport equations using internal data recovered from ultrasound measurements. We derive uniqueness and stability results on the inverse problems and develop some efficient algorithms for image reconstructions. Numerical simulations based on synthetic data are presented to validate the theoretical analysis. The results we present here complement these in Ren K and Zhao H (2013 SIAM J. Imaging Sci. 6 2024-49) on the same problem but in the diffusive regime.
FDTD-based quantitative analysis of terahertz wave detection for multilayered structures.
Tu, Wanli; Zhong, Shuncong; Shen, Yaochun; Zhou, Qing; Yao, Ligang
2014-10-01
Experimental investigations have shown that terahertz pulsed imaging (TPI) is able to quantitatively characterize a range of multilayered media (e.g., biological issues, pharmaceutical tablet coatings, layered polymer composites, etc.). Advanced modeling of the interaction of terahertz radiation with a multilayered medium is required to enable the wide application of terahertz technology in a number of emerging fields, including nondestructive testing. Indeed, there have already been many theoretical analyses performed on the propagation of terahertz radiation in various multilayered media. However, to date, most of these studies used 1D or 2D models, and the dispersive nature of the dielectric layers was not considered or was simplified. In the present work, the theoretical framework of using terahertz waves for the quantitative characterization of multilayered media was established. A 3D model based on the finite difference time domain (FDTD) method is proposed. A batch of pharmaceutical tablets with a single coating layer of different coating thicknesses and different refractive indices was modeled. The reflected terahertz wave from such a sample was computed using the FDTD method, assuming that the incident terahertz wave is broadband, covering a frequency range up to 3.5 THz. The simulated results for all of the pharmaceutical-coated tablets considered were found to be in good agreement with the experimental results obtained using a commercial TPI system. In addition, we studied a three-layered medium to mimic the occurrence of defects in the sample.
NASA Astrophysics Data System (ADS)
Zepeda-Ruiz, Luis A.; Pelzel, Rodney I.; Nosho, Brett Z.; Weinberg, W. Henry; Maroudas, Dimitrios
2001-09-01
A comprehensive, quantitative analysis is presented of the deformation behavior of coherently strained InAs/GaAs(111)A heteroepitaxial systems. The analysis combines a hierarchical theoretical approach with experimental measurements. Continuum linear elasticity theory is linked with atomic-scale calculations of structural relaxation for detailed theoretical studies of deformation in systems consisting of InAs thin films on thin GaAs(111)A substrates that are mechanically unconstrained at their bases. Molecular-beam epitaxy is used to grow very thin InAs films on both thick and thin GaAs buffer layers on epi-ready GaAs(111)A substrates. The deformation state of these samples is characterized by x-ray diffraction (XRD). The interplanar distances of thin GaAs buffer layers along the [220] and [111] crystallographic directions obtained from the corresponding XRD spectra indicate clearly that thin buffer layers deform parallel to the InAs/GaAs(111)A interfacial plane, thus aiding in the accommodation of the strain induced by lattice mismatch. The experimental measurements are in excellent agreement with the calculated lattice interplanar distances and the corresponding strain fields in the thin mechanically unconstrained substrates considered in the theoretical analysis. Therefore, this work contributes direct evidence in support of our earlier proposal that thin buffer layers in layer-by-layer semiconductor heteroepitaxy exhibit mechanical behavior similar to that of compliant substrates [see, e.g., B. Z. Nosho, L. A. Zepeda-Ruiz, R. I. Pelzel, W. H. Weinberg, and D. Maroudas, Appl. Phys. Lett. 75, 829 (1999)].
Liu, Yan; Song, Yang; Madahar, Vipul; Liao, Jiayu
2012-03-01
Förster resonance energy transfer (FRET) technology has been widely used in biological and biomedical research, and it is a very powerful tool for elucidating protein interactions in either dynamic or steady state. SUMOylation (the process of SUMO [small ubiquitin-like modifier] conjugation to substrates) is an important posttranslational protein modification with critical roles in multiple biological processes. Conjugating SUMO to substrates requires an enzymatic cascade. Sentrin/SUMO-specific proteases (SENPs) act as an endopeptidase to process the pre-SUMO or as an isopeptidase to deconjugate SUMO from its substrate. To fully understand the roles of SENPs in the SUMOylation cycle, it is critical to understand their kinetics. Here, we report a novel development of a quantitative FRET-based protease assay for SENP1 kinetic parameter determination. The assay is based on the quantitative analysis of the FRET signal from the total fluorescent signal at acceptor emission wavelength, which consists of three components: donor (CyPet-SUMO1) emission, acceptor (YPet) emission, and FRET signal during the digestion process. Subsequently, we developed novel theoretical and experimental procedures to determine the kinetic parameters, k(cat), K(M), and catalytic efficiency (k(cat)/K(M)) of catalytic domain SENP1 toward pre-SUMO1. Importantly, the general principles of this quantitative FRET-based protease kinetic determination can be applied to other proteases. Copyright © 2011 Elsevier Inc. All rights reserved.
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
A study on the correlation between the dewetting temperature of Ag film and SERS intensity.
Quan, Jiamin; Zhang, Jie; Qi, Xueqiang; Li, Junying; Wang, Ning; Zhu, Yong
2017-11-07
The thermally dewetted metal nano-islands have been actively investigated as cost-effective SERS-active substrates with a large area, good reproducibility and repeatability via simple fabrication process. However, the correlation between the dewetting temperature of metal film and SERS intensity hasn't been systematically studied. In this work, taking Ag nano-islands (AgNIs) as an example, we reported a strategy to investigate the correlation between the dewetting temperature of metal film and SERS intensity. We described the morphology evolution of AgNIs on the SiO 2 planar substrate in different temperatures and got the quantitative information in surface-limited diffusion process (SLDP) as a function of annealing temperature via classical mean-field nucleation theory. Those functions were further used in the simulation of electromagnetic field to obtain the correlation between the dewetting temperature of Ag film and theoretical analysis. In addition, Raman mapping was done on samples annealed at different temperatures, with R6G as an analyte, to accomplish the analysis of the correlation between the dewetting temperature of Ag film and SERS intensity, which is consistent with the theoretical analysis. For SLDP, we used the morphological characterization of five samples prepared by different annealing temperatures to successfully illustrate the change in SERS intensity with the temperature fluctuation, obtaining a small deviation between the experimental results and theoretic prediction.
Flory-Stockmayer analysis on reprocessable polymer networks
NASA Astrophysics Data System (ADS)
Li, Lingqiao; Chen, Xi; Jin, Kailong; Torkelson, John
Reprocessable polymer networks can undergo structure rearrangement through dynamic chemistries under proper conditions, making them a promising candidate for recyclable crosslinked materials, e.g. tires. This research field has been focusing on various chemistries. However, there has been lacking of an essential physical theory explaining the relationship between abundancy of dynamic linkages and reprocessability. Based on the classical Flory-Stockmayer analysis on network gelation, we developed a similar analysis on reprocessable polymer networks to quantitatively predict the critical condition for reprocessability. Our theory indicates that it is unnecessary for all bonds to be dynamic to make the resulting network reprocessable. As long as there is no percolated permanent network in the system, the material can fully rearrange. To experimentally validate our theory, we used a thiol-epoxy network model system with various dynamic linkage compositions. The stress relaxation behavior of resulting materials supports our theoretical prediction: only 50 % of linkages between crosslinks need to be dynamic for a tri-arm network to be reprocessable. Therefore, this analysis provides the first fundamental theoretical platform for designing and evaluating reprocessable polymer networks. We thank McCormick Research Catalyst Award Fund and ISEN cluster fellowship (L. L.) for funding support.
Quantitative interpretations of Visible-NIR reflectance spectra of blood.
Serebrennikova, Yulia M; Smith, Jennifer M; Huffman, Debra E; Leparc, German F; García-Rubio, Luis H
2008-10-27
This paper illustrates the implementation of a new theoretical model for rapid quantitative analysis of the Vis-NIR diffuse reflectance spectra of blood cultures. This new model is based on the photon diffusion theory and Mie scattering theory that have been formulated to account for multiple scattering populations and absorptive components. This study stresses the significance of the thorough solution of the scattering and absorption problem in order to accurately resolve for optically relevant parameters of blood culture components. With advantages of being calibration-free and computationally fast, the new model has two basic requirements. First, wavelength-dependent refractive indices of the basic chemical constituents of blood culture components are needed. Second, multi-wavelength measurements or at least the measurements of characteristic wavelengths equal to the degrees of freedom, i.e. number of optically relevant parameters, of blood culture system are required. The blood culture analysis model was tested with a large number of diffuse reflectance spectra of blood culture samples characterized by an extensive range of the relevant parameters.
Statistical image quantification toward optimal scan fusion and change quantification
NASA Astrophysics Data System (ADS)
Potesil, Vaclav; Zhou, Xiang Sean
2007-03-01
Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartman, J.S.; Gordon, R.L.; Lessor, D.L.
1980-09-01
The application of reflective Nomarski differential interference contrast microscopy for the determination of quantitative sample topography data is presented. The discussion includes a review of key theoretical results presented previously plus the experimental implementation of the concepts using a commercial Momarski microscope. The experimental work included the modification and characterization of a commercial microscope to allow its use for obtaining quantitative sample topography data. System usage for the measurement of slopes on flat planar samples is also discussed. The discussion has been designed to provide the theoretical basis, a physical insight, and a cookbook procedure for implementation to allow thesemore » results to be of value to both those interested in the microscope theory and its practical usage in the metallography laboratory.« less
Critical Nucleation Length for Accelerating Frictional Slip
NASA Astrophysics Data System (ADS)
Aldam, Michael; Weikamp, Marc; Spatschek, Robert; Brener, Efim A.; Bouchbinder, Eran
2017-11-01
The spontaneous nucleation of accelerating slip along slowly driven frictional interfaces is central to a broad range of geophysical, physical, and engineering systems, with particularly far-reaching implications for earthquake physics. A common approach to this problem associates nucleation with an instability of an expanding creep patch upon surpassing a critical length Lc. The critical nucleation length Lc is conventionally obtained from a spring-block linear stability analysis extended to interfaces separating elastically deformable bodies using model-dependent fracture mechanics estimates. We propose an alternative approach in which the critical nucleation length is obtained from a related linear stability analysis of homogeneous sliding along interfaces separating elastically deformable bodies. For elastically identical half-spaces and rate-and-state friction, the two approaches are shown to yield Lc that features the same scaling structure, but with substantially different numerical prefactors, resulting in a significantly larger Lc in our approach. The proposed approach is also shown to be naturally applicable to finite-size systems and bimaterial interfaces, for which various analytic results are derived. To quantitatively test the proposed approach, we performed inertial Finite-Element-Method calculations for a finite-size two-dimensional elastically deformable body in rate-and-state frictional contact with a rigid body under sideway loading. We show that the theoretically predicted Lc and its finite-size dependence are in reasonably good quantitative agreement with the full numerical solutions, lending support to the proposed approach. These results offer a theoretical framework for predicting rapid slip nucleation along frictional interfaces.
Ranking and validation of spallation models for isotopic production cross sections of heavy residua
NASA Astrophysics Data System (ADS)
Sharma, Sushil K.; Kamys, Bogusław; Goldenbaum, Frank; Filges, Detlef
2017-07-01
The production cross sections of isotopically identified residual nuclei of spallation reactions induced by 136Xe projectiles at 500AMeV on hydrogen target were analyzed in a two-step model. The first stage of the reaction was described by the INCL4.6 model of an intranuclear cascade of nucleon-nucleon and pion-nucleon collisions whereas the second stage was analyzed by means of four different models; ABLA07, GEM2, GEMINI++ and SMM. The quality of the data description was judged quantitatively using two statistical deviation factors; the H-factor and the M-factor. It was found that the present analysis leads to a different ranking of models as compared to that obtained from the qualitative inspection of the data reproduction. The disagreement was caused by sensitivity of the deviation factors to large statistical errors present in some of the data. A new deviation factor, the A factor, was proposed, that is not sensitive to the statistical errors of the cross sections. The quantitative ranking of models performed using the A-factor agreed well with the qualitative analysis of the data. It was concluded that using the deviation factors weighted by statistical errors may lead to erroneous conclusions in the case when the data cover a large range of values. The quality of data reproduction by the theoretical models is discussed. Some systematic deviations of the theoretical predictions from the experimental results are observed.
Shandra, John M; Nobles, Jenna; London, Bruce; Williamson, John B
2004-07-01
This study presents quantitative, sociological models designed to account for cross-national variation in infant mortality rates. We consider variables linked to four different theoretical perspectives: the economic modernization, social modernization, political modernization, and dependency perspectives. The study is based on a panel regression analysis of a sample of 59 developing countries. Our preliminary analysis based on additive models replicates prior studies to the extent that we find that indicators linked to economic and social modernization have beneficial effects on infant mortality. We also find support for hypotheses derived from the dependency perspective suggesting that multinational corporate penetration fosters higher levels of infant mortality. Subsequent analysis incorporating interaction effects suggest that the level of political democracy conditions the effects of dependency relationships based upon exports, investments from multinational corporations, and international lending institutions. Transnational economic linkages associated with exports, multinational corporations, and international lending institutions adversely affect infant mortality more strongly at lower levels of democracy than at higher levels of democracy: intranational, political factors interact with the international, economic forces to affect infant mortality. We conclude with some brief policy recommendations and suggestions for the direction of future research.
Shao, Shiying; Guo, Tiannan; Gross, Vera; Lazarev, Alexander; Koh, Ching Chiek; Gillessen, Silke; Joerger, Markus; Jochum, Wolfram; Aebersold, Ruedi
2016-06-03
The reproducible and efficient extraction of proteins from biopsy samples for quantitative analysis is a critical step in biomarker and translational research. Recently, we described a method consisting of pressure-cycling technology (PCT) and sequential windowed acquisition of all theoretical fragment ions-mass spectrometry (SWATH-MS) for the rapid quantification of thousands of proteins from biopsy-size tissue samples. As an improvement of the method, we have incorporated the PCT-MicroPestle into the PCT-SWATH workflow. The PCT-MicroPestle is a novel, miniaturized, disposable mechanical tissue homogenizer that fits directly into the microTube sample container. We optimized the pressure-cycling conditions for tissue lysis with the PCT-MicroPestle and benchmarked the performance of the system against the conventional PCT-MicroCap method using mouse liver, heart, brain, and human kidney tissues as test samples. The data indicate that the digestion of the PCT-MicroPestle-extracted proteins yielded 20-40% more MS-ready peptide mass from all tissues tested with a comparable reproducibility when compared to the conventional PCT method. Subsequent SWATH-MS analysis identified a higher number of biologically informative proteins from a given sample. In conclusion, we have developed a new device that can be seamlessly integrated into the PCT-SWATH workflow, leading to increased sample throughput and improved reproducibility at both the protein extraction and proteomic analysis levels when applied to the quantitative proteomic analysis of biopsy-level samples.
ERIC Educational Resources Information Center
Petty, John T.
1995-01-01
Describes an experiment that uses air to test Charles' law. Reinforces the student's intuitive feel for Charles' law with quantitative numbers they can see, introduces the idea of extrapolating experimental data to obtain a theoretical value, and gives a physical quantitative meaning to the concept of absolute zero. (JRH)
Xiao, Huapan; Chen, Zhi; Wang, Hairong; Wang, Jiuhong; Zhu, Nan
2018-02-19
Based on micro-indentation mechanics and kinematics of grinding processes, theoretical formulas are deduced to calculate surface roughness (SR) and subsurface damage (SSD) depth. The SRs and SSD depths of a series of fused silica samples, which are prepared under different grinding parameters, are measured. By experimental and theoretical analysis, the relationship between SR and SSD depth is discussed. The effect of grinding parameters on SR and SSD depth is investigated quantitatively. The results show that SR and SSD depth decrease with the increase of wheel speed or the decrease of feed speed as well as cutting depth. The interaction effect between wheel speed and feed speed should be emphasized greatly. Furthermore, a relationship model between SSD depth and grinding parameters is established, which could be employed to evaluate SSD depth efficiently.
NASA Astrophysics Data System (ADS)
Wuhrer, R.; Moran, K.
2014-03-01
Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.
The Causal Effects of Cultural Relevance: Evidence from an Ethnic Studies Curriculum
ERIC Educational Resources Information Center
Dee, Thomas S.; Penner, Emily K.
2017-01-01
An extensive theoretical and qualitative literature stresses the promise of instructional practices and content aligned with minority students' experiences. Ethnic studies courses provide an example of such "culturally relevant pedagogy" (CRP). Despite theoretical support, quantitative evidence on the effectiveness of these courses is…
Tian, Tongde; Chen, Chuanliang; Yang, Feng; Tang, Jingwen; Pei, Junwen; Shi, Bian; Zhang, Ning; Zhang, Jianhua
2017-03-01
The paper aimed to screen out genetic markers applicable to early diagnosis for colorectal cancer and establish apoptotic regulatory network model for colorectal cancer, and to analyze the current situation of traditional Chinese medicine (TCM) target, thereby providing theoretical evidence for early diagnosis and targeted therapy of colorectal cancer. Taking databases including CNKI, VIP, Wanfang data, Pub Med, and MEDLINE as main sources of literature retrieval, literatures associated with genetic markers that are applied to early diagnosis of colorectal cancer were searched and performed comprehensive and quantitative analysis by Meta analysis, hence screening genetic markers used in early diagnosis of colorectal cancer. KEGG analysis was employed to establish apoptotic regulatory network model based on screened genetic markers, and optimization was conducted on TCM targets. Through Meta analysis, seven genetic markers were screened out, including WWOX, K-ras, COX-2, P53, APC, DCC and PTEN, among which DCC has the highest diagnostic efficiency. Apoptotic regulatory network was built by KEGG analysis. Currently, it was reported that TCM has regulatory function on gene locus in apoptotic regulatory network. The apoptotic regulatory model of colorectal cancer established in this study provides theoretical evidence for early diagnosis and TCM targeted therapy of colorectal cancer in clinic.
Quantitative pathology in virtual microscopy: history, applications, perspectives.
Kayser, Gian; Kayser, Klaus
2013-07-01
With the emerging success of commercially available personal computers and the rapid progress in the development of information technologies, morphometric analyses of static histological images have been introduced to improve our understanding of the biology of diseases such as cancer. First applications have been quantifications of immunohistochemical expression patterns. In addition to object counting and feature extraction, laws of thermodynamics have been applied in morphometric calculations termed syntactic structure analysis. Here, one has to consider that the information of an image can be calculated for separate hierarchical layers such as single pixels, cluster of pixels, segmented small objects, clusters of small objects, objects of higher order composed of several small objects. Using syntactic structure analysis in histological images, functional states can be extracted and efficiency of labor in tissues can be quantified. Image standardization procedures, such as shading correction and color normalization, can overcome artifacts blurring clear thresholds. Morphometric techniques are not only useful to learn more about biological features of growth patterns, they can also be helpful in routine diagnostic pathology. In such cases, entropy calculations are applied in analogy to theoretical considerations concerning information content. Thus, regions with high information content can automatically be highlighted. Analysis of the "regions of high diagnostic value" can deliver in the context of clinical information, site of involvement and patient data (e.g. age, sex), support in histopathological differential diagnoses. It can be expected that quantitative virtual microscopy will open new possibilities for automated histological support. Automated integrated quantification of histological slides also serves for quality assurance. The development and theoretical background of morphometric analyses in histopathology are reviewed, as well as their application and potential future implementation in virtual microscopy. Copyright © 2012 Elsevier GmbH. All rights reserved.
The aluminum ordering in aluminosilicates: a dipolar 27Al NMR spectroscopy study.
Gee, Becky A
2004-01-01
The spatial ordering of aluminum atoms in CsAl(SiO3)2 and 3Al2O3.2SiO2 was probed by 27Al dipolar solid-state NMR spectroscopy. The 27Al response to a Hahn spin-echo pulse sequence in a series of aluminum-containing model crystalline compounds demonstrates that quantitative 27Al homonuclear dipolar second moments can be obtained to within +/-20% of the theoretical values, if evaluation of the spin-echo response curve is limited to short evolution periods (2t1 < or = 0.10 ms). Additionally, selective excitation of the central transition m = 1/2 --> -1/2 is necessary in order to ensure quantitative results. Restriction of spin exchange affecting the dephasing of the magnetization may decelerate the spin-echo decay at longer evolution periods. Considering these restraints, the method was used to probe the spatial distribution of aluminum atoms among the tetrahedral sites in two aluminosilicate materials. Experimental 27Al spin-echo response data for the aluminosilicates CsAl(SiO3)2 (synthetic pollucite) and 3Al2O3.2SiO2 (mullite) are compared with theoretical data based on (I) various degrees of aluminum-oxygen-aluminum bond formation among tetrahedrally coordinated aluminum atoms (Al(T(d) )-O-Al(T(d) )) and (II) the maximum avoidance of Al(T(d) )-O-Al(T(d) ) bonding. Analysis of the second moment values and resulting echo decay responses suggests that partial suppression of spin exchange among aluminum atoms in crystallographically distinct sites may contribute to the 27Al spin echo decay in 3Al2O3.2SiO2, thus complicating quantitative analysis of the data. Silicon-29 and aluminum-27 magic angle spinning (MAS) NMR spectra of 3Al2O3.2SiO2 are consistent with those previously reported. The experimental 27Al spin-echo response behavior of CsAl(SiO3)2 differs from the theoretical response behavior based on the maximum avoidance of Al-O-Al bonding between tetrahedral aluminum sites in CsAl(SiO3)2. A single unresolved resonance is observed in both the silicon-29 and aluminum-27 MAS spectra of CsAl(SiO3)2. Copyright 2003 John Wiley & Sons, Ltd.
Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets
NASA Technical Reports Server (NTRS)
Brugel, Edward W.; Domik, Gitta O.; Ayres, Thomas R.
1993-01-01
The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions, and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists.
Wang, Z C; Zhong, X Y; Jin, L; Chen, X F; Moritomo, Y; Mayer, J
2017-05-01
Electron energy-loss magnetic chiral dichroism (EMCD) spectroscopy, which is similar to the well-established X-ray magnetic circular dichroism spectroscopy (XMCD), can determine the quantitative magnetic parameters of materials with high spatial resolution. One of the major obstacles in quantitative analysis using the EMCD technique is the relatively poor signal-to-noise ratio (SNR), compared to XMCD. Here, in the example of a double perovskite Sr 2 FeMoO 6 , we predicted the optimal dynamical diffraction conditions such as sample thickness, crystallographic orientation and detection aperture position by theoretical simulations. By using the optimized conditions, we showed that the SNR of experimental EMCD spectra can be significantly improved and the error of quantitative magnetic parameter determined by EMCD technique can be remarkably lowered. Our results demonstrate that, with enhanced SNR, the EMCD technique can be a unique tool to understand the structure-property relationship of magnetic materials particularly in the high-density magnetic recording and spintronic devices by quantitatively determining magnetic structure and properties at the nanometer scale. Copyright © 2017 Elsevier B.V. All rights reserved.
Linking agent-based models and stochastic models of financial markets
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H. Eugene
2012-01-01
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that “fat” tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting. PMID:22586086
Linking agent-based models and stochastic models of financial markets.
Feng, Ling; Li, Baowen; Podobnik, Boris; Preis, Tobias; Stanley, H Eugene
2012-05-29
It is well-known that financial asset returns exhibit fat-tailed distributions and long-term memory. These empirical features are the main objectives of modeling efforts using (i) stochastic processes to quantitatively reproduce these features and (ii) agent-based simulations to understand the underlying microscopic interactions. After reviewing selected empirical and theoretical evidence documenting the behavior of traders, we construct an agent-based model to quantitatively demonstrate that "fat" tails in return distributions arise when traders share similar technical trading strategies and decisions. Extending our behavioral model to a stochastic model, we derive and explain a set of quantitative scaling relations of long-term memory from the empirical behavior of individual market participants. Our analysis provides a behavioral interpretation of the long-term memory of absolute and squared price returns: They are directly linked to the way investors evaluate their investments by applying technical strategies at different investment horizons, and this quantitative relationship is in agreement with empirical findings. Our approach provides a possible behavioral explanation for stochastic models for financial systems in general and provides a method to parameterize such models from market data rather than from statistical fitting.
Quantitative two-dimensional HSQC experiment for high magnetic field NMR spectrometers
NASA Astrophysics Data System (ADS)
Koskela, Harri; Heikkilä, Outi; Kilpeläinen, Ilkka; Heikkinen, Sami
2010-01-01
The finite RF power available on carbon channel in proton-carbon correlation experiments leads to non-uniform cross peak intensity response across carbon chemical shift range. Several classes of broadband pulses are available that alleviate this problem. Adiabatic pulses provide an excellent magnetization inversion over a large bandwidth, and very recently, novel phase-modulated pulses have been proposed that perform 90° and 180° magnetization rotations with good offset tolerance. Here, we present a study how these broadband pulses (adiabatic and phase-modulated) can improve quantitative application of the heteronuclear single quantum coherence (HSQC) experiment on high magnetic field strength NMR spectrometers. Theoretical and experimental examinations of the quantitative, offset-compensated, CPMG-adjusted HSQC (Q-OCCAHSQC) experiment are presented. The proposed experiment offers a formidable improvement to the offset performance; 13C offset-dependent standard deviation of the peak intensity was below 6% in range of ±20 kHz. This covers the carbon chemical shift range of 150 ppm, which contains the protonated carbons excluding the aldehydes, for 22.3 T NMR magnets. A demonstration of the quantitative analysis of a fasting blood plasma sample obtained from a healthy volunteer is given.
Analysis of high-order languages for use on space station application software
NASA Technical Reports Server (NTRS)
Knoebel, A.
1986-01-01
Considered in this study is the general and not easily resolved problem of how to choose the right programming language for a particular task. This is specialized to the question of which versions of what languages should be chosen for the multitude of tasks that the Marshall Space Flight Center will be responsible for in the Space Station. Four criteria are presented: theoretical considerations, quantitative matrices, qualitative benchmarks, and the monitoring of programmers. Specific recommendations for future studies are given to resolve these questions for the Space Station.
Common Lognormal Behavior in Legal Systems
NASA Astrophysics Data System (ADS)
Yamamoto, Ken
2017-07-01
This study characterizes a statistical property of legal systems: the distribution of the number of articles in a law follows a lognormal distribution. This property is common to the Japanese, German, and Singaporean laws. To explain this lognormal behavior, tree structure of the law is analyzed. If the depth of a tree follows a normal distribution, the lognormal distribution of the number of articles can be theoretically derived. We analyze the structure of the Japanese laws using chapters, sections, and other levels of organization, and this analysis demonstrates that the proposed model is quantitatively reasonable.
Direct and ultrasonic measurements of macroscopic piezoelectricity in sintered hydroxyapatite
NASA Astrophysics Data System (ADS)
Tofail, S. A. M.; Haverty, D.; Cox, F.; Erhart, J.; Hána, P.; Ryzhenko, V.
2009-03-01
Macroscopic piezoelectricity in hydroxyapatite (HA) ceramic was measured by a direct quasistatic method and an ultrasonic interference technique. The effective symmetry of polycrystalline aggregate was established and a detailed theoretical analysis was carried out to determine by these two methods the shear piezoelectric coefficient, d14, of HA. Piezoelectric nature of HA was proved qualitatively although a specific quantitative value for the d14 coefficient could not be established. Ultrasound method was also employed to anisotropic elastic constants, which agreed well with those measured from the first principles.
NASA Astrophysics Data System (ADS)
Batic, Matej; Begalli, Marcia; Han, Min Cheol; Hauf, Steffen; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Han Sung; Grazia Pia, Maria; Saracco, Paolo; Weidenspointner, Georg
2014-06-01
A systematic review of methods and data for the Monte Carlo simulation of photon interactions is in progress: it concerns a wide set of theoretical modeling approaches and data libraries available for this purpose. Models and data libraries are assessed quantitatively with respect to an extensive collection of experimental measurements documented in the literature to determine their accuracy; this evaluation exploits rigorous statistical analysis methods. The computational performance of the associated modeling algorithms is evaluated as well. An overview of the assessment of photon interaction models and results of the experimental validation are presented.
ERIC Educational Resources Information Center
LoPresto, Michael C.
2014-01-01
What follows is a description of a theoretical model designed to calculate the playing frequencies of the musical pitches produced by a trombone. The model is based on quantitative treatments that demonstrate the effects of the flaring bell and cup-shaped mouthpiece sections on these frequencies and can be used to calculate frequencies that…
Establishment of apoptotic regulatory network for genetic markers of colorectal cancer.
Hao, Yibin; Shan, Guoyong; Nan, Kejun
2017-03-01
Our purpose is to screen out genetic markers applicable to early diagnosis for colorectal cancer and to establish apoptotic regulatory network model for colorectal cancer, thereby providing theoretical evidence and targeted therapy for early diagnosis of colorectal cancer. Taking databases including CNKI, VIP, Wanfang data, Pub Med, and MEDLINE as main sources of literature retrieval, literatures associated with genetic markers applied to early diagnosis of colorectal cancer were searched to perform comprehensive and quantitative analysis by Meta analysis, hence screening genetic markers used in early diagnosis of colorectal cancer. Gene Ontology (GO) analysis and Kyoto Encyclopedia of Genes and Genomes (KEGG) analysis were employed to establish apoptotic regulatory network model based on screened genetic markers, and then verification experiment was conducted. Through Meta analysis, seven genetic markers were screened out, including WWOX, K-ras, COX-2, p53, APC, DCC and PTEN, among which DCC shows highest diagnostic efficiency. GO analysis of genetic markers found that six genetic markers played role in biological process, molecular function and cellular component. It was indicated in apoptotic regulatory network built by KEGG analysis and verification experiment that WWOX could promote tumor cell apoptotic in colorectal cancer and elevate expression level of p53. The apoptotic regulatory model of colorectal cancer established in this study provides clinically theoretical evidence and targeted therapy for early diagnosis of colorectal cancer.
NASA Astrophysics Data System (ADS)
Lugez, C. L.; Lovas, F. J.; Hougen, J. T.; Ohashi, N.
1999-03-01
Spectral data onK= 0 and 1 levels of the methanol dimer available from previous and present Fourier transform microwave measurements have been interpreted globally, using a group-theoretically derived effective Hamiltonian and corresponding tunneling matrix elements to describe the splittings arising from a large number of tunneling motions. In the present work, 302 new measurements (40K= 1-1 and 262K= 1-0 transitions) were added to the previous data set to give a total of 584 assigned transitions withJ≤ 6. As a result of the rather completeK= 0, 1 data set forJ≤ 4, the lone-pair exchange tunneling splittings were obtained experimentally. Matrix element expansions inJ(J+ 1) used in the previousK= 0 formalism were modified to apply toK> 0, essentially by making a number of real coefficients complex, as required by the generalized internal-axis-method tunneling formalism. To reduce the number of adjustable parameters to an acceptable level in both theK= 0 andK= 1 effective Hamiltonians (used in separateK= 0 andK= 1 least-squares fits), a rather large number of assumptions concerning probably negligible parameters had to be made. The present fitting results should thus be considered as providing assurance of the group-theoretical line assignments as well as a nearly quantitative global interpretation of the tunneling splittings, even though they do not yet unambiguously determine the relative contributions from all 25 group-theoretically inequivalent tunneling motions in this complex, nor do they permit quantitative extrapolation to higherKlevels.
Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.
Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816
Quantitative dual-probe microdialysis: mathematical model and analysis.
Chen, Kevin C; Höistad, Malin; Kehr, Jan; Fuxe, Kjell; Nicholson, Charles
2002-04-01
Steady-state microdialysis is a widely used technique to monitor the concentration changes and distributions of substances in tissues. To obtain more information about brain tissue properties from microdialysis, a dual-probe approach was applied to infuse and sample the radiotracer, [3H]mannitol, simultaneously both in agar gel and in the rat striatum. Because the molecules released by one probe and collected by the other must diffuse through the interstitial space, the concentration profile exhibits dynamic behavior that permits the assessment of the diffusion characteristics in the brain extracellular space and the clearance characteristics. In this paper a mathematical model for dual-probe microdialysis was developed to study brain interstitial diffusion and clearance processes. Theoretical expressions for the spatial distribution of the infused tracer in the brain extracellular space and the temporal concentration at the probe outlet were derived. A fitting program was developed using the simplex algorithm, which finds local minima of the standard deviations between experiments and theory by adjusting the relevant parameters. The theoretical curves accurately fitted the experimental data and generated realistic diffusion parameters, implying that the mathematical model is capable of predicting the interstitial diffusion behavior of [3H]mannitol and that it will be a valuable quantitative tool in dual-probe microdialysis.
Lerner, Eitan; Ploetz, Evelyn; Hohlbein, Johannes; Cordes, Thorben; Weiss, Shimon
2016-07-07
Single-molecule, protein-induced fluorescence enhancement (PIFE) serves as a molecular ruler at molecular distances inaccessible to other spectroscopic rulers such as Förster-type resonance energy transfer (FRET) or photoinduced electron transfer. In order to provide two simultaneous measurements of two distances on different molecular length scales for the analysis of macromolecular complexes, we and others recently combined measurements of PIFE and FRET (PIFE-FRET) on the single molecule level. PIFE relies on steric hindrance of the fluorophore Cy3, which is covalently attached to a biomolecule of interest, to rotate out of an excited-state trans isomer to the cis isomer through a 90° intermediate. In this work, we provide a theoretical framework that accounts for relevant photophysical and kinetic parameters of PIFE-FRET, show how this framework allows the extraction of the fold-decrease in isomerization mobility from experimental data, and show how these results provide information on changes in the accessible volume of Cy3. The utility of this model is then demonstrated for experimental results on PIFE-FRET measurement of different protein-DNA interactions. The proposed model and extracted parameters could serve as a benchmark to allow quantitative comparison of PIFE effects in different biological systems.
Validation of PCR methods for quantitation of genetically modified plants in food.
Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P
2001-01-01
For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.
Dual-Enrollment High-School Graduates' College-Enrollment Considerations
ERIC Educational Resources Information Center
Damrow, Roberta J.
2017-01-01
This quantitative study examined college enrollment considerations of dual-enrollment students enrolling at one Wisconsin credit-granting technical college. A combined college-choice theoretical framework guided this quantitative study that addressed two research questions: To what extent, if any, did the number of dual credits predict likelihood…
ERIC Educational Resources Information Center
Tan, Cheng Yong
2017-01-01
The present study reviewed quantitative empirical studies examining the relationship between cultural capital and student achievement. Results showed that researchers had conceptualized and measured cultural capital in different ways. It is argued that the more holistic understanding of the construct beyond highbrow cultural consumption must be…
Cycling Empirical Antibiotic Therapy in Hospitals: Meta-Analysis and Models
Abel, Sören; Viechtbauer, Wolfgang; Bonhoeffer, Sebastian
2014-01-01
The rise of resistance together with the shortage of new broad-spectrum antibiotics underlines the urgency of optimizing the use of available drugs to minimize disease burden. Theoretical studies suggest that coordinating empirical usage of antibiotics in a hospital ward can contain the spread of resistance. However, theoretical and clinical studies came to different conclusions regarding the usefulness of rotating first-line therapy (cycling). Here, we performed a quantitative pathogen-specific meta-analysis of clinical studies comparing cycling to standard practice. We searched PubMed and Google Scholar and identified 46 clinical studies addressing the effect of cycling on nosocomial infections, of which 11 met our selection criteria. We employed a method for multivariate meta-analysis using incidence rates as endpoints and find that cycling reduced the incidence rate/1000 patient days of both total infections by 4.95 [9.43–0.48] and resistant infections by 7.2 [14.00–0.44]. This positive effect was observed in most pathogens despite a large variance between individual species. Our findings remain robust in uni- and multivariate metaregressions. We used theoretical models that reflect various infections and hospital settings to compare cycling to random assignment to different drugs (mixing). We make the realistic assumption that therapy is changed when first line treatment is ineffective, which we call “adjustable cycling/mixing”. In concordance with earlier theoretical studies, we find that in strict regimens, cycling is detrimental. However, in adjustable regimens single resistance is suppressed and cycling is successful in most settings. Both a meta-regression and our theoretical model indicate that “adjustable cycling” is especially useful to suppress emergence of multiple resistance. While our model predicts that cycling periods of one month perform well, we expect that too long cycling periods are detrimental. Our results suggest that “adjustable cycling” suppresses multiple resistance and warrants further investigations that allow comparing various diseases and hospital settings. PMID:24968123
NASA Astrophysics Data System (ADS)
Alver, Özgür; Kaya, Mehmet Fatih; Dikmen, Gökhan
2015-12-01
Structural elucidation of 3-(acrylamido)phenylboronic acid (C9H10BNO3) was carried out with 1H, 13C and HETCOR NMR techniques. Solvent effects on nuclear magnetic shielding tensors were examined with deuterated dimethyl sulfoxide, acetone, methanol and water solvents. The correct order of appearance of carbon and hydrogen atoms on NMR scale from highest magnetic field region to the lowest one were investigated using different types of theoretical levels and the details of the levels were presented in this study. Stable structural conformers and vibrational band analysis of the title molecule (C9H10BNO3) were studied both experimental and theoretical viewpoints using FT-IR, Raman spectroscopic methods and density functional theory (DFT). FT-IR and Raman spectra were obtained in the region of 4000-400 cm-1, and 3700-10 cm-1, respectively. Becke-3-Lee-Yang-Parr (B3LYP) hybrid density functional theory method with 6-31++G(d, p) basis set was included in the search for optimized structures and vibrational wavenumbers. Experimental and theoretical results show that after application of a suitable scaling factor density functional B3LYP method resulted in acceptable results for predicting vibrational wavenumbers except OH and NH stretching modes which is most likely arising from increasing unharmonicity in the high wave number region and possible intra and inter molecular interaction at OH edges those of which are not fully taken into consideration in theoretical processes. To make a more quantitative vibrational assignments, potential energy distribution (PED) values were calculated using VEDA 4 (Vibrational Energy Distribution Analysis) program.
NASA Technical Reports Server (NTRS)
Hasselman, D. P. H.; Singh, J. P.; Satyamurthy, K.
1980-01-01
An analysis was conducted of the possible modes of thermal stress failure of brittle ceramics for potential use in point-focussing solar receivers. The pertinent materials properties which control thermal stress resistance were identified for conditions of steady-state and transient heat flow, convective and radiative heat transfer, thermal buckling and thermal fatigue as well as catastrophic crack propagation. Selection rules for materials with optimum thermal stress resistance for a particular thermal environment were identified. Recommendations for materials for particular components were made. The general requirements for a thermal shock testing program quantitatively meaningful for point-focussing solar receivers were outlined. Recommendations for follow-on theoretical analyses were made.
Dynamic Forces Between Two Deformable Oil Droplets in Water
NASA Astrophysics Data System (ADS)
Dagastine, Raymond R.; Manica, Rogério; Carnie, Steven L.; Chan, D. Y. C.; Stevens, Geoffrey W.; Grieser, Franz
2006-07-01
The understanding of static interactions in colloidal suspensions is well established, whereas dynamic interactions more relevant to biological and other suspended soft-matter systems are less well understood. We present the direct force measurement and quantitative theoretical description for dynamic forces for liquid droplets in another immiscible fluid. Analysis of this system demonstrates the strong link between interfacial deformation, static surface forces, and hydrodynamic drainage, which govern dynamic droplet-droplet interactions over the length scale of nanometers and over the time scales of Brownian collisions. The results and analysis have direct bearing on the control and manipulation of suspended droplets in soft-matter systems ranging from the emulsions in shampoo to cellular interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zacharia, T.; David, S.A.; Vitek, J.M.
1989-12-01
A computational and experimental study was carried out to quantitatively understand the influence of the heat flow and the fluid flow in the transient development of the weld pool during gas tungsten arc (GTA) and laser beam welding of Type 304 stainless steel. Stationary gas tungsten arc and laser beam welds were made on two heats of Type 304 austenitic stainless steels containing 90 ppm sulfur and 240 ppm sulfur. A transient heat transfer model was utilized to simulate the heat flow and fluid flow in the weld pool. In this paper, the results of the heat flow and fluidmore » flow analysis are presented.« less
Wide-Field Imaging of Single-Nanoparticle Extinction with Sub-nm2 Sensitivity
NASA Astrophysics Data System (ADS)
Payne, Lukas M.; Langbein, Wolfgang; Borri, Paola
2018-03-01
We report on a highly sensitive wide-field imaging technique for quantitative measurement of the optical extinction cross section σext of single nanoparticles. The technique is simple and high speed, and it enables the simultaneous acquisition of hundreds of nanoparticles for statistical analysis. Using rapid referencing, fast acquisition, and a deconvolution analysis, a shot-noise-limited sensitivity down to 0.4 nm2 is achieved. Measurements on a set of individual gold nanoparticles of 5 nm diameter using this method yield σext=(10.0 ±3.1 ) nm2, which is consistent with theoretical expectations and well above the background fluctuations of 0.9 nm2 .
Rayleigh-Taylor mixing in supernova experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swisher, N. C.; Abarzhi, S. I., E-mail: snezhana.abarzhi@gmail.com; Kuranz, C. C.
We report a scrupulous analysis of data in supernova experiments that are conducted at high power laser facilities in order to study core-collapse supernova SN1987A. Parameters of the experimental system are properly scaled to investigate the interaction of a blast-wave with helium-hydrogen interface, and the induced Rayleigh-Taylor instability and Rayleigh-Taylor mixing of the denser and lighter fluids with time-dependent acceleration. We analyze all available experimental images of the Rayleigh-Taylor flow in supernova experiments and measure delicate features of the interfacial dynamics. A new scaling is identified for calibration of experimental data to enable their accurate analysis and comparisons. By properlymore » accounting for the imprint of the experimental conditions, the data set size and statistics are substantially increased. New theoretical solutions are reported to describe asymptotic dynamics of Rayleigh-Taylor flow with time-dependent acceleration by applying theoretical analysis that considers symmetries and momentum transport. Good qualitative and quantitative agreement is achieved of the experimental data with the theory and simulations. Our study indicates that in supernova experiments Rayleigh-Taylor flow is in the mixing regime, the interface amplitude contributes substantially to the characteristic length scale for energy dissipation; Rayleigh-Taylor mixing keeps order.« less
Interface Pattern Selection in Directional Solidification
NASA Technical Reports Server (NTRS)
Trivedi, Rohit; Tewari, Surendra N.
2001-01-01
The central focus of this research is to establish key scientific concepts that govern the selection of cellular and dendritic patterns during the directional solidification of alloys. Ground-based studies have established that the conditions under which cellular and dendritic microstructures form are precisely where convection effects are dominant in bulk samples. Thus, experimental data can not be obtained terrestrially under pure diffusive regime. Furthermore, reliable theoretical models are not yet possible which can quantitatively incorporate fluid flow in the pattern selection criterion. Consequently, microgravity experiments on cellular and dendritic growth are designed to obtain benchmark data under diffusive growth conditions that can be quantitatively analyzed and compared with the rigorous theoretical model to establish the fundamental principles that govern the selection of specific microstructure and its length scales. In the cellular structure, different cells in an array are strongly coupled so that the cellular pattern evolution is controlled by complex interactions between thermal diffusion, solute diffusion and interface effects. These interactions give infinity of solutions, and the system selects only a narrow band of solutions. The aim of this investigation is to obtain benchmark data and develop a rigorous theoretical model that will allow us to quantitatively establish the physics of this selection process.
Quantitative confirmation of diffusion-limited oxidation theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillen, K.T.; Clough, R.L.
1990-01-01
Diffusion-limited (heterogeneous) oxidation effects are often important for studies of polymer degradation. Such effects are common in polymers subjected to ionizing radiation at relatively high dose rate. To better understand the underlying oxidation processes and to aid in the planning of accelerated aging studies, it would be desirable to be able to monitor and quantitatively understand these effects. In this paper, we briefly review a theoretical diffusion approach which derives model profiles for oxygen surrounded sheets of material by combining oxygen permeation rates with kinetically based oxygen consumption expressions. The theory leads to a simple governing expression involving the oxygenmore » consumption and permeation rates together with two model parameters {alpha} and {beta}. To test the theory, gamma-initiated oxidation of a sheet of commercially formulated EPDM rubber was performed under conditions which led to diffusion-limited oxidation. Profile shapes from the theoretical treatments are shown to accurately fit experimentally derived oxidation profiles. In addition, direct measurements on the same EPDM material of the oxygen consumption and permeation rates, together with values of {alpha} and {beta} derived from the fitting procedure, allow us to quantitatively confirm for the first time the governing theoretical relationship. 17 refs., 3 figs.« less
Nakao, Takashi; Ohira, Hideki; Northoff, Georg
2012-01-01
Most experimental studies of decision-making have specifically examined situations in which a single less-predictable correct answer exists (externally guided decision-making under uncertainty). Along with such externally guided decision-making, there are instances of decision-making in which no correct answer based on external circumstances is available for the subject (internally guided decision-making). Such decisions are usually made in the context of moral decision-making as well as in preference judgment, where the answer depends on the subject’s own, i.e., internal, preferences rather than on external, i.e., circumstantial, criteria. The neuronal and psychological mechanisms that allow guidance of decisions based on more internally oriented criteria in the absence of external ones remain unclear. This study was undertaken to compare decision-making of these two kinds empirically and theoretically. First, we reviewed studies of decision-making to clarify experimental–operational differences between externally guided and internally guided decision-making. Second, using multi-level kernel density analysis, a whole-brain-based quantitative meta-analysis of neuroimaging studies was performed. Our meta-analysis revealed that the neural network used predominantly for internally guided decision-making differs from that for externally guided decision-making under uncertainty. This result suggests that studying only externally guided decision-making under uncertainty is insufficient to account for decision-making processes in the brain. Finally, based on the review and results of the meta-analysis, we discuss the differences and relations between decision-making of these two types in terms of their operational, neuronal, and theoretical characteristics. PMID:22403525
Controlling and prevention of surface wrinkling via size-dependent critical wrinkling strain.
Han, Xue; Zhao, Yan; Cao, Yanping; Lu, Conghua
2015-06-14
Surface wrinkling may occur in a film-substrate system when the applied strain exceeds the critical value. However, the practically required strain for the onset of surface wrinkling can be different from the theoretically predicted value. Here we investigate the film size effect-dependent critical strain for the mechanical strain-induced surface wrinkling via a combination of experiments and theoretical analysis. In the poly(dimethylsiloxane)-based system fabricated by the smart combination of mechanical straining and selective O2 plasma (OP) exposure through Cu grids, the film size effect on the critical wrinkling strain is systematically studied by considering OP exposure duration, the mesh number and geometry of Cu grids. Meanwhile, a simple analytical solution revealing the film size effect is well established, which shows good consistency with the experimental results. This study provides an experimental and theoretical basis for finely tuning the critical wrinkling strain in a simple and quantitative manner, which can find a wide range of applications in such fields as microelectronic circuits and optical devices, where controlling and/or prevention of surface wrinkling are of great importance.
Systemic Analysis Approaches for Air Transportation
NASA Technical Reports Server (NTRS)
Conway, Sheila
2005-01-01
Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.
Tholen, Danny; Zhu, Xin-Guang
2011-05-01
Photosynthesis is limited by the conductance of carbon dioxide (CO(2)) from intercellular spaces to the sites of carboxylation. Although the concept of internal conductance (g(i)) has been known for over 50 years, shortcomings in the theoretical description of this process may have resulted in a limited understanding of the underlying mechanisms. To tackle this issue, we developed a three-dimensional reaction-diffusion model of photosynthesis in a typical C(3) mesophyll cell that includes all major components of the CO(2) diffusion pathway and associated reactions. Using this novel systems model, we systematically and quantitatively examined the mechanisms underlying g(i). Our results identify the resistances of the cell wall and chloroplast envelope as the most significant limitations to photosynthesis. In addition, the concentration of carbonic anhydrase in the stroma may also be limiting for the photosynthetic rate. Our analysis demonstrated that higher levels of photorespiration increase the apparent resistance to CO(2) diffusion, an effect that has thus far been ignored when determining g(i). Finally, we show that outward bicarbonate leakage through the chloroplast envelope could contribute to the observed decrease in g(i) under elevated CO(2). Our analysis suggests that physiological and anatomical features associated with g(i) have been evolutionarily fine-tuned to benefit CO(2) diffusion and photosynthesis. The model presented here provides a novel theoretical framework to further analyze the mechanisms underlying diffusion processes in the mesophyll.
Tholen, Danny; Zhu, Xin-Guang
2011-01-01
Photosynthesis is limited by the conductance of carbon dioxide (CO2) from intercellular spaces to the sites of carboxylation. Although the concept of internal conductance (gi) has been known for over 50 years, shortcomings in the theoretical description of this process may have resulted in a limited understanding of the underlying mechanisms. To tackle this issue, we developed a three-dimensional reaction-diffusion model of photosynthesis in a typical C3 mesophyll cell that includes all major components of the CO2 diffusion pathway and associated reactions. Using this novel systems model, we systematically and quantitatively examined the mechanisms underlying gi. Our results identify the resistances of the cell wall and chloroplast envelope as the most significant limitations to photosynthesis. In addition, the concentration of carbonic anhydrase in the stroma may also be limiting for the photosynthetic rate. Our analysis demonstrated that higher levels of photorespiration increase the apparent resistance to CO2 diffusion, an effect that has thus far been ignored when determining gi. Finally, we show that outward bicarbonate leakage through the chloroplast envelope could contribute to the observed decrease in gi under elevated CO2. Our analysis suggests that physiological and anatomical features associated with gi have been evolutionarily fine-tuned to benefit CO2 diffusion and photosynthesis. The model presented here provides a novel theoretical framework to further analyze the mechanisms underlying diffusion processes in the mesophyll. PMID:21441385
NASA Technical Reports Server (NTRS)
Wilson, L. B., III; Sibeck, D. G.; Breneman, A.W.; Le Contel, O.; Cully, C.; Turner, D. L.; Angelopoulos, V.; Malaspina, D. M.
2014-01-01
We present a detailed outline and discussion of the analysis techniques used to compare the relevance of different energy dissipation mechanisms at collisionless shock waves. We show that the low-frequency, quasi-static fields contribute less to ohmic energy dissipation, (-j · E ) (minus current density times measured electric field), than their high-frequency counterparts. In fact, we found that high-frequency, large-amplitude (greater than 100 millivolts per meter and/or greater than 1 nanotesla) waves are ubiquitous in the transition region of collisionless shocks. We quantitatively show that their fields, through wave-particle interactions, cause enough energy dissipation to regulate the global structure of collisionless shocks. The purpose of this paper, part one of two, is to outline and describe in detail the background, analysis techniques, and theoretical motivation for our new results presented in the companion paper. The companion paper presents the results of our quantitative energy dissipation rate estimates and discusses the implications. Together, the two manuscripts present the first study quantifying the contribution that high-frequency waves provide, through wave-particle interactions, to the total energy dissipation budget of collisionless shock waves.
Improving the geological interpretation of magnetic and gravity satellite anomalies
NASA Technical Reports Server (NTRS)
Hinze, William J.; Braile, Lawrence W.; Vonfrese, Ralph R. B.
1987-01-01
Quantitative analysis of the geologic component of observed satellite magnetic and gravity fields requires accurate isolation of the geologic component of the observations, theoretically sound and viable inversion techniques, and integration of collateral, constraining geologic and geophysical data. A number of significant contributions were made which make quantitative analysis more accurate. These include procedures for: screening and processing orbital data for lithospheric signals based on signal repeatability and wavelength analysis; producing accurate gridded anomaly values at constant elevations from the orbital data by three-dimensional least squares collocation; increasing the stability of equivalent point source inversion and criteria for the selection of the optimum damping parameter; enhancing inversion techniques through an iterative procedure based on the superposition theorem of potential fields; and modeling efficiently regional-scale lithospheric sources of satellite magnetic anomalies. In addition, these techniques were utilized to investigate regional anomaly sources of North and South America and India and to provide constraints to continental reconstruction. Since the inception of this research study, eleven papers were presented with associated published abstracts, three theses were completed, four papers were published or accepted for publication, and an additional manuscript was submitted for publication.
Multiblob coarse-graining for mixtures of long polymers and soft colloids
NASA Astrophysics Data System (ADS)
Locatelli, Emanuele; Capone, Barbara; Likos, Christos N.
2016-11-01
Soft nanocomposites represent both a theoretical and an experimental challenge due to the high number of the microscopic constituents that strongly influence the behaviour of the systems. An effective theoretical description of such systems invokes a reduction of the degrees of freedom to be analysed, hence requiring the introduction of an efficient, quantitative, coarse-grained description. We here report on a novel coarse graining approach based on a set of transferable potentials that quantitatively reproduces properties of mixtures of linear and star-shaped homopolymeric nanocomposites. By renormalizing groups of monomers into a single effective potential between a f-functional star polymer and an homopolymer of length N0, and through a scaling argument, it will be shown how a substantial reduction of the to degrees of freedom allows for a full quantitative description of the system. Our methodology is tested upon full monomer simulations for systems of different molecular weight, proving its full predictive potential.
Identifying, Analyzing, and Communicating Rural: A Quantitative Perspective
ERIC Educational Resources Information Center
Koziol, Natalie A.; Arthur, Ann M.; Hawley, Leslie R.; Bovaird, James A.; Bash, Kirstie L.; McCormick, Carina; Welch, Greg W.
2015-01-01
Defining rural is a critical task for rural education researchers, as it has implications for all phases of a study. However, it is also a difficult task due to the many ways in which rural can be theoretically, conceptually, and empirically operationalized. This article provides researchers with specific guidance on important theoretical and…
Faraday's first dynamo: A retrospective
NASA Astrophysics Data System (ADS)
Smith, Glenn S.
2013-12-01
In the early 1830s, Michael Faraday performed his seminal experimental research on electromagnetic induction, in which he created the first electric dynamo—a machine for continuously converting rotational mechanical energy into electrical energy. His machine was a conducting disc, rotating between the poles of a permanent magnet, with the voltage/current obtained from brushes contacting the disc. In his first dynamo, the magnetic field was asymmetric with respect to the axis of the disc. This is to be contrasted with some of his later symmetric designs, which are the ones almost invariably discussed in textbooks on electromagnetism. In this paper, a theoretical analysis is developed for Faraday's first dynamo. From this analysis, the eddy currents in the disc and the open-circuit voltage for arbitrary positioning of the brushes are determined. The approximate analysis is verified by comparing theoretical results with measurements made on an experimental recreation of the dynamo. Quantitative results from the analysis are used to elucidate Faraday's qualitative observations, from which he learned so much about electromagnetic induction. For the asymmetric design, the eddy currents in the disc dissipate energy that makes the dynamo inefficient, prohibiting its use as a practical generator of electric power. Faraday's experiments with his first dynamo provided valuable insight into electromagnetic induction, and this insight was quickly used by others to design practical generators.
Stable isotopic labeling-based quantitative targeted glycomics (i-QTaG).
Kim, Kyoung-Jin; Kim, Yoon-Woo; Kim, Yun-Gon; Park, Hae-Min; Jin, Jang Mi; Hwan Kim, Young; Yang, Yung-Hun; Kyu Lee, Jun; Chung, Junho; Lee, Sun-Gu; Saghatelian, Alan
2015-01-01
Mass spectrometry (MS) analysis combined with stable isotopic labeling is a promising method for the relative quantification of aberrant glycosylation in diseases and disorders. We developed a stable isotopic labeling-based quantitative targeted glycomics (i-QTaG) technique for the comparative and quantitative analysis of total N-glycans using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). We established the analytical procedure with the chemical derivatizations (i.e., sialic acid neutralization and stable isotopic labeling) of N-glycans using a model glycoprotein (bovine fetuin). Moreover, the i-QTaG using MALDI-TOF MS was evaluated with various molar ratios (1:1, 1:2, 1:5) of (13) C6 /(12) C6 -2-aminobenzoic acid-labeled glycans from normal human serum. Finally, this method was applied to direct comparison of the total N-glycan profiles between normal human sera (n = 8) and prostate cancer patient sera (n = 17). The intensities of the N-glycan peaks from i-QTaG method showed a good linearity (R(2) > 0.99) with the amount of the bovine fetuin glycoproteins. The ratios of relative intensity between the isotopically 2-AA labeled N-glycans were close to the theoretical molar ratios (1:1, 1:2, 1:5). We also demonstrated that the up-regulation of the Lewis antigen (~82%) in sera from prostate cancer patients. In this proof-of-concept study, we demonstrated that the i-QTaG method, which enables to achieve a reliable comparative quantitation of total N-glycans via MALDI-TOF MS analysis, has the potential to diagnose and monitor alterations in glycosylation associated with disease states or biotherapeutics. © 2015 American Institute of Chemical Engineers.
Suslov, Sergey A; Bozhko, Alexandra A; Sidorov, Alexander S; Putin, Gennady F
2012-07-01
Flow patterns arising in a vertical differentially heated layer of nonconducting ferromagnetic fluid placed in an external uniform transverse magnetic field are studied experimentally and discussed from the point of view of the perturbation energy balance. A quantitative criterion for detecting the parametric point where the dominant role in generating a flow instability is transferred between the thermogravitational and thermomagnetic mechanisms is suggested, based on the disturbance energy balance analysis. A comprehensive experimental study of various flow patterns is undertaken, and the existence is demonstrated of oblique thermomagnetic waves theoretically predicted by Suslov [Phys. Fluids 20, 084101 (2008)] and superposed onto the stationary magnetoconvective pattern known previously. It is found that the wave number of the detected convection patterns depends sensitively on the temperature difference across the layer and on the applied magnetic field. In unsteady regimes its value varies periodically by a factor of almost 2, indicating the appearance of two different competing wave modes. The wave numbers and spatial orientation of the observed dominant flow patterns are found to be in good agreement with theoretical predictions.
Estimation of π-π Electronic Couplings from Current Measurements.
Trasobares, J; Rech, J; Jonckheere, T; Martin, T; Aleveque, O; Levillain, E; Diez-Cabanes, V; Olivier, Y; Cornil, J; Nys, J P; Sivakumarasamy, R; Smaali, K; Leclere, P; Fujiwara, A; Théron, D; Vuillaume, D; Clément, N
2017-05-10
The π-π interactions between organic molecules are among the most important parameters for optimizing the transport and optical properties of organic transistors, light-emitting diodes, and (bio-) molecular devices. Despite substantial theoretical progress, direct experimental measurement of the π-π electronic coupling energy parameter t has remained an old challenge due to molecular structural variability and the large number of parameters that affect the charge transport. Here, we propose a study of π-π interactions from electrochemical and current measurements on a large array of ferrocene-thiolated gold nanocrystals. We confirm the theoretical prediction that t can be assessed from a statistical analysis of current histograms. The extracted value of t ≈35 meV is in the expected range based on our density functional theory analysis. Furthermore, the t distribution is not necessarily Gaussian and could be used as an ultrasensitive technique to assess intermolecular distance fluctuation at the subangström level. The present work establishes a direct bridge between quantum chemistry, electrochemistry, organic electronics, and mesoscopic physics, all of which were used to discuss results and perspectives in a quantitative manner.
Instability of elliptic liquid jets: Temporal linear stability theory and experimental analysis
NASA Astrophysics Data System (ADS)
Amini, Ghobad; Lv, Yu; Dolatabadi, Ali; Ihme, Matthias
2014-11-01
The instability dynamics of inviscid liquid jets issuing from elliptical orifices is studied, and effects of the surrounding gas and the liquid surface tension on the stability behavior are investigated. A dispersion relation for the zeroth azimuthal (axisymmetric) instability mode is derived. Consistency of the analysis is confirmed by demonstrating that these equations reduce to the well-known dispersion equations for the limiting cases of round and planar jets. It is shown that the effect of the ellipticity is to increase the growth rate over a large range of wavenumbers in comparison to those of a circular jet. For higher Weber numbers, at which capillary forces have a stabilizing effect, the growth rate decreases with increasing ellipticity. Similar to circular and planar jets, increasing the density ratio between gas and liquid increases the growth of disturbances significantly. These theoretical investigations are complemented by experiments to validate the local linear stability results. Comparisons of predicted growth rates with measurements over a range of jet ellipticities confirm that the theoretical model provides a quantitatively accurate description of the instability dynamics in the Rayleigh and first wind-induced regimes.
Quantitative analysis of soil chromatography. I. Water and radionuclide transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reeves, M.; Francis, C.W.; Duguid, J.O.
Soil chromatography has been used successfully to evaluate relative mobilities of pesticides and nuclides in soils. Its major advantage over the commonly used suspension technique is that it more accurately simulates field conditions. Under such conditions the number of potential exchange sites is limited both by the structure of the soil matrix and by the manner in which the carrier fluid moves through this structure. The major limitation of the chromatographic method, however, has been its qualitative nature. This document represents an effort to counter this objection. A theoretical basis is specified for the transport both of the carrier elutingmore » fluid and of the dissolved constituent. A computer program based on this theory is developed which optimizes the fit of theoretical data to experimental data by automatically adjusting the transport parameters, one of which is the distribution coefficient k/sub d/. This analysis procedure thus constitutes an integral part of the soil chromatographic method, by means of which mobilities of nuclides and other dissolved constituents in soils may be quantified.« less
Surface potential extraction from electrostatic and Kelvin-probe force microscopy images
NASA Astrophysics Data System (ADS)
Xu, Jie; Chen, Deyuan; Li, Wei; Xu, Jun
2018-05-01
A comprehensive comparison study of electrostatic force microscopy (EFM) and Kelvin probe force microscopy (KPFM) is conducted in this manuscript. First, it is theoretically demonstrated that for metallic or semiconductor samples, both the EFM and KPFM signals are a convolution of the sample surface potential with their respective transfer functions. Then, an equivalent point-mass model describing cantilever deflection under distributed loads is developed to reevaluate the cantilever influence on detection signals, and it is shown that the cantilever has no influence on the EFM signal, while it will affect the KPFM signal intensity but not change the resolution. Finally, EFM and KPFM experiments are carried out, and the surface potential is extracted from the EFM and KPFM images by deconvolution processing, respectively. The extracted potential intensity is well consistent with each other and the detection resolution also complies with the theoretical analysis. Our work is helpful to perform a quantitative analysis of EFM and KPFM signals, and the developed point-mass model can also be used for other cantilever beam deflection problems.
Metabolic Compartmentation – A System Level Property of Muscle Cells
Saks, Valdur; Beraud, Nathalie; Wallimann, Theo
2008-01-01
Problems of quantitative investigation of intracellular diffusion and compartmentation of metabolites are analyzed. Principal controversies in recently published analyses of these problems for the living cells are discussed. It is shown that the formal theoretical analysis of diffusion of metabolites based on Fick's equation and using fixed diffusion coefficients for diluted homogenous aqueous solutions, but applied for biological systems in vivo without any comparison with experimental results, may lead to misleading conclusions, which are contradictory to most biological observations. However, if the same theoretical methods are used for analysis of actual experimental data, the apparent diffusion constants obtained are orders of magnitude lower than those in diluted aqueous solutions. Thus, it can be concluded that local restrictions of diffusion of metabolites in a cell are a system-level properties caused by complex structural organization of the cells, macromolecular crowding, cytoskeletal networks and organization of metabolic pathways into multienzyme complexes and metabolons. This results in microcompartmentation of metabolites, their channeling between enzymes and in modular organization of cellular metabolic networks. The perspectives of further studies of these complex intracellular interactions in the framework of Systems Biology are discussed. PMID:19325782
Hybrid density-functional calculations of phonons in LaCoO3
NASA Astrophysics Data System (ADS)
Gryaznov, Denis; Evarestov, Robert A.; Maier, Joachim
2010-12-01
Phonon frequencies at Γ point in nonmagnetic rhombohedral phase of LaCoO3 were calculated using density-functional theory with hybrid exchange correlation functional PBE0. The calculations involved a comparison of results for two types of basis functions commonly used in ab initio calculations, namely, the plane-wave approach and linear combination of atomic orbitals, as implemented in VASP and CRYSTAL computer codes, respectively. A good qualitative, but also within an error margin of less than 30%, a quantitative agreement was observed not only between the two formalisms but also between theoretical and experimental phonon frequency predictions. Moreover, the correlation between the phonon symmetries in cubic and rhombohedral phases is discussed in detail on the basis of group-theoretical analysis. It is concluded that the hybrid PBE0 functional is able to predict correctly the phonon properties in LaCoO3 .
Coherent beam combination of fiber lasers with a strongly confined waveguide: numerical model.
Tao, Rumao; Si, Lei; Ma, Yanxing; Zhou, Pu; Liu, Zejin
2012-08-20
Self-imaging properties of fiber lasers in a strongly confined waveguide (SCW) and their application in coherent beam combination (CBC) are studied theoretically. Analytical formulas are derived for the positions, amplitudes, and phases of the N images at the end of an SCW, which is important for quantitative analysis of waveguide CBC. The formulas are verified with experimental results and numerical simulation of a finite difference beam propagation method (BPM). The error of our analytical formulas is less than 6%, which can be reduced to less than 1.5% with Goos-Hahnchen penetration depth considered. Based on the theoretical model and BPM, we studied the combination of two laser beams based on an SCW. The effects of the waveguide refractive index and Gaussian beam waist are studied. We also simulated the CBC of nine and 16 fiber lasers, and a single beam without side lobes was achieved.
NASA Technical Reports Server (NTRS)
Holley, W. R.; Chatterjee, A.
1994-01-01
A theoretical framework is presented which provides a quantitative analysis of radiation induced translocations between the ab1 oncogene on CH9q34 and a breakpoint cluster region, bcr, on CH 22q11. Such translocations are associated frequently with chronic myelogenous leukemia. The theory is based on the assumption that incorrect or unfaithful rejoining of initial double strand breaks produced concurrently within the 200 kbp intron region upstream of the second abl exon, and the 16.5 kbp region between bcr exon 2 and exon 6 interact with each other, resulting in a fusion gene. for an x-ray dose of 100 Gy, there is good agreement between the theoretical estimate and the one available experimental result. The theory has been extended to provide dose response curves for these types of translocations. These curves are quadratic at low doses and become linear at high doses.
Stability and Hopf Bifurcation in a HIV-1 System with Multitime Delays
NASA Astrophysics Data System (ADS)
Zhao, Lingyan; Liu, Haihong; Yan, Fang
In this paper, we propose a mathematical model for HIV-1 infection with three time delays. The model examines a viral-therapy for controlling infections by using an engineered virus to selectively eliminate infected cells. In our model, three time delays represent the latent period of pathogen virus, pathogen virus production period and recombinant (genetically modified) virus production period, respectively. Detailed theoretical analysis have demonstrated that the values of three delays can affect the stability of equilibrium solutions, can also lead to Hopf bifurcation and oscillated solutions of the system. Moreover, we give the conditions for the existence of stable positive equilibrium solution and Hopf bifurcation. Further, the properties of Hopf bifurcation are discussed. These theoretical results indicate that the delays play an important role in determining the dynamic behavior quantitatively. Therefore, it is a fact that delays are very important, which should not be missed in controlling HIV-1 infections.
Effect of deposition rate and NNN interactions on adatoms mobility in epitaxial growth
NASA Astrophysics Data System (ADS)
Hamouda, Ajmi B. H.; Mahjoub, B.; Blel, S.
2017-07-01
This paper provides a detailed analysis of the surface diffusion problem during epitaxial step-flow growth using a simple theoretical model for the diffusion equation of adatoms concentration. Within this framework, an analytical expression for the adatom mobility as a function of the deposition rate and the Next-Nearest-Neighbor (NNN) interactions is derived and compared with the effective mobility computed from kinetic Monte Carlo (kMC) simulations. As far as the 'small' step velocity or relatively weak deposition rate commonly used for copper growth is concerned, an excellent quantitative agreement with the theoretical prediction is found. The effective adatoms mobility is shown to exhibit an exponential decrease with NNN interactions strength and increases in roughly linear behavior versus deposition rate F. The effective step stiffness and the adatoms mobility are also shown to be closely related to the concentration of kinks.
NASA Astrophysics Data System (ADS)
Li, Kun-Dar; Huang, Po-Yu
2017-12-01
In order to simulate a process of directional vapor deposition, in this study, a numerical approach was applied to model the growth and evolution of surface morphologies for the crystallographic structures of thin films. The critical factors affecting the surface morphologies in a deposition process, such as the crystallographic symmetry, anisotropic interfacial energy, shadowing effect, and deposition rate, were all enclosed in the theoretical model. By altering the parameters of crystallographic symmetry in the structures, the faceted nano-columns with rectangular and hexagonal shapes were established in the simulation results. Furthermore, for revealing the influences of the anisotropic strength and the deposition rate theoretically on the crystallographic structure formations, various parameters adjusted in the numerical calculations were also investigated. Not only the morphologies but also the surface roughnesses for different processing conditions were distinctly demonstrated with the quantitative analysis of the simulations.
NASA's upper atmosphere research satellite: A program to study global ozone change
NASA Technical Reports Server (NTRS)
Luther, Michael R.
1992-01-01
The Upper Atmosphere Research Satellite (UARS) is a major initiative in the NASA Office of Space Science and Applications, and is the prototype for NASA's Earth Observing System (EOS) planned for launch in the 1990s. The UARS combines a balanced program of experimental and theoretical investigations to perform diagnostic studies, qualitative model analysis, and quantitative measurements and comparative studies of the upper atmosphere. UARS provides theoretical and experimental investigations which pursue four specific research topics: atmospheric energy budget, chemistry, dynamics, and coupling processes. An international cadre of investigators was assembled by NASA to accomplish those scientific objectives. The observatory, its complement of ten state of the art instruments, and the ground system are nearing flight readiness. The timely UARS program will play a major role in providing data to understand the complex physical and chemical processes occurring in the upper atmosphere and answering many questions regarding the health of the ozone layer.
Ryan, Gillian L; Watanabe, Naoki; Vavylonis, Dimitrios
2012-04-01
A characteristic feature of motile cells as they undergo a change in motile behavior is the development of fluctuating exploratory motions of the leading edge, driven by actin polymerization. We review quantitative models of these protrusion and retraction phenomena. Theoretical studies have been motivated by advances in experimental and computational methods that allow controlled perturbations, single molecule imaging, and analysis of spatiotemporal correlations in microscopic images. To explain oscillations and waves of the leading edge, most theoretical models propose nonlinear interactions and feedback mechanisms among different components of the actin cytoskeleton system. These mechanisms include curvature-sensing membrane proteins, myosin contraction, and autocatalytic biochemical reaction kinetics. We discuss how the combination of experimental studies with modeling promises to quantify the relative importance of these biochemical and biophysical processes at the leading edge and to evaluate their generality across cell types and extracellular environments. Copyright © 2012 Wiley Periodicals, Inc.
Tunable quasiparticle trapping in Meissner and vortex states of mesoscopic superconductors.
Taupin, M; Khaymovich, I M; Meschke, M; Mel'nikov, A S; Pekola, J P
2016-03-16
Nowadays, superconductors serve in numerous applications, from high-field magnets to ultrasensitive detectors of radiation. Mesoscopic superconducting devices, referring to those with nanoscale dimensions, are in a special position as they are easily driven out of equilibrium under typical operating conditions. The out-of-equilibrium superconductors are characterized by non-equilibrium quasiparticles. These extra excitations can compromise the performance of mesoscopic devices by introducing, for example, leakage currents or decreased coherence time in quantum devices. By applying an external magnetic field, one can conveniently suppress or redistribute the population of excess quasiparticles. In this article, we present an experimental demonstration and a theoretical analysis of such effective control of quasiparticles, resulting in electron cooling both in the Meissner and vortex states of a mesoscopic superconductor. We introduce a theoretical model of quasiparticle dynamics, which is in quantitative agreement with the experimental data.
Tunable quasiparticle trapping in Meissner and vortex states of mesoscopic superconductors
Taupin, M.; Khaymovich, I. M.; Meschke, M.; Mel'nikov, A. S.; Pekola, J. P.
2016-01-01
Nowadays, superconductors serve in numerous applications, from high-field magnets to ultrasensitive detectors of radiation. Mesoscopic superconducting devices, referring to those with nanoscale dimensions, are in a special position as they are easily driven out of equilibrium under typical operating conditions. The out-of-equilibrium superconductors are characterized by non-equilibrium quasiparticles. These extra excitations can compromise the performance of mesoscopic devices by introducing, for example, leakage currents or decreased coherence time in quantum devices. By applying an external magnetic field, one can conveniently suppress or redistribute the population of excess quasiparticles. In this article, we present an experimental demonstration and a theoretical analysis of such effective control of quasiparticles, resulting in electron cooling both in the Meissner and vortex states of a mesoscopic superconductor. We introduce a theoretical model of quasiparticle dynamics, which is in quantitative agreement with the experimental data. PMID:26980225
Photonic band gap spectra in Octonacci metamaterial quasicrystals
NASA Astrophysics Data System (ADS)
Brandão, E. R.; Vasconcelos, M. S.; Albuquerque, E. L.; Fulco, U. L.
2017-02-01
In this work we study theoretically the photonic band gap spectra for a one-dimensional quasicrystal made up of SiO2 (layer A) and a metamaterial (layer B) organized following the Octonacci sequence, where its nth-stage Sn is given by the inflation rule Sn =Sn - 1Sn - 2Sn - 1 for n ≥ 3 , with initial conditions S1 = A and S2 = B . The metamaterial is characterized by a frequency dependent electric permittivity ε(ω) and magnetic permeability μ(ω) . The polariton dispersion relation is obtained analytically by employing a theoretical calculation based on a transfer-matrix approach. A quantitative analysis of the spectra is then discussed, stressing the distribution of the allowed photonic band widths for high generations of the Octonacci structure, which depict a self-similar scaling property behavior, with a power law depending on the common in-plane wavevector kx .
Indirect scaling methods for testing quantitative emotion theories.
Junge, Martin; Reisenzein, Rainer
2013-01-01
Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.
Guetterman, Timothy C; Fetters, Michael D; Creswell, John W
2015-11-01
Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.
Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.
2015-01-01
PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895
From information theory to quantitative description of steric effects.
Alipour, Mojtaba; Safari, Zahra
2016-07-21
Immense efforts have been made in the literature to apply the information theory descriptors for investigating the electronic structure theory of various systems. In the present study, the information theoretic quantities, such as Fisher information, Shannon entropy, Onicescu information energy, and Ghosh-Berkowitz-Parr entropy, have been used to present a quantitative description for one of the most widely used concepts in chemistry, namely the steric effects. Taking the experimental steric scales for the different compounds as benchmark sets, there are reasonable linear relationships between the experimental scales of the steric effects and theoretical values of steric energies calculated from information theory functionals. Perusing the results obtained from the information theoretic quantities with the two representations of electron density and shape function, the Shannon entropy has the best performance for the purpose. On the one hand, the usefulness of considering the contributions of functional groups steric energies and geometries, and on the other hand, dissecting the effects of both global and local information measures simultaneously have also been explored. Furthermore, the utility of the information functionals for the description of steric effects in several chemical transformations, such as electrophilic and nucleophilic reactions and host-guest chemistry, has been analyzed. The functionals of information theory correlate remarkably with the stability of systems and experimental scales. Overall, these findings show that the information theoretic quantities can be introduced as quantitative measures of steric effects and provide further evidences of the quality of information theory toward helping theoreticians and experimentalists to interpret different problems in real systems.
NASA Astrophysics Data System (ADS)
Reineker, P.; Kenkre, V. M.; Kühne, R.
1981-08-01
A quantitative comparison of a simple theoretical prediction for the drift mobility of photo-electrons in organic molecular crystals, calculated within the model of the coupled band-like and hopping motion, with experiments in napthalene of Schein et al. and Karl et al. is given.
ERIC Educational Resources Information Center
Castillo, Alan F.
2014-01-01
The purpose of this quantitative correlational cross-sectional research study was to examine a theoretical model consisting of leadership practice, attitudes of business process outsourcing, and strategic intentions of leaders to use cloud computing and to examine the relationships between each of the variables respectively. This study…
Quantitative test for concave aspheric surfaces using a Babinet compensator.
Saxena, A K
1979-08-15
A quantitative test for the evaluation of surface figures of concave aspheric surfaces using a Babinet compensator is reported. A theoretical estimate of the sensitivity is 0.002lambda for a minimum detectable phase change of 2 pi x 10(-3) rad over a segment length of 1.0 cm.
NASA Astrophysics Data System (ADS)
Ivanova, Bojidarka; Spiteller, Michael
2017-12-01
The present paper deals with quantitative kinetics and thermodynamics of collision induced dissociation (CID) reactions of piperazines under different experimental conditions together with a systematic description of effect of counter-ions on common MS fragment reactions of piperazines; and intra-molecular effect of quaternary cyclization of substituted piperazines yielding to quaternary salts. There are discussed quantitative model equations of rate constants as well as free Gibbs energies of series of m-independent CID fragment processes in GP, which have been evidenced experimentally. Both kinetic and thermodynamic parameters are also predicted by computational density functional theory (DFT) and ab initio both static and dynamic methods. The paper examines validity of Maxwell-Boltzmann distribution to non-Boltzmann CID processes in quantitatively as well. The experiments conducted within the latter framework yield to an excellent correspondence with theoretical quantum chemical modeling. The important property of presented model equations of reaction kinetics is the applicability in predicting unknown and assigning of known mass spectrometric (MS) patterns. The nature of "GP" continuum of CID-MS coupled scheme of measurements with electrospray ionization (ESI) source is discussed, performing parallel computations in gas-phase (GP) and polar continuum at different temperatures and ionic strengths. The effect of pressure is presented. The study contributes significantly to methodological and phenomenological developments of CID-MS and its analytical implementations for quantitative and structural analyses. It also demonstrates great prospective of a complementary application of experimental CID-MS and computational quantum chemistry studying chemical reactivity, among others. To a considerable extend this work underlies the place of computational quantum chemistry to the field of experimental analytical chemistry in particular highlighting the structural analysis.
Hutchings, Maggie; Scammell, Janet; Quinney, Anne
2013-09-01
While there is growing evidence of theoretical perspectives adopted in interprofessional education, learning theories tend to foreground the individual, focusing on psycho-social aspects of individual differences and professional identity to the detriment of considering social-structural factors at work in social practices. Conversely socially situated practice is criticised for being context-specific, making it difficult to draw generalisable conclusions for improving interprofessional education. This article builds on a theoretical framework derived from earlier research, drawing on the dynamics of Dewey's experiential learning theory and Archer's critical realist social theory, to make a case for a meta-theoretical framework enabling social-constructivist and situated learning theories to be interlinked and integrated through praxis and reflexivity. Our current analysis is grounded in an interprofessional curriculum initiative mediated by a virtual community peopled by health and social care users. Student perceptions, captured through quantitative and qualitative data, suggest three major disruptive themes, creating opportunities for congruence and disjuncture and generating a model of zones of interlinked praxis associated with professional differences and identity, pedagogic strategies and technology-mediated approaches. This model contributes to a framework for understanding the complexity of interprofessional learning and offers bridges between individual and structural factors for engaging with the enablements and constraints at work in communities of practice and networks for interprofessional education.
Integrated stoichiometric, thermodynamic and kinetic modelling of steady state metabolism
Fleming, R.M.T.; Thiele, I.; Provan, G.; Nasheuer, H.P.
2010-01-01
The quantitative analysis of biochemical reactions and metabolites is at frontier of biological sciences. The recent availability of high-throughput technology data sets in biology has paved the way for new modelling approaches at various levels of complexity including the metabolome of a cell or an organism. Understanding the metabolism of a single cell and multi-cell organism will provide the knowledge for the rational design of growth conditions to produce commercially valuable reagents in biotechnology. Here, we demonstrate how equations representing steady state mass conservation, energy conservation, the second law of thermodynamics, and reversible enzyme kinetics can be formulated as a single system of linear equalities and inequalities, in addition to linear equalities on exponential variables. Even though the feasible set is non-convex, the reformulation is exact and amenable to large-scale numerical analysis, a prerequisite for computationally feasible genome scale modelling. Integrating flux, concentration and kinetic variables in a unified constraint-based formulation is aimed at increasing the quantitative predictive capacity of flux balance analysis. Incorporation of experimental and theoretical bounds on thermodynamic and kinetic variables ensures that the predicted steady state fluxes are both thermodynamically and biochemically feasible. The resulting in silico predictions are tested against fluxomic data for central metabolism in E. coli and compare favourably with in silico prediction by flux balance analysis. PMID:20230840
Probability Density Functions of Observed Rainfall in Montana
NASA Technical Reports Server (NTRS)
Larsen, Scott D.; Johnson, L. Ronald; Smith, Paul L.
1995-01-01
The question of whether a rain rate probability density function (PDF) can vary uniformly between precipitation events is examined. Image analysis on large samples of radar echoes is possible because of advances in technology. The data provided by such an analysis easily allow development of radar reflectivity factors (and by extension rain rate) distribution. Finding a PDF becomes a matter of finding a function that describes the curve approximating the resulting distributions. Ideally, one PDF would exist for all cases; or many PDF's that have the same functional form with only systematic variations in parameters (such as size or shape) exist. Satisfying either of theses cases will, validate the theoretical basis of the Area Time Integral (ATI). Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89 percent of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit. Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89% of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit.
Towards a neuro-computational account of prism adaptation.
Petitet, Pierre; O'Reilly, Jill X; O'Shea, Jacinta
2017-12-14
Prism adaptation has a long history as an experimental paradigm used to investigate the functional and neural processes that underlie sensorimotor control. In the neuropsychology literature, prism adaptation behaviour is typically explained by reference to a traditional cognitive psychology framework that distinguishes putative functions, such as 'strategic control' versus 'spatial realignment'. This theoretical framework lacks conceptual clarity, quantitative precision and explanatory power. Here, we advocate for an alternative computational framework that offers several advantages: 1) an algorithmic explanatory account of the computations and operations that drive behaviour; 2) expressed in quantitative mathematical terms; 3) embedded within a principled theoretical framework (Bayesian decision theory, state-space modelling); 4) that offers a means to generate and test quantitative behavioural predictions. This computational framework offers a route towards mechanistic neurocognitive explanations of prism adaptation behaviour. Thus it constitutes a conceptual advance compared to the traditional theoretical framework. In this paper, we illustrate how Bayesian decision theory and state-space models offer principled explanations for a range of behavioural phenomena in the field of prism adaptation (e.g. visual capture, magnitude of visual versus proprioceptive realignment, spontaneous recovery and dynamics of adaptation memory). We argue that this explanatory framework can advance understanding of the functional and neural mechanisms that implement prism adaptation behaviour, by enabling quantitative tests of hypotheses that go beyond merely descriptive mapping claims that 'brain area X is (somehow) involved in psychological process Y'. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
UV absorption in metal decorated boron nitride flakes: a theoretical analysis of excited states
NASA Astrophysics Data System (ADS)
Chopra, Siddheshwar; Plasser, Felix
2017-10-01
The excited states of single metal atom (X = Co, Al and Cu) doped boron nitride flake (MBNF) B15N14H14-X and pristine boron nitride (B15N15H14) are studied by time-dependent density functional theory. The immediate effect of metal doping is a red shift of the onset of absorption from about 220 nm for pristine BNF to above 300 nm for all metal-doped variants with the biggest effect for MBNF-Co, which shows appreciable intensity even above 400 nm. These energy shifts are analysed by detailed wavefunction analysis protocols using visualisation methods, such as the natural transition orbital analysis and electron-hole correlation plots, as well as quantitative analysis of the exciton size and electron-hole populations. The analysis shows that the Co and Cu atoms provide strong contributions to the relevant states whereas the aluminium atom is only involved to a lesser extent.
Review on investigations of antisense oligonucleotides with the use of mass spectrometry.
Studzińska, Sylwia
2018-01-01
Antisense oligonucleotides have been investigated as potential drugs for years. They inhibit target gene or protein expression. The present review summarizes their modifications, modes of action, and applications of liquid chromatography coupled with mass spectrometry for qualitative and quantitative analysis of these compounds. The most recent reports on a given topic were given prominence, while some early studies were reviewed in order to provide a theoretical background. The present review covers the issues of using ion-exchange chromatography, ion-pair reversed-phase high performance liquid chromatography and hydrophilic interaction chromatography for the separation of antisense oligonucleotides. The application of mass spectrometry was described with regard to the ionization type used for the determination of these potential therapeutics. Moreover, the current approaches and applications of mass spectrometry for quantitative analysis of antisense oligonucleotides and their metabolites as well as their impurities during in vitro and in vivo studies were discussed. Finally, certain conclusions and perspectives on the determination of therapeutic oligonucleotides in various samples were briefly described. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Baliukin, I. I.; Izmodenov, V. V.; Möbius, E.; Alexashov, D. B.; Katushkina, O. A.; Kucharek, H.
2017-12-01
Quantitative analysis of the interstellar heavy (oxygen and neon) atom fluxes obtained by the Interstellar Boundary Explorer (IBEX) suggests the existence of the secondary interstellar oxygen component. This component is formed near the heliopause due to charge exchange of interstellar oxygen ions with hydrogen atoms, as was predicted theoretically. A detailed quantitative analysis of the fluxes of interstellar heavy atoms is only possible with a model that takes into account both the filtration of primary and the production of secondary interstellar oxygen in the boundary region of the heliosphere as well as a detailed simulation of the motion of interstellar atoms inside the heliosphere. This simulation must take into account photoionization, charge exchange with the protons of the solar wind and solar gravitational attraction. This paper presents the results of modeling interstellar oxygen and neon atoms through the heliospheric interface and inside the heliosphere based on a three-dimensional kinetic-MHD model of the solar wind interaction with the local interstellar medium and a comparison of these results with the data obtained on the IBEX spacecraft.
Blank, Hartmut
2005-02-01
Traditionally, the causes of interference phenomena were sought in "real" or "hard" memory processes such as unlearning, response competition, or inhibition, which serve to reduce the accessibility of target items. I propose an alternative approach which does not deny the influence of such processes but highlights a second, equally important, source of interference-the conversion (Tulving, 1983) of accessible memory information into memory performance. Conversion is conceived as a problem-solving-like activity in which the rememberer tries to find solutions to a memory task. Conversion-based interference effects are traced to different conversion processes in the experimental and control conditions of interference designs. I present a simple theoretical model that quantitatively predicts the resulting amount of interference. In two paired-associate learning experiments using two different types of memory tests, these predictions were corroborated. Relations of the present approach to traditional accounts of interference phenomena and implications for eyewitness testimony are discussed.
Description and Application of a Mathematical Method for the Analysis of Harmony
Zuo, Qiting; Jin, Runfang; Ma, Junxia
2015-01-01
Harmony issues are widespread in human society and nature. To analyze these issues, harmony theory has been proposed as the main theoretical approach for the study of interpersonal relationships and relationships between humans and nature. Therefore, it is of great importance to study harmony theory. After briefly introducing the basic concepts of harmony theory, this paper expounds the five elements that are essential for the quantitative description of harmony issues in water resources management: harmony participant, harmony objective, harmony regulation, harmony factor, and harmony action. A basic mathematical equation for the harmony degree, that is, a quantitative expression of harmony issues, is introduced in the paper: HD = ai − bj, where a is the uniform degree, b is the difference degree, i is the harmony coefficient, and j is the disharmony coefficient. This paper also discusses harmony assessment and harmony regulation and introduces some application examples. PMID:26167535
Quantitative Study on Corrosion of Steel Strands Based on Self-Magnetic Flux Leakage.
Xia, Runchuan; Zhou, Jianting; Zhang, Hong; Liao, Leng; Zhao, Ruiqiang; Zhang, Zeyu
2018-05-02
This paper proposed a new computing method to quantitatively and non-destructively determine the corrosion of steel strands by analyzing the self-magnetic flux leakage (SMFL) signals from them. The magnetic dipole model and three growth models (Logistic model, Exponential model, and Linear model) were proposed to theoretically analyze the characteristic value of SMFL. Then, the experimental study on the corrosion detection by the magnetic sensor was carried out. The setup of the magnetic scanning device and signal collection method were also introduced. The results show that the Logistic Growth model is verified as the optimal model for calculating the magnetic field with good fitting effects. Combined with the experimental data analysis, the amplitudes of the calculated values ( B xL ( x,z ) curves) agree with the measured values in general. This method provides significant application prospects for the evaluation of the corrosion and the residual bearing capacity of steel strand.
Estimating sub-surface dispersed oil concentration using acoustic backscatter response.
Fuller, Christopher B; Bonner, James S; Islam, Mohammad S; Page, Cheryl; Ojo, Temitope; Kirkey, William
2013-05-15
The recent Deepwater Horizon disaster resulted in a dispersed oil plume at an approximate depth of 1000 m. Several methods were used to characterize this plume with respect to concentration and spatial extent including surface supported sampling and autonomous underwater vehicles with in situ instrument payloads. Additionally, echo sounders were used to track the plume location, demonstrating the potential for remote detection using acoustic backscatter (ABS). This study evaluated use of an Acoustic Doppler Current Profiler (ADCP) to quantitatively detect oil-droplet suspensions from the ABS response in a controlled laboratory setting. Results from this study showed log-linear ABS responses to oil-droplet volume concentration. However, the inability to reproduce ABS response factors suggests the difficultly in developing meaningful calibration factors for quantitative field analysis. Evaluation of theoretical ABS intensity derived from the particle size distribution provided insight regarding method sensitivity in the presence of interfering ambient particles. Copyright © 2013 Elsevier Ltd. All rights reserved.
Khatib, Rasha; Schwalm, Jon-David; Yusuf, Salim; Haynes, R. Brian; McKee, Martin; Khan, Maheer; Nieuwlaat, Robby
2014-01-01
Background Although the importance of detecting, treating, and controlling hypertension has been recognized for decades, the majority of patients with hypertension remain uncontrolled. The path from evidence to practice contains many potential barriers, but their role has not been reviewed systematically. This review aimed to synthesize and identify important barriers to hypertension control as reported by patients and healthcare providers. Methods Electronic databases MEDLINE, EMBASE and Global Health were searched systematically up to February 2013. Two reviewers independently selected eligible studies. Two reviewers categorized barriers based on a theoretical framework of behavior change. The theoretical framework suggests that a change in behavior requires a strong commitment to change [intention], the necessary skills and abilities to adopt the behavior [capability], and an absence of health system and support constraints. Findings Twenty-five qualitative studies and 44 quantitative studies met the inclusion criteria. In qualitative studies, health system barriers were most commonly discussed in studies of patients and health care providers. Quantitative studies identified disagreement with clinical recommendations as the most common barrier among health care providers. Quantitative studies of patients yielded different results: lack of knowledge was the most common barrier to hypertension awareness. Stress, anxiety and depression were most commonly reported as barriers that hindered or delayed adoption of a healthier lifestyle. In terms of hypertension treatment adherence, patients mostly reported forgetting to take their medication. Finally, priority setting barriers were most commonly reported by patients in terms of following up with their health care providers. Conclusions This review identified a wide range of barriers facing patients and health care providers pursuing hypertension control, indicating the need for targeted multi-faceted interventions. More methodologically rigorous studies that encompass the range of barriers and that include low- and middle-income countries are required in order to inform policies to improve hypertension control. PMID:24454721
QMRA for Drinking Water: 2. The Effect of Pathogen Clustering in Single-Hit Dose-Response Models.
Nilsen, Vegard; Wyller, John
2016-01-01
Spatial and/or temporal clustering of pathogens will invalidate the commonly used assumption of Poisson-distributed pathogen counts (doses) in quantitative microbial risk assessment. In this work, the theoretically predicted effect of spatial clustering in conventional "single-hit" dose-response models is investigated by employing the stuttering Poisson distribution, a very general family of count distributions that naturally models pathogen clustering and contains the Poisson and negative binomial distributions as special cases. The analysis is facilitated by formulating the dose-response models in terms of probability generating functions. It is shown formally that the theoretical single-hit risk obtained with a stuttering Poisson distribution is lower than that obtained with a Poisson distribution, assuming identical mean doses. A similar result holds for mixed Poisson distributions. Numerical examples indicate that the theoretical single-hit risk is fairly insensitive to moderate clustering, though the effect tends to be more pronounced for low mean doses. Furthermore, using Jensen's inequality, an upper bound on risk is derived that tends to better approximate the exact theoretical single-hit risk for highly overdispersed dose distributions. The bound holds with any dose distribution (characterized by its mean and zero inflation index) and any conditional dose-response model that is concave in the dose variable. Its application is exemplified with published data from Norovirus feeding trials, for which some of the administered doses were prepared from an inoculum of aggregated viruses. The potential implications of clustering for dose-response assessment as well as practical risk characterization are discussed. © 2016 Society for Risk Analysis.
Maximizing the Biochemical Resolving Power of Fluorescence Microscopy
Esposito, Alessandro; Popleteeva, Marina; Venkitaraman, Ashok R.
2013-01-01
Most recent advances in fluorescence microscopy have focused on achieving spatial resolutions below the diffraction limit. However, the inherent capability of fluorescence microscopy to non-invasively resolve different biochemical or physical environments in biological samples has not yet been formally described, because an adequate and general theoretical framework is lacking. Here, we develop a mathematical characterization of the biochemical resolution in fluorescence detection with Fisher information analysis. To improve the precision and the resolution of quantitative imaging methods, we demonstrate strategies for the optimization of fluorescence lifetime, fluorescence anisotropy and hyperspectral detection, as well as different multi-dimensional techniques. We describe optimized imaging protocols, provide optimization algorithms and describe precision and resolving power in biochemical imaging thanks to the analysis of the general properties of Fisher information in fluorescence detection. These strategies enable the optimal use of the information content available within the limited photon-budget typically available in fluorescence microscopy. This theoretical foundation leads to a generalized strategy for the optimization of multi-dimensional optical detection, and demonstrates how the parallel detection of all properties of fluorescence can maximize the biochemical resolving power of fluorescence microscopy, an approach we term Hyper Dimensional Imaging Microscopy (HDIM). Our work provides a theoretical framework for the description of the biochemical resolution in fluorescence microscopy, irrespective of spatial resolution, and for the development of a new class of microscopes that exploit multi-parametric detection systems. PMID:24204821
Ling, Gilbert N.
1970-01-01
A theoretical equation is presented for the control of cooperative adsorption on proteins and other linear macromolecules by hormones, drugs, ATP, and other „cardinal adsorbents.” With reasonable accuracy, this equation describes quantitatively the control of oxygen binding to hemoglobin by 2,3-diphosphoglycerate and by inosine hexaphosphate. PMID:5272319
1987-10-01
durability test at 800 C, 95% r.h. 71 SEM photomicrograph at 1600 x of E-8385 film spun coat . from a 2 wt% solution onto a ferrotype plate. .I 72 Theoretical ...TiO2 to the high energy side. While Auger line shapes theoretically yield oxidation state information, stoichiometry conclusions from experi- 0 mental...the justification for the methods chosen in this work. ,*p-* ., Fadley et al. [37] present a detailed theoretical discussion on quantitative XPS
Visualization techniques to aid in the analysis of multispectral astrophysical data sets
NASA Technical Reports Server (NTRS)
Brugel, E. W.; Domik, Gitta O.; Ayres, T. R.
1993-01-01
The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists. Twenty-one examples of the use of visualization for astrophysical data are included with this report. Sixteen publications related to efforts performed during or initiated through work on this project are listed at the end of this report.
On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment
NASA Astrophysics Data System (ADS)
Guterres, Rui M.
The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.
Compound analysis via graph kernels incorporating chirality.
Brown, J B; Urata, Takashi; Tamura, Takeyuki; Arai, Midori A; Kawabata, Takeo; Akutsu, Tatsuya
2010-12-01
High accuracy is paramount when predicting biochemical characteristics using Quantitative Structural-Property Relationships (QSPRs). Although existing graph-theoretic kernel methods combined with machine learning techniques are efficient for QSPR model construction, they cannot distinguish topologically identical chiral compounds which often exhibit different biological characteristics. In this paper, we propose a new method that extends the recently developed tree pattern graph kernel to accommodate stereoisomers. We show that Support Vector Regression (SVR) with a chiral graph kernel is useful for target property prediction by demonstrating its application to a set of human vitamin D receptor ligands currently under consideration for their potential anti-cancer effects.
Liquid-Crystal Point-Diffraction Interferometer for Wave-Front Measurements
NASA Technical Reports Server (NTRS)
Mercer, Carolyn R.; Creath, Katherine
1996-01-01
A new instrument, the liquid-crystal point-diffraction interferometer (LCPDI), is developed for the measurement of phase objects. This instrument maintains the compact, robust design of Linnik's point-diffraction interferometer and adds to it a phase-stepping capability for quantitative interferogram analysis. The result is a compact, simple to align, environmentally insensitive interferometer capable of accurately measuring optical wave fronts with very high data density and with automated data reduction. We describe the theory and design of the LCPDI. A focus shift was measured with the LCPDI, and the results are compared with theoretical results,
The size effects upon shock plastic compression of nanocrystals
NASA Astrophysics Data System (ADS)
Malygin, G. A.; Klyavin, O. V.
2017-10-01
For the first time a theoretical analysis of scale effects upon the shock plastic compression of nanocrystals is implemented in the context of a dislocation kinetic approach based on the equations and relationships of dislocation kinetics. The yield point of crystals τy is established as a quantitative function of their cross-section size D and the rate of shock deformation as τy ɛ2/3 D. This dependence is valid in the case of elastic stress relaxation on account of emission of dislocations from single-pole Frank-Read sources near the crystal surface.
Visualization of the hot chocolate sound effect by spectrograms
NASA Astrophysics Data System (ADS)
Trávníček, Z.; Fedorchenko, A. I.; Pavelka, M.; Hrubý, J.
2012-12-01
We present an experimental and a theoretical analysis of the hot chocolate effect. The sound effect is evaluated using time-frequency signal processing, resulting in a quantitative visualization by spectrograms. This method allows us to capture the whole phenomenon, namely to quantify the dynamics of the rising pitch. A general form of the time dependence volume fraction of the bubbles is proposed. We show that the effect occurs due to the nonlinear dependence of the speed of sound in the gas/liquid mixture on the volume fraction of the bubbles and the nonlinear time dependence of the volume fraction of the bubbles.
How to make Raman-inactive helium visible in Raman spectra of tritium-helium gas mixtures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schloesser, M.; Pakari, O.; Rupp, S.
2015-03-15
Raman spectroscopy, a powerful method for the quantitative compositional analysis of molecular gases, e.g. mixtures of hydrogen isotopologues, is not able to detect monoatomic species like helium. This deficit can be overcome by using radioluminescence emission from helium atoms induced by β-electrons from tritium decay. We present theoretical considerations and combined Raman/radioluminescence spectra. Furthermore, we discuss the linearity of the method together with validation measurements for determining the pressure dependence. Finally, we conclude how this technique can be used for samples of helium with traces of tritium, and vice versa. (authors)
Modelisation and distribution of neutron flux in radium-beryllium source (226Ra-Be)
NASA Astrophysics Data System (ADS)
Didi, Abdessamad; Dadouch, Ahmed; Jai, Otman
2017-09-01
Using the Monte Carlo N-Particle code (MCNP-6), to analyze the thermal, epithermal and fast neutron fluxes, of 3 millicuries of radium-beryllium, for determine the qualitative and quantitative of many materials, using method of neutron activation analysis. Radium-beryllium source of neutron is established to practical work and research in nuclear field. The main objective of this work was to enable us harness the profile flux of radium-beryllium irradiation, this theoretical study permits to discuss the design of the optimal irradiation and performance for increased the facility research and education of nuclear physics.
Quantitative polarized light microscopy using spectral multiplexing interferometry.
Li, Chengshuai; Zhu, Yizheng
2015-06-01
We propose an interferometric spectral multiplexing method for measuring birefringent specimens with simple configuration and high sensitivity. The retardation and orientation of sample birefringence are simultaneously encoded onto two spectral carrier waves, generated interferometrically by a birefringent crystal through polarization mixing. A single interference spectrum hence contains sufficient information for birefringence determination, eliminating the need for mechanical rotation or electrical modulation. The technique is analyzed theoretically and validated experimentally on cellulose film. System simplicity permits the possibility of mitigating system birefringence background. Further analysis demonstrates the technique's exquisite sensitivity as high as ∼20 pm for retardation measurement.
Mean intensity of the vortex Bessel-Gaussian beam in turbulent atmosphere
NASA Astrophysics Data System (ADS)
Lukin, Igor P.
2017-11-01
In this work the question of stability of the vortex Bessel-Gaussian optical beams formed in turbulent atmosphere is theoretically considered. The detailed analysis of features of spatial structure of distribution of mean intensity of vortex Bessel-Gaussian optical beams in turbulent atmosphere are analyzed. The quantitative criterion of possibility of formation of vortex Bessel-Gaussian optical beams in turbulent atmosphere is derived. It is shown that stability of the form of a vortex Bessel-Gaussian optical beam during propagation in turbulent atmosphere increases with increase of value of a topological charge of this optical beam.
Dynamical Systems in Psychology: Linguistic Approaches
NASA Astrophysics Data System (ADS)
Sulis, William
Major goals for psychoanalysis and psychology are the description, analysis, prediction, and control of behaviour. Natural language has long provided the medium for the formulation of our theoretical understanding of behavior. But with the advent of nonlinear dynamics, a new language has appeared which offers promise to provide a quantitative theory of behaviour. In this paper, some of the limitations of natural and formal languages are discussed. Several approaches to understanding the links between natural and formal languages, as applied to the study of behavior, are discussed. These include symbolic dynamics, Moore's generalized shifts, Crutchfield's ɛ machines, and dynamical automata.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sokaras, D.; Andrianis, M.; Lagoyannis, A.
The cascade L-shell x-ray emission as an incident polarized and unpolarized monochromatic radiation overpass the 1s ionization threshold is investigated for the metallic Fe by means of moderate resolution, quantitative x-ray spectrometry. A full ab initio theoretical investigation of the L-shell x-ray emission processes is performed based on a detailed straightforward construction of the cascade decay trees within the Pauli-Fock approximation. The agreement obtained between experiments and the presented theory is indicated and discussed with respect to the accuracy of advanced atomic models as well as its significance for the characterization capabilities of x-ray fluorescence (XRF) analysis.
Critical behavior of subcellular density organization during neutrophil activation and migration.
Baker-Groberg, Sandra M; Phillips, Kevin G; Healy, Laura D; Itakura, Asako; Porter, Juliana E; Newton, Paul K; Nan, Xiaolin; McCarty, Owen J T
2015-12-01
Physical theories of active matter continue to provide a quantitative understanding of dynamic cellular phenomena, including cell locomotion. Although various investigations of the rheology of cells have identified important viscoelastic and traction force parameters for use in these theoretical approaches, a key variable has remained elusive both in theoretical and experimental approaches: the spatiotemporal behavior of the subcellular density. The evolution of the subcellular density has been qualitatively observed for decades as it provides the source of image contrast in label-free imaging modalities (e.g., differential interference contrast, phase contrast) used to investigate cellular specimens. While these modalities directly visualize cell structure, they do not provide quantitative access to the structures being visualized. We present an established quantitative imaging approach, non-interferometric quantitative phase microscopy, to elucidate the subcellular density dynamics in neutrophils undergoing chemokinesis following uniform bacterial peptide stimulation. Through this approach, we identify a power law dependence of the neutrophil mean density on time with a critical point, suggesting a critical density is required for motility on 2D substrates. Next we elucidate a continuum law relating mean cell density, area, and total mass that is conserved during neutrophil polarization and migration. Together, our approach and quantitative findings will enable investigators to define the physics coupling cytoskeletal dynamics with subcellular density dynamics during cell migration.
Critical behavior of subcellular density organization during neutrophil activation and migration
Baker-Groberg, Sandra M.; Phillips, Kevin G.; Healy, Laura D.; Itakura, Asako; Porter, Juliana E.; Newton, Paul K.; Nan, Xiaolin; McCarty, Owen J.T.
2015-01-01
Physical theories of active matter continue to provide a quantitative understanding of dynamic cellular phenomena, including cell locomotion. Although various investigations of the rheology of cells have identified important viscoelastic and traction force parameters for use in these theoretical approaches, a key variable has remained elusive both in theoretical and experimental approaches: the spatiotemporal behavior of the subcellular density. The evolution of the subcellular density has been qualitatively observed for decades as it provides the source of image contrast in label-free imaging modalities (e.g., differential interference contrast, phase contrast) used to investigate cellular specimens. While these modalities directly visualize cell structure, they do not provide quantitative access to the structures being visualized. We present an established quantitative imaging approach, non-interferometric quantitative phase microscopy, to elucidate the subcellular density dynamics in neutrophils undergoing chemokinesis following uniform bacterial peptide stimulation. Through this approach, we identify a power law dependence of the neutrophil mean density on time with a critical point, suggesting a critical density is required for motility on 2D substrates. Next we elucidate a continuum law relating mean cell density, area, and total mass that is conserved during neutrophil polarization and migration. Together, our approach and quantitative findings will enable investigators to define the physics coupling cytoskeletal dynamics with subcellular density dynamics during cell migration. PMID:26640599
Pütter, Carolin; Pechlivanis, Sonali; Nöthen, Markus M; Jöckel, Karl-Heinz; Wichmann, Heinz-Erich; Scherag, André
2011-01-01
Genome-wide association studies have identified robust associations between single nucleotide polymorphisms and complex traits. As the proportion of phenotypic variance explained is still limited for most of the traits, larger and larger meta-analyses are being conducted to detect additional associations. Here we investigate the impact of the study design and the underlying assumption about the true genetic effect in a bimodal mixture situation on the power to detect associations. We performed simulations of quantitative phenotypes analysed by standard linear regression and dichotomized case-control data sets from the extremes of the quantitative trait analysed by standard logistic regression. Using linear regression, markers with an effect in the extremes of the traits were almost undetectable, whereas analysing extremes by case-control design had superior power even for much smaller sample sizes. Two real data examples are provided to support our theoretical findings and to explore our mixture and parameter assumption. Our findings support the idea to re-analyse the available meta-analysis data sets to detect new loci in the extremes. Moreover, our investigation offers an explanation for discrepant findings when analysing quantitative traits in the general population and in the extremes. Copyright © 2011 S. Karger AG, Basel.
Wang, Ji; Zhou, Chuang; Zhang, Wei; Yao, Jun; Lu, Haojie; Dong, Qiongzhu; Zhou, Haijun; Qin, Lunxiu
2014-01-15
The complexity of protein glycosylation makes it difficult to characterize glycosylation patterns on a proteomic scale. In this study, we developed an integrated strategy for comparatively analyzing N-glycosylation/glycoproteins quantitatively from complex biological samples in a high-throughput manner. This strategy entailed separating and enriching glycopeptides/glycoproteins using lectin affinity chromatography, and then tandem labeling them with 18O/16O to generate a mass shift of 6 Da between the paired glycopeptides, and finally analyzing them with liquid chromatography-mass spectrometry (LC-MS) and the automatic quantitative method we developed based on Mascot Distiller. The accuracy and repeatability of this strategy were first verified using standard glycoproteins; linearity was maintained within a range of 1:10-10:1. The peptide concentration ratios obtained by the self-build quantitative method were similar to both the manually calculated and theoretical values, with a standard deviation (SD) of 0.023-0.186 for glycopeptides. The feasibility of the strategy was further confirmed with serum from hepatocellular carcinoma (HCC) patients and healthy individuals; the expression of 44 glycopeptides and 30 glycoproteins were significantly different between HCC patient and control serum. This strategy is accurate, repeatable, and efficient, and may be a useful tool for identification of disease-related N-glycosylation/glycoprotein changes.
A Taylor weak-statement algorithm for hyperbolic conservation laws
NASA Technical Reports Server (NTRS)
Baker, A. J.; Kim, J. W.
1987-01-01
Finite element analysis, applied to computational fluid dynamics (CFD) problem classes, presents a formal procedure for establishing the ingredients of a discrete approximation numerical solution algorithm. A classical Galerkin weak-statement formulation, formed on a Taylor series extension of the conservation law system, is developed herein that embeds a set of parameters eligible for constraint according to specification of suitable norms. The derived family of Taylor weak statements is shown to contain, as special cases, over one dozen independently derived CFD algorithms published over the past several decades for the high speed flow problem class. A theoretical analysis is completed that facilitates direct qualitative comparisons. Numerical results for definitive linear and nonlinear test problems permit direct quantitative performance comparisons.
NASA Astrophysics Data System (ADS)
Melelli, Laura; Liucci, Luisa; Vergari, Francesca; Ciccacci, Sirio; Del Monte, Maurizio
2014-05-01
Drainage basins are primary landscape units for geomorphological investigations. Both hillslopes and river drainage system are fundamental components in drainage basins analysis. As other geomorphological systems, also the drainage basins aim to an equilibrium condition where the sequence of erosion, transport and sedimentation approach to a condition of minimum energy effort. This state is revealed by a typical geometry of landforms and of drainage net. Several morphometric indexes can measure how much a drainage basin is far from the theoretical equilibrium configuration, revealing possible external disarray. In active tectonic areas, the drainage basins have a primary importance in order to highlight style, amount and rate of tectonic impulses, and morphometric indexes allow to estimate the tectonic activity classes of different sectors in a study area. Moreover, drainage rivers are characterized by a self-similarity structure; this promotes the use of fractals theory to investigate the system. In this study, fractals techniques are employed together with quantitative geomorphological analysis to study the Upper Tiber Valley (UTV), a tectonic intermontane basin located in northern Apennines (Umbria, central Italy). The area is the result of different tectonic phases. From Late Pliocene until present time the UTV is strongly controlled by a regional uplift and by an extensional phase with different sets of normal faults playing a fundamental role in basin morphology. Thirty-four basins are taken into account for the quantitative analysis, twenty on the left side of the basin, the others on the right side. Using fractals dimension of drainage networks, Horton's laws results, concavity and steepness indexes, and hypsometric curves, this study aims to obtain an evolutionary model of the UTV, where the uplift is compared to local subsidence induced by normal fault activity. The results highlight a well defined difference between western and eastern tributary basins, suggesting a greater disequilibrium in the last ones. The quantitative analysis points out the segments of the basin boundaries where the fault activity is more efficient and the resulting geomorphological implications.
ERIC Educational Resources Information Center
Smeyers, Paul
2008-01-01
Generally educational research is grounded in the empirical traditions of the social sciences (commonly called quantitative and qualitative methods) and is as such distinguished from other forms of scholarship such as theoretical, conceptual or methodological essays, critiques of research traditions and practices and those studies grounded in the…
ERIC Educational Resources Information Center
Yilmaz, Kaya
2013-01-01
There has been much discussion about quantitative and qualitative approaches to research in different disciplines. In the behavioural and social sciences, these two paradigms are compared to reveal their relative strengths and weaknesses. But the debate about both traditions has commonly taken place in academic books. It is hard to find an article…
ERIC Educational Resources Information Center
Balarabe Kura, Sulaiman Y.
2012-01-01
There is a germane relationship between qualitative and quantitative approaches to social science research. The relationship is empirically and theoretically demonstrated by poverty researchers. The study of poverty, as argued in this article, is a study of both numbers and contextualities. This article provides a general overview of qualitative…
Sneck, Sami; Saarnio, Reetta; Isola, Arja; Boigu, Risto
2016-01-01
Medication administration is an important task of registered nurses. According to previous studies, nurses lack theoretical knowledge and drug calculation skills and knowledge-based mistakes do occur in clinical practice. Finnish health care organizations started to develop a systematic verification processes for medication competence at the end of the last decade. No studies have yet been made of nurses' theoretical knowledge and drug calculation skills according to these online exams. The aim of this study was to describe the medication competence of Finnish nurses according to theoretical and drug calculation exams. A descriptive correlation design was adopted. Participants and settings All nurses who participated in the online exam in three Finnish hospitals between 1.1.2009 and 31.05.2014 were selected to the study (n=2479). Quantitative methods like Pearson's chi-squared tests, analysis of variance (ANOVA) with post hoc Tukey tests and Pearson's correlation coefficient were used to test the existence of relationships between dependent and independent variables. The majority of nurses mastered the theoretical knowledge needed in medication administration, but 5% of the nurses struggled with passing the drug calculation exam. Theoretical knowledge and drug calculation skills were better in acute care units than in the other units and younger nurses achieved better results in both exams than their older colleagues. The differences found in this study were statistically significant, but not high. Nevertheless, even the tiniest deficiency in theoretical knowledge and drug calculation skills should be focused on. It is important to identify the nurses who struggle in the exams and to plan targeted educational interventions for supporting them. The next step is to study if verification of medication competence has an effect on patient safety. Copyright © 2015 Elsevier Ltd. All rights reserved.
Moreno-Poyato, Antonio R; Delgado-Hito, Pilar; Suárez-Pérez, Raquel; Leyva-Moral, Juan M; Aceña-Domínguez, Rosa; Carreras-Salvador, Regina; Roldán-Merino, Juan F; Lluch-Canut, Teresa; Montesó-Curto, Pilar
2017-01-01
Psychiatric nurses are aware of the importance of the therapeutic relationship in psychiatric units. Nevertheless, a review of the scientific evidence indicates that theoretical knowledge alone is insufficient to establish an adequate therapeutic alliance. Therefore, strategies are required to promote changes to enhance the establishment of the working relationship. The aims of the study are to generate changes in how nurses establish the therapeutic relationship in acute psychiatric units, based on participative action research and to evaluate the effectiveness of the implementation of evidence through this method. The study will use a mixed method design. Qualitative methodology, through participative action research, will be employed to implement scientific evidence on the therapeutic relationship. A quasi-experimental, one-group, pre-test/post-test design will also be used to quantitatively measure the effectiveness of the implementation of the evidence. Participants will consist of nurses and patients from two psychiatric units in Barcelona. Nurses will be selected by theoretical sampling, and patients assigned to each nurses will be selected by consecutive sampling. Qualitative data will be gathered through discussion groups and field diaries. Quantitative data will be collected through the Working Alliance Inventory and the Interpersonal Reactivity Index. Qualitative data will be analysed through the technique of content analysis and quantitative data through descriptive and inferential statistics. This study will help to understand the process of change in a nursing team working in an inpatient psychiatric ward and will allow nurses to generate knowledge, identify difficulties, and establish strategies to implement change, as well as to assess whether the quality of the care they provide shows a qualitative improvement.
2017-01-01
Cell size distribution is highly reproducible, whereas the size of individual cells often varies greatly within a tissue. This is obvious in a population of Arabidopsis thaliana leaf epidermal cells, which ranged from 1,000 to 10,000 μm2 in size. Endoreduplication is a specialized cell cycle in which nuclear genome size (ploidy) is doubled in the absence of cell division. Although epidermal cells require endoreduplication to enhance cellular expansion, the issue of whether this mechanism is sufficient for explaining cell size distribution remains unclear due to a lack of quantitative understanding linking the occurrence of endoreduplication with cell size diversity. Here, we addressed this question by quantitatively summarizing ploidy profile and cell size distribution using a simple theoretical framework. We first found that endoreduplication dynamics is a Poisson process through cellular maturation. This finding allowed us to construct a mathematical model to predict the time evolution of a ploidy profile with a single rate constant for endoreduplication occurrence in a given time. We reproduced experimentally measured ploidy profile in both wild-type leaf tissue and endoreduplication-related mutants with this analytical solution, further demonstrating the probabilistic property of endoreduplication. We next extended the mathematical model by incorporating the element that cell size is determined according to ploidy level to examine cell size distribution. This analysis revealed that cell size is exponentially enlarged 1.5 times every endoreduplication round. Because this theoretical simulation successfully recapitulated experimentally observed cell size distributions, we concluded that Poissonian endoreduplication dynamics and exponential size-boosting are the sources of the broad cell size distribution in epidermal tissue. More generally, this study contributes to a quantitative understanding whereby stochastic dynamics generate steady-state biological heterogeneity. PMID:28926847
A Quantitative Approach to Assessing System Evolvability
NASA Technical Reports Server (NTRS)
Christian, John A., III
2004-01-01
When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.
Quantitative analysis on PUVA-induced skin photodamages using optical coherence tomography
NASA Astrophysics Data System (ADS)
Zhai, Juan; Guo, Zhouyi; Liu, Zhiming; Xiong, Honglian; Zeng, Changchun; Jin, Ying
2009-08-01
Psoralen plus ultraviolet A radiation (PUVA) therapy is a very important clinical treatment of skin diseases such as vitiligo and psoriasis, but associated with an increased risk of skin photodamages especially photoaging. Since skin biopsy alters the original skin morphology and always requires an iatrogenic trauma, optical coherence tomography (OCT) appears to be a promising technique to study skin damage in vivo. In this study, the Balb/c mice had 8-methoxypsralen (8-MOP) treatment prior to UVA radiation was used as PUVA-induced photo-damaged modal. The OCT imaging of photo-damaged group (modal) and normal group (control) in vivo was obtained of mice dorsal skin at 0, 24, 48, 72 hours after irradiation respectively. And then the results were quantitatively analyzed combined with histological information. The experimental results showed that, PUVA-induced photo-damaged skin had an increase in epidermal thickness (ET), a reduction of attenuation coefficient in OCT images signal, and an increase in brightness of the epidermis layer compared with the control group. In conclusion, noninvasive high-resolution imaging techniques such as OCT may be a promising tool for photobiological studies aimed at assessing photo-damage and repair processes in vivo. It can be used to quantitative analysis of changes in photo-damaged skin, such as the ET and collagen in dermis, provides a theoretical basis for treatment and prevention of skin photodamages.
Study of the Electronic Surface States of III-V Compounds and Silicon.
1981-03-31
region in metal/Si interfaces is thus at most a quantitative , with increasing intermixing going from Ag/Si, Cu/Si, Ni/Si, PdSi to Au/Si. This...to- At the present time, the above argument on cross sections noise ratio better than 102. can not be put in a completely quantitative way since the...of the intensity (0 - 23 in Fig. 2) when the system search effort (also theoretical) is made for a more quantitative becomes richer in the metal
Theoretical foundations for a quantitative approach to paleogenetics. I, II.
NASA Technical Reports Server (NTRS)
Holmquist, R.
1972-01-01
It is shown that by neglecting the phenomena of multiple hits, back mutation, and chance coincidence errors larger than 100% can be introduced in the calculated value of the average number of nucleotide base differences to be expected between two homologous polynucleotides. Mathematical formulas are derived to correct quantitatively for these effects. It is pointed out that the effects change materially the quantitative aspects of phylogenics, such as the length of the legs of the trees. A number of problems are solved without approximation.-
NASA Astrophysics Data System (ADS)
Labate, Demetrio; Negi, Pooran; Ozcan, Burcin; Papadakis, Manos
2015-09-01
As advances in imaging technologies make more and more data available for biomedical applications, there is an increasing need to develop efficient quantitative algorithms for the analysis and processing of imaging data. In this paper, we introduce an innovative multiscale approach called Directional Ratio which is especially effective to distingush isotropic from anisotropic structures. This task is especially useful in the analysis of images of neurons, the main units of the nervous systems which consist of a main cell body called the soma and many elongated processes called neurites. We analyze the theoretical properties of our method on idealized models of neurons and develop a numerical implementation of this approach for analysis of fluorescent images of cultured neurons. We show that this algorithm is very effective for the detection of somas and the extraction of neurites in images of small circuits of neurons.
Commentary on factors affecting transverse vibration using an idealized theoretical equation
Joseph F. Murphy
2000-01-01
An idealized theoretical equation to calculate flexural stiffness using transverse vibration of a simply end-supported beam is being considered by the American Society of Testing and Materials (ASTM) Wood Committee D07 to determine lumber modulus of elasticity. This commentary provides the user a quantitative view of six factors that affect the accuracy of using the...
Shutin, Dmitriy; Zlobinskaya, Olga
2010-02-01
The goal of this contribution is to apply model-based information-theoretic measures to the quantification of relative differences between immunofluorescent signals. Several models for approximating the empirical fluorescence intensity distributions are considered, namely Gaussian, Gamma, Beta, and kernel densities. As a distance measure the Hellinger distance and the Kullback-Leibler divergence are considered. For the Gaussian, Gamma, and Beta models the closed-form expressions for evaluating the distance as a function of the model parameters are obtained. The advantages of the proposed quantification framework as compared to simple mean-based approaches are analyzed with numerical simulations. Two biological experiments are also considered. The first is the functional analysis of the p8 subunit of the TFIIH complex responsible for a rare hereditary multi-system disorder--trichothiodystrophy group A (TTD-A). In the second experiment the proposed methods are applied to assess the UV-induced DNA lesion repair rate. A good agreement between our in vivo results and those obtained with an alternative in vitro measurement is established. We believe that the computational simplicity and the effectiveness of the proposed quantification procedure will make it very attractive for different analysis tasks in functional proteomics, as well as in high-content screening. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Polverino, Pierpaolo; Frisk, Erik; Jung, Daniel; Krysander, Mattias; Pianese, Cesare
2017-07-01
The present paper proposes an advanced approach for Polymer Electrolyte Membrane Fuel Cell (PEMFC) systems fault detection and isolation through a model-based diagnostic algorithm. The considered algorithm is developed upon a lumped parameter model simulating a whole PEMFC system oriented towards automotive applications. This model is inspired by other models available in the literature, with further attention to stack thermal dynamics and water management. The developed model is analysed by means of Structural Analysis, to identify the correlations among involved physical variables, defined equations and a set of faults which may occur in the system (related to both auxiliary components malfunctions and stack degradation phenomena). Residual generators are designed by means of Causal Computation analysis and the maximum theoretical fault isolability, achievable with a minimal number of installed sensors, is investigated. The achieved results proved the capability of the algorithm to theoretically detect and isolate almost all faults with the only use of stack voltage and temperature sensors, with significant advantages from an industrial point of view. The effective fault isolability is proved through fault simulations at a specific fault magnitude with an advanced residual evaluation technique, to consider quantitative residual deviations from normal conditions and achieve univocal fault isolation.
Isotropic differential phase contrast microscopy for quantitative phase bio-imaging.
Chen, Hsi-Hsun; Lin, Yu-Zi; Luo, Yuan
2018-05-16
Quantitative phase imaging (QPI) has been investigated to retrieve optical phase information of an object and applied to biological microscopy and related medical studies. In recent examples, differential phase contrast (DPC) microscopy can recover phase image of thin sample under multi-axis intensity measurements in wide-field scheme. Unlike conventional DPC, based on theoretical approach under partially coherent condition, we propose a new method to achieve isotropic differential phase contrast (iDPC) with high accuracy and stability for phase recovery in simple and high-speed fashion. The iDPC is simply implemented with a partially coherent microscopy and a programmable thin-film transistor (TFT) shield to digitally modulate structured illumination patterns for QPI. In this article, simulation results show consistency of our theoretical approach for iDPC under partial coherence. In addition, we further demonstrate experiments of quantitative phase images of a standard micro-lens array, as well as label-free live human cell samples. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Fitness to work of astronauts in conditions of action of the extreme emotional factors
NASA Astrophysics Data System (ADS)
Prisniakova, L. M.
2004-01-01
The theoretical model for the quantitative determination of influence of a level of emotional exertion on the success of human activity is presented. The learning curves of fixed words in the groups with a different level of the emotional exertion are analyzed. The obtained magnitudes of time constant T depending on a type of the emotional exertion are a quantitative measure of the emotional exertion. Time constants could also be of use for a prediction of the characteristic of fitness to work of an astronaut in conditions of extreme factors. The inverse of the sign of influencing on efficiency of activity of the man is detected. The paper offers a mathematical model of the relation between successful activity and motivations or the emotional exertion (Yerkes-Dodson law). Proposed models can serve by the theoretical basis of the quantitative characteristics of an estimation of activity of astronauts in conditions of the emotional factors at a phase of their selection.
Fitness to work of astronauts in conditions of action of the extreme emotional factors.
Prisniakova, L M
2004-01-01
The theoretical model for the quantitative determination of influence of a level of emotional exertion on the success of human activity is presented. The learning curves of fixed words in the groups with a different level of the emotional exertion are analyzed. The obtained magnitudes of time constant T depending on a type of the emotional exertion are a quantitative measure of the emotional exertion. Time constants could also be of use for a prediction of the characteristic of fitness to work of an astronaut in conditions of extreme factors. The inverse of the sign of influencing on efficiency of activity of the man is detected. The paper offers a mathematical model of the relation between successful activity and motivations or the emotional exertion (Yerkes-Dodson law). Proposed models can serve by the theoretical basis of the quantitative characteristics of an estimation of activity of astronauts in conditions of the emotional factors at a phase of their selection. Published by Elsevier Ltd on behalf of COSPAR.
Noninvasive identification of the total peripheral resistance baroreflex
NASA Technical Reports Server (NTRS)
Mukkamala, Ramakrishna; Toska, Karin; Cohen, Richard J.
2003-01-01
We propose two identification algorithms for quantitating the total peripheral resistance (TPR) baroreflex, an important contributor to short-term arterial blood pressure (ABP) regulation. Each algorithm analyzes beat-to-beat fluctuations in ABP and cardiac output, which may both be obtained noninvasively in humans. For a theoretical evaluation, we applied both algorithms to a realistic cardiovascular model. The results contrasted with only one of the algorithms proving to be reliable. This algorithm was able to track changes in the static gains of both the arterial and cardiopulmonary TPR baroreflex. We then applied both algorithms to a preliminary set of human data and obtained contrasting results much like those obtained from the cardiovascular model, thereby making the theoretical evaluation results more meaningful. This study suggests that, with experimental testing, the reliable identification algorithm may provide a powerful, noninvasive means for quantitating the TPR baroreflex. This study also provides an example of the role that models can play in the development and initial evaluation of algorithms aimed at quantitating important physiological mechanisms.
Sahoo, Sagarika; Adhikari, Chandana; Kuanar, Minati; Mishra, Bijay K
2016-01-01
Synthesis of organic compounds with specific biological activity or physicochemical characteristics needs a thorough analysis of the enumerable data set obtained from literature. Quantitative structure property/activity relationships have made it simple by predicting the structure of the compound with any optimized activity. For that there is a paramount data set of molecular descriptors (MD). This review is a survey on the generation of the molecular descriptors and its probable applications in QSP/AR. Literatures have been collected from a wide class of research journals, citable web reports, seminar proceedings and books. The MDs were classified according to their generation. The applications of the MDs on the QSP/AR have also been reported in this review. The MDs can be classified into experimental and theoretical types, having a sub classification of the later into structural and quantum chemical descriptors. The structural parameters are derived from molecular graphs or topology of the molecules. Even the pixel of the molecular image can be used as molecular descriptor. In QSPR studies the physicochemical properties include boiling point, heat capacity, density, refractive index, molar volume, surface tension, heat of formation, octanol-water partition coefficient, solubility, chromatographic retention indices etc. Among biological activities toxicity, antimalarial activity, sensory irritant, potencies of local anesthetic, tadpole narcosis, antifungal activity, enzyme inhibiting activity are some important parameters in the QSAR studies. The classification of the MDs is mostly generic in nature. The application of the MDs in QSP/AR also has a generic link. Experimental MDs are more suitable in correlation analysis than the theoretical ones but are more expensive for generation. In advent of sophisticated computational tools and experimental design proliferation of MDs is inevitable, but for a highly optimized MD, studies on generation of MD is an unending process.
Kia, Seyed Mostafa; Vega Pons, Sandro; Weisz, Nathan; Passerini, Andrea
2016-01-01
Brain decoding is a popular multivariate approach for hypothesis testing in neuroimaging. Linear classifiers are widely employed in the brain decoding paradigm to discriminate among experimental conditions. Then, the derived linear weights are visualized in the form of multivariate brain maps to further study spatio-temporal patterns of underlying neural activities. It is well known that the brain maps derived from weights of linear classifiers are hard to interpret because of high correlations between predictors, low signal to noise ratios, and the high dimensionality of neuroimaging data. Therefore, improving the interpretability of brain decoding approaches is of primary interest in many neuroimaging studies. Despite extensive studies of this type, at present, there is no formal definition for interpretability of multivariate brain maps. As a consequence, there is no quantitative measure for evaluating the interpretability of different brain decoding methods. In this paper, first, we present a theoretical definition of interpretability in brain decoding; we show that the interpretability of multivariate brain maps can be decomposed into their reproducibility and representativeness. Second, as an application of the proposed definition, we exemplify a heuristic for approximating the interpretability in multivariate analysis of evoked magnetoencephalography (MEG) responses. Third, we propose to combine the approximated interpretability and the generalization performance of the brain decoding into a new multi-objective criterion for model selection. Our results, for the simulated and real MEG data, show that optimizing the hyper-parameters of the regularized linear classifier based on the proposed criterion results in more informative multivariate brain maps. More importantly, the presented definition provides the theoretical background for quantitative evaluation of interpretability, and hence, facilitates the development of more effective brain decoding algorithms in the future.
Kia, Seyed Mostafa; Vega Pons, Sandro; Weisz, Nathan; Passerini, Andrea
2017-01-01
Brain decoding is a popular multivariate approach for hypothesis testing in neuroimaging. Linear classifiers are widely employed in the brain decoding paradigm to discriminate among experimental conditions. Then, the derived linear weights are visualized in the form of multivariate brain maps to further study spatio-temporal patterns of underlying neural activities. It is well known that the brain maps derived from weights of linear classifiers are hard to interpret because of high correlations between predictors, low signal to noise ratios, and the high dimensionality of neuroimaging data. Therefore, improving the interpretability of brain decoding approaches is of primary interest in many neuroimaging studies. Despite extensive studies of this type, at present, there is no formal definition for interpretability of multivariate brain maps. As a consequence, there is no quantitative measure for evaluating the interpretability of different brain decoding methods. In this paper, first, we present a theoretical definition of interpretability in brain decoding; we show that the interpretability of multivariate brain maps can be decomposed into their reproducibility and representativeness. Second, as an application of the proposed definition, we exemplify a heuristic for approximating the interpretability in multivariate analysis of evoked magnetoencephalography (MEG) responses. Third, we propose to combine the approximated interpretability and the generalization performance of the brain decoding into a new multi-objective criterion for model selection. Our results, for the simulated and real MEG data, show that optimizing the hyper-parameters of the regularized linear classifier based on the proposed criterion results in more informative multivariate brain maps. More importantly, the presented definition provides the theoretical background for quantitative evaluation of interpretability, and hence, facilitates the development of more effective brain decoding algorithms in the future. PMID:28167896
Beatty, Garrett F; Cranley, Nicole M; Carnaby, Giselle; Janelle, Christopher M
2016-03-01
Emotions motivate individuals to attain appetitive goals and avoid aversive consequences. Empirical investigations have detailed how broad approach and avoidance orientations are reflected in fundamental movement attributes such as the speed, accuracy, and variability of motor actions. Several theoretical perspectives propose explanations for how emotional states influence the speed with which goal directed movements are initiated. These perspectives include biological predisposition, muscle activation, distance regulation, cognitive evaluation, and evaluative response coding accounts. A comprehensive review of literature and meta-analysis were undertaken to quantify empirical support for these theoretical perspectives. The systematic review yielded 34 studies that contained 53 independent experiments producing 128 effect sizes used to evaluate the predictions of existing theories. The central tenets of the biological predisposition (Hedges' g = -0.356), distance regulation (g = -0.293; g = 0.243), and cognitive evaluation (g = -0.249; g = -0.405; g = -0.174) accounts were supported. Partial support was also identified for the evaluative response coding (g = -0.255) framework. Our findings provide quantitative evidence that substantiate existing theoretical perspectives, and provide potential direction for conceptual integration of these independent perspectives. Recommendations for future empirical work in this area are discussed. (c) 2016 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Palosz, B.; Grzanka, E.; Stelmakh, S.; Pielaszek, R.; Bismayer, U.; Weber, H. P.; Janik, J. F.; Palosz, W.; Curreri, Peter A. (Technical Monitor)
2002-01-01
The effect of the chemical state of the surface of nanoparticles on the relaxation in the near-surface layer was examined using the concept of the apparent lattice parameter (alp) determined for different diffraction vectors Q. The apparent lattice parameter is a lattice parameter determined either from an individual Bragg reflection, or from a selected region of the diffraction pattern. At low diffraction vectors the Bragg peak positions are affected mainly by the structure of the near-surface layer, while at high Q-values only the interior of the nano-grain contributes to the diffraction pattern. Following the measurements on raw (as prepared) powders we investigated powders cleaned by annealing at 400C under vacuum, and the same powders wetted with water. Theoretical alp-Q plots showed that the structure of the surface layer depends on the sample treatment. Semi-quantitative analysis based on the comparison of the experimental and theoretical alp-Q plots was performed. Theoretical alp-Q relations were obtained from the diffraction patterns calculated for models of nanocrystals with a strained surface layer using the Debye functions.
Eddy, Nnabuk O; Ita, Benedict I
2011-02-01
Experimental aspects of the inhibition of the corrosion of mild steel in HCl solutions by some carbozones were studied using gravimetric, thermometric and gasometric methods, while a theoretical study was carried out using density functional theory, a quantitative structure-activity relation, and quantum chemical principles. The results obtained indicated that the studied carbozones are good adsorption inhibitors for the corrosion of mild steel in HCl. The inhibition efficiencies of the studied carbozones were found to increase with increasing concentration of the respective inhibitor. A strong correlation was found between the average inhibition efficiency and some quantum chemical parameters, and also between the experimental and theoretical inhibition efficiencies (obtained from the quantitative structure-activity relation).
Zavaglia, Melissa; Hilgetag, Claus C
2016-06-01
Spatial attention is a prime example for the distributed network functions of the brain. Lesion studies in animal models have been used to investigate intact attentional mechanisms as well as perspectives for rehabilitation in the injured brain. Here, we systematically analyzed behavioral data from cooling deactivation and permanent lesion experiments in the cat, where unilateral deactivation of the posterior parietal cortex (in the vicinity of the posterior middle suprasylvian cortex, pMS) or the superior colliculus (SC) cause a severe neglect in the contralateral hemifield. Counterintuitively, additional deactivation of structures in the opposite hemisphere reverses the deficit. Using such lesion data, we employed a game-theoretical approach, multi-perturbation Shapley value analysis (MSA), for inferring functional contributions and network interactions of bilateral pMS and SC from behavioral performance in visual attention studies. The approach provides an objective theoretical strategy for lesion inferences and allows a unique quantitative characterization of regional functional contributions and interactions on the basis of multi-perturbations. The quantitative analysis demonstrated that right posterior parietal cortex and superior colliculus made the strongest positive contributions to left-field orienting, while left brain regions had negative contributions, implying that their perturbation may reverse the effects of contralateral lesions or improve normal function. An analysis of functional modulations and interactions among the regions revealed redundant interactions (implying functional overlap) between regions within each hemisphere, and synergistic interactions between bilateral regions. To assess the reliability of the MSA method in the face of variable and incomplete input data, we performed a sensitivity analysis, investigating how much the contribution values of the four regions depended on the performance of specific configurations and on the prediction of unknown performances. The results suggest that the MSA approach is sensitive to categorical, but insensitive to gradual changes in the input data. Finally, we created a basic network model that was based on the known anatomical interactions among cortical-tectal regions and reproduced the experimentally observed behavior in visual orienting. We discuss the structural organization of the network model relative to the causal modulations identified by MSA, to aid a mechanistic understanding of the attention network of the brain.
The role of risk-based prioritization in total quality management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, C.T.
1994-10-01
The climate in which government managers must make decisions grows more complex and uncertain. All stakeholders - the public, industry, and Congress - are demanding greater consciousness, responsibility, and accountability of programs and their budgets. Yet, managerial decisions have become multifaceted, involve greater risk, and operate over much longer time periods. Over the last four or five decades, as policy analysis and decisions became more complex, scientists from psychology, operations research, systems science, and economics have developed a more or less coherent process called decision analysis to aid program management. The process of decision analysis - a systems theoretic approachmore » - provides the backdrop for this paper. The Laboratory Integrated Prioritization System (LIPS) has been developed as a systems analytic and risk-based prioritization tool to aid the management of the Tri-Labs` (Lawrence Livermore, Los Alamos, and Sandia) operating resources. Preliminary analyses of the effects of LIPS has confirmed the practical benefits of decision and systems sciences - the systematic, quantitative reduction in uncertainty. To date, the use of LIPS - and, hence, its value - has been restricted to resource allocation within the Tri-Labs` operations budgets. This report extends the role of risk-based prioritization to the support of DOE Total Quality Management (TQM) programs. Furthermore, this paper will argue for the requirement to institutionalize an evolutionary, decision theoretic approach to the policy analysis of the Department of Energy`s Program Budget.« less
NASA Technical Reports Server (NTRS)
Glassgold, Alfred E.; Huggins, Patrick J.
1987-01-01
The study of the outer envelopes of cool evolved stars has become an active area of research. The physical properties of CS envelopes are presented. Observations of many wavelengths bands are relevant. A summary of observations and a discussion of theoretical considerations concerning the chemistry are summarized. Recent theoretical considerations show that the thermal equilibrium model is of limited use for understanding the chemistry of the outer CS envelopes. The theoretical modeling of the chemistry of CS envelopes provides a quantitive test of chemical concepts which have a broader interest than the envelopes themselves.
Combining formal and functional approaches to topic structure.
Zellers, Margaret; Post, Brechtje
2012-03-01
Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to adopt the insights of both PTI's qualitative analysis and EP's quantitative analysis and combine them into a multiple-methods approach. One realm in which it is possible to combine these frameworks is in the analysis of discourse topic structure and the prosodic cues relevant to it. By combining a quantitative and a qualitative approach to discourse topic structure, it is possible to give a better account of the observed variation in prosody, for example in the case of fundamental frequency (F0) peak timing, which can be explained in terms of pitch accent distribution over different topic structure categories. Similarly, local and global patterns in speech rate variation can be better explained and motivated by adopting insights from both PTI and EP in the study of topic structure. Combining PTI and EP can provide better accounts of speech data as well as opening up new avenues of investigation which would not have been possible in either approach alone.
Functionalization of SBA-15 mesoporous silica by Cu-phosphonate units: Probing of synthesis route
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laskowski, Lukasz, E-mail: lukasz.laskowski@kik.pcz.pl; Czestochowa University of Technology, Institute of Physics, Al. Armii Krajowej 19, 42-201 Czestochowa; Laskowska, Magdalena, E-mail: magdalena.laskowska@onet.pl
2014-12-15
Mesoporous silica SBA-15 containing propyl-copper phosphonate units was investigated. The structure of mesoporous samples was tested by N{sub 2} isothermal sorption (BET and BHJ analysis), TEM microscopy and X-Ray scattering. Quantitative analysis EDX has given information about proportions between component atoms in the sample. Quantitative elemental analysis has been carried out to support EDX. To examine bounding between copper atoms and phosphonic units the Raman spectroscopy was carried out. As a support of Raman scattering, the theoretical calculations were made based on density functional theory, with the B3LYP method. By comparison of the calculated vibrational spectra of the molecule withmore » experimental results, distribution of the active units inside silica matrix has been determined. - Graphical abstract: The present study is devoted to mesoporous silica SBA-15 containing propyl-copper phosphonate units. The species were investigated to confirm of synthesis procedure correctness by the micro-Raman technique combined with DFT numerical simulations. Complementary research was carried out to test the structure of mesoporous samples. - Highlights: • SBA-15 silica functionalized with propyl-copper phosphonate units was synthesized. • Synthesis efficiency probed by Raman study supported with DFT simulations. • Homogenous distribution of active units was proved. • Synthesis route enables precise control of distance between copper ions.« less
Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye
2016-01-13
A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.
Qualitative research and the epidemiological imagination: a vital relationship.
Popay, J
2003-01-01
This paper takes as its starting point the assumption that the 'Epidemiological Imagination' has a central role to play in the future development of policies and practice to improve population health and reduce health inequalities within and between states but suggests that by neglecting the contribution that qualitative research can make epidemiology is failing to deliver this potential. The paper briefly considers what qualitative research is, touching on epistemological questions--what type of "knowledge" is generated--and questions of methods--what approaches to data collection, analysis and interpretation are involved). Following this the paper presents two different models of the relationship between qualitative and quantitative research. The enhancement model (which assumes that qualitative research findings add something extra to the findings of quantitative research) suggests three related "roles" for qualitative research: generating hypothesis to be tested by quantitative research, helping to construct more sophisticated measures of social phenomena and explaining unexpected research from quantitative research. In contrast, the Epistemological Model suggests that qualitative research is equal but different from quantitative research making a unique contribution through: researching parts other research approaches can't reach, increasing understanding by adding conceptual and theoretical depth to knowledge, shifting the balance of power between researchers and researched and challenging traditional epidemiological ways of "knowing" the social world. The paper illustrates these different types of contributions with examples of qualitative research and finally discusses ways in which the "trustworthiness" of qualitative research can be assessed.
An Analysis of Categorical and Quantitative Methods for Planning Under Uncertainty
Langlotz, Curtis P.; Shortliffe, Edward H.
1988-01-01
Decision theory and logical reasoning are both methods for representing and solving medical decision problems. We analyze the usefulness of these two approaches to medical therapy planning by establishing a simple correspondence between decision theory and non-monotonic logic, a formalization of categorical logical reasoning. The analysis indicates that categorical approaches to planning can be viewed as comprising two decision-theoretic concepts: probabilities (degrees of belief in planning hypotheses) and utilities (degrees of desirability of planning outcomes). We present and discuss examples of the following lessons from this decision-theoretic view of categorical (nonmonotonic) reasoning: (1) Decision theory and artificial intelligence techniques are intended to solve different components of the planning problem. (2) When considered in the context of planning under uncertainty, nonmonotonic logics do not retain the domain-independent characteristics of classical logical reasoning for planning under certainty. (3) Because certain nonmonotonic programming paradigms (e.g., frame-based inheritance, rule-based planning, protocol-based reminders) are inherently problem-specific, they may be inappropriate to employ in the solution of certain types of planning problems. We discuss how these conclusions affect several current medical informatics research issues, including the construction of “very large” medical knowledge bases.
Van den Bulcke, Marc; Lievens, Antoon; Barbau-Piednoir, Elodie; MbongoloMbella, Guillaume; Roosens, Nancy; Sneyers, Myriam; Casi, Amaya Leunda
2010-03-01
The detection of genetically modified (GM) materials in food and feed products is a complex multi-step analytical process invoking screening, identification, and often quantification of the genetically modified organisms (GMO) present in a sample. "Combinatory qPCR SYBRGreen screening" (CoSYPS) is a matrix-based approach for determining the presence of GM plant materials in products. The CoSYPS decision-support system (DSS) interprets the analytical results of SYBRGREEN qPCR analysis based on four values: the C(t)- and T(m) values and the LOD and LOQ for each method. A theoretical explanation of the different concepts applied in CoSYPS analysis is given (GMO Universe, "Prime number tracing", matrix/combinatory approach) and documented using the RoundUp Ready soy GTS40-3-2 as an example. By applying a limited set of SYBRGREEN qPCR methods and through application of a newly developed "prime number"-based algorithm, the nature of subsets of corresponding GMO in a sample can be determined. Together, these analyses provide guidance for semi-quantitative estimation of GMO presence in a food and feed product.
Zhu, Fu-Yuan; Chen, Mo-Xian; Su, Yu-Wen; Xu, Xuezhong; Ye, Neng-Hui; Cao, Yun-Ying; Lin, Sheng; Liu, Tie-Yuan; Li, Hao-Xuan; Wang, Guan-Qun; Jin, Yu; Gu, Yong-Hai; Chan, Wai-Lung; Lo, Clive; Peng, Xinxiang; Zhu, Guohui; Zhang, Jianhua
2016-01-01
Modern rice cultivars have large panicle but their yield potential is often not fully achieved due to poor grain-filling of late-flowering inferior spikelets (IS). Our earlier work suggested a broad transcriptional reprogramming during grain filling and showed a difference in gene expression between IS and earlier-flowering superior spikelets (SS). However, the links between the abundances of transcripts and their corresponding proteins are unclear. In this study, a SWATH-MS (sequential window acquisition of all theoretical spectra-mass spectrometry) -based quantitative proteomic analysis has been applied to investigate SS and IS proteomes. A total of 304 proteins of widely differing functionality were observed to be differentially expressed between IS and SS. Detailed gene ontology analysis indicated that several biological processes including photosynthesis, protein metabolism, and energy metabolism are differentially regulated. Further correlation analysis revealed that abundances of most of the differentially expressed proteins are not correlated to the respective transcript levels, indicating that an extra layer of gene regulation which may exist during rice grain filling. Our findings raised an intriguing possibility that these candidate proteins may be crucial in determining the poor grain-filling of IS. Therefore, we hypothesize that the regulation of proteome changes not only occurs at the transcriptional, but also at the post-transcriptional level, during grain filling in rice. PMID:28066479
NASA Technical Reports Server (NTRS)
Kojima, Jun; Nguyen, Quang-Viet
2004-01-01
We present a theoretical study of the spectral interferences in the spontaneous Raman scattering spectra of major combustion products in 30-atm fuel-rich hydrogen-air flames. An effective methodology is introduced to choose an appropriate line-shape model for simulating Raman spectra in high-pressure combustion environments. The Voigt profile with the additive approximation assumption was found to provide a reasonable model of the spectral line shape for the present analysis. The rotational/vibrational Raman spectra of H2, N2, and H2O were calculated using an anharmonic-oscillator model using the latest collisional broadening coefficients. The calculated spectra were validated with data obtained in a 10-atm fuel-rich H2-air flame and showed excellent agreement. Our quantitative spectral analysis for equivalence ratios ranging from 1.5 to 5.0 revealed substantial amounts of spectral cross-talk between the rotational H2 lines and the N2 O-/Q-branch; and between the vibrational H2O(0,3) line and the vibrational H2O spectrum. We also address the temperature dependence of the spectral cross-talk and extend our analysis to include a cross-talk compensation technique that removes the nterference arising from the H2 Raman spectra onto the N2, or H2O spectra.
Hu, Li; Tian, Xiaorui; Huang, Yingzhou; Fang, Liang; Fang, Yurui
2016-02-14
Plasmonic chirality has drawn much attention because of tunable circular dichroism (CD) and the enhancement for chiral molecule signals. Although various mechanisms have been proposed to explain the plasmonic CD, a quantitative explanation like the ab initio mechanism for chiral molecules, is still unavailable. In this study, a mechanism similar to the mechanisms associated with chiral molecules was analyzed. The giant extrinsic circular dichroism of a plasmonic splitting rectangle ring was quantitatively investigated from a theoretical standpoint. The interplay of the electric and magnetic modes of the meta-structure is proposed to explain the giant CD. We analyzed the interplay using both an analytical coupled electric-magnetic dipole model and a finite element method model. The surface charge distributions showed that the circular current yielded by the splitting rectangle ring causes the ring to behave like a magneton at some resonant modes, which then interact with the electric modes, resulting in a mixing of the two types of modes. The strong interplay of the two mode types is primarily responsible for the giant CD. The analysis of the chiral near-field of the structure shows potential applications for chiral molecule sensing.
2010-01-01
Background Methods for the calculation and application of quantitative electromyographic (EMG) statistics for the characterization of EMG data detected from forearm muscles of individuals with and without pain associated with repetitive strain injury are presented. Methods A classification procedure using a multi-stage application of Bayesian inference is presented that characterizes a set of motor unit potentials acquired using needle electromyography. The utility of this technique in characterizing EMG data obtained from both normal individuals and those presenting with symptoms of "non-specific arm pain" is explored and validated. The efficacy of the Bayesian technique is compared with simple voting methods. Results The aggregate Bayesian classifier presented is found to perform with accuracy equivalent to that of majority voting on the test data, with an overall accuracy greater than 0.85. Theoretical foundations of the technique are discussed, and are related to the observations found. Conclusions Aggregation of motor unit potential conditional probability distributions estimated using quantitative electromyographic analysis, may be successfully used to perform electrodiagnostic characterization of "non-specific arm pain." It is expected that these techniques will also be able to be applied to other types of electrodiagnostic data. PMID:20156353
The post-buckling behavior of a beam constrained by springy walls
NASA Astrophysics Data System (ADS)
Katz, Shmuel; Givli, Sefi
2015-05-01
The post-buckling behavior of a beam subjected to lateral constraints is of practical importance in a variety of applications, such as stent procedures, filopodia growth in living cells, endoscopic examination of internal organs, and deep drilling. Even though in reality the constraining surfaces are often deformable, the literature has focused mainly on rigid and fixed constraints. In this paper, we make a first step to bridge this gap through a theoretical and experimental examination of the post-buckling behavior of a beam constrained by a fixed wall and a springy wall, i.e. one that moves laterally against a spring. The response exhibited by the proposed system is much richer compared to that of the fixed-wall system, and can be tuned by choosing the spring stiffness. Based on small-deformation analysis, we obtained closed-form analytical solutions and quantitative insights. The accuracy of these results was examined by comparison to large-deformation analysis. We concluded that the closed-form solution of the small-deformation analysis provides an excellent approximation, except in the highest attainable mode. There, the system exhibits non-intuitive behavior and non-monotonous force-displacement relations that can only be captured by large-deformation theories. Although closed-form solutions cannot be derived for the large-deformation analysis, we were able to reveal general properties of the solution. In the last part of the paper, we present experimental results that demonstrate various features obtained from the theoretical analysis.
The THz fingerprint spectra of the active ingredients of a TCM medicine: Herba Ephedrae
NASA Astrophysics Data System (ADS)
Ma, Shihua; Liu, Guifeng; Zhang, Peng; Song, Xiyu; Ji, Te; Wang, Wenfeng
2008-12-01
In this paper, THz-TDS has been used to measure the spectral properties of two active ingredients of Herba Ephedrae: ephedrine and pseudoephedrine, which exist in hydrochloride salts. The THz spectra of the sole-ingredient, twoingredient and three-ingredient compounds are studied. We obtained the finger-print spectra of the net active ingredients of the medicine, and also measured the mixtures of by two or three active ingredients at the different ratios. At the same time, theoretical analysis and quantitative analysis is applied to foretell the different THz spectra, identify the ingredients and infer the contents of principal components in samples. The THz spectroscopy is a potential and promising technique in evaluating and inspecting the quality of the drugs in the TCM field.
Theory of sampling: four critical success factors before analysis.
Wagner, Claas; Esbensen, Kim H
2015-01-01
Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.
Jiménez-Fernández, J
2018-01-01
This paper investigates the dependence of the subharmonic response in a signal scattered by contrast agent microbubbles on ambient pressure to provide quantitative estimations of local blood pressure. The problem is formulated by assuming a gas bubble encapsulated by a shell of finite thickness with dynamic behavior modeled by a nonlinear viscoelastic constitutive equation. For ambient overpressure compatible with the clinical range, the acoustic pressure intervals where the subharmonic signal may be detected (above the threshold for the onset and below the limit value for the first chaotic transition) are determined. The analysis shows that as the overpressure is increased, all harmonic components are displaced to higher frequencies. This displacement is significant for the subharmonic of order 1/2 and explains the increase or decrease in the subharmonic amplitude with ambient pressure described in previous works. Thus, some questions related to the monotonic dependence of the subharmonic amplitude on ambient pressure are clarified. For different acoustic pressures, quantitative conditions for determining the intervals where the subharmonic amplitude is a monotonic or non-monotonic function of the ambient pressure are provided. Finally, the influence of the ambient pressure on the subharmonic resonance frequency is analyzed.
A quantitative analysis of hydraulic interaction processes in stream-aquifer systems
Wang, Wenke; Dai, Zhenxue; Zhao, Yaqian; Li, Junting; Duan, Lei; Wang, Zhoufeng; Zhu, Lin
2016-01-01
The hydraulic relationship between the stream and aquifer can be altered from hydraulic connection to disconnection when the pumping rate exceeds the maximum seepage flux of the streambed. This study proposes to quantitatively analyze the physical processes of stream-aquifer systems from connection to disconnection. A free water table equation is adopted to clarify under what conditions a stream starts to separate hydraulically from an aquifer. Both the theoretical analysis and laboratory tests have demonstrated that the hydraulic connectedness of the stream-aquifer system can reach a critical disconnection state when the horizontal hydraulic gradient at the free water surface is equal to zero and the vertical is equal to 1. A boundary-value problem for movement of the critical point of disconnection is established for an analytical solution of the inverted water table movement beneath the stream. The result indicates that the maximum distance or thickness of the inverted water table is equal to the water depth in the stream, and at a steady state of disconnection, the maximum hydraulic gradient at the streambed center is 2. This study helps us to understand the hydraulic phenomena of water flow near streams and accurately assess surface water and groundwater resources. PMID:26818442
Research in health sciences library and information science: a quantitative analysis.
Dimitroff, A
1992-01-01
A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas. PMID:1422504
Analysis of Sequence Data Under Multivariate Trait-Dependent Sampling.
Tao, Ran; Zeng, Donglin; Franceschini, Nora; North, Kari E; Boerwinkle, Eric; Lin, Dan-Yu
2015-06-01
High-throughput DNA sequencing allows for the genotyping of common and rare variants for genetic association studies. At the present time and for the foreseeable future, it is not economically feasible to sequence all individuals in a large cohort. A cost-effective strategy is to sequence those individuals with extreme values of a quantitative trait. We consider the design under which the sampling depends on multiple quantitative traits. Under such trait-dependent sampling, standard linear regression analysis can result in bias of parameter estimation, inflation of type I error, and loss of power. We construct a likelihood function that properly reflects the sampling mechanism and utilizes all available data. We implement a computationally efficient EM algorithm and establish the theoretical properties of the resulting maximum likelihood estimators. Our methods can be used to perform separate inference on each trait or simultaneous inference on multiple traits. We pay special attention to gene-level association tests for rare variants. We demonstrate the superiority of the proposed methods over standard linear regression through extensive simulation studies. We provide applications to the Cohorts for Heart and Aging Research in Genomic Epidemiology Targeted Sequencing Study and the National Heart, Lung, and Blood Institute Exome Sequencing Project.
[Prediction of the molecular response to pertubations from single cell measurements].
Remacle, Françoise; Levine, Raphael D
2014-12-01
The response of protein signalization networks to perturbations is analysed from single cell measurements. This experimental approach allows characterizing the fluctuations in protein expression levels from cell to cell. The analysis is based on an information theoretic approach grounded in thermodynamics leading to a quantitative version of Le Chatelier principle which allows to predict the molecular response. Two systems are investigated: human macrophages subjected to lipopolysaccharide challenge, analogous to the immune response against Gram-negative bacteria and the response of the proteins involved in the mTOR signalizing network of GBM cancer cells to changes in partial oxygen pressure. © 2014 médecine/sciences – Inserm.
NASA Technical Reports Server (NTRS)
Rosner, Daniel E.; Nagarajan, R.
1987-01-01
An analysis is undertaken of aerodynamically- and centrifugally-driven liquid condensate layers on nonisothermal combustion turbines' stator vanes and rotor blades. Attention is given to the quantitative consequences of one possible mechanism for the initiation of 'hot corrosion' in the underlying blade material through a 'fluxing' of the protective oxide coating by the molten salt of the Newtonian condensate film. Illustrative calculations are presented for the condensate streamline pattern and the distributions of the steady-state condensate layer thickness, together with the corresponding oxide dissolution rate, for a test turbine blade.
NASA Astrophysics Data System (ADS)
Miranda, Henrique P. C.; Reichardt, Sven; Froehlicher, Guillaume; Molina-Sánchez, Alejandro; Berciaud, Stéphane; Wirtz, Ludger
2017-04-01
We present a combined experimental and theoretical study of resonant Raman spectroscopy in single- and triple-layer MoTe$_2$. Raman intensities are computed entirely from first principles by calculating finite differences of the dielectric susceptibility. In our analysis, we investigate the role of quantum interference effects and the electron-phonon coupling. With this method, we explain the experimentally observed intensity inversion of the $A^\\prime_1$ vibrational modes in triple-layer MoTe2 with increasing laser photon energy. Finally, we show that a quantitative comparison with experimental data requires the proper inclusion of excitonic effects.
Electromagnetic Pumps for Liquid Metal-Fed Electric Thrusters
NASA Technical Reports Server (NTRS)
Polzin, Kurt A.; Markusic, Thomas E.
2007-01-01
Prototype designs of two separate pumps for use in electric propulsion systems with liquid lithium and bismuth propellants are presented. Both pumps are required to operate at elevated temperatures, and the lithium pump must additionally withstand the corrosive nature of the propellant. Compatibility of the pump materials and seals with lithium and bismuth were demonstrated through proof-of-concept experiments followed by post-experiment visual inspections. The pressure rise produced by the bismuth pump was found to be linear with input current and ranged from 0-9 kPa for corresponding input current levels of 0-30 A, showing good quantitative agreement with theoretical analysis.
NASA Astrophysics Data System (ADS)
Alkorta, Ibon; Elguero, José; Elguero, Eric
2017-11-01
1125 X-ray structures of nitroxide free radicals presenting intermolecular hydrogen bonds have been reported in the Cambridge Structural Database. We will report in this paper a qualitative and quantitative analysis of these bonds. The observation in some plots of an excluded region was statistically analyzed using convex hull and kernel smooting methodologies. A theoretical study at the MP2 level with different basis has been carried out indicating that the nitronyl nitroxide radicals (five electrons) lie just in between nitroso compounds (four electrons) and amine N-oxides (six electrons) as far as hydrogen-bond basicity is concerned.
Redistribution by insurance market regulation: Analyzing a ban on gender-based retirement annuities.
Finkelstein, Amy; Poterba, James; Rothschild, Casey
2009-01-01
We illustrate how equilibrium screening models can be used to evaluate the economic consequences of insurance market regulation. We calibrate and solve a model of the United Kingdom's compulsory annuity market and examine the impact of gender-based pricing restrictions. We find that the endogenous adjustment of annuity contract menus in response to such restrictions can undo up to half of the redistribution from men to women that would occur with exogenous Social Security-like annuity contracts. Our findings indicate the importance of endogenous contract responses and illustrate the feasibility of employing theoretical insurance market equilibrium models for quantitative policy analysis.
Photoacoustic resonance spectroscopy for biological tissue characterization
NASA Astrophysics Data System (ADS)
Gao, Fei; Feng, Xiaohua; Zheng, Yuanjin; Ohl, Claus-Dieter
2014-06-01
By "listening to photons," photoacoustics allows the probing of chromosomes in depth beyond the optical diffusion limit. Here we report the photoacoustic resonance effect induced by multiburst modulated laser illumination, which is theoretically modeled as a damped mass-string oscillator and a resistor-inductor-capacitor (RLC) circuit. Through sweeping the frequency of multiburst modulated laser, the photoacoustic resonance effect is observed experimentally on phantoms and porcine tissues. Experimental results demonstrate different spectra for each phantom and tissue sample to show significant potential for spectroscopic analysis, fusing optical absorption and mechanical vibration properties. Unique RLC circuit parameters are extracted to quantitatively characterize phantom and biological tissues.
Optical actuators for fly-by-light applications
NASA Astrophysics Data System (ADS)
Chee, Sonny H. S.; Liu, Kexing; Measures, Raymond M.
1993-04-01
A review of optomechanical interfaces is presented. A detailed quantitative and qualitative analysis of the University of Toronto Institute for Aerospace Studies (UTIAS) box, optopneumatics, optical activation of a bimetal, optical activation of the shape memory effect, and optical activation of the pyroelectric effects is given. The UTIAS box is found to display a good conversion efficiency and a high bandwidth. A preliminary UTIAS box design has achieved a conversion efficiency of about 1/6 of the theoretical limit and a bandwidth of 2 Hz. In comparison to previous optomechanical interfaces, the UTIAS box has the highest pressure development to optical power ratio (at least an order of magnitude greater).
Redistribution by insurance market regulation: Analyzing a ban on gender-based retirement annuities
Finkelstein, Amy; Poterba, James; Rothschild, Casey
2009-01-01
We illustrate how equilibrium screening models can be used to evaluate the economic consequences of insurance market regulation. We calibrate and solve a model of the United Kingdom’s compulsory annuity market and examine the impact of gender-based pricing restrictions. We find that the endogenous adjustment of annuity contract menus in response to such restrictions can undo up to half of the redistribution from men to women that would occur with exogenous Social Security-like annuity contracts. Our findings indicate the importance of endogenous contract responses and illustrate the feasibility of employing theoretical insurance market equilibrium models for quantitative policy analysis. PMID:20046907
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavda, Bhavin R., E-mail: chavdabhavin9@gmail.com; Dubey, Rahul P.; Patel, Urmila H.
The novel chalcone derivatives have widespread applications in material science and medicinal industries. The density functional theory (DFT) is used to optimized the molecular structure of the three chalcone derivatives (M-I, II, III). The observed discrepancies between the theoretical and experimental (X-ray data) results attributed to different environments of the molecules, the experimental values are of the molecule in solid state there by subjected to the intermolecular forces, like non-bonded hydrogen bond interactions, where as isolated state in gas phase for theoretical studies. The lattice energy of all the molecules have been calculated using PIXELC module in Coulomb –London –Paulimore » (CLP) package and is partitioned into corresponding coulombic, polarization, dispersion and repulsion contributions. Lattice energy data confirm and strengthen the finding of the X-ray results that the weak but significant intermolecular interactions like C-H…O, Π- Π and C-H… Π plays an important role in the stabilization of crystal packing.« less
NASA Astrophysics Data System (ADS)
Yi, X.; Duan, H. L.
2009-08-01
Surface stress is widely used to characterize the adsorption effect on the mechanical response of nanomaterials and nanodevices. However, quantitative relations between continuum-level descriptions of surface stress and molecular-level descriptions of adsorbate interactions are not well established. In this paper, we first obtain the relations between the adsorption-induced surface stress and the van der Waals and Coulomb interactions in terms of the physical and chemical interactions between adsorbates and solid surfaces. Then, we present a theoretical framework to predict the deflection and resonance frequencies of microcantilevers with the simultaneous effects of the eigenstrain, surface stress and adsorption mass. Finally, the adsorption-induced deflection and resonance frequency shift of microcantilevers are numerically analyzed for the van der Waals and Coulomb interactions. The present theoretical framework quantifies the mechanisms of the adsorption-induced surface stress, and thus provides guidelines to the analysis of the sensitivities, and the identification of the detected substance in the design and application of micro- and nanocantilever sensors.
Broadband frequency and angular response of a sinusoidal bull’s eye antenna
NASA Astrophysics Data System (ADS)
Beaskoetxea, U.; Navarro-Cía, M.; Beruete, M.
2016-07-01
A thorough experimental study of the frequency and beaming angle response of a metallic leaky-wave bull’s eye antenna working at 77 GHz with a sinusoidally corrugated profile is presented. The beam scanning property of these antennas as frequency is varied is experimentally demonstrated and corroborated through theoretical and numerical results. From the experimental results the dispersion diagram of the n = -1 and n = -2 space harmonics is extracted, and the operation at different frequency regimes is identified and discussed. In order to show the contribution of each half of the antenna, numerical examples of the near-field behavior are also displayed. Overall, experimental results are in good qualitative and quantitative agreement with theoretical and numerical calculations. Finally, an analysis of the beamwidth as a function of frequency is performed, showing that it can achieve values below 1.5° in a fractional bandwidth of 4% around the operation frequency, which is an interesting frequency-stable broadside radiation.
Schaper, Louise; Pervan, Graham
2007-01-01
The research reported in this paper describes the development, empirical validation and analysis of a model of technology acceptance by Australian occupational therapists. The study described involved the collection of quantitative data through a national survey. The theoretical significance of this work is that it uses a thoroughly constructed research model, with one of the largest sample sizes ever tested (n=1605), to extend technology acceptance research into the health sector. Results provide strong support for the model. This work reveals the complexity of the constructs and relationships that influence technology acceptance and highlights the need to include sociotechnical and system issues in studies of technology acceptance in healthcare to improve information system implementation success in this arena. The results of this study have practical and theoretical implications for health informaticians and researchers in the field of health informatics and information systems, tertiary educators, Commonwealth and State Governments and the allied health professions.
Kanazawa, Kiyoshi; Sueshige, Takumi; Takayasu, Hideki; Takayasu, Misako
2018-03-30
A microscopic model is established for financial Brownian motion from the direct observation of the dynamics of high-frequency traders (HFTs) in a foreign exchange market. Furthermore, a theoretical framework parallel to molecular kinetic theory is developed for the systematic description of the financial market from microscopic dynamics of HFTs. We report first on a microscopic empirical law of traders' trend-following behavior by tracking the trajectories of all individuals, which quantifies the collective motion of HFTs but has not been captured in conventional order-book models. We next introduce the corresponding microscopic model of HFTs and present its theoretical solution paralleling molecular kinetic theory: Boltzmann-like and Langevin-like equations are derived from the microscopic dynamics via the Bogoliubov-Born-Green-Kirkwood-Yvon hierarchy. Our model is the first microscopic model that has been directly validated through data analysis of the microscopic dynamics, exhibiting quantitative agreements with mesoscopic and macroscopic empirical results.
Palaeostress perturbations near the El Castillo de las Guardas fault (SW Iberian Massif)
NASA Astrophysics Data System (ADS)
García-Navarro, Encarnación; Fernández, Carlos
2010-05-01
Use of stress inversion methods on faults measured at 33 sites located at the northwestern part of the South Portuguese Zone (Variscan Iberian Massif), and analysis of the basic dyke attitude at this same region, has revealed a prominent perturbation of the stress trajectories around some large, crustal-scale faults, like the El Castillo de las Guardas fault. The results are compared with the predictions of theoretical models of palaeostress deviations near master faults. According to this comparison, the El Castillo de las Guardas fault, an old structure that probably reversed several times its slip sense, can be considered as a sinistral strike-slip fault during the Moscovian. These results also point out the main shortcomings that still hinder a rigorous quantitative use of the theoretical models of stress perturbations around major faults: the spatial variation in the parameters governing the brittle behaviour of the continental crust, and the possibility of oblique slip along outcrop-scale faults in regions subjected to general, non-plane strain.
NASA Astrophysics Data System (ADS)
Kanazawa, Kiyoshi; Sueshige, Takumi; Takayasu, Hideki; Takayasu, Misako
2018-03-01
A microscopic model is established for financial Brownian motion from the direct observation of the dynamics of high-frequency traders (HFTs) in a foreign exchange market. Furthermore, a theoretical framework parallel to molecular kinetic theory is developed for the systematic description of the financial market from microscopic dynamics of HFTs. We report first on a microscopic empirical law of traders' trend-following behavior by tracking the trajectories of all individuals, which quantifies the collective motion of HFTs but has not been captured in conventional order-book models. We next introduce the corresponding microscopic model of HFTs and present its theoretical solution paralleling molecular kinetic theory: Boltzmann-like and Langevin-like equations are derived from the microscopic dynamics via the Bogoliubov-Born-Green-Kirkwood-Yvon hierarchy. Our model is the first microscopic model that has been directly validated through data analysis of the microscopic dynamics, exhibiting quantitative agreements with mesoscopic and macroscopic empirical results.
Growth of wormlike micelles in nonionic surfactant solutions: Quantitative theory vs. experiment.
Danov, Krassimir D; Kralchevsky, Peter A; Stoyanov, Simeon D; Cook, Joanne L; Stott, Ian P; Pelan, Eddie G
2018-06-01
Despite the considerable advances of molecular-thermodynamic theory of micelle growth, agreement between theory and experiment has been achieved only in isolated cases. A general theory that can provide self-consistent quantitative description of the growth of wormlike micelles in mixed surfactant solutions, including the experimentally observed high peaks in viscosity and aggregation number, is still missing. As a step toward the creation of such theory, here we consider the simplest system - nonionic wormlike surfactant micelles from polyoxyethylene alkyl ethers, C i E j . Our goal is to construct a molecular-thermodynamic model that is in agreement with the available experimental data. For this goal, we systematized data for the micelle mean mass aggregation number, from which the micelle growth parameter was determined at various temperatures. None of the available models can give a quantitative description of these data. We constructed a new model, which is based on theoretical expressions for the interfacial-tension, headgroup-steric and chain-conformation components of micelle free energy, along with appropriate expressions for the parameters of the model, including their temperature and curvature dependencies. Special attention was paid to the surfactant chain-conformation free energy, for which a new more general formula was derived. As a result, relatively simple theoretical expressions are obtained. All parameters that enter these expressions are known, which facilitates the theoretical modeling of micelle growth for various nonionic surfactants in excellent agreement with the experiment. The constructed model can serve as a basis that can be further upgraded to obtain quantitative description of micelle growth in more complicated systems, including binary and ternary mixtures of nonionic, ionic and zwitterionic surfactants, which determines the viscosity and stability of various formulations in personal-care and house-hold detergency. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
Feng, Chenghong; Bi, Zhe; Tang, Hongxiao
2015-01-06
Electrospray mass spectrometry has been reported as a novel technique for Al species identification, but to date, the working mechanism is not clear and no unanimous method exists for spectrum analysis of traditional Al salt flocculants, let alone for analysis of polyaluminum chloride (PAC) flocculants. Therefore, this paper introduces a novel theoretical calculation method to identify Al species from a mass spectrum, based on deducing changes in m/z (mass-to-charge ratio) and molecular formulas of oligomers in five typical PAC flocculants. The use of reference chemical species was specially proposed in the method to guarantee the uniqueness of the assigned species. The charge and mass reduction of the Al cluster was found to proceed by hydrolysis, gasification, and change of hydroxyl on the oxy bridge. The novel method was validated both qualitatively and quantitatively by comparing the results to those obtained with the (27)Al NMR spectrometry.
X-ray microbeam three-dimensional topography for dislocation strain-field analysis of 4H-SiC
NASA Astrophysics Data System (ADS)
Tanuma, R.; Mori, D.; Kamata, I.; Tsuchida, H.
2013-07-01
This paper describes the strain-field analysis of threading edge dislocations (TEDs) and basal-plane dislocations (BPDs) in 4H-SiC using x-ray microbeam three-dimensional (3D) topography. This 3D topography enables quantitative strain-field analysis, which measures images of effective misorientations (Δω maps) around the dislocations. A deformation-matrix-based simulation algorithm is developed to theoretically evaluate the Δω mapping. Systematic linear calculations can provide simulated Δω maps (Δωsim maps) of dislocations with different Burgers vectors, directions, and reflection vectors for the desired cross-sections. For TEDs and BPDs, Δω maps are compared with Δωsim maps, and their excellent correlation is demonstrated. Two types of asymmetric reflections, high- and low-angle incidence types, are compared. Strain analyses are also conducted to investigate BPD-TED conversion near an epilayer/substrate interface in 4H-SiC.
MTF measurements on real time for performance analysis of electro-optical systems
NASA Astrophysics Data System (ADS)
Stuchi, Jose Augusto; Signoreto Barbarini, Elisa; Vieira, Flavio Pascoal; dos Santos, Daniel, Jr.; Stefani, Mário Antonio; Yasuoka, Fatima Maria Mitsue; Castro Neto, Jarbas C.; Linhari Rodrigues, Evandro Luis
2012-06-01
The need of methods and tools that assist in determining the performance of optical systems is actually increasing. One of the most used methods to perform analysis of optical systems is to measure the Modulation Transfer Function (MTF). The MTF represents a direct and quantitative verification of the image quality. This paper presents the implementation of the software, in order to calculate the MTF of electro-optical systems. The software was used for calculating the MTF of Digital Fundus Camera, Thermal Imager and Ophthalmologic Surgery Microscope. The MTF information aids the analysis of alignment and measurement of optical quality, and also defines the limit resolution of optical systems. The results obtained with the Fundus Camera and Thermal Imager was compared with the theoretical values. For the Microscope, the results were compared with MTF measured of Microscope Zeiss model, which is the quality standard of ophthalmological microscope.
NASA Astrophysics Data System (ADS)
Ye, Z.; Meng, Q.; Mohamadian, H. P.; Wang, J. T.; Chen, L.; Zhu, L.
2007-06-01
The formation of SI engine combustion deposits is a complex phenomenon which depends on various factors of fuel, oil, additives, and engine. The goal of this study is to examine the effects of operating conditions, gasoline, lubricating oil, and additives on deposit formation. Both an experimental investigation and theoretical analysis are conducted on a single cylinder engine. As a result, the impact of deposits on engine performance and exhaust emissions (HC, NO x ) has been indicated. Using samples from a cylinder head and exhaust pipe as well as switching gases via the dual-gas method (N2, O2), the deposit formation mechanism is thoroughly investigated via the thermogravity analysis approach, where the roles of organic, inorganic, and volatile components of fuel, additives, and oil on deposit formation are identified from thermogravity curves. Sustainable feedback control design is then proposed for potential emission control and performance optimization
Error Analysis and Validation for Insar Height Measurement Induced by Slant Range
NASA Astrophysics Data System (ADS)
Zhang, X.; Li, T.; Fan, W.; Geng, X.
2018-04-01
InSAR technique is an important method for large area DEM extraction. Several factors have significant influence on the accuracy of height measurement. In this research, the effect of slant range measurement for InSAR height measurement was analysis and discussed. Based on the theory of InSAR height measurement, the error propagation model was derived assuming no coupling among different factors, which directly characterise the relationship between slant range error and height measurement error. Then the theoretical-based analysis in combination with TanDEM-X parameters was implemented to quantitatively evaluate the influence of slant range error to height measurement. In addition, the simulation validation of InSAR error model induced by slant range was performed on the basis of SRTM DEM and TanDEM-X parameters. The spatial distribution characteristics and error propagation rule of InSAR height measurement were further discussed and evaluated.
2014-01-01
Background The complexity of protein glycosylation makes it difficult to characterize glycosylation patterns on a proteomic scale. In this study, we developed an integrated strategy for comparatively analyzing N-glycosylation/glycoproteins quantitatively from complex biological samples in a high-throughput manner. This strategy entailed separating and enriching glycopeptides/glycoproteins using lectin affinity chromatography, and then tandem labeling them with 18O/16O to generate a mass shift of 6 Da between the paired glycopeptides, and finally analyzing them with liquid chromatography-mass spectrometry (LC-MS) and the automatic quantitative method we developed based on Mascot Distiller. Results The accuracy and repeatability of this strategy were first verified using standard glycoproteins; linearity was maintained within a range of 1:10–10:1. The peptide concentration ratios obtained by the self-build quantitative method were similar to both the manually calculated and theoretical values, with a standard deviation (SD) of 0.023–0.186 for glycopeptides. The feasibility of the strategy was further confirmed with serum from hepatocellular carcinoma (HCC) patients and healthy individuals; the expression of 44 glycopeptides and 30 glycoproteins were significantly different between HCC patient and control serum. Conclusions This strategy is accurate, repeatable, and efficient, and may be a useful tool for identification of disease-related N-glycosylation/glycoprotein changes. PMID:24428921
Rapid surface enhanced Raman scattering detection method for chloramphenicol residues
NASA Astrophysics Data System (ADS)
Ji, Wei; Yao, Weirong
2015-06-01
Chloramphenicol (CAP) is a widely used amide alcohol antibiotics, which has been banned from using in food producing animals in many countries. In this study, surface enhanced Raman scattering (SERS) coupled with gold colloidal nanoparticles was used for the rapid analysis of CAP. Density functional theory (DFT) calculations were conducted with Gaussian 03 at the B3LYP level using the 3-21G(d) and 6-31G(d) basis sets to analyze the assignment of vibrations. Affirmatively, the theoretical Raman spectrum of CAP was in complete agreement with the experimental spectrum. They both exhibited three strong peaks characteristic of CAP at 1104 cm-1, 1344 cm-1, 1596 cm-1, which were used for rapid qualitative analysis of CAP residues in food samples. The use of SERS as a method for the measurements of CAP was explored by comparing use of different solvents, gold colloidal nanoparticles concentration and absorption time. The method of the detection limit was determined as 0.1 μg/mL using optimum conditions. The Raman peak at 1344 cm-1 was used as the index for quantitative analysis of CAP in food samples, with a linear correlation of R2 = 0.9802. Quantitative analysis of CAP residues in foods revealed that the SERS technique with gold colloidal nanoparticles was sensitive and of a good stability and linear correlation, and suited for rapid analysis of CAP residue in a variety of food samples.
Rapid surface enhanced Raman scattering detection method for chloramphenicol residues.
Ji, Wei; Yao, Weirong
2015-06-05
Chloramphenicol (CAP) is a widely used amide alcohol antibiotics, which has been banned from using in food producing animals in many countries. In this study, surface enhanced Raman scattering (SERS) coupled with gold colloidal nanoparticles was used for the rapid analysis of CAP. Density functional theory (DFT) calculations were conducted with Gaussian 03 at the B3LYP level using the 3-21G(d) and 6-31G(d) basis sets to analyze the assignment of vibrations. Affirmatively, the theoretical Raman spectrum of CAP was in complete agreement with the experimental spectrum. They both exhibited three strong peaks characteristic of CAP at 1104 cm(-1), 1344 cm(-1), 1596 cm(-1), which were used for rapid qualitative analysis of CAP residues in food samples. The use of SERS as a method for the measurements of CAP was explored by comparing use of different solvents, gold colloidal nanoparticles concentration and absorption time. The method of the detection limit was determined as 0.1 μg/mL using optimum conditions. The Raman peak at 1344 cm(-1) was used as the index for quantitative analysis of CAP in food samples, with a linear correlation of R(2)=0.9802. Quantitative analysis of CAP residues in foods revealed that the SERS technique with gold colloidal nanoparticles was sensitive and of a good stability and linear correlation, and suited for rapid analysis of CAP residue in a variety of food samples. Copyright © 2015 Elsevier B.V. All rights reserved.
Hellen, Adam; Mandelis, Andreas; Finer, Yoav; Amaechi, Bennett T
2011-11-01
Human molars were subjected to demineralization in acid gel followed by incubation in remineralization solutions without or with fluoride (1 or 1000 ppm). Photothermal radiometry (PTR) and modulated luminescence (LUM) frequency scans were performed prior to and during de/remineralization treatments. Transverse Micro-Radiography (TMR) analysis followed at treatment conclusion to determine mineral loss and lesion depth. The remineralization process illustrated a complex interplay between surface and subsurface mineral deposition, confining the thermal-wave centroid toward the dominating layer. Experimental amplitudes and phases were fitted to a coupled diffuse-photon-density-wave and thermal-wave theoretical model used to quantitatively evaluate evolving changes in thermal and optical properties of de/remineralized enamel lesions. Additional information obtained from the LUM data corroborated the remineralization kinetics affecting the PTR signals. The results pointed to enhanced effectiveness of subsurface lesion remineralization in the presence of fluoride. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Theoretical analysis of field emission from a metal diamond cold cathode emitter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lerner, P.; Cutler, P.H.; Miskovsky, N.M.
Recently, Geis {ital et al.} [J. Vac. Sci. Technol. B {bold 14}, 2060 (1996)] proposed a cold cathode emitter based on a Spindt-type design using a diamond film doped by substitutional nitrogen. The device is characterized by high field emission currents at very low power. Two properties, the rough surface of the metallic injector and the negative electron affinity of the (111) surface of the diamond are essential for its operation. We present a first consistent quantitative theory of the operation of a Geis{endash}Spindt diamond field emitter. Its essential features are predicated on nearly {ital zero-field conditions} in the diamondmore » beyond the depletion layer, {ital quasiballistic transport} in the conduction band, and applicability of a modified {ital Fowler{endash}Nordheim equation} to the transmission of electrons through the Schottky barrier at the metal-diamond interface. Calculated results are in good qualitative and quantitative agreement with the experimental results of Geis {ital et al.} {copyright} {ital 1997 American Vacuum Society.}« less
Electrostatic Effects in Filamentous Protein Aggregation
Buell, Alexander K.; Hung, Peter; Salvatella, Xavier; Welland, Mark E.; Dobson, Christopher M.; Knowles, Tuomas P.J.
2013-01-01
Electrostatic forces play a key role in mediating interactions between proteins. However, gaining quantitative insights into the complex effects of electrostatics on protein behavior has proved challenging, due to the wide palette of scenarios through which both cations and anions can interact with polypeptide molecules in a specific manner or can result in screening in solution. In this article, we have used a variety of biophysical methods to probe the steady-state kinetics of fibrillar protein self-assembly in a highly quantitative manner to detect how it is modulated by changes in solution ionic strength. Due to the exponential modulation of the reaction rate by electrostatic forces, this reaction represents an exquisitely sensitive probe of these effects in protein-protein interactions. Our approach, which involves a combination of experimental kinetic measurements and theoretical analysis, reveals a hierarchy of electrostatic effects that control protein aggregation. Furthermore, our results provide a highly sensitive method for the estimation of the magnitude of binding of a variety of ions to protein molecules. PMID:23473495
The image of mathematics held by Irish post-primary students
NASA Astrophysics Data System (ADS)
Lane, Ciara; Stynes, Martin; O'Donoghue, John
2014-08-01
The image of mathematics held by Irish post-primary students was examined and a model for the image found was constructed. Initially, a definition for 'image of mathematics' was adopted with image of mathematics hypothesized as comprising attitudes, beliefs, self-concept, motivation, emotions and past experiences of mathematics. Research focused on students studying ordinary level mathematics for the Irish Leaving Certificate examination - the final examination for students in second-level or post-primary education. Students were aged between 15 and 18 years. A questionnaire was constructed with both quantitative and qualitative aspects. The questionnaire survey was completed by 356 post-primary students. Responses were analysed quantitatively using Statistical Package for the Social Sciences (SPSS) and qualitatively using the constant comparative method of analysis and by reviewing individual responses. Findings provide an insight into Irish post-primary students' images of mathematics and offer a means for constructing a theoretical model of image of mathematics which could be beneficial for future research.
Kheifets, Aaron; Freestone, David; Gallistel, C R
2017-07-01
In three experiments with mice ( Mus musculus ) and rats (Rattus norvigicus), we used a switch paradigm to measure quantitative properties of the interval-timing mechanism. We found that: 1) Rodents adjusted the precision of their timed switches in response to changes in the interval between the short and long feed latencies (the temporal goalposts). 2) The variability in the timing of the switch response was reduced or unchanged in the face of large trial-to-trial random variability in the short and long feed latencies. 3) The adjustment in the distribution of switch latencies in response to changes in the relative frequency of short and long trials was sensitive to the asymmetry in the Kullback-Leibler divergence. The three results suggest that durations are represented with adjustable precision, that they are timed by multiple timers, and that there is a trial-by-trial (episodic) record of feed latencies in memory. © 2017 Society for the Experimental Analysis of Behavior.
NASA Astrophysics Data System (ADS)
LoPresto, Michael C.
2014-09-01
What follows is a description of a theoretical model designed to calculate the playing frequencies of the musical pitches produced by a trombone. The model is based on quantitative treatments that demonstrate the effects of the flaring bell and cup-shaped mouthpiece sections on these frequencies and can be used to calculate frequencies that compare well to both the desired frequencies of the musical pitches and those actually played on a real trombone.
Chong, Bin; Yu, Dongliang; Jin, Rong; Wang, Yang; Li, Dongdong; Song, Ye; Gao, Mingqi; Zhu, Xufei
2015-04-10
Anodic TiO2 nanotubes have been studied extensively for many years. However, the growth kinetics still remains unclear. The systematic study of the current transient under constant anodizing voltage has not been mentioned in the original literature. Here, a derivation and its corresponding theoretical formula are proposed to overcome this challenge. In this paper, the theoretical expressions for the time dependent ionic current and electronic current are derived to explore the anodizing process of Ti. The anodizing current-time curves under different anodizing voltages and different temperatures are experimentally investigated in the anodization of Ti. Furthermore, the quantitative relationship between the thickness of the barrier layer and anodizing time, and the relationships between the ionic/electronic current and temperatures are proposed in this paper. All of the current-transient plots can be fitted consistently by the proposed theoretical expressions. Additionally, it is the first time that the coefficient A of the exponential relationship (ionic current j(ion) = A exp(BE)) has been determined under various temperatures and voltages. And the results indicate that as temperature and voltage increase, ionic current and electronic current both increase. The temperature has a larger effect on electronic current than ionic current. These results can promote the research of kinetics from a qualitative to quantitative level.
NASA Astrophysics Data System (ADS)
Chong, Bin; Yu, Dongliang; Jin, Rong; Wang, Yang; Li, Dongdong; Song, Ye; Gao, Mingqi; Zhu, Xufei
2015-04-01
Anodic TiO2 nanotubes have been studied extensively for many years. However, the growth kinetics still remains unclear. The systematic study of the current transient under constant anodizing voltage has not been mentioned in the original literature. Here, a derivation and its corresponding theoretical formula are proposed to overcome this challenge. In this paper, the theoretical expressions for the time dependent ionic current and electronic current are derived to explore the anodizing process of Ti. The anodizing current-time curves under different anodizing voltages and different temperatures are experimentally investigated in the anodization of Ti. Furthermore, the quantitative relationship between the thickness of the barrier layer and anodizing time, and the relationships between the ionic/electronic current and temperatures are proposed in this paper. All of the current-transient plots can be fitted consistently by the proposed theoretical expressions. Additionally, it is the first time that the coefficient A of the exponential relationship (ionic current jion = A exp(BE)) has been determined under various temperatures and voltages. And the results indicate that as temperature and voltage increase, ionic current and electronic current both increase. The temperature has a larger effect on electronic current than ionic current. These results can promote the research of kinetics from a qualitative to quantitative level.
Wrinkle-stabilized metal-graphene hybrid fibers with zero temperature coefficient of resistance.
Fang, Bo; Xi, Jiabin; Liu, Yingjun; Guo, Fan; Xu, Zhen; Gao, Weiwei; Guo, Daoyou; Li, Peigang; Gao, Chao
2017-08-24
The interfacial adhesion between graphene and metals is poor, as metals tend to generate superlubricity on smooth graphene surface. This problem renders the free assembly of graphene and metals to be a big challenge, and therefore, some desired conducting properties (e.g., stable metal-like conductivities in air, lightweight yet flexible conductors, and ultralow temperature coefficient of resistance, TCR) likely being realized by integrating the merits of graphene and metals remains at a theoretical level. This work proposes a wrinkle-stabilized approach to address the poor adhesion between graphene surface and metals. Cyclic voltammetry (CV) tests and theoretical analysis by Scharifker-Hills models demonstrate that multiscale wrinkles effectively induce nucleation of metal particles, locking in metal nuclei and guiding the continuous growth of metal islands in an instantaneous model on rough graphene surface. The universality and practicability of the wrinkle-stabilized approach is verified by our investigation through the electrodeposition of nine kinds of metals on graphene fibers (GF). The strong interface bonding permits metal-graphene hybrid fibers to show metal-level conductivities (up to 2.2 × 10 7 S m -1 , a record high value for GF in air), reliable weatherability and favorable flexibility. Due to the negative TCR of graphene and positive TCR of metals, the TCR of Cu- and Au-coated GFs reaches zero at a wide temperature range (15 K-300 K). For this layered model, the quantitative analysis by classical theories demonstrates the suitable thickness ratio of graphene layer and metal layer to achieve zero TCR to be 0.2, agreeing well with our experimental results. This wrinkle-stabilized approach and our theoretical analysis of zero-TCR behavior of the graphene-metal system are conducive to the design of high-performance conducting materials based on graphene and metals.
Zhao, Daqiu; Gong, Saijie; Hao, Zhaojun; Meng, Jiasong; Tao, Jun
2015-01-01
Herbaceous peony (Paeonia lactiflora Pall.) is an emerging high-grade cut flower worldwide, which is usually used in wedding bouquets and known as the “wedding flower”. However, abundant lateral branches appear frequently in some excellent cultivars, and a lack of a method to remove Paeonia lactiflora lateral branches other than inefficient artificial methods is an obstacle for improving the quality of its cut flowers. In this study, paclobutrazol (PBZ) application was found to inhibit the growth of lateral branches in Paeonia lactiflora for the first time, including 96.82% decreased lateral bud number per branch, 77.79% and 42.31% decreased length and diameter of lateral branches, respectively, declined cell wall materials and changed microstructures. Subsequently, isobaric tag for relative and absolute quantitation (iTRAQ) technology was used for quantitative proteomics analysis of lateral branches under PBZ application and control. The results indicated that 178 differentially expressed proteins (DEPs) successfully obtained, 98 DEPs were up-regulated and 80 DEPs were down-regulated. Thereafter, 34 candidate DEPs associated with the inhibited growth of lateral branches were screened according to their function and classification. These PBZ-stress responsive candidate DEPs were involved in eight biological processes, which played a very important role in the growth and development of lateral branches together with the response to PBZ stress. These results provide a better understanding of the molecular theoretical basis for removing Paeonia lactiflora lateral branches using PBZ application. PMID:26473855
Work Site-Based Environmental Interventions to Reduce Sedentary Behavior: A Systematic Review.
Hutcheson, Amanda K; Piazza, Andrew J; Knowlden, Adam P
2018-01-01
The purpose of this investigation was to systematically review work site-based, environmental interventions to reduce sedentary behavior following preferred reporting items for systematic reviews and meta-analyses guidelines. Data were extracted from Medical Literature Analysis and Retrieval System Online, Cochrane Central Register of Controlled Trials, and Web of Science between January 2005 and December 2015. Inclusion criteria were work site interventions, published in peer-reviewed journals, employing environmental modalities, targeting sedentary behavior, and using any quantitative design. Exclusion criteria were noninterventions and non-English publications. Data extracted included study design, population, intervention dosage, intervention activities, evaluation measures, and intervention effects. Data were tabulated quantitatively and synthesized qualitatively. A total of 15 articles were identified for review and 14 reported statistically significant decreases in sedentary behavior. The majority of studies employed a randomized controlled trial design (n = 7), used inclinometers to measure sedentary behavior (n = 9), recruited predominantly female samples (n = 15), and utilized sit-to-stand desks as the primary intervention modality (n = 10). The mean methodological quality score was 6.2 out of 10. Environmental work site interventions to reduce sedentary behavior show promise because work sites often have more control over environmental factors. Limitations of this intervention stream include inconsistent measurement of sedentary behavior, absence of theoretical frameworks to guide program development, and absence of long-term evaluation. Future studies should include clear reporting of intervention strategies and explicit operationalization of theoretical constructs.
Multi-scale Modeling of Chromosomal DNA in Living Cells
NASA Astrophysics Data System (ADS)
Spakowitz, Andrew
The organization and dynamics of chromosomal DNA play a pivotal role in a range of biological processes, including gene regulation, homologous recombination, replication, and segregation. Establishing a quantitative theoretical model of DNA organization and dynamics would be valuable in bridging the gap between the molecular-level packaging of DNA and genome-scale chromosomal processes. Our research group utilizes analytical theory and computational modeling to establish a predictive theoretical model of chromosomal organization and dynamics. In this talk, I will discuss our efforts to develop multi-scale polymer models of chromosomal DNA that are both sufficiently detailed to address specific protein-DNA interactions while capturing experimentally relevant time and length scales. I will demonstrate how these modeling efforts are capable of quantitatively capturing aspects of behavior of chromosomal DNA in both prokaryotic and eukaryotic cells. This talk will illustrate that capturing dynamical behavior of chromosomal DNA at various length scales necessitates a range of theoretical treatments that accommodate the critical physical contributions that are relevant to in vivo behavior at these disparate length and time scales. National Science Foundation, Physics of Living Systems Program (PHY-1305516).
Quantitative interpretation of heavy ion effects: Comparison of different systems and endpoints
NASA Astrophysics Data System (ADS)
Kiefer, J.
For a quantitative interpretation of biological heavy ion action the following parameters have to be taken into account: variations of energy depositions in microscopical sites, the dependence of primary lesion formation on local energy density and changes in repairability. They can be studied in objects of different size and with different sensitivities. Results on survival and mutation induction in yeast and in mammalian cells will be compared with theoretical predictions. It is shown that shouldered survival curves of diploid yeast can be adequately described if the final slope is adjusted according to the varying production of primary lesions. This is not the case for mammalian cells where the experiments show a rapid loss of the shoulder with LET, contrary to theoretical expectations. This behaviour is interpreted to mean that the repairability of heavy ion lesions is different in the two systems. Mutation induction is theoretically expected to decrease with higher LET. This is found in yeast but not in mammalian cells where it actually increases. These results suggest a higher rate of misrepair in mammalian cells.
Otani, Takashi
2017-01-01
The article is an in-depth explanation of qualitative research, an approach increasingly prevalent among today's research communities. After discussing its present spread within the health sciences, the author addresses: 1. Its definition. 2. Its characteristics, as well as its theoretical and procedural background. 3. Its procedures. 4. Differences between qualitative and quantitative approaches. 5. Mixed methods incorporating quantitative research. And in conclusion: 6. The importance of establishing an epistemological perspective in qualitative research.
Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods.
Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C W; Lipiński, Wojciech; Bischof, John C
2016-07-21
Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.
Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods
NASA Astrophysics Data System (ADS)
Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C. W.; Lipiński, Wojciech; Bischof, John C.
2016-07-01
Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.
Underestimating extreme events in power-law behavior due to machine-dependent cutoffs
NASA Astrophysics Data System (ADS)
Radicchi, Filippo
2014-11-01
Power-law distributions are typical macroscopic features occurring in almost all complex systems observable in nature. As a result, researchers in quantitative analyses must often generate random synthetic variates obeying power-law distributions. The task is usually performed through standard methods that map uniform random variates into the desired probability space. Whereas all these algorithms are theoretically solid, in this paper we show that they are subject to severe machine-dependent limitations. As a result, two dramatic consequences arise: (i) the sampling in the tail of the distribution is not random but deterministic; (ii) the moments of the sample distribution, which are theoretically expected to diverge as functions of the sample sizes, converge instead to finite values. We provide quantitative indications for the range of distribution parameters that can be safely handled by standard libraries used in computational analyses. Whereas our findings indicate possible reinterpretations of numerical results obtained through flawed sampling methodologies, they also pave the way for the search for a concrete solution to this central issue shared by all quantitative sciences dealing with complexity.
Comparative Kinetic Analysis of Closed-Ended and Open-Ended Porous Sensors
NASA Astrophysics Data System (ADS)
Zhao, Yiliang; Gaur, Girija; Mernaugh, Raymond L.; Laibinis, Paul E.; Weiss, Sharon M.
2016-09-01
Efficient mass transport through porous networks is essential for achieving rapid response times in sensing applications utilizing porous materials. In this work, we show that open-ended porous membranes can overcome diffusion challenges experienced by closed-ended porous materials in a microfluidic environment. A theoretical model including both transport and reaction kinetics is employed to study the influence of flow velocity, bulk analyte concentration, analyte diffusivity, and adsorption rate on the performance of open-ended and closed-ended porous sensors integrated with flow cells. The analysis shows that open-ended pores enable analyte flow through the pores and greatly reduce the response time and analyte consumption for detecting large molecules with slow diffusivities compared with closed-ended pores for which analytes largely flow over the pores. Experimental confirmation of the results was carried out with open- and closed-ended porous silicon (PSi) microcavities fabricated in flow-through and flow-over sensor configurations, respectively. The adsorption behavior of small analytes onto the inner surfaces of closed-ended and open-ended PSi membrane microcavities was similar. However, for large analytes, PSi membranes in a flow-through scheme showed significant improvement in response times due to more efficient convective transport of analytes. The experimental results and theoretical analysis provide quantitative estimates of the benefits offered by open-ended porous membranes for different analyte systems.
2017-05-26
Mathematical psychology. In APA Handbook of Research Methods in Psychology, Vol. 2: Research Designs: Quantitative , Qualitative, DISTRIBUTION A: Distribution...AFRL-AFOSR-VA-TR-2017-0108 A Proposal to Perform New Theoretical and Experimental Research on Human Efficiency Through Developments Within Systems...release. AF Office Of Scientific Research (AFOSR)/ RTA2 Arlington, Virginia 22203 Air Force Research Laboratory Air Force Materiel Command a. REPORT
An overview of quantitative approaches in Gestalt perception.
Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H
2016-09-01
Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Esposito, Alessandro
2006-05-01
This PhD project aims at the development and evaluation of microscopy techniques for the quantitative detection of molecular interactions and cellular features. The primarily investigated techniques are Fαrster Resonance Energy Transfer imaging and Fluorescence Lifetime Imaging Microscopy. These techniques have the capability to quantitatively probe the biochemical environment of fluorophores. An automated microscope capable of unsupervised operation has been developed that enables the investigation of molecular and cellular properties at high throughput levels and the analysis of cellular heterogeneity. State-of-the-art Förster Resonance Energy Transfer imaging, Fluorescence Lifetime Imaging Microscopy, Confocal Laser Scanning Microscopy and the newly developed tools have been combined with cellular and molecular biology techniques for the investigation of protein-protein interactions, oligomerization and post-translational modifications of α-Synuclein and Tau, two proteins involved in Parkinson’s and Alzheimer’s disease, respectively. The high inter-disciplinarity of this project required the merging of the expertise of both the Molecular Biophysics Group at the Debye Institute - Utrecht University and the Cell Biophysics Group at the European Neuroscience Institute - Gαttingen University. This project was conducted also with the support and the collaboration of the Center for the Molecular Physiology of the Brain (Göttingen), particularly with the groups associated with the Molecular Quantitative Microscopy and Parkinson’s Disease and Aggregopathies areas. This work demonstrates that molecular and cellular quantitative microscopy can be used in combination with high-throughput screening as a powerful tool for the investigation of the molecular mechanisms of complex biological phenomena like those occurring in neurodegenerative diseases.
Dou, Shuping; Virostko, John; Greiner, Dale L.; Powers, Alvin C.; Liu, Guozheng
2016-01-01
Quantitative prediction of in vivo behavior using an in vitro assay would dramatically accelerate pharmaceutical development. However, studies quantitatively correlating in vivo properties with in vitro assay results are rare because of the difficulty in quantitatively understanding the in vivo behavior of an agent. We now demonstrate such a correlation as a case study based on our quantitative understanding of the in vivo chemistry. In an ongoing pretargeting project, we designed a trifunctional antibody (Ab) that concomitantly carried a biotin and a DNA analogue (hereafter termed MORF). The biotin and the MORF were fused into one structure prior to conjugation to the Ab for the concomitant attachment. Because it was known that avidin-bound Ab molecules leave the circulation rapidly, this design would theoretically allow complete clearance by avidin. The clearability of the trifunctional Ab was determined by calculating the blood MORF concentration ratio of avidin-treated Ab to non-avidin-treated Ab using mice injected with these compounds. In theory, any compromised clearability should be due to the presence of impurities. In vitro, we measured the biotinylated percentage of the Ab-reacting (MORF-biotin)⊃-NH2 modifier, by addition of streptavidin to the radiolabeled (MORF-biotin)⊃-NH2 samples and subsequent high-performance liquid chromatography (HPLC) analysis. On the basis of our previous quantitative understanding, we predicted that the clearability of the Ab would be equal to the biotinylation percentage measured via HPLC. We validated this prediction within a 3% difference. In addition to the high avidin-induced clearability of the trifunctional Ab (up to ~95%) achieved by the design, we were able to predict the required quality of the (MORF-biotin)⊃-NH2 modifier for any given in vivo clearability. This approach may greatly reduce the steps and time currently required in pharmaceutical development in the process of synthesis, chemical analysis, in vitro cell study, and in vivo validation. PMID:26103429
Does gratitude enhance prosociality?: A meta-analytic review.
Ma, Lawrence K; Tunney, Richard J; Ferguson, Eamonn
2017-06-01
Theoretical models suggest that gratitude is linked to increased prosociality. To date, however, there is a lack of a comprehensive quantitative synthesis of results to support this claim. In this review we aimed to (a) examine the overall strength of the association between gratitude and prosociality, and (b) identify the theoretical and methodological variables that moderate this link. We identified 252 effect sizes from 91 studies across 65 papers-(Total N = 18,342 participants). The present meta-analysis revealed a statistically significant, and moderate positive correlation between gratitude and prosociality (r = .374). This association was significantly larger among studies that assessed reciprocal outcomes relative to nonreciprocal outcomes, and in particular among studies that examined direct -compared with indirect -reciprocity. Studies that examined gratitude as an affective state reported significantly larger effect size studies assessing gratitude as a trait . Studies that examined benefit-triggered gratitude (in response to other's kindness) had a stronger effect that generalized gratitude that focuses on the appreciation of what is valued and cherished in life. Finally, studies that manipulated gratitude in vivo (e.g., economic games) had larger effect sizes compared with those based on recalled incidents when the person felt grateful. We describe the theoretical and practical significance of the results. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Field-theoretic approach to fluctuation effects in neural networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buice, Michael A.; Cowan, Jack D.; Mathematics Department, University of Chicago, Chicago, Illinois 60637
A well-defined stochastic theory for neural activity, which permits the calculation of arbitrary statistical moments and equations governing them, is a potentially valuable tool for theoretical neuroscience. We produce such a theory by analyzing the dynamics of neural activity using field theoretic methods for nonequilibrium statistical processes. Assuming that neural network activity is Markovian, we construct the effective spike model, which describes both neural fluctuations and response. This analysis leads to a systematic expansion of corrections to mean field theory, which for the effective spike model is a simple version of the Wilson-Cowan equation. We argue that neural activity governedmore » by this model exhibits a dynamical phase transition which is in the universality class of directed percolation. More general models (which may incorporate refractoriness) can exhibit other universality classes, such as dynamic isotropic percolation. Because of the extremely high connectivity in typical networks, it is expected that higher-order terms in the systematic expansion are small for experimentally accessible measurements, and thus, consistent with measurements in neocortical slice preparations, we expect mean field exponents for the transition. We provide a quantitative criterion for the relative magnitude of each term in the systematic expansion, analogous to the Ginsburg criterion. Experimental identification of dynamic universality classes in vivo is an outstanding and important question for neuroscience.« less
The dynamics of adapting, unregulated populations and a modified fundamental theorem.
O'Dwyer, James P
2013-01-06
A population in a novel environment will accumulate adaptive mutations over time, and the dynamics of this process depend on the underlying fitness landscape: the fitness of and mutational distance between possible genotypes in the population. Despite its fundamental importance for understanding the evolution of a population, inferring this landscape from empirical data has been problematic. We develop a theoretical framework to describe the adaptation of a stochastic, asexual, unregulated, polymorphic population undergoing beneficial, neutral and deleterious mutations on a correlated fitness landscape. We generate quantitative predictions for the change in the mean fitness and within-population variance in fitness over time, and find a simple, analytical relationship between the distribution of fitness effects arising from a single mutation, and the change in mean population fitness over time: a variant of Fisher's 'fundamental theorem' which explicitly depends on the form of the landscape. Our framework can therefore be thought of in three ways: (i) as a set of theoretical predictions for adaptation in an exponentially growing phase, with applications in pathogen populations, tumours or other unregulated populations; (ii) as an analytically tractable problem to potentially guide theoretical analysis of regulated populations; and (iii) as a basis for developing empirical methods to infer general features of a fitness landscape.
Morrison, P; Burnard, P
1989-04-01
The theoretical framework known as Six Category Intervention Analysis is described. This framework has been used in the teaching of interpersonal skills in various settings but there appears to be little or no empirical work to test out the theory. In the present study, an instrument was devised for assessing student nurses' perceptions of their interpersonal skills based on the category analysis. The findings of the study are presented and a quantitative comparison is made with the results of an earlier study of trained nurses' perceptions. Marked similarities were noted between the two sets of findings. The key trend to emerge was that both groups of nurses tended to perceive themselves as being more authoritative and less facilitative in their interpersonal relationships, in terms of the category analysis. This trend and others are discussed and suggestions made for future directions in research and training in the field of interpersonal skills in nursing. Implications for the theory of six category intervention analysis are also discussed.
Hydrodynamics Analysis and CFD Simulation of Portal Venous System by TIPS and LS.
Wang, Meng; Zhou, Hongyu; Huang, Yaozhen; Gong, Piyun; Peng, Bing; Zhou, Shichun
2015-06-01
In cirrhotic patients, portal hypertension is often associated with a hyperdynamic changes. Transjugular Intrahepatic Portosystemic Shunt (TIPS) and Laparoscopic splenectomy are both treatments for liver cirrhosis due to portal hypertension. While, the two different interventions have different effects on hemodynamics after operation and the possibilities of triggering PVT are different. How hemodynamics of portal vein system evolving with two different operations remain unknown. Based on ultrasound and established numerical methods, CFD technique is applied to analyze hemodynamic changes after TIPS and Laparoscopic splenectomy. In this paper, we applied two 3-D flow models to the hemodynamic analysis for two patients who received a TIPS and a laparoscopic splenectomy, both therapies for treating portal hypertension induced diseases. The current computer simulations give a quantitative analysis of the interplay between hemodynamics and TIPS or splenectomy. In conclusion, the presented computational model can be used for the theoretical analysis of TIPS and laparoscopic splenectomy, clinical decisions could be made based on the simulation results with personal properly treatment.
NASA Astrophysics Data System (ADS)
Sreejith, S. S.; Mohan, Nithya; Kurup, M. R. Prathapachandra
2018-02-01
A trinulcear Zn2La Schiff base complex was synthesized using slow-solvent evaporation technique from a Zn(II) mononuclear metalloligand by 2:1 addition with La(NO3)3 salt. Single crystal XRD analysis revealed a rare nitrato bridged trinuclear entity which is seldom seen in these class of ligand systems. Qualitative and quantitative analysis of intermolecular interactions/short contacts were done using Hirshfeld surface and 2D finger print analysis. The thermally stable, blue luminescent compound exhibits internal heavy atom effect thereby quenching the emission intensity of the ligand. DFT calculations were performed on the compound to analyze frontier orbitals and also ESP plots were used to monitor nucleophilic/electrophilic regions on the compound and its implications on hydrogen bonding. A comparison of the bond orders and atomic charges on the trinuclear compound and the Zn(II) metalloligand precursor was performed to substantiate the formation of the trinuclear product through ligand exchange.
Thermally induced oscillations in fluid flow
NASA Technical Reports Server (NTRS)
Zuber, N.
1970-01-01
Theoretical investigation distinguishes the various mechanisms responsible for oscillations of pressure, temperature, and flow velocity, derives a quantitative description of the most troublesome mechanisms, and develops a capability to predict the occurrence of unstable flow.
Development of test methodology for dynamic mechanical analysis instrumentation
NASA Technical Reports Server (NTRS)
Allen, V. R.
1982-01-01
Dynamic mechanical analysis instrumentation was used for the development of specific test methodology in the determination of engineering parameters of selected materials, esp. plastics and elastomers, over a broad range of temperature with selected environment. The methodology for routine procedures was established with specific attention given to sample geometry, sample size, and mounting techniques. The basic software of the duPont 1090 thermal analyzer was used for data reduction which simplify the theoretical interpretation. Clamps were developed which allowed 'relative' damping during the cure cycle to be measured for the fiber-glass supported resin. The correlation of fracture energy 'toughness' (or impact strength) with the low temperature (glassy) relaxation responses for a 'rubber-modified' epoxy system was negative in result because the low-temperature dispersion mode (-80 C) of the modifier coincided with that of the epoxy matrix, making quantitative comparison unrealistic.
Comprehensive proteomic analysis of Penicillium verrucosum.
Nöbauer, Katharina; Hummel, Karin; Mayrhofer, Corina; Ahrens, Maike; Setyabudi, Francis M C; Schmidt-Heydt, Markus; Eisenacher, Martin; Razzazi-Fazeli, Ebrahim
2017-05-01
Mass spectrometric identification of proteins in species lacking validated sequence information is a major problem in veterinary science. In the present study, we used ochratoxin A producing Penicillium verrucosum to identify and quantitatively analyze proteins of an organism with yet no protein information available. The work presented here aimed to provide a comprehensive protein identification of P. verrucosum using shotgun proteomics. We were able to identify 3631 proteins in an "ab initio" translated database from DNA sequences of P. verrucosum. Additionally, a sequential window acquisition of all theoretical fragment-ion spectra analysis was done to find differentially regulated proteins at two different time points of the growth curve. We compared the proteins at the beginning (day 3) and at the end of the log phase (day 12). © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
HPTLC in Herbal Drug Quantification
NASA Astrophysics Data System (ADS)
Shinde, Devanand B.; Chavan, Machindra J.; Wakte, Pravin S.
For the past few decades, compounds from natural sources have been gaining importance because of the vast chemical diversity they offer. This has led to phenomenal increase in the demand for herbal medicines in the last two decades and need has been felt for ensuring the quality, safety, and efficacy of herbal drugs. Phytochemical evaluation is one of the tools for the quality assessment, which include preliminary phytochemical screening, chemoprofiling, and marker compound analysis using modern analytical techniques. High-performance thin-layer chromatography (HPTLC) has been emerged as an important tool for the qualitative, semiquantitative, and quantitative phytochemical analysis of the herbal drugs and formulations. This includes developing TLC fingerprinting profiles and estimation of biomarkers. This review has an attempt to focus on the theoretical considerations of HPTLC and some examples of herbal drugs and formulations analyzed by HPTLC.
Solar Prominence Fine Structure and Dynamics
NASA Astrophysics Data System (ADS)
Berger, Thomas
2014-01-01
We review recent observational and theoretical results on the fine structure and dynamics of solar prominences, beginning with an overview of prominence classifications, the proposal of possible new ``funnel prominence'' classification, and a discussion of the recent ``solar tornado'' findings. We then focus on quiescent prominences to review formation, down-flow dynamics, and the ``prominence bubble'' phenomena. We show new observations of the prominence bubble Rayleigh-Taylor instability triggered by a Kelvin-Helmholtz shear flow instability occurring along the bubble boundary. Finally we review recent studies on plasma composition of bubbles, emphasizing that differential emission measure (DEM) analysis offers a more quantitative analysis than photometric comparisons. In conclusion, we discuss the relation of prominences to coronal magnetic flux ropes, proposing that prominences can be understood as partially ionized condensations of plasma forming the return flow of a general magneto-thermal convection in the corona.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tralshawala, Nilesh; Howard, Don; Knight, Bryon
2008-02-28
In conventional infrared thermography, determination of thermal diffusivity requires thickness information. Recently GE has been experimenting with the use of lateral heat flow to determine thermal diffusivity without thickness information. This work builds on previous work at NASA Langley and Wayne State University but we incorporate thermal time of flight (tof) analysis rather than curve fitting to obtain quantitative information. We have developed appropriate theoretical models and a tof based data analysis framework to experimentally determine all components of thermal diffusivity from the time-temperature measurements. Initial validation was carried out using finite difference simulations. Experimental validation was done using anisotropicmore » carbon fiber reinforced polymer (CFRP) composites. We found that in the CFRP samples used, the in-plane component of diffusivity is about eight times larger than the through-thickness component.« less
NASA Astrophysics Data System (ADS)
Cheng, Tian-Le; Ma, Fengde D.; Zhou, Jie E.; Jennings, Guy; Ren, Yang; Jin, Yongmei M.; Wang, Yu U.
2012-01-01
Diffuse scattering contains rich information on various structural disorders, thus providing a useful means to study the nanoscale structural deviations from the average crystal structures determined by Bragg peak analysis. Extraction of maximal information from diffuse scattering requires concerted efforts in high-quality three-dimensional (3D) data measurement, quantitative data analysis and visualization, theoretical interpretation, and computer simulations. Such an endeavor is undertaken to study the correlated dynamic atomic position fluctuations caused by thermal vibrations (phonons) in precursor state of shape-memory alloys. High-quality 3D diffuse scattering intensity data around representative Bragg peaks are collected by using in situ high-energy synchrotron x-ray diffraction and two-dimensional digital x-ray detector (image plate). Computational algorithms and codes are developed to construct the 3D reciprocal-space map of diffuse scattering intensity distribution from the measured data, which are further visualized and quantitatively analyzed to reveal in situ physical behaviors. Diffuse scattering intensity distribution is explicitly formulated in terms of atomic position fluctuations to interpret the experimental observations and identify the most relevant physical mechanisms, which help set up reduced structural models with minimal parameters to be efficiently determined by computer simulations. Such combined procedures are demonstrated by a study of phonon softening phenomenon in precursor state and premartensitic transformation of Ni-Mn-Ga shape-memory alloy.
Rastogi, Tushar; Leder, Christoph; Kümmerer, Klaus
2014-09-01
The presences of micro-pollutants (active pharmaceutical ingredients, APIs) are increasingly seen as a challenge of the sustainable management of water resources worldwide due to ineffective effluent treatment and other measures for their input prevention. Therefore, novel approaches are needed like designing greener pharmaceuticals, i.e. better biodegradability in the environment. This study addresses a tiered approach of implementing green and sustainable chemistry principles for theoretically designing better biodegradable and pharmacologically improved pharmaceuticals. Photodegradation process coupled with LC-MS(n) analysis and in silico tools such as quantitative structure-activity relationships (QSAR) analysis and molecular docking proved to be a very significant approach for the preliminary stages of designing chemical structures that would fit into the "benign by design" concept in the direction of green and sustainable pharmacy. Metoprolol (MTL) was used as an example, which itself is not readily biodegradable under conditions found in sewage treatment and the aquatic environment. The study provides the theoretical design of new derivatives of MTL which might have the same or improved pharmacological activity and are more degradable in the environment than MTL. However, the in silico toxicity prediction by QSAR of those photo-TPs indicated few of them might be possibly mutagenic and require further testing. This novel approach of theoretically designing 'green' pharmaceuticals can be considered as a step forward towards the green and sustainable pharmacy field. However, more knowledge and further experience have to be collected on the full scope, opportunities and limitations of this approach. Copyright © 2014 Elsevier Ltd. All rights reserved.
Li, Junhua; Sun, Runguang; Hao, Changchun; He, Guangxiao; Zhang, Lei; Wang, Juan
2015-10-01
Cytochrome c (Cyt c) is an essential component of the inner mitochondrial respiratory chain because of its function of transferring electrons. The feature is closely related to the interaction between Cyt c and membrane lipids. We used Langmuir-Blodgett monolayer technique combined with AFM to study the interaction of Cyt c with lipid monolayers at air-buffer interface. In our work, by comparing the mixed Cyt c-anionic (DPPS) and Cyt c-zwitterionic (DPPC/DPPE) monolayers, the adsorption capacity of Cyt c on lipid monolayers is DPPS>DPPE>DPPC, which is attributed to their different headgroup structures. π-A isothermal data show that Cyt c (v=2.5 μL) molecules are at maximum adsorption quantity on lipid monolayer. Moreover, Cyt c molecules would form aggregations and drag some lipids with them into subphase if the protein exceeds the maximum adsorption quantity. π-T curve indicates that it takes more time for Cyt c molecular conformation to rearrange on DPPE monolayer than on DPPC. The compressibility study reveals that the adsorption or intermolecular aggregation of Cyt c molecules on lipid monolayer will change the membrane fluidization. In order to quantitatively estimate Cyt c molecular adsorption properties on lipid monolayers, we fit the experimental isotherm with a simple surface state equation. A theoretical model is also introduced to analyze the liquid expanded (LE) to liquid condensed (LC) phase transition of DPPC monolayer. The results of theoretical analysis are in good agreement with the experiment. Copyright © 2015 Elsevier B.V. All rights reserved.
Mitani, Yuji; Kubo, Mamoru; Muramoto, Ken-ichiro; Fukuma, Takeshi
2009-08-01
We have developed a wideband digital frequency detector for high-speed frequency modulation atomic force microscopy (FM-AFM). We used a subtraction-based phase comparator (PC) in a phase-locked loop circuit instead of a commonly used multiplication-based PC, which has enhanced the detection bandwidth to 100 kHz. The quantitative analysis of the noise performance revealed that the internal noise from the developed detector is small enough to provide the theoretically limited noise performance in FM-AFM experiments in liquid. FM-AFM imaging of mica in liquid was performed with the developed detector, showing its stability and applicability to true atomic-resolution imaging in liquid.
Analytical method for thermal stress analysis of plasma facing materials
NASA Astrophysics Data System (ADS)
You, J. H.; Bolt, H.
2001-10-01
The thermo-mechanical response of plasma facing materials (PFMs) to heat loads from the fusion plasma is one of the crucial issues in fusion technology. In this work, a fully analytical description of the thermal stress distribution in armour tiles of plasma facing components is presented which is expected to occur under typical high heat flux (HHF) loads. The method of stress superposition is applied considering the temperature gradient and thermal expansion mismatch. Several combinations of PFMs and heat sink metals are analysed and compared. In the framework of the present theoretical model, plastic flow and the effect of residual stress can be quantitatively assessed. Possible failure features are discussed.
Magnetic field induced transition in superconducting LaTiO3/SrTiO3 interfaces
NASA Astrophysics Data System (ADS)
Biscaras, J.; Bergeal, N.; Hurand, S.; Feuillet-Palma, C.; Rastogi, A.; Budhani, R. C.; Grilli, M.; Caprara, S.; Lesueur, J.
2013-07-01
Superconductivity at the LaTiO3/SrTiO3 interface is studied by low temperature and high magnetic field measurements as a function of a back-gate voltage. We show that it is intimately related to the appearance of a low density (a few 1012 cm-2) of high mobility carriers, in addition to low mobility ones always present in the system. These carriers form superconducting puddles coupled by a metallic two-dimensional electron gas, as revealed by the analysis of the phase transition driven by a perpendicular magnetic field. Two critical fields are evidenced, and a quantitative comparison with a recent theoretical model is made.
Stochasticity in the signalling network of a model microbe
NASA Astrophysics Data System (ADS)
Bischofs, Ilka; Foley, Jonathan; Battenberg, Eric; Fontaine-Bodin, Lisa; Price, Gavin; Wolf, Denise; Arkin, Adam
2007-03-01
The soil dwelling bacterium Bacillus subtilis is an excellent model organism for studying stochastic stress response induction in an isoclonal population. Subjected to the same stressor cells undergo different cell fates, including sporulation, competence, degradative enzyme synthesis and motility. For example, under conditions of nutrient deprivation and high cell density only a portion of the cell population forms an endospore. Here we use a combined experimental and theoretical approach to study stochastic sporulation induction in Bacillus subtilis. Using several fluorescent reporter strains we apply time lapse fluorescent microscopy in combination with quantitative image analysis to study cell fate progression on a single cell basis and elucidate key noise generators in the underlying cellular network.
Classification of large-sized hyperspectral imagery using fast machine learning algorithms
NASA Astrophysics Data System (ADS)
Xia, Junshi; Yokoya, Naoto; Iwasaki, Akira
2017-07-01
We present a framework of fast machine learning algorithms in the context of large-sized hyperspectral images classification from the theoretical to a practical viewpoint. In particular, we assess the performance of random forest (RF), rotation forest (RoF), and extreme learning machine (ELM) and the ensembles of RF and ELM. These classifiers are applied to two large-sized hyperspectral images and compared to the support vector machines. To give the quantitative analysis, we pay attention to comparing these methods when working with high input dimensions and a limited/sufficient training set. Moreover, other important issues such as the computational cost and robustness against the noise are also discussed.
Self-organized plasmonic metasurfaces for all-optical modulation
NASA Astrophysics Data System (ADS)
Della Valle, G.; Polli, D.; Biagioni, P.; Martella, C.; Giordano, M. C.; Finazzi, M.; Longhi, S.; Duò, L.; Cerullo, G.; Buatier de Mongeot, F.
2015-06-01
We experimentally demonstrate a self-organized metasurface with a polarization dependent transmittance that can be dynamically controlled by optical means. The configuration consists of tightly packed plasmonic nanowires with a large dispersion of width and height produced by the defocused ion-beam sputtering of a thin gold film supported on a silica glass. Our results are quantitatively interpreted according to a theoretical model based on the thermomodulational nonlinearity of gold and a finite-element numerical analysis of the absorption and scattering cross-sections of the nanowires. We found that the polarization sensitivity of the metasurface can be strongly enhanced by pumping with ultrashort laser pulses, leading to potential applications in ultrafast all-optical modulation and switching of light.
Analysis of cellular signal transduction from an information theoretic approach.
Uda, Shinsuke; Kuroda, Shinya
2016-03-01
Signal transduction processes the information of various cellular functions, including cell proliferation, differentiation, and death. The information for controlling cell fate is transmitted by concentrations of cellular signaling molecules. However, how much information is transmitted in signaling pathways has thus far not been investigated. Shannon's information theory paves the way to quantitatively analyze information transmission in signaling pathways. The theory has recently been applied to signal transduction, and mutual information of signal transduction has been determined to be a measure of information transmission. We review this work and provide an overview of how signal transduction transmits informational input and exerts biological output. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ezard, Thomas H.G.; Jørgensen, Peter S.; Zimmerman, Naupaka; Chamberlain, Scott; Salguero-Gómez, Roberto; Curran, Timothy J.; Poisot, Timothée
2014-01-01
Proficiency in mathematics and statistics is essential to modern ecological science, yet few studies have assessed the level of quantitative training received by ecologists. To do so, we conducted an online survey. The 937 respondents were mostly early-career scientists who studied biology as undergraduates. We found a clear self-perceived lack of quantitative training: 75% were not satisfied with their understanding of mathematical models; 75% felt that the level of mathematics was “too low” in their ecology classes; 90% wanted more mathematics classes for ecologists; and 95% more statistics classes. Respondents thought that 30% of classes in ecology-related degrees should be focused on quantitative disciplines, which is likely higher than for most existing programs. The main suggestion to improve quantitative training was to relate theoretical and statistical modeling to applied ecological problems. Improving quantitative training will require dedicated, quantitative classes for ecology-related degrees that contain good mathematical and statistical practice. PMID:24688862
Theoretical model for plasmonic photothermal response of gold nanostructures solutions
NASA Astrophysics Data System (ADS)
Phan, Anh D.; Nga, Do T.; Viet, Nguyen A.
2018-03-01
Photothermal effects of gold core-shell nanoparticles and nanorods dispersed in water are theoretically investigated using the transient bioheat equation and the extended Mie theory. Properly calculating the absorption cross section is an extremely crucial milestone to determine the elevation of solution temperature. The nanostructures are assumed to be randomly and uniformly distributed in the solution. Compared to previous experiments, our theoretical temperature increase during laser light illumination provides, in various systems, both reasonable qualitative and quantitative agreement. This approach can be a highly reliable tool to predict photothermal effects in experimentally unexplored structures. We also validate our approach and discuss itslimitations.
Acioli, Paulo H.; Jellinek, Julius
2017-07-14
A theoretical/computational description and analysis of the spectra of electron binding energies of Al 12 -, Al 13 - and Al 12Ni- clusters, which differ in size and/or composition by a single atom yet possess strikingly different measured photoelectron spectra, is presented. It is shown that the measured spectra can not only be reproduced computationally with quantitative fidelity – this is achieved through a combination of state-of-the-art density functional theory with a highly accurate scheme for conversion of the Kohn-Sham eigenenergies into electron binding energies – but also explained in terms of the effects of size, structure/symmetry and composition. Furthermore,more » a new methodology is developed and applied that provides for disentanglement and differential assignment of the separate roles played by size, structure/symmetry and composition in defining the observed differences in the measured spectra. The methodology is general and applicable to any finite system, homogeneous or heterogeneous. Finally, we project that in combination with advances in synthesis techniques this methodology will become an indispensable computation-based aid in the design of controlled synthesis protocols for manufacture of nanosystems and nanodevices with precisely desired electronic and other characteristics.« less
NASA Astrophysics Data System (ADS)
Groby, Jean-Philippe; Wirgin, Armand
2008-02-01
We address the problem of the response to a seismic wave of an urban site consisting of Nb blocks overlying a soft layer underlain by a hard substratum. The results of a theoretical analysis, appealing to a space-frequency mode-matching (MM) technique, are compared to those obtained by a space-time finite-element (FE) technique. The two methods are shown to give rise to the same prediction of the seismic response for Nb = 1, 2 and 40 blocks. The mechanism of the interaction between blocks and the ground, as well as that of the mutual interaction between blocks, are studied. It is shown, in the first part of this paper, that the presence of a small number of blocks modifies the seismic disturbance in a manner which evokes qualitatively, but not quantitatively, what was observed during the 1985 Michoacan earthquake in Mexico City. Anomalous earthquake response at a much greater level, in terms of duration, peak and cumulative amplitude of motion, is shown, by a theoretical and numerical analysis in the second part of this paper, to be induced by the presence of a large (>=10) number of identical equi-spaced blocks that are present in certain districts of many cities.
The Sonic Altimeter for Aircraft
NASA Technical Reports Server (NTRS)
Draper, C S
1937-01-01
Discussed here are results already achieved with sonic altimeters in light of the theoretical possibilities of such instruments. From the information gained in this investigation, a procedure is outlined to determine whether or not a further development program is justified by the value of the sonic altimeter as an aircraft instrument. The information available in the literature is reviewed and condensed into a summary of sonic altimeter developments. Various methods of receiving the echo and timing the interval between the signal and the echo are considered. A theoretical discussion is given of sonic altimeter errors due to uncertainties in timing, variations in sound velocity, aircraft speed, location of the sending and receiving units, and inclinations of the flight path with respect to the ground surface. Plots are included which summarize the results in each case. An analysis is given of the effect of an inclined flight path on the frequency of the echo. A brief study of the acoustical phases of the sonic altimeter problem is carried through. The results of this analysis are used to predict approximately the maximum operating altitudes of a reasonably designed sonic altimeter under very good and very bad conditions. A final comparison is made between the estimated and experimental maximum operating altitudes which shows good agreement where quantitative information is available.
Urban, Jan; Hrouzek, Pavel; Stys, Dalibor; Martens, Harald
2013-01-01
Responsivity is a conversion qualification of a measurement device given by the functional dependence between the input and output quantities. A concentration-response-dependent calibration curve represents the most simple experiment for the measurement of responsivity in mass spectrometry. The cyanobacterial hepatotoxin microcystin-LR content in complex biological matrices of food additives was chosen as a model example of a typical problem. The calibration curves for pure microcystin and its mixtures with extracts of green alga and fish meat were reconstructed from the series of measurement. A novel approach for the quantitative estimation of ion competition in ESI is proposed in this paper. We define the correlated responsivity offset in the intensity values using the approximation of minimal correlation given by the matrix to the target mass values of the analyte. The estimation of the matrix influence enables the approximation of the position of a priori unknown responsivity and was easily evaluated using a simple algorithm. The method itself is directly derived from the basic attributes of the theory of measurements. There is sufficient agreement between the theoretical and experimental values. However, some theoretical issues are discussed to avoid misinterpretations and excessive expectations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Acioli, Paulo H.; Jellinek, Julius
A theoretical/computational description and analysis of the spectra of electron binding energies of Al 12 -, Al 13 - and Al 12Ni- clusters, which differ in size and/or composition by a single atom yet possess strikingly different measured photoelectron spectra, is presented. It is shown that the measured spectra can not only be reproduced computationally with quantitative fidelity – this is achieved through a combination of state-of-the-art density functional theory with a highly accurate scheme for conversion of the Kohn-Sham eigenenergies into electron binding energies – but also explained in terms of the effects of size, structure/symmetry and composition. Furthermore,more » a new methodology is developed and applied that provides for disentanglement and differential assignment of the separate roles played by size, structure/symmetry and composition in defining the observed differences in the measured spectra. The methodology is general and applicable to any finite system, homogeneous or heterogeneous. Finally, we project that in combination with advances in synthesis techniques this methodology will become an indispensable computation-based aid in the design of controlled synthesis protocols for manufacture of nanosystems and nanodevices with precisely desired electronic and other characteristics.« less
Hydrodynamic Properties of Planing Surfaces and Flying Boats
NASA Technical Reports Server (NTRS)
Sokolov, N. A.
1950-01-01
The study of the hydrodynamic properties of planing bottom of flying boats and seaplane floats is at the present time based exclusively on the curves of towing tests conducted in tanks. In order to provide a rational basis for the test procedure in tanks and practical design data, a theoretical study must be made of the flow at the step and relations derived that show not only qualitatively but quantitatively the inter-relations of the various factors involved. The general solution of the problem of the development of hydrodynamic forces during the motion of the seaplane float or flying boat is very difficult for it is necessary to give a three-dimensional solution, which does not always permit reducing the analysis to the form of workable computation formulas. On the other had, the problem is complicated by the fact that the object of the analysis is concerned with two fluid mediums, namely, air and water, which have a surface of density discontinuity between them. The theoretical and experimental investigations on the hydrodynamics of a ship cannot be completely carried over to the design of floats and flying-boat hulls, because of the difference in the shape of the contour lines of the bodies, and, because of the entirely different flow conditions from the hydrodynamic viewpoint.
Hrouzek, Pavel; Štys, Dalibor; Martens, Harald
2013-01-01
Responsivity is a conversion qualification of a measurement device given by the functional dependence between the input and output quantities. A concentration-response-dependent calibration curve represents the most simple experiment for the measurement of responsivity in mass spectrometry. The cyanobacterial hepatotoxin microcystin-LR content in complex biological matrices of food additives was chosen as a model example of a typical problem. The calibration curves for pure microcystin and its mixtures with extracts of green alga and fish meat were reconstructed from the series of measurement. A novel approach for the quantitative estimation of ion competition in ESI is proposed in this paper. We define the correlated responsivity offset in the intensity values using the approximation of minimal correlation given by the matrix to the target mass values of the analyte. The estimation of the matrix influence enables the approximation of the position of a priori unknown responsivity and was easily evaluated using a simple algorithm. The method itself is directly derived from the basic attributes of the theory of measurements. There is sufficient agreement between the theoretical and experimental values. However, some theoretical issues are discussed to avoid misinterpretations and excessive expectations. PMID:23586036
Darvasi, A.; Soller, M.
1994-01-01
Selective genotyping is a method to reduce costs in marker-quantitative trait locus (QTL) linkage determination by genotyping only those individuals with extreme, and hence most informative, quantitative trait values. The DNA pooling strategy (termed: ``selective DNA pooling'') takes this one step further by pooling DNA from the selected individuals at each of the two phenotypic extremes, and basing the test for linkage on marker allele frequencies as estimated from the pooled samples only. This can reduce genotyping costs of marker-QTL linkage determination by up to two orders of magnitude. Theoretical analysis of selective DNA pooling shows that for experiments involving backcross, F(2) and half-sib designs, the power of selective DNA pooling for detecting genes with large effect, can be the same as that obtained by individual selective genotyping. Power for detecting genes with small effect, however, was found to decrease strongly with increase in the technical error of estimating allele frequencies in the pooled samples. The effect of technical error, however, can be markedly reduced by replication of technical procedures. It is also shown that a proportion selected of 0.1 at each tail will be appropriate for a wide range of experimental conditions. PMID:7896115
Tanase, Mihai; Waliszewski, Przemyslaw
2015-12-01
We propose a novel approach for the quantitative evaluation of aggressiveness in prostate carcinomas. The spatial distribution of cancer cell nuclei was characterized by the global spatial fractal dimensions D0, D1, and D2. Two hundred eighteen prostate carcinomas were stratified into the classes of equivalence using results of ROC analysis. A simulation of the cellular automata mix defined a theoretical frame for a specific geometric representation of the cell nuclei distribution called a local structure correlation diagram (LSCD). The LSCD and dispersion Hd were computed for each carcinoma. Data mining generated some quantitative criteria describing tumor aggressiveness. Alterations in tumor architecture along progression were associated with some changes in both shape and the quantitative characteristics of the LSCD consistent with those in the automata mix model. Low-grade prostate carcinomas with low complexity and very low biological aggressiveness are defined by the condition D0 < 1.545 and Hd < 38. High-grade carcinomas with high complexity and very high biological aggressiveness are defined by the condition D0 > 1.764 and Hd < 38. The novel homogeneity measure Hd identifies carcinomas with very low aggressiveness within the class of complexity C1 or carcinomas with very high aggressiveness in the class C7. © 2015 Wiley Periodicals, Inc.
Wang, Xueding; Xu, Yilian; Yang, Lu; Lu, Xiang; Zou, Hao; Yang, Weiqing; Zhang, Yuanyuan; Li, Zicheng; Ma, Menglin
2018-03-01
A series of 1,3,5-triazines were synthesized and their UV absorption properties were tested. The computational chemistry methods were used to construct quantitative structure-property relationship (QSPR), which was used to computer aided design of new 1,3,5-triazines ultraviolet rays absorber compounds. The experimental UV absorption data are in good agreement with those predicted data using the Time-dependent density functional theory (TD-DFT) [B3LYP/6-311 + G(d,p)]. A suitable forecasting model (R > 0.8, P < 0.0001) was revealed. Predictive three-dimensional quantitative structure-property relationship (3D-QSPR) model was established using multifit molecular alignment rule of Sybyl program, which conclusion is consistent with the TD-DFT calculation. The exceptional photostability mechanism of such ultraviolet rays absorber compounds was studied and confirmed as principally banked upon their ability to undergo excited-state deactivation via an ultrafast excited-state proton transfer (ESIPT). The intramolecular hydrogen bond (IMHB) of 1,3,5-triazines compounds is the basis for the excited state proton transfer, which was explored by IR spectroscopy, UV spectra, structural and energetic aspects of different conformers and frontier molecular orbitals analysis.
Input-output characterization of fiber reinforced composites by P waves
NASA Technical Reports Server (NTRS)
Renneisen, John D.; Williams, James H., Jr.
1990-01-01
Input-output characterization of fiber composites is studied theoretically by tracing P waves in the media. A new path motion to aid in the tracing of P and the reflection generated SV wave paths in the continuum plate is developed. A theoretical output voltage from the receiving transducer is calculated for a tone burst. The study enhances the quantitative and qualitative understanding of the nondestructive evaluation of fiber composites which can be modeled as transversely isotropic media.
Classical and quantum magnetism in giant Keplerate magnetic molecules.
Müller, A; Luban, M; Schröder, C; Modler, R; Kögerler, P; Axenovich, M; Schnack, J; Canfield, P; Bud'ko, S; Harrison, N
2001-09-17
Complementary theoretical modeling methods are presented for the classical and quantum Heisenberg model to explain the magnetic properties of nanometer-sized magnetic molecules. Excellent quantitative agreement is achieved between our experimental data down to 0.1 K and for fields up to 60 Tesla and our theoretical results for the giant Keplerate species {Mo72Fe30}, by far the largest paramagnetic molecule synthesized to date. © 2001 WILEY-VCH Verlag GmbH, Weinheim, Fed. Rep. of Germany.
Mechanism of unpinning spirals by a series of stimuli
NASA Astrophysics Data System (ADS)
Gao, Xiang; Zhang, Hong
2014-06-01
Antitachycardia pacing (ATP) is widely used to terminate tachycardia before it proceeds to lethal fibrillation. The important prerequisite for successful ATP is unpinning of the spirals anchored to the obstacle by a series of stimuli. Here, to understand the mechanism of unpinning spirals by ATP, we propose a theoretical explanation based on a nonlinear eikonal relation and a kinematical model. The theoretical results are quantitatively consistent with the numerical simulations in both weak and high excitabilities.
Watkins, Herschel M.; Vallée-Bélisle, Alexis; Ricci, Francesco; Makarov, Dmitrii E.; Plaxco, Kevin W.
2012-01-01
Surface-tethered biomolecules play key roles in many biological processes and biotechnologies. However, while the physical consequences of such surface attachment have seen significant theoretical study, to date this issue has seen relatively little experimental investigation. In response we present here a quantitative experimental and theoretical study of the extent to which attachment to a charged –but otherwise apparently inert– surface alters the folding free energy of a simple biomolecule. Specifically, we have measured the folding free energy of a DNA stem loop both in solution and when site-specifically attached to a negatively charged, hydroxyl-alkane-coated gold surface. We find that, whereas surface attachment is destabilizing at low ionic strength it becomes stabilizing at ionic strengths above ~130 mM. This behavior presumably reflects two competing mechanisms: excluded volume effects, which stabilize the folded conformation by reducing the entropy of the unfolded state, and electrostatics, which, at lower ionic strengths, destabilizes the more compact folded state via repulsion from the negatively charged surface. To test this hypothesis we have employed existing theories of the electrostatics of surface-bound polyelectrolytes and the entropy of surface-bound polymers to model both effects. Despite lacking any fitted parameters, these theoretical models quantitatively fit our experimental results, suggesting that, for this system, current knowledge of both surface electrostatics and excluded volume effects is reasonably complete and accurate. PMID:22239220
Hu, Rujun; Gao, Huiming; Ye, Yansheng; Ni, Zhihong; Jiang, Ning; Jiang, Xiaolian
2018-03-01
In recent years, the flipped classroom approach has been broadly applied to nursing courses in China. However, a systematic and quantitative assessment of the outcomes of this approach has not been conducted. The purpose of the meta-analysis is to evaluate the effectiveness of the flipped classroom pedagogy in Chinese baccalaureate nursing education. Meta-analysis of randomized controlled studies. All randomized controlled trials relevant to the use of flipped classrooms in Chinese nursing education were retrieved from the following databases from their date of inception through September 23, 2017: PubMed, EMBASE, the Cochrane Central Register of Controlled Trials, CINAHL, the China National Knowledge Infrastructure, the Wanfang Database, and the Chinese Scientific Journals Database. Search terms including "flipp*", "inverted", "classroom", and "nurs*" were used to identify potential studies. We also manually searched the reference lists of the retrieved articles to identify potentially relevant studies. Two reviewers independently assessed the eligibility of each study and extracted the data. The Cochrane risk-of-bias tool was used to evaluate the quality of the studies. RevMan (Version 5.3) was used to analyze the data. Theoretical knowledge scores and skill scores (continuous data) were synthesized using the standardized mean difference (SMD) and 95% confidence interval (CI). The statistical heterogeneity of the included studies was analyzed by calculating the I 2 statistic and applying a chi-square test. Publication bias was assessed by funnel plots. The quality of the combined results was evaluated using the Grading of Recommendations Assessment, Development and Evaluation system. Eleven randomized controlled trials published between 2015 and 2017 were selected. All the included studies had a moderate possibility of bias due to low methodological quality. The meta-analysis indicated that the theoretical knowledge scores and skill scores were significantly higher in the flipped classroom group than in the traditional lectures group (SMD=1.06, 95% CI: 0.70-1.41, P <0.001, and SMD=1.40, 95% CI: 0.46-2.34, P <0.001). There was no significant publication bias indicated in the primary analysis. Sensitivity analysis showed that the results of our meta-analysis were reliable. The evidence grades of the results regarding the theoretical knowledge and skill scores were low and very low, respectively. Flipped classroom pedagogy is more effective than traditional lectures at improving students' theoretical knowledge and skill scores. Given the limitations of the included studies, more robust randomized controlled trials are warranted in a variety of educational settings to confirm our findings. Copyright © 2017 Elsevier Ltd. All rights reserved.
Publication Trends in Thanatology: An Analysis of Leading Journals.
Wittkowski, Joachim; Doka, Kenneth J; Neimeyer, Robert A; Vallerga, Michael
2015-01-01
To identify important trends in thanatology as a discipline, the authors analyzed over 1,500 articles that appeared in Death Studies and Omega over a 20-year period, coding the category of articles (e.g., theory, application, empirical research), their content focus (e.g., bereavement, death attitudes, end-of-life), and for empirical studies, their methodology (e.g., quantitative, qualitative). In general, empirical research predominates in both journals, with quantitative methods outnumbering qualitative procedures 2 to 1 across the period studied, despite an uptick in the latter methods in recent years. Purely theoretical articles, in contrast, decline in frequency. Research on grief and bereavement is the most commonly occurring (and increasing) content focus of this work, with a declining but still substantial body of basic research addressing death attitudes. Suicidology is also well represented in the corpus of articles analyzed. In contrast, publications on topics such as death education, medical ethics, and end-of-life issues occur with lower frequency, in the latter instances likely due to the submission of such work to more specialized medical journals. Differences in emphasis of Death Studies and Omega are noted, and the analysis of publication patterns is interpreted with respect to overall trends in the discipline and the culture, yielding a broad depiction of the field and some predictions regarding its possible future.
Zhang, Tisheng; Niu, Xiaoji; Ban, Yalong; Zhang, Hongping; Shi, Chuang; Liu, Jingnan
2015-01-01
A GNSS/INS deeply-coupled system can improve the satellite signals tracking performance by INS aiding tracking loops under dynamics. However, there was no literature available on the complete modeling of the INS branch in the INS-aided tracking loop, which caused the lack of a theoretical tool to guide the selections of inertial sensors, parameter optimization and quantitative analysis of INS-aided PLLs. This paper makes an effort on the INS branch in modeling and parameter optimization of phase-locked loops (PLLs) based on the scalar-based GNSS/INS deeply-coupled system. It establishes the transfer function between all known error sources and the PLL tracking error, which can be used to quantitatively evaluate the candidate inertial measurement unit (IMU) affecting the carrier phase tracking error. Based on that, a steady-state error model is proposed to design INS-aided PLLs and to analyze their tracking performance. Based on the modeling and error analysis, an integrated deeply-coupled hardware prototype is developed, with the optimization of the aiding information. Finally, the performance of the INS-aided PLLs designed based on the proposed steady-state error model is evaluated through the simulation and road tests of the hardware prototype. PMID:25569751
Characterization methods for liquid interfacial layers
NASA Astrophysics Data System (ADS)
Javadi, A.; Mucic, N.; Karbaschi, M.; Won, J. Y.; Lotfi, M.; Dan, A.; Ulaganathan, V.; Gochev, G.; Makievski, A. V.; Kovalchuk, V. I.; Kovalchuk, N. M.; Krägel, J.; Miller, R.
2013-05-01
Liquid interfaces are met everywhere in our daily life. The corresponding interfacial properties and their modification play an important role in many modern technologies. Most prominent examples are all processes involved in the formation of foams and emulsions, as they are based on a fast creation of new surfaces, often of an immense extension. During the formation of an emulsion, for example, all freshly created and already existing interfaces are permanently subject to all types of deformation. This clearly entails the need of a quantitative knowledge on relevant dynamic interfacial properties and their changes under conditions pertinent to the technological processes. We report on the state of the art of interfacial layer characterization, including the determination of thermodynamic quantities as base line for a further quantitative analysis of the more important dynamic interfacial characteristics. Main focus of the presented work is on the experimental possibilities available at present to gain dynamic interfacial parameters, such as interfacial tensions, adsorbed amounts, interfacial composition, visco-elastic parameters, at shortest available surface ages and fastest possible interfacial perturbations. The experimental opportunities are presented along with examples for selected systems and theoretical models for a best data analysis. We also report on simulation results and concepts of necessary refinements and developments in this important field of interfacial dynamics.
A quantitative analysis of hydraulic interaction processes in stream-aquifer systems
Wang, Wenke; Dai, Zhenxue; Zhao, Yaqian; ...
2016-01-28
The hydraulic relationship between the stream and aquifer can be altered from hydraulic connection to disconnection when the pumping rate exceeds the maximum seepage flux of the streambed. This study proposes to quantitatively analyze the physical processes of stream-aquifer systems from connection to disconnection. A free water table equation is adopted to clarify under what conditions a stream starts to separate hydraulically from an aquifer. Both the theoretical analysis and laboratory tests have demonstrated that the hydraulic connectedness of the stream-aquifer system can reach a critical disconnection state when the horizontal hydraulic gradient at the free water surface is equalmore » to zero and the vertical is equal to 1. A boundary-value problem for movement of the critical point of disconnection is established for an analytical solution of the inverted water table movement beneath the stream. The result indicates that the maximum distance or thickness of the inverted water table is equal to the water depth in the stream, and at a steady state of disconnection, the maximum hydraulic gradient at the streambed center is 2. In conclusion, this study helps us to understand the hydraulic phenomena of water flow near streams and accurately assess surface water and groundwater resources.« less
Humpback whale bioacoustics: From form to function
NASA Astrophysics Data System (ADS)
Mercado, Eduardo, III
This thesis investigates how humpback whales produce, perceive, and use sounds from a comparative and computational perspective. Biomimetic models are developed within a systems-theoretic framework and then used to analyze the properties of humpback whale sounds. First, sound transmission is considered in terms of possible production mechanisms and the propagation characteristics of shallow water environments frequented by humpback whales. A standard source-filter model (used to describe human sound production) is shown to be well suited for characterizing sound production by humpback whales. Simulations of sound propagation based on normal mode theory reveal that optimal frequencies for long range propagation are higher than the frequencies used most often by humpbacks, and that sounds may contain spectral information indicating how far they have propagated. Next, sound reception is discussed. A model of human auditory processing is modified to emulate humpback whale auditory processing as suggested by cochlear anatomical dimensions. This auditory model is used to generate visual representations of humpback whale sounds that more clearly reveal what features are likely to be salient to listening whales. Additionally, the possibility that an unusual sensory organ (the tubercle) plays a role in acoustic processing is assessed. Spatial distributions of tubercles are described that suggest tubercles may be useful for localizing sound sources. Finally, these models are integrated with self-organizing feature maps to create a biomimetic sound classification system, and a detailed analysis of individual sounds and sound patterns in humpback whale 'songs' is performed. This analysis provides evidence that song sounds and sound patterns vary substantially in terms of detectability and propagation potential, suggesting that they do not all serve the same function. New quantitative techniques are also presented that allow for more objective characterizations of the long term acoustic features of songs. The quantitative framework developed in this thesis provides a basis for theoretical consideration of how humpback whales (and other cetaceans) might use sound. Evidence is presented suggesting that vocalizing humpbacks could use sounds not only to convey information to other whales, but also to collect information about other whales. In particular, it is suggested that some sounds currently believed to be primarily used as communicative signals, might be primarily used as sonar signals. This theoretical framework is shown to be generalizable to other baleen whales and to toothed whales.
NASA Astrophysics Data System (ADS)
Li, Jiajia; Li, Rongxi; Zhao, Bangsheng; Guo, Hui; Zhang, Shuan; Cheng, Jinghua; Wu, Xiaoli
2018-04-01
The use of Micro-Laser Raman spectroscopy technology for quantitatively determining gas carbon isotope composition is presented. In this study, 12CO2 and 13CO2 were mixed with N2 at various molar fraction ratios to obtain Raman quantification factors (F12CO2 and F13CO2), which provide a theoretical basis for calculating the δ13C value. And the corresponding values were 0.523 (0 < C12CO2/CN2 < 2) and 1.11998 (0 < C13CO2/CN2 < 1.5) respectively. It has shown that the representative Raman peak area can be used for the determination of δ13C values within the relative errors range of 0.076% to 1.154% in 13CO2/12CO2 binary mixtures when F12CO2/F13CO2 is 0.466972625. In addition, measurement of δ13C values by Micro-Laser Raman analysis were carried out on natural CO2 gas from Shengli Oil-field at room temperature under different pressures. The δ13C values obtained by Micro-Laser Raman spectroscopy technology and Isotope Ratio Mass Spectrometry (IRMS) technology are in good agreement with each other, and the relative errors range of δ13C values is 1.232%-6.964%. This research provides a fundamental analysis tool for determining gas carbon isotope composition (δ13C values) quantitatively by using Micro-Laser Raman spectroscopy. Experiment of results demonstrates that this method has the potential for obtaining δ13C values in natural CO2 gas reservoirs.
ERIC Educational Resources Information Center
Psacharopoulos, George
1990-01-01
Replies to commentaries on the relationship of comparative education theory and practice, addressing the beneficiaries of educational planning and policymaking, the planner's role, evaluation criteria for educational planning, choice of discipline, simplification of theory, analytical versus quantitative research, theoretical foundations of…
NASA Astrophysics Data System (ADS)
Sun, Qiming; Melnikov, Alexander; Wang, Jing; Mandelis, Andreas
2018-04-01
A rigorous treatment of the nonlinear behavior of photocarrier radiometric (PCR) signals is presented theoretically and experimentally for the quantitative characterization of semiconductor photocarrier recombination and transport properties. A frequency-domain model based on the carrier rate equation and the classical carrier radiative recombination theory was developed. The derived concise expression reveals different functionalities of the PCR amplitude and phase channels: the phase bears direct quantitative correlation with the carrier effective lifetime, while the amplitude versus the estimated photocarrier density dependence can be used to extract the equilibrium majority carrier density and thus, resistivity. An experimental ‘ripple’ optical excitation mode (small modulation depth compared to the dc level) was introduced to bypass the complicated ‘modulated lifetime’ problem so as to simplify theoretical interpretation and guarantee measurement self-consistency and reliability. Two Si wafers with known resistivity values were tested to validate the method.
NASA Astrophysics Data System (ADS)
Kumar, Amit; Kumar, Rajesh; Gupta, Archana; Tandon, Poonam; D'silva, E. Deepak
2017-12-01
A collective experimental and theoretical study was conducted on the molecular structure and vibrational spectra of nonlinear optical chalcone derivative (2E)-3-[4-(methylsulfanyl) phenyl]-1-(3-bromophenyl) prop-2-en-1-one (3Br4MSP). The FT-IR and FT-Raman spectra of the molecule in the solid phase have been recorded. Density functional theory (DFT) calculations at B3LYP level with 6-311++G (d,p) basis set have been carried out to derive useful information about the molecular structure and to assign the relevant electronic and vibrational features. These calculations reveal that the optimized geometry closely resembles the experimental XRD data. The vibrational spectra were analyzed on the basis of the potential energy distribution (PED) of each vibrational mode, which allowed us to obtain a quantitative as well as qualitative interpretation of FT-IR and FT-Raman spectra. The UV-vis spectrum was recorded in methanol solution. The excited state properties have been determined by TD-DFT method and the effect of solvent was analyzed by PCM model. The most prominent transition corresponds to π→π∗. The reactivity parameters as chemical potential, global hardness, and electrophilicity index have also been calculated. To provide an explicit assignment and analysis of 13C and 1H NMR spectra, theoretical calculations on chemical shift of the title compound were done through GIAO method at B3LYP/6-311++G (d,p) level. The Mulliken's population analysis shows one of the simplest pictures of charge distribution. The standard statistical thermodynamic functions like heat capacity at constant pressure (Cop,m), entropy (Som) and enthalpy (Hom) were obtained from the theoretical harmonic frequencies for the optimized molecule. The nonlinear optical properties of title molecule are also addressed theoretically. Two contributions, vibrational and electronic, to the electrical properties polarizability and first order hyperpolarizability of 3Br4MSP have been evaluated using the self-consistent field wave functions within the double harmonic oscillator approximation.
Alagoz, Baris Baykant; Deniz, Furkan Nur; Keles, Cemal; Tan, Nusret
2015-03-01
This study investigates disturbance rejection capacity of closed loop control systems by means of reference to disturbance ratio (RDR). The RDR analysis calculates the ratio of reference signal energy to disturbance signal energy at the system output and provides a quantitative evaluation of disturbance rejection performance of control systems on the bases of communication channel limitations. Essentially, RDR provides a straightforward analytical method for the comparison and improvement of implicit disturbance rejection capacity of closed loop control systems. Theoretical analyses demonstrate us that RDR of the negative feedback closed loop control systems are determined by energy spectral density of controller transfer function. In this manner, authors derived design criteria for specifications of disturbance rejection performances of PID and fractional order PID (FOPID) controller structures. RDR spectra are calculated for investigation of frequency dependence of disturbance rejection capacity and spectral RDR analyses are carried out for PID and FOPID controllers. For the validation of theoretical results, simulation examples are presented. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Guisasola, Jenaro; Ceberio, Mikel; Zubimendi, José Luis
2006-09-01
The study we present tries to explore how first year engineering students formulate hypotheses in order to construct their own problem solving structure when confronted with problems in physics. Under the constructivistic perspective of the teaching-learning process, the formulation of hypotheses plays a key role in contrasting the coherence of the students' ideas with the theoretical frame. The main research instrument used to identify students' reasoning is the written report by the student on how they have attempted four problem solving tasks in which they have been asked explicitly to formulate hypotheses. The protocols used in the assessment of the solutions consisted of a semi-quantitative study based on grids designed for the analysis of written answers. In this paper we have included two of the tasks used and the corresponding scheme for the categorisation of the answers. Details of the other two tasks are also outlined. According to our findings we would say that the majority of students judge a hypothesis to be plausible if it is congruent with their previous knowledge without rigorously checking it against the theoretical framework explained in class.
Collective behavior in animal groups: theoretical models and empirical studies
Giardina, Irene
2008-01-01
Collective phenomena in animal groups have attracted much attention in the last years, becoming one of the hottest topics in ethology. There are various reasons for this. On the one hand, animal grouping provides a paradigmatic example of self-organization, where collective behavior emerges in absence of centralized control. The mechanism of group formation, where local rules for the individuals lead to a coherent global state, is very general and transcends the detailed nature of its components. In this respect, collective animal behavior is a subject of great interdisciplinary interest. On the other hand, there are several important issues related to the biological function of grouping and its evolutionary success. Research in this field boasts a number of theoretical models, but much less empirical results to compare with. For this reason, even if the general mechanisms through which self-organization is achieved are qualitatively well understood, a quantitative test of the models assumptions is still lacking. New analysis on large groups, which require sophisticated technological procedures, can provide the necessary empirical data. PMID:19404431
Layer contributions to the nonlinear acoustic radiation from stratified media.
Vander Meulen, François; Haumesser, Lionel
2016-12-01
This study presents the thorough investigation of the second harmonic generation scenario in a three fluid layer system. An emphasis is on the evaluation of the nonlinear parameter B/A in each layer from remote measurements. A theoretical approach of the propagation of a finite amplitude acoustic wave in a multilayered medium is developed. In the frame of the KZK equation, the weak nonlinearity of the media, attenuation and diffraction effects are computed for the fundamental and second harmonic waves propagating back and forth in each of the layers of the system. The model uses a gaussian expansion to describe the beam propagation in order to quantitatively evaluate the contribution of each part of the system (layers and interfaces) to its nonlinearity. The model is validated through measurements on a water/aluminum/water system. Transmission as well as reflection configurations are studied. Good agreement is found between the theoretical results and the experimental data. The analysis of the second harmonic field sources measured by the transducers from outside the stratified medium highlights the factors that favor the cumulative effects. Copyright © 2016 Elsevier B.V. All rights reserved.
Tunneling magnetic force microscopy
NASA Technical Reports Server (NTRS)
Burke, Edward R.; Gomez, Romel D.; Adly, Amr A.; Mayergoyz, Isaak D.
1993-01-01
We have developed a powerful new tool for studying the magnetic patterns on magnetic recording media. This was accomplished by modifying a conventional scanning tunneling microscope. The fine-wire probe that is used to image surface topography was replaced with a flexible magnetic probe. Images obtained with these probes reveal both the surface topography and the magnetic structure. We have made a thorough theoretical analysis of the interaction between the probe and the magnetic fields emanating from a typical recorded surface. Quantitative data about the constituent magnetic fields can then be obtained. We have employed these techniques in studies of two of the most important issues of magnetic record: data overwrite and maximizing data-density. These studies have shown: (1) overwritten data can be retrieved under certain conditions; and (2) improvements in data-density will require new magnetic materials. In the course of these studies we have developed new techniques to analyze magnetic fields of recorded media. These studies are both theoretical and experimental and combined with the use of our magnetic force scanning tunneling microscope should lead to further breakthroughs in the field of magnetic recording.
Diffraction contrast near heterostructure boundaries--its nature and its application.
Bangert, U; Harvey, A J
1993-03-01
Two phenomena of diffraction contrast arising at or near III-V compound heterostructure boundaries are described and quantitatively analyzed. In the first observation alpha/delta-fringe contrast at boundaries inclined to the electron beam is discussed. Theoretical fringe profiles are generated according to the theory by Gevers et al. in 1964, which are then compared with experimental profiles. Applications to the characterization of AlGaAs/GaAs and InGaAsP/InP interfaces regarding composition, abruptness, and lattice tilt are presented. In the second study a new and very sensitive characterization technique for the direct determination of the strain in strained-layer structures is described. The method uses electron microscope images of 90 degrees-wedges, which exhibit a shift in the thickness contours due to strain relaxation at the edge, and compares these to images which are obtained theoretically by implementing finite element strain calculations in wedges in the dynamical theory of diffraction contrast. The considerable potential of this method is demonstrated on the strain analysis of strained GaInAs/GaAs structures.
NASA Astrophysics Data System (ADS)
Chithiraikumar, S.; Gandhimathi, S.; Neelakantan, M. A.
2017-06-01
A heterocyclic Schiff base, (E)-4-(1-((pyridin-2-ylmethyl)imino)ethyl)benzene-1,3-diol (L) was synthesized and isolated as single crystals. Its structure was characterized by FT-IR, UV, 1H and 13C NMR, and further confirmed by X-ray crystallography. Qualitatively and quantitatively the various interactions in the crystal structure of L has been analyzed by Hirshfeld surfaces and 2D fingerprint plots. Non covalent interactions have been studied by electron localization function (ELF) and mapped with reduced density gradient (RDG) analysis. The molecular structure was studied computationally by DFT-B3LYP/6-311G(d,p) calculations. HOMO-LUMO energy levels, chemical reactivity descriptors and thermodynamic parameters have been investigated at the same level of theory. The antioxidant potential of L was evaluated experimentally by measuring DPPH free radical scavenging effect using UV-visible spectroscopy and theoretically by DFT. Theoretical parameters, such as bond dissociation enthalpy (BDE) and spin density calculated suggests that antioxidant potential of L is due to H atom abstraction from the sbnd OH group.
High sensitivity phase retrieval method in grating-based x-ray phase contrast imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Zhao; Gao, Kun; Chen, Jian
2015-02-15
Purpose: Grating-based x-ray phase contrast imaging is considered as one of the most promising techniques for future medical imaging. Many different methods have been developed to retrieve phase signal, among which the phase stepping (PS) method is widely used. However, further practical implementations are hindered, due to its complex scanning mode and high radiation dose. In contrast, the reverse projection (RP) method is a novel fast and low dose extraction approach. In this contribution, the authors present a quantitative analysis of the noise properties of the refraction signals retrieved by the two methods and compare their sensitivities. Methods: Using themore » error propagation formula, the authors analyze theoretically the signal-to-noise ratios (SNRs) of the refraction images retrieved by the two methods. Then, the sensitivities of the two extraction methods are compared under an identical exposure dose. Numerical experiments are performed to validate the theoretical results and provide some quantitative insight. Results: The SNRs of the two methods are both dependent on the system parameters, but in different ways. Comparison between their sensitivities reveals that for the refraction signal, the RP method possesses a higher sensitivity, especially in the case of high visibility and/or at the edge of the object. Conclusions: Compared with the PS method, the RP method has a superior sensitivity and provides refraction images with a higher SNR. Therefore, one can obtain highly sensitive refraction images in grating-based phase contrast imaging. This is very important for future preclinical and clinical implementations.« less
Evaporation, diffusion and self-assembly at drying interfaces.
Roger, K; Sparr, E; Wennerström, H
2018-04-18
Water evaporation from complex aqueous solutions leads to the build-up of structure and composition gradients at their interface with air. We recently introduced an experimental setup for quantitatively studying such gradients and discussed how structure formation can lead to a self-regulation mechanism for controlling water evaporation through self-assembly. Here, we provide a detailed theoretical analysis using an advection/diffusion transport equation that takes into account thermodynamically non-ideal conditions and we directly relate the theoretical description to quantitative experimental data. We derive that the concentration profile develops according to a general square root of time scaling law, which fully agrees with experimental observations. The evaporation rate notably decreases with time as t-1/2, which shows that diffusion in the liquid phase is the rate limiting step for this system, in contrast to pure water evaporation. For the particular binary system that was investigated experimentally, which is composed of water and a sugar-based surfactant (α-dodecylmaltoside), the interfacial layer consists in a sequence of liquid crystalline phases of different mesostructures. We extract values for mutual diffusion coefficients of lamellar, hexagonal and micellar cubic phases, which are consistent with previously reported values and simple models. We thus provide a method to estimate the transport properties of oriented mesophases. The macroscopic humidity-independence of the evaporation rate up to 85% relative humidities is shown to result from both an extremely low mutual diffusion coefficient and the large range of water activities corresponding to relative humidities below 85%, at which the lamellar phase exists. Such a humidity self-regulation mechanism is expected for a large variety of complex system.
Pérez-Payá, E; Porcar, I; Gómez, C M; Pedrós, J; Campos, A; Abad, C
1997-08-01
A thermodynamic approach is proposed to quantitatively analyze the binding isotherms of peptides to model membranes as a function of one adjustable parameter, the actual peptide charge in solution z(p)+. The main features of this approach are a theoretical expression for the partition coefficient calculated from the molar free energies of the peptide in the aqueous and lipid phases, an equation proposed by S. Stankowski [(1991) Biophysical Journal, Vol. 60, p. 341] to evaluate the activity coefficient of the peptide in the lipid phase, and the Debye-Hückel equation that quantifies the activity coefficient of the peptide in the aqueous phase. To assess the validity of this approach we have studied, by means of steady-state fluorescence spectroscopy, the interaction of basic amphipathic peptides such as melittin and its dansylcadaverine analogue (DNC-melittin), as well as a new fluorescent analogue of substance P, SP (DNC-SP) with neutral phospholipid membranes. A consistent quantitative analysis of each binding curve was achieved. The z(p)+ values obtained were always found to be lower than the physical charge of the peptide. These z(p)+ values can be rationalized by considering that the peptide charged groups are strongly associated with counterions in buffer solution at a given ionic strength. The partition coefficients theoretically derived using the z(p)+ values were in agreement with those deduced from the Gouy-Chapman formalism. Ultimately, from the z(p)+ values the molar free energies for the free and lipid-bound states of the peptides have been calculated.
Origin of traps and charge transport mechanism in hafnia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Islamov, D. R., E-mail: damir@isp.nsc.ru; Gritsenko, V. A., E-mail: grits@isp.nsc.ru; Novosibirsk State University, Novosibirsk 630090
2014-12-01
In this study, we demonstrated experimentally and theoretically that oxygen vacancies are responsible for the charge transport in HfO{sub 2}. Basing on the model of phonon-assisted tunneling between traps, and assuming that the electron traps are oxygen vacancies, good quantitative agreement between the experimental and theoretical data of current-voltage characteristics was achieved. The thermal trap energy of 1.25 eV in HfO{sub 2} was determined based on the charge transport experiments.
1975-05-01
Finally, diagnostics for quantitative measurements of all these properties are necessary for meaningful comparison of the experiments with theoretical ...width (FWHM) of 120 ^rad. For comparison, a beam which fills the last amplifier rod has a corresponding theoretical divergence angle of 108 urad...hydrogen the protons produced by photoionization do not absorb). Also shown are the spontaneous lifetimes tu of the upper laser level, of use for self
Buchner, Ginka S; Murphy, Ronan D; Buchete, Nicolae-Viorel; Kubelka, Jan
2011-08-01
The problem of spontaneous folding of amino acid chains into highly organized, biologically functional three-dimensional protein structures continues to challenge the modern science. Understanding how proteins fold requires characterization of the underlying energy landscapes as well as the dynamics of the polypeptide chains in all stages of the folding process. In recent years, important advances toward these goals have been achieved owing to the rapidly growing interdisciplinary interest and significant progress in both experimental techniques and theoretical methods. Improvements in the experimental time resolution led to determination of the timescales of the important elementary events in folding, such as formation of secondary structure and tertiary contacts. Sensitive single molecule methods made possible probing the distributions of the unfolded and folded states and following the folding reaction of individual protein molecules. Discovery of proteins that fold in microseconds opened the possibility of atomic-level theoretical simulations of folding and their direct comparisons with experimental data, as well as of direct experimental observation of the barrier-less folding transition. The ultra-fast folding also brought new questions, concerning the intrinsic limits of the folding rates and experimental signatures of barrier-less "downhill" folding. These problems will require novel approaches for even more detailed experimental investigations of the folding dynamics as well as for the analysis of the folding kinetic data. For theoretical simulations of folding, a main challenge is how to extract the relevant information from overwhelmingly detailed atomistic trajectories. New theoretical methods have been devised to allow a systematic approach towards a quantitative analysis of the kinetic network of folding-unfolding transitions between various configuration states of a protein, revealing the transition states and the associated folding pathways at multiple levels, from atomistic to coarse-grained representations. This article is part of a Special Issue entitled: Protein Dynamics: Experimental and Computational Approaches. Copyright © 2010 Elsevier B.V. All rights reserved.
Zhou, Yun; Sojkova, Jitka; Resnick, Susan M; Wong, Dean F
2012-04-01
Both the standardized uptake value ratio (SUVR) and the Logan plot result in biased distribution volume ratios (DVRs) in ligand-receptor dynamic PET studies. The objective of this study was to use a recently developed relative equilibrium-based graphical (RE) plot method to improve and simplify the 2 commonly used methods for quantification of (11)C-Pittsburgh compound B ((11)C-PiB) PET. The overestimation of DVR in SUVR was analyzed theoretically using the Logan and the RE plots. A bias-corrected SUVR (bcSUVR) was derived from the RE plot. Seventy-eight (11)C-PiB dynamic PET scans (66 from controls and 12 from participants with mild cognitive impaired [MCI] from the Baltimore Longitudinal Study of Aging) were acquired over 90 min. Regions of interest (ROIs) were defined on coregistered MR images. Both the ROI and the pixelwise time-activity curves were used to evaluate the estimates of DVR. DVRs obtained using the Logan plot applied to ROI time-activity curves were used as a reference for comparison of DVR estimates. Results from the theoretic analysis were confirmed by human studies. ROI estimates from the RE plot and the bcSUVR were nearly identical to those from the Logan plot with ROI time-activity curves. In contrast, ROI estimates from DVR images in frontal, temporal, parietal, and cingulate regions and the striatum were underestimated by the Logan plot (controls, 4%-12%; MCI, 9%-16%) and overestimated by the SUVR (controls, 8%-16%; MCI, 16%-24%). This bias was higher in the MCI group than in controls (P < 0.01) but was not present when data were analyzed using either the RE plot or the bcSUVR. The RE plot improves pixelwise quantification of (11)C-PiB dynamic PET, compared with the conventional Logan plot. The bcSUVR results in lower bias and higher consistency of DVR estimates than of SUVR. The RE plot and the bcSUVR are practical quantitative approaches that improve the analysis of (11)C-PiB studies.
Zhou, Yun; Sojkova, Jitka; Resnick, Susan M.; Wong, Dean F.
2012-01-01
Both the standardized uptake value ratio (SUVR) and the Logan plot result in biased distribution volume ratios (DVR) in ligand-receptor dynamic PET studies. The objective of this study is to use a recently developed relative equilibrium-based graphical plot (RE plot) method to improve and simplify the two commonly used methods for quantification of [11C]PiB PET. Methods The overestimation of DVR in SUVR was analyzed theoretically using the Logan and the RE plots. A bias-corrected SUVR (bcSUVR) was derived from the RE plot. Seventy-eight [11C]PiB dynamic PET scans (66 from controls and 12 from mildly cognitively impaired participants (MCI) from the Baltimore Longitudinal Study of Aging (BLSA)) were acquired over 90 minutes. Regions of interest (ROIs) were defined on coregistered MRIs. Both the ROI and pixelwise time activity curves (TACs) were used to evaluate the estimates of DVR. DVRs obtained using the Logan plot applied to ROI TACs were used as a reference for comparison of DVR estimates. Results Results from the theoretical analysis were confirmed by human studies. ROI estimates from the RE plot and the bcSUVR were nearly identical to those from the Logan plot with ROI TACs. In contrast, ROI estimates from DVR images in frontal, temporal, parietal, cingulate regions, and the striatum were underestimated by the Logan plot (controls 4 – 12%; MCI 9 – 16%) and overestimated by the SUVR (controls 8 – 16%; MCI 16 – 24%). This bias was higher in the MCI group than in controls (p < 0.01) but was not present when data were analyzed using either the RE plot or the bcSUVR. Conclusion The RE plot improves pixel-wise quantification of [11C]PiB dynamic PET compared to the conventional Logan plot. The bcSUVR results in lower bias and higher consistency of DVR estimates compared to SUVR. The RE plot and the bcSUVR are practical quantitative approaches that improve the analysis of [11C]PiB studies. PMID:22414634
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Pengpeng; Zheng, Xiaojing, E-mail: xjzheng@xidian.edu.cn; Jin, Ke
2016-04-14
Weak magnetic nondestructive testing (e.g., metal magnetic memory method) concerns the magnetization variation of ferromagnetic materials due to its applied load and a weak magnetic surrounding them. One key issue on these nondestructive technologies is the magnetomechanical effect for quantitative evaluation of magnetization state from stress–strain condition. A representative phenomenological model has been proposed to explain the magnetomechanical effect by Jiles in 1995. However, the Jiles' model has some deficiencies in quantification, for instance, there is a visible difference between theoretical prediction and experimental measurements on stress–magnetization curve, especially in the compression case. Based on the thermodynamic relations and themore » approach law of irreversible magnetization, a nonlinear coupled model is proposed to improve the quantitative evaluation of the magnetomechanical effect. Excellent agreement has been achieved between the predictions from the present model and previous experimental results. In comparison with Jiles' model, the prediction accuracy is improved greatly by the present model, particularly for the compression case. A detailed study has also been performed to reveal the effects of initial magnetization status, cyclic loading, and demagnetization factor on the magnetomechanical effect. Our theoretical model reveals that the stable weak magnetic signals of nondestructive testing after multiple cyclic loads are attributed to the first few cycles eliminating most of the irreversible magnetization. Remarkably, the existence of demagnetization field can weaken magnetomechanical effect, therefore, significantly reduces the testing capability. This theoretical model can be adopted to quantitatively analyze magnetic memory signals, and then can be applied in weak magnetic nondestructive testing.« less
NASA Technical Reports Server (NTRS)
Tobak, Murray
1954-01-01
The concept of indicial aerodynamic functions is applied to the analysis of the short-period pitching mode of aircraft. By the use of simple physical relationships associated with the indicial-function relationships concept, quantitative studies are made of the separate effects on the damping in pitch of changes in Mach number, aspect ratio, plan-form shape, and frequency. The concept is further shown to be of value in depicting physically the induced effects on a tail surface which follows in the wake of a starting forward surface. Considerable effort is devoted to the development of theoretical techniques whereby the transient response in lift at the tail to the wing wake may be estimated. Numerical results for several representative cases are presented, and these are analyzed to reassess the importance of the contribution to the rotary damping moment of the interference lift at the tail.
A method to correct coordinate distortion in EBSD maps
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Y.B., E-mail: yubz@dtu.dk; Elbrønd, A.; Lin, F.X.
2014-10-15
Drift during electron backscatter diffraction mapping leads to coordinate distortions in resulting orientation maps, which affects, in some cases significantly, the accuracy of analysis. A method, thin plate spline, is introduced and tested to correct such coordinate distortions in the maps after the electron backscatter diffraction measurements. The accuracy of the correction as well as theoretical and practical aspects of using the thin plate spline method is discussed in detail. By comparing with other correction methods, it is shown that the thin plate spline method is most efficient to correct different local distortions in the electron backscatter diffraction maps. -more » Highlights: • A new method is suggested to correct nonlinear spatial distortion in EBSD maps. • The method corrects EBSD maps more precisely than presently available methods. • Errors less than 1–2 pixels are typically obtained. • Direct quantitative analysis of dynamic data are available after this correction.« less
Excitonic terahertz photoconductivity in intrinsic semiconductor nanowires.
Yan, Jie-Yun
2018-06-13
Excitonic terahertz photoconductivity in intrinsic semiconductor nanowires is studied. Based on the excitonic theory, the numerical method to calculate the photoconductivity spectrum in the nanowires is developed, which can simulate optical pump terahertz-probe spectroscopy measurements on real nanowires and thereby calculate the typical photoconductivity spectrum. With the help of the energetic structure deduced from the calculated linear absorption spectrum, the numerically observed shift of the resonant peak in the photoconductivity spectrum is found to result from the dominant exciton transition between excited or continuum states to the ground state, and the quantitative analysis is in good agreement with the quantum plasmon model. Besides, the dependence of the photoconductivity on the polarization of the terahertz field is also discussed. The numerical method and supporting theoretical analysis provide a new tool for experimentalists to understand the terahertz photoconductivity in intrinsic semiconductor nanowires at low temperatures or for nanowires subjected to below bandgap photoexcitation, where excitonic effects dominate.
Signal and noise modeling in confocal laser scanning fluorescence microscopy.
Herberich, Gerlind; Windoffer, Reinhard; Leube, Rudolf E; Aach, Til
2012-01-01
Fluorescence confocal laser scanning microscopy (CLSM) has revolutionized imaging of subcellular structures in biomedical research by enabling the acquisition of 3D time-series of fluorescently-tagged proteins in living cells, hence forming the basis for an automated quantification of their morphological and dynamic characteristics. Due to the inherently weak fluorescence, CLSM images exhibit a low SNR. We present a novel model for the transfer of signal and noise in CLSM that is both theoretically sound as well as corroborated by a rigorous analysis of the pixel intensity statistics via measurement of the 3D noise power spectra, signal-dependence and distribution. Our model provides a better fit to the data than previously proposed models. Further, it forms the basis for (i) the simulation of the CLSM imaging process indispensable for the quantitative evaluation of CLSM image analysis algorithms, (ii) the application of Poisson denoising algorithms and (iii) the reconstruction of the fluorescence signal.
Tug-of-war lacunarity—A novel approach for estimating lacunarity
NASA Astrophysics Data System (ADS)
Reiss, Martin A.; Lemmerer, Birgit; Hanslmeier, Arnold; Ahammer, Helmut
2016-11-01
Modern instrumentation provides us with massive repositories of digital images that will likely only increase in the future. Therefore, it has become increasingly important to automatize the analysis of digital images, e.g., with methods from pattern recognition. These methods aim to quantify the visual appearance of captured textures with quantitative measures. As such, lacunarity is a useful multi-scale measure of texture's heterogeneity but demands high computational efforts. Here we investigate a novel approach based on the tug-of-war algorithm, which estimates lacunarity in a single pass over the image. We computed lacunarity for theoretical and real world sample images, and found that the investigated approach is able to estimate lacunarity with low uncertainties. We conclude that the proposed method combines low computational efforts with high accuracy, and that its application may have utility in the analysis of high-resolution images.
NASA Technical Reports Server (NTRS)
Sutherland, Betsy M.; Georgakilas, Alexandros G.; Bennett, Paula V.; Laval, Jacques; Sutherland, John C.; Gewirtz, A. M. (Principal Investigator)
2003-01-01
Assessing DNA damage induction, repair and consequences of such damages requires measurement of specific DNA lesions by methods that are independent of biological responses to such lesions. Lesions affecting one DNA strand (altered bases, abasic sites, single strand breaks (SSB)) as well as damages affecting both strands (clustered damages, double strand breaks) can be quantified by direct measurement of DNA using gel electrophoresis, gel imaging and number average length analysis. Damage frequencies as low as a few sites per gigabase pair (10(9)bp) can be quantified by this approach in about 50ng of non-radioactive DNA, and single molecule methods may allow such measurements in DNA from single cells. This review presents the theoretical basis, biochemical requirements and practical aspects of this approach, and shows examples of their applications in identification and quantitation of complex clustered damages.
Devolatilization Analysis in a Twin Screw Extruder by using the Flow Analysis Network (FAN) Method
NASA Astrophysics Data System (ADS)
Tomiyama, Hideki; Takamoto, Seiji; Shintani, Hiroaki; Inoue, Shigeki
We derived the theoretical formulas for three mechanisms of devolatilization in a twin screw extruder. These are flash, surface refreshment and forced expansion. The method for flash devolatilization is based on the equation of equilibrium concentration which shows that volatiles break off from polymer when they are relieved from high pressure condition. For surface refreshment devolatilization, we applied Latinen's model to allow estimation of polymer behavior in the unfilled screw conveying condition. Forced expansion devolatilization is based on the expansion theory in which foams are generated under reduced pressure and volatiles are diffused on the exposed surface layer after mixing with the injected devolatilization agent. Based on these models, we developed the simulation software of twin-screw extrusion by the FAN method and it allows us to quantitatively estimate volatile concentration and polymer temperature with a high accuracy in the actual multi-vent extrusion process for LDPE + n-hexane.
Effect of different analyte diffusion/adsorption protocols on SERS signals
NASA Astrophysics Data System (ADS)
Li, Ruoping; Petschek, Rolfe G.; Han, Junhe; Huang, Mingju
2018-07-01
The effect of different analyte diffusion/adsorption protocols was studied which is often overlooked in surface-enhanced Raman scattering (SERS) technique. Three protocols: highly concentrated dilution (HCD) protocol, half-half dilution (HHD) protocol and layered adsorption (LA) protocol were studied and the SERS substrates were monolayer films of 80 nm Ag nanoparticles (NPs) which were modified by polyvinylpyrrolidone. The diffusion/adsorption mechanisms were modelled using the diffusion equation and the electromagnetic field distribution of two adjacent Ag NPs was simulated by the finite-different time-domain method. All experimental data and theoretical analysis suggest that different diffusion/adsorption behaviour of analytes will cause different SERS signal enhancements. HHD protocol could produce the most uniform and reproducible samples, and the corresponding signal intensity of the analyte is the strongest. This study will help to understand and promote the use of SERS technique in quantitative analysis.
Factors of empowerment for women in recovery from substance use.
Hunter, Bronwyn A; Jason, Leonard A; Keys, Christopher B
2013-03-01
Empowerment is an interdisciplinary construct heavily grounded in the theories of community psychology. Although empowerment has a strong theoretical foundation, few context-specific quantitative measures have been designed to evaluate empowerment for specific populations. The present study explored the factor structure of a modified empowerment scale with a cross-sectional sample of 296 women in recovery from substance use who lived in recovery homes located throughout the United States. Results from an exploratory factor analysis identified three factors of psychological empowerment which were closely related to previous conceptualizations of psychological empowerment: self-perception, resource knowledge and participation. Further analyses demonstrated a hierarchical relationship among the three factors, with resource knowledge predicting participation when controlling for self-perception. Finally, a correlational analysis demonstrated the initial construct validity of each factor, as each factor of empowerment was significantly and positively related to self-esteem. Implications for the application of psychological empowerment theory and research are discussed.
A comment on measuring the Hurst exponent of financial time series
NASA Astrophysics Data System (ADS)
Couillard, Michel; Davison, Matt
2005-03-01
A fundamental hypothesis of quantitative finance is that stock price variations are independent and can be modeled using Brownian motion. In recent years, it was proposed to use rescaled range analysis and its characteristic value, the Hurst exponent, to test for independence in financial time series. Theoretically, independent time series should be characterized by a Hurst exponent of 1/2. However, finite Brownian motion data sets will always give a value of the Hurst exponent larger than 1/2 and without an appropriate statistical test such a value can mistakenly be interpreted as evidence of long term memory. We obtain a more precise statistical significance test for the Hurst exponent and apply it to real financial data sets. Our empirical analysis shows no long-term memory in some financial returns, suggesting that Brownian motion cannot be rejected as a model for price dynamics.
Population and Star Formation Histories from the Outer Limits Survey
NASA Astrophysics Data System (ADS)
Brondel, Brian Joseph; Saha, Abhijit; Olszewski, Edward
2015-08-01
The Outer Limits Survey (OLS) is a deep survey of selected fields in the outlying areas of the Magellanic Clouds based on the MOSAIC-II instrument on the Blanco 4-meter Telescope at CTIO. OLS is designed to probe the outer disk and halo structures of Magellanic System. The survey comprises ~50 fields obtained in Landolt R, I and Washington C, M and DDO51 filters, extending to a depth of about 24th magnitude in I. While qualitative examination of the resulting data has yielded interesting published results, we report here on quantitative analysis through matching of Hess diagrams to theoretical isochrones. We present analysis based on techniques developed by Dolphin (e.g., 2002, MNRAS, 332, 91) for fields observed by OLS. Our results broadly match those found by qualitative examination of the CMDs, but interesting details emerge from isochrone fitting.
Excitonic terahertz photoconductivity in intrinsic semiconductor nanowires
NASA Astrophysics Data System (ADS)
Yan, Jie-Yun
2018-06-01
Excitonic terahertz photoconductivity in intrinsic semiconductor nanowires is studied. Based on the excitonic theory, the numerical method to calculate the photoconductivity spectrum in the nanowires is developed, which can simulate optical pump terahertz-probe spectroscopy measurements on real nanowires and thereby calculate the typical photoconductivity spectrum. With the help of the energetic structure deduced from the calculated linear absorption spectrum, the numerically observed shift of the resonant peak in the photoconductivity spectrum is found to result from the dominant exciton transition between excited or continuum states to the ground state, and the quantitative analysis is in good agreement with the quantum plasmon model. Besides, the dependence of the photoconductivity on the polarization of the terahertz field is also discussed. The numerical method and supporting theoretical analysis provide a new tool for experimentalists to understand the terahertz photoconductivity in intrinsic semiconductor nanowires at low temperatures or for nanowires subjected to below bandgap photoexcitation, where excitonic effects dominate.
Factors of Empowerment for Women in Recovery from Substance Use
Hunter, Bronwyn A.; Jason, Leonard A.; Keys, Christopher B.
2014-01-01
Empowerment is an interdisciplinary construct heavily grounded in the theories of community psychology. Although empowerment has a strong theoretical foundation, few context-specific quantitative measures have been designed to evaluate empowerment for specific populations. The present study explored the factor structure of a modified empowerment scale with a cross-sectional sample of 296 women in recovery from substance use who lived in recovery homes located throughout the United States. Results from an exploratory factor analysis identified three factors of psychological empowerment which were closely related to previous conceptualizations of psychological empowerment: self perception, resource knowledge and participation. Further analyses demonstrated a hierarchical relationship among the three factors, with resource knowledge predicting participation when controlling for self-perception. Finally, a correlational analysis demonstrated the initial construct validity of each factor, as each factor of empowerment was significantly and positively related to self-esteem. Implications for the application of psychological empowerment theory and research are discussed. PMID:22392193
Electrochemical performance investigations on the hydrogen depolarized CO2 concentrator
NASA Technical Reports Server (NTRS)
Aylward, J. R.
1976-01-01
An extensive investigation of anode and cathode polarization in complete cells and half cells was conducted to determine the factors affecting HDC electrode polarization and the nature of this polarization. Matrix-electrolyte-electrode interactions and cell electrolyte composition were also investigated. The electrodes were found to have normal performance capabilities. The HDC anode polarization characteristics were correlated with a theoretical kinetic analysis; and, except for some quantitative details, a rather complete understanding of the causes for HDC electrode polarization was formulated. One of the important finding resulting from the kinetic analysis was that platinum appears to catalyze the decomposition of carbonic acid to carbon dioxide and water. It was concluded that the abnormal voltage performance of the One Man ARS HDC cells was caused by insufficient cell electrolyte volume under normal operating conditions due to deficiencies in the reservoir to cell interfacing.
NASA Astrophysics Data System (ADS)
Adamo, M.; Nappi, C.; Sarnelli, E.
2010-09-01
The use of a scanning magnetic microscope (SMM) with a high temperature superconducting quantum interference device (SQUID) for quantitative measurements in eddy current nondestructive analysis (NDA) is presented. The SQUID has been used to detect the weak magnetic field variations around a small defect, close to a structural part generating an intensive magnetic field. The experimental data for a deep crack close to a rivet in a multilayer conducting plate have been taken in a RF-shielded environment and discussed in the light of the theoretical predictions. The results show that eddy current NDA can distinguish subsurface crack signals from wider structural signals, with defects located 10 mm below the surface. Moreover, in order to visualize the structure of the probing current when a circular induction coil is used, the simulation of eddy currents in a thick unflawed conducting plate has been carried out.
Danov, Krassimir D; Georgiev, Mihail T; Kralchevsky, Peter A; Radulova, Gergana M; Gurkov, Theodor D; Stoyanov, Simeon D; Pelan, Eddie G
2018-01-01
Suspensions of colloid particles possess the remarkable property to solidify upon the addition of minimal amount of a second liquid that preferentially wets the particles. The hardening is due to the formation of capillary bridges (pendular rings), which connect the particles. Here, we review works on the mechanical properties of such suspensions and related works on the capillary-bridge force, and present new rheological data for the weakly studied concentration range 30-55 vol% particles. The mechanical strength of the solidified capillary suspensions, characterized by the yield stress Y, is measured at the elastic limit for various volume fractions of the particles and the preferentially wetting liquid. A quantitative theoretical model is developed, which relates Y with the maximum of the capillary-bridge force, projected on the shear plane. A semi-empirical expression for the mean number of capillary bridges per particle is proposed. The model agrees very well with the experimental data and gives a quantitative description of the yield stress, which increases with the rise of interfacial tension and with the volume fractions of particles and capillary bridges, but decreases with the rise of particle radius and contact angle. The quantitative description of capillary force is based on the exact theory and numerical calculation of the capillary bridge profile at various bridge volumes and contact angles. An analytical formula for Y is also derived. The comparison of the theoretical and experimental strain at the elastic limit reveals that the fluidization of the capillary suspension takes place only in a deformation zone of thickness up to several hundred particle diameters, which is adjacent to the rheometer's mobile plate. The reported experimental results refer to water-continuous suspension with hydrophobic particles and oily capillary bridges. The comparison of data for bridges from soybean oil and hexadecane surprisingly indicate that the yield strength is greater for the suspension with soybean oil despite its lower interfacial tension against water. The result can be explained with the different contact angles of the two oils in agreement with the theoretical predictions. The results could contribute for a better understanding, quantitative prediction and control of the mechanical properties of three-phase capillary suspensions solid/liquid/liquid. Copyright © 2017 Elsevier B.V. All rights reserved.
Optical analysis of electro-optical systems by MTF calculus
NASA Astrophysics Data System (ADS)
Barbarini, Elisa Signoreto; Dos Santos, Daniel, Jr.; Stefani, Mário Antonio; Yasuoka, Fátima Maria Mitsue; Castro Neto, Jarbas C.; Rodrigues, Evandro Luís Linhari
2011-08-01
One of the widely used methods for performance analysis of an optical system is the determination of the Modulation Transfer Function (MTF). The MTF represents a quantitative and direct measure of image quality, and, besides being an objective test, it can be used on concatenated optical system. This paper presents the application of software called SMTF (software modulation transfer function), built in C++ and Open CV platforms for MTF calculation on electro-optical system. Through this technique, it is possible to develop specific method to measure the real time performance of a digital fundus camera, an infrared sensor and an ophthalmological surgery microscope. Each optical instrument mentioned has a particular device to measure the MTF response, which is being developed. Then the MTF information assists the analysis of the optical system alignment, and also defines its resolution limit by the MTF graphic. The result obtained from the implemented software is compared with the theoretical MTF curve from the analyzed systems.
NASA Technical Reports Server (NTRS)
Bruno, G. V.; Harrington, J. K.; Eastman, M. P.
1978-01-01
An analysis of EPR line shapes by the method of Polnaszek, Bruno, and Freed is made for slowly tumbling vanadyl spin probes in viscous nematic liquid crystals. The use of typical vanadyl complexes as spin probes for nematic liquid crystals is shown to simplify the theoretical analysis and the subsequent interpretation. Rotational correlation times tau and orientational ordering parameters S sub Z where slow tumbling effects are expected to be observed in vanadyl EPR spectra are indicated in a plot. Analysis of the inertial effects on the probe reorientation, which are induced by slowly fluctuating torque components of the local solvent structure, yield quantitative values for tau and S sub Z. The weakly ordered probe VOAA is in the slow tumbling region and displays these inertial effects throughout the nematic range of BEPC and Phase V. VOAA exhibits different reorientation behavior near the isotropic-nematic transition temperature than that displayed far below this transition temperature.
Ernst, Dominique; Köhler, Jürgen
2013-01-21
We provide experimental results on the accuracy of diffusion coefficients obtained by a mean squared displacement (MSD) analysis of single-particle trajectories. We have recorded very long trajectories comprising more than 1.5 × 10(5) data points and decomposed these long trajectories into shorter segments providing us with ensembles of trajectories of variable lengths. This enabled a statistical analysis of the resulting MSD curves as a function of the lengths of the segments. We find that the relative error of the diffusion coefficient can be minimized by taking an optimum number of points into account for fitting the MSD curves, and that this optimum does not depend on the segment length. Yet, the magnitude of the relative error for the diffusion coefficient does, and achieving an accuracy in the order of 10% requires the recording of trajectories with about 1000 data points. Finally, we compare our results with theoretical predictions and find very good qualitative and quantitative agreement between experiment and theory.
Quantifying the relationship between sequence and three-dimensional structure conservation in RNA
2010-01-01
Background In recent years, the number of available RNA structures has rapidly grown reflecting the increased interest on RNA biology. Similarly to the studies carried out two decades ago for proteins, which gave the fundamental grounds for developing comparative protein structure prediction methods, we are now able to quantify the relationship between sequence and structure conservation in RNA. Results Here we introduce an all-against-all sequence- and three-dimensional (3D) structure-based comparison of a representative set of RNA structures, which have allowed us to quantitatively confirm that: (i) there is a measurable relationship between sequence and structure conservation that weakens for alignments resulting in below 60% sequence identity, (ii) evolution tends to conserve more RNA structure than sequence, and (iii) there is a twilight zone for RNA homology detection. Discussion The computational analysis here presented quantitatively describes the relationship between sequence and structure for RNA molecules and defines a twilight zone region for detecting RNA homology. Our work could represent the theoretical basis and limitations for future developments in comparative RNA 3D structure prediction. PMID:20550657
From double-slit interference to structural information in simple hydrocarbons
Kushawaha, Rajesh Kumar; Patanen, Minna; Guillemin, Renaud; Journel, Loic; Miron, Catalin; Simon, Marc; Piancastelli, Maria Novella; Skates, C.; Decleva, Piero
2013-01-01
Interferences in coherent emission of photoelectrons from two equivalent atomic centers in a molecule are the microscopic analogies of the celebrated Young’s double-slit experiment. By considering inner-valence shell ionization in the series of simple hydrocarbons C2H2, C2H4, and C2H6, we show that double-slit interference is widespread and has built-in quantitative information on geometry, orbital composition, and many-body effects. A theoretical and experimental study is presented over the photon energy range of 70–700 eV. A strong dependence of the oscillation period on the C–C distance is observed, which can be used to determine bond lengths between selected pairs of equivalent atoms with an accuracy of at least 0.01 Å. Furthermore, we show that the observed oscillations are directly informative of the nature and atomic composition of the inner-valence molecular orbitals and that observed ratios are quantitative measures of elusive many-body effects. The technique and analysis can be immediately extended to a large class of compounds. PMID:24003155
Fluctuations and Noise in Stochastic Spread of Respiratory Infection Epidemics in Social Networks
NASA Astrophysics Data System (ADS)
Yulmetyev, Renat; Emelyanova, Natalya; Demin, Sergey; Gafarov, Fail; Hänggi, Peter; Yulmetyeva, Dinara
2003-05-01
For the analysis of epidemic and disease dynamics complexity, it is necessary to understand the basic principles and notions of its spreading in long-time memory media. Here we considering the problem from a theoretical and practical viewpoint, presenting the quantitative evidence confirming the existence of stochastic long-range memory and robust chaos in a real time series of respiratory infections of human upper respiratory track. In this work we present a new statistical method of analyzing the spread of grippe and acute respiratory track infections epidemic process of human upper respiratory track by means of the theory of discrete non-Markov stochastic processes. We use the results of our recent theory (Phys. Rev. E 65, 046107 (2002)) for the study of statistical effects of memory in real data series, describing the epidemic dynamics of human acute respiratory track infections and grippe. The obtained results testify to an opportunity of the strict quantitative description of the regular and stochastic components in epidemic dynamics of social networks with a view to time discreteness and effects of statistical memory.
Thurber, Greg M; Wittrup, K Dane
2008-05-01
Antibody-based cancer treatment depends upon distribution of the targeting macromolecule throughout tumor tissue, and spatial heterogeneity could significantly limit efficacy in many cases. Antibody distribution in tumor tissue is a function of drug dosage, antigen concentration, binding affinity, antigen internalization, drug extravasation from blood vessels, diffusion in the tumor extracellular matrix, and systemic clearance rates. We have isolated the effects of a subset of these variables by live-cell microscopic imaging of single-chain antibody fragments against carcinoembryonic antigen in LS174T tumor spheroids. The measured rates of scFv penetration and retention were compared with theoretical predictions based on simple scaling criteria. The theory predicts that antibody dose must be large enough to drive a sufficient diffusive flux of antibody to overcome cellular internalization, and exposure time must be long enough to allow penetration to the spheroid center. The experimental results in spheroids are quantitatively consistent with these predictions. Therefore, simple scaling criteria can be applied to accurately predict antibody and antibody fragment penetration distance in tumor tissue.
Thurber, Greg M.; Wittrup, K. Dane
2010-01-01
Antibody-based cancer treatment depends upon distribution of the targeting macromolecule throughout tumor tissue, and spatial heterogeneity could significantly limit efficacy in many cases. Antibody distribution in tumor tissue is a function of drug dosage, antigen concentration, binding affinity, antigen internalization, drug extravasation from blood vessels, diffusion in the tumor extracellular matrix, and systemic clearance rates. We have isolated the effects of a subset of these variables by live-cell microscopic imaging of single-chain antibody fragments against carcinoembryonic antigen in LS174T tumor spheroids. The measured rates of scFv penetration and retention were compared with theoretical predictions based on simple scaling criteria. The theory predicts that antibody dose must be large enough to drive a sufficient diffusive flux of antibody to overcome cellular internalization, and exposure time must be long enough to allow penetration to the spheroid center. The experimental results in spheroids are quantitatively consistent with these predictions. Therefore, simple scaling criteria can be applied to accurately predict antibody and antibody fragment penetration distance in tumor tissue. PMID:18451160
Quantitative Aspects of Single Molecule Microscopy
Ober, Raimund J.; Tahmasbi, Amir; Ram, Sripad; Lin, Zhiping; Ward, E. Sally
2015-01-01
Single molecule microscopy is a relatively new optical microscopy technique that allows the detection of individual molecules such as proteins in a cellular context. This technique has generated significant interest among biologists, biophysicists and biochemists, as it holds the promise to provide novel insights into subcellular processes and structures that otherwise cannot be gained through traditional experimental approaches. Single molecule experiments place stringent demands on experimental and algorithmic tools due to the low signal levels and the presence of significant extraneous noise sources. Consequently, this has necessitated the use of advanced statistical signal and image processing techniques for the design and analysis of single molecule experiments. In this tutorial paper, we provide an overview of single molecule microscopy from early works to current applications and challenges. Specific emphasis will be on the quantitative aspects of this imaging modality, in particular single molecule localization and resolvability, which will be discussed from an information theoretic perspective. We review the stochastic framework for image formation, different types of estimation techniques and expressions for the Fisher information matrix. We also discuss several open problems in the field that demand highly non-trivial signal processing algorithms. PMID:26167102
Conversion of evanescent Lamb waves into propagating waves via a narrow aperture edge.
Yan, Xiang; Yuan, Fuh-Gwo
2015-06-01
This paper presents a quantitative study of conversion of evanescent Lamb waves into propagating in isotropic plates. The conversion is substantiated by prescribing time-harmonic Lamb displacements/tractions through a narrow aperture at an edge of a semi-infinite plate. Complex-valued dispersion and group velocity curves are employed to characterize the conversion process. The amplitude coefficient of the propagating Lamb modes converted from evanescent is quantified based on the complex reciprocity theorem via a finite element analysis. The power flow generated into the plate can be separated into radiative and reactive parts made on the basis of propagating and evanescent Lamb waves, where propagating Lamb waves are theoretically proved to radiate pure real power flow, and evanescent Lamb waves carry reactive pure imaginary power flow. The propagating power conversion efficiency is then defined to quantitatively describe the conversion. The conversion efficiency is strongly frequency dependent and can be significant. With the converted propagating waves from evanescent, sensors at far-field can recapture some localized damage information that is generally possessed in evanescent waves and may have potential application in structural health monitoring.
NASA Astrophysics Data System (ADS)
Snakowska, Anna; Jurkiewicz, Jerzy; Gorazd, Łukasz
2017-05-01
The paper presents derivation of the impedance matrix based on the rigorous solution of the wave equation obtained by the Wiener-Hopf technique for a semi-infinite unflanged cylindrical duct. The impedance matrix allows, in turn, calculate the acoustic impedance along the duct and, as a special case, the radiation impedance. The analysis is carried out for a multimode incident wave accounting for modes coupling on the duct outlet not only qualitatively but also quantitatively for a selected source operating inside. The quantitative evaluation of the acoustic impedance requires setting of modes amplitudes which has been obtained applying the mode decomposition method to the far-field pressure radiation measurements and theoretical formulae for single mode directivity characteristics for an unflanged duct. Calculation of the acoustic impedance for a non-uniform distribution of the sound pressure and the sound velocity on a duct cross section requires determination of the acoustic power transmitted along/radiated from a duct. In the paper, the impedance matrix, the power, and the acoustic impedance were derived as functions of Helmholtz number and distance from the outlet.
Coupling biology and oceanography in models.
Fennel, W; Neumann, T
2001-08-01
The dynamics of marine ecosystems, i.e. the changes of observable chemical-biological quantities in space and time, are driven by biological and physical processes. Predictions of future developments of marine systems need a theoretical framework, i.e. models, solidly based on research and understanding of the different processes involved. The natural way to describe marine systems theoretically seems to be the embedding of chemical-biological models into circulation models. However, while circulation models are relatively advanced the quantitative theoretical description of chemical-biological processes lags behind. This paper discusses some of the approaches and problems in the development of consistent theories and indicates the beneficial potential of the coupling of marine biology and oceanography in models.
No breakdown of the radiatively driven wind theory in low-metallicity environments
NASA Astrophysics Data System (ADS)
Bouret, J.-C.; Lanz, T.; Hillier, D. J.; Martins, F.; Marcolino, W. L. F.; Depagne, E.
2015-05-01
We present a spectroscopic analysis of Hubble Space Telescope/Cosmic Origins Spectrograph observations of three massive stars in the low metallicity dwarf galaxies IC 1613 and WLM. These stars, were previously observed with Very Large Telescope (VLT)/X-shooter by Tramper et al., who claimed that their mass-loss rates are higher than expected from theoretical predictions for the underlying metallicity. A comparison of the far ultraviolet (FUV) spectra with those of stars of similar spectral types/luminosity classes in the Galaxy, and the Magellanic Clouds provides a direct, model-independent check of the mass-loss-metallicity relation. Then, a quantitative spectroscopic analysis is carried out using the non-LTE (NLTE) stellar atmosphere code CMFGEN. We derive the photospheric and wind characteristics, benefiting from a much better sensitivity of the FUV lines to wind properties than Hα. Iron and CNO abundances are measured, providing an independent check of the stellar metallicity. The spectroscopic analysis indicates that Z/Z⊙ = 1/5, similar to a Small Magellanic Cloud-type environment, and higher than usually quoted for IC 1613 and WLM. The mass-loss rates are smaller than the empirical ones by Tramper et al., and those predicted by the widely used theoretical recipe by Vink et al. On the other hand, we show that the empirical, FUV-based, mass-loss rates are in good agreement with those derived from mass fluxes computed by Lucy. We do not concur with Tramper et al. that there is a breakdown in the mass-loss-metallicity relation.
Wind tunnel seeding particles for laser velocimeter
NASA Technical Reports Server (NTRS)
Ghorieshi, Anthony
1992-01-01
The design of an optimal air foil has been a major challenge for aerospace industries. The main objective is to reduce the drag force while increasing the lift force in various environmental air conditions. Experimental verification of theoretical and computational results is a crucial part of the analysis because of errors buried in the solutions, due to the assumptions made in theoretical work. Experimental studies are an integral part of a good design procedure; however, empirical data are not always error free due to environmental obstacles or poor execution, etc. The reduction of errors in empirical data is a major challenge in wind tunnel testing. One of the recent advances of particular interest is the use of a non-intrusive measurement technique known as laser velocimetry (LV) which allows for obtaining quantitative flow data without introducing flow disturbing probes. The laser velocimeter technique is based on measurement of scattered light by the particles present in the flow but not the velocity of the flow. Therefore, for an accurate flow velocity measurement with laser velocimeters, two criterion are investigated: (1) how well the particles track the local flow field, and (2) the requirement of light scattering efficiency to obtain signals with the LV. In order to demonstrate the concept of predicting the flow velocity by velocity measurement of particle seeding, the theoretical velocity of the gas flow is computed and compared with experimentally obtained velocity of particle seeding.
NASA Astrophysics Data System (ADS)
Shuai, Kang; Yang, Xiaozhi
2017-03-01
Infrared spectroscopy is a powerful technique for probing H-species in nominally anhydrous minerals, and a particular goal of considerable efforts has been providing a simple yet accurate method for the quantification. The available methods, with either polarized or unpolarized analyses, are usually time-consuming or, in some cases, subjected to larger uncertainty than theoretically expected. It is shown here that an empirical approach for measuring the concentration, by determining three polarized infrared spectra along any three mutually perpendicular directions, is theoretically and in particular experimentally correct. The theoretical background is established by considering the integrated absorbance, and the experimental measurements are based on a careful evaluation of the species and content of H in a series of gem-quality orthogonal, monoclinic and triclinic crystals, including olivine, orthopyroxene, clinopyroxene, orthoclase and albite (natural and H-annealed). The results demonstrate that the sum of the integrated absorbance from two polarized spectra along two perpendicular directions in any given plane is a constant, and that the sum of the integrated absorbance from three polarized spectra along any three orthogonal directions is of essentially the same accuracy as that along the principal axes. It is also shown that this method works well, with a relative accuracy within 10%, even at some extreme cases where the sample absorption bands are both intense and strongly anisotropic.
Incorporation of membrane potential into theoretical analysis of electrogenic ion pumps.
Reynolds, J A; Johnson, E A; Tanford, C
1985-01-01
The transport rate of an electrogenic ion pump, and therefore also the current generated by the pump, depends on the potential difference (delta psi) between the two sides of the membrane. This dependence arises from at least three sources: (i) charges carried across the membrane by the transported ions; (ii) protein charges in the ion binding sites that alternate between exposure to (and therefore electrical contact with) the two sides of the membrane; (iii) protein charges or dipoles that move within the domain of the membrane as a result of conformational changes linked to the transport cycle. Quantitative prediction of these separate effects requires presently unavailable molecular information, so that there is great freedom in assigning voltage dependence to individual steps of a transport cycle when one attempts to make theoretical calculations of physiological behavior for an ion pump for which biochemical data (mechanism, rate constants, etc.) are already established. The need to make kinetic behavior consistent with thermodynamic laws, however, limits this freedom, and in most cases two points on a curve of rate versus delta psi will be fixed points independent of how voltage dependence is assigned. Theoretical discussion of these principles is illustrated by reference to ATP-driven Na,K pumps. Physiological data for this system suggest that all three of the possible mechanisms for generating voltage dependence do in fact make significant contributions. PMID:2413447
Tucker, George; Loh, Po-Ru; Berger, Bonnie
2013-10-04
Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zudilin, W W
2001-08-31
An arithmetical property allowing an improvement of some number-theoretic estimates is studied. Previous results were mostly qualitative. Application of quantitative results of the paper to the class of generalized hypergeometric G-functions extends the set of irrational numbers representable as values of these functions.
Ding, Xiangyan; Li, Feilong; Zhao, Youxuan; Xu, Yongmei; Hu, Ning; Cao, Peng; Deng, Mingxi
2018-04-23
This paper investigates the propagation of Rayleigh surface waves in structures with randomly distributed surface micro-cracks using numerical simulations. The results revealed a significant ultrasonic nonlinear effect caused by the surface micro-cracks, which is mainly represented by a second harmonic with even more distinct third/quadruple harmonics. Based on statistical analysis from the numerous results of random micro-crack models, it is clearly found that the acoustic nonlinear parameter increases linearly with micro-crack density, the proportion of surface cracks, the size of micro-crack zone, and the excitation frequency. This study theoretically reveals that nonlinear Rayleigh surface waves are feasible for use in quantitatively identifying the physical characteristics of surface micro-cracks in structures.
The Effect of the Density Ratio on the Nonlinear Dynamics of the Unstable Fluid Interface
NASA Technical Reports Server (NTRS)
Abarzhi, S. I.
2003-01-01
Here we report multiple harmonic theoretical solutions for a complete system of conservation laws, which describe the large-scale coherent dynamics in RTI and RMI for fluids with a finite density ratio in the general three-dimensional case. The analysis yields new properties of the bubble front dynamics. In either RTI or RMI, the obtained dependencies of the bubble velocity and curvature on the density ratio differ qualitatively and quantitatively from those suggested by the models of Sharp (1984), Oron et al. (2001), and Goncharov (2002). We show explicitly that these models violate the conservation laws. For the first time, our theory reveals an important qualitative distinction between the dynamics of the RT and RM bubbles.
Data analysis and theoretical studies for atmospheric Explorer C, D and E
NASA Technical Reports Server (NTRS)
Dalgarno, A.
1983-01-01
The research concentrated on construction of a comprehensive model of the chemistry of the ionosphere. It proceeded by comparing detailed predictions of the atmospheric parameters observed by the instrumentation on board the Atmospheric Explorer Satellites with the measured values and modifying the chemistry to bring about consistency. Full account was taken of laboratory measurements of the processes identified as important. The research programs were made available to the AE team members. Regularly updated tables of recommended values of photoionization cross sections and electron impact excitation and ionization cross sections were provided. The research did indeed lead to a chemistry model in which the main pathways are quantitatively secure. The accuracy was sufficient that remaining differences are small.
NASA Astrophysics Data System (ADS)
Nikitin, S. Yu.; Priezzhev, A. V.; Lugovtsov, A. E.; Ustinov, V. D.; Razgulin, A. V.
2014-10-01
The paper is devoted to development of the laser ektacytometry technique for evaluation of the statistical characteristics of inhomogeneous ensembles of red blood cells (RBCs). We have analyzed theoretically laser beam scattering by the inhomogeneous ensembles of elliptical discs, modeling red blood cells in the ektacytometer. The analysis shows that the laser ektacytometry technique allows for quantitative evaluation of such population characteristics of RBCs as the cells mean shape, the cells deformability variance and asymmetry of the cells distribution in the deformability. Moreover, we show that the deformability distribution itself can be retrieved by solving a specific Fredholm integral equation of the first kind. At this stage we do not take into account the scatter in the RBC sizes.
NASA Astrophysics Data System (ADS)
Lai, Szu Cheng; Sharifzadeh Mirshekarloo, Meysam; Yao, Kui
2017-05-01
Piezoelectric shunt damping (PSD) utilizes an electrically-shunted piezoelectric damper attached on a panel structure to suppress the transmission of acoustic noise. The paper develops an understanding on the effects of equivalent series resistance (ESR) of the piezoelectric damper in a PSD system on noise mitigation performance, and demonstrates that an increased ESR leads to a significant rise in the noise transmissibility due to reduction in the system’s mechanical damping. It is further demonstrated with experimental results that ESR effects can be compensated in the shunt circuit to significantly improve the noise mitigation performance. A theoretical electrical equivalent model of the PSD incorporating the ESR is established for quantitative analysis of ESR effects on noise mitigation.
Autonomous space processor for orbital debris
NASA Technical Reports Server (NTRS)
Ramohalli, Kumar; Campbell, David; Marine, Micky; Saad, Mohamad; Bertles, Daniel; Nichols, Dave
1990-01-01
Advanced designs are being continued to develop the ultimate goal of a GETAWAY special to demonstrate economical removal of orbital debris utilizing local resources in orbit. The fundamental technical feasibility was demonstrated in 1988 through theoretical calculations, quantitative computer animation, a solar focal point cutter, a robotic arm design and a subcase model. Last year improvements were made to the solar cutter and the robotic arm. Also performed last year was a mission analysis which showed the feasibility of retrieve at least four large (greater than 1500 kg) pieces of debris. Advances made during this reporting period are the incorporation of digital control with the existing placement arm, the development of a new robotic manipulator arm, and the study of debris spin attenuation. These advances are discussed.
Fundamentals and techniques of nonimaging optics for solar energy concentration
NASA Astrophysics Data System (ADS)
Winston, R.; Ogallaher, J. J.
1980-09-01
Recent progress in basic research into the theoretical understanding of nonimaging optical systems and their application to the design of practical solar concentration was reviewed. Work was done to extend the previously developed geometrical vector flux formalism with the goal of applying it to the analysis of nonideal concentrators. Both phase space and vector flux representation for traditional concentrators were generated. Understanding of the thermodynamically derived relationship between concentration and cavity effects led to the design of new lossless and low loss concentrators for absorbers with gaps. Quantitative measurements of the response of real collector systems and the distribution of diffuse insolation shows that in most cases performance exceeds predictions in solar applications. These developments led to improved nonimaging solar concentrator designs and applications.
Using representations in geometry: a model of students' cognitive and affective performance
NASA Astrophysics Data System (ADS)
Panaoura, Areti
2014-05-01
Self-efficacy beliefs in mathematics, as a dimension of the affective domain, are related with students' performance on solving tasks and mainly on overcoming cognitive obstacles. The present study investigated the interrelations of cognitive performance on geometry and young students' self-efficacy beliefs about using representations for solving geometrical tasks. The emphasis was on confirming a theoretical model for the primary-school and secondary-school students and identifying the differences and similarities for the two ages. A quantitative study was developed and data were collected from 1086 students in Grades 5-8. Confirmatory factor analysis affirmed the existence of a coherent model of affective dimensions about the use of representations for understanding the geometrical concepts, which becomes more stable across the educational levels.
A mixed methods study of Canadian adolescents’ perceptions of health
Pickett, William; Vandemeer, Eleanor; Taylor, Brian; Davison, Colleen
2016-01-01
Health perceptions adopted during childhood lay foundations for adult health trajectories and experiences. This study used a sequential mixed methods design to generate new evidence about child perceptions of health in two samples of Canadian children. A core qualitative study was followed by a complementary quantitative analysis to aid interpretation. Generational theory was used as a lens through which to interpret all data. Findings suggested that good health is perceived as customized and subjective. The strengths and liabilities of these perceptions are discussed, as well as implications for health promotion and prevention strategies. Through intentional consideration of the perspectives of this population group, this study makes both empirical and theoretical contributions to appreciating how cultural environments shape health perceptions. PMID:27741955
A mathematical model for CTL effect on a latently infected cell inclusive HIV dynamics and treatment
NASA Astrophysics Data System (ADS)
Tarfulea, N. E.
2017-10-01
This paper investigates theoretically and numerically the effect of immune effectors, such as the cytotoxic lymphocyte (CTL), in modeling HIV pathogenesis (via a newly developed mathematical model); our results suggest the significant impact of the immune response on the control of the virus during primary infection. Qualitative aspects (including positivity, boundedness, stability, uncertainty, and sensitivity analysis) are addressed. Additionally, by introducing drug therapy, we analyze numerically the model to assess the effect of treatment consisting of a combination of several antiretroviral drugs. Our results show that the inclusion of the CTL compartment produces a higher rebound for an individual's healthy helper T-cell compartment than drug therapy alone. Furthermore, we quantitatively characterize successful drugs or drug combination scenarios.
Measuring safety climate in health care.
Flin, R; Burns, C; Mearns, K; Yule, S; Robertson, E M
2006-04-01
To review quantitative studies of safety climate in health care to examine the psychometric properties of the questionnaires designed to measure this construct. A systematic literature review was undertaken to study sample and questionnaire design characteristics (source, no of items, scale type), construct validity (content validity, factor structure and internal reliability, concurrent validity), within group agreement, and level of analysis. Twelve studies were examined. There was a lack of explicit theoretical underpinning for most questionnaires and some instruments did not report standard psychometric criteria. Where this information was available, several questionnaires appeared to have limitations. More consideration should be given to psychometric factors in the design of healthcare safety climate instruments, especially as these are beginning to be used in large scale surveys across healthcare organisations.
Simultaneous two-wavelength tri-window common-path digital holography
NASA Astrophysics Data System (ADS)
Liu, Lei; Shan, Mingguang; Zhong, Zhi
2018-06-01
Two-wavelength common-path off-axis digital holography is proposed with a tri-window in a single shot. It is established using a standard 4f optical image system with a 2D Ronchi grating placed outside the Fourier plane. The input plane consists of three windows: one for the object and the other two for reference. Aided by a spatial filter together with two orthogonal linear polarizers in the Fourier plane, the two-wavelength information is encoded into a multiplexed hologram with two orthogonal spatial frequencies that enable full separation of spectral information in the digital Fourier space without resolution loss. Theoretical analysis and experimental results illustrate that our approach can simultaneously perform quantitative phase imaging at two wavelengths.
Ding, Xiangyan; Li, Feilong; Xu, Yongmei; Cao, Peng; Deng, Mingxi
2018-01-01
This paper investigates the propagation of Rayleigh surface waves in structures with randomly distributed surface micro-cracks using numerical simulations. The results revealed a significant ultrasonic nonlinear effect caused by the surface micro-cracks, which is mainly represented by a second harmonic with even more distinct third/quadruple harmonics. Based on statistical analysis from the numerous results of random micro-crack models, it is clearly found that the acoustic nonlinear parameter increases linearly with micro-crack density, the proportion of surface cracks, the size of micro-crack zone, and the excitation frequency. This study theoretically reveals that nonlinear Rayleigh surface waves are feasible for use in quantitatively identifying the physical characteristics of surface micro-cracks in structures. PMID:29690580
Mean-field theory for pedestrian outflow through an exit.
Yanagisawa, Daichi; Nishinari, Katsuhiro
2007-12-01
The average pedestrian flow through an exit is one of the most important indices in evaluating pedestrian dynamics. In order to study the flow in detail, the floor field model, which is a crowd model using cellular automata, is extended by taking into account realistic behavior of pedestrians around the exit. The model is studied by both numerical simulations and cluster analysis to obtain a theoretical expression for the average pedestrian flow through the exit. It is found quantitatively that the effects of exit door width, the wall, and the pedestrian mood of competition or cooperation significantly influence the average flow. The results show that there is a suitable width and position of the exit according to the pedestrians' mood.
Hospital library resources in Massachusetts: data collection and analysis.
McGrath, P J
1980-10-01
Hospitals in the Commonwealth of Massachusetts were surveyed to establish some ranges and baseline statistics for hospital medical information resources. The data were evaluated in terms of theoretical compliance with the Joint Commission on Accreditation of Hospitals standards as well as the more specific proposed appendices to the Canadian Standards for Hospital Libraries. The study quantifies hospital library resources and services in a state with a substantial number of acute care facilities. Of the study universe, 67.6% were judged as meeting either the revised JCAH or the Canadian criteria. The central finding is that the 100- to 299-bed institutions reflect a significant number of deficiencies when evaluated against either quantitative or nonquantitative standards. Further areas of study are suggested.
"Parents a dead end life": The main experiences of parents of children with leukemia.
Jadidi, Rahmatollah; Hekmatpou, Davood; Eghbali, Aziz; Memari, Fereshteh; Anbari, Zohreh
2014-11-01
The quantitative studies show that due to the widespread prevalence, high death rate, high treatment expenses, and long hospital stay, leukemia influences the families and their children to a great extent. In this regard, no qualitative study has been conducted in Iran. So, this study was conducted in Arak in 2011 with the aim of expressing the experiences of the parents whose children suffered from leukemia. Using qualitative research approach, by applying content analysis method, 22 participants were interviewed in two educational hospitals during 2 months. The study was started by purposive sampling and continued by theoretical one. The data were analyzed based on the content analysis method. Data analysis showed that insolvency, knapsack problems, cancer secrecy, trust on God, self-sacrifice, adaptation, medical malpractice, and hospital facilities were the level 3 codes of parents' experiences and "parents a dead end life" was the main theme of this study. In this study, the experiences of the parents whose children suffered from cancer were studied deeply by the use of qualitative method, especially by the use of resources syncretism rather than studying quantitatively. Parents a dead end life emerged as the main theme of this study, emphasizing the necessity of paying further attention to the parents. On the other hand, making more use of parents' experiences and encouraging them helps make the treatment more effective. It is suggested that these experiences be shared with parents in the form of pamphlets distributed right at the beginning of the treatment process.
Assessing Psychodynamic Conflict.
Simmonds, Joshua; Constantinides, Prometheas; Perry, J Christopher; Drapeau, Martin; Sheptycki, Amanda R
2015-09-01
Psychodynamic psychotherapies suggest that symptomatic relief is provided, in part, with the resolution of psychic conflicts. Clinical researchers have used innovative methods to investigate such phenomenon. This article aims to review the literature on quantitative psychodynamic conflict rating scales. An electronic search of the literature was conducted to retrieve quantitative observer-rated scales used to assess conflict noting each measure's theoretical model, information source, and training and clinical experience required. Scales were also examined for levels of reliability and validity. Five quantitative observer-rated conflict scales were identified. Reliability varied from poor to excellent with each measure demonstrating good validity. However a small number of studies and limited links to current conflict theory suggest further clinical research is needed.
Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun
2017-01-30
Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less confidence matched to confident proteins were retrieved. However, the total peptide-spectrum-match false discovery rate (PSM FDR) after retrieval analysis was still controlled at a confident level of FDR≤1%. As expected, the penalty for occasionally incorporating incorrect peptide identifications is negligible by comparison with the improvements in quantitative performance. More quantifiable peptides, lower missing value rate, better quantitative accuracy and precision were significantly achieved for the same protein identifications by this simple strategy. This strategy is theoretically applicable for any quantitative approaches in proteomics and thereby provides more quantitative information, especially on low-abundance proteins. Published by Elsevier B.V.
Chou, Ting-Chao
2006-09-01
The median-effect equation derived from the mass-action law principle at equilibrium-steady state via mathematical induction and deduction for different reaction sequences and mechanisms and different types of inhibition has been shown to be the unified theory for the Michaelis-Menten equation, Hill equation, Henderson-Hasselbalch equation, and Scatchard equation. It is shown that dose and effect are interchangeable via defined parameters. This general equation for the single drug effect has been extended to the multiple drug effect equation for n drugs. These equations provide the theoretical basis for the combination index (CI)-isobologram equation that allows quantitative determination of drug interactions, where CI < 1, = 1, and > 1 indicate synergism, additive effect, and antagonism, respectively. Based on these algorithms, computer software has been developed to allow automated simulation of synergism and antagonism at all dose or effect levels. It displays the dose-effect curve, median-effect plot, combination index plot, isobologram, dose-reduction index plot, and polygonogram for in vitro or in vivo studies. This theoretical development, experimental design, and computerized data analysis have facilitated dose-effect analysis for single drug evaluation or carcinogen and radiation risk assessment, as well as for drug or other entity combinations in a vast field of disciplines of biomedical sciences. In this review, selected examples of applications are given, and step-by-step examples of experimental designs and real data analysis are also illustrated. The merging of the mass-action law principle with mathematical induction-deduction has been proven to be a unique and effective scientific method for general theory development. The median-effect principle and its mass-action law based computer software are gaining increased applications in biomedical sciences, from how to effectively evaluate a single compound or entity to how to beneficially use multiple drugs or modalities in combination therapies.
[New method of mixed gas infrared spectrum analysis based on SVM].
Bai, Peng; Xie, Wen-Jun; Liu, Jun-Hua
2007-07-01
A new method of infrared spectrum analysis based on support vector machine (SVM) for mixture gas was proposed. The kernel function in SVM was used to map the seriously overlapping absorption spectrum into high-dimensional space, and after transformation, the high-dimensional data could be processed in the original space, so the regression calibration model was established, then the regression calibration model with was applied to analyze the concentration of component gas. Meanwhile it was proved that the regression calibration model with SVM also could be used for component recognition of mixture gas. The method was applied to the analysis of different data samples. Some factors such as scan interval, range of the wavelength, kernel function and penalty coefficient C that affect the model were discussed. Experimental results show that the component concentration maximal Mean AE is 0.132%, and the component recognition accuracy is higher than 94%. The problems of overlapping absorption spectrum, using the same method for qualitative and quantitative analysis, and limit number of training sample, were solved. The method could be used in other mixture gas infrared spectrum analyses, promising theoretic and application values.
Use of Multivariate Linkage Analysis for Dissection of a Complex Cognitive Trait
Marlow, Angela J.; Fisher, Simon E.; Francks, Clyde; MacPhie, I. Laurence; Cherny, Stacey S.; Richardson, Alex J.; Talcott, Joel B.; Stein, John F.; Monaco, Anthony P.; Cardon, Lon R.
2003-01-01
Replication of linkage results for complex traits has been exceedingly difficult, owing in part to the inability to measure the precise underlying phenotype, small sample sizes, genetic heterogeneity, and statistical methods employed in analysis. Often, in any particular study, multiple correlated traits have been collected, yet these have been analyzed independently or, at most, in bivariate analyses. Theoretical arguments suggest that full multivariate analysis of all available traits should offer more power to detect linkage; however, this has not yet been evaluated on a genomewide scale. Here, we conduct multivariate genomewide analyses of quantitative-trait loci that influence reading- and language-related measures in families affected with developmental dyslexia. The results of these analyses are substantially clearer than those of previous univariate analyses of the same data set, helping to resolve a number of key issues. These outcomes highlight the relevance of multivariate analysis for complex disorders for dissection of linkage results in correlated traits. The approach employed here may aid positional cloning of susceptibility genes in a wide spectrum of complex traits. PMID:12587094
Elzanfaly, Eman S; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A
2015-12-05
A comparative study was established between two signal processing techniques showing the theoretical algorithm for each method and making a comparison between them to indicate the advantages and limitations. The methods under study are Numerical Differentiation (ND) and Continuous Wavelet Transform (CWT). These methods were studied as spectrophotometric resolution tools for simultaneous analysis of binary and ternary mixtures. To present the comparison, the two methods were applied for the resolution of Bisoprolol (BIS) and Hydrochlorothiazide (HCT) in their binary mixture and for the analysis of Amlodipine (AML), Aliskiren (ALI) and Hydrochlorothiazide (HCT) as an example for ternary mixtures. By comparing the results in laboratory prepared mixtures, it was proven that CWT technique is more efficient and advantageous in analysis of mixtures with severe overlapped spectra than ND. The CWT was applied for quantitative determination of the drugs in their pharmaceutical formulations and validated according to the ICH guidelines where accuracy, precision, repeatability and robustness were found to be within the acceptable limit. Copyright © 2015 Elsevier B.V. All rights reserved.
Exploring the use of storytelling in quantitative research fields using a multiple case study method
NASA Astrophysics Data System (ADS)
Matthews, Lori N. Hamlet
The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Winfree, William P.
2000-01-01
Wall thinning in utility boiler waterwall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used lor inspection of these tubes. This technique has proved to be very labor intensive and slow. This has resulted in a "spot check" approach to inspections, making thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source, coupled with this analysis technique, represents a significant improvement in the inspection speed for large structures such as boiler waterwalls while still providing high-resolution thickness measurements. A theoretical basis for the technique will be presented thus demonstrating the quantitative nature of the technique. Further, results of laboratory experiments on flat Panel specimens with fabricated material loss regions will be presented.
Doing Research on Education for Sustainable Development
ERIC Educational Resources Information Center
Reunamo, Jyrki; Pipere, Anita
2011-01-01
Purpose: The purpose of this paper is to describe the research preferences and differences of education for sustainable development (ESD) researchers. A model with the continuums assimilation-accommodation and adaptation-agency was applied resulting in quantitative, qualitative, theoretic and participative research orientations.…
Trends in Child Maltreatment Literature.
ERIC Educational Resources Information Center
Behl, Leah E.; Conyngham, Heather A.; May, Patricia F.
2003-01-01
Child maltreatment articles (n=2090) published from 1977-1998 were reviewed. Across the period studied, quantitative articles and articles on child sexual abuse increased and theoretical articles and articles on physical abuse decreased. Articles examining child neglect or emotional abuse remained consistently low. Participant recruitment from…
NASA Technical Reports Server (NTRS)
Benoit, P. H.; Sears, D. W. G.
2000-01-01
Natural thermoluminescence (TL) of ordinary chondrites reflects their irradiation and thermal history. We discuss the quantitative aspects of TL interpretation, with an emphasis on the terrestrial history of Antarctic meteorites and the orbital history of modern falls.
Radiofrequency recombination lines as diagnostics of the cool interstellar medium.
NASA Technical Reports Server (NTRS)
Dupree, A. K.
1971-01-01
Quantitative details are given of a new diagnostic technique for the carbon and hydrogen (H I) recombination lines. Theoretical results are presented for conditions expected in H I clouds, and are compared with available observations for Orion A and NGC 2024.
ERIC Educational Resources Information Center
Glass, Gene V.
1992-01-01
Questions that beginning educational researchers and evaluators will have to answer are discussed concerning: (1) which intellectual tradition to select; (2) the roles of basic and theoretical inquiry; (3) qualitative and quantitative distinctions; (4) the theory-practice relationship; (5) political influences; and (6) the correct emphasis on…
Representativeness of laboratory sampling procedures for the analysis of trace metals in soil.
Dubé, Jean-Sébastien; Boudreault, Jean-Philippe; Bost, Régis; Sona, Mirela; Duhaime, François; Éthier, Yannic
2015-08-01
This study was conducted to assess the representativeness of laboratory sampling protocols for purposes of trace metal analysis in soil. Five laboratory protocols were compared, including conventional grab sampling, to assess the influence of sectorial splitting, sieving, and grinding on measured trace metal concentrations and their variability. It was concluded that grinding was the most important factor in controlling the variability of trace metal concentrations. Grinding increased the reproducibility of sample mass reduction by rotary sectorial splitting by up to two orders of magnitude. Combined with rotary sectorial splitting, grinding increased the reproducibility of trace metal concentrations by almost three orders of magnitude compared to grab sampling. Moreover, results showed that if grinding is used as part of a mass reduction protocol by sectorial splitting, the effect of sieving on reproducibility became insignificant. Gy's sampling theory and practice was also used to analyze the aforementioned sampling protocols. While the theoretical relative variances calculated for each sampling protocol qualitatively agreed with the experimental variances, their quantitative agreement was very poor. It was assumed that the parameters used in the calculation of theoretical sampling variances may not correctly estimate the constitutional heterogeneity of soils or soil-like materials. Finally, the results have highlighted the pitfalls of grab sampling, namely, the fact that it does not exert control over incorrect sampling errors and that it is strongly affected by distribution heterogeneity.
[Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].
Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie
2013-11-01
In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.
Absolute Helmholtz free energy of highly anharmonic crystals: theory vs Monte Carlo.
Yakub, Lydia; Yakub, Eugene
2012-04-14
We discuss the problem of the quantitative theoretical prediction of the absolute free energy for classical highly anharmonic solids. Helmholtz free energy of the Lennard-Jones (LJ) crystal is calculated accurately while accounting for both the anharmonicity of atomic vibrations and the pair and triple correlations in displacements of the atoms from their lattice sites. The comparison with most precise computer simulation data on sublimation and melting lines revealed that theoretical predictions are in excellent agreement with Monte Carlo simulation data in the whole range of temperatures and densities studied.
NASA Astrophysics Data System (ADS)
Poumellec, B.; Kraizman, V.; Aifa, Y.; Cortès, R.; Novakovich, A.; Vedrinskii, R.
1998-09-01
Angular dependence of the vanadium K-edge x-ray appearance near-edge structure (XANES) for the VOPO4.2H2O xerogel is thoroughly studied both experimentally and theoretically. The main attention is paid to the pre-edge fine structure (PEFS) of the spectra which was shown earlier to be a useful tool for the atomic short order investigations. Good quantitative agreement between theory and experiment obtained for both dipole and quadrupole contributions to the spectra proves validity of the calculation method developed and enables us to ascertain the nature of all the features in the PEFS's. The p-d mixture effect due to distortion of the central coordination octahedron and the quadrupole transitions are proved to be the only mechanisms responsible for the PEFS formation in the case considered. We show that in order to achieve quantitative agreement between experimental and theoretical spectra, it is necessary to include the effect of atomic vibrations, which makes the forbidden transitions to molecular orbitals of the central octahedron (MOCO's) dipole allowed, and to take into account deviation of the crystal layers from the substrate plane, which is not a single crystal but a texture.
NASA Astrophysics Data System (ADS)
Reddy, Pramod; Washiyama, Shun; Kaess, Felix; Kirste, Ronny; Mita, Seiji; Collazo, Ramon; Sitar, Zlatko
2017-12-01
A theoretical framework that provides a quantitative relationship between point defect formation energies and growth process parameters is presented. It enables systematic point defect reduction by chemical potential control in metalorganic chemical vapor deposition (MOCVD) of III-nitrides. Experimental corroboration is provided by a case study of C incorporation in GaN. The theoretical model is shown to be successful in providing quantitative predictions of CN defect incorporation in GaN as a function of growth parameters and provides valuable insights into boundary phases and other impurity chemical reactions. The metal supersaturation is found to be the primary factor in determining the chemical potential of III/N and consequently incorporation or formation of point defects which involves exchange of III or N atoms with the reservoir. The framework is general and may be extended to other defect systems in (Al)GaN. The utility of equilibrium formalism typically employed in density functional theory in predicting defect incorporation in non-equilibrium and high temperature MOCVD growth is confirmed. Furthermore, the proposed theoretical framework may be used to determine optimal growth conditions to achieve minimum compensation within any given constraints such as growth rate, crystal quality, and other practical system limitations.
NASA Astrophysics Data System (ADS)
van de Wiel, B. J. H.; Moene, A. F.; Hartogensis, O. K.; de Bruin, H. A. R.; Holtslag, A. A. M.
2003-10-01
In this paper a classification of stable boundary layer regimes is presented based on observations of near-surface turbulence during the Cooperative Atmosphere-Surface Exchange Study-1999 (CASES-99). It is found that the different nights can be divided into three subclasses: a turbulent regime, an intermittent regime, and a radiative regime, which confirms the findings of two companion papers that use a simplified theoretical model (it is noted that its simpliflied structure limits the model generality to near-surface flows). The papers predict the occurrence of stable boundary layer regimes in terms of external forcing parameters such as the (effective) pressure gradient and radiative forcing. The classification in the present work supports these predictions and shows that the predictions are robust in a qualitative sense. As such, it is, for example, shown that intermittent turbulence is most likely to occur in clear-sky conditions with a moderately weak effective pressure gradient. The quantitative features of the theoretical classification are, however, rather sensitive to (often uncertain) local parameter estimations, such as the bulk heat conductance of the vegetation layer. This sensitivity limits the current applicability of the theoretical classification in a strict quantitative sense, apart from its conceptual value.
The interaction of moderately strong shock waves with thick perforated walls of low porosity
NASA Technical Reports Server (NTRS)
Grant, D. J.
1972-01-01
A theoretical prediction is given of the flow through thick perforated walls of low porosity resulting from the impingement of a moderately strong traveling shock wave. The model was a flat plate positioned normal to the direction of the flow. Holes bored in the plate parallel to the direction of the flow provided nominal hole length-to-diameter ratios of 10:1 and an axial porosity of 25 percent of the flow channel cross section. The flow field behind the reflected shock wave was assumed to behave as a reservoir producing a quasi-steady duct flow through the model. Rayleigh and Fanno duct flow theoretical computations for each of three possible auxiliary wave patterns that can be associated with the transmitted shock (to satisfy contact surface compatibility) were used to provide bounding solutions as an alternative to the more complex influence coefficients method. Qualitative and quantitative behavior was verified in a 1.5- by 2.0-in. helium shock tube. High speed Schlieren photography, piezoelectric pressure-time histories, and electronic-counter wave speed measurements were used to assess the extent of correlation with the theoretical flow models. Reduced data indicated the adequacy of the bounding theory approach to predict wave phenomena and quantitative response.
Boiler Tube Corrosion Characterization with a Scanning Thermal Line
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Jacobstein, Ronald; Reilly, Thomas
2001-01-01
Wall thinning due to corrosion in utility boiler water wall tubing is a significant operational concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. Unfortunately, ultrasonic inspection is very manpower intense and slow. Therefore, thickness measurements are typically taken over a relatively small percentage of the total boiler wall and statistical analysis is used to determine the overall condition of the boiler tubing. Other inspection techniques, such as electromagnetic acoustic transducer (EMAT), have recently been evaluated, however they provide only a qualitative evaluation - identifying areas or spots where corrosion has significantly reduced the wall thickness. NASA Langley Research Center, in cooperation with ThermTech Services, has developed a thermal NDE technique designed to quantitatively measure the wall thickness and thus determine the amount of material thinning present in steel boiler tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed and accuracy for large structures such as boiler water walls. A theoretical basis for the technique will be presented to establish the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of the application of this technology to actual water wall tubing samples and in-situ inspections will be presented.
Li, Jiajia; Li, Rongxi; Zhao, Bangsheng; Guo, Hui; Zhang, Shuan; Cheng, Jinghua; Wu, Xiaoli
2018-04-15
The use of Micro-Laser Raman spectroscopy technology for quantitatively determining gas carbon isotope composition is presented. In this study, 12 CO 2 and 13 CO 2 were mixed with N 2 at various molar fraction ratios to obtain Raman quantification factors (F 12CO2 and F 13CO2 ), which provide a theoretical basis for calculating the δ 13 C value. And the corresponding values were 0.523 (0
Video measurement of the muzzle velocity of a potato gun
NASA Astrophysics Data System (ADS)
Jasperson, Christopher; Pollman, Anthony
2011-09-01
Using first principles, a theoretical equation for the maximum and actual muzzle velocities for a pneumatic cannon was recently derived. For a fixed barrel length, this equation suggests that the muzzle velocity can be enhanced by maximizing the product of the initial pressure and the volume of the propellant gas and decreasing the projectile mass. The present paper describes the results of experiments conducted to verify the validity of this theoretical equation. A high-speed video camera was used to quantify muzzle velocity for potatoes of varying mass exiting a pneumatic cannon for gauge pressures ranging from 310 to 830 kPa. The experiments verified that a friction modified version of the theoretical equation is qualitatively and quantitatively accurate for potato masses above 100 g.
Mazzucca, Stephanie; Tabak, Rachel G; Pilar, Meagan; Ramsey, Alex T; Baumann, Ana A; Kryzer, Emily; Lewis, Ericka M; Padek, Margaret; Powell, Byron J; Brownson, Ross C
2018-01-01
The need for optimal study designs in dissemination and implementation (D&I) research is increasingly recognized. Despite the wide range of study designs available for D&I research, we lack understanding of the types of designs and methodologies that are routinely used in the field. This review assesses the designs and methodologies in recently proposed D&I studies and provides resources to guide design decisions. We reviewed 404 study protocols published in the journal Implementation Science from 2/2006 to 9/2017. Eligible studies tested the efficacy or effectiveness of D&I strategies (i.e., not effectiveness of the underlying clinical or public health intervention); had a comparison by group and/or time; and used ≥1 quantitative measure. Several design elements were extracted: design category (e.g., randomized); design type [e.g., cluster randomized controlled trial (RCT)]; data type (e.g., quantitative); D&I theoretical framework; levels of treatment assignment, intervention, and measurement; and country in which the research was conducted. Each protocol was double-coded, and discrepancies were resolved through discussion. Of the 404 protocols reviewed, 212 (52%) studies tested one or more implementation strategy across 208 manuscripts, therefore meeting inclusion criteria. Of the included studies, 77% utilized randomized designs, primarily cluster RCTs. The use of alternative designs (e.g., stepped wedge) increased over time. Fewer studies were quasi-experimental (17%) or observational (6%). Many study design categories (e.g., controlled pre-post, matched pair cluster design) were represented by only one or two studies. Most articles proposed quantitative and qualitative methods (61%), with the remaining 39% proposing only quantitative. Half of protocols (52%) reported using a theoretical framework to guide the study. The four most frequently reported frameworks were Consolidated Framework for Implementing Research and RE-AIM ( n = 16 each), followed by Promoting Action on Research Implementation in Health Services and Theoretical Domains Framework ( n = 12 each). While several novel designs for D&I research have been proposed (e.g., stepped wedge, adaptive designs), the majority of the studies in our sample employed RCT designs. Alternative study designs are increasing in use but may be underutilized for a variety of reasons, including preference of funders or lack of awareness of these designs. Promisingly, the prevalent use of quantitative and qualitative methods together reflects methodological innovation in newer D&I research.