Defining Success in Adult Basic Education Settings: Multiple Stakeholders, Multiple Perspectives
ERIC Educational Resources Information Center
Tighe, Elizabeth L.; Barnes, Adrienne E.; Connor, Carol M.; Steadman, Sharilyn C.
2013-01-01
This study employed quantitative and qualitative research approaches to investigate what constitutes success in adult basic education (ABE) programs from the perspectives of multiple educational stakeholders: the state funding agency, the teachers, and the students. Success was defined in multiple ways. In the quantitative section of the study, we…
Transforming Verbal Counts in Reports of Qualitative Descriptive Studies Into Numbers
Chang, YunKyung; Voils, Corrine I.; Sandelowski, Margarete; Hasselblad, Vic; Crandell, Jamie L.
2009-01-01
Reports of qualitative studies typically do not offer much information on the numbers of respondents linked to any one finding. This information may be especially useful in reports of basic, or minimally interpretive, qualitative descriptive studies focused on surveying a range of experiences in a target domain, and its lack may limit the ability to synthesize the results of such studies with quantitative results in systematic reviews. Accordingly, the authors illustrate strategies for deriving plausible ranges of respondents expressing a finding in a set of reports of basic qualitative descriptive studies on antiretroviral adherence and suggest how the results might be used. These strategies have limitations and are never appropriate for use with findings from interpretive qualitative studies. Yet they offer a temporary workaround for preserving and maximizing the value of information from basic qualitative descriptive studies for systematic reviews. They show also why quantitizing is never simply quantitative. PMID:19448052
NASA Astrophysics Data System (ADS)
Buxner, Sanlyn; Impey, Chris David; Follette, Katherine B.; Dokter, Erin F.; McCarthy, Don; Vezino, Beau; Formanek, Martin; Romine, James M.; Brock, Laci; Neiberding, Megan; Prather, Edward E.
2017-01-01
Introductory astronomy courses often serve as terminal science courses for non-science majors and present an opportunity to assess non future scientists’ attitudes towards science as well as basic scientific knowledge and scientific analysis skills that may remain unchanged after college. Through a series of studies, we have been able to evaluate students’ basic science knowledge, attitudes towards science, quantitative literacy, and informational literacy. In the Fall of 2015, we conducted a case study of a single class administering all relevant surveys to an undergraduate class of 20 students. We will present our analysis of trends of each of these studies as well as the comparison case study. In general we have found that students basic scientific knowledge has remained stable over the past quarter century. In all of our studies, there is a strong relationship between student attitudes and their science and quantitative knowledge and skills. Additionally, students’ information literacy is strongly connected to their attitudes and basic scientific knowledge. We are currently expanding these studies to include new audiences and will discuss the implications of our findings for instructors.
ERIC Educational Resources Information Center
Papaphotis, Georgios; Tsaparlis, Georgios
2008-01-01
Part 1 of the findings are presented of a quantitative study (n = 125) on basic quantum chemical concepts taught in the twelfth grade (age 17-18 years) in Greece. A paper-and-pencil test of fourteen questions was used. The study compared performance in five questions that tested recall of knowledge or application of algorithmic procedures (type-A…
ERIC Educational Resources Information Center
Cotreau Berube, Elyse A.
2011-01-01
The purpose of this quantitative research study was to investigate the use of rote learning in basic skills of mathematics and spelling of 12 high school students, from a career and technical high school, in an effort to learn if the pedagogy of rote fits in the frameworks of today's education. The study compared the accuracy of…
What Are We Doing When We Translate from Quantitative Models?
Critchfield, Thomas S; Reed, Derek D
2009-01-01
Although quantitative analysis (in which behavior principles are defined in terms of equations) has become common in basic behavior analysis, translational efforts often examine everyday events through the lens of narrative versions of laboratory-derived principles. This approach to translation, although useful, is incomplete because equations may convey concepts that are difficult to capture in words. To support this point, we provide a nontechnical introduction to selected aspects of quantitative analysis; consider some issues that translational investigators (and, potentially, practitioners) confront when attempting to translate from quantitative models; and discuss examples of relevant translational studies. We conclude that, where behavior-science translation is concerned, the quantitative features of quantitative models cannot be ignored without sacrificing conceptual precision, scientific and practical insights, and the capacity of the basic and applied wings of behavior analysis to communicate effectively. PMID:22478533
Basic sciences agonize in Turkey!
NASA Astrophysics Data System (ADS)
Akdemir, Fatma; Araz, Asli; Akman, Ferdi; Durak, Rıdvan
2016-04-01
In this study, changes from past to present in the departments of physics, chemistry, biology and mathematics, which are considered as the basic sciences in Turkey, are shown. The importance of basic science for the country emphasized and the status of our country was discussed with a critical perspective. The number of academic staff, the number of students, opened quotas according to years for these four departments at universities were calculated and analysis of the resulting changes were made. In examined graphics changes to these four departments were similar. Especially a significant change was observed in the physics department. Lack of jobs employing young people who have graduated from basic science is also an issue that must be discussed. There are also qualitative results of this study that we have discussed as quantitative. Psychological problems caused by unemployment have become a disease among young people. This study was focused on more quantitative results. We have tried to explain the causes of obtained results and propose solutions.
A QUANTITATIVE APPROACH FOR ESTIMATING EXPOSURE TO PESTICIDES IN THE AGRICULTURAL HEALTH STUDY
We developed a quantitative method to estimate chemical-specific pesticide exposures in a large prospective cohort study of over 58,000 pesticide applicators in North Carolina and Iowa. An enrollment questionnaire was administered to applicators to collect basic time- and inten...
Yin, Hongyao; Feng, Yujun; Liu, Hanbin; Mu, Meng; Fei, Chenhong
2014-08-26
Owing to its wide availability, nontoxicity, and low cost, CO2 working as a trigger to reversibly switch material properties, including polarity, ionic strength, hydrophilicity, viscosity, surface charge, and degree of polymerization or cross-linking, has attracted an increasing attention in recent years. However, a quantitative correlation between basicity of these materials and their CO2 switchability has been less documented though it is of great importance for fabricating switchable system. In this work, the "switch-on" and "switch-off" abilities of melamine and its amino-substituted derivatives by introducing and removing CO2 are studied, and then their quantitative relationship with basicity is established, so that performances of other organobases can be quantitatively predicted. These findings are beneficial for forecasting the CO2 stimuli-responsive behavior of other organobases and the design of CO2-switchable materials.
Li, Dong-tao; Ling, Chang-quan; Zhu, De-zeng
2007-07-01
To establish a quantitative model for evaluating the degree of the TCM basic syndromes often encountered in patients with primary liver cancer (PLC). Medical literatures concerning the clinical investigation and TCM syndrome of PLC were collected and analyzed adopting expert-composed symposium method, and the 100 millimeter scaling was applied in combining with scoring on degree of symptoms to establish a quantitative criterion for symptoms and signs degree classification in patients with PLC. Two models, i.e. the additive model and the additive-multiplicative model, were established by using comprehensive analytic hierarchy process (AHP) as the mathematical tool to estimate the weight of the criterion for evaluating basic syndromes in various layers by specialists. Then the two models were verified in clinical practice and the outcomes were compared with that fuzzy evaluated by specialists. Verification on 459 times/case of PLC showed that the coincidence rate between the outcomes derived from specialists with that from the additive model was 84.53 %, and with that from the additive-multificative model was 62.75 %, the difference between the two showed statistical significance (P<0.01). It could be decided that the additive model is the principle model suitable for quantitative evaluation on the degree of TCM basic syndromes in patients with PLC.
Shortage of Mathematics Teachers in Thai Basic Education Level
ERIC Educational Resources Information Center
Puncreobutr, Vichian; Rattanatumma, Tawachai
2016-01-01
The objective of this study was to identify the reasons for shortage of Mathematics teachers at Thai Basic Education level. This research is both quantitative and qualitative in nature. For the purpose of study, survey was conducted with senior high school students, in order to find out their willingness to pursue mathematics in Bachelor of…
New Statistical Techniques for Evaluating Longitudinal Models.
ERIC Educational Resources Information Center
Murray, James R.; Wiley, David E.
A basic methodological approach in developmental studies is the collection of longitudinal data. Behavioral data cen take at least two forms, qualitative (or discrete) and quantitative. Both types are fallible. Measurement errors can occur in quantitative data and measures of these are based on error variance. Qualitative or discrete data can…
Guidelines for Reporting Quantitative Methods and Results in Primary Research
ERIC Educational Resources Information Center
Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob
2015-01-01
Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…
A quantitative comparison of corrective and perfective maintenance
NASA Technical Reports Server (NTRS)
Henry, Joel; Cain, James
1994-01-01
This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.
ERIC Educational Resources Information Center
Tsaparlis, Georgios; Papaphotis, Georgios
2009-01-01
This study tested for deep understanding and critical thinking about basic quantum chemical concepts taught at 12th grade (age 17-18). Our aim was to achieve conceptual change in students. A quantitative study was conducted first (n = 125), and following this 23 selected students took part in semi-structured interviews either individually or in…
Structure and Properties of Energetic Materials
1992-12-02
basic research is needed. First, a quantitative study of friction effects on propellants with varying particle sizes can be conducted. Second, using...Army position, policy, or decision, unless so designated by other documentation. Mat. Res. Soc. Symp. Proc. Vol. 296. t 1993 Materials Research Society...further observations and analysis. INTRODUCTION Recently, a study group sponsored by the Army Research Office developed and published an overall basic
ERIC Educational Resources Information Center
Alsuwaileh, Bader Ghannam; Russ-Eft, Darlene F.; Alshurai, Saad R.
2016-01-01
The research herein used a sequential mixed methods design to investigate why academic dishonesty is widespread among the students at the College of Basic Education in Kuwait. Qualitative interviews were conducted to generate research hypotheses. Then, using questionnaire survey, the research hypotheses were quantitatively tested. The findings…
The Impact of the Digital Divide on First-Year Community College Students
ERIC Educational Resources Information Center
Mansfield, Malinda
2017-01-01
Some students do not possess the learning management system (LMS) and basic computer skills needed for success in first-year experience (FYE) courses. The purpose of this quantitative study, based on the Integrative Learning Design Framework and theory of transactional distance, was to identify what basic computer skills and LMS skills are needed…
ERIC Educational Resources Information Center
Esia-Donkoh, Kweku; Baffoe, Stella
2018-01-01
The study examined the supervisory practices of headteachers and how these supervisory practices relate with teacher motivation in public basic schools in the Anomabo Education Circuit of the Mfantseman Municipality in the Central Region of Ghana. Quantitative approach of the cross-sectional survey design was adopted. Using purposive and…
ERIC Educational Resources Information Center
Frazier Varner, Debrah
2010-01-01
Many adult basic education (ABE) programs do not achieve a high success rate in meeting student academic needs. Rooted in Knowles' theory of andragogy and Bandura's theory of modeling, this quantitative causal comparative study examined the effects of individualized instruction (IGI) and of facilitated, participatory group programs (SPOKES) on the…
ERIC Educational Resources Information Center
Haegele, Justin A.; Hodge, Samuel R.
2015-01-01
Emerging professionals, particularly senior-level undergraduate and graduate students in kinesiology who have an interest in physical education for individuals with and without disabilities, should understand the basic assumptions of the quantitative research paradigm. Knowledge of basic assumptions is critical for conducting, analyzing, and…
Appraising Quantitative Research in Health Education: Guidelines for Public Health Educators
Hayes, Sandra C.; Scharalda, Jeanfreau G.; Stetson, Barbara; Jones-Jack, Nkenge H.; Valliere, Matthew; Kirchain, William R.; Fagen, Michael; LeBlanc, Cris
2010-01-01
Many practicing health educators do not feel they possess the skills necessary to critically appraise quantitative research. This publication is designed to help provide practicing health educators with basic tools helpful to facilitate a better understanding of quantitative research. This article describes the major components—title, introduction, methods, analyses, results and discussion sections—of quantitative research. Readers will be introduced to information on the various types of study designs and seven key questions health educators can use to facilitate the appraisal process. Upon reading, health educators will be in a better position to determine whether research studies are well designed and executed. PMID:20400654
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, M.D.; Beck, R.N.
1988-06-01
This document describes several years research to improve PET imaging and diagnostic techniques in man. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefitmore » from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. The reports in the study were processed separately for the data bases. (TEM)« less
ERIC Educational Resources Information Center
Ocloo, Mark Anthony; Subbey, Michael
2008-01-01
The purpose of this study is to investigate the perception of basic school teachers towards inclusive education in the Hohoe District of Ghana. The research makes use of a descriptive survey design, which engaged both qualitative and quantitative research methodologies. A sample size of 100 respondents, comprising of 60 male teachers and 40 female…
Haneda, Kiyofumi; Umeda, Tokuo; Koyama, Tadashi; Harauchi, Hajime; Inamura, Kiyonari
2002-01-01
The target of our study is to establish the methodology for analyzing level of security requirements, for searching suitable security measures and for optimizing security distribution to every portion of medical practice. Quantitative expression must be introduced to our study as possible for the purpose of easy follow up of security procedures and easy evaluation of security outcomes or results. Results of system analysis by fault tree analysis (FTA) clarified that subdivided system elements in detail contribute to much more accurate analysis. Such subdivided composition factors very much depended on behavior of staff, interactive terminal devices, kinds of service, and routes of network. As conclusion, we found the methods to analyze levels of security requirements for each medical information systems employing FTA, basic events for each composition factor and combination of basic events. Methods for searching suitable security measures were found. Namely risk factors for each basic event, number of elements for each composition factor and candidates of security measure elements were found. Method to optimize the security measures for each medical information system was proposed. Namely optimum distribution of risk factors in terms of basic events were figured out, and comparison of them between each medical information systems became possible.
Critical thinking skills of basic baccalaureate and Accelerated second-degree nursing students.
Newton, Sarah E; Moore, Gary
2013-01-01
The purpose of this study was to describe the critical thinking (CT) skills of basic baccalaureate (basic-BSN) and accelerated second-degree (ASD) nursing students at nursing program entry. Many authors propose that CT in nursing should be viewed as a developmental process that increases as students' experiences with it change. However, there is a dearth of literature that describes basic-BSN and ASD students' CT skills from an evolutionary perspective. The study design was exploratory descriptive. The results indicated thatASD students had higher CT scores on a quantitative critical thinking assessment at program entry than basic-BSN students. CT data are needed across the nursing curriculum from basic-BSN and ASD students in order for nurse educators to develop cohort-specific pedagogical approaches that facilitate critical thinking in nursing and produce nurses with good CT skills for the future.
Principles of Quantitative MR Imaging with Illustrated Review of Applicable Modular Pulse Diagrams.
Mills, Andrew F; Sakai, Osamu; Anderson, Stephan W; Jara, Hernan
2017-01-01
Continued improvements in diagnostic accuracy using magnetic resonance (MR) imaging will require development of methods for tissue analysis that complement traditional qualitative MR imaging studies. Quantitative MR imaging is based on measurement and interpretation of tissue-specific parameters independent of experimental design, compared with qualitative MR imaging, which relies on interpretation of tissue contrast that results from experimental pulse sequence parameters. Quantitative MR imaging represents a natural next step in the evolution of MR imaging practice, since quantitative MR imaging data can be acquired using currently available qualitative imaging pulse sequences without modifications to imaging equipment. The article presents a review of the basic physical concepts used in MR imaging and how quantitative MR imaging is distinct from qualitative MR imaging. Subsequently, the article reviews the hierarchical organization of major applicable pulse sequences used in this article, with the sequences organized into conventional, hybrid, and multispectral sequences capable of calculating the main tissue parameters of T1, T2, and proton density. While this new concept offers the potential for improved diagnostic accuracy and workflow, awareness of this extension to qualitative imaging is generally low. This article reviews the basic physical concepts in MR imaging, describes commonly measured tissue parameters in quantitative MR imaging, and presents the major available pulse sequences used for quantitative MR imaging, with a focus on the hierarchical organization of these sequences. © RSNA, 2017.
United States Marine Corps Basic Reconnaissance Course: Predictors of Success
2017-03-01
PAGE INTENTIONALLY LEFT BLANK 81 VI. CONCLUSIONS AND RECOMMENDATIONS A. CONCLUSIONS The objective of my research is to provide quantitative ...percent over the last three years, illustrating there is room for improvement. This study conducts a quantitative and qualitative analysis of the...criteria used to select candidates for the BRC. The research uses multi-variate logistic regression models and survival analysis to determine to what
2012-01-01
Background F1 hybrid clones of Eucalyptus grandis and E. urophylla are widely grown for pulp and paper production in tropical and subtropical regions. Volume growth and wood quality are priority objectives in Eucalyptus tree improvement. The molecular basis of quantitative variation and trait expression in eucalypt hybrids, however, remains largely unknown. The recent availability of a draft genome sequence (http://www.phytozome.net) and genome-wide genotyping platforms, combined with high levels of genetic variation and high linkage disequilibrium in hybrid crosses, greatly facilitate the detection of quantitative trait loci (QTLs) as well as underlying candidate genes for growth and wood property traits. In this study, we used Diversity Arrays Technology markers to assess the genetic architecture of volume growth (diameter at breast height, DBH) and wood basic density in four-year-old progeny of an interspecific backcross pedigree of E. grandis and E. urophylla. In addition, we used Illumina RNA-Seq expression profiling in the E. urophylla backcross family to identify cis- and trans-acting polymorphisms (eQTLs) affecting transcript abundance of genes underlying QTLs for wood basic density. Results A total of five QTLs for DBH and 12 for wood basic density were identified in the two backcross families. Individual QTLs for DBH and wood basic density explained 3.1 to 12.2% of phenotypic variation. Candidate genes underlying QTLs for wood basic density on linkage groups 8 and 9 were found to share trans-acting eQTLs located on linkage groups 4 and 10, which in turn coincided with QTLs for wood basic density suggesting that these QTLs represent segregating components of an underlying transcriptional network. Conclusion This is the first demonstration of the use of next-generation expression profiling to quantify transcript abundance in a segregating tree population and identify candidate genes potentially affecting wood property variation. The QTLs identified in this study provide a resource for identifying candidate genes and developing molecular markers for marker-assisted breeding of volume growth and wood basic density. Our results suggest that integrated analysis of transcript and trait variation in eucalypt hybrids can be used to dissect the molecular basis of quantitative variation in wood property traits. PMID:22817272
Nuclear medicine and imaging research (quantitative studies in radiopharmaceutical science)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, M.D.; Beck, R.N.
1990-09-01
This is a report of progress in Year Two (January 1, 1990--December 31, 1990) of Grant FG02-86ER60438, Quantitative Studies in Radiopharmaceutical Science,'' awarded for the three-year period January 1, 1989--December 31, 1991 as a competitive renewal following site visit in the fall of 1988. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further themore » development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 25 refs., 13 figs., 1 tab.« less
Analysis of a document/reporting system
NASA Technical Reports Server (NTRS)
Narrow, B.
1971-01-01
An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.
Description and Application of a Mathematical Method for the Analysis of Harmony
Zuo, Qiting; Jin, Runfang; Ma, Junxia
2015-01-01
Harmony issues are widespread in human society and nature. To analyze these issues, harmony theory has been proposed as the main theoretical approach for the study of interpersonal relationships and relationships between humans and nature. Therefore, it is of great importance to study harmony theory. After briefly introducing the basic concepts of harmony theory, this paper expounds the five elements that are essential for the quantitative description of harmony issues in water resources management: harmony participant, harmony objective, harmony regulation, harmony factor, and harmony action. A basic mathematical equation for the harmony degree, that is, a quantitative expression of harmony issues, is introduced in the paper: HD = ai − bj, where a is the uniform degree, b is the difference degree, i is the harmony coefficient, and j is the disharmony coefficient. This paper also discusses harmony assessment and harmony regulation and introduces some application examples. PMID:26167535
Parameterizing the Supernova Engine and Its Effect on Remnants and Basic Yields
NASA Astrophysics Data System (ADS)
Fryer, Chris L.; Andrews, Sydney; Even, Wesley; Heger, Alex; Safi-Harb, Samar
2018-03-01
Core-collapse supernova science is now entering an era in which engine models are beginning to make both qualitative and, in some cases, quantitative predictions. Although the evidence in support of the convective engine for core-collapse supernova continues to grow, it is difficult to place quantitative constraints on this engine. Some studies have made specific predictions for the remnant distribution from the convective engine, but the results differ between different groups. Here we use a broad parameterization for the supernova engine to understand the differences between distinct studies. With this broader set of models, we place error bars on the remnant mass and basic yields from the uncertainties in the explosive engine. We find that, even with only three progenitors and a narrow range of explosion energies, we can produce a wide range of remnant masses and nucleosynthetic yields.
ERIC Educational Resources Information Center
Papaphotis, Georgios; Tsaparlis, Georgios
2008-01-01
Part 2 of the findings are presented of a quantitative study (n = 125) on basic quantum chemical concepts taught at twelfth grade (age 17-18 years) in Greece. A paper-and-pencil test of fourteen questions was used that were of two kinds: five questions that tested recall of knowledge or application of algorithmic procedures (type-A questions);…
Oja, M; Maran, U
2015-01-01
Absorption in gastrointestinal tract compartments varies and is largely influenced by pH. Therefore, considering pH in studies and analyses of membrane permeability provides an opportunity to gain a better understanding of the behaviour of compounds and to obtain good permeability estimates for prediction purposes. This study concentrates on relationships between the chemical structure and membrane permeability of acidic and basic drugs and drug-like compounds. The membrane permeability of 36 acidic and 61 basic compounds was measured using the parallel artificial membrane permeability assay (PAMPA) at pH 3, 5, 7.4 and 9. Descriptive and/or predictive single-parameter quantitative structure-permeability relationships were derived for all pH values. For acidic compounds, membrane permeability is mainly influenced by hydrogen bond donor properties, as revealed by models with r(2) > 0.8 for pH 3 and pH 5. For basic compounds, the best (r(2) > 0.7) structure-permeability relationships are obtained with the octanol-water distribution coefficient for pH 7.4 and pH 9, indicating the importance of partition properties. In addition to the validation set, the prediction quality of the developed models was tested with folic acid and astemizole, showing good matches between experimental and calculated membrane permeabilities at key pHs. Selected QSAR models are available at the QsarDB repository ( http://dx.doi.org/10.15152/QDB.166 ).
A structural equation modeling analysis of students' understanding in basic mathematics
NASA Astrophysics Data System (ADS)
Oktavia, Rini; Arif, Salmawaty; Ferdhiana, Ridha; Yuni, Syarifah Meurah; Ihsan, Mahyus
2017-11-01
This research, in general, aims to identify incoming students' understanding and misconceptions of several basic concepts in mathematics. The participants of this study are the 2015 incoming students of Faculty of Mathematics and Natural Science of Syiah Kuala University, Indonesia. Using an instrument that were developed based on some anecdotal and empirical evidences on students' misconceptions, a survey involving 325 participants was administered and several quantitative and qualitative analysis of the survey data were conducted. In this article, we discuss the confirmatory factor analysis using Structural Equation Modeling (SEM) on factors that determine the new students' overall understanding of basic mathematics. The results showed that students' understanding on algebra, arithmetic, and geometry were significant predictors for their overall understanding of basic mathematics. This result supported that arithmetic and algebra are not the only predictors of students' understanding of basic mathematics.
Informal Adoption Among Black Families.
ERIC Educational Resources Information Center
Hill, Robert B.
This study of informal adoption patterns among black families throughout the United States examines the role of the extended family and the functioning of a kinship network which includes foster care of children by relatives other than parents. The study's basic mode of investigation was secondary analysis of existing data: quantitative national…
Derejko, Katie-Sue; Couture, Julie; Padgett, Deborah K.
2014-01-01
This mixed-methods study uses Maslow’s hierarchy as a theoretical lens to investigate the experiences of 63 newly enrolled clients of housing first and traditional programs for adults with serious mental illness who have experienced homelessness. Quantitative findings suggests that identifying self-actualization goals is associated with not having one’s basic needs met rather than from the fulfillment of basic needs. Qualitative findings suggest a more complex relationship between basic needs, goal setting, and the meaning of self-actualization. Transforming mental health care into a recovery-oriented system will require further consideration of person-centered care planning as well as the impact of limited resources especially for those living in poverty. PMID:24518968
Henwood, Benjamin F; Derejko, Katie-Sue; Couture, Julie; Padgett, Deborah K
2015-03-01
This mixed-methods study uses Maslow's hierarchy as a theoretical lens to investigate the experiences of 63 newly enrolled clients of housing first and traditional programs for adults with serious mental illness who have experienced homelessness. Quantitative findings suggests that identifying self-actualization goals is associated with not having one's basic needs met rather than from the fulfillment of basic needs. Qualitative findings suggest a more complex relationship between basic needs, goal setting, and the meaning of self-actualization. Transforming mental health care into a recovery-oriented system will require further consideration of person-centered care planning as well as the impact of limited resources especially for those living in poverty.
Fritz, Nora E; Keller, Jennifer; Calabresi, Peter A; Zackowski, Kathleen M
2017-01-01
At least 85% of individuals with multiple sclerosis report walking dysfunction as their primary complaint. Walking and strength measures are common clinical measures to mark increasing disability or improvement with rehabilitation. Previous studies have shown an association between strength or walking ability and spinal cord MRI measures, and strength measures with brainstem corticospinal tract magnetization transfer ratio. However, the relationship between walking performance and brain corticospinal tract magnetization transfer imaging measures and the contribution of clinical measurements of walking and strength to the underlying integrity of the corticospinal tract has not been explored in multiple sclerosis. The objectives of this study were explore the relationship of quantitative measures of walking and strength to whole-brain corticospinal tract-specific MRI measures and to determine the contribution of quantitative measures of function in addition to basic clinical measures (age, gender, symptom duration and Expanded Disability Status Scale) to structural imaging measures of the corticospinal tract. We hypothesized that quantitative walking and strength measures would be related to brain corticospinal tract-specific measures, and would provide insight into the heterogeneity of brain pathology. Twenty-nine individuals with relapsing-remitting multiple sclerosis (mean(SD) age 48.7 (11.5) years; symptom duration 11.9(8.7); 17 females; median[range] Expanded Disability Status Scale 4.0 [1.0-6.5]) and 29 age and gender-matched healthy controls (age 50.8(11.6) years; 20 females) participated in clinical tests of strength and walking (Timed Up and Go, Timed 25 Foot Walk, Two Minute Walk Test ) as well as 3 T imaging including diffusion tensor imaging and magnetization transfer imaging. Individuals with multiple sclerosis were weaker (p = 0.0024) and walked slower (p = 0.0013) compared to controls. Quantitative measures of walking and strength were significantly related to corticospinal tract fractional anisotropy (r > 0.26; p < 0.04) and magnetization transfer ratio (r > 0.29; p < 0.03) measures. Although the Expanded Disability Status Scale was highly correlated with walking measures, it was not significantly related to either corticospinal tract fractional anisotropy or magnetization transfer ratio (p > 0.05). Walk velocity was a significant contributor to magnetization transfer ratio (p = 0.006) and fractional anisotropy (p = 0.011) in regression modeling that included both quantitative measures of function and basic clinical information. Quantitative measures of strength and walking are associated with brain corticospinal tract pathology. The addition of these quantitative measures to basic clinical information explains more of the variance in corticospinal tract fractional anisotropy and magnetization transfer ratio than the basic clinical information alone. Outcome measurement for multiple sclerosis clinical trials has been notoriously challenging; the use of quantitative measures of strength and walking along with tract-specific imaging methods may improve our ability to monitor disease change over time, with intervention, and provide needed guidelines for developing more effective targeted rehabilitation strategies.
Zhang, Yin; Wang, Lei
2013-01-01
Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689
Zhang, Yin; Wang, Lei; Diao, Tianxi
2013-12-01
The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.
Profile of Scientific Ability of Chemistry Education Students in Basic Physics Course
NASA Astrophysics Data System (ADS)
Suastika, K. G.; Sudyana, I. N.; Lasiani, L.; Pebriyanto, Y.; Kurniawati, N.
2017-09-01
The weakness of scientific ability of students in college has been being a concern in this case, especially in terms of laboratory activities to support Laboratory Based Education. Scientific ability is a basic ability that must be dominated by students in basic physics lecturing process as a part of scientific method. This research aims to explore the indicators emergence of the scientific ability of students in Chemistry Education of Study Program, Faculty of Teaching and Education University of Palangka Raya through Inquiry Based Learning in basic physics courses. This research is a quantitative research by using descriptive method (descriptive-quantitative). Students are divided into three categories of group those are excellent group, low group, and heterogeneous group. The result shows that the excellent group and low group have same case that were occured decreasing in the percentage of achievement of scientific ability, while in heterogeneous group was increased. The differentiation of these results are caused by enthusiastic level of students in every group that can be seen in tables of scientific ability achievement aspects. By the results of this research, hoping in the future can be a references for further research about innovative learning strategies and models that can improve scientific ability and scientific reasoning especially for science teacher candidates.
NASA Astrophysics Data System (ADS)
Follette, K.; McCarthy, D.
2012-08-01
Current trends in the teaching of high school and college science avoid numerical engagement because nearly all students lack basic arithmetic skills and experience anxiety when encountering numbers. Nevertheless, such skills are essential to science and vital to becoming savvy consumers, citizens capable of recognizing pseudoscience, and discerning interpreters of statistics in ever-present polls, studies, and surveys in which our society is awash. Can a general-education collegiate course motivate students to value numeracy and to improve their quantitative skills in what may well be their final opportunity in formal education? We present a tool to assess whether skills in numeracy/quantitative literacy can be fostered and improved in college students through the vehicle of non-major introductory courses in astronomy. Initial classroom applications define the magnitude of this problem and indicate that significant improvements are possible. Based on these initial results we offer this tool online and hope to collaborate with other educators, both formal and informal, to develop effective mechanisms for encouraging all students to value and improve their skills in basic numeracy.
Building a Database for a Quantitative Model
NASA Technical Reports Server (NTRS)
Kahn, C. Joseph; Kleinhammer, Roger
2014-01-01
A database can greatly benefit a quantitative analysis. The defining characteristic of a quantitative risk, or reliability, model is the use of failure estimate data. Models can easily contain a thousand Basic Events, relying on hundreds of individual data sources. Obviously, entering so much data by hand will eventually lead to errors. Not so obviously entering data this way does not aid linking the Basic Events to the data sources. The best way to organize large amounts of data on a computer is with a database. But a model does not require a large, enterprise-level database with dedicated developers and administrators. A database built in Excel can be quite sufficient. A simple spreadsheet database can link every Basic Event to the individual data source selected for them. This database can also contain the manipulations appropriate for how the data is used in the model. These manipulations include stressing factors based on use and maintenance cycles, dormancy, unique failure modes, the modeling of multiple items as a single "Super component" Basic Event, and Bayesian Updating based on flight and testing experience. A simple, unique metadata field in both the model and database provides a link from any Basic Event in the model to its data source and all relevant calculations. The credibility for the entire model often rests on the credibility and traceability of the data.
Zhang, Yin; Diao, Tianxi; Wang, Lei
2014-12-01
Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. © 2014 Wiley Periodicals, Inc.
An evidential reasoning extension to quantitative model-based failure diagnosis
NASA Technical Reports Server (NTRS)
Gertler, Janos J.; Anderson, Kenneth C.
1992-01-01
The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.
[Some comments on ecological field].
Wang, D
2000-06-01
Based on the data of plant ecological field studies, this paper reviewed the conception of ecological field, field eigenfunctions, graphs of ecological field and its application of ecological field theory in explaining plant interactions. It is suggested that the basic character of ecological field is material, and based on the current research level, it is not sure whether ecological field is a kind of specific field different from general physical field. The author gave some comments on the formula and estimation of parameters of basic field function-ecological potential model on ecological field. Both models have their own characteristics and advantages in specific conditions. The author emphasized that ecological field had even more meaning of ecological methodology, and applying ecological field theory in describing the types and processes of plant interactions had three characteristics: quantitative, synthetic and intuitionistic. Field graphing might provide a new way to ecological studies, especially applying the ecological field theory might give an appropriate quantitative explanation for the dynamic process of plant populations (coexistence and interference competition).
Li, Jie; Sun, Jin; Cui, Shengmiao; He, Zhonggui
2006-11-03
Linear solvation energy relationships (LSERs) amended by the introduction of a molecular electronic factor were employed to establish quantitative structure-retention relationships using immobilized artificial membrane (IAM) chromatography, in particular ionizable solutes. The chromatographic indices, log k(IAM), were determined by HPLC on an IAM.PC.DD2 column for 53 structurally diverse compounds, including neutral, acidic and basic compounds. Unlike neutral compounds, the IAM chromatographic retention of ionizable compounds was affected by their molecular charge state. When the mean net charge per molecule (delta) was introduced into the amended LSER as the sixth variable, the LSER regression coefficient was significantly improved for the test set including ionizable solutes. The delta coefficients of acidic and basic compounds were quite different indicating that the molecular electronic factor had a markedly different impact on the retention of acidic and basic compounds on IAM column. Ionization of acidic compounds containing a carboxylic group tended to impair their retention on IAM, while the ionization of basic compounds did not have such a marked effect. In addition, the extra-interaction with the polar head of phospholipids might cause a certain change in the retention of basic compounds. A comparison of calculated and experimental retention indices suggested that the semi-empirical LSER amended by the addition of a molecular electronic factor was able to reproduce adequately the experimental retention factors of the structurally diverse solutes investigated.
Using mixed methods research in medical education: basic guidelines for researchers.
Schifferdecker, Karen E; Reed, Virginia A
2009-07-01
Mixed methods research involves the collection, analysis and integration of both qualitative and quantitative data in a single study. The benefits of a mixed methods approach are particularly evident when studying new questions or complex initiatives and interactions, which is often the case in medical education research. Basic guidelines for when to use mixed methods research and how to design a mixed methods study in medical education research are not readily available. The purpose of this paper is to remedy that situation by providing an overview of mixed methods research, research design models relevant for medical education research, examples of each research design model in medical education research, and basic guidelines for medical education researchers interested in mixed methods research. Mixed methods may prove superior in increasing the integrity and applicability of findings when studying new or complex initiatives and interactions in medical education research. They deserve an increased presence and recognition in medical education research.
NASA Astrophysics Data System (ADS)
Haneda, Kiyofumi; Kajima, Toshio; Koyama, Tadashi; Muranaka, Hiroyuki; Dojo, Hirofumi; Aratani, Yasuhiko
2002-05-01
The target of our study is to analyze the level of necessary security requirements, to search for suitable security measures and to optimize security distribution to every portion of the medical practice. Quantitative expression must be introduced to our study, if possible, to enable simplified follow-up security procedures and easy evaluation of security outcomes or results. Using fault tree analysis (FTA), system analysis showed that system elements subdivided into groups by details result in a much more accurate analysis. Such subdivided composition factors greatly depend on behavior of staff, interactive terminal devices, kinds of services provided, and network routes. Security measures were then implemented based on the analysis results. In conclusion, we identified the methods needed to determine the required level of security and proposed security measures for each medical information system, and the basic events and combinations of events that comprise the threat composition factors. Methods for identifying suitable security measures were found and implemented. Risk factors for each basic event, a number of elements for each composition factor, and potential security measures were found. Methods to optimize the security measures for each medical information system were proposed, developing the most efficient distribution of risk factors for basic events.
Defining Success in Adult Basic Education Settings: Multiple Stakeholders, Multiple Perspectives
Tighe, Elizabeth L.; Barnes, Adrienne E.; Connor, Carol M.; Steadman, Sharilyn C.
2015-01-01
This study employed quantitative and qualitative research approaches to investigate what constitutes “success” in Adult Basic Education (ABE) programs from the perspectives of multiple educational stakeholders: the state funding agency, the teachers, and the students. Success was defined in multiple ways. In the quantitative section of the study, we computed classroom value-added scores (used as a metric of the state’s definition of success) to identify more and less effective ABE classrooms in two Florida counties. In the qualitative section of the study, we observed and conducted interviews with teachers and students in the selected classrooms to investigate how these stakeholders defined success in ABE. Iterative consideration of the qualitative data revealed three principal markers of success: (a) instructional strategies and teacher-student interactions; (b) views on standardized testing; and (c) student motivational factors. In general, classrooms with higher value-added scores were characterized by multiple instructional approaches, positive and collaborative teacher-student interactions, and students engaging in goal setting and citing motivational factors such as family and personal fulfillment. The implications for ABE programs are discussed. PMID:26279590
ERIC Educational Resources Information Center
Nikmah; Ardi, Muhammad; Yahya, Mohamad; Upa, Muhamad D. Pua; Dirawan, Gufran Darma
2017-01-01
The objective of research is to describe the knowledge and attitude of basic sanitation management community in Kupang City. This type of research is a survey research using quantitative approach. Data were collected by using the instrument in the form of test knowledge of basic sanitation management and attitude questionnaire. The data was then…
ERIC Educational Resources Information Center
Gülpinar, Mehmet Ali; Isoglu-Alkaç, Ümmühan; Yegen, Berrak Çaglayan
2015-01-01
Recently, integrated and contextual learning models such as problem-based learning (PBL) and brain/mind learning (BML) have become prominent. The present study aimed to develop and evaluate a PBL program enriched with BML principles. In this study, participants were 295 first-year medical students. The study used both quantitative and qualitative…
Reading and Readability Affect on E-Learning Success in a Fortune 100 Company: A Correlational Study
ERIC Educational Resources Information Center
Finnegan, Denis Michael Thomas
2010-01-01
The purpose of this quantitative correlational study was to examine the relationship between employees' reading skills, E-learning readability, student learning, and student satisfaction. The Tests of Adult Basic Education (TABE) form 10 Level A instrument evaluated student-reading skills. The Flesch-Kincaid Grade Level Index course assessed…
Utilization and acceptance of virtual patients in veterinary basic sciences - the vetVIP-project.
Kleinsorgen, Christin; Kankofer, Marta; Gradzki, Zbigniew; Mandoki, Mira; Bartha, Tibor; von Köckritz-Blickwede, Maren; Naim, Hassan Y; Beyerbach, Martin; Tipold, Andrea; Ehlers, Jan P
2017-01-01
Context: In medical and veterinary medical education the use of problem-based and cased-based learning has steadily increased over time. At veterinary faculties, this development has mainly been evident in the clinical phase of the veterinary education. Therefore, a consortium of teachers of biochemistry and physiology together with technical and didactical experts launched the EU-funded project "vetVIP", to create and implement veterinary virtual patients and problems for basic science instruction. In this study the implementation and utilization of virtual patients occurred at the veterinary faculties in Budapest, Hannover and Lublin. Methods: This report describes the investigation of the utilization and acceptance of students studying veterinary basic sciences using optional online learning material concurrently to regular biochemistry and physiology didactic instruction. The reaction of students towards this offer of clinical case-based learning in basic sciences was analysed using quantitative and qualitative data. Quantitative data were collected automatically within the chosen software-system CASUS as user-log-files. Responses regarding the quality of the virtual patients were obtained using an online questionnaire. Furthermore, subjective evaluation by authors was performed using a focus group discussion and an online questionnaire. Results: Implementation as well as usage and acceptance varied between the three participating locations. High approval was documented in Hannover and Lublin based upon the high proportion of voluntary students (>70%) using optional virtual patients. However, in Budapest the participation rate was below 1%. Due to utilization, students seem to prefer virtual patients and problems created in their native language and developed at their own university. In addition, the statement that assessment drives learning was supported by the observation that peak utilization was just prior to summative examinations. Conclusion: Veterinary virtual patients in basic sciences can be introduced and used for the presentation of integrative clinical case scenarios. Student post-course comments also supported the conclusion that overall the virtual cases increased their motivation for learning veterinary basic sciences.
Optical Basicity and Nepheline Crystallization in High Alumina Glasses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodriguez, Carmen P.; McCloy, John S.; Schweiger, M. J.
2011-02-25
The purpose of this study was to find compositions that increase waste loading of high-alumina wastes beyond what is currently acceptable while avoiding crystallization of nepheline (NaAlSiO4) on slow cooling. Nepheline crystallization has been shown to have a large impact on the chemical durability of high-level waste glasses. It was hypothesized that there would be some composition regions where high-alumina would not result in nepheline crystal production, compositions not currently allowed by the nepheline discriminator. Optical basicity (OB) and the nepheline discriminator (ND) are two ways of describing a given complex glass composition. This report presents the theoretical and experimentalmore » basis for these models. They are being studied together in a quadrant system as metrics to explore nepheline crystallization and chemical durability as a function of waste glass composition. These metrics were calculated for glasses with existing data and also for theoretical glasses to explore nepheline formation in Quadrant IV (passes OB metric but fails ND metric), where glasses are presumed to have good chemical durability. Several of these compositions were chosen, and glasses were made to fill poorly represented regions in Quadrant IV. To evaluate nepheline formation and chemical durability of these glasses, quantitative X-ray diffraction (XRD) analysis and the Product Consistency Test were conducted. A large amount of quantitative XRD data is collected here, both from new glasses and from glasses of previous studies that had not previously performed quantitative XRD on the phase assemblage. Appendix A critically discusses a large dataset to be considered for future quantitative studies on nepheline formation in glass. Appendix B provides a theoretical justification for choice of the oxide coefficients used to compute the OB criterion for nepheline formation.« less
Using Alien Coins to Test Whether Simple Inference Is Bayesian
ERIC Educational Resources Information Center
Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.
2016-01-01
Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…
An Investigation of Basic Design Capacity Performance in Different Background Students
ERIC Educational Resources Information Center
Cheng, Chu-Yu; Ou, Yang-Kun
2017-01-01
The technological and vocational higher education system in Taiwan is offering an undergraduate degree for design-based vocational high school students and general high school students whose qualitative and quantitative abilities are evaluated through a student selection examination. This study focused on the conceptual understandings of 64…
ERIC Educational Resources Information Center
Norris, John M.
2015-01-01
Traditions of statistical significance testing in second language (L2) quantitative research are strongly entrenched in how researchers design studies, select analyses, and interpret results. However, statistical significance tests using "p" values are commonly misinterpreted by researchers, reviewers, readers, and others, leading to…
Verb Errors of Bilingual and Monolingual Basic Writers
ERIC Educational Resources Information Center
Griswold, Olga
2017-01-01
This study analyzed the grammatical control of verbs exercised by 145 monolingual English and Generation 1.5 bilingual developmental writers in narrative essays using quantitative and qualitative methods. Generation 1.5 students made more errors than their monolingual peers in each category investigated, albeit in only 2 categories was the…
34 CFR 668.146 - Criteria for approving tests.
Code of Federal Regulations, 2010 CFR
2010-07-01
... approved under this subpart, a test shall— (1) Assess secondary school level basic verbal and quantitative... verbal and quantitative skills with sufficient numbers of questions to— (i) Adequately represent each... the American Educational Research Association, the American Psychological Association, and the...
ERIC Educational Resources Information Center
Deever, Walter Thomas
2012-01-01
More than half of adults in the USA have quantitative literacy ratings at or below a basic level. This lack of literacy often becomes a barrier to employability. To overcome this barrier, adults are returning to college to improve their quantitative skills and complete an undergraduate education, often through an accelerated degree program. A…
Mixed-methods research in pharmacy practice: basics and beyond (part 1).
Hadi, Muhammad Abdul; Alldred, David Phillip; Closs, S José; Briggs, Michelle
2013-10-01
This is the first of two papers which explore the use of mixed-methods research in pharmacy practice. In an era of evidence-based medicine and policy, high-quality research evidence is essential for the development of effective pharmacist-led services. Over the past decade, the use of mixed-methods research has become increasingly common in healthcare, although to date its use has been relatively limited in pharmacy practice research. In this article, the basic concepts of mixed-methods research including its definition, typologies and advantages in relation to pharmacy practice research are discussed. Mixed-methods research brings together qualitative and quantitative methodologies within a single study to answer or understand a research problem. There are a number of mixed-methods designs available, but the selection of an appropriate design must always be dictated by the research question. Importantly, mixed-methods research should not be seen as a 'tool' to collect qualitative and quantitative data, rather there should be some degree of 'integration' between the two data sets. If conducted appropriately, mixed-methods research has the potential to generate quality research evidence by combining strengths and overcoming the respective limitations of qualitative and quantitative methodologies. © 2012 Royal Pharmaceutical Society.
Fu, Rongwei; Gartlehner, Gerald; Grant, Mark; Shamliyan, Tatyana; Sedrakyan, Art; Wilt, Timothy J; Griffith, Lauren; Oremus, Mark; Raina, Parminder; Ismaila, Afisi; Santaguida, Pasqualina; Lau, Joseph; Trikalinos, Thomas A
2011-11-01
This article is to establish recommendations for conducting quantitative synthesis, or meta-analysis, using study-level data in comparative effectiveness reviews (CERs) for the Evidence-based Practice Center (EPC) program of the Agency for Healthcare Research and Quality. We focused on recurrent issues in the EPC program and the recommendations were developed using group discussion and consensus based on current knowledge in the literature. We first discussed considerations for deciding whether to combine studies, followed by discussions on indirect comparison and incorporation of indirect evidence. Then, we described our recommendations on choosing effect measures and statistical models, giving special attention to combining studies with rare events; and on testing and exploring heterogeneity. Finally, we briefly presented recommendations on combining studies of mixed design and on sensitivity analysis. Quantitative synthesis should be conducted in a transparent and consistent way. Inclusion of multiple alternative interventions in CERs increases the complexity of quantitative synthesis, whereas the basic issues in quantitative synthesis remain crucial considerations in quantitative synthesis for a CER. We will cover more issues in future versions and update and improve recommendations with the accumulation of new research to advance the goal for transparency and consistency. Copyright © 2011 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Sevim, Oguzhan
2014-01-01
The aim of this study is to determine the effects of the drama method on speaking anxieties of pre-service teachers and their opinions about the method. In the study, mixed method including experimental design, quantitative, and basic qualitative research was used. The study was carried out with 77 first grade students from day-time and evening…
Code of Federal Regulations, 2010 CFR
2010-07-01
... procedure or instrument measures both basic verbal and quantitative skills at the secondary school level. (2... verbal and quantitative skills at the secondary school level; and (4) The passing scores and the methods...
Perceived mood, health, and burden in female Mexican American family cancer caregivers.
Wells, Jo Nell; Cagle, Carolyn Spence; Marshall, David; Hollen, Mary Luna
2009-07-01
Female family caregivers of various global cultures provide basic care in health, social, emotional, and financial domains for family members with cancer and may sacrifice their own health to do so. To learn about role-related mood, health status self-perceptions, and burden of one cultural group, we used qualitative and quantitative approaches to study 34 Mexican American (MA) women who provided care for an ill family member with cancer. We report quantitative data on study variables and make comparisons with caregiver qualitative reports. Implications for health planning, service delivery, and future research with underserved, minority female caregivers are presented.
Kinetic Studies and Product Characterization during the Basic Hydrolysis of Glyceryl Nitrate Esters
1979-08-01
elaborate instrumentation, which was considered too time consuming and costly to construct for this study. 25 The quantitative estimation of the calcium...0.6 [ l-exp(- Kfc ) 1 Figure 21. Formation of N03~ during hydrolysis of 1-MNG in aqueous Ca(0H)2 solution at 25°C (See Tables 2 and 8) 6 5
ERIC Educational Resources Information Center
Kumedzro, Felix Kwame; Otube, Nelly; Wamunyi, Chomba; Runo, Mary
2016-01-01
The study aimed at establishing relationship between leadership style of head teachers and retention of special education teachers in Southern Ghana. The study was purely quantitative and utilized descriptive correlation design which allowed the researcher to establish the strength and direction of the relationship between the independent variable…
A Laboratory Activity on the Eddy Current Brake
ERIC Educational Resources Information Center
Molina-Bolivar, J. A.; Abella-Palacios, A. J.
2012-01-01
The aim of this paper is to introduce a simple and low-cost experimental setup that can be used to study the eddy current brake, which considers the motion of a sliding magnet on an inclined conducting plane in terms of basic physical principles. We present a set of quantitative experiments performed to study the influence of the geometrical and…
Remote sensing programs and courses in engineering and water resources
NASA Technical Reports Server (NTRS)
Kiefer, R. W.
1981-01-01
The content of typical basic and advanced remote sensing and image interpretation courses are described and typical remote sensing graduate programs of study in civil engineering and in interdisciplinary environmental remote sensing and water resources management programs are outlined. Ideally, graduate programs with an emphasis on remote sensing and image interpretation should be built around a core of five courses: (1) a basic course in fundamentals of remote sensing upon which the more specialized advanced remote sensing courses can build; (2) a course dealing with visual image interpretation; (3) a course dealing with quantitative (computer-based) image interpretation; (4) a basic photogrammetry course; and (5) a basic surveying course. These five courses comprise up to one-half of the course work required for the M.S. degree. The nature of other course work and thesis requirements vary greatly, depending on the department in which the degree is being awarded.
Genetic and Environmental Influences on Behavior: Capturing All the Interplay
ERIC Educational Resources Information Center
Johnson, Wendy
2007-01-01
Basic quantitative genetic models of human behavioral variation have made clear that individual differences in behavior cannot be understood without acknowledging the importance of genetic influences. Yet these basic models estimate average, population-level genetic and environmental influences, obscuring differences that might exist within the…
34 CFR 668.142 - Special definitions.
Code of Federal Regulations, 2014 CFR
2014-07-01
.... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...
34 CFR 668.142 - Special definitions.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...
34 CFR 668.142 - Special definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...
34 CFR 668.142 - Special definitions.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... General learned abilities: Cognitive operations, such as deductive reasoning, reading comprehension, or translation from graphic to numerical representation, that may be learned in both school and non-school...,” “curricula,” or “basic verbal and quantitative skills,” the basic knowledge or skills generally learned in...
Technology Integration and the Effect on Mathematics Fact Fluency in the Middle East
ERIC Educational Resources Information Center
Letwinsky, Karim Medico; Berry, Michael David
2017-01-01
This quantitative, quasi-experimental study investigated the effect of the Mathletics.com technology on basic multiplication fact fluency in fourth grade students in the Middle East. The treatment group received three weeks of scheduled time using Mathletics.com, while the control group practiced multiplication facts using only traditional…
DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.
ERIC Educational Resources Information Center
KUSEWITT, J.B.
THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…
Perceptions of Students towards ICT Competencies at the University
ERIC Educational Resources Information Center
Torres-Gastelú, Carlos Arturo; Kiss, Gábor
2016-01-01
The purpose of this study is to identify the perceptions of university students towards their ICT Competencies from two universities, one in Mexico and the other in Hungary. The research type is quantitative and exploratory. The instrument consists of 14 questions related to three types of competencies: Basic, Application and Ethical. The sample…
ERIC Educational Resources Information Center
Fisher, Ronald J.; Andrews, John J.
1976-01-01
A co-educational living-learning center for the arts was studied through participant observation and quantitative assessment. The results document the importance of full self-selection into a membership group and demonstrate the relationships between reference group identification, basic interests in personality, and social behavior. (Author)
Lexical Properties of Slovene Sign Language: A Corpus-Based Study
ERIC Educational Resources Information Center
Vintar, Špela
2015-01-01
Slovene Sign Language (SZJ) has as yet received little attention from linguists. This article presents some basic facts about SZJ, its history, current status, and a description of the Slovene Sign Language Corpus and Pilot Grammar (SIGNOR) project, which compiled and annotated a representative corpus of SZJ. Finally, selected quantitative data…
A Comparative Analysis of Selected Mechanical Aspects of the Ice Skating Stride.
ERIC Educational Resources Information Center
Marino, G. Wayne
This study quantitatively analyzes selected aspects of the skating strides of above-average and below-average ability skaters. Subproblems were to determine how stride length and stride rate are affected by changes in skating velocity, to ascertain whether the basic assumption that stride length accurately approximates horizontal movement of the…
Comparing Levels of School Performance to Science Teachers' Reports on Knowledge
ERIC Educational Resources Information Center
Kerr, Rebecca
2013-01-01
The purpose of this descriptive quantitative and basic qualitative study was to examine fifth and eighth grade science teachers' responses, perceptions of the role of technology in the classroom, and how they felt that computer applications, tools, and the Internet influence student understanding. The purposeful sample included survey and…
ERIC Educational Resources Information Center
Thomson, Jennifer Barbara
2010-01-01
Student performance in basic math and reading skills in the United States trails behind other developed countries, providing the rationale for more research to determine how performance might be improved. Following evidence to conclude that multilingualism enhances cognitive, neuro-linguistic and meta-linguistic development, it is proposed that…
Using Linguistics in the Teaching of Developmental and Remedial Algebra.
ERIC Educational Resources Information Center
Lesnak, Richard J.
Basic algebra at Robert Morris College (RMC) in Pittsburgh, Pennsylvania, is a remedial course for students with virtually no algebra background, and for students whose previous experiences with algebra have created math blocks and math anxiety. A study was conducted in an effort to measure quantitatively the benefits of using linguistic methods…
34 CFR 668.146 - Criteria for approving tests.
Code of Federal Regulations, 2011 CFR
2011-07-01
... basic verbal and quantitative skills and general learned abilities; (2) Sample the major content domains of secondary school level verbal and quantitative skills with sufficient numbers of questions to— (i... Educational and Psychological Testing, prepared by a joint committee of the American Educational Research...
34 CFR 668.146 - Criteria for approving tests.
Code of Federal Regulations, 2013 CFR
2013-07-01
... basic verbal and quantitative skills and general learned abilities; (2) Sample the major content domains of secondary school level verbal and quantitative skills with sufficient numbers of questions to— (i... Educational and Psychological Testing, prepared by a joint committee of the American Educational Research...
34 CFR 668.146 - Criteria for approving tests.
Code of Federal Regulations, 2014 CFR
2014-07-01
... basic verbal and quantitative skills and general learned abilities; (2) Sample the major content domains of secondary school level verbal and quantitative skills with sufficient numbers of questions to— (i... Educational and Psychological Testing, prepared by a joint committee of the American Educational Research...
34 CFR 668.146 - Criteria for approving tests.
Code of Federal Regulations, 2012 CFR
2012-07-01
... basic verbal and quantitative skills and general learned abilities; (2) Sample the major content domains of secondary school level verbal and quantitative skills with sufficient numbers of questions to— (i... Educational and Psychological Testing, prepared by a joint committee of the American Educational Research...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1990-09-01
This report summarizes goals and accomplishments of the research program supported under DOE Grant No. FG02-86ER60418 entitled Instrumentation and Quantitative Methods of Evaluation, with R. Beck, P. I. and M. Cooper, Co-P.I. during the period January 15, 1990 through September 1, 1990. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development andmore » transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 7 figs.« less
McIlvane, William J; Kledaras, Joanne B; Gerard, Christophe J; Wilde, Lorin; Smelson, David
2018-07-01
A few noteworthy exceptions notwithstanding, quantitative analyses of relational learning are most often simple descriptive measures of study outcomes. For example, studies of stimulus equivalence have made much progress using measures such as percentage consistent with equivalence relations, discrimination ratio, and response latency. Although procedures may have ad hoc variations, they remain fairly similar across studies. Comparison studies of training variables that lead to different outcomes are few. Yet to be developed are tools designed specifically for dynamic and/or parametric analyses of relational learning processes. This paper will focus on recent studies to develop (1) quality computer-based programmed instruction for supporting relational learning in children with autism spectrum disorders and intellectual disabilities and (2) formal algorithms that permit ongoing, dynamic assessment of learner performance and procedure changes to optimize instructional efficacy and efficiency. Because these algorithms have a strong basis in evidence and in theories of stimulus control, they may have utility also for basic and translational research. We present an overview of the research program, details of algorithm features, and summary results that illustrate their possible benefits. It also presents arguments that such algorithm development may encourage parametric research, help in integrating new research findings, and support in-depth quantitative analyses of stimulus control processes in relational learning. Such algorithms may also serve to model control of basic behavioral processes that is important to the design of effective programmed instruction for human learners with and without functional disabilities. Copyright © 2018 Elsevier B.V. All rights reserved.
Crovelli, R.A.
1988-01-01
The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.
Quantitative interpretations of Visible-NIR reflectance spectra of blood.
Serebrennikova, Yulia M; Smith, Jennifer M; Huffman, Debra E; Leparc, German F; García-Rubio, Luis H
2008-10-27
This paper illustrates the implementation of a new theoretical model for rapid quantitative analysis of the Vis-NIR diffuse reflectance spectra of blood cultures. This new model is based on the photon diffusion theory and Mie scattering theory that have been formulated to account for multiple scattering populations and absorptive components. This study stresses the significance of the thorough solution of the scattering and absorption problem in order to accurately resolve for optically relevant parameters of blood culture components. With advantages of being calibration-free and computationally fast, the new model has two basic requirements. First, wavelength-dependent refractive indices of the basic chemical constituents of blood culture components are needed. Second, multi-wavelength measurements or at least the measurements of characteristic wavelengths equal to the degrees of freedom, i.e. number of optically relevant parameters, of blood culture system are required. The blood culture analysis model was tested with a large number of diffuse reflectance spectra of blood culture samples characterized by an extensive range of the relevant parameters.
ERIC Educational Resources Information Center
Bachore, Mebratu Mulatu
2016-01-01
The main objective of the study was to assess the perception of teachers and learners on the nature of practice, the type and the causes of academic cheating (dishonesty) in Hawassa University. The study was basically a survey which employed both qualitative and quantitative approaches to gather data. The subjects were 20 instructors and 60…
ERIC Educational Resources Information Center
Aghababaeian, Parinaz; Moghaddam, Shams Aldin Hashemi; Nateghi, Faezeh; Faghihi, Alireza
2017-01-01
This study investigated the changes in public school social studies textbooks in general period of Iran (fourth and fifth grades) based on the emphasis on Facione critical thinking skills in the past three decades. In this study, content analysis of qualitative and quantitative methods was used to evaluate changes in textbook. For this purpose,…
Selecting the most appropriate inferential statistical test for your quantitative research study.
Bettany-Saltikov, Josette; Whittaker, Victoria Jane
2014-06-01
To discuss the issues and processes relating to the selection of the most appropriate statistical test. A review of the basic research concepts together with a number of clinical scenarios is used to illustrate this. Quantitative nursing research generally features the use of empirical data which necessitates the selection of both descriptive and statistical tests. Different types of research questions can be answered by different types of research designs, which in turn need to be matched to a specific statistical test(s). Discursive paper. This paper discusses the issues relating to the selection of the most appropriate statistical test and makes some recommendations as to how these might be dealt with. When conducting empirical quantitative studies, a number of key issues need to be considered. Considerations for selecting the most appropriate statistical tests are discussed and flow charts provided to facilitate this process. When nursing clinicians and researchers conduct quantitative research studies, it is crucial that the most appropriate statistical test is selected to enable valid conclusions to be made. © 2013 John Wiley & Sons Ltd.
Calabrese, Edward J
2013-11-01
The most common quantitative feature of the hormetic-biphasic dose response is its modest stimulatory response which at maximum is only 30-60% greater than control values, an observation that is consistently independent of biological model, level of organization (i.e., cell, organ or individual), endpoint measured, chemical/physical agent studied, or mechanism. This quantitative feature suggests an underlying "upstream" mechanism common across biological systems, therefore basic and general. Hormetic dose response relationships represent an estimate of the peak performance of integrative biological processes that are allometrically based. Hormetic responses reflect both direct stimulatory or overcompensation responses to damage induced by relatively low doses of chemical or physical agents. The integration of the hormetic dose response within an allometric framework provides, for the first time, an explanation for both the generality and the quantitative features of the hormetic dose response. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quantitative Modeling of Earth Surface Processes
NASA Astrophysics Data System (ADS)
Pelletier, Jon D.
This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.
A Method for Label-Free, Differential Top-Down Proteomics.
Ntai, Ioanna; Toby, Timothy K; LeDuc, Richard D; Kelleher, Neil L
2016-01-01
Biomarker discovery in the translational research has heavily relied on labeled and label-free quantitative bottom-up proteomics. Here, we describe a new approach to biomarker studies that utilizes high-throughput top-down proteomics and is the first to offer whole protein characterization and relative quantitation within the same experiment. Using yeast as a model, we report procedures for a label-free approach to quantify the relative abundance of intact proteins ranging from 0 to 30 kDa in two different states. In this chapter, we describe the integrated methodology for the large-scale profiling and quantitation of the intact proteome by liquid chromatography-mass spectrometry (LC-MS) without the need for metabolic or chemical labeling. This recent advance for quantitative top-down proteomics is best implemented with a robust and highly controlled sample preparation workflow before data acquisition on a high-resolution mass spectrometer, and the application of a hierarchical linear statistical model to account for the multiple levels of variance contained in quantitative proteomic comparisons of samples for basic and clinical research.
ERIC Educational Resources Information Center
Wong, Arch Chee Keen
2015-01-01
This study examines the lived experiences of students as expressed in their reflections on their experiences of learning at Ambrose University in Calgary. It uses quantitative outcomes-related data from the National Survey of Student Engagement and the Theological School Survey of Student Engagement to illuminate qualitative data obtained through…
ERIC Educational Resources Information Center
Dunlop, David Livingston
The purpose of this study was to use an information theoretic memory model to quantitatively investigate classification sorting and recall behaviors of various groups of students. The model provided theorems for the determination of information theoretic measures from which inferences concerning mental processing were made. The basic procedure…
Student Perceptions of Learning in a Web-Based Tutorial.
ERIC Educational Resources Information Center
Brescia, William; McAuley, Sean
This case study used both quantitative and qualitative methods to investigate students' perceptions of learning using a Web-based tutorial. Students participated in a Web-based tutorial to learn basic HTML as part of a graduate-level Web design course. Four of five students agreed to participate in the survey and interviews. After completing the…
Is the Class Schedule the Only Difference between Morning and Afternoon Shift Schools in Mexico?
ERIC Educational Resources Information Center
Cardenas Denham, Sergio
2009-01-01
Double-shift schooling has been implemented in Mexico for several decades as a strategy to achieve universal access to basic education. This study provides evidence on the existence of social inequalities related to the implementation of this schooling model. Using quantitative data from several databases including the National Census, the…
The Influence of Culture on Strategic Decision Making in Japan and China
2011-09-01
literature, with classic studies arguing that military dictatorships are actually 33 Naoko Sajima...prone than non- dictatorships .36 Recent research, however, contends this hypothesis, with quantitative analysis showing that states with strong civilian... dictatorship naturally creates conditions where the military position and solutions have precedence over civilian leadership. By that basic definition, any
The Teaching of the Code of Ethics and Standard Practices for Texas Educator Preparation Programs
ERIC Educational Resources Information Center
Davenport, Marvin; Thompson, J. Ray; Templeton, Nathan R.
2015-01-01
The purpose of this descriptive quantitative research study was to answer three basic informational questions: (1) To what extent ethics training, as stipulated in Texas Administrative Code Chapter 247, was included in the EPP curriculum; (2) To what extent Texas public universities with approved EPP programs provided faculty opportunities for…
An Overview of Lewis Basicity and Affinity Scales
ERIC Educational Resources Information Center
Laurence, Christian; Graton, Jerome; Gal, Jean-Francois
2011-01-01
The impossibility of establishing a universal scale of Lewis basicity does not prevent the determination of the quantitative behavior of Lewis bases, thanks to scales constructed against particular Lewis acids: BF[subscript 3], 4-FC[subscript 6]H[subscript 4]OH, I[subscript 2], Li[superscript +], Na[superscript +], K[superscript +], Al[superscript…
Basic Human Needs; A Framework for Action.
ERIC Educational Resources Information Center
McHale, John; McHale, Magda Cordell
The report presents quantitative assessments of basic human needs in the areas of food, health, education, shelter, and clothing and considers how these needs may be met in ways harmonious with environmental and developmental objectives. The target group consists of those who are below or just below poverty line. The book is presented in six…
Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?
Happ, Mary Beth
2010-01-01
This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973
Understanding quantitative research: part 1.
Hoe, Juanita; Hoare, Zoë
This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.
Sawan, Mouna; Jeon, Yun-Hee; Chen, Timothy F
2018-03-01
Psychotropic medicines are commonly used in nursing homes, despite marginal clinical benefits and association with harm in the elderly. Organizational culture is proposed as a factor explaining the high-level use of psychotropic medicines. Schein describes three levels of culture: artifacts, espoused values, and basic assumptions. This integrative review aimed to investigate the facets and role of organizational culture in the use of psychotropic medicines in nursing homes. Five databases were searched for qualitative, quantitative, and mixed method empirical studies up to 13 February 2017. Articles were included if they examined an aspect of organizational culture according to Schein's theory and the use of psychotropic medicines in nursing homes for the management of behavioral and sleep disturbances in residents. Article screening and data extraction were performed independently by one reviewer and checked by the research team. The integrative review method, an approach similar to the method of constant comparison analysis was utilized for data analysis. Twenty-four studies met the inclusion criteria: 13 used quantitative methods, 9 used qualitative methods, 1 was quasi-qualitative, and 1 used mixed methods. Included studies were found to only address two aspects of organizational culture in relation to the use of psychotropic medicines: artifacts and espoused values. No studies addressed the basic assumptions, the unsaid taken-for-granted beliefs, which provide explanations for in/consistencies between the ideal use of psychotropic medicines and the actual use of psychotropic medicines. Previous studies suggest that organizational culture influences the use of psychotropic medicines in nursing homes; however, what is known is descriptive of culture only at the surface level, that is the artifacts and espoused values. Hence, future research that explains the impact of the basic assumptions of culture on the use of psychotropic medicines is important.
Using Statistics to Lie, Distort, and Abuse Data
ERIC Educational Resources Information Center
Bintz, William; Moore, Sara; Adams, Cheryll; Pierce, Rebecca
2009-01-01
Statistics is a branch of mathematics that involves organization, presentation, and interpretation of data, both quantitative and qualitative. Data do not lie, but people do. On the surface, quantitative data are basically inanimate objects, nothing more than lifeless and meaningless symbols that appear on a page, calculator, computer, or in one's…
ERIC Educational Resources Information Center
Atkinson, Maxine P.; Czaja, Ronald F.; Brewster, Zachary B.
2006-01-01
Sociologists can make meaningful contributions to quantitative literacy by teaching sociological research skills in sociology classes, including introductory courses. We report on the effectiveness of requiring a research module in a large introductory class. The module is designed to teach both basic research skills and to increase awareness of…
Quantiprot - a Python package for quantitative analysis of protein sequences.
Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold
2017-07-17
The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.
Quantitative approach for defining basic color terms and color category best exemplars.
Fider, Nicole; Narens, Louis; Jameson, Kimberly A; Komarova, Natalia L
2017-08-01
A new method is presented that identifies basic color terms (BCTs) from color-naming data. A function is defined that measures how well a term is understood by a communicating population. BCTs are then separated from other color terms by a threshold value applied to this function. A new mathematical algorithm is proposed and analyzed for determining the best exemplar associated with each BCT. Using data provided by the World Color Survey, comparisons are made between the paper's methods and those from other studies. These comparisons show that the paper's new definition of "basicness" mostly agrees with the typical definition found in the color categorization literature, which was originally due to Kay and colleagues. The new definition, unlike the typical one, has the advantage of not relying on syntactic or semantic features of languages or color lexicons. This permits the methodology developed to be generalizable and applied to other category domains for which a construct of "basicness" could have an important role.
Grass Grows, the Cow Eats: A Simple Grazing Systems Model with Emergent Properties
ERIC Educational Resources Information Center
Ungar, Eugene David; Seligman, Noam G.; Noy-Meir, Imanuel
2004-01-01
We describe a simple, yet intellectually challenging model of grazing systems that introduces basic concepts in ecology and systems analysis. The practical is suitable for high-school and university curricula with a quantitative orientation, and requires only basic skills in mathematics and spreadsheet use. The model is based on Noy-Meir's (1975)…
Quantitative Hyperspectral Reflectance Imaging
Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.
2008-01-01
Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms. PMID:27873831
Quantitation of Indoleacetic Acid Conjugates in Bean Seeds by Direct Tissue Hydrolysis 1
Bialek, Krystyna; Cohen, Jerry D.
1989-01-01
Gas chromatography-selected ion monitoring-mass spectral analysis using [13C6]indole-3-acetic acid (IAA) as an internal standard provides an effective means for quantitation of IAA liberated during direct strong basic hydrolysis of bean (Phaseolus vulgaris L.) seed powder, provided that extra precautions are undertaken to exclude oxygen from the reaction vial. Direct seed powder hydrolysis revealed that the major portion of amide IAA conjugates in bean seeds are not extractable by aqueous acetone, the solvent used commonly for IAA conjugate extraction from seeds and other plant tissues. Strong basic hydrolysis of plant tissue can be used to provide new information on IAA content. Images Figure 1 PMID:16666783
Differential Prediction of Academic Achievement in Elementary and Junior High School by Sex.
ERIC Educational Resources Information Center
Lewis, J. C.
This study examined differences in predicting achievement by sex on the Iowa Tests of Basic Skills (ITBS) from the verbal, quantitative, and nonverbal scores on the Cognitive Abilities Test (CogAT). The sample (n=10,000) consisted of all students in Grades 2, 5, and 8 who completed both tests in fall 1984. Examinations of means and standard…
The Impact of a School-Based Cultural Awareness Program on Students Ethnic Identity and Self-Esteem
ERIC Educational Resources Information Center
Braswell, Charley Alexandria
2011-01-01
The purpose of this quantitative study was to examine the influences of a school-based cultural awareness program on ethnic identity and self-esteem in fifth grade early adolescents. The development and implementation of a school-based cultural awareness program was intended to offer students a basic foundation for the development and/or…
ERIC Educational Resources Information Center
de la Fuente, Maria J.
2006-01-01
Framed under a cognitive approach to task-based L2 learning, this study used a pedagogical approach to investigate the effects of three vocabulary lessons (one traditional and two task-based) on acquisition of basic meanings, forms and morphological aspects of Spanish words. Quantitative analysis performed on the data suggests that the type of…
ERIC Educational Resources Information Center
Efande, Lyonga John
2015-01-01
This study aims at investigating the relationship between the expansion of secondary Technical Education on the acquisition of technical skills by students. Technical Vocational Education and Training (TVET) has been expanding quantitatively yearly without paying enough attention to its adverse effect on quality and the acquisition of the…
Callings, Work Role Fit, Psychological Meaningfulness and Work Engagement among Teachers in Zambia
ERIC Educational Resources Information Center
Rothmann, Sebastiaan; Hamukang'andu, Lukondo
2013-01-01
Our aim in this study was to investigate the relationships among a calling orientation, work role fit, psychological meaningfulness and work engagement of teachers in Zambia. A quantitative approach was followed and a cross-sectional survey was used. The sample (n = 150) included 75 basic and 75 secondary school teachers in the Choma district of…
ERIC Educational Resources Information Center
Pritchard, Jan Teena
2013-01-01
The most basic and fundamental skill for academic success is the ability to read. The purpose of this 1-group pretest and posttest pre-experimental quantitative study was to investigate how a unique instructional approach, called "curriculum in motion" with an emphasis on therapeutic martial arts and Brain Gym exercises influenced…
Diffraction enhance x-ray imaging for quantitative phase contrast studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agrawal, A. K.; Singh, B., E-mail: balwants@rrcat.gov.in; Kashyap, Y. S.
2016-05-23
Conventional X-ray imaging based on absorption contrast permits limited visibility of feature having small density and thickness variations. For imaging of weakly absorbing material or materials possessing similar densities, a novel phase contrast imaging techniques called diffraction enhanced imaging has been designed and developed at imaging beamline Indus-2 RRCAT Indore. The technique provides improved visibility of the interfaces and show high contrast in the image forsmall density or thickness gradients in the bulk. This paper presents basic principle, instrumentation and analysis methods for this technique. Initial results of quantitative phase retrieval carried out on various samples have also been presented.
Statistical thermodynamics unveils the dissolution mechanism of cellobiose.
Nicol, Thomas W J; Isobe, Noriyuki; Clark, James H; Shimizu, Seishi
2017-08-30
In the study of the cellulose dissolution mechanism opinion is still divided. Here, the solution interaction components of the most prominent hypotheses for the driving force of cellulose dissolution were evaluated quantitatively. Combining a rigorous statistical thermodynamic theory and cellobiose solubility data in the presence of chloride salts, whose cations progress in the Hofmeister series (KCl, NaCl, LiCl and ZnCl 2 ), we have shown that cellobiose solubilization is driven by the preferential accumulation of salts around the solutes which is stronger than cellobiose hydration. Yet contrary to the classical chaotropy hypothesis, increasing salt concentration leads to cellobiose dehydration in the presence of the strongest solubilizer ZnCl 2 . However, thanks to cellobiose dehydration, cellobiose-salt interaction still remains preferential despite weakening salt accumulation. Based on such insights, the previous hypotheses based on hydrophobicity and polymer charging have also been evaluated quantitatively. Thus, our present study successfully paved a way towards identifying the basic driving forces for cellulose solubilization in a quantitative manner for the first time. When combined with unit additivity methods this quantitative information could lead to a full understanding of cellulose solubility.
NASA Technical Reports Server (NTRS)
Dum, C. T.
1990-01-01
Particle simulation experiments were used to study the basic physical ingredients needed for building a global model of foreshock wave phenomena. In particular, the generation of Langmuir waves by a gentle bump-on-tail electron distribution is analyzed. It is shown that, with appropriately designed simulations experiments, quasi-linear theory can be quantitatively verified for parameters corresponding to the electron foreshock.
[Access to primary healthcare services: still a way to go].
Mendes, Antônio da Cruz Gouveia; Miranda, Gabriella Morais Duarte; Figueiredo, Karla Erika Gouveia; Duarte, Petra Oliveira; Furtado, Betise Mery Alencar Sousa Macau
2012-11-01
This study seeks to evaluate accessibility to the Basic Units of the Family Health Strategy (ESF-UB) and Traditional Basic Units (BU-T) in the city of Recife in 2009. Data were collected through three instruments: a roadmap for systematic observation of the units and questionnaires for users and professional units. This is a descriptive cross-sectional study using a quantitative approach, and 1180 users, 61 doctors and 56 nurses were interviewed. The results showed good ties and recognition of users whereby primary healthcare is seen as the access portal to the health system. In the comparison between ESF-UB and UB-T, evaluations are always more favorable to the family healthcare strategy, though with relatively insignificant differences. The overall result revealed widespread dissatisfaction with the difficulty of obtaining drugs and taking tests, and also with the waiting times and access to specialized care. This showed the existence of organizational problems that may constitute barriers limiting accessibility to basic healthcare services for users.
QUANTITATIVE MAGNETIC RESONANCE IMAGING OF ARTICULAR CARTILAGE AND ITS CLINICAL APPLICATIONS
Li, Xiaojuan; Majumdar, Sharmila
2013-01-01
Cartilage is one of the most essential tissues for healthy joint function and is compromised in degenerative and traumatic joint diseases. There have been tremendous advances during the past decade using quantitative MRI techniques as a non-invasive tool for evaluating cartilage, with a focus on assessing cartilage degeneration during osteoarthritis (OA). In this review, after a brief overview of cartilage composition and degeneration, we discuss techniques that grade and quantify morphologic changes as well as the techniques that quantify changes in the extracellular matrix. The basic principles, in vivo applications, advantages and challenges for each technique are discussed. Recent studies using the OA Initiative (OAI) data are also summarized. Quantitative MRI provides non-invasive measures of cartilage degeneration at the earliest stages of joint degeneration, which is essential for efforts towards prevention and early intervention in OA. PMID:24115571
Wilkerson, L; Abelmann, W H
1993-03-01
The Harvard-MIT Program in Health Sciences and Technology (HST) is a flexible, preclinical curriculum, taught by members of the faculties of both Harvard University and the Massachusetts Institute of Technology, that stresses a rigorous, scientific, quantitative approach, small classes (usually fewer than 50 students), and student-faculty interaction. The program is aimed at students with strong backgrounds in quantitative and biological sciences who are interested in careers as physician-scientists. The first 234 students of the program, who graduated between 1975 and 1985, were asked to participate in a 1990 follow-up study by completing a four-page questionnaire and submitting curricula vitae and lists of publications, if available. Data were analyzed quantitatively and qualitatively. Of the 234 graduates, 211 (90%) responded. Sixty-three (30%) had received both MD and PhD degrees. The graduates were twice as likely to describe their primary professional roles as academic than as clinical practice; 94 held full-time faculty positions at 50 medical schools. The 154 (73%) in research spent an average of 51% of their time on this activity. According to the 179 graduates (85%) who stated that they would choose HST again, the most frequently mentioned reasons were the quantitative approach that emphasized integration of basic science and clinical practice (49%) and the small class size (37%). The HST MD curriculum, with its emphasis on basic science and research experience, has been successful in preparing carefully selected students for careers as physician-scientists, without necessarily requiring the completion of a PhD degree.
Strategies for Revising Judgment: How (and How Well) People Use Others' Opinions
ERIC Educational Resources Information Center
Soll, Jack B.; Larrick, Richard P.
2009-01-01
A basic issue in social influence is how best to change one's judgment in response to learning the opinions of others. This article examines the strategies that people use to revise their quantitative estimates on the basis of the estimates of another person. The authors note that people tend to use 2 basic strategies when revising estimates:…
NASA Astrophysics Data System (ADS)
Batyaev, V. F.; Belichenko, S. G.; Bestaev, R. R.
2016-04-01
The work is devoted to a quantitative comparison of different inorganic scintillators to be used in neutron-radiation inspection systems. Such systems can be based on the tagged neutron (TN) method and have a significant potential in different applications such as detection of explosives, drugs, mines, identification of chemical warfare agents, assay of nuclear materials and human body composition [1]-[3]. The elemental composition of an inspected object is determined via spectrometry of gammas from the object bombarded by neutrons which are tagged by an alpha-detector built inside a neutron generator. This creates a task to find a quantitative indicator of the object identification quality (via elemental composition) as a function of basic parameters of the γ-detectors, such as their efficiency, energy and time resolutions, which in turn are generally defined by a scintillator of the detector. We have tried to solve the task for a set of four scintillators which are often used in the study of TN method, namely BGO, LaBr3, LYSO, NaI(Tl), whose basic parameters are well known [4]-[7].
The Basic Shelf Experience: a comprehensive evaluation.
Dewolfe, Judith A; Greaves, Gaye
2003-01-01
The Basic Shelf Experience is a program designed to assist people living on limited incomes to make better use of their food resources. The purpose of this research was to learn if the Basic Shelf Experience program helps such people to 1. utilize food resources more effectively and 2. cope, through group support, with poverty-associated stressors that influence food security. Both quantitative and qualitative methods were used to evaluate the program objectives. Participants completed a questionnaire at the beginning and end of the six-week program. The questionnaire asked about their food access, food security, and feelings about themselves. Participants returned for a focus group discussion and completed the questionnaire again three months after the program ended. The focus group was designed to elicit information about perceived changes, if any, attributed to the program. Forty-two people completed the questionnaires pre-program and 20 post-program; 17 participated in the three-month follow-up session. While results from quantitative data analysis indicate that program objectives were not met, qualitative data provide evidence that the program did achieve its stated objectives. Our results suggest such programs as the Basic Shelf Experience can assist people living on limited incomes to achieve food security.
Mauk, Kristen L; Li, Pei Ying; Jin, Huilu; Rogers, Julie; Scalzitti, Kristina
The purpose of this study was to present results of a pilot program to educate nurses in China about rehabilitation nursing. A single cohort, pre- and posttest design with an educational intervention. A 3-day basic rehabilitation nursing education program was conducted in Shanghai and Hangzhou by a certified rehabilitation nurse specialist from the United States. The effect of the educational intervention was measured using pre- and posttests for six topic areas. Data were analyzed using descriptive statistics, correlations, and paired samples t tests. Paired samples t tests showed a significant improvement (p < .01) as a result of the educational intervention on all three tests covering the six basic topics. The knowledge of the nurses on topics of basic rehabilitation nursing significantly increased as a result of the educational program. Rehabilitation nurses interested in international travel and developing professional relationships with nurses in China can provide education to promote our specialty practice overseas.
Kida, S; Kato, T
2015-01-01
Psychiatric disorders are caused not only by genetic factors but also by complicated factors such as environmental ones. Moreover, environmental factors are rarely quantitated as biological and biochemical indicators, making it extremely difficult to understand the pathological conditions of psychiatric disorders as well as their underlying pathogenic mechanisms. Additionally, we have actually no other option but to perform biological studies on postmortem human brains that display features of psychiatric disorders, thereby resulting in a lack of experimental materials to characterize the basic biology of these disorders. From these backgrounds, animal, tissue, or cell models that can be used in basic research are indispensable to understand biologically the pathogenic mechanisms of psychiatric disorders. In this review, we discuss the importance of microendophenotypes of psychiatric disorders, i.e., phenotypes at the level of molecular dynamics, neurons, synapses, and neural circuits, as targets of basic research on these disorders.
ERIC Educational Resources Information Center
Zegarski, Cassandra Marie
2017-01-01
The benefits of involvement in early childhood education have been researched from many different perspectives to show how children achieve social, emotional, and academic gains when they participate in a pre-kindergarten program. The purpose of this non-experimental quantitative study was to assess how attending a pre-kindergarten program in a…
Nuclear medicine and imaging research (Instrumentation and quantitative methods of evaluation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1989-09-01
This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility.« less
Methods for Quantitative Creatinine Determination.
Moore, John F; Sharer, J Daniel
2017-04-06
Reliable measurement of creatinine is necessary to assess kidney function, and also to quantitate drug levels and diagnostic compounds in urine samples. The most commonly used methods are based on the Jaffe principal of alkaline creatinine-picric acid complex color formation. However, other compounds commonly found in serum and urine may interfere with Jaffe creatinine measurements. Therefore, many laboratories have made modifications to the basic method to remove or account for these interfering substances. This appendix will summarize the basic Jaffe method, as well as a modified, automated version. Also described is a high performance liquid chromatography (HPLC) method that separates creatinine from contaminants prior to direct quantification by UV absorption. Lastly, a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method is described that uses stable isotope dilution to reliably quantify creatinine in any sample. This last approach has been recommended by experts in the field as a means to standardize all quantitative creatinine methods against an accepted reference. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.
Zhang, Chao; Su, Jinghua
2014-01-01
Near infrared spectroscopy (NIRS) has been widely applied in both qualitative and quantitative analysis. There is growing interest in its application to traditional Chinese medicine (TCM) and a review of recent developments in the field is timely. To present an overview of recent applications of NIRS to the identification, classification and analysis of TCM products, studies describing the application of NIRS to TCM products are classified into those involving qualitative and quantitative analysis. In addition, the application of NIRS to the detection of illegal additives and the rapid assessment of quality of TCMs by fast inspection are also described. This review covers over 100 studies emphasizing the application of NIRS in different fields. Furthermore, basic analytical principles and specific examples are used to illustrate the feasibility and effectiveness of NIRS in pattern identification. NIRS provides an effective and powerful tool for the qualitative and quantitative analysis of TCM products. PMID:26579382
Culture, risk factors and suicide in rural China: a psychological autopsy case control study.
Zhang, J; Conwell, Y; Zhou, L; Jiang, C
2004-12-01
Previous research on sociocultural factors for Chinese suicide have been basically limited to single case studies or qualitative research with ethnographic methodology. The current study examines the major risk factors and some cultural uniqueness related to Chinese rural suicide using a quantitative design. This is a case control study with 66 completed suicides and 66 living controls obtained from psychological autopsy interviews in rural China. Both bivariate analyses and the multiple regression model have found that the Chinese rural suicide patterns are basically similar to those in most other cultures in the world: strong predictors of rural Chinese suicide are the psychopathological, psychological, and physical health variables, followed by social support and negative and stressful life events. Other significant correlates include lower education, poverty, religion, and family disputes. Culture has an important impact on suicide patterns in a society.
Culture, risk factors and suicide in rural China: a psychological autopsy case control study
Zhang, J.; Conwell, Y.; Zhou, L.; Jiang, C.
2009-01-01
Objective Previous research on sociocultural factors for Chinese suicide have been basically limited to single case studies or qualitative research with ethnographic methodology. The current study examines the major risk factors and some cultural uniqueness related to Chinese rural suicide using a quantitative design. Method This is a case control study with 66 completed suicides and 66 living controls obtained from psychological autopsy interviews in rural China. Results Both bivariate analyses and the multiple regression model have found that the Chinese rural suicide patterns are basically similar to those in most other cultures in the world: strong predictors of rural Chinese suicide are the psychopathological, psychological, and physical health variables, followed by social support and negative and stressful life events. Other significant correlates include lower education, poverty, religion, and family disputes. Conclusion Culture has an important impact on suicide patterns in a society. PMID:15521827
ERIC Educational Resources Information Center
Lahtero, Tapio Juhani; Kuusilehto-Awale, Lea
2013-01-01
This article introduces a quantitative research into how the leadership team members of 49 basic education schools in the city of Vantaa, Finland, experienced the realisation of strategic leadership in their leadership teams' work. The data were collected by a survey of 24 statements, rated on a five-point Likert scale, and analysed with the…
ERIC Educational Resources Information Center
Slisko, Josip; Cruz, Adrian Corona
2013-01-01
There is a general agreement that critical thinking is an important element of 21st century skills. Although critical thinking is a very complex and controversial conception, many would accept that recognition and evaluation of assumptions is a basic critical-thinking process. When students use simple mathematical model to reason quantitatively…
NASA Technical Reports Server (NTRS)
Bair, E. K.
1986-01-01
The unbiased selection of the Space Transportation Main Engine (STME) configuration requires that the candidate engines be evaluated against a predetermined set of criteria which must be properly weighted to emphasize critical requirements defined prior to the actual evaluation. The evaluation and selection process involves the following functions: (1) determining if a configuration can satisfy basic STME requirements (yes/no); (2) defining the evaluation criteria; (3) selecting the criteria relative importance or weighting; (4) determining the weighting sensitivities; and (5) establishing a baseline for engine evaluation. The criteria weighting and sensitivities are cost related and are based on mission models and vehicle requirements. The evaluation process is used as a coarse screen to determine the candidate engines for the parametric studies and as a fine screen to determine concept(s) for conceptual design. The criteria used for the coarse and fine screen evaluation process is shown. The coarse screen process involves verifying that the candidate engines can meet the yes/no screening requirements and a semi-subjective quantitative evaluation. The fine screen engines have to meet all of the yes/no screening gates and are then subjected to a detailed evaluation or assessment using the quantitative cost evaluation processes. The option exists for re-cycling a concept through the quantitative portion of the screening and allows for some degree of optimization. The basic vehicle is a two stage LOX/HC, LOX/LH2 parallel burn vehicle capable of placing 150,000 lbs in low Earth orbit (LEO).
NASA Astrophysics Data System (ADS)
Xin, YANG; Si-qi, WU; Qi, ZHANG
2018-05-01
Beijing, London, Paris, New York are typical cities in the world, so comparative study of four cities green pattern is very important to find out gap and advantage and to learn from each other. The paper will provide basis and new ideas for development of metropolises in China. On the background of big data, API (Application Programming Interface) system can provide extensive and accurate basic data to study urban green pattern in different geographical environment in domestic and foreign. On the basis of this, Average nearest neighbor tool, Kernel density tool and Standard Ellipse tool in ArcGIS platform can process and summarize data and realize quantitative analysis of green pattern. The paper summarized uniqueness of four cities green pattern and reasons of formation on basis of numerical comparison.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Milnes, M.; Baylor, L.C.; Bave, S.
This article offers a basic review of fiber-optic sensing technology, or more specifically, fiber-optic sensing technology as applied to the qualitative or quantitative identification of a chemical sample, and how it works,
Watson, Roger
2015-04-01
This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.
NASA Astrophysics Data System (ADS)
Takada, Tohru; Nakamura, Jin; Suzuki, Masaru
All the first-year students in the University of Electro-Communications (UEC) take "Basic Physics I", "Basic Physics II" and "Physics Laboratory" as required subjects; Basic Physics I and Basic Physics II are calculus-based physics of mechanics, wave and oscillation, thermal physics and electromagnetics. Physics Laboratory is designed mainly aiming at learning the skill of basic experimental technique and technical writing. Although 95% students have taken physics in the senior high school, they poorly understand it by connecting with experience, and it is difficult to learn Physics Laboratory in the university. For this reason, we introduced two ICT (Information and Communication Technology) systems of Physics Laboratory to support students'learning and staff's teaching. By using quantitative data obtained from the ICT systems, we can easily check understanding of physics contents in students, and can improve physics education.
Chen, Li; Mossa-Basha, Mahmud; Balu, Niranjan; Canton, Gador; Sun, Jie; Pimentel, Kristi; Hatsukami, Thomas S; Hwang, Jenq-Neng; Yuan, Chun
2018-06-01
To develop a quantitative intracranial artery measurement technique to extract comprehensive artery features from time-of-flight MR angiography (MRA). By semiautomatically tracing arteries based on an open-curve active contour model in a graphical user interface, 12 basic morphometric features and 16 basic intensity features for each artery were identified. Arteries were then classified as one of 24 types using prediction from a probability model. Based on the anatomical structures, features were integrated within 34 vascular groups for regional features of vascular trees. Eight 3D MRA acquisitions with intracranial atherosclerosis were assessed to validate this technique. Arterial tracings were validated by an experienced neuroradiologist who checked agreement at bifurcation and stenosis locations. This technique achieved 94% sensitivity and 85% positive predictive values (PPV) for bifurcations, and 85% sensitivity and PPV for stenosis. Up to 1,456 features, such as length, volume, and averaged signal intensity for each artery, as well as vascular group in each of the MRA images, could be extracted to comprehensively reflect characteristics, distribution, and connectivity of arteries. Length for the M1 segment of the middle cerebral artery extracted by this technique was compared with reviewer-measured results, and the intraclass correlation coefficient was 0.97. A semiautomated quantitative method to trace, label, and measure intracranial arteries from 3D-MRA was developed and validated. This technique can be used to facilitate quantitative intracranial vascular research, such as studying cerebrovascular adaptation to aging and disease conditions. Magn Reson Med 79:3229-3238, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Bartholomew, K J; Arnold, R; Hampson, R J; Fletcher, D
2017-12-01
This article reports the first study to quantitatively examine the relationships between the demands encountered by athletes that are associated with the organization within which they are operating, cognitive appraisals, and basic psychological need experiences. Three hundred and fifteen high-level British athletes completed a multisection questionnaire which assessed each of the aforementioned constructs. A series of path analyses provided valuable insight into the way in which the three dimensions (ie, frequency, intensity, and duration) of five organizational stressor categories were evaluated by athletes and, in turn, how such threat or challenge appraisals predicted feelings of need satisfaction and need frustration. Moreover, cognitive stress appraisals were found to mediate the relationship between organizational stressors and psychological need experiences. The role of secondary control appraisals was also explored and found to mediate the relationship between primary cognitive appraisals and basic psychological need experiences. Study limitations, proposed future research directions, and the implications of the findings for applied practitioners are discussed. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Taki, Tsuyoshi; Hasegawa, Jun-ichi
1998-12-01
This paper proposes a basic feature for quantitative measurement and evaluation of group behavior of persons. This feature called 'dominant region' is a kind of sphere of influence for each person in the group. The dominant region is defined as a region in where the person can arrive earlier than any other persons and can be formulated as Voronoi region modified by replacing the distance function with a time function. This time function is calculated based on a computational model of moving ability of the person. As an application of the dominant region, we present a motion analysis system of soccer games. The purpose of this system is to evaluate the teamwork quantitatively based on movement of all the players in the game. From experiments using motion pictures of actual games, it is suggested that the proposed feature is useful for measurement and evaluation of group behavior in team sports. This basic feature may be applied to other team ball games, such as American football, basketball, handball and water polo.
ROOHOLAMINI, AZADEH; AMINI, MITRA; BAZRAFKAN, LEILA; DEHGHANI, MOHAMMAD REZA; ESMAEILZADEH, ZOHREH; NABEIEI, PARISA; REZAEE, RITA; KOJURI, JAVAD
2017-01-01
Introduction: In recent years curriculum reform and integration was done in many medical schools. The integrated curriculum is a popular concept all over the world. In Shiraz medical school, the reform was initiated by stablishing the horizontal basic science integration model and Early Clinical Exposure (ECE) for undergraduate medical education. The purpose of this study was to provide the required data for the program evaluation of this curriculum for undergraduate medical students, using CIPP program evaluation model. Methods: This study is an analytic descriptive and triangulation mixed method study which was carried out in Shiraz Medical School in 2012, based on the views of professors of basic sciences courses and first and second year medical students. The study evaluated the quality of the relationship between basic sciences and clinical courses and the method of presenting such courses based on the Context, Input, Process and Product (CIPP) model. The tools for collecting data, both quantitatively and qualitatively, were some questionnaires, content analysis of portfolios, semi- structured interview and brain storming sessions. For quantitative data analysis, SPSS software, version 14, was used. Results: In the context evaluation by modified DREEM questionnaire, 77.75%of the students believed that this educational system encourages them to actively participate in classes. Course schedule and atmosphere of class were reported suitable by 87.81% and 83.86% of students. In input domain that was measured by a researcher made questionnaire, the facilities for education were acceptable except for shortage of cadavers. In process evaluation, the quality of integrated modules presentation and Early Clinical Exposure (ECE) was good from the students’ viewpoint. In product evaluation, students’ brain storming, students’ portfolio and semi-structured interview with faculties were done, showing some positive aspects of integration and some areas that need improvement. Conclusion: The main advantage of assessing an educational program based on CIPP evaluation model is that the context, input, process and product of the program are viewed and evaluated systematically. This will help the educational authorities to make proper decisions based on the weaknesses and strengths of the program on its continuation, cessation and revision. Based on the results of this study, the integrated basic sciences course for undergraduate medical students in Shiraz Medical School is at a desirable level. However, attempts to improve or reform some sections and continual evaluation of the program and its accreditation seem to be necessary. PMID:28761888
Han, Z Y; Weng, W G
2011-05-15
In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.
Pattern, growth, and aging in aggregation kinetics of a Vicsek-like active matter model
NASA Astrophysics Data System (ADS)
Das, Subir K.
2017-01-01
Via molecular dynamics simulations, we study kinetics in a Vicsek-like phase-separating active matter model. Quantitative results, for isotropic bicontinuous pattern, are presented on the structure, growth, and aging. These are obtained via the two-point equal-time density-density correlation function, the average domain length, and the two-time density autocorrelation function. Both the correlation functions exhibit basic scaling properties, implying self-similarity in the pattern dynamics, for which the average domain size exhibits a power-law growth in time. The equal-time correlation has a short distance behavior that provides reasonable agreement between the corresponding structure factor tail and the Porod law. The autocorrelation decay is a power-law in the average domain size. Apart from these basic similarities, the overall quantitative behavior of the above-mentioned observables is found to be vastly different from those of the corresponding passive limit of the model which also undergoes phase separation. The functional forms of these have been quantified. An exceptionally rapid growth in the active system occurs due to fast coherent motion of the particles, mean-squared-displacements of which exhibit multiple scaling regimes, including a long time ballistic one.
Ribaric, Samo; Kordas, Marjan
2011-06-01
Here, we report on a new tool for teaching cardiovascular physiology and pathophysiology that promotes qualitative as well as quantitative thinking about time-dependent physiological phenomena. Quantification of steady and presteady-state (transient) cardiovascular phenomena is traditionally done by differential equations, but this is time consuming and unsuitable for most undergraduate medical students. As a result, quantitative thinking about time-dependent physiological phenomena is often not extensively dealt with in an undergraduate physiological course. However, basic concepts of steady and presteady state can be explained with relative simplicity, without the introduction of differential equation, with equivalent electronic circuits (EECs). We introduced undergraduate medical students to the concept of simulating cardiovascular phenomena with EECs. EEC simulations facilitate the understanding of simple or complex time-dependent cardiovascular physiological phenomena by stressing the analogies between EECs and physiological processes. Student perceptions on using EEC to simulate, study, and understand cardiovascular phenomena were documented over a 9-yr period, and the impact of the course on the students' knowledge of selected basic facts and concepts in cardiovascular physiology was evaluated over a 3-yr period. We conclude that EECs are a valuable tool for teaching cardiovascular physiology concepts and that EECs promote active learning.
Mesoscale Modeling, Forecasting and Remote Sensing Research.
remote sensing , cyclonic scale diagnostic studies and mesoscale numerical modeling and forecasting are summarized. Mechanisms involved in the release of potential instability are discussed and simulated quantitatively, giving particular attention to the convective formulation. The basic mesoscale model is documented including the equations, boundary condition, finite differences and initialization through an idealized frontal zone. Results of tests including a three dimensional test with real data, tests of convective/mesoscale interaction and tests with a detailed
Wang, Shuo; Poon, Gregory M K; Wilson, W David
2015-01-01
Biosensor-surface plasmon resonance (SPR) technology has emerged as a powerful label-free approach for the study of nucleic acid interactions in real time. The method provides simultaneous equilibrium and kinetic characterization for biomolecular interactions with low sample requirements and without the need for external probes. A detailed and practical guide for protein-DNA interaction analyses using biosensor-SPR methods is presented. Details of SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips and samples, experimental design, quantitative and qualitative data analyses and presentation. A specific example of the interaction of a transcription factor with DNA is provided with results evaluated by both kinetic and steady-state SPR methods.
NASA Technical Reports Server (NTRS)
Husson, N.; Barbe, A.; Brown, L. R.; Carli, B.; Goldman, A.; Pickett, H. M.; Roche, A. E.; Rothman, L. S.; Smith, M. A. H.
1985-01-01
Several aspects of quantitative atmospheric spectroscopy are considered, using a classification of the molecules according to the gas amounts in the stratosphere and upper troposphere, and reviews of quantitative atmospheric high-resolution spectroscopic measurements and field measurements systems are given. Laboratory spectroscopy and spectral analysis and prediction are presented with a summary of current laboratory spectroscopy capabilities. Spectroscopic data requirements for accurate derivation of atmospheric composition are discussed, where examples are given for space-based remote sensing experiments of the atmosphere: the ATMOS (Atmospheric Trace Molecule) and UARS (Upper Atmosphere Research Satellite) experiment. A review of the basic parameters involved in the data compilations; a summary of information on line parameter compilations already in existence; and a summary of current laboratory spectroscopy studies are used to assess the data base.
Methodology for determining the investment attractiveness of construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana
2018-03-01
The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.
Quantitative MRI and spectroscopy of bone marrow
Ruschke, Stefan; Dieckmeyer, Michael; Diefenbach, Maximilian; Franz, Daniela; Gersing, Alexandra S.; Krug, Roland; Baum, Thomas
2017-01-01
Bone marrow is one of the largest organs in the human body, enclosing adipocytes, hematopoietic stem cells, which are responsible for blood cell production, and mesenchymal stem cells, which are responsible for the production of adipocytes and bone cells. Magnetic resonance imaging (MRI) is the ideal imaging modality to monitor bone marrow changes in healthy and pathological states, thanks to its inherent rich soft‐tissue contrast. Quantitative bone marrow MRI and magnetic resonance spectroscopy (MRS) techniques have been also developed in order to quantify changes in bone marrow water–fat composition, cellularity and perfusion in different pathologies, and to assist in understanding the role of bone marrow in the pathophysiology of systemic diseases (e.g. osteoporosis). The present review summarizes a large selection of studies published until March 2017 in proton‐based quantitative MRI and MRS of bone marrow. Some basic knowledge about bone marrow anatomy and physiology is first reviewed. The most important technical aspects of quantitative MR methods measuring bone marrow water–fat composition, fatty acid composition, perfusion, and diffusion are then described. Finally, previous MR studies are reviewed on the application of quantitative MR techniques in both healthy aging and diseased bone marrow affected by osteoporosis, fractures, metabolic diseases, multiple myeloma, and bone metastases. Level of Evidence: 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:332–353. PMID:28570033
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, R.N.; Cooper, M.D.
1988-06-01
This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development ofmore » new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 58 refs., 15 figs., 4 tabs.« less
Cavalot, A L; Palonta, F; Preti, G; Nazionale, G; Ricci, E; Vione, N; Albera, R; Cortesina, G
2001-12-01
The insertion of a prosthesis and restoration with pectoralis major myocutaneous flaps for patients subjected to total pharyngolaryngectomy is a technique now universally accepted; however the literature on the subject is lacking. Our study considers 10 patients subjected to total pharyngolaryngectomy and restoration with pectoralis major myocutaneous flaps who were fitted with vocal function prostheses and a control group of 50 subjects treated with a total laryngectomy without pectoralis major myocutaneous flaps and who were fitted with vocal function prostheses. Specific qualitative and quantitative parameters were compared. The quantitative measurement of the levels of voice intensity and the evaluation of the harmonics-to-noise ratio were not statistically significant (p > 0.05) between the two study groups at either high- or low-volume speech. On the contrary, statistically significant differences were found (p < 0.05) for the basic frequency of both the low and the high volume voice. For the qualitative analysis seven parameters were established for evaluation by trained and untrained listeners: on the basis of these parameters the control group had statistically better voices.
Schwartz, Peter H; Perkins, Susan M; Schmidt, Karen K; Muriello, Paul F; Althouse, Sandra; Rawl, Susan M
2017-08-01
Guidelines recommend that patient decision aids should provide quantitative information about probabilities of potential outcomes, but the impact of this information is unknown. Behavioral economics suggests that patients confused by quantitative information could benefit from a "nudge" towards one option. We conducted a pilot randomized trial to estimate the effect sizes of presenting quantitative information and a nudge. Primary care patients (n = 213) eligible for colorectal cancer screening viewed basic screening information and were randomized to view (a) quantitative information (quantitative module), (b) a nudge towards stool testing with the fecal immunochemical test (FIT) (nudge module), (c) neither a nor b, or (d) both a and b. Outcome measures were perceived colorectal cancer risk, screening intent, preferred test, and decision conflict, measured before and after viewing the decision aid, and screening behavior at 6 months. Patients viewing the quantitative module were more likely to be screened than those who did not ( P = 0.012). Patients viewing the nudge module had a greater increase in perceived colorectal cancer risk than those who did not ( P = 0.041). Those viewing the quantitative module had a smaller increase in perceived risk than those who did not ( P = 0.046), and the effect was moderated by numeracy. Among patients with high numeracy who did not view the nudge module, those who viewed the quantitative module had a greater increase in intent to undergo FIT ( P = 0.028) than did those who did not. The limitations of this study were the limited sample size and single healthcare system. Adding quantitative information to a decision aid increased uptake of colorectal cancer screening, while adding a nudge to undergo FIT did not increase uptake. Further research on quantitative information in decision aids is warranted.
Gestalt factors modulate basic spatial vision.
Sayim, B; Westheimer, G; Herzog, M H
2010-05-01
Human perception of a stimulus varies depending on the context in which the stimulus is presented. Such contextual modulation has often been explained by two basic neural mechanisms: lateral inhibition and spatial pooling. In the present study, we presented observers with a vernier stimulus flanked by single lines; observers' ability to discriminate the offset direction of the vernier stimulus deteriorated in accordance with both explanations. However, when the flanking lines were part of a geometric shape (i.e., a good Gestalt), this deterioration strongly diminished. These findings cannot be explained by lateral inhibition or spatial pooling. It seems that Gestalt factors play an important role in contextual modulation. We propose that contextual modulation can be used as a quantitative measure to investigate the rules governing the grouping of elements into meaningful wholes.
The performance assessment of undergraduate students in physics laboratory by using guided inquiry
NASA Astrophysics Data System (ADS)
Mubarok, H.; Lutfiyah, A.; Kholiq, A.; Suprapto, N.; Putri, N. P.
2018-03-01
The performance assessment of basic physics experiment among undergraduate physics students which includes three stages: pre-laboratory, conducting experiment and final report was explored in this study. The research used a descriptive quantitative approach by utilizing guidebook of basic physics experiment. The findings showed that (1) the performance of pre-laboratory rate among undergraduate physics students in good category (average score = 77.55), which includes the ability of undergraduate physics students’ theory before they were doing the experiment. (2) The performance of conducting experiment was in good category (average score = 78.33). (3) While the performance of final report was in moderate category (average score = 73.73), with the biggest weakness at how to analyse and to discuss the data and writing the abstract.
Cluster analysis of cognitive performance in elderly and demented subjects.
Giaquinto, S; Nolfe, G; Calvani, M
1985-06-01
48 elderly normals, 14 demented subjects and 76 young controls were tested for basic cognitive functions. All the tests were quantified and could therefore be subjected to statistical analysis. The results show a difference in the speed of information processing and in memory load between the young controls and elderly normals but the age groups differed in quantitative terms only. Cluster analysis showed that the elderly and the demented formed two distinctly separate groups at the qualitative level, the basic cognitive processes being damaged in the demented group. Age thus appears to be only a risk factor for dementia and not its cause. It is concluded that batteries based on precise and measurable tasks are the most appropriate not only for the study of dementia but for rehabilitation purposes too.
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
[Quantitative data analysis for live imaging of bone.
Seno, Shigeto
Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gritsenko, Marina A.; Xu, Zhe; Liu, Tao
Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less
Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D
2016-01-01
Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.
Hayes Lane, Susan; Serafica, Reimund; Huffman, Carolyn; Cuddy, Alyssa
2016-01-01
In the current healthcare environment, nurses must have a basic understanding of research to lead change and implement evidence-based practice. The purpose of this study was to evaluate the effectiveness of an educational intervention formulated on the framework of the Great American Cookie Experiment measuring nurses' research knowledge, attitudes, and practice using mobile device gaming. This multisite quantitative study provides insight into promotion of research and information about best practices on innovative teaching strategies for nurses.
Introduction to metabolomics and its applications in ophthalmology
Tan, S Z; Begley, P; Mullard, G; Hollywood, K A; Bishop, P N
2016-01-01
Metabolomics is the study of endogenous and exogenous metabolites in biological systems, which aims to provide comparative semi-quantitative information about all metabolites in the system. Metabolomics is an emerging and potentially powerful tool in ophthalmology research. It is therefore important for health professionals and researchers involved in the speciality to understand the basic principles of metabolomics experiments. This article provides an overview of the experimental workflow and examples of its use in ophthalmology research from the study of disease metabolism and pathogenesis to identification of biomarkers. PMID:26987591
[Advances in mass spectrometry-based approaches for neuropeptide analysis].
Ji, Qianyue; Ma, Min; Peng, Xin; Jia, Chenxi; Ji, Qianyue
2017-07-25
Neuropeptides are an important class of endogenous bioactive substances involved in the function of the nervous system, and connect the brain and other neural and peripheral organs. Mass spectrometry-based neuropeptidomics are designed to study neuropeptides in a large-scale manner and obtain important molecular information to further understand the mechanism of nervous system regulation and the pathogenesis of neurological diseases. This review summarizes the basic strategies for the study of neuropeptides using mass spectrometry, including sample preparation and processing, qualitative and quantitative methods, and mass spectrometry imagining.
Phenotyping for drought tolerance of crops in the genomics era
Tuberosa, Roberto
2012-01-01
Improving crops yield under water-limited conditions is the most daunting challenge faced by breeders. To this end, accurate, relevant phenotyping plays an increasingly pivotal role for the selection of drought-resilient genotypes and, more in general, for a meaningful dissection of the quantitative genetic landscape that underscores the adaptive response of crops to drought. A major and universally recognized obstacle to a more effective translation of the results produced by drought-related studies into improved cultivars is the difficulty in properly phenotyping in a high-throughput fashion in order to identify the quantitative trait loci that govern yield and related traits across different water regimes. This review provides basic principles and a broad set of references useful for the management of phenotyping practices for the study and genetic dissection of drought tolerance and, ultimately, for the release of drought-tolerant cultivars. PMID:23049510
Colored Petri net modeling and simulation of signal transduction pathways.
Lee, Dong-Yup; Zimmer, Ralf; Lee, Sang Yup; Park, Sunwon
2006-03-01
Presented herein is a methodology for quantitatively analyzing the complex signaling network by resorting to colored Petri nets (CPN). The mathematical as well as Petri net models for two basic reaction types were established, followed by the extension to a large signal transduction system stimulated by epidermal growth factor (EGF) in an application study. The CPN models based on the Petri net representation and the conservation and kinetic equations were used to examine the dynamic behavior of the EGF signaling pathway. The usefulness of Petri nets is demonstrated for the quantitative analysis of the signal transduction pathway. Moreover, the trade-offs between modeling capability and simulation efficiency of this pathway are explored, suggesting that the Petri net model can be invaluable in the initial stage of building a dynamic model.
Safavi-Abbasi, Sam; de Oliveira, Jean G; Deshmukh, Pushpa; Reis, Cassius V; Brasiliense, Leonardo B C; Crawford, Neil R; Feiz-Erfan, Iman; Spetzler, Robert F; Preul, Mark C
2010-03-01
The aim of this study was to describe quantitatively the properties of the posterolateral approaches and their combination. Six silicone-injected cadaveric heads were dissected bilaterally. Quantitative data were generated with the Optotrak 3020 system (Northern Digital, Waterloo, Canada) and Surgiscope (Elekta Instruments, Inc., Atlanta, GA), including key anatomic points on the skull base and brainstem. All parameters were measured after the basic retrosigmoid craniectomy and then after combination with a basic far-lateral extension. The clinical results of 20 patients who underwent a combined retrosigmoid and far-lateral approach were reviewed. The change in accessibility to the lower clivus was greatest after the far-lateral extension (mean change, 43.62 +/- 10.98 mm2; P = .001). Accessibility to the constant landmarks, Meckel's cave, internal auditory meatus, and jugular foramen did not change significantly between the 2 approaches (P > .05). The greatest change in accessibility to soft tissue between the 2 approaches was to the lower brainstem (mean change, 33.88 +/- 5.25 mm2; P = .0001). Total removal was achieved in 75% of the cases. The average postoperative Glasgow Outcome Scale score of patients who underwent the combined retrosigmoid and far-lateral approach improved significantly, compared with the preoperative scores. The combination of the far-lateral and simple retrosigmoid approaches significantly increases the petroclival working area and access to the cranial nerves. However, risk of injury to neurovascular structures and time needed to extend the craniotomy must be weighed against the increased working area and angles of attack.
Correa Shokiche, Carlos; Schaad, Laura; Triet, Ramona; Jazwinska, Anna; Tschanz, Stefan A.; Djonov, Valentin
2016-01-01
Background Researchers evaluating angiomodulating compounds as a part of scientific projects or pre-clinical studies are often confronted with limitations of applied animal models. The rough and insufficient early-stage compound assessment without reliable quantification of the vascular response counts, at least partially, to the low transition rate to clinics. Objective To establish an advanced, rapid and cost-effective angiogenesis assay for the precise and sensitive assessment of angiomodulating compounds using zebrafish caudal fin regeneration. It should provide information regarding the angiogenic mechanisms involved and should include qualitative and quantitative data of drug effects in a non-biased and time-efficient way. Approach & Results Basic vascular parameters (total regenerated area, vascular projection area, contour length, vessel area density) were extracted from in vivo fluorescence microscopy images using a stereological approach. Skeletonization of the vasculature by our custom-made software Skelios provided additional parameters including “graph energy” and “distance to farthest node”. The latter gave important insights into the complexity, connectivity and maturation status of the regenerating vascular network. The employment of a reference point (vascular parameters prior amputation) is unique for the model and crucial for a proper assessment. Additionally, the assay provides exceptional possibilities for correlative microscopy by combining in vivo-imaging and morphological investigation of the area of interest. The 3-way correlative microscopy links the dynamic changes in vivo with their structural substrate at the subcellular level. Conclusions The improved zebrafish fin regeneration model with advanced quantitative analysis and optional 3-way correlative morphology is a promising in vivo angiogenesis assay, well-suitable for basic research and preclinical investigations. PMID:26950851
Phommasone, Koukeo; Althaus, Thomas; Souvanthong, Phonesavanh; Phakhounthong, Khansoudaphone; Soyvienvong, Laxoy; Malapheth, Phatthaphone; Mayxay, Mayfong; Pavlicek, Rebecca L; Paris, Daniel H; Dance, David; Newton, Paul; Lubell, Yoel
2016-02-04
C-Reactive Protein (CRP) has been shown to be an accurate biomarker for discriminating bacterial from viral infections in febrile patients in Southeast Asia. Here we investigate the accuracy of existing rapid qualitative and semi-quantitative tests as compared with a quantitative reference test to assess their potential for use in remote tropical settings. Blood samples were obtained from consecutive patients recruited to a prospective fever study at three sites in rural Laos. At each site, one of three rapid qualitative or semi-quantitative tests was performed, as well as a corresponding quantitative NycoCard Reader II as a reference test. We estimate the sensitivity and specificity of the three tests against a threshold of 10 mg/L and kappa values for the agreement of the two semi-quantitative tests with the results of the reference test. All three tests showed high sensitivity, specificity and kappa values as compared with the NycoCard Reader II. With a threshold of 10 mg/L the sensitivity of the tests ranged from 87-98 % and the specificity from 91-98 %. The weighted kappa values for the semi-quantitative tests were 0.7 and 0.8. The use of CRP rapid tests could offer an inexpensive and effective approach to improve the targeting of antibiotics in remote settings where health facilities are basic and laboratories are absent. This study demonstrates that accurate CRP rapid tests are commercially available; evaluations of their clinical impact and cost-effectiveness at point of care is warranted.
2014-09-01
Publication This work was conducted by the Institute for Defense Analyses (IDA) under contract HQ0034-14-D-0001, Project AI-2-3863, “Multidisciplinary...Initiative (MURI) Program is a Department of Defense (DoD) effort that supports academic research teams to conduct basic research addressing...across the govern- ment’s historical records of the MURI grants for a quantitative analysis of the program. In addi- tion, IDA conducted interviews with
NASA Astrophysics Data System (ADS)
Baer, E. M.; Whittington, C.; Burn, H.
2008-12-01
The geological sciences are fundamentally quantitative. However, the diversity of students' mathematical preparation and skills makes the successful use of quantitative concepts difficult in introductory level classes. At Highline Community College, we have implemented a one-credit co-requisite course to give students supplemental instruction for quantitative skills used in the course. The course, formally titled "Quantitative Geology," nicknamed "MathPatch," runs parallel to our introductory Physical Geology course. MathPatch teaches the quantitative skills required for the geology class right before they are needed. Thus, students learn only the skills they need and are given opportunities to apply them immediately. Topics include complex-graph reading, unit conversions, large numbers, scientific notation, scale and measurement, estimation, powers of 10, and other fundamental mathematical concepts used in basic geological concepts. Use of this course over the past 8 years has successfully accomplished the goals of increasing students' quantitative skills, success and retention. Students master the quantitative skills to a greater extent than before the course was implemented, and less time is spent covering basic quantitative skills in the classroom. Because the course supports the use of quantitative skills, the large number of faculty that teach Geology 101 are more comfortable in using quantitative analysis, and indeed see it as an expectation of the course at Highline. Also significant, retention in the geology course has increased substantially, from 75% to 85%. Although successful, challenges persist with requiring MathPatch as a supplementary course. One, we have seen enrollments decrease in Geology 101, which may be the result of adding this co-requisite. Students resist mandatory enrollment in the course, although they are not good at evaluating their own need for the course. The logistics utilizing MathPatch in an evening class with fewer and longer class meetings has been challenging. Finally, in order to better serve our students' needs, we began to offer on-line sections of MathPatch; this mode of instruction is not as clearly effective, although it is very popular. Through the new The Math You Need project, we hope to improve the effectiveness of the on-line instruction so it can provide comparable results to the face-to-face sections of this class.
Exploring pain pathophysiology in patients.
Sommer, Claudia
2016-11-04
Although animal models of pain have brought invaluable information on basic processes underlying pain pathophysiology, translation to humans is a problem. This Review will summarize what information has been gained by the direct study of patients with chronic pain. The techniques discussed range from patient phenotyping using quantitative sensory testing to specialized nociceptor neurophysiology, imaging methods of peripheral nociceptors, analyses of body fluids, genetics and epigenetics, and the generation of sensory neurons from patients via inducible pluripotent stem cells. Copyright © 2016, American Association for the Advancement of Science.
Dissection of a nuclear localization signal.
Hodel, M R; Corbett, A H; Hodel, A E
2001-01-12
The regulated process of protein import into the nucleus of a eukaryotic cell is mediated by specific nuclear localization signals (NLSs) that are recognized by protein import receptors. This study seeks to decipher the energetic details of NLS recognition by the receptor importin alpha through quantitative analysis of variant NLSs. The relative importance of each residue in two monopartite NLS sequences was determined using an alanine scanning approach. These measurements yield an energetic definition of a monopartite NLS sequence where a required lysine residue is followed by two other basic residues in the sequence K(K/R)X(K/R). In addition, the energetic contributions of the second basic cluster in a bipartite NLS ( approximately 3 kcal/mol) as well as the energy of inhibition of the importin alpha importin beta-binding domain ( approximately 3 kcal/mol) were also measured. These data allow the generation of an energetic scale of nuclear localization sequences based on a peptide's affinity for the importin alpha-importin beta complex. On this scale, a functional NLS has a binding constant of approximately 10 nm, whereas a nonfunctional NLS has a 100-fold weaker affinity of 1 microm. Further correlation between the current in vitro data and in vivo function will provide the foundation for a comprehensive quantitative model of protein import.
GAS CHROMATOGRAPHIC TECHNIQUES FOR THE MEASUREMENT OF ISOPRENE IN AIR
The chapter discusses gas chromatographic techniques for measuring isoprene in air. Such measurement basically consists of three parts: (1) collection of sufficient sample volume for representative and accurate quantitation, (2) separation (if necessary) of isoprene from interfer...
75 FR 77885 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-14
... of federally-funded research and development. Foreign patent applications are filed on selected... applications. Software System for Quantitative Assessment of Vasculature in Three Dimensional Images... three dimensional vascular networks from medical and basic research images. Deregulation of angiogenesis...
Mapping Quantitative Traits in Unselected Families: Algorithms and Examples
Dupuis, Josée; Shi, Jianxin; Manning, Alisa K.; Benjamin, Emelia J.; Meigs, James B.; Cupples, L. Adrienne; Siegmund, David
2009-01-01
Linkage analysis has been widely used to identify from family data genetic variants influencing quantitative traits. Common approaches have both strengths and limitations. Likelihood ratio tests typically computed in variance component analysis can accommodate large families but are highly sensitive to departure from normality assumptions. Regression-based approaches are more robust but their use has primarily been restricted to nuclear families. In this paper, we develop methods for mapping quantitative traits in moderately large pedigrees. Our methods are based on the score statistic which in contrast to the likelihood ratio statistic, can use nonparametric estimators of variability to achieve robustness of the false positive rate against departures from the hypothesized phenotypic model. Because the score statistic is easier to calculate than the likelihood ratio statistic, our basic mapping methods utilize relatively simple computer code that performs statistical analysis on output from any program that computes estimates of identity-by-descent. This simplicity also permits development and evaluation of methods to deal with multivariate and ordinal phenotypes, and with gene-gene and gene-environment interaction. We demonstrate our methods on simulated data and on fasting insulin, a quantitative trait measured in the Framingham Heart Study. PMID:19278016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poulin, Patrick, E-mail: patrick-poulin@videotron.ca; Ekins, Sean; Department of Pharmaceutical Sciences, School of Pharmacy, University of Maryland, 20 Penn Street, Baltimore, MD 21201
A general toxicity of basic drugs is related to phospholipidosis in tissues. Therefore, it is essential to predict the tissue distribution of basic drugs to facilitate an initial estimate of that toxicity. The objective of the present study was to further assess the original prediction method that consisted of using the binding to red blood cells measured in vitro for the unbound drug (RBCu) as a surrogate for tissue distribution, by correlating it to unbound tissue:plasma partition coefficients (Kpu) of several tissues, and finally to predict volume of distribution at steady-state (V{sub ss}) in humans under in vivo conditions. Thismore » correlation method demonstrated inaccurate predictions of V{sub ss} for particular basic drugs that did not follow the original correlation principle. Therefore, the novelty of this study is to provide clarity on the actual hypotheses to identify i) the impact of pharmacological mode of action on the generic correlation of RBCu-Kpu, ii) additional mechanisms of tissue distribution for the outlier drugs, iii) molecular features and properties that differentiate compounds as outliers in the original correlation analysis in order to facilitate its applicability domain alongside the properties already used so far, and finally iv) to present a novel and refined correlation method that is superior to what has been previously published for the prediction of human V{sub ss} of basic drugs. Applying a refined correlation method after identifying outliers would facilitate the prediction of more accurate distribution parameters as key inputs used in physiologically based pharmacokinetic (PBPK) and phospholipidosis models.« less
NASA Technical Reports Server (NTRS)
Kazem, Sayyed M.
1992-01-01
Materials and Processes 1 (MET 141) is offered to freshmen by the Mechanical Engineering Department at Purdue University. The goal of MET 141 is to broaden the technical background of students who have not had any college science courses. Hence, applied physics, chemistry, and mathematics are included and quantitative problem solving is involved. In the elementary metallography experiment of this course, the objectives are: (1) introduce the vocabulary and establish outlook; (2) make qualitative observations and quantitative measurements; (3) demonstrate the proper use of equipment; and (4) review basic mathematics and science.
Radiation measurements from polar and geosynchronous satellites
NASA Technical Reports Server (NTRS)
Vonderhaar, T. H.
1973-01-01
During the 1960's, radiation budget measurements from satellites have allowed quantitative study of the global energetics of our atmosphere-ocean system. A continuing program is planned, including independent measurement of the solar constant. Thus far, the measurements returned from two basically different types of satellite experiments are in agreement on the long term global scales where they are most comparable. This fact, together with independent estimates of the accuracy of measurement from each system, shows that the energy exchange between earth and space is now measured better than it can be calculated. Examples of application of the radiation budget data were shown. They can be related to the age-old problem of climate change, to the basic question of the thermal forcing of our circulation systems, and to the contemporary problems of local area energetics and computer modeling of the atmosphere.
The Obtaining of Oil from an Oil Reservoir.
ERIC Educational Resources Information Center
Dawe, R. A.
1979-01-01
Discusses the mechanics of how an actual oil reservoir works and provides some technical background in physics. An experiment which simulates an oil reservoir and demonstrates quantitatively all the basic concepts of oil reservoir rock properties is also presented. (HM)
75 FR 77882 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-14
... of federally-funded research and development. Foreign patent applications are filed on selected... applications. Software System for Quantitative Assessment of Vasculature in Three Dimensional Images... vascular networks from medical and basic research images. Deregulation of angiogenesis plays a major role...
[Improved device and method for determination of protein digestibility in vitro].
Lipatov, N N; Iudina, S B; Lisitsyn, A B
1994-01-01
The ten-cells device for modelling of ferment hydrolysis of food proteins by acid basic proteases of human alimentary canal is described. The new procedure for the calculation of quantitative characteristic of proteins digestion "in vitro" is presented.
Basic Proteins of Plant Nuclei during Normal and Pathological Cell Growth
Rasch, Ellen; Woodard, John W.
1959-01-01
Histone proteins were studied by microphotometry of plant tissue sections stained with fast green at pH 8.1. For comparative purposes the Feulgen reaction was used for deoxyribose nuclei acid (DNA); the Sakaguchi reaction for arginine; and the Millon reaction for estimates of total protein. Analysis of Tradescantia tissues indicated that amounts of nuclear histone fell into approximate multiples of the gametic (egg or sperm) quantity except in dividing tissues, where amounts intermediate between multiples were found. In differentiated tissues of lily, corn, onion, and broad bean, histones occurred in constant amounts per nucleus, characteristic of the species, as was found also for DNA. Unlike the condition in several animal species, the basic proteins of sperm nuclei in these higher plants were of the histone type; no evidence of protamine was found. In a plant neoplasm, crown gall of broad bean, behavior of the basic nuclear proteins closely paralleled that of DNA. Thus, alterations of DNA levels in tumor tissues were accompanied by quantitatively similar changes in histone levels to maintain the same Feulgen/fast green ratios found in homologous normal tissues. PMID:14436319
Clementi, Massimo; Bagnarelli, Patrizia
2015-10-01
In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges.
Precocious quantitative cognition in monkeys.
Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F
2016-02-01
Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone.
Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348
Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.
Phenotypic Signatures Arising from Unbalanced Bacterial Growth
Tan, Cheemeng; Smith, Robert Phillip; Tsai, Ming-Chi; Schwartz, Russell; You, Lingchong
2014-01-01
Fluctuations in the growth rate of a bacterial culture during unbalanced growth are generally considered undesirable in quantitative studies of bacterial physiology. Under well-controlled experimental conditions, however, these fluctuations are not random but instead reflect the interplay between intra-cellular networks underlying bacterial growth and the growth environment. Therefore, these fluctuations could be considered quantitative phenotypes of the bacteria under a specific growth condition. Here, we present a method to identify “phenotypic signatures” by time-frequency analysis of unbalanced growth curves measured with high temporal resolution. The signatures are then applied to differentiate amongst different bacterial strains or the same strain under different growth conditions, and to identify the essential architecture of the gene network underlying the observed growth dynamics. Our method has implications for both basic understanding of bacterial physiology and for the classification of bacterial strains. PMID:25101949
Phenotypic signatures arising from unbalanced bacterial growth.
Tan, Cheemeng; Smith, Robert Phillip; Tsai, Ming-Chi; Schwartz, Russell; You, Lingchong
2014-08-01
Fluctuations in the growth rate of a bacterial culture during unbalanced growth are generally considered undesirable in quantitative studies of bacterial physiology. Under well-controlled experimental conditions, however, these fluctuations are not random but instead reflect the interplay between intra-cellular networks underlying bacterial growth and the growth environment. Therefore, these fluctuations could be considered quantitative phenotypes of the bacteria under a specific growth condition. Here, we present a method to identify "phenotypic signatures" by time-frequency analysis of unbalanced growth curves measured with high temporal resolution. The signatures are then applied to differentiate amongst different bacterial strains or the same strain under different growth conditions, and to identify the essential architecture of the gene network underlying the observed growth dynamics. Our method has implications for both basic understanding of bacterial physiology and for the classification of bacterial strains.
Grinstein, Amir; Kodra, Evan; Chen, Stone; Sheldon, Seth; Zik, Ory
2018-01-01
Individuals must have a quantitative understanding of the carbon footprint tied to their everyday decisions to make efficient sustainable decisions. We report research of the innumeracy of individuals as it relates to their carbon footprint. In three studies that varied in terms of scale and sample, respondents estimate the quantity of CO2 released when combusting a gallon of gasoline in comparison to several well-known metrics including food calories and travel distance. Consistently, respondents estimated the quantity of CO2 from gasoline compared to other metrics with significantly less accuracy while exhibiting a tendency to underestimate CO2. Such relative absence of carbon numeracy of even a basic consumption habit may limit the effectiveness of environmental policies and campaigns aimed at changing individual behavior. We discuss several caveats as well as opportunities for policy design that could aid the improvement of people's quantitative understanding of their carbon footprint.
NASA Astrophysics Data System (ADS)
Li, Bo; Cai Ren, Fa; Tang, Xiao Ying
2018-03-01
The manufacture of pressure vessels with austenitic stainless steel strain strengthening technology has become an important technical means for the light weight of cryogenic pressure vessels. In the process of increasing the strength of austenitic stainless steel, strain can induce the martensitic phase transformation in austenite phase. There is a quantitative relationship between the transformation quantity of martensitic phase and the basic mechanical properties. Then, the martensitic phase variables can be obtained by means of detection, and the mechanical properties and safety performance are evaluated and calculated. Based on this, the quantitative relationship between strain hardening and deformation induced martensite phase content is studied in this paper, and the mechanism of deformation induced martensitic transformation of austenitic stainless steel is detailed.
A Study on the Basic Criteria for Selecting Heterogeneity Parameters of F18-FDG PET Images.
Forgacs, Attila; Pall Jonsson, Hermann; Dahlbom, Magnus; Daver, Freddie; D DiFranco, Matthew; Opposits, Gabor; K Krizsan, Aron; Garai, Ildiko; Czernin, Johannes; Varga, Jozsef; Tron, Lajos; Balkay, Laszlo
2016-01-01
Textural analysis might give new insights into the quantitative characterization of metabolically active tumors. More than thirty textural parameters have been investigated in former F18-FDG studies already. The purpose of the paper is to declare basic requirements as a selection strategy to identify the most appropriate heterogeneity parameters to measure textural features. Our predefined requirements were: a reliable heterogeneity parameter has to be volume independent, reproducible, and suitable for expressing quantitatively the degree of heterogeneity. Based on this criteria, we compared various suggested measures of homogeneity. A homogeneous cylindrical phantom was measured on three different PET/CT scanners using the commonly used protocol. In addition, a custom-made inhomogeneous tumor insert placed into the NEMA image quality phantom was imaged with a set of acquisition times and several different reconstruction protocols. PET data of 65 patients with proven lung lesions were retrospectively analyzed as well. Four heterogeneity parameters out of 27 were found as the most attractive ones to characterize the textural properties of metabolically active tumors in FDG PET images. These four parameters included Entropy, Contrast, Correlation, and Coefficient of Variation. These parameters were independent of delineated tumor volume (bigger than 25-30 ml), provided reproducible values (relative standard deviation< 10%), and showed high sensitivity to changes in heterogeneity. Phantom measurements are a viable way to test the reliability of heterogeneity parameters that would be of interest to nuclear imaging clinicians.
A Study on the Basic Criteria for Selecting Heterogeneity Parameters of F18-FDG PET Images
Forgacs, Attila; Pall Jonsson, Hermann; Dahlbom, Magnus; Daver, Freddie; D. DiFranco, Matthew; Opposits, Gabor; K. Krizsan, Aron; Garai, Ildiko; Czernin, Johannes; Varga, Jozsef; Tron, Lajos; Balkay, Laszlo
2016-01-01
Textural analysis might give new insights into the quantitative characterization of metabolically active tumors. More than thirty textural parameters have been investigated in former F18-FDG studies already. The purpose of the paper is to declare basic requirements as a selection strategy to identify the most appropriate heterogeneity parameters to measure textural features. Our predefined requirements were: a reliable heterogeneity parameter has to be volume independent, reproducible, and suitable for expressing quantitatively the degree of heterogeneity. Based on this criteria, we compared various suggested measures of homogeneity. A homogeneous cylindrical phantom was measured on three different PET/CT scanners using the commonly used protocol. In addition, a custom-made inhomogeneous tumor insert placed into the NEMA image quality phantom was imaged with a set of acquisition times and several different reconstruction protocols. PET data of 65 patients with proven lung lesions were retrospectively analyzed as well. Four heterogeneity parameters out of 27 were found as the most attractive ones to characterize the textural properties of metabolically active tumors in FDG PET images. These four parameters included Entropy, Contrast, Correlation, and Coefficient of Variation. These parameters were independent of delineated tumor volume (bigger than 25–30 ml), provided reproducible values (relative standard deviation< 10%), and showed high sensitivity to changes in heterogeneity. Phantom measurements are a viable way to test the reliability of heterogeneity parameters that would be of interest to nuclear imaging clinicians. PMID:27736888
Kerkhofs, Johan; Geris, Liesbet
2015-01-01
Boolean models have been instrumental in predicting general features of gene networks and more recently also as explorative tools in specific biological applications. In this study we introduce a basic quantitative and a limited time resolution to a discrete (Boolean) framework. Quantitative resolution is improved through the employ of normalized variables in unison with an additive approach. Increased time resolution stems from the introduction of two distinct priority classes. Through the implementation of a previously published chondrocyte network and T helper cell network, we show that this addition of quantitative and time resolution broadens the scope of biological behaviour that can be captured by the models. Specifically, the quantitative resolution readily allows models to discern qualitative differences in dosage response to growth factors. The limited time resolution, in turn, can influence the reachability of attractors, delineating the likely long term system behaviour. Importantly, the information required for implementation of these features, such as the nature of an interaction, is typically obtainable from the literature. Nonetheless, a trade-off is always present between additional computational cost of this approach and the likelihood of extending the model’s scope. Indeed, in some cases the inclusion of these features does not yield additional insight. This framework, incorporating increased and readily available time and semi-quantitative resolution, can help in substantiating the litmus test of dynamics for gene networks, firstly by excluding unlikely dynamics and secondly by refining falsifiable predictions on qualitative behaviour. PMID:26067297
Morán, Félix; Olmos, Antonio; Lotos, Leonidas; Predajňa, Lukáš; Katis, Nikolaos; Glasa, Miroslav; Maliogka, Varvara; Ruiz-García, Ana B
2018-01-01
Grapevine Pinot gris virus (GPGV) is a widely distributed grapevine pathogen that has been associated to the grapevine leaf mottling and deformation disease. With the aim of better understanding the disease epidemiology and providing efficient control strategies a specific and quantitative duplex TaqMan real-time RT-PCR assay has been developed. This method has allowed reliable quantitation of the GPGV titer ranging from 30 up to 3 x 108 transcript copies, with a detection limit of 70 viral copies in plant material. The assay targets a grapevine internal control that reduces the occurrence of false negative results, thus increasing the diagnostic sensitivity of the technique. Viral isolates both associated and non-associated to symptoms from Greece, Slovakia and Spain have been successfully detected. The method has also been applied to the absolute quantitation of GPGV in its putative transmission vector Colomerus vitis. Moreover, the viral titer present in single mites has been determined. In addition, in the current study a new polymorphism in the GPGV genome responsible for a shorter movement protein has been found. A phylogenetic study based on this genomic region has shown a high variability among Spanish isolates and points to a different evolutionary origin of this new polymorphism. The methodology here developed opens new possibilities for basic and epidemiological studies as well as for the establishment of efficient control strategies.
Descriptive approaches to landscape analysis
R. Burton Litton Jr.
1979-01-01
Descriptive landscape analyses include various procedures used to document visual/scenic resources. Historic and regional examples of landscape description represent desirable insight for contemporary professional inventory work. Routed and areal landscape inventories are discussed as basic tools. From them, qualitative and quantitative evaluations can be developed...
Quantitative and sensitive detection of prohibited fish drugs by surface-enhanced Raman scattering
NASA Astrophysics Data System (ADS)
Lin, Shi-Chao; Zhang, Xin; Zhao, Wei-Chen; Chen, Zhao-Yang; Du, Pan; Zhao, Yong-Mei; Wu, Zheng-Long; Xu, Hai-Jun
2018-02-01
Not Available Project supported by the National Basic Research Program of China (Grant No. 2014CB745100), the National Natural Science Foundation of China (Grant Nos. 21390202 and 21676015), and the Beijing Higher Education Young Elite Teacher Project.
ERIC Educational Resources Information Center
Glass, Gene V.
1992-01-01
Questions that beginning educational researchers and evaluators will have to answer are discussed concerning: (1) which intellectual tradition to select; (2) the roles of basic and theoretical inquiry; (3) qualitative and quantitative distinctions; (4) the theory-practice relationship; (5) political influences; and (6) the correct emphasis on…
An implementation and analysis of the Abstract Syntax Notation One and the basic encoding rules
NASA Technical Reports Server (NTRS)
Harvey, James D.; Weaver, Alfred C.
1990-01-01
The details of abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively solve the problem of data transfer across incompatible host environments are presented, and a compiler that was built to automate their use is described. Experiences with this compiler are also discussed which provide a quantitative analysis of the performance costs associated with the application of these standards. An evaluation is offered as to how well suited ASN.1 and BER are in solving the common data representation problem.
Komatsu, Setsuko; Takasaki, Hironori
2009-07-01
Genes regulated by gibberellin (GA) during leaf sheath elongation in rice seedlings were identified using the transcriptome approach. mRNA from the basal regions of leaf sheaths treated with GA3 was analyzed by high-coverage gene expression profiling. 33,004 peaks were detected, and 30 transcripts showed significant changes in the presence of GA3. Among these, basic helix-loop-helix transcription factor (AK073385) was significantly upregulated. Quantitative PCR analysis confirmed that expression of AK073385 was controlled by GA3 in a time- and dose-dependent manner. Basic helix-loop-helix transcription factor (AK073385) is therefore involved in the regulation of gene expression by GA3.
Allometric Trajectories and "Stress": A Quantitative Approach.
Anfodillo, Tommaso; Petit, Giai; Sterck, Frank; Lechthaler, Silvia; Olson, Mark E
2016-01-01
The term "stress" is an important but vague term in plant biology. We show situations in which thinking in terms of "stress" is profitably replaced by quantifying distance from functionally optimal scaling relationships between plant parts. These relationships include, for example, the often-cited one between leaf area and sapwood area, which presumably reflects mutual dependence between sources and sink tissues and which scales positively within individuals and across species. These relationships seem to be so basic to plant functioning that they are favored by selection across nearly all plant lineages. Within a species or population, individuals that are far from the common scaling patterns are thus expected to perform negatively. For instance, "too little" leaf area (e.g., due to herbivory or disease) per unit of active stem mass would be expected to incur to low carbon income per respiratory cost and thus lead to lower growth. We present a framework that allows quantitative study of phenomena traditionally assigned to "stress," without need for recourse to this term. Our approach contrasts with traditional approaches for studying "stress," e.g., revealing that small "stressed" plants likely are in fact well suited to local conditions. We thus offer a quantitative perspective to the study of phenomena often referred to under such terms as "stress," plasticity, adaptation, and acclimation.
Bagnasco, Annamaria; Galaverna, Lucia; Aleo, Giuseppe; Grugnetti, Anna Maria; Rosa, Francesca; Sasso, Loredana
2016-01-01
In the literature we found many studies that confirmed our concerns about nursing students' poor maths skills that directly impact on their ability to correctly calculate drug dosages with very serious consequences for patient safety. The aim of our study was to explore where students had most difficulty and identify appropriate educational interventions to bridge their mathematical knowledge gaps. This was a quali-quantitative descriptive study that included a sample of 726 undergraduate nursing students. We identified exactly where students had most difficulty and identified appropriate educational interventions to bridge their mathematical knowledge gaps. We found that the undergraduate nursing students mainly had difficulty with basic maths principles. Specific learning interventions are needed to improve their basic maths skills and their dosage calculation skills. For this purpose, we identified safeMedicate and eDose (Authentic World Ltd.), only that they are only available in English. In the near future we hope to set up a partnership to work together on the Italian version of these tools. Copyright © 2015 Elsevier Ltd. All rights reserved.
Rethinking health numeracy: a multidisciplinary literature review.
Ancker, Jessica S; Kaufman, David
2007-01-01
The purpose of this review is to organize various published conceptions of health numeracy and to discuss how health numeracy contributes to the productive use of quantitative information for health. We define health numeracy as the individual-level skills needed to understand and use quantitative health information, including basic computation skills, ability to use information in documents and non-text formats such as graphs, and ability to communicate orally. We also identify two other factors affecting whether a consumer can use quantitative health information: design of documents and other information artifacts, and health-care providers' communication skills. We draw upon the distributed cognition perspective to argue that essential ingredients for the productive use of quantitative health information include not only health numeracy but also good provider communication skills, as well as documents and devices that are designed to enhance comprehension and cognition.
Self-consistent approach for neutral community models with speciation
NASA Astrophysics Data System (ADS)
Haegeman, Bart; Etienne, Rampal S.
2010-03-01
Hubbell’s neutral model provides a rich theoretical framework to study ecological communities. By incorporating both ecological and evolutionary time scales, it allows us to investigate how communities are shaped by speciation processes. The speciation model in the basic neutral model is particularly simple, describing speciation as a point-mutation event in a birth of a single individual. The stationary species abundance distribution of the basic model, which can be solved exactly, fits empirical data of distributions of species’ abundances surprisingly well. More realistic speciation models have been proposed such as the random-fission model in which new species appear by splitting up existing species. However, no analytical solution is available for these models, impeding quantitative comparison with data. Here, we present a self-consistent approximation method for neutral community models with various speciation modes, including random fission. We derive explicit formulas for the stationary species abundance distribution, which agree very well with simulations. We expect that our approximation method will be useful to study other speciation processes in neutral community models as well.
Grouping patients for masseter muscle genotype-phenotype studies.
Moawad, Hadwah Abdelmatloub; Sinanan, Andrea C M; Lewis, Mark P; Hunt, Nigel P
2012-03-01
To use various facial classifications, including either/both vertical and horizontal facial criteria, to assess their effects on the interpretation of masseter muscle (MM) gene expression. Fresh MM biopsies were obtained from 29 patients (age, 16-36 years) with various facial phenotypes. Based on clinical and cephalometric analysis, patients were grouped using three different classifications: (1) basic vertical, (2) basic horizontal, and (3) combined vertical and horizontal. Gene expression levels of the myosin heavy chain genes MYH1, MYH2, MYH3, MYH6, MYH7, and MYH8 were recorded using quantitative reverse transcriptase polymerase chain reaction (RT-PCR) and were related to the various classifications. The significance level for statistical analysis was set at P ≤ .05. Using classification 1, none of the MYH genes were found to be significantly different between long face (LF) patients and the average vertical group. Using classification 2, MYH3, MYH6, and MYH7 genes were found to be significantly upregulated in retrognathic patients compared with prognathic and average horizontal groups. Using classification 3, only the MYH7 gene was found to be significantly upregulated in retrognathic LF compared with prognathic LF, prognathic average vertical faces, and average vertical and horizontal groups. The use of basic vertical or basic horizontal facial classifications may not be sufficient for genetics-based studies of facial phenotypes. Prognathic and retrognathic facial phenotypes have different MM gene expressions; therefore, it is not recommended to combine them into one single group, even though they may have a similar vertical facial phenotype.
Epifanova, M V; Chalyi, M E; Krasnov, A O
2017-09-01
To determine the quantitative and qualitative composition of growth factors (PDGF-AA, PDGF-BB, VEGF, VEGF-D, FGF-acid, FGF-basic) and platelets in various modifications of APRP. Blood of 12 male volunteers (control group) and 12 patients with ED was used to prepare APRP and the subsequently determine the concentration of growth factors. The growth factor concentrations (FGF acid, FGF basic, PDGF-AA, PDGF-BB, VEGF, VEGF-D) was determined using a flow cytometry-based xMAP Luminex (Gen-Probe) system. Concentration of platelets in APRP obtained by two stage centrifugation, reached 1480 (1120-1644) in the control group and 1232 (956-1502) in patients with ED. The concentration of growth factors in the samples prepared without preliminary freezing was: PDGF-AA 842 (22-3700), PDGF-BB 2837 (1460-4100), FGF-basic 7.9 (0.28-127), FGF-acid 3, 4 (0.14-11), VEGF 19 (4.6-46), VEGF-D 21 (14-38). After thawing, the concentration of all growth factors in the samples increased. The study findings suggest that the mechanism of erectile function recovery following the use of APRP is through the active substances detected in APRP, i.e. FGF-basic, PDGF-AA, PDGF-BB, VEGF, VEGF-D and FGF-acid. Also, the study showed that the content of growth factors in APRP after of freezing/thawing is higher than in APRP that has not been frozen. This is due to the cell membrane destruction at extremely low temperatures during freezing.
Randolph, S E; Craine, N G
1995-11-01
Models of tick-borne diseases must take account of the particular biological features of ticks that contrast with those of insect vectors. A general framework is proposed that identifies the parameters of the transmission dynamics of tick-borne diseases to allow a quantitative assessment of the relative contributions of different host species and alternative transmission routes to the basic reproductive number, Ro, of such diseases. Taking the particular case of the transmission of the Lyme borreliosis spirochaete, Borrelia burgdorferi, by Ixodes ticks in Europe, and using the best, albeit still inadequate, estimates of the parameter values and a set of empirical data from Thetford Forest, England, we show that squirrels and the transovarial transmission route make quantitatively very significant contributions to Ro. This approach highlights the urgent need for more robust estimates of certain crucial parameter values, particularly the coefficients of transmission between ticks and vertebrates, before we can progress to full models that incorporate seasonality and heterogeneity among host populations for the natural dynamics of transmission of borreliosis and other tick-borne diseases.
Technical manual for basic version of the Markov chain nest productivity model (MCnest)
The Markov Chain Nest Productivity Model (or MCnest) integrates existing toxicity information from three standardized avian toxicity tests with information on species life history and the timing of pesticide applications relative to the timing of avian breeding seasons to quantit...
User’s manual for basic version of MCnest Markov chain nest productivity model
The Markov Chain Nest Productivity Model (or MCnest) integrates existing toxicity information from three standardized avian toxicity tests with information on species life history and the timing of pesticide applications relative to the timing of avian breeding seasons to quantit...
SCREENING LIFE CYCLE ASSESSMENT OF GASOLINE ADDITIVES
The EPA's ORD is conducting a screening of Life Cycle Assessment (LCA) of selected automotive fuel (i.e., gasoline) systems. Although no specific guidelines exist on how to conduct such a streamlined approach, the basic idea is to use a mix of qualitative and quantitative generi...
[Transmission dynamic model for echinococcosis granulosus: establishment and application].
Yang, Shi-Jie; Wu, Wei-Ping
2009-06-01
A dynamic model of disease can be used to quantitatively describe the pattern and characteristics of disease transmission, predict the disease status and evaluate the efficacy of control strategy. This review summarizes the basic transmission dynamic models of echinococcosis granulosus and their application.
Astronomy: social background of students of the integrated high school
NASA Astrophysics Data System (ADS)
Voelzke, M. R.; Barbosa, J. I. L.
2017-07-01
Astronomy-related contents exist in almost all levels of basic education in Brazil and are also frequently disseminated through mass media. Thus, students form their own explanations about the phenomena studied by this science. Therefore, this work has the objective of identifying the possible social background of the Integrated High School students on the term Astronomy. It is a research of a basic nature, descriptive, and for that reason a quali-quantitative approach was adopted; the procedures to obtain the data were effected in the form of a survey. The results show that the tested students have a social background about the object Astronomy, which is on the one hand fortified by elements they have made or which is part of the experience lived by the respondents within the formal space of education, and on the other hand based on elements possibly disseminated through the mass media.
NASA Technical Reports Server (NTRS)
Frye, Robert
1990-01-01
Research at the Environmental Research Lab in support of Biosphere 2 was both basic and applied in nature. One aspect of the applied research involved the use of biological reactors for the scrubbing of trace atmospheric organic contaminants. The research involved a quantitative study of the efficiency of operation of Soil Bed Reactors (SBR) and the optimal operating conditions for contaminant removal. The basic configuration of a SBR is that air is moved through a living soil that supports a population of plants. Upon exposure to the soil, contaminants are either passively adsorbed onto the surface of soil particles, chemically transformed in the soil to usable compounds that are taken up by the plants or microbes as a metabolic energy source and converted to CO2 and water.
Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part II.
Tavakol, Mohsen; Sandars, John
2014-10-01
Abstract Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.
Quantitative and qualitative methods in medical education research: AMEE Guide No 90: Part I.
Tavakol, Mohsen; Sandars, John
2014-09-01
Medical educators need to understand and conduct medical education research in order to make informed decisions based on the best evidence, rather than rely on their own hunches. The purpose of this Guide is to provide medical educators, especially those who are new to medical education research, with a basic understanding of how quantitative and qualitative methods contribute to the medical education evidence base through their different inquiry approaches and also how to select the most appropriate inquiry approach to answer their research questions.
Helicopter Pilot Performance for Discrete-maneuver Flight Tasks
NASA Technical Reports Server (NTRS)
Heffley, R. K.; Bourne, S. M.; Hindson, W. S.
1984-01-01
This paper describes a current study of several basic helicopter flight maneuvers. The data base consists of in-flight measurements from instrumented helicopters using experienced pilots. The analysis technique is simple enough to apply without automatic data processing, and the results can be used to build quantitative matah models of the flight task and some aspects of the pilot control strategy. In addition to describing the performance measurement technqiue, some results are presented which define the aggressiveness and amplitude of maneuvering for several lateral maneuvers including turns and sidesteps.
An overview of the nonequilibrium behavior of polymer glasses
NASA Technical Reports Server (NTRS)
Tant, M. R.; Wilkes, G. L.
1981-01-01
It is pointed out that research efforts are at present being directed in two areas, one comprising experimental studies of this phenomenon in various glassy polymer systems and the other involving the development of a quantitative theory capable of satisfactorily predicting aging behavior for a variety of polymer materials under different conditions. Recent work in both these areas is surveyed. The basic principles of nonequilibrium behavior are outlined, with emphasis placed on changes in material properties with annealing below the glass transition temperature. Free volume theory and thermodynamic theory are discussed.
Kiontke, Andreas; Oliveira-Birkmeier, Ariana; Opitz, Andreas
2016-01-01
Over the past decades, electrospray ionization for mass spectrometry (ESI-MS) has become one of the most commonly employed techniques in analytical chemistry, mainly due to its broad applicability to polar and semipolar compounds and the superior selectivity which is achieved in combination with high resolution separation techniques. However, responsiveness of an analytical method also determines its suitability for the quantitation of chemical compounds; and in electrospray ionization for mass spectrometry, it can vary significantly among different analytes with identical solution concentrations. Therefore, we investigated the ESI-response behavior of 56 nitrogen-containing compounds including aromatic amines and pyridines, two compound classes of high importance to both, synthetic organic chemistry as well as to pharmaceutical sciences. These compounds are increasingly analyzed employing ESI mass spectrometry detection due to their polar, basic character. Signal intensities of the peaks from the protonated molecular ion (MH+) were acquired under different conditions and related to compound properties such as basicity, polarity, volatility and molecular size exploring their quantitative impact on ionization efficiency. As a result, we found that though solution basicity of a compound is the main factor initially determining the ESI response of the protonated molecular ion, other factors such as polarity and vaporability become more important under acidic solvent conditions and may nearly outweigh the importance of basicity under these conditions. Moreover, we show that different molecular descriptors may become important when using different types of instruments for such investigations, a fact not detailed so far in the available literature. PMID:27907110
Quantitative RNA-seq analysis of the Campylobacter jejuni transcriptome
Chaudhuri, Roy R.; Yu, Lu; Kanji, Alpa; Perkins, Timothy T.; Gardner, Paul P.; Choudhary, Jyoti; Maskell, Duncan J.
2011-01-01
Campylobacter jejuni is the most common bacterial cause of foodborne disease in the developed world. Its general physiology and biochemistry, as well as the mechanisms enabling it to colonize and cause disease in various hosts, are not well understood, and new approaches are required to understand its basic biology. High-throughput sequencing technologies provide unprecedented opportunities for functional genomic research. Recent studies have shown that direct Illumina sequencing of cDNA (RNA-seq) is a useful technique for the quantitative and qualitative examination of transcriptomes. In this study we report RNA-seq analyses of the transcriptomes of C. jejuni (NCTC11168) and its rpoN mutant. This has allowed the identification of hitherto unknown transcriptional units, and further defines the regulon that is dependent on rpoN for expression. The analysis of the NCTC11168 transcriptome was supplemented by additional proteomic analysis using liquid chromatography-MS. The transcriptomic and proteomic datasets represent an important resource for the Campylobacter research community. PMID:21816880
Reflectance spectroscopy for evaluating hair follicle cycle
NASA Astrophysics Data System (ADS)
Liu, Caihua; Guan, Yue; Wang, Jianru; Zhu, Dan
2014-02-01
Hair follicle, as a mini-organ with perpetually cycling of telogen, anagen and catagen, provides a valuable experimental model for studying hair and organ regeneration. The transition of hair follicle from telogen to anagen is a significant sign for successful regeneration. So far discrimination of the hair follicle stage is mostly based on canonical histological examination and empirical speculation based on skin color. Hardly a method has been proposed to quantitatively evaluate the hair follicle stage. In this work, a commercial optical fiber spectrometer was applied to monitor diffuse reflectance of mouse skin with hair follicle cycling, and then the change of reflectance was obtained. Histological examination was used to verify the hair follicle stage. In comparison with the histological examination, the skin diffuse reflectance was relatively high for mouse with telogen hair follicles; it decreased once hair follicles transited to anagen stage; then it increased reversely at catagen stage. This study provided a new method to quantitatively evaluate the hair follicle stage, and should be valuable for the basic and therapeutic investigations on hair regeneration.
Cai, Xiang; Shen, Liguo; Zhang, Meijia; Chen, Jianrong; Hong, Huachang; Lin, Hongjun
2017-11-01
Quantitatively evaluating interaction energy between two randomly rough surfaces is the prerequisite to quantitatively understand and control membrane fouling in membrane bioreactors (MBRs). In this study, a new unified approach to construct rough topographies and to quantify interaction energy between a randomly rough particle and a randomly rough membrane was proposed. It was found that, natural rough topographies of both foulants and membrane could be well constructed by a modified two-variable Weierstrass-Mandelbrot (WM) function included in fractal theory. Spatial differential relationships between two constructed surfaces were accordingly established. Thereafter, a new approach combining these relationships, surface element integration (SEI) approach and composite Simpson's rule was deduced to calculate the interaction energy between two randomly rough surfaces in a submerged MBR. The obtained results indicate the profound effects of surface morphology on interaction energy and membrane fouling. This study provided a basic approach to investigate membrane fouling and interface behaviors. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quantitative proteomics in the field of microbiology.
Otto, Andreas; Becher, Dörte; Schmidt, Frank
2014-03-01
Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Takallu, M. A.; Wong, D. T.; Uenking, M. D.
2002-01-01
An experimental investigation was conducted to study the effectiveness of modern flight displays in general aviation cockpits for mitigating Low Visibility Loss of Control and the Controlled Flight Into Terrain accidents. A total of 18 General Aviation (GA) pilots with private pilot, single engine land rating, with no additional instrument training beyond private pilot license requirements, were recruited to evaluate three different display concepts in a fixed-based flight simulator at the NASA Langley Research Center's General Aviation Work Station. Evaluation pilots were asked to continue flight from Visual Meteorological Conditions (VMC) into Instrument Meteorological Conditions (IMC) while performing a series of 4 basic precision maneuvers. During the experiment, relevant pilot/vehicle performance variables, pilot control inputs and physiological data were recorded. Human factors questionnaires and interviews were administered after each scenario. Qualitative and quantitative data have been analyzed and the results are presented here. Pilot performance deviations from the established target values (errors) were computed and compared with the FAA Practical Test Standards. Results of the quantitative data indicate that evaluation pilots committed substantially fewer errors when using the Synthetic Vision Systems (SVS) displays than when they were using conventional instruments. Results of the qualitative data indicate that evaluation pilots perceived themselves to have a much higher level of situation awareness while using the SVS display concept.
Li, Weina; Fedosov, Sergey; Tan, Tianwei; Xu, Xuebing; Guo, Zheng
2014-05-01
To maintain biological functions, thousands of different reactions take place in human body at physiological pH (7.0) and mild conditions, which is associated with health and disease. Therefore, to examine the catalytic function of the intrinsically occurring molecules, such as amino acids at neutral pH, is of fundamental interests. Natural basic α-amino acid of L-lysine, L-arginine, and L-histidine neutralized to physiological pH as salts were investigated for their ability to catalyze Knoevenagel condensation of benzaldehyde and ethyl cyanoacetate. Compared with their free base forms, although neutralized alkaline amino acid salts reduced the catalytic activity markedly, they were still capable to perform an efficient catalysis at physiological pH as porcine pancreatic lipase (PPL), one of the best enzymes that catalyze Knoevenagel condensation. In agreement with the fact that the three basic amino acids were well neutralized, stronger basic amino acid Arg and Lys showed more obvious variation in NH bend peak from the FTIR spectroscopy study. Study of ethanol/water system and quantitative kinetic analysis suggested that the microenvironment in the vicinity of amino acid salts and protonability/deprotonability of the amine moiety may determine their catalytic activity and mechanism. The kinetic study of best approximation suggested that the random binding might be the most probable catalytic mechanism for the neutralized alkaline amino acid salt-catalyzed Knoevenagel condensation.
Sandia and General Motors: Advancing Clean Combustion Engines with
Quantitative Risk Assessment Technical Reference for Hydrogen Compatibility of Materials Hydrogen Battery Abuse Testing Laboratory Center for Infrastructure Research and Innovation Combustion Research Facility Joint BioEnergy Institute Close Energy Research Programs ARPA-E Basic Energy Sciences Materials
34 CFR 668.156 - Approved State process.
Code of Federal Regulations, 2010 CFR
2010-07-01
... student's eligibility for Title IV, HEA program funds must apply to the Secretary for approval of that... than a single standardized test; (3) Tutoring in basic verbal and quantitative skills, if appropriate... program completion. (d) A State process must— (1) Monitor on an annual basis each participating...
John Ralph; Larry L. Landucci
2010-01-01
This chapter will consider the basic aspects and findings of several forms of NMR spectroscopy, including separate discussions of proton, carbon, heteronuclear, and multidimensional NMR. Enhanced focus will be on 13C NMR, because of its qualitative and quantitative importance, followed by NMRâs contributions to our understanding of lignin...
Integrated System Test of the Advanced Instructional System (AIS). Final Report.
ERIC Educational Resources Information Center
Lintz, Larry M.; And Others
The integrated system test for the Advanced Instructional System (AIS) was designed to provide quantitative information regarding training time reductions resulting from certain computer managed instruction features. The reliabilities of these features and of support systems were also investigated. Basic computer managed instruction reduced…
Genetic data analysis for plant and animal breeding
USDA-ARS?s Scientific Manuscript database
This book is an advanced textbook covering the application of quantitative genetics theory to analysis of actual data (both trait and DNA marker information) for breeding populations of crops, trees, and animals. Chapter 1 is an introduction to basic software used for trait data analysis. Chapter 2 ...
Turati, Laura; Moscatelli, Marco; Mastropietro, Alfonso; Dowell, Nicholas G; Zucca, Ileana; Erbetta, Alessandra; Cordiglieri, Chiara; Brenna, Greta; Bianchi, Beatrice; Mantegazza, Renato; Cercignani, Mara; Baggi, Fulvio; Minati, Ludovico
2015-03-01
The pool size ratio measured by quantitative magnetization transfer MRI is hypothesized to closely reflect myelin density, but their relationship has so far been confirmed mostly in ex vivo conditions. We investigate the correspondence between this parameter measured in vivo at 7.0 T, with Black Gold II staining for myelin fibres, and with myelin basic protein and beta-tubulin immunofluorescence in a hybrid longitudinal study of C57BL/6 and SJL/J mice treated with cuprizone, a neurotoxicant causing relatively selective myelin loss followed by spontaneous remyelination upon treatment suspension. Our results confirm that pool size ratio measurements correlate with myelin content, with the correlation coefficient depending on strain and staining method, and demonstrate the in vivo applicability of this MRI technique to experimental mouse models of multiple sclerosis. Copyright © 2015 John Wiley & Sons, Ltd.
Rushton, L
1996-01-01
This paper describes basic principles underlying the methodology for obtaining quantitative estimates of benzene exposure in the petroleum marketing and distribution industry. Work histories for 91 cases of leukemia and 364 matched controls (4 per case) identified for a cohort of oil distribution workers up to the end of 1992 were obtained, primarily from personnel records. Information on the distribution sites, more than 90% of which were closed at the time of data collection, was obtained from site visits and archive material. Industrial hygiene measurements measured under known conditions were assembled for different tasks. These were adjusted for conditions where measured data were not available using variables known to influence exposure, such as temperature, technology, percentage of benzene in fuel handled, products handled, number of loads, and job activity. A quantitative estimate of dermal contact and peak exposure was also made. PMID:9118922
Kodra, Evan; Chen, Stone; Sheldon, Seth; Zik, Ory
2018-01-01
Individuals must have a quantitative understanding of the carbon footprint tied to their everyday decisions to make efficient sustainable decisions. We report research of the innumeracy of individuals as it relates to their carbon footprint. In three studies that varied in terms of scale and sample, respondents estimate the quantity of CO2 released when combusting a gallon of gasoline in comparison to several well-known metrics including food calories and travel distance. Consistently, respondents estimated the quantity of CO2 from gasoline compared to other metrics with significantly less accuracy while exhibiting a tendency to underestimate CO2. Such relative absence of carbon numeracy of even a basic consumption habit may limit the effectiveness of environmental policies and campaigns aimed at changing individual behavior. We discuss several caveats as well as opportunities for policy design that could aid the improvement of people’s quantitative understanding of their carbon footprint. PMID:29723206
Photon-counting-based diffraction phase microscopy combined with single-pixel imaging
NASA Astrophysics Data System (ADS)
Shibuya, Kyuki; Araki, Hiroyuki; Iwata, Tetsuo
2018-04-01
We propose a photon-counting (PC)-based quantitative-phase imaging (QPI) method for use in diffraction phase microscopy (DPM) that is combined with a single-pixel imaging (SPI) scheme (PC-SPI-DPM). This combination of DPM with the SPI scheme overcomes a low optical throughput problem that has occasionally prevented us from obtaining quantitative-phase images in DPM through use of a high-sensitivity single-channel photodetector such as a photomultiplier tube (PMT). The introduction of a PMT allowed us to perform PC with ease and thus solved a dynamic range problem that was inherent to SPI. As a proof-of-principle experiment, we performed a comparison study of analogue-based SPI-DPM and PC-SPI-DPM for a 125-nm-thick indium tin oxide (ITO) layer coated on a silica glass substrate. We discuss the basic performance of the method and potential future modifications of the proposed system.
Focus groups: a useful tool for curriculum evaluation.
Frasier, P Y; Slatt, L; Kowlowitz, V; Kollisch, D O; Mintzer, M
1997-01-01
Focus group interviews have been used extensively in health services program planning, health education, and curriculum planning. However, with the exception of a few reports describing the use of focus groups for a basic science course evaluation and a clerkship's impact on medical students, the potential of focus groups as a tool for curriculum evaluation has not been explored. Focus groups are a valid stand-alone evaluation process, but they are most often used in combination with other quantitative and qualitative methods. Focus groups rely heavily on group interaction, combining elements of individual interviews and participant observation. This article compares the focus group interview with both quantitative and qualitative methods; discusses when to use focus group interviews; outlines a protocol for conducting focus groups, including a comparison of various styles of qualitative data analysis; and offers a case study, in which focus groups evaluated the effectiveness of a pilot preclinical curriculum.
Mars Observer: Mission toward a basic understanding of Mars
NASA Technical Reports Server (NTRS)
Albee, Arden L.
1992-01-01
The Mars Observer Mission will provide a spacecraft platform about Mars from which the entire Martian surface and atmosphere will be observed and mapped by remote sensing instruments for at least 1 Martian year. The scientific objectives for the Mission emphasize qualitative and quantitative determination of the elemental and mineralogical composition of the surface; measurement of the global surface topography, gravity field, and magnetic field; and the development of a synoptic data base of climatological conditions. The Mission will provide basic global understanding of Mars as it exists today and will provide a framework for understanding its past.
Nuclear Reactions in Micro/Nano-Scale Metal Particles
NASA Astrophysics Data System (ADS)
Kim, Y. E.
2013-03-01
Low-energy nuclear reactions in micro/nano-scale metal particles are described based on the theory of Bose-Einstein condensation nuclear fusion (BECNF). The BECNF theory is based on a single basic assumption capable of explaining the observed LENR phenomena; deuterons in metals undergo Bose-Einstein condensation. The BECNF theory is also a quantitative predictive physical theory. Experimental tests of the basic assumption and theoretical predictions are proposed. Potential application to energy generation by ignition at low temperatures is described. Generalized theory of BECNF is used to carry out theoretical analyses of recently reported experimental results for hydrogen-nickel system.
Applying Mixed Methods Techniques in Strategic Planning
ERIC Educational Resources Information Center
Voorhees, Richard A.
2008-01-01
In its most basic form, strategic planning is a process of anticipating change, identifying new opportunities, and executing strategy. The use of mixed methods, blending quantitative and qualitative analytical techniques and data, in the process of assembling a strategic plan can help to ensure a successful outcome. In this article, the author…
Abstract Numeric Relations and the Visual Structure of Algebra
ERIC Educational Resources Information Center
Landy, David; Brookes, David; Smout, Ryan
2014-01-01
Formal algebras are among the most powerful and general mechanisms for expressing quantitative relational statements; yet, even university engineering students, who are relatively proficient with algebraic manipulation, struggle with and often fail to correctly deploy basic aspects of algebraic notation (Clement, 1982). In the cognitive tradition,…
The Utility of Single Subject Design Research
ERIC Educational Resources Information Center
Bennett, Kyle D.
2016-01-01
Single subject design (SSD) research is a quantitative approach used to investigate basic and applied research questions. It has been used for decades to examine issues of social importance such as those related to general and special education strategies, therapeutic approaches in mental health, community health practices, safety, and business…
e-Learning: The Student Experience
ERIC Educational Resources Information Center
Gilbert, Jennifer; Morton, Susan; Rowley, Jennifer
2007-01-01
The paper draws on in-depth qualitative comments from student evaluation of an e-learning module on an MSc in Information Technologies and Management, to develop a picture of their perspective on the experience. Questionnaires that yielded some basic quantitative data and a rich seam of qualitative data were administered. General questions on…
Avian life history profiles for use in the Markov chain nest productivity model (MCnest)
The Markov Chain nest productivity model, or MCnest, quantitatively estimates the effects of pesticides or other toxic chemicals on annual reproductive success of avian species (Bennett and Etterson 2013, Etterson and Bennett 2013). The Basic Version of MCnest was developed as a...
This paper evaluates the chemical stability of four arsenosugars using tetramethylammonium hydroxide (TMAOH) as an extraction solvent. This solvent was chosen because of the near quantitative removal of these arsenicals from difficult to extract seafood (oysters and shellfish). ...
Aliakrinskaia, I O
2010-01-01
The basic morphological, ethological, and physiological-biochemical adaptations of Monodonta turbinata to survival in the littoral zone were investigated in this work. Quantitative estimation of myoglobin content in radular tissues of mollusks inhabiting the Mediterranean Sea Basin has been carried out.
The next generation of training for Arabidopsis researchers: bioinformatics and quantitative biology
USDA-ARS?s Scientific Manuscript database
It has been more than 50 years since Arabidopsis (Arabidopsis thaliana) was first introduced as a model organism to understand basic processes in plant biology. A well-organized scientific community has used this small reference plant species to make numerous fundamental plant biology discoveries (P...
The Decomposition of Zinc Carbonate: Using Stoichiometry to Choose between Chemical Formulas
ERIC Educational Resources Information Center
DeMeo, Stephen
2004-01-01
The existence of different stoichiometries in two basic zinc carbonates helps to explain their decomposition potential, and to select between the two compound formulas, which better describes zinc carbonate. The accuracy of the experiment makes it a viable means for students to perform various quantitative measurements.
Understanding the Damped SHM without ODEs
ERIC Educational Resources Information Center
Ng, Chiu-king
2016-01-01
Instead of solving ordinary differential equations (ODEs), the damped simple harmonic motion (SHM) is surveyed qualitatively from basic mechanics and quantitatively by the instrumentality of a graph of velocity against displacement. In this way, the condition b ? [square root]4mk for the occurrence of the non-oscillating critical damping and…
Teaching Online: Applying Need Theory to the Work-Family Interface
ERIC Educational Resources Information Center
Nicklin, Jessica M.; McNall, Laurel A.; Cerasoli, Christopher P.; Varga, Claire M.; McGivney, R. J.
2016-01-01
Using Warner and Hausdorf's (2009) framework, the authors empirically examined work-life balance and work outcomes among collegiate faculty teaching courses online. Quantitative and qualitative results from 138 online instructors demonstrated that basic psychological need satisfaction was related to higher levels of work-family enrichment, job…
Attitude errors arising from antenna/satellite altitude errors - Recognition and reduction
NASA Technical Reports Server (NTRS)
Godbey, T. W.; Lambert, R.; Milano, G.
1972-01-01
A review is presented of the three basic types of pulsed radar altimeter designs, as well as the source and form of altitude bias errors arising from antenna/satellite attitude errors in each design type. A quantitative comparison of the three systems was also made.
Estimating tree heights from shadows on vertical aerial photographs
Earl J. Rogers
1947-01-01
Aerial photographs are now being applied more and more to practical forestry - especially to forest survey. Many forest characteristics can be recognized on aerial photographs in greater detail than is possible through ground methods alone. The basic need is for tools and methods for interpreting the detail in quantitative terms.
Geary, Nori
2013-02-01
Analysis of the interactive effects of combinations of hormones or other manipulations with qualitatively similar individual effects is an important topic in basic and clinical endocrinology as well as other branches of basic and clinical research related to integrative physiology. Functional, as opposed to mechanistic, analyses of interactions rely on the concept of synergy, which can be defined qualitatively as a cooperative action or quantitatively as a supra-additive effect according to some metric for the addition of different dose-effect curves. Unfortunately, dose-effect curve addition is far from straightforward; rather, it requires the development of an axiomatic mathematical theory. I review the mathematical soundness, face validity, and utility of the most frequently used approaches to supra-additive synergy. These criteria highlight serious problems in the two most common synergy approaches, response additivity and Loewe additivity, which is the basis of the isobole and related response surface approaches. I conclude that there is no adequate, generally applicable, supra-additive synergy metric appropriate for endocrinology or any other field of basic and clinical integrative physiology. I recommend that these metrics be abandoned in favor of the simpler definition of synergy as a cooperative, i.e., nonantagonistic, effect. This simple definition avoids mathematical difficulties, is easily applicable, meets regulatory requirements for combination therapy development, and suffices to advance phenomenological basic research to mechanistic studies of interactions and clinical combination therapy research.
NASA Astrophysics Data System (ADS)
Wang, Yanqiu; Huang, Xiaorong; Gao, Linyun; Guo, Biying; Ma, Kai
2018-06-01
Water resources are not only basic natural resources, but also strategic economic resources and ecological control factors. Water resources carrying capacity constrains the sustainable development of regional economy and society. Studies of water resources carrying capacity can provide helpful information about how the socioeconomic system is both supported and restrained by the water resources system. Based on the research of different scholars, major problems in the study of water resources carrying capacity were summarized as follows: the definition of water resources carrying capacity is not yet unified; the methods of carrying capacity quantification based on the definition of inconsistency are poor in operability; the current quantitative research methods of water resources carrying capacity did not fully reflect the principles of sustainable development; it is difficult to quantify the relationship among the water resources, economic society and ecological environment
. Therefore, it is necessary to develop a better quantitative evaluation method to determine the regional water resources carrying capacity. This paper proposes a new approach to quantifying water resources carrying capacity (that is, through the compilation of the water resources balance sheet) to get a grasp of the regional water resources depletion and water environmental degradation (as well as regional water resources stock assets and liabilities), figure out the squeeze of socioeconomic activities on the environment, and discuss the quantitative calculation methods and technical route of water resources carrying capacity which are able to embody the substance of sustainable development.
NASA Astrophysics Data System (ADS)
Cabello, Violeta
2017-04-01
This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.
Alexander, Kathleen M; Olsen, Janette; Seiger, Cindy; Peterson, Teri S
2016-01-01
Student physical therapists are expected to learn and confidently perform technical skills while integrating nontechnical behavioral and cognitive skills in their examinations and interventions. The purpose of this study was to compare the self-confidence of entry-level doctoral student physical therapists during foundational assessment and musculoskeletal differential diagnosis courses and the students' competencies based on skills examinations. Methods using qualitative and quantitative procedures. Student physical therapists (n=27) participated in a basic assessment course followed by a musculoskeletal differential diagnosis course. The students completed confidence surveys prior to skills examinations in both courses. A random sample of students participated in focus groups, led by a researcher outside the physical therapy department. Student confidence did not correlate with competency scores. At the end of the basic clinical assessment course and the beginning of the differential diagnosis course, students' confidence was significantly below baseline. However, by the end of the differential diagnosis course, student confidence had returned to original baseline levels. Over three semesters, the students lost confidence and then regained confidence in their abilities. Additional experience and practice influenced perceived confidence. However, increased competence may have been associated with poor self-appraisal skills instead of increased competency.
Pérez-Payá, E; Porcar, I; Gómez, C M; Pedrós, J; Campos, A; Abad, C
1997-08-01
A thermodynamic approach is proposed to quantitatively analyze the binding isotherms of peptides to model membranes as a function of one adjustable parameter, the actual peptide charge in solution z(p)+. The main features of this approach are a theoretical expression for the partition coefficient calculated from the molar free energies of the peptide in the aqueous and lipid phases, an equation proposed by S. Stankowski [(1991) Biophysical Journal, Vol. 60, p. 341] to evaluate the activity coefficient of the peptide in the lipid phase, and the Debye-Hückel equation that quantifies the activity coefficient of the peptide in the aqueous phase. To assess the validity of this approach we have studied, by means of steady-state fluorescence spectroscopy, the interaction of basic amphipathic peptides such as melittin and its dansylcadaverine analogue (DNC-melittin), as well as a new fluorescent analogue of substance P, SP (DNC-SP) with neutral phospholipid membranes. A consistent quantitative analysis of each binding curve was achieved. The z(p)+ values obtained were always found to be lower than the physical charge of the peptide. These z(p)+ values can be rationalized by considering that the peptide charged groups are strongly associated with counterions in buffer solution at a given ionic strength. The partition coefficients theoretically derived using the z(p)+ values were in agreement with those deduced from the Gouy-Chapman formalism. Ultimately, from the z(p)+ values the molar free energies for the free and lipid-bound states of the peptides have been calculated.
NASA Astrophysics Data System (ADS)
Auricchio, F.; Conti, M.; Lefieux, A.; Morganti, S.; Reali, A.; Sardanelli, F.; Secchi, F.; Trimarchi, S.; Veneziani, A.
2014-10-01
The purpose of this study is to quantitatively evaluate the impact of endovascular repair on aortic hemodynamics. The study addresses the assessment of post-operative hemodynamic conditions of a real clinical case through patient-specific analysis, combining accurate medical image analysis and advanced computational fluid-dynamics (CFD). Although the main clinical concern was firstly directed to the endoluminal protrusion of the prosthesis, the CFD simulations have demonstrated that there are two other important areas where the local hemodynamics is impaired and a disturbed blood flow is present: the first one is the ostium of the subclavian artery, which is partially closed by the graft; the second one is the stenosis of the distal thoracic aorta. Besides the clinical relevance of these specific findings, this study highlights how CFD analyses allow to observe important flow effects resulting from the specific features of patient vessel geometries. Consequently, our results demonstrate the potential impact of computational biomechanics not only on the basic knowledge of physiopathology, but also on the clinical practice, thanks to a quantitative extraction of knowledge made possible by merging medical data and mathematical models.
Pedodiversity and Its Significance in the Context of Modern Soil Geography
NASA Astrophysics Data System (ADS)
Krasilnikov, P. V.; Gerasimova, M. I.; Golovanov, D. L.; Konyushkova, M. V.; Sidorova, V. A.; Sorokin, A. S.
2018-01-01
Methodological basics of the study and quantitative assessment of pedodiversity are discussed. It is shown that the application of various indices and models of pedodiversity can be feasible for solving three major issues in pedology: a comparative geographical analysis of different territories, a comparative historical analysis of soil development in the course of landscape evolution, and the analysis of relationships between biodiversity and pedodiversity. Analogous geographic concepts of geodiversity and landscape diversity are also discussed. Certain limitations in the use of quantitative estimates of pedodiversity related to their linkage to the particular soil classification systems and with the initial soil maps are considered. Problems of the interpretation of the results of pedodiversity assessments are emphasized. It is shown that scientific explanations of biodiversity cannot be adequately applied in soil studies. Promising directions of further studies of pedodiversity are outlined. They include the assessment of the functional diversity of soils on the basis of data on their properties, integration with geostatistical methods of evaluation of soil variability, and assessment of pedodiversity on different scales.
Knowledge and beliefs of Malaysian adolescents regarding cancer.
Al-Naggar, Redhwan Ahmed; Jillson, Irene Anne; Abu-Hamad, Samir; Mumford, William; Bobryshev, Yuri V
2015-01-01
Few studies have explored the knowledge and attitudes of adolescents toward cancer prevention and treatment. This lack of research and its potential utility in the development of new educational initiatives and screening methods, or the reconstruction of existing ones, provided the impetus for this study. The primary research aim was to assess secondary school student knowledge of cancer and determine whether or not they possessed basic knowledge of cancer symptoms, risk factors, and treatments and to determine the relationship between cancer knowledge and key demographic factors. The Management and Science University conducted a cross-sectional study analyzing responses through cross-tabulation with the socio-demographic data collected. The findings of our quantitative analysis suggest that Malaysian youth generally possess a moderate knowledge about cancer. Quantitative analyses found that socioeconomic inequalities and bias in education present as important factors contributing to cancer awareness, prevention, and treatment among Malaysian adolescents. The findings indicate that Malaysian youth generally possess a moderate knowledge about cancer but the current deficiencies in initiatives directed to cancer awareness continue to hinder the improvement in prevention of cancer among Malaysian adolescents.
Alexis, Matamoro-Vidal; Isaac, Salazar-Ciudad; David, Houle
2015-01-01
One of the aims of evolutionary developmental biology is to discover the developmental origins of morphological variation. The discipline has mainly focused on qualitative morphological differences (e.g., presence or absence of a structure) between species. Studies addressing subtle, quantitative variation are less common. The Drosophila wing is a model for the study of development and evolution, making it suitable to investigate the developmental mechanisms underlying the subtle quantitative morphological variation observed in nature. Previous reviews have focused on the processes involved in wing differentiation, patterning and growth. Here, we investigate what is known about how the wing achieves its final shape, and what variation in development is capable of generating the variation in wing shape observed in nature. Three major developmental stages need to be considered: larval development, pupariation, and pupal development. The major cellular processes involved in the determination of tissue size and shape are cell proliferation, cell death, oriented cell division and oriented cell intercalation. We review how variation in temporal and spatial distribution of growth and transcription factors affects these cellular mechanisms, which in turn affects wing shape. We then discuss which aspects of the wing morphological variation are predictable on the basis of these mechanisms. PMID:25619644
Zhang, Dongdong; Liu, Xuejiao; Liu, Yu; Sun, Xizhuo; Wang, Bingyuan; Ren, Yongcheng; Zhao, Yang; Zhou, Junmei; Han, Chengyi; Yin, Lei; Zhao, Jingzhi; Shi, Yuanyuan; Zhang, Ming; Hu, Dongsheng
2017-10-01
Leisure-time physical activity (LTPA) has been suggested to reduce risk of metabolic syndrome (MetS). However, a quantitative comprehensive assessment of the dose-response association between LTPA and incident MetS has not been reported. We performed a meta-analysis of studies assessing the risk of MetS with LTPA. MEDLINE via PubMed and EMBase databases were searched for relevant articles published up to March 13, 2017. Random-effects models were used to estimate the summary relative risk (RR) of MetS with LTPA. Restricted cubic splines were used to model the dose-response association. We identified 16 articles (18 studies including 76,699 participants and 13,871 cases of MetS). We found a negative linear association between LTPA and incident MetS, with a reduction of 8% in MetS risk per 10 metabolic equivalent of task (MET) h/week increment. According to the restricted cubic splines model, risk of MetS was reduced 10% with LTPA performed according to the basic guideline-recommended level of 150min of moderate PA (MPA) per week (10METh/week) versus inactivity (RR=0.90, 95% CI 0.86-0.94). It was reduced 20% and 53% with LTPA at twice (20METh/week) and seven times (70METh/week) the basic recommended level (RR=0.80, 95% CI 0.74-0.88 and 0.47, 95% CI 0.34-0.64, respectively). Our findings provide quantitative data suggesting that any amount of LTPA is better than none and that LTPA substantially exceeding the current LTPA guidelines is associated with an additional reduction in MetS risk. Copyright © 2017. Published by Elsevier Inc.
Psychopathology of basic stages of schizophrenia in view of formal thought disturbances.
Gross, G; Huber, G
1985-01-01
Psychopathological, nosological, and prognostic aspects of basic stages and basic symptoms, in particular consideration of formal thought disorders, are outlined. In view of the far-reaching overlap of the psychopathological pictures of the pre- und postpsychotic basic stages a Bonn Scale for the Assessment of Basic Symptoms (BSABS) including all types of basic stages was constructed. Subjective cognitive thought disorders were recorded from 69% of the patients in pure defective states, from 78% in postpsychotic reversible basic stages and from 67% in prodromes. In contrast to incoherence of thoughts, including the symptoms of the endogenomorphic-schizophrenic axial syndrome (Berner), these thought disorders are registered only on the basis of the reports of the patients and not through observation by the investigator. The difference between subjective and objective thought disorders is presumably only conditioned primarily by differences in the degree and secondarily by the psychopathological quality of the disorders. If the criteria concerning formal thought disorders and affective blunting of the schizophrenic axial syndrome or of SANS (Andreasen) are fulfilled, as a rule the patient loses the ability of perceiving, communicating, and coping with the disorders, and at the same time there is a break from an only quantitative to a qualitative abnormal phenomenon. The presence or absence of subjective or objective formal thought disorders in the beginning of the disease had no significant influence on the long-term outcome in the main sample of the Bonn study. Proceeding from the initial psychopathological syndromes 54% of the female hebephrenics with the most unfavorable long-term prognosis showed incoherence of thoughts in the first 2 years of the illness; in contrast, incoherence was seen in only 16% of the male hebephrenics for whom the long-term outcome did not differ from that of the whole sample. This and other data of the Bonn schizophrenia study seem to argue in favor of the assumption that typical incoherence of thoughts might be valuated as a criterion of unfavorable prognosis only when the phenomenon appears within the context of a hebephrenic initial syndrome in the beginning of the schizophrenic disease.
Wojczyńska, A; Leiggener, C S; Bredell, M; Ettlin, D A; Erni, S; Gallo, L M; Colombo, V
2016-10-01
The aim of this study was to qualitatively and quantitatively describe the biomechanics of existing total alloplastic reconstructions of temporomandibular joints (TMJ). Fifteen patients with unilateral or bilateral TMJ total joint replacements and 15 healthy controls were evaluated via dynamic stereometry technology. This non-invasive method combines three-dimensional imaging of the subject's anatomy with jaw tracking. It provides an insight into the patient's jaw joint movements in real time and provides a quantitative evaluation. The patients were also evaluated clinically for jaw opening, protrusive and laterotrusive movements, pain, interference with eating, and satisfaction with the joint replacements. The qualitative assessment revealed that condyles of bilateral total joint replacements displayed similar basic motion patterns to those of unilateral prostheses. Quantitatively, mandibular movements of artificial joints during opening, protrusion, and laterotrusion were all significantly shorter than those of controls. A significantly restricted mandibular range of motion in replaced joints was also observed clinically. Fifty-three percent of patients suffered from chronic pain at rest and 67% reported reduced chewing function. Nonetheless, patients declared a high level of satisfaction with the replacement. This study shows that in order to gain a comprehensive understanding of complex therapeutic measures, a multidisciplinary approach is needed. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Gucciardi, Daniel F; Jackson, Ben
2015-01-01
Fostering individuals' long-term participation in activities that promote positive development such as organised sport is an important agenda for research and practice. We integrated the theories of planned behaviour (TPB) and basic psychological needs (BPN) to identify factors associated with young adults' continuation in organised sport over a 12-month period. Prospective study, including an online psycho-social assessment at Time 1 and an assessment of continuation in sport approximately 12 months later. Participants (N=292) aged between 17 and 21 years (M=18.03; SD=1.29) completed an online survey assessing the theories of planned behaviour and basic psychological needs constructs. Bayesian structural equation modelling (BSEM) was employed to test the hypothesised theoretical sequence, using informative priors for structural relations based on empirical and theoretical expectations. The analyses revealed support for the robustness of the hypothesised theoretical model in terms of the pattern of relations as well as the direction and strength of associations among the constructs derived from quantitative summaries of existing research and theoretical expectations. The satisfaction of basic psychological needs was associated with more positive attitudes, higher levels of perceived behavioural control, and more favourable subjective norms; positive attitudes and perceived behavioural control were associated with higher behavioural intentions; and both intentions and perceived behavioural control predicted sport continuation. This study demonstrated the utility of Bayesian structural equation modelling for testing the robustness of an integrated theoretical model, which is informed by empirical evidence from meta-analyses and theoretical expectations, for understanding sport continuation. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
The Merensky Reef in the Chineisky Pluton (Siberia)? A myth or a reality?
NASA Astrophysics Data System (ADS)
Zhitova, L.; Sharapov, V.; Zhukova, I.
2006-12-01
It is a dream of each geologist to find a `Merensky Reef' in each layered basic intrusion. Scientists have been trying many various techniques to come this dream to reality. The most perspective way to do so is probably a combination of physicochemical and computer modeling of layered basic intrusion crystallization together with fluid and melt inclusions studies in situ. This combination allows us to do the following: 1. To study boundary conditions for separation of low density gas phase and salt melt from the crystallizing primary basic melt in large magma chamber. 2. To determine correct quantitative parameters for formation of residual fluid-bearing brines extracting high metal concentrations. 3. To compute critical levels for substance differentiation at phase, geochemical and other `barriers' in those basic mantle-crust ore magmatic systems. 4. To model metal extraction, transportation and deposition at these `barriers' for systems of various `silicate melt - residual salt brines' ratios under the conditions of continental lithosphere. Comparison of real and modeled data allows us to conclude if a formation of a narrow zone of high metal concentration is possible at those critical levels (phase and geochemical `barriers'). The above-mentioned algorithm has been used for the Chineisky Pluton (the Transbaikal region, Siberia). Fortunately we have found our own `Merensky Reef', which happened to be a PGE enrichment marginal zone of the Chineisky Pluton due to specific fluid regime of crystallization! This work was supported by the Ministry for Russian Science and Education, Grant #DSP.2.1.1.702.
Ahlström, Christine; Peletier, Lambertus A; Gabrielsson, Johan
2011-10-09
In this paper we quantitatively evaluate two feedback systems with a focus on rate and extent of tolerance and rebound development. In the two feedback systems, the regulation of turnover of response is governed by one or several moderators. In the basic system, one single moderator inhibits the formation of response. This system has been applied to cortisol secretion and serotonin reuptake inhibition. The basic system has been extended to adequately describe nicotinic acid (NiAc)-induced changes in non-esterified fatty acids (NEFA). In the extended system, the feedback is described by a cascade of moderators where the first inhibits formation of response and the last stimulates loss of response. The objectives of this paper were to analyze these systems from a mathematical/analytical and quantitative point of view and to present simulations with different parameter settings and dosing regimens in order to highlight the intrinsic behaviour of these systems and to present expressions and graphs that are applicable for quantification of rate and extent of tolerance and rebound. The dynamics of the moderators (k(tol)) compared to the dynamics of the response (k(out)), was shown to be important for the behaviour of both systems. For instance, slow dynamics of the moderator compared to the response (k(tol)
Han, Xueying; Williams, Sharon R; Zuckerman, Brian L
2018-01-01
The translation of biomedical research from basic knowledge to application has been a priority at the National Institute of Health (NIH) for many years. Tracking the progress of scientific research and knowledge through the translational process is difficult due to variation in the definition of translational research as well as the identification of benchmarks for the spread and application of biomedical research; quantitatively tracking this process is even more difficult. Using a simple and reproducible method to assess whether publications are translational, we examined NIH R01 behavioral and social science research (BSSR) awards funded between 2008 and 2014 to determine whether there are differences in the percent of translational research publications produced by basic and applied research awards. We also assessed the percent of translational research publications produced by the Clinical and Translational Science Awards (CTSA) program to evaluate whether targeted translational research awards result in increased translational research. We found that 3.9% of publications produced by basic research awards were translational; that the percent of translational research publications produced by applied research awards is approximately double that of basic research awards (7.4%); and that targeted translational research awards from the CTSA program produced the highest percentage of translational research publications (13.4%). In addition, we assessed differences in time to first publication, time to first citation, and publication quality by award type (basic vs. applied), and whether an award (or publication) is translational.
Williams, Sharon R.; Zuckerman, Brian L.
2018-01-01
The translation of biomedical research from basic knowledge to application has been a priority at the National Institute of Health (NIH) for many years. Tracking the progress of scientific research and knowledge through the translational process is difficult due to variation in the definition of translational research as well as the identification of benchmarks for the spread and application of biomedical research; quantitatively tracking this process is even more difficult. Using a simple and reproducible method to assess whether publications are translational, we examined NIH R01 behavioral and social science research (BSSR) awards funded between 2008 and 2014 to determine whether there are differences in the percent of translational research publications produced by basic and applied research awards. We also assessed the percent of translational research publications produced by the Clinical and Translational Science Awards (CTSA) program to evaluate whether targeted translational research awards result in increased translational research. We found that 3.9% of publications produced by basic research awards were translational; that the percent of translational research publications produced by applied research awards is approximately double that of basic research awards (7.4%); and that targeted translational research awards from the CTSA program produced the highest percentage of translational research publications (13.4%). In addition, we assessed differences in time to first publication, time to first citation, and publication quality by award type (basic vs. applied), and whether an award (or publication) is translational. PMID:29742129
Simultaneous extraction and quantitation of several bioactive amines in cheese and chocolate.
Baker, G B; Wong, J T; Coutts, R T; Pasutto, F M
1987-04-17
A method is described for simultaneous extraction and quantitation of the amines 2-phenylethylamine, tele-methylhistamine, histamine, tryptamine, m- and p-tyramine, 3-methoxytyramine, 5-hydroxytryptamine, cadaverine, putrescine, spermidine and spermine. This method is based on extractive derivatization of the amines with a perfluoroacylating agent, pentafluorobenzoyl chloride, under basic aqueous conditions. Analysis was done on a gas chromatograph equipped with an electron-capture detector and a capillary column system. The procedure is relatively rapid and provides derivatives with good chromatographic properties. Its application to analysis of the above amines in cheese and chocolate products is described.
Measuring the Enzyme Activity of Arabidopsis Deubiquitylating Enzymes.
Kalinowska, Kamila; Nagel, Marie-Kristin; Isono, Erika
2016-01-01
Deubiquitylating enzymes, or DUBs, are important regulators of ubiquitin homeostasis and substrate stability, though the molecular mechanisms of most of the DUBs in plants are not yet understood. As different ubiquitin chain types are implicated in different biological pathways, it is important to analyze the enzyme characteristic for studying a DUB. Quantitative analysis of DUB activity is also important to determine enzyme kinetics and the influence of DUB binding proteins on the enzyme activity. Here, we show methods to analyze DUB activity using immunodetection, Coomassie Brilliant Blue staining, and fluorescence measurement that can be useful for understanding the basic characteristic of DUBs.
Quantifying iron content in magnetic resonance imaging.
Ghassaban, Kiarash; Liu, Saifeng; Jiang, Caihong; Haacke, E Mark
2018-04-25
Measuring iron content has practical clinical indications in the study of diseases such as Parkinson's disease, Huntington's disease, ferritinopathies and multiple sclerosis as well as in the quantification of iron content in microbleeds and oxygen saturation in veins. In this work, we review the basic concepts behind imaging iron using T2, T2*, T2', phase and quantitative susceptibility mapping in the human brain, liver and heart, followed by the applications of in vivo iron quantification in neurodegenerative diseases, iron tagged cells and ultra-small superparamagnetic iron oxide (USPIO) nanoparticles. Copyright © 2018 Elsevier Inc. All rights reserved.
The energy expenditure of normal and pathologic gait.
Waters, R L; Mulroy, S
1999-07-01
Physiological energy expenditure measurement has proven to be a reliable method of quantitatively assessing the penalties imposed by gait disability. The purpose of this review is to outline the basic principles of exercise physiology relevant to human locomotion; detail the energy expenditure of normal walking; and summarize the results of energy expenditure studies performed in patients with specific neurologic and orthopedic disabilities. The magnitude of the disabilities and the patients' capacity to tolerate the increased energy requirements are compared. This paper also will examine the effectiveness of rehabilitation interventions at mitigating the energetic penalties of disability during ambulation.
A disciplined approach to capital: today's healthcare imperative.
Dupuis, Patrick J; Kaufman, Kenneth
2007-07-01
BJC HealthCare's experience exemplifies several basic principles of a finance-based approach to capital. Organizations that adopt this approach look to improve processes first, remove costs second, and spend capital last. Multiyear planning is required to quantitatively identify the profitability and liquidity requirements of strategic initiatives and address essential funding and financing issues.
NASA Astrophysics Data System (ADS)
Liu, Ji-Hua
2018-03-01
Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11472187 and 11602166), the National Basic Research Program of China (Grant No. 2014CB046805), and the Natural Science Foundation of Tianjin, China (Grant No. 16JCYBJC40500).
Qualititive and Quantitative Allocations of Program Funds in a Non-Profit Institution.
ERIC Educational Resources Information Center
Brown, Edward K.
Through a generalized application of the principles of programing-planning-budgeting (PPB), a process was devised for describing the methods of resource allocation in a nonprofit institution. By categorizing pupil service inputs according to basic skills, instruction, and supportive services it became possible to identify meaningful service input…
Educational Forecasting Methodologies: State of the Art, Trends, and Highlights.
ERIC Educational Resources Information Center
Hudson, Barclay; Bruno, James
This overview of both quantitative and qualitative methods of educational forecasting is introduced by a discussion of a general typology of forecasting methods. In each of the following sections, discussion follows the same general format: a number of basic approaches are identified (e.g. extrapolation, correlation, systems modelling), and each…
Quantitative Assay for Starch by Colorimetry Using a Desktop Scanner
ERIC Educational Resources Information Center
Matthews, Kurt R.; Landmark, James D.; Stickle, Douglas F.
2004-01-01
The procedure to produce standard curve for starch concentration measurement by image analysis using a color scanner and computer for data acquisition and color analysis is described. Color analysis is performed by a Visual Basic program that measures red, green, and blue (RGB) color intensities for pixels within the scanner image.
Assessing Motivation To Read. Instructional Resource No. 14.
ERIC Educational Resources Information Center
Gambrell, Linda B.; And Others
The Motivation to Read Profile (MRP) is a public-domain instrument designed to provide teachers with an efficient and reliable way to assess reading motivation qualitatively and quantitatively by evaluating students' self-concept as readers and the value they place on reading. The MRP consists of two basic instruments: the Reading Survey (a…
Genetics, Environment, and Behavior: Implications for Educational Policy.
ERIC Educational Resources Information Center
Ehrman, Lee, Ed.; And Others
The contents of this book, which presents the fruits of one of a series of conferences organized by the National Research Council Committee on Basic Research in Education, includes 12 papers, with discussion and comments: "Introductory Remarks," Ernst W. Caspari; "Quantitative Aspects of Genetics and Environment in the Determination of Behavior,"…
Friulian: The Friulian Language in Education in Italy. Regional Dossiers Series
ERIC Educational Resources Information Center
Petris, Cinzia, Comp.
2014-01-01
This regional dossier aims to provide a concise, description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
Catalan: The Catalan Language in Education in Spain, 2nd Edition. Regional Dossiers Series
ERIC Educational Resources Information Center
Areny, Maria, Comp.; Mayans, Pere, Comp.; Forniès, David, Comp.
2013-01-01
Regional dossiers aim at providing a concise description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B
2010-11-15
The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.
A multiscale model on hospital infections coupling macro and micro dynamics
NASA Astrophysics Data System (ADS)
Wang, Xia; Tang, Sanyi
2017-09-01
A multiscale model of hospital infections coupling the micro model of the growth of bacteria and the macro model describing the transmission of the bacteria among patients and health care workers (HCWs) was established to investigate the effects of antibiotic treatment on the transmission of the bacteria among patients and HCWs. The model was formulated by viewing the transmission rate from infected patients to HCWs and the shedding rate of bacteria from infected patients to the environment as saturated functions of the within-host bacterial load. The equilibria and the basic reproduction number of the coupled system were studied, and the global dynamics of the disease free equilibrium and the endemic equilibrium were analyzed in detail by constructing two Lyapunov functions. Furthermore, effects of drug treatment in the within-host model on the basic reproduction number and the dynamics of the coupled model were studied by coupling a pharmacokinetics model with the within-host model. Sensitive analysis indicated that the growth rate of the bacteria, the maximum drug effect and the dosing interval are the three most sensitive parameters contributing to the basic reproduction number. Thus, adopting ;wonder; drugs to decrease the growth rate of the bacteria or to increase the drug's effect is the most effective measure but changing the dosage regime is also effective. A quantitative criterion of how to choose the best dosage regimen can also be obtained from numerical results.
NASA Astrophysics Data System (ADS)
Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.
We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.
A Soft, Wearable Microfluidic Device for the Capture, Storage, and Colorimetric Sensing of Sweat
Koh, Ahyeon; Kang, Daeshik; Xue, Yeguang; Lee, Seungmin; Pielak, Rafal M.; Kim, Jeonghyun; Hwang, Taehwan; Min, Seunghwan; Banks, Anthony; Bastien, Philippe; Manco, Megan C.; Wang, Liang; Ammann, Kaitlyn R.; Jang, Kyung-In; Won, Phillip; Han, Seungyong; Ghaffari, Roozbeh; Paik, Ungyu; Slepian, Marvin J.; Balooch, Guive; Huang, Yonggang; Rogers, John A.
2017-01-01
Capabilities in health monitoring via capture and quantitative chemical analysis of sweat could complement, or potentially obviate the need for, approaches based on sporadic assessment of blood samples. Established sweat monitoring technologies use simple fabric swatches and are limited to basic analysis in controlled laboratory or hospital settings. We present a collection of materials and device designs for soft, flexible and stretchable microfluidic systems, including embodiments that integrate wireless communication electronics, which can intimately and robustly bond to the surface of skin without chemical and mechanical irritation. This integration defines access points for a small set of sweat glands such that perspiration spontaneously initiates routing of sweat through a microfluidic network and set of reservoirs. Embedded chemical analyses respond in colorimetric fashion to markers such as chloride and hydronium ions, glucose and lactate. Wireless interfaces to digital image capture hardware serve as a means for quantitation. Human studies demonstrated the functionality of this microfluidic device during fitness cycling in a controlled environment and during long-distance bicycle racing in arid, outdoor conditions. The results include quantitative values for sweat rate, total sweat loss, pH and concentration of both chloride and lactate. PMID:27881826
Biophysics at the Boundaries: The Next Problem Sets
NASA Astrophysics Data System (ADS)
Skolnick, Malcolm
2009-03-01
The interface between physics and biology is one of the fastest growing subfields of physics. As knowledge of such topics as cellular processes and complex ecological systems advances, researchers have found that progress in understanding these and other systems requires application of more quantitative approaches. Today, there is a growing demand for quantitative and computational skills in biological research and the commercialization of that research. The fragmented teaching of science in our universities still leaves biology outside the quantitative and mathematical culture that is the foundation of physics. This is particularly inopportune at a time when the needs for quantitative thinking about biological systems are exploding. More physicists should be encouraged to become active in research and development in the growing application fields of biophysics including molecular genetics, biomedical imaging, tissue generation and regeneration, drug development, prosthetics, neural and brain function, kinetics of nonequilibrium open biological systems, metabolic networks, biological transport processes, large-scale biochemical networks and stochastic processes in biochemical systems to name a few. In addition to moving into basic research in these areas, there is increasing opportunity for physicists in industry beginning with entrepreneurial roles in taking research results out of the laboratory and in the industries who perfect and market the inventions and developments that physicists produce. In this talk we will identify and discuss emerging opportunities for physicists in biophysical and biotechnological pursuits ranging from basic research through development of applications and commercialization of results. This will include discussion of the roles of physicists in non-traditional areas apart from academia such as patent law, financial analysis and regulatory science and the problem sets assigned in education and training that will enable future biophysicists to fill these roles.
Contrast imaging in mouse embryos using high-frequency ultrasound.
Denbeigh, Janet M; Nixon, Brian A; Puri, Mira C; Foster, F Stuart
2015-03-04
Ultrasound contrast-enhanced imaging can convey essential quantitative information regarding tissue vascularity and perfusion and, in targeted applications, facilitate the detection and measure of vascular biomarkers at the molecular level. Within the mouse embryo, this noninvasive technique may be used to uncover basic mechanisms underlying vascular development in the early mouse circulatory system and in genetic models of cardiovascular disease. The mouse embryo also presents as an excellent model for studying the adhesion of microbubbles to angiogenic targets (including vascular endothelial growth factor receptor 2 (VEGFR2) or αvβ3) and for assessing the quantitative nature of molecular ultrasound. We therefore developed a method to introduce ultrasound contrast agents into the vasculature of living, isolated embryos. This allows freedom in terms of injection control and positioning, reproducibility of the imaging plane without obstruction and motion, and simplified image analysis and quantification. Late gestational stage (embryonic day (E)16.6 and E17.5) murine embryos were isolated from the uterus, gently exteriorized from the yolk sac and microbubble contrast agents were injected into veins accessible on the chorionic surface of the placental disc. Nonlinear contrast ultrasound imaging was then employed to collect a number of basic perfusion parameters (peak enhancement, wash-in rate and time to peak) and quantify targeted microbubble binding in an endoglin mouse model. We show the successful circulation of microbubbles within living embryos and the utility of this approach in characterizing embryonic vasculature and microbubble behavior.
An Eigensystem Realization Algorithm (ERA) for modal parameter identification and model reduction
NASA Technical Reports Server (NTRS)
Juang, J. N.; Pappa, R. S.
1985-01-01
A method, called the Eigensystem Realization Algorithm (ERA), is developed for modal parameter identification and model reduction of dynamic systems from test data. A new approach is introduced in conjunction with the singular value decomposition technique to derive the basic formulation of minimum order realization which is an extended version of the Ho-Kalman algorithm. The basic formulation is then transformed into modal space for modal parameter identification. Two accuracy indicators are developed to quantitatively identify the system modes and noise modes. For illustration of the algorithm, examples are shown using simulation data and experimental data for a rectangular grid structure.
Proteomics Unveils Fibroblast-Cardiomyocyte Lactate Shuttle and Hexokinase Paradox in Mouse Muscles.
Rakus, Dariusz; Gizak, Agnieszka; Wiśniewski, Jacek R
2016-08-05
Quantitative mapping, given in biochemically interpretable units such as mol per mg of total protein, of tissue-specific proteomes is prerequisite for the analysis of any process in cells. We applied label- and standard-free proteomics to characterize three types of striated muscles: white, red, and cardiac muscle. The analysis presented here uncovers several unexpected and novel features of striated muscles. In addition to differences in protein expression levels, the three muscle types substantially differ in their patterns of basic metabolic pathways and isoforms of regulatory proteins. Importantly, some of the conclusions drawn on the basis of our results, such as the potential existence of a "fibroblast-cardiomyocyte lactate shuttle" and the "hexokinase paradox" point to the necessity of reinterpretation of some basic aspects of striated muscle metabolism. The data presented here constitute a powerful database and a resource for future studies of muscle physiology and for the design of pharmaceutics for the treatment of muscular disorders.
Basics of identification measurement technology
NASA Astrophysics Data System (ADS)
Klikushin, Yu N.; Kobenko, V. Yu; Stepanov, P. P.
2018-01-01
All available algorithms and suitable for pattern recognition do not give 100% guarantee, therefore there is a field of scientific night activity in this direction, studies are relevant. It is proposed to develop existing technologies for pattern recognition in the form of application of identification measurements. The purpose of the study is to identify the possibility of recognizing images using identification measurement technologies. In solving problems of pattern recognition, neural networks and hidden Markov models are mainly used. A fundamentally new approach to the solution of problems of pattern recognition based on the technology of identification signal measurements (IIS) is proposed. The essence of IIS technology is the quantitative evaluation of the shape of images using special tools and algorithms.
Wagstaff, Jane L; Taylor, Samantha L; Howard, Mark J
2013-04-05
This review aims to illustrate that STD NMR is not simply a method for drug screening and discovery, but has qualitative and quantitative applications that can answer fundamental and applied biological and biomedical questions involving molecular interactions between ligands and proteins. We begin with a basic introduction to the technique of STD NMR and report on recent advances and biological applications of STD including studies to follow the interactions of non-steroidal anti-inflammatories, minimum binding requirements for virus infection and understating inhibition of amyloid fibre formation. We expand on this introduction by reporting recent STD NMR studies of live-cell receptor systems, new methodologies using scanning STD, magic-angle spinning STD and approaches to use STD NMR in a quantitative fashion for dissociation constants and group epitope mapping (GEM) determination. We finish by outlining new approaches that have potential to influence future applications of the technique; NMR isotope-editing, heteronuclear multidimensional STD and (19)F STD methods that are becoming more amenable due to the latest NMR equipment technologies.
NASA Astrophysics Data System (ADS)
Fu, Jundong; Zhang, Guangcheng; Wang, Lei; Xia, Nuan
2018-01-01
Based on gigital elevation model in the 1 arc-second format of shuttle radar topography mission data, using the window analysis and mean change point analysis of geographic information system (GIS) technology, programmed with python modules this, automatically extracted and calculated geomorphic elements of Shandong province. The best access to quantitatively study area relief amplitude of statistical area. According to Chinese landscape classification standard, the landscape type in Shandong province was divided into 8 types: low altitude plain, medium altitude plain, low altitude platform, medium altitude platform, low altitude hills, medium altitude hills, low relief mountain, medium relief mountain and the percentages of Shandong province’s total area are as follows: 12.72%, 0.01%, 36.38%, 0.24%, 17.26%, 15.64%, 11.1%, 6.65%. The results of landforms are basically the same as the overall terrain of Shandong Province, Shandong province’s total area, and the study can quantitatively and scientifically provide reference for the classification of landforms in Shandong province.
Human DAZL, DAZ and BOULE genes modulate primordial germ cell and haploid gamete formation
Kee, Kehkooi; Angeles, Vanessa T; Flores, Martha; Nguyen, Ha Nam; Pera, Renee A Reijo
2009-01-01
The leading cause of infertility in men and women is quantitative and qualitative defects in human germ cell (oocyte and sperm) development. Yet, it has not been possible to examine the unique developmental genetics of human germ cell formation and differentiation due to inaccessibility of germ cells during fetal development. Although several studies have shown that germ cells can be differentiated from mouse and human embryonic stem cells, human germ cells differentiated in these studies generally did not develop beyond the earliest stages1-8. Here we used a germ cell reporter to quantitate and isolate primordial germ cells derived from both male and female hESCs. Then, by silencing and overexpressing genes that encode germ cell-specific cytoplasmic RNA-binding proteins (not transcription factors), we modulated human germ cell formation and developmental progression. We observed that human DAZL (Deleted in AZoospermia-Like) functions in primordial germ cell formation, whereas closely-related genes, DAZ and BOULE, promote later stages of meiosis and development of haploid gametes. These results are significant to the generation of gametes for future basic science and potential clinical applications. PMID:19865085
Kratochwill, Thomas R; Levin, Joel R
2014-04-01
In this commentary, we add to the spirit of the articles appearing in the special series devoted to meta- and statistical analysis of single-case intervention-design data. Following a brief discussion of historical factors leading to our initial involvement in statistical analysis of such data, we discuss: (a) the value added by including statistical-analysis recommendations in the What Works Clearinghouse Standards for single-case intervention designs; (b) the importance of visual analysis in single-case intervention research, along with the distinctive role that could be played by single-case effect-size measures; and (c) the elevated internal validity and statistical-conclusion validity afforded by the incorporation of various forms of randomization into basic single-case design structures. For the future, we envision more widespread application of quantitative analyses, as critical adjuncts to visual analysis, in both primary single-case intervention research studies and literature reviews in the behavioral, educational, and health sciences. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.
[Development of a program of prevention of drug dependence in school children].
García Lerín, A; Calvo Trujillo, S; Sánchez-Porro Valadés, P
1997-03-15
To promote healthy habits of behaviour among school-children so that they reject drug-taking and learn to identify high-risk situations. Quasi-experimental study. The Amorós private school in Carabanchel, in Madrid's Health District XI. 45 pupils from the eighth year of basic, aged between 13 and 14. Quantitative and qualitative methods. Quantitative indicators were: number of students who attended the activities organised, their level of participation, the number of new terms, increase in knowledge. Qualitative indicators were the oral poll of class leaders, attainment of objectives, later evaluation of changes in attitude towards drug-takers and collages among the initiatives after the course. Drug-taking usually starts in the family context, leisure situations and peers. In this study isolated consumption was also detected. Most commonly consumed drugs were: caffeine, tobacco (mainly Virginia), alcohol occasionally, and cannabis. The type and form of drug-taking found is very similar to that of other, Spaniards of the same age. We found children who were not drug-takers, but were anxious about this because they "wanted to try out drugs".
Krishnamurthy, Krish
2013-12-01
The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented. Copyright © 2013 John Wiley & Sons, Ltd.
Temporal maps and informativeness in associative learning.
Balsam, Peter D; Gallistel, C Randy
2009-02-01
Neurobiological research on learning assumes that temporal contiguity is essential for association formation, but what constitutes temporal contiguity has never been specified. We review evidence that learning depends, instead, on learning a temporal map. Temporal relations between events are encoded even from single experiences. The speed with which an anticipatory response emerges is proportional to the informativeness of the encoded relation between a predictive stimulus or event and the event it predicts. This principle yields a quantitative account of the heretofore undefined, but theoretically crucial, concept of temporal pairing, an account in quantitative accord with surprising experimental findings. The same principle explains the basic results in the cue competition literature, which motivated the Rescorla-Wagner model and most other contemporary models of associative learning. The essential feature of a memory mechanism in this account is its ability to encode quantitative information.
Temporal maps and informativeness in associative learning
Balsam, Peter D; Gallistel, C. Randy
2009-01-01
Neurobiological research on learning assumes that temporal contiguity is essential for association formation, but what constitutes temporal contiguity has never been specified. We review evidence that learning depends, instead, on learning a temporal map. Temporal relations between events are encoded even from single experiences. The speed with which an anticipatory response emerges is proportional to the informativeness of the encoded relation between a predictive stimulus or event and the event it predicts. This principle yields a quantitative account of the heretofore undefined, but theoretically crucial, concept of temporal pairing, an account in quantitative accord with surprising experimental findings. The same principle explains the basic results in the cue competition literature, which motivated the Rescorla–Wagner model and most other contemporary models of associative learning. The essential feature of a memory mechanism in this account is its ability to encode quantitative information. PMID:19136158
Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper
2018-03-01
Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.
Curcumin does not switch melanin synthesis towards pheomelanin in B16F10 cells.
Wolnicka-Glubisz, Agnieszka; Nogal, Katarzyna; Żądło, Andrzej; Płonka, Przemysław M
2015-01-01
Melanin, the basic skin pigment present also in the majority of melanomas, has a huge impact on the efficiency of photodynamic, radio- or chemotherapies of melanoma. Moreover, the melanoma cells produce more melanin than normal melanocytes in adjacent skin do. Thus, attention has been paid to natural agents that are safe and effective in suppression of melanogenesis. B16F10 cells were studied by electron paramagnetic resonance (EPR) spectroscopy. The cells were cultured for 24-72 h in RPMI or DMEM with or without curcumin. The results confirmed that curcumin has no significant effect on B16F10 cells viability at concentrations of 1-10 µM. Curcumin at concentration of 10 µM significantly inhibited their proliferation and stimulated differentiation. We have not stimulated melanogenesis hormonally but we found a strong increase in melanogenesis in DMEM, containing more L-Tyr, as compared to RPMI. The EPR studies revealed that the effect of curcumin on melanogenesis in RPMI-incubated cells was not significant, and only in DMEM was curcumin able to inhibit melanogenesis. The effect of curcumin was only quantitative, as it did not switch eumelanogenesis towards pheomelanogenesis under any conditions. Interestingly, we observed elevation of production of hydrogen peroxide in DMEM-incubated cells, in parallel to the facilitation of melanogenesis. Curcumin significantly but transiently intensified the already pronounced generation of H2O2 in DMEM. We conclude that the quantitative effect of curcumin on melanogenesis in melanoma is intricate. It depends on the basic melanogenetic efficiency of the cells, and can be observed only in strongly pigmented cells. Qualitatively, curcumin does not switch melanogenesis towards pheomelanogenesis, either in strongly, or in weakly melanized melanoma cells.
Structured learning for robotic surgery utilizing a proficiency score: a pilot study.
Hung, Andrew J; Bottyan, Thomas; Clifford, Thomas G; Serang, Sarfaraz; Nakhoda, Zein K; Shah, Swar H; Yokoi, Hana; Aron, Monish; Gill, Inderbir S
2017-01-01
We evaluated feasibility and benefit of implementing structured learning in a robotics program. Furthermore, we assessed validity of a proficiency assessment tool for stepwise graduation. Teaching cases included robotic radical prostatectomy and partial nephrectomy. Procedure steps were categorized: basic, intermediate, and advanced. An assessment tool ["proficiency score" (PS)] was developed to evaluate ability to safely and autonomously complete a step. Graduation required a passing PS (PS ≥ 3) on three consecutive attempts. PS and validated global evaluative assessment of robotic skills (GEARS) were evaluated for completed steps. Linear regression was utilized to determine postgraduate year/PS relationship (construct validity). Spearman's rank correlation coefficient measured correlation between PS and GEARS evaluations (concurrent validity). Intraclass correlation (ICC) evaluated PS agreement between evaluator classes. Twenty-one robotic trainees participated within the pilot program, completing a median of 14 (2-69) cases each. Twenty-three study evaluators scored 14 (1-60) cases. Over 4 months, 229/294 (78 %) cases were designated "teaching" cases. Residents completed 91 % of possible evaluations; faculty completed 78 %. Verbal and quantitative feedback received by trainees increased significantly (p = 0.002, p < 0.001, respectively). Average PS increased with PGY (post-graduate year) for basic and intermediate steps (regression slopes: 0.402 (p < 0.0001), 0.323 (p < 0.0001), respectively) (construct validation). Overall, PS correlated highly with GEARS (ρ = 0.81, p < 0.0001) (concurrent validity). ICC was 0.77 (95 % CI 0.61-0.88) for resident evaluations. Structured learning can be implemented in an academic robotic program with high levels of trainee and evaluator participation, encouraging both quantitative and verbal feedback. A proficiency assessment tool developed for step-specific proficiency has construct and concurrent validity.
Translational PK/PD of Anti-Infective Therapeutics
Rathi, Chetan; Lee, Richard E.; Meibohm, Bernd
2016-01-01
Translational PK/PD modeling has emerged as a critical technique for quantitative analysis of the relationship between dose, exposure and response of antibiotics. By combining model components for pharmacokinetics, bacterial growth kinetics and concentration-dependent drug effects, these models are able to quantitatively capture and simulate the complex interplay between antibiotic, bacterium and host organism. Fine-tuning of these basic model structures allows to further account for complicating factors such as resistance development, combination therapy, or host responses. With this tool set at hand, mechanism-based PK/PD modeling and simulation allows to develop optimal dosing regimens for novel and established antibiotics for maximum efficacy and minimal resistance development. PMID:27978987
Effects of atmospheric aerosols on scattering reflected visible light from earth resource features
NASA Technical Reports Server (NTRS)
Noll, K. E.; Tschantz, B. A.; Davis, W. T.
1972-01-01
The vertical variations in atmospheric light attenuation under ambient conditions were identified, and a method through which aerial photographs of earth features might be corrected to yield quantitative information about the actual features was provided. A theoretical equation was developed based on the Bouguer-Lambert extinction law and basic photographic theory.
This tutorial reviews some of the screens, icons, and basic functions of the SDMProjectBuilder (SDMPB) that allow a user to identify a watershed of interest that can be used to choose a pour point or 12-digit HUC (HUC-12) for a microbial assessment. It demonstrates how to identif...
The BioScope Initiative: Integrating Technology into the Biology Classroom.
ERIC Educational Resources Information Center
Ashburn, Sarah J.; Eichinger, David C.; Witham, Shelly A.; Cross, Vanessa D.; Krockover, Gerald H.; Pae, Tae-Il; Islam, Samantha; Robinson, J. Paul
2002-01-01
Reports on the quantitative and qualitative assessment of the CD-ROM "Cell Structure and Function" which includes five sections: (1) Basics; (2) Simple Cell; (3) Cell Viewer; (4) Cellular Changes; and (5) Handles. Evaluates the effectiveness of the CD-ROM with the participation of (n=65) students. Applies both qualitative and statistical methods.…
Privatising Public Schooling in Post-Apartheid South Africa: Equity Considerations
ERIC Educational Resources Information Center
Motala, Shireen
2009-01-01
Through an analysis of quantitative and qualitative data on school funding in South Africa, this paper aims to analyse the user fee policy option in public schooling in South Africa. Debate is ongoing about the role of private input into public schooling and whether this practice affects access (and the constitutional right) to basic education,…
ERIC Educational Resources Information Center
Hussmann, Katja; Grande, Marion; Meffert, Elisabeth; Christoph, Swetlana; Piefke, Martina; Willmes, Klaus; Huber, Walter
2012-01-01
Although generally accepted as an important part of aphasia assessment, detailed analysis of spontaneous speech is rarely carried out in clinical practice mostly due to time limitations. The Aachener Sprachanalyse (ASPA; Aachen Speech Analysis) is a computer-assisted method for the quantitative analysis of German spontaneous speech that allows for…
ERIC Educational Resources Information Center
Karsai, Istvan; Kampis, George
2010-01-01
Biology is changing and becoming more quantitative. Research is creating new challenges that need to be addressed in education as well. New educational initiatives focus on combining laboratory procedures with mathematical skills, yet it seems that most curricula center on a single relationship between scientific knowledge and scientific method:…
Physical characteristics of some northern California brush fuels
Clive M. Countryman
1982-01-01
Brush species make up much of the fuel load in forested wildlands. Basic physical and chemical characteristics of these species influence ease of ignition, rate of fire spread, burning time, and fire intensity. Quantitative knowledge of the variations in brush characteristics is essential to progress in fire control and effective use of fire in wildland management....
An Assessment of the Differences Between High and Low Achieving Students. Final Report.
ERIC Educational Resources Information Center
Scott, Ralph; Ford, Jon A.
Primarily a longitudinal and quantitative analysis of achievement functioning, this experiment sought to identify factors which promote or impair the learning of individual children. The 683 Junior High students were divided into one of eight groups according to sex, race and whether their seventh grade Iowa Test of Basic Skills Composite score…
Sorbian: The Sorbian Language in Education in Germany, 2nd Edition. Regional Dossiers Series
ERIC Educational Resources Information Center
Brezan, Beate, Comp.; Nowak, Meto, Comp.
2016-01-01
This regional dossier aims to provide a concise, description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
ERIC Educational Resources Information Center
Walker, Alastair G. H., Comp.
2015-01-01
This regional dossier aims to provide a concise, description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
Swedish: The Swedish Language in Education in Finland, 2nd Edition. Regional Dossiers Series
ERIC Educational Resources Information Center
Østern, Anna-Lena, Comp.; Harju-Luukkainen, Heidi, Comp.
2013-01-01
This regional dossier aims to provide a concise, description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
Ladin: The Ladin Language in Education in Italy, 2nd Edition. Regional Dossiers Series
ERIC Educational Resources Information Center
Verra, Roland, Comp.
2016-01-01
This regional dossier aims to provide a concise, description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
Galician: The Galician Language in Education in Spain, 2nd Edition. Regional Dossiers Series
ERIC Educational Resources Information Center
Costas, Xosé-Henrique, Comp.; Expósito-Loureiro, Andrea, Comp.
2016-01-01
This regional dossier aims to provide a concise, description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
Welsh: The Welsh Language in Education in the UK, 2nd Edition. Regional Dossiers Series
ERIC Educational Resources Information Center
Jones, Meirion Prys, Comp.; Jones, Ceinwen, Comp.
2014-01-01
This regional dossier aims to provide a concise, description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
ERIC Educational Resources Information Center
Ó Murchú, Helen, Comp.
2016-01-01
This regional dossier aims to provide a concise, description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
Asturian: The Asturian Language in Education in Spain, 2nd Edition. Regional Dossiers Series
ERIC Educational Resources Information Center
González-Riaño, Xosé Antón, Comp.; Fernández-Costales, Alberto, Comp.
2014-01-01
This regional dossier aims to provide a concise, description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
Development and Standardization of the Air Force Officer Qualifying Test Form M.
ERIC Educational Resources Information Center
Miller, Robert E.
Air Force Officer Qualifying Test (AFOQT) Form M was constructed as a replacement for AFOQT Form L in Fiscal Year 1974. The new form serves the same purposes as its predecessor and possesses basically the same characteristics. It yields Pilot, Navigator-Technical, Officer Quality, Verbal, and Quantitative composite scores. Three sets of conversion…
Quantitative Literacy: Geosciences and Beyond
NASA Astrophysics Data System (ADS)
Richardson, R. M.; McCallum, W. G.
2002-12-01
Quantitative literacy seems like such a natural for the geosciences, right? The field has gone from its origin as a largely descriptive discipline to one where it is hard to imagine failing to bring a full range of mathematical tools to the solution of geological problems. Although there are many definitions of quantitative literacy, we have proposed one that is analogous to the UNESCO definition of conventional literacy: "A quantitatively literate person is one who, with understanding, can both read and represent quantitative information arising in his or her everyday life." Central to this definition is the concept that a curriculum for quantitative literacy must go beyond the basic ability to "read and write" mathematics and develop conceptual understanding. It is also critical that a curriculum for quantitative literacy be engaged with a context, be it everyday life, humanities, geoscience or other sciences, business, engineering, or technology. Thus, our definition works both within and outside the sciences. What role do geoscience faculty have in helping students become quantitatively literate? Is it our role, or that of the mathematicians? How does quantitative literacy vary between different scientific and engineering fields? Or between science and nonscience fields? We will argue that successful quantitative literacy curricula must be an across-the-curriculum responsibility. We will share examples of how quantitative literacy can be developed within a geoscience curriculum, beginning with introductory classes for nonmajors (using the Mauna Loa CO2 data set) through graduate courses in inverse theory (using singular value decomposition). We will highlight six approaches to across-the curriculum efforts from national models: collaboration between mathematics and other faculty; gateway testing; intensive instructional support; workshops for nonmathematics faculty; quantitative reasoning requirement; and individual initiative by nonmathematics faculty.
Digital storage and analysis of color Doppler echocardiograms
NASA Technical Reports Server (NTRS)
Chandra, S.; Thomas, J. D.
1997-01-01
Color Doppler flow mapping has played an important role in clinical echocardiography. Most of the clinical work, however, has been primarily qualitative. Although qualitative information is very valuable, there is considerable quantitative information stored within the velocity map that has not been extensively exploited so far. Recently, many researchers have shown interest in using the encoded velocities to address the clinical problems such as quantification of valvular regurgitation, calculation of cardiac output, and characterization of ventricular filling. In this article, we review some basic physics and engineering aspects of color Doppler echocardiography, as well as drawbacks of trying to retrieve velocities from video tape data. Digital storage, which plays a critical role in performing quantitative analysis, is discussed in some detail with special attention to velocity encoding in DICOM 3.0 (medical image storage standard) and the use of digital compression. Lossy compression can considerably reduce file size with minimal loss of information (mostly redundant); this is critical for digital storage because of the enormous amount of data generated (a 10 minute study could require 18 Gigabytes of storage capacity). Lossy JPEG compression and its impact on quantitative analysis has been studied, showing that images compressed at 27:1 using the JPEG algorithm compares favorably with directly digitized video images, the current goldstandard. Some potential applications of these velocities in analyzing the proximal convergence zones, mitral inflow, and some areas of future development are also discussed in the article.
Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko
2017-07-01
Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.
The origins of duality of patterning in artificial whistled languages
Verhoef, Tessa
2012-01-01
In human speech, a finite set of basic sounds is combined into a (potentially) unlimited set of well-formed morphemes. Hockett (1960) placed this phenomenon under the term ‘duality of patterning’ and included it as one of the basic design features of human language. Of the thirteen basic design features Hockett proposed, duality of patterning is the least studied and it is still unclear how it evolved in language. Recent work shedding light on this is summarized in this paper and experimental data is presented. This data shows that combinatorial structure can emerge in an artificial whistled language through cultural transmission as an adaptation to human cognitive biases and learning. In this work the method of experimental iterated learning (Kirby et al. 2008) is used, in which a participant is trained on the reproductions of the utterances the previous participant learned. Participants learn and recall a system of sounds that are produced with a slide whistle. Transmission from participant to participant causes the whistle systems to change and become more learnable and more structured. These findings follow from qualitative observations, quantitative measures and a follow-up experiment that tests how well participants can learn the emerged whistled languages by generalizing from a few examples. PMID:23637710
Cenesthopathy and Subjective Cognitive Complaints: An Exploratory Study in Schizophrenia.
Jimeno, Natalia; Vargas, Martin L
2018-01-01
Cenesthopathy is mainly associated with schizophrenia; however, its neurobiological basis is nowadays unclear. The general objective was to explore clinical correlates of cenesthopathy and subjective cognitive complaints in schizophrenia. Participants (n = 30) meeting DSM-IV criteria for psychotic disorder were recruited from a psychiatry unit and assessed with: Association for Methodology and Documentation in Psychiatry (AMDP) system, Positive and Negative Syndrome Scale, Frankfurt Complaint Questionnaire (FCQ), and the Bonn Scale for the Assessment of Basic Symptoms (BSABS). For quantitative variables, means and Spearman correlation coefficients were calculated. Linear regression following backward method and principal component analysis with varimax rotation were used. 83.3% of subjects (73.3% male, mean age, 31.5 years) presented any type of cenesthopathy; all types of cenesthetic basic symptoms were found. Cenesthetic basic symptoms significantly correlated with the AMDP category "fear and anancasm," FCQ total score, and BSABS cognitive thought disturbances. In the regression analysis only 1 predictor, cognitive thought disturbances, entered the model. In the principal component analysis, a main component which accounted for 22.69% of the variance was found. Cenesthopathy, as assessed with the Bonn Scale (BSABS), is mainly associated with cog-nitive abnormalities including disturbances of thought initiative and mental intentionality, of receptive speech, and subjective retardation or pressure of thoughts. © 2018 S. Karger AG, Basel.
Implementing online quantitative support modules in an intermediate-level course
NASA Astrophysics Data System (ADS)
Daly, J.
2011-12-01
While instructors typically anticipate that students in introductory geology courses enter a class with a wide range of quantitative ability, we often overlook the fact that this may also be true in upper-level courses. Some students are drawn to the subject and experience success in early courses with an emphasis on descriptive geology, then experience frustration and disappointment in mid- and upper-level courses that are more quantitative. To bolster student confidence in quantitative skills and enhance their performance in an upper-level course, I implemented several modules from The Math You Need (TMYN) online resource with a 200-level geomorphology class. Student facility with basic quantitative skills (rearranging equations, manipulating units, and graphing) was assessed with an online pre- and post-test. During the semester, modules were assigned to complement existing course activities (for example, the module on manipulating units was assigned prior to a lab on measurement of channel area and water velocity, then calculation of discharge). The implementation was designed to be a concise review of relevant skills for students with higher confidence in their quantitative abilities, and to provide a self-paced opportunity for students with less quantitative facility to build skills. This course already includes a strong emphasis on quantitative data collection, analysis, and presentation; in the past, student performance in the course has been strongly influenced by their individual quantitative ability. I anticipate that giving students the opportunity to improve mastery of fundamental quantitative skills will improve their performance on higher-stakes assignments and exams, and will enhance their sense of accomplishment in the course.
Nolan, John P.; Mandy, Francis
2008-01-01
While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537
Bigler, Erin D
2015-09-01
Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.
The phylogeny of swimming kinematics: The environment controls flagellar waveforms in sperm motility
NASA Astrophysics Data System (ADS)
Guasto, Jeffrey; Burton, Lisa; Zimmer, Richard; Hosoi, Anette; Stocker, Roman
2013-11-01
In recent years, phylogenetic and molecular analyses have dominated the study of ecology and evolution. However, physical interactions between organisms and their environment, a fundamental determinant of organism ecology and evolution, are mediated by organism form and function, highlighting the need to understand the mechanics of basic survival strategies, including locomotion. Focusing on spermatozoa, we combined high-speed video microscopy and singular value decomposition analysis to quantitatively compare the flagellar waveforms of eight species, ranging from marine invertebrates to humans. We found striking similarities in sperm swimming kinematics between genetically dissimilar organisms, which could not be uncovered by phylogenetic analysis. The emergence of dominant waveform patterns across species are suggestive of biological optimization for flagellar locomotion and point toward environmental cues as drivers of this convergence. These results reinforce the power of quantitative kinematic analysis to understand the physical drivers of evolution and as an approach to uncover new solutions for engineering applications, such as micro-robotics.
NASA Technical Reports Server (NTRS)
Smedes, H. W. (Principal Investigator); Root, R. R.; Roller, N. E. G.; Despain, D.
1978-01-01
The author has identified the following significant results. A terrain map of Yellowstone National Park showed plant community types and other classes of ground cover in what is basically a wild land. The map comprised 12 classes, six of which were mapped with accuracies of 70 to 95%. The remaining six classes had spectral reflectances that overlapped appreciably, and hence, those were mapped less accurately. Techniques were devised for quantitatively comparing the recognition map of the park with control data acquired from ground inspection and from analysis of sidelooking radar images, a thermal IR mosaic, and IR aerial photos of several scales. Quantitative analyses were made in ten 40 sq km test areas. Comparison mechanics were performed by computer with the final results displayed on line printer output. Forested areas were mapped by computer using ERTS data for less than 1/4 the cost of the conventional forest mapping technique for topographic base maps.
Hansen, John P
2003-01-01
Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 1, presents basic information about data including a classification system that describes the four major types of variables: continuous quantitative variable, discrete quantitative variable, ordinal categorical variable (including the binomial variable), and nominal categorical variable. A histogram is a graph that displays the frequency distribution for a continuous variable. The article also demonstrates how to calculate the mean, median, standard deviation, and variance for a continuous variable.
Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira
2014-08-01
This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
NASA Astrophysics Data System (ADS)
Latief, F. D. E.; Mohammad, I. H.; Rarasati, A. D.
2017-11-01
Digital imaging of a concrete sample using high resolution tomographic imaging by means of X-Ray Micro Computed Tomography (μ-CT) has been conducted to assess the characteristic of the sample’s structure. A standard procedure of image acquisition, reconstruction, image processing of the method using a particular scanning device i.e., the Bruker SkyScan 1173 High Energy Micro-CT are elaborated. A qualitative and a quantitative analysis were briefly performed on the sample to deliver some basic ideas of the capability of the system and the bundled software package. Calculation of total VOI volume, object volume, percent of object volume, total VOI surface, object surface, object surface/volume ratio, object surface density, structure thickness, structure separation, total porosity were conducted and analysed. This paper should serve as a brief description of how the device can produce the preferred image quality as well as the ability of the bundled software packages to help in performing qualitative and quantitative analysis.
NASA Astrophysics Data System (ADS)
Cederman, L.-E.; Conte, R.; Helbing, D.; Nowak, A.; Schweitzer, F.; Vespignani, A.
2012-11-01
A huge flow of quantitative social, demographic and behavioral data is becoming available that traces the activities and interactions of individuals, social patterns, transportation infrastructures and travel fluxes. This has caused, together with innovative computational techniques and methods for modeling social actions in hybrid (natural and artificial) societies, a qualitative change in the ways we model socio-technical systems. For the first time, society can be studied in a comprehensive fashion that addresses social and behavioral complexity. In other words we are in the position to envision the development of large data and computational cyber infrastructure defining an exploratory of society that provides quantitative anticipatory, explanatory and scenario analysis capabilities ranging from emerging infectious disease to conflict and crime surges. The goal of the exploratory of society is to provide the basic infrastructure embedding the framework of tools and knowledge needed for the design of forecast/anticipatory/crisis management approaches to socio technical systems, supporting future decision making procedures by accelerating the scientific cycle that goes from data generation to predictions.
[Quantitative research on operation behavior of acupuncture manipulation].
Li, Jing; Grierson, Lawrence; Wu, Mary X; Breuer, Ronny; Carnahan, Heather
2014-03-01
To explore a method of quantitative evaluation on operation behavior of acupuncture manipulation and further analyze behavior features of professional acupuncture manipulation. According to acupuncture basic manipulations, Scales for Operation Behavior of Acupuncture Basic Manipulation was made and Delphi method was adopted to test its validity. Two independent estimators utilized this scale to assess operation behavior of acupuncture manipulate among 12 acupuncturists and 12 acupuncture-novices and calculate interrater reliability, also the differences of total score of operation behavior in the two groups as well as single-step score, including sterilization, needle insertion, needle manipulation and needle withdrawal, were compared. The validity of this scale was satisfied. The inter-rater reliability was 0. 768. The total score of operation behavior in acupuncturist group was significantly higher than that in the acupuncture-novice group (13.80 +/- 1.05 vs 11.03 +/- 2.14, P < 0.01). The scores of needle insertion and needle manipulation in the acupuncturist group were significantly higher than those in the acupuncture-novice group (4.28 +/- 0.91 vs 2.54 +/- 1.51, P < 0.01; 2.56 +/- 0.65 vs 1.88 +/- 0.88, P < 0.05); however, the scores of sterilization and needle withdrawal in the acupuncturist group were not different from those in the acupuncture-novice group. This scale is suitable for quantitative evaluation on operation behavior of acupuncture manipulation. The behavior features of professional acupuncture manipulation are mainly presented with needle insertion and needle manipulation which has superior difficulty, high coordination and accuracy.
Arruda, Thomas M; Kumar, Amit; Jesse, Stephen; Veith, Gabriel M; Tselev, Alexander; Baddorf, Arthur P; Balke, Nina; Kalinin, Sergei V
2013-09-24
The application of electric bias across tip-surface junctions in scanning probe microscopy can readily induce surface and bulk electrochemical processes that can be further detected though changes in surface topography, Faradaic or conductive currents, or electromechanical strain responses. However, the basic factors controlling tip-induced electrochemical processes, including the relationship between applied tip bias and the thermodynamics of local processes, remains largely unexplored. Using the model Li-ion reduction reaction on the surface in Li-ion conducting glass ceramic, we explore the factors controlling Li-metal formation and find surprisingly strong effects of atmosphere and back electrode composition on the process. We find that reaction processes are highly dependent on the nature of the counter electrode and environmental conditions. Using a nondepleting Li counter electrode, Li particles could grow significantly larger and faster than a depleting counter electrode. Significant Li ion depletion leads to the inability for further Li reduction. Time studies suggest that Li diffusion replenishes the vacant sites after ∼12 h. These studies suggest the feasibility of SPM-based quantitative electrochemical studies under proper environmental controls, extending the concepts of ultramicroelectrodes to the single-digit nanometer scale.
MacPhail, Catherine; Adato, Michelle; Kahn, Kathleen; Selin, Amanda; Twine, Rhian; Khoza, Samson; Rosenberg, Molly; Nguyen, Nadia; Becker, Elizabeth; Pettifor, Audrey
2013-01-01
Women are at increased risk of HIV infection in much of sub-Saharan Africa. Longitudinal and cross-sectional studies have found an association between school attendance and reduced HIV risk. We report feasibility and acceptability results from a pilot of a cash transfer intervention conditional on school attendance paid to young women and their families in rural Mpumalanga, South Africa for the prevention of HIV infection. Twenty-nine young women were randomised to intervention or control and a cash payment based on school attendance made over a 2 month period. Quantitative (survey) and qualitative (focus group and interview) data collection was undertaken with young women, parents, teachers and young men in the same school. Qualitative analysis was conducted in Atlas.ti using a framework approach and basic descriptive analysis in Excel was conducted on the quantitative data. Results indicate it was both feasible and acceptable to introduce such an intervention among this population in rural South Africa. There was good understanding of the process of randomisation and the aims of the study, although some rumours developed in the study community. We address some of the changes necessary to ensure acceptability and feasibility of the main trial. PMID:23435698
NASA Astrophysics Data System (ADS)
Hughes, Dirk D.
The primary purpose of the quantitative experimental study is to compare employee-learning outcomes for a course of study that is offered in two formats: explicit and tacit instructor led and explicit e-learning operations training. A Kirkpatrick Level 2 course examination is used to establish a pretest knowledge baseline and to measure posttest learning outcomes for each instructional format. A secondary purpose is to compare responses of the two groups using a Kirkpatrick Level 1 customer satisfaction index survey. Several authors reported the United States electric utility industry would have an employee attrition issue during the 2010 through 2015 period. This is at the same time the industry will be experiencing an increased demand for electricity. There now is a demand for highly training powerplant operators. A review of literature yielded few studies comparing instructor led training and e-based training. Though the Electric Power Research Institute stated the two training modes would be acceptable instruction, the organization did not develop a quantifiable justified recommendation as to the training. Subjects participated in a basic operations course and decided to take either the instructor led or e-based training course. Results of the study concluded that both instructor led and e-based training provided significant learning to the participants. The Kirkpatrick Level 1 results indicated significantly better results for instructor led training. There was not a significant difference in the Kirkpatrick Level 2 results between the two training modalities. Recommendation for future research include conducting a quantitative studies including a Phillips Level 5 study and qualitative studies including a more detailed examination of the customer satisfaction survey (Kirkpatrick Level 1).
Basic research in evolution and ecology enhances forensics.
Tomberlin, Jeffery K; Benbow, M Eric; Tarone, Aaron M; Mohr, Rachel M
2011-02-01
In 2009, the National Research Council recommended that the forensic sciences strengthen their grounding in basic empirical research to mitigate against criticism and improve accuracy and reliability. For DNA-based identification, this goal was achieved under the guidance of the population genetics community. This effort resulted in DNA analysis becoming the 'gold standard' of the forensic sciences. Elsewhere, we proposed a framework for streamlining research in decomposition ecology, which promotes quantitative approaches to collecting and applying data to forensic investigations involving decomposing human remains. To extend the ecological aspects of this approach, this review focuses on forensic entomology, although the framework can be extended to other areas of decomposition. Published by Elsevier Ltd.
Using Self-Reflection To Increase Science Process Skills in the General Chemistry Laboratory
NASA Astrophysics Data System (ADS)
Veal, William R.; Taylor, Dawne; Rogers, Amy L.
2009-03-01
Self-reflection is a tool of instruction that has been used in the science classroom. Research has shown great promise in using video as a learning tool in the classroom. However, the integration of self-reflective practice using video in the general chemistry laboratory to help students develop process skills has not been done. Immediate video feedback and direct instruction were employed in a general chemistry laboratory course to improve students' mastery and understanding of basic and advanced process skills. Qualitative results and statistical analysis of quantitative data proved that self-reflection significantly helped students develop basic and advanced process skills, yet did not seem to influence the general understanding of the science content.
Precision medicine in myasthenia graves: begin from the data precision
Hong, Yu; Xie, Yanchen; Hao, Hong-Jun; Sun, Ren-Cheng
2016-01-01
Myasthenia gravis (MG) is a prototypic autoimmune disease with overt clinical and immunological heterogeneity. The data of MG is far from individually precise now, partially due to the rarity and heterogeneity of this disease. In this review, we provide the basic insights of MG data precision, including onset age, presenting symptoms, generalization, thymus status, pathogenic autoantibodies, muscle involvement, severity and response to treatment based on references and our previous studies. Subgroups and quantitative traits of MG are discussed in the sense of data precision. The role of disease registries and scientific bases of precise analysis are also discussed to ensure better collection and analysis of MG data. PMID:27127759
NASA Astrophysics Data System (ADS)
Elrefai, Ahmed L.; Sasayama, Teruyoshi; Yoshida, Takashi; Enpuku, Keiji
2018-05-01
We studied the magnetization (M-H) curve of immobilized magnetic nanoparticles (MNPs) used for biomedical applications. First, we performed numerical simulation on the DC M-H curve over a wide range of MNPs parameters. Based on the simulation results, we obtained an empirical expression for DC M-H curve. The empirical expression was compared with the measured M-H curves of various MNP samples, and quantitative agreements were obtained between them. We can also estimate the basic parameters of MNP from the comparison. Therefore, the empirical expression is useful for analyzing the M-H curve of immobilized MNPs for specific biomedical applications.
Dermatology patients’ and their doctors’ representations about adherence
Kemény, Lajos; Csabai, Márta
2015-01-01
The aim of our study was to identify representations about patient adherence among dermatologists (N=40) and their patients (N=153). A combined qualitative-quantitative methodology was applied. Dermatologists identified good doctor-patient relationship, information from the doctor, and background information as the most important determinants of adherence. In patients’ rankings, information from the doctor and understandable communication received the highest scores. Multidimensional scaling arranged patients’ results into four content groups which helped to reveal the structure of the representations. Our results may contribute to the evidence-based confirmation that transparency of views and expectations in doctor-patient communication is a basic determinant of successful adherence. PMID:28352698
BCB Bonding Technology of Back-Side Illuminated COMS Device
NASA Astrophysics Data System (ADS)
Wu, Y.; Jiang, G. Q.; Jia, S. X.; Shi, Y. M.
2018-03-01
Back-side illuminated CMOS(BSI) sensor is a key device in spaceborne hyperspectral imaging technology. Compared with traditional devices, the path of incident light is simplified and the spectral response is planarized by BSI sensors, which meets the requirements of quantitative hyperspectral imaging applications. Wafer bonding is the basic technology and key process of the fabrication of BSI sensors. 6 inch bonding of CMOS wafer and glass wafer was fabricated based on the low bonding temperature and high stability of BCB. The influence of different thickness of BCB on bonding strength was studied. Wafer bonding with high strength, high stability and no bubbles was fabricated by changing bonding conditions.
NASA Astrophysics Data System (ADS)
Alkorta, Ibon; Elguero, José; Elguero, Eric
2017-11-01
1125 X-ray structures of nitroxide free radicals presenting intermolecular hydrogen bonds have been reported in the Cambridge Structural Database. We will report in this paper a qualitative and quantitative analysis of these bonds. The observation in some plots of an excluded region was statistically analyzed using convex hull and kernel smooting methodologies. A theoretical study at the MP2 level with different basis has been carried out indicating that the nitronyl nitroxide radicals (five electrons) lie just in between nitroso compounds (four electrons) and amine N-oxides (six electrons) as far as hydrogen-bond basicity is concerned.
Quantitative aspects of vibratory mobilization and break-up of non-wetting fluids in porous media
NASA Astrophysics Data System (ADS)
Deng, Wen
Seismic stimulation is a promising technology aimed to mobilize the entrapped non-wetting fluids in the subsurface. The applications include enhanced oil recovery or, alternatively, facilitation of movement of immiscible/partly-miscible gases far into porous media, for example, for CO2 sequestration. This work is devoted to detailed quantitative studies of the two basic pore-scale mechanisms standing behind seismic stimulation: the mobilization of bubbles or drops entrapped in pore constrictions by capillary forces and the break-up of continuous long bubbles or drops. In typical oil-production operations, oil is produced by the natural reservoir-pressure drive during the primary stage and by artificial water flooding at the secondary stage. Capillary forces act to retain a substantial residual fraction of reservoir oil even after water flooding. The seismic stimulation is an unconventional technology that serves to overcome capillary barriers in individual pores and liberate the entrapped oil by adding an oscillatory inertial forcing to the external pressure gradient. According to our study, the effect of seismic stimulation on oil mobilization is highly dependent on the frequencies and amplitudes of the seismic waves. Generally, the lower the frequency and the larger the amplitude, more effective is the mobilization. To describe the mobilization process, we developed two theoretical hydrodynamics-based models and justified both using computational fluid dynamics (CFD). Our theoretical models have a significant advantage over CFD in that they reduce the computational time significantly, while providing correct practical guidance regarding the required field parameters of vibroseismic stimulation, such as the amplitude and frequency of the seismic field. The models also provide important insights into the basic mechanisms governing the vibration-driven two-phase flow in constricted capillaries. In a waterflooded reservoir, oil can be recovered most efficiently by forming continuous streams from isolated droplets. The longer the continuous oil phase under a certain pressure gradient, the more easily it overcomes its capillary barrier. However, surface tension between water and oil causes the typically non-wetting oil, constituting the core phase in the channels, to break up at the pore constriction into isolated beads, which inhibits further motion. The break-up thus counteracts the mobilization. We developed a theoretical model that provides an exact quantitative description of the dynamics of the oil-snap-off process. It also formulates a purely geometric criterion that controls, based on pore geometry only, whether the oil core phase stays continuous or disintegrates into droplets. Both the theoretical model and the break-criterion have been validated against CFD simulations. The work completed elucidates the basic physical mechanisms behind the enhanced oil recovery by seismic waves and vibrations. This creates a theoretical foundation for the further development of corresponding field technologies.
Benner, W.H.
1984-05-08
An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.
Benner, William H.
1986-01-01
An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.
Towards automation of user interface design
NASA Technical Reports Server (NTRS)
Gastner, Rainer; Kraetzschmar, Gerhard K.; Lutz, Ernst
1992-01-01
This paper suggests an approach to automatic software design in the domain of graphical user interfaces. There are still some drawbacks in existing user interface management systems (UIMS's) which basically offer only quantitative layout specifications via direct manipulation. Our approach suggests a convenient way to get a default graphical user interface which may be customized and redesigned easily in further prototyping cycles.
This tutorial reviews some of the screens, icons, and basic functions of the SDMProjectBuilder (SDMPB) that allow a user to identify an 8-digit HUC (HUC-8) of interest from which a pour point or 12-digit HUC (HUC-12) can be chosen for a microbial assessment. It demonstrates how t...
An analysis of the multiple model adaptive control algorithm. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Greene, C. S.
1978-01-01
Qualitative and quantitative aspects of the multiple model adaptive control method are detailed. The method represents a cascade of something which resembles a maximum a posteriori probability identifier (basically a bank of Kalman filters) and a bank of linear quadratic regulators. Major qualitative properties of the MMAC method are examined and principle reasons for unacceptable behavior are explored.
ERIC Educational Resources Information Center
Sulochana, Rosy
2015-01-01
"Access to basic education" continues to be a matter of serious concern in India. While the quantitative expansion of the system appears to be very impressive, the achievement of the goal of universalisation of primary education has still remained elusive. This is because the government continues its celebration through reflecting on…
ERIC Educational Resources Information Center
Davis, Mary M.
2009-01-01
The American Association of Colleges and Universities reports that over 50% of the students entering colleges and universities are academically under prepared; that is, according to Miller and Murray (2005), students "lack basic skills in at least one of the three fundamental areas of reading, writing, and mathematics" (paragraph 4). Furthermore,…
ERIC Educational Resources Information Center
Laptander, Roza Ivanovna, Comp.
2016-01-01
This regional dossier aims to provide a concise, description and basic statistics about minority language education in a specific region of Europe. Aspects that are addressed include features of the education system, recent educational policies, main actors, legal arrangements, and support structures, as well as quantitative aspects, such as the…
Back of the Envelope Reasoning for Robust Quantitative Problem Solving
2007-12-31
limited numeric vocabulary, for example, the Pirahã tribe in Amazonia [Gordon, 2004] and Munduruku [Pica et al., 2004], an Amazonian language... investigation of category structure: 1. Level of categorization: Rosch [1978] identifies three levels of categorization: subordinate, basic-level...Using Common Sense Knowledge to Overcome Brittleness and Knowledge Acquisition Bottlenecks. AI Magazine . Lenhart K. Schubert and Matthew Tong
Shear-Wave Elastography: Basic Physics and Musculoskeletal Applications.
Taljanovic, Mihra S; Gimber, Lana H; Becker, Giles W; Latt, L Daniel; Klauser, Andrea S; Melville, David M; Gao, Liang; Witte, Russell S
2017-01-01
In the past 2 decades, sonoelastography has been progressively used as a tool to help evaluate soft-tissue elasticity and add to information obtained with conventional gray-scale and Doppler ultrasonographic techniques. Recently introduced on clinical scanners, shear-wave elastography (SWE) is considered to be more objective, quantitative, and reproducible than compression sonoelastography with increasing applications to the musculoskeletal system. SWE uses an acoustic radiation force pulse sequence to generate shear waves, which propagate perpendicular to the ultrasound beam, causing transient displacements. The distribution of shear-wave velocities at each pixel is directly related to the shear modulus, an absolute measure of the tissue's elastic properties. Shear-wave images are automatically coregistered with standard B-mode images to provide quantitative color elastograms with anatomic specificity. Shear waves propagate faster through stiffer contracted tissue, as well as along the long axis of tendon and muscle. SWE has a promising role in determining the severity of disease and treatment follow-up of various musculoskeletal tissues including tendons, muscles, nerves, and ligaments. This article describes the basic ultrasound physics of SWE and its applications in the evaluation of various traumatic and pathologic conditions of the musculoskeletal system. © RSNA, 2017.
Shear-Wave Elastography: Basic Physics and Musculoskeletal Applications
Gimber, Lana H.; Becker, Giles W.; Latt, L. Daniel; Klauser, Andrea S.; Melville, David M.; Gao, Liang; Witte, Russell S.
2017-01-01
In the past 2 decades, sonoelastography has been progressively used as a tool to help evaluate soft-tissue elasticity and add to information obtained with conventional gray-scale and Doppler ultrasonographic techniques. Recently introduced on clinical scanners, shear-wave elastography (SWE) is considered to be more objective, quantitative, and reproducible than compression sonoelastography with increasing applications to the musculoskeletal system. SWE uses an acoustic radiation force pulse sequence to generate shear waves, which propagate perpendicular to the ultrasound beam, causing transient displacements. The distribution of shear-wave velocities at each pixel is directly related to the shear modulus, an absolute measure of the tissue’s elastic properties. Shear-wave images are automatically coregistered with standard B-mode images to provide quantitative color elastograms with anatomic specificity. Shear waves propagate faster through stiffer contracted tissue, as well as along the long axis of tendon and muscle. SWE has a promising role in determining the severity of disease and treatment follow-up of various musculoskeletal tissues including tendons, muscles, nerves, and ligaments. This article describes the basic ultrasound physics of SWE and its applications in the evaluation of various traumatic and pathologic conditions of the musculoskeletal system. ©RSNA, 2017 PMID:28493799
Ninomiya, Shinji; Tokumine, Asako; Yasuda, Toru; Tomizawa, Yasuko
2007-01-01
A training system with quantitative evaluation of performance for training perfusionists is valuable for preparation for rare but critical situations. A simulator system, ECCSIM-Lite, for extracorporeal circulation (ECC) training of perfusionists was developed. This system consists of a computer system containing a simulation program of the hemodynamic conditions and the training scenario with instructions, a flow sensor unit, a reservoir with a built-in water level sensor, and an ECC circuit with a soft bag representing the human body. This system is relatively simple, easy to handle, compact, and reasonably inexpensive. Quantitative information is recorded, including the changes in arterial flow by the manipulation of a knob, the changes in venous drainage by handling a clamp, and the change in reservoir level; the time courses of the above parameters are presented graphically. To increase the realism of the training, a numerical-hydraulic circulatory model was applied. Following the instruction and explanation of the scenario in the form of audio and video captions, it is possible for a trainee to undertake self-study without an instructor or a computer operator. To validate the system, a training session was given to three beginners using a simple training scenario; it was possible to record the performance of the perfusion sessions quantitatively. In conclusion, the ECCSIM-Lite system is expected to be useful for perfusion training, since quantitative information about the trainee's performance is recorded and it is possible to use the data for assessment and comparison.
Petroll, W. Matthew; Robertson, Danielle M.
2015-01-01
The optical sectioning ability of confocal microscopy allows high magnification images to be obtained from different depths within a thick tissue specimen, and is thus ideally suited to the study of intact tissue in living subjects. In vivo confocal microscopy has been used in a variety of corneal research and clinical applications since its development over 25 years ago. In this article we review the latest developments in quantitative corneal imaging with the Heidelberg Retinal Tomograph with Rostock Corneal Module (HRT-RCM). We provide an overview of the unique strengths and weaknesses of the HRT-RCM. We discuss techniques for performing 3-D imaging with the HRT-RCM, including hardware and software modifications that allow full thickness confocal microscopy through focusing (CMTF) of the cornea, which can provide quantitative measurements of corneal sublayer thicknesses, stromal cell and extracellular matrix backscatter, and depth dependent changes in corneal keratocyte density. We also review current approaches for quantitative imaging of the subbasal nerve plexus, which require a combination of advanced image acquisition and analysis procedures, including wide field mapping and 3-D reconstruction of nerve structures. The development of new hardware, software, and acquisition techniques continues to expand the number of applications of the HRT-RCM for quantitative in vivo corneal imaging at the cellular level. Knowledge of these rapidly evolving strategies should benefit corneal clinicians and basic scientists alike. PMID:25998608
A soft, wearable microfluidic device for the capture, storage, and colorimetric sensing of sweat.
Koh, Ahyeon; Kang, Daeshik; Xue, Yeguang; Lee, Seungmin; Pielak, Rafal M; Kim, Jeonghyun; Hwang, Taehwan; Min, Seunghwan; Banks, Anthony; Bastien, Philippe; Manco, Megan C; Wang, Liang; Ammann, Kaitlyn R; Jang, Kyung-In; Won, Phillip; Han, Seungyong; Ghaffari, Roozbeh; Paik, Ungyu; Slepian, Marvin J; Balooch, Guive; Huang, Yonggang; Rogers, John A
2016-11-23
Capabilities in health monitoring enabled by capture and quantitative chemical analysis of sweat could complement, or potentially obviate the need for, approaches based on sporadic assessment of blood samples. Established sweat monitoring technologies use simple fabric swatches and are limited to basic analysis in controlled laboratory or hospital settings. We present a collection of materials and device designs for soft, flexible, and stretchable microfluidic systems, including embodiments that integrate wireless communication electronics, which can intimately and robustly bond to the surface of the skin without chemical and mechanical irritation. This integration defines access points for a small set of sweat glands such that perspiration spontaneously initiates routing of sweat through a microfluidic network and set of reservoirs. Embedded chemical analyses respond in colorimetric fashion to markers such as chloride and hydronium ions, glucose, and lactate. Wireless interfaces to digital image capture hardware serve as a means for quantitation. Human studies demonstrated the functionality of this microfluidic device during fitness cycling in a controlled environment and during long-distance bicycle racing in arid, outdoor conditions. The results include quantitative values for sweat rate, total sweat loss, pH, and concentration of chloride and lactate. Copyright © 2016, American Association for the Advancement of Science.
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
NASA Astrophysics Data System (ADS)
Le, Du; Wang, Quanzeng; Ramella-Roman, Jessica; Pfefer, Joshua
2012-06-01
Narrow-band imaging (NBI) is a spectrally-selective reflectance imaging technique for enhanced visualization of superficial vasculature. Prior clinical studies have indicated NBI's potential for detection of vasculature abnormalities associated with gastrointestinal mucosal neoplasia. While the basic mechanisms behind the increased vessel contrast - hemoglobin absorption and tissue scattering - are known, a quantitative understanding of the effect of tissue and device parameters has not been achieved. In this investigation, we developed and implemented a numerical model of light propagation that simulates NBI reflectance distributions. This was accomplished by incorporating mucosal tissue layers and vessel-like structures in a voxel-based Monte Carlo algorithm. Epithelial and mucosal layers as well as blood vessels were defined using wavelength-specific optical properties. The model was implemented to calculate reflectance distributions and vessel contrast values as a function of vessel depth (0.05 to 0.50 mm) and diameter (0.01 to 0.10 mm). These relationships were determined for NBI wavelengths of 410 nm and 540 nm, as well as broadband illumination common to standard endoscopic imaging. The effects of illumination bandwidth on vessel contrast were also simulated. Our results provide a quantitative analysis of the effect of absorption and scattering on vessel contrast. Additional insights and potential approaches for improving NBI system contrast are discussed.
Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Booth, Andrew; Harden, Angela; Hannes, Karin; Thomas, James; Flemming, Kate; Garside, Ruth; Noyes, Jane
2018-05-01
This article provides reviewers with guidance on methods for identifying and processing evidence to understand intervention implementation. Strategies, tools, and methods are applied to the systematic review process to illustrate how process and implementation can be addressed using quantitative, qualitative, and other sources of evidence (i.e., descriptive textual and nonempirical). Reviewers can take steps to navigate the heterogeneity and level of uncertainty present in the concepts, measures, and methods used to assess implementation. Activities can be undertaken in advance of a Cochrane quantitative review to develop program theory and logic models that situate implementation in the causal chain. Four search strategies are offered to retrieve process and implementation evidence. Recommendations are made for addressing rigor or risk of bias in process evaluation or implementation evidence. Strategies are recommended for locating and extracting data from primary studies. The basic logic is presented to assist reviewers to make initial review-level judgments about implementation failure and theory failure. Although strategies, tools, and methods can assist reviewers to address process and implementation using quantitative, qualitative, and other forms of evidence, few exemplar reviews exist. There is a need for further methodological development and trialing of proposed approaches. Copyright © 2017 Elsevier Inc. All rights reserved.
Determination of fat- and water-soluble vitamins by supercritical fluid chromatography: A review.
Tyśkiewicz, Katarzyna; Dębczak, Agnieszka; Gieysztor, Roman; Szymczak, Tomasz; Rój, Edward
2018-01-01
Vitamins are compounds that take part in all basic functions of an organism but also are subject of number of studies performed by different researchers. Two groups of vitamins are distinguished taking into consideration their solubility. Chromatography with supercritical CO 2 has found application in the determination, separation, and quantitative analyses of both fat- and water-soluble vitamins. The methods of vitamins separation have developed and improved throughout the years. Both groups of compounds were separated using supercritical fluid chromatography with different detection on different stationary phases. The main aim of this review is to provide an overview of the studies of vitamins separation that have been determined so far. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bovens, M; Csesztregi, T; Franc, A; Nagy, J; Dujourdy, L
2014-01-01
The basic goal in sampling for the quantitative analysis of illicit drugs is to maintain the average concentration of the drug in the material from its original seized state (the primary sample) all the way through to the analytical sample, where the effect of particle size is most critical. The size of the largest particles of different authentic illicit drug materials, in their original state and after homogenisation, using manual or mechanical procedures, was measured using a microscope with a camera attachment. The comminution methods employed included pestle and mortar (manual) and various ball and knife mills (mechanical). The drugs investigated were amphetamine, heroin, cocaine and herbal cannabis. It was shown that comminution of illicit drug materials using these techniques reduces the nominal particle size from approximately 600 μm down to between 200 and 300 μm. It was demonstrated that the choice of 1 g increments for the primary samples of powdered drugs and cannabis resin, which were used in the heterogeneity part of our study (Part I) was correct for the routine quantitative analysis of illicit seized drugs. For herbal cannabis we found that the appropriate increment size was larger. Based on the results of this study we can generally state that: An analytical sample weight of between 20 and 35 mg of an illicit powdered drug, with an assumed purity of 5% or higher, would be considered appropriate and would generate an RSDsampling in the same region as the RSDanalysis for a typical quantitative method of analysis for the most common, powdered, illicit drugs. For herbal cannabis, with an assumed purity of 1% THC (tetrahydrocannabinol) or higher, an analytical sample weight of approximately 200 mg would be appropriate. In Part III we will pull together our homogeneity studies and particle size investigations and use them to devise sampling plans and sample preparations suitable for the quantitative instrumental analysis of the most common illicit drugs. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
de Sanctis, Luca; Galla, Tobias
2009-04-01
We study the effects of bounded confidence thresholds and of interaction and external noise on Axelrod’s model of social influence. Our study is based on a combination of numerical simulations and an integration of the mean-field master equation describing the system in the thermodynamic limit. We find that interaction thresholds affect the system only quantitatively, but that they do not alter the basic phase structure. The known crossover between an ordered and a disordered state in finite systems subject to external noise persists in models with general confidence threshold. Interaction noise here facilitates the dynamics and reduces relaxation times. We also study Axelrod systems with metric features and point out similarities and differences compared to models with nominal features.
Diagnostic value of plasma morphology in patients with coronary heart disease
NASA Astrophysics Data System (ADS)
Malinova, Lidia I.; Sergeeva, Yuliya V.; Simonenko, Georgy V.; Tuchin, Valery V.; Denisova, Tatiana P.
2006-08-01
Blood plasma can be considered as a special water system with self-organization possibilities. Plasma slides as the results of wedge dehydration reflect its stereochemical interaction and their study can be used in diagnostic processes. 46 patients with coronary heart disease were studied. The main group was formed of men in age ranged from 54 to 72 years old with stable angina pectoris of II and III functional class (by Canadian classification) (n=25). The group of compare was of those who was hospitalized with diagnosis of acute coronary syndrome, men in age range 40-82. Clinical examination, basic biochemical tests and functional plasma morphology characteristics were studied. A number of qualitative and quantitative differences of blood plasma morphology of patients with chronic and acute coronary disease forms was revealed.
Purcell, Maureen K.; Getchell, Rodman G.; McClure, Carol A.; Weber, S.E.; Garver, Kyle A.
2011-01-01
Real-time, or quantitative, polymerase chain reaction (qPCR) is quickly supplanting other molecular methods for detecting the nucleic acids of human and other animal pathogens owing to the speed and robustness of the technology. As the aquatic animal health community moves toward implementing national diagnostic testing schemes, it will need to evaluate how qPCR technology should be employed. This review outlines the basic principles of qPCR technology, considerations for assay development, standards and controls, assay performance, diagnostic validation, implementation in the diagnostic laboratory, and quality assurance and control measures. These factors are fundamental for ensuring the validity of qPCR assay results obtained in the diagnostic laboratory setting.
Fraunhofer line-dept sensing applied to water
NASA Technical Reports Server (NTRS)
Stoertz, G. E.
1969-01-01
An experimental Fraunhofer line discriminator is basically an airborne fluorometer, capable of quantitatively measuring the concentration of fluorescent substances dissolved in water. It must be calibrated against standards and supplemented by ground-truth data on turbidity and on approximate vertical distribution of the fluorescent substance. Quantitative use requires that it be known in advance what substance is the source of the luminescence emission; qualitative sensing, or detection of luminescence is also possible. The two approaches are fundamentally different, having different purposes, different applications, and different instruments. When used for sensing of Rhodamine WT dye in coastal waters and estuaries, the FLD is sensing in the spectral region permitting nearly maximum depth of light penetration.
Quantitative experiments to explain the change of seasons
NASA Astrophysics Data System (ADS)
Testa, Italo; Busarello, Gianni; Puddu, Emanuella; Leccia, Silvio; Merluzzi, Paola; Colantonio, Arturo; Moretti, Maria Ida; Galano, Silvia; Zappia, Alessandro
2015-03-01
The science education literature shows that students have difficulty understanding what causes the seasons. Incorrect explanations are often due to a lack of knowledge about the physical mechanisms underlying this phenomenon. To address this, we present a module in which the students engage in quantitative measurements with a photovoltaic panel to explain changes to the sunray flow on Earth’s surface over the year. The activities also provide examples of energy transfers between the incoming radiation and the environment to introduce basic features of Earth’s climate. The module was evaluated with 45 secondary school students (aged 17-18) and a pre-/post-test research design. Analysis of students’ learning outcomes supports the effectiveness of the proposed activities.
How to Combine ChIP with qPCR.
Asp, Patrik
2018-01-01
Chromatin immunoprecipitation (ChIP) coupled with quantitative PCR (qPCR) has in the last 15 years become a basic mainstream tool in genomic research. Numerous commercially available ChIP kits, qPCR kits, and real-time PCR systems allow for quick and easy analysis of virtually anything chromatin-related as long as there is an available antibody. However, the highly accurate quantitative dimension added by using qPCR to analyze ChIP samples significantly raises the bar in terms of experimental accuracy, appropriate controls, data analysis, and data presentation. This chapter will address these potential pitfalls by providing protocols and procedures that address the difficulties inherent in ChIP-qPCR assays.
Sánchez Gómez, Serafín; Ostos, Elisa María Cabot; Solano, Juan Manuel Maza; Salado, Tomás Francisco Herrero
2013-05-06
We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an independent operator, and 98% achieved the maximum scores for participation as an observer or assistant. Medical students reported that use of an electronic portfolio that provided quantitative feedback on their progress was useful when the number and complexity of targets were appropriate, but not when the portfolio offered only formative evaluations based on reflection. Students felt that use of the e-Portfolio guided their learning process by indicating knowledge gaps to themselves and teachers.
Basic quantitative assessment of visual performance in patients with very low vision.
Bach, Michael; Wilke, Michaela; Wilhelm, Barbara; Zrenner, Eberhart; Wilke, Robert
2010-02-01
A variety of approaches to developing visual prostheses are being pursued: subretinal, epiretinal, via the optic nerve, or via the visual cortex. This report presents a method of comparing their efficacy at genuinely improving visual function, starting at no light perception (NLP). A test battery (a computer program, Basic Assessment of Light and Motion [BaLM]) was developed in four basic visual dimensions: (1) light perception (light/no light), with an unstructured large-field stimulus; (2) temporal resolution, with single versus double flash discrimination; (3) localization of light, where a wedge extends from the center into four possible directions; and (4) motion, with a coarse pattern moving in one of four directions. Two- or four-alternative, forced-choice paradigms were used. The participants' responses were self-paced and delivered with a keypad. The feasibility of the BaLM was tested in 73 eyes of 51 patients with low vision. The light and time test modules discriminated between NLP and light perception (LP). The localization and motion modules showed no significant response for NLP but discriminated between LP and hand movement (HM). All four modules reached their ceilings in the acuity categories higher than HM. BaLM results systematically differed between the very-low-acuity categories NLP, LP, and HM. Light and time yielded similar results, as did localization and motion; still, for assessing the visual prostheses with differing temporal characteristics, they are not redundant. The results suggest that this simple test battery provides a quantitative assessment of visual function in the very-low-vision range from NLP to HM.
The Design of a Quantitative Western Blot Experiment
Taylor, Sean C.; Posch, Anton
2014-01-01
Western blotting is a technique that has been in practice for more than three decades that began as a means of detecting a protein target in a complex sample. Although there have been significant advances in both the imaging and reagent technologies to improve sensitivity, dynamic range of detection, and the applicability of multiplexed target detection, the basic technique has remained essentially unchanged. In the past, western blotting was used simply to detect a specific target protein in a complex mixture, but now journal editors and reviewers are requesting the quantitative interpretation of western blot data in terms of fold changes in protein expression between samples. The calculations are based on the differential densitometry of the associated chemiluminescent and/or fluorescent signals from the blots and this now requires a fundamental shift in the experimental methodology, acquisition, and interpretation of the data. We have recently published an updated approach to produce quantitative densitometric data from western blots (Taylor et al., 2013) and here we summarize the complete western blot workflow with a focus on sample preparation and data analysis for quantitative western blotting. PMID:24738055
Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.
Mantle, M D
2011-09-30
The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.
Analyzing 7000 texts on deep brain stimulation: what do they tell us?
Ineichen, Christian; Christen, Markus
2015-01-01
The enormous increase in numbers of scientific publications in the last decades requires quantitative methods for obtaining a better understanding of topics and developments in various fields. In this exploratory study, we investigate the emergence, trends, and connections of topics within the whole text corpus of the deep brain stimulation (DBS) literature based on more than 7000 papers (title and abstracts) published between 1991 to 2014 using a network approach. Taking the co-occurrence of basic terms that represent important topics within DBS as starting point, we outline the statistics of interconnections between DBS indications, anatomical targets, positive, and negative effects, as well as methodological, technological, and economic issues. This quantitative approach confirms known trends within the literature (e.g., regarding the emergence of psychiatric indications). The data also reflect an increased discussion about complex issues such as personality connected tightly to the ethical context, as well as an apparent focus on depression as important DBS indication, where the co-occurrence of terms related to negative effects is low both for the indication as well as the related anatomical targets. We also discuss consequences of the analysis from a bioethical perspective, i.e., how such a quantitative analysis could uncover hidden subject matters that have ethical relevance. For example, we find that hardware-related issues in DBS are far more robustly connected to an ethical context compared to impulsivity, concrete side-effects or death/suicide. Our contribution also outlines the methodology of quantitative text analysis that combines statistical approaches with expert knowledge. It thus serves as an example how innovative quantitative tools can be made useful for gaining a better understanding in the field of DBS.
Analyzing 7000 texts on deep brain stimulation: what do they tell us?
Ineichen, Christian; Christen, Markus
2015-01-01
The enormous increase in numbers of scientific publications in the last decades requires quantitative methods for obtaining a better understanding of topics and developments in various fields. In this exploratory study, we investigate the emergence, trends, and connections of topics within the whole text corpus of the deep brain stimulation (DBS) literature based on more than 7000 papers (title and abstracts) published between 1991 to 2014 using a network approach. Taking the co-occurrence of basic terms that represent important topics within DBS as starting point, we outline the statistics of interconnections between DBS indications, anatomical targets, positive, and negative effects, as well as methodological, technological, and economic issues. This quantitative approach confirms known trends within the literature (e.g., regarding the emergence of psychiatric indications). The data also reflect an increased discussion about complex issues such as personality connected tightly to the ethical context, as well as an apparent focus on depression as important DBS indication, where the co-occurrence of terms related to negative effects is low both for the indication as well as the related anatomical targets. We also discuss consequences of the analysis from a bioethical perspective, i.e., how such a quantitative analysis could uncover hidden subject matters that have ethical relevance. For example, we find that hardware-related issues in DBS are far more robustly connected to an ethical context compared to impulsivity, concrete side-effects or death/suicide. Our contribution also outlines the methodology of quantitative text analysis that combines statistical approaches with expert knowledge. It thus serves as an example how innovative quantitative tools can be made useful for gaining a better understanding in the field of DBS. PMID:26578908
Publication Trends in Thanatology: An Analysis of Leading Journals.
Wittkowski, Joachim; Doka, Kenneth J; Neimeyer, Robert A; Vallerga, Michael
2015-01-01
To identify important trends in thanatology as a discipline, the authors analyzed over 1,500 articles that appeared in Death Studies and Omega over a 20-year period, coding the category of articles (e.g., theory, application, empirical research), their content focus (e.g., bereavement, death attitudes, end-of-life), and for empirical studies, their methodology (e.g., quantitative, qualitative). In general, empirical research predominates in both journals, with quantitative methods outnumbering qualitative procedures 2 to 1 across the period studied, despite an uptick in the latter methods in recent years. Purely theoretical articles, in contrast, decline in frequency. Research on grief and bereavement is the most commonly occurring (and increasing) content focus of this work, with a declining but still substantial body of basic research addressing death attitudes. Suicidology is also well represented in the corpus of articles analyzed. In contrast, publications on topics such as death education, medical ethics, and end-of-life issues occur with lower frequency, in the latter instances likely due to the submission of such work to more specialized medical journals. Differences in emphasis of Death Studies and Omega are noted, and the analysis of publication patterns is interpreted with respect to overall trends in the discipline and the culture, yielding a broad depiction of the field and some predictions regarding its possible future.
TU-G-303-03: Machine Learning to Improve Human Learning From Longitudinal Image Sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veeraraghavan, H.
‘Radiomics’ refers to studies that extract a large amount of quantitative information from medical imaging studies as a basis for characterizing a specific aspect of patient health. Radiomics models can be built to address a wide range of outcome predictions, clinical decisions, basic cancer biology, etc. For example, radiomics models can be built to predict the aggressiveness of an imaged cancer, cancer gene expression characteristics (radiogenomics), radiation therapy treatment response, etc. Technically, radiomics brings together quantitative imaging, computer vision/image processing, and machine learning. In this symposium, speakers will discuss approaches to radiomics investigations, including: longitudinal radiomics, radiomics combined with othermore » biomarkers (‘pan-omics’), radiomics for various imaging modalities (CT, MRI, and PET), and the use of registered multi-modality imaging datasets as a basis for radiomics. There are many challenges to the eventual use of radiomics-derived methods in clinical practice, including: standardization and robustness of selected metrics, accruing the data required, building and validating the resulting models, registering longitudinal data that often involve significant patient changes, reliable automated cancer segmentation tools, etc. Despite the hurdles, results achieved so far indicate the tremendous potential of this general approach to quantifying and using data from medical images. Specific applications of radiomics to be presented in this symposium will include: the longitudinal analysis of patients with low-grade gliomas; automatic detection and assessment of patients with metastatic bone lesions; image-based monitoring of patients with growing lymph nodes; predicting radiotherapy outcomes using multi-modality radiomics; and studies relating radiomics with genomics in lung cancer and glioblastoma. Learning Objectives: Understanding the basic image features that are often used in radiomic models. Understanding requirements for reliable radiomic models, including robustness of metrics, adequate predictive accuracy, and generalizability. Understanding the methodology behind radiomic-genomic (’radiogenomics’) correlations. Research supported by NIH (US), CIHR (Canada), and NSERC (Canada)« less
TU-G-303-04: Radiomics and the Coming Pan-Omics Revolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
El Naqa, I.
‘Radiomics’ refers to studies that extract a large amount of quantitative information from medical imaging studies as a basis for characterizing a specific aspect of patient health. Radiomics models can be built to address a wide range of outcome predictions, clinical decisions, basic cancer biology, etc. For example, radiomics models can be built to predict the aggressiveness of an imaged cancer, cancer gene expression characteristics (radiogenomics), radiation therapy treatment response, etc. Technically, radiomics brings together quantitative imaging, computer vision/image processing, and machine learning. In this symposium, speakers will discuss approaches to radiomics investigations, including: longitudinal radiomics, radiomics combined with othermore » biomarkers (‘pan-omics’), radiomics for various imaging modalities (CT, MRI, and PET), and the use of registered multi-modality imaging datasets as a basis for radiomics. There are many challenges to the eventual use of radiomics-derived methods in clinical practice, including: standardization and robustness of selected metrics, accruing the data required, building and validating the resulting models, registering longitudinal data that often involve significant patient changes, reliable automated cancer segmentation tools, etc. Despite the hurdles, results achieved so far indicate the tremendous potential of this general approach to quantifying and using data from medical images. Specific applications of radiomics to be presented in this symposium will include: the longitudinal analysis of patients with low-grade gliomas; automatic detection and assessment of patients with metastatic bone lesions; image-based monitoring of patients with growing lymph nodes; predicting radiotherapy outcomes using multi-modality radiomics; and studies relating radiomics with genomics in lung cancer and glioblastoma. Learning Objectives: Understanding the basic image features that are often used in radiomic models. Understanding requirements for reliable radiomic models, including robustness of metrics, adequate predictive accuracy, and generalizability. Understanding the methodology behind radiomic-genomic (’radiogenomics’) correlations. Research supported by NIH (US), CIHR (Canada), and NSERC (Canada)« less
Accuracy of press reports on gamma-ray astronomy
NASA Astrophysics Data System (ADS)
Schaefer, Bradley E.; Nemiroff, Robert J.; Hurley, Kevin
2000-09-01
Most Americans learn about modern science from press reports, while such articles have a bad reputation among scientists. We have performed a study of 148 news articles on gamma-ray astronomy to quantitatively answer the questions ``How accurate are press reports of gamma-ray astronomy?'' and ``What fraction of the basic claims in the press are correct?'' We have taken all articles on the topic from five news sources (UPI, New York Times, Sky & Telescope, Science News, and five middle-sized city newspapers) for one decade (1987-1996) We found an average rate of roughly one trivial error every two articles, while none of our 148 articles significantly mislead the reader or misrepresented the science. This quantitative result is in stark contrast to the nearly universal opinion among scientists that the press frequently butchers science stories. So a major result from our study is that reporters should be rehabilitated into the good graces of astrophysicists, since they actually are doing a good job. For our second question, we rated each story with the probability that its basic new science claim is correct. We found that the average probability over all stories is 70%. Since the reporters and the scientists are both doing good jobs, then why is 30% of the science you read in the press wrong? The reason is that the nature of news reporting is to present front-line science and the nature of front-line science is that reliable conclusions have not yet been reached. The combination of these two natures forces fast breaking science news to have frequent incorrect ideas that are subsequently identified and corrected. So a second major result from our study is to make the distinction between textbook science (with reliabilities near 100%) and front-line science which you read about in the press (with reliabilities near 70%). .
Melvin, Steven D; Petit, Marie A; Duvignacq, Marion C; Sumpter, John P
2017-08-01
The quality and reproducibility of science has recently come under scrutiny, with criticisms spanning disciplines. In aquatic toxicology, behavioural tests are currently an area of controversy since inconsistent findings have been highlighted and attributed to poor quality science. The problem likely relates to limitations to our understanding of basic behavioural patterns, which can influence our ability to design statistically robust experiments yielding ecologically relevant data. The present study takes a first step towards understanding baseline behaviours in fish, including how basic choices in experimental design might influence behavioural outcomes and interpretations in aquatic toxicology. Specifically, we explored how fish acclimate to behavioural arenas and how different lengths of observation time impact estimates of basic swimming parameters (i.e., average, maximum and angular velocity). We performed a semi-quantitative literature review to place our findings in the context of the published literature describing behavioural tests with fish. Our results demonstrate that fish fundamentally change their swimming behaviour over time, and that acclimation and observational timeframes may therefore have implications for influencing both the ecological relevance and statistical robustness of behavioural toxicity tests. Our review identified 165 studies describing behavioural responses in fish exposed to various stressors, and revealed that the majority of publications documenting fish behavioural responses report extremely brief acclimation times and observational durations, which helps explain inconsistencies identified across studies. We recommend that researchers applying behavioural tests with fish, and other species, apply a similar framework to better understand baseline behaviours and the implications of design choices for influencing study outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Yurt, Kıymet Kübra; Kivrak, Elfide Gizem; Altun, Gamze; Mohamed, Hamza; Ali, Fathelrahman; Gasmalla, Hosam Eldeen; Kaplan, Suleyman
2018-02-26
A quantitative description of a three-dimensional (3D) object based on two-dimensional images can be made using stereological methods These methods involve unbiased approaches and provide reliable results with quantitative data. The quantitative morphology of the nervous system has been thoroughly researched in this context. In particular, various novel methods such as design-based stereological approaches have been applied in neuoromorphological studies. The main foundations of these methods are systematic random sampling and a 3D approach to structures such as tissues and organs. One key point in these methods is that selected samples should represent the entire structure. Quantification of neurons, i.e. particles, is important for revealing degrees of neurodegeneration and regeneration in an organ or system. One of the most crucial morphometric parameters in biological studies is thus the "number". The disector counting method introduced by Sterio in 1984 is an efficient and reliable solution for particle number estimation. In order to obtain precise results by means of stereological analysis, counting items should be seen clearly in the tissue. If an item in the tissue cannot be seen, these cannot be analyzed even using unbiased stereological techniques. Staining and sectioning processes therefore play a critical role in stereological analysis. The purpose of this review is to evaluate current neuroscientific studies using optical and physical disector counting methods and to discuss their definitions and methodological characteristics. Although the efficiency of the optical disector method in light microscopic studies has been revealed in recent years, the physical disector method is more easily performed in electron microscopic studies. Also, we offered to readers summaries of some common basic staining and sectioning methods, which can be used for stereological techniques in this review. Copyright © 2018 Elsevier B.V. All rights reserved.
Hellman, Therese; Jensen, Irene; Bergström, Gunnar; Brämberg, Elisabeth Björk
2016-01-01
ABSTRACT The aim of the study presented in this article was to explore how professionals, without guidelines for implementing interprofessional teamwork, experience the collaboration within team-based rehabilitation for people with back pain and how this collaboration influences their clinical practice. This study employed a mixed methods design. A questionnaire was answered by 383 participants and 17 participants were interviewed. The interviews were analysed using content analysis. The quantitative results showed that the participants were satisfied with their team-based collaboration. Thirty percent reported that staff changes in the past year had influenced their clinical practice, of which 57% reported that these changes had had negative consequences. The qualitative findings revealed that essential features for an effective collaboration were shared basic values and supporting each other. Furthermore, aspects such as having enough time for reflection, staff continuity, and a shared view of the team members’ roles were identified as aspects which influenced the clinical practice. Important clinical implications for nurturing and developing a collaboration in team-based rehabilitation are to create shared basic values and a unified view of all team members’ roles and their contributions to the team. These aspects need to be emphasised on an ongoing basis and not only when the team is formed. PMID:27152534
Identification of Candidate Genes Underlying an Iron Efficiency Quantitative Trait Locus in Soybean1
Peiffer, Gregory A.; King, Keith E.; Severin, Andrew J.; May, Gregory D.; Cianzio, Silvia R.; Lin, Shun Fu; Lauter, Nicholas C.; Shoemaker, Randy C.
2012-01-01
Prevalent on calcareous soils in the United States and abroad, iron deficiency is among the most common and severe nutritional stresses in plants. In soybean (Glycine max) commercial plantings, the identification and use of iron-efficient genotypes has proven to be the best form of managing this soil-related plant stress. Previous studies conducted in soybean identified a significant iron efficiency quantitative trait locus (QTL) explaining more than 70% of the phenotypic variation for the trait. In this research, we identified candidate genes underlying this QTL through molecular breeding, mapping, and transcriptome sequencing. Introgression mapping was performed using two related near-isogenic lines in which a region located on soybean chromosome 3 required for iron efficiency was identified. The region corresponds to the previously reported iron efficiency QTL. The location was further confirmed through QTL mapping conducted in this study. Transcriptome sequencing and quantitative real-time-polymerase chain reaction identified two genes encoding transcription factors within the region that were significantly induced in soybean roots under iron stress. The two induced transcription factors were identified as homologs of the subgroup lb basic helix-loop-helix (bHLH) genes that are known to regulate the strategy I response in Arabidopsis (Arabidopsis thaliana). Resequencing of these differentially expressed genes unveiled a significant deletion within a predicted dimerization domain. We hypothesize that this deletion disrupts the Fe-DEFICIENCY-INDUCED TRANSCRIPTION FACTOR (FIT)/bHLH heterodimer that has been shown to induce known iron acquisition genes. PMID:22319075
NASA Technical Reports Server (NTRS)
Sawada, H.; Sakakibara, S.; Sato, M.; Kanda, H.; Karasawa, T.
1984-01-01
A quantitative evaluation method of the suction effect from a suction plate on side walls is explained. It is found from wind tunnel tests that the wall interference is basically described by the summation form of wall interferences in the case of two dimensional flow and the interference of side walls.
NASA Astrophysics Data System (ADS)
El-Zahry, Marwa R.; Lendl, Bernhard
2018-03-01
A simple, fast and sensitive surface enhanced Raman spectroscopy (SERS) method for quantitative determination of fluoroquinolone antibiotic Ofloxacin (OFX) is presented. Also the stability behavior of OFX was investigated by monitoring the SERS spectra of OFX after various degradation processes. Acidic, basic and oxidative force degradation processes were applied at different time intervals. The forced degradation conditions were conducted and followed using SERS method utilizing silver nanoparticles (Ag NPs) as a SERS substrate. The Ag NPs colloids were prepared by reduction of silver nitrate using polyethyelene glycol (PEG) as a reducing and stabilizing agent. Validation tests were done in accordance with International Conference on Harmonization (ICH) guidelines. The calibration curve with a correlation coefficient (R = 0.9992) was constructed as a relationship between the concentration range of OFX (100-500 ng/ml) and SERS intensity at 1394 cm- 1 band. LOD and LOQ values were calculated and found to be 23.5 ng/ml and 72.6 ng/ml, respectively. The developed method was applied successfully for quantitation of OFX in different pharmaceutical dosage forms. Kinetic parameters were calculated including rate constant of the degradation of the studied antibiotic.
Zhou, Juntuo; Liu, Huiying; Liu, Yang; Liu, Jia; Zhao, Xuyang; Yin, Yuxin
2016-04-19
Recent advances in mass spectrometers which have yielded higher resolution and faster scanning speeds have expanded their application in metabolomics of diverse diseases. Using a quadrupole-Orbitrap LC-MS system, we developed an efficient large-scale quantitative method targeting 237 metabolites involved in various metabolic pathways using scheduled, parallel reaction monitoring (PRM). We assessed the dynamic range, linearity, reproducibility, and system suitability of the PRM assay by measuring concentration curves, biological samples, and clinical serum samples. The quantification performances of PRM and MS1-based assays in Q-Exactive were compared, and the MRM assay in QTRAP 6500 was also compared. The PRM assay monitoring 237 polar metabolites showed greater reproducibility and quantitative accuracy than MS1-based quantification and also showed greater flexibility in postacquisition assay refinement than the MRM assay in QTRAP 6500. We present a workflow for convenient PRM data processing using Skyline software which is free of charge. In this study we have established a reliable PRM methodology on a quadrupole-Orbitrap platform for evaluation of large-scale targeted metabolomics, which provides a new choice for basic and clinical metabolomics study.
Problem-based learning on quantitative analytical chemistry course
NASA Astrophysics Data System (ADS)
Fitri, Noor
2017-12-01
This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.
Inequality and School reform in Bahia, Brazil
NASA Astrophysics Data System (ADS)
Reiter, Bernd
2009-07-01
This article compares public and community schools in Salvador, the state capital of Bahia, Brazil. Based on quantitative data analysis and qualitative research conducted on-site during three research trips in 2001, 2003 and 2005, the author finds that Brazil's extreme inequality and the associated concentration of state power in a few hands stand in the way of an effective reform. In 1999, the state of Bahia started to reform its basic education cycle, but the author's research shows that Bahian elites use access to basic education to defend their inherited privilege. The analysis of community schools further demonstrates that inequality also blocks effective community and parental involvement in school management, as schools tend to distance themselves from neighbourhoods portrayed as poor and black, and thus "dangerous".
Positron Emission Tomography: Principles, Technology, and Recent Developments
NASA Astrophysics Data System (ADS)
Ziegler, Sibylle I.
2005-04-01
Positron emission tomography (PET) is a nuclear medical imaging technique for quantitative measurement of physiologic parameters in vivo (an overview of principles and applications can be found in [P.E. Valk, et al., eds. Positron Emission Tomography. Basic Science and Clinical Practice. 2003, Springer: Heidelberg]), based on the detection of small amounts of posi-tron-emitter-labelled biologic molecules. Various radiotracers are available for neuro-logical, cardiological, and oncological applications in the clinic and in research proto-cols. This overview describes the basic principles, technology, and recent develop-ments in PET, followed by a section on the development of a tomograph with ava-lanche photodiodes dedicated for small animal imaging as an example of efforts in the domain of high resolution tomographs.
Research progress in Asia on methods of processing laser-induced breakdown spectroscopy data
NASA Astrophysics Data System (ADS)
Guo, Yang-Min; Guo, Lian-Bo; Li, Jia-Ming; Liu, Hong-Di; Zhu, Zhi-Hao; Li, Xiang-You; Lu, Yong-Feng; Zeng, Xiao-Yan
2016-10-01
Laser-induced breakdown spectroscopy (LIBS) has attracted much attention in terms of both scientific research and industrial application. An important branch of LIBS research in Asia, the development of data processing methods for LIBS, is reviewed. First, the basic principle of LIBS and the characteristics of spectral data are briefly introduced. Next, two aspects of research on and problems with data processing methods are described: i) the basic principles of data preprocessing methods are elaborated in detail on the basis of the characteristics of spectral data; ii) the performance of data analysis methods in qualitative and quantitative analysis of LIBS is described. Finally, a direction for future development of data processing methods for LIBS is also proposed.
Economies of scale and asset values in power production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Considine, T.J.
While innovative trading tools have become an increasingly important aspect of the electricity business, the future of any firm in the industry boils down to a basic bread and butter issue of generating power at competitive costs. While buying electricity from power pools at spot prices instead of generating power to service load may be profitable for some firms in the short run, the need to efficiently utilize existing plants in the long run remains. These competitive forces will force the closure of many inefficient plants. As firms close plants and re-evaluate their generating asset portfolios, the basic structure ofmore » the industry will change. This article presents some quantitative analysis that sheds light on this unfolding transformation.« less
Melo, E Correa
2003-08-01
The author describes the reasons why evaluation processes should be applied to the Veterinary Services of Member Countries, either for trade in animals and animal products and by-products between two countries, or for establishing essential measures to improve the Veterinary Service concerned. The author also describes the basic elements involved in conducting an evaluation process, including the instruments for doing so. These basic elements centre on the following:--designing a model, or desirable image, against which a comparison can be made--establishing a list of processes to be analysed and defining the qualitative and quantitative mechanisms for this analysis--establishing a multidisciplinary evaluation team and developing a process for standardising the evaluation criteria.
Cor, M Ken
Interpreting results from quantitative research can be difficult when measures of concepts are constructed poorly, something that can limit measurement validity. Social science steps for defining concepts, guidelines for limiting construct-irrelevant variance when writing self-report questions, and techniques for conducting basic item analysis are reviewed to inform the design of instruments to measure social science concepts in pharmacy education research. Based on a review of the literature, four main recommendations emerge: These include: (1) employ a systematic process of conceptualization to derive nominal definitions; (2) write exact and detailed operational definitions for each concept, (3) when creating self-report questionnaires, write statements and select scales to avoid introducing construct-irrelevant variance (CIV); and (4) use basic item analysis results to inform instrument revision. Employing recommendations that emerge from this review will strengthen arguments to support measurement validity which in turn will support the defensibility of study finding interpretations. An example from pharmacy education research is used to contextualize the concepts introduced. Copyright © 2017 Elsevier Inc. All rights reserved.
Basic versus applied research: Julius Sachs (1832-1897) and the experimental physiology of plants.
Kutschera, Ulrich
2015-01-01
The German biologist Julius Sachs was the first to introduce controlled, accurate, quantitative experimentation into the botanical sciences, and is regarded as the founder of modern plant physiology. His seminal monograph Experimental-Physiologie der Pflanzen (Experimental Physiology of Plants) was published 150 y ago (1865), when Sachs was employed as a lecturer at the Agricultural Academy in Poppelsdorf/Bonn (now part of the University). This book marks the beginning of a new era of basic and applied plant science. In this contribution, I summarize the achievements of Sachs and outline his lasting legacy. In addition, I show that Sachs was one of the first biologists who integrated bacteria, which he considered to be descendants of fungi, into the botanical sciences and discussed their interaction with land plants (degradation of wood etc.). This "plant-microbe-view" of green organisms was extended and elaborated by the laboratory botanist Wilhelm Pfeffer (1845-1920), so that the term "Sachs-Pfeffer-Principle of Experimental Plant Research" appears to be appropriate to characterize this novel way of performing scientific studies on green, photoautotrophic organisms (embryophytes, algae, cyanobacteria).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galan, Brandon R.; Wiedner, Eric S.; Helm, Monte L.
Nickel(II) complexes containing chelating N-heterocyclic carbene-phosphine ligands ([NiL2](BPh4)2, for which L = [MeIm(CH2)2PR2]) have been synthesized for the purpose of studying how this class of ligand effects the electrochemical properties compared to the nickel bis- diphosphine analogues. The nickel complexes were synthesized and characterized by x-ray crystallography and electrochemical methods. Based on the half wave potentials (E1/2), substitution of an NHC for one of the phosphines in a diphoshine ligand results in shifts in potential to 0.6 V to 1.2 V more negative than the corresponding nickel bis-diphosphine complexes. These quantitative results highlight the substantial effect that NHC ligands canmore » have upon the electronic properties of the metal complexes. BRG, JCL, and AMA acknowledge the support by the US Department of Energy Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. MLH acknoledges the support of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the US Department of Energy, Office of Science, Office of Basic Energy Sciences. Pacific Northwest National Laboratory is operated by Battelle for the US Department of Energy.« less
Cognitive and neural components of the phenomenology of agency.
Morsella, Ezequiel; Berger, Christopher C; Krieger, Stepehen C
2011-06-01
A primary aspect of the self is the sense of agency – the sense that one is causing an action. In the spirit of recent reductionistic approaches to other complex, multifaceted phenomena (e.g., working memory; cf. Johnson &Johnson, 2009), we attempt to unravel the sense of agency by investigating its most basic components, without invoking high-level conceptual or 'central executive' processes. After considering the high-level components of agency, we examine the cognitive and neural underpinnings of its low-level components, which include basic consciousness and subjective urges (e.g., the urge to breathe when holding one's breath). Regarding urges, a quantitative review revealed that certain inter-representational dynamics (conflicts between action plans, as when holding one's breath) reliably engender fundamental aspects both of the phenomenology of agency and of 'something countering the will of the self'. The neural correlates of such dynamics, for both primordial urges (e.g., air hunger) and urges elicited in laboratory interference tasks, are entertained. In addition, we discuss the implications of this unique perspective for the study of disorders involving agency.
Precise pooling and dispensing of microfluidic droplets towards micro- to macro-world interfacing
Brouzes, Eric; Carniol, April; Bakowski, Tomasz; Strey, Helmut H.
2014-01-01
Droplet microfluidics possesses unique properties such as the ability to carry out multiple independent reactions without dispersion of samples in microchannels. We seek to extend the use of droplet microfluidics to a new range of applications by enabling its integration into workflows based on traditional technologies, such as microtiter plates. Our strategy consists in developing a novel method to manipulate, pool and deliver a precise number of microfluidic droplets. To this aim, we present a basic module that combines droplet trapping with an on-chip valve. We quantitatively analyzed the trapping efficiency of the basic module in order to optimize its design. We also demonstrate the integration of the basic module into a multiplex device that can deliver 8 droplets at every cycle. This device will have a great impact in low throughput droplet applications that necessitate interfacing with macroscale technologies. The micro- to macro- interface is particularly critical in microfluidic applications that aim at sample preparation and has not been rigorously addressed in this context. PMID:25485102
Analysis of the YouTube videos on basic life support and cardiopulmonary resuscitation.
Tourinho, Francis Solange Vieira; de Medeiros, Kleyton Santos; Salvador, Pétala Tuani Candido De Oliveira; Castro, Grayce Loyse Tinoco; Santos, Viviane Euzébia Pereira
2012-01-01
To analyze the videos on the YouTube video sharing site, noting which points addressed in the videos related to CPR and BLS, based on the 2010 Guidelines for the American Heart Association (AHA). This was an exploratory, quantitative and qualitative research performed in the YouTube sharing site, using as keywords the expressions in Portuguese equivalent to the Medical Subject Headings (MeSH) "Cardiopulmonary Resuscitation" and "Basic Life Support" for videos that focused on the basic life support. The research totaled 260 videos over the two searches. Following the exclusion criteria, 61 videos remained. These mostly are posted by individuals and belong to the category Education. Moreover, most of the videos, despite being added to the site after the publication of the 2010 AHA Guidelines, were under the older 2005 guidelines. Although the video-sharing site YouTube is widely used today, it lacks videos about CPR and BLS that comply to the most recent AHA recommendations, which may negatively influence the population that uses it.
Brown, Ted
2010-01-01
In this review, 39 articles published in the American journal of occupational therapy in 2008 and 2009 that were categorized in the practice area of children and youth were examined using content analysis. The most frequent type of research published was basic research, which accounted for 38.5% (n=15) of the 39 studies published on the topic. Instrument development and testing and effectiveness studies were the next two most frequently noted research approaches, accounting for 25.6% (n=10) and 20.5% (n=8) of the studies, respectively. Among the 8 effectiveness studies, the level of evidence distribution was as follows: Level I, 3; Level III, 2; Level IV, 1; and Level V, 2. Quantitative studies were the predominant research paradigm used with 76.9% (n=30) of the studies.
Epidemic spreading on adaptively weighted scale-free networks.
Sun, Mengfeng; Zhang, Haifeng; Kang, Huiyan; Zhu, Guanghu; Fu, Xinchu
2017-04-01
We introduce three modified SIS models on scale-free networks that take into account variable population size, nonlinear infectivity, adaptive weights, behavior inertia and time delay, so as to better characterize the actual spread of epidemics. We develop new mathematical methods and techniques to study the dynamics of the models, including the basic reproduction number, and the global asymptotic stability of the disease-free and endemic equilibria. We show the disease-free equilibrium cannot undergo a Hopf bifurcation. We further analyze the effects of local information of diseases and various immunization schemes on epidemic dynamics. We also perform some stochastic network simulations which yield quantitative agreement with the deterministic mean-field approach.
Hoste, H; Torres-Acosta, J F J; Quijada, J; Chan-Perez, I; Dakheel, M M; Kommuru, D S; Mueller-Harvey, I; Terrill, T H
2016-01-01
Interactions between host nutrition and feeding behaviour are central to understanding the pathophysiological consequences of infections of the digestive tract with parasitic nematodes. The manipulation of host nutrition provides useful options to control gastrointestinal nematodes as a component of an integrated strategy. Focussed mainly on the Haemonchus contortus infection model in small ruminants, this chapter (1) illustrates the relationship between quantitative (macro- and micro-nutrients) and qualitative (plant secondary metabolites) aspects of host nutrition and nematode infection, and (2) shows how basic studies aimed at addressing some generic questions can help to provide solutions, despite the considerable diversity of epidemiological situations and breeding systems. Copyright © 2016 Elsevier Ltd. All rights reserved.
An Online Risk Monitor System (ORMS) to Increase Safety and Security Levels in Industry
NASA Astrophysics Data System (ADS)
Zubair, M.; Rahman, Khalil Ur; Hassan, Mehmood Ul
2013-12-01
The main idea of this research is to develop an Online Risk Monitor System (ORMS) based on Living Probabilistic Safety Assessment (LPSA). The article highlights the essential features and functions of ORMS. The basic models and modules such as, Reliability Data Update Model (RDUM), running time update, redundant system unavailability update, Engineered Safety Features (ESF) unavailability update and general system update have been described in this study. ORMS not only provides quantitative analysis but also highlights qualitative aspects of risk measures. ORMS is capable of automatically updating the online risk models and reliability parameters of equipment. ORMS can support in the decision making process of operators and managers in Nuclear Power Plants.
The Fraunhofer line discriminator: An airborne fluorometer
NASA Technical Reports Server (NTRS)
Stoertz, G. E.
1969-01-01
An experimental Fraunhofer Line Discriminator (FLD) can differentiate and measure solar-stimulated luminescence when viewed against a background of reflected light. Key elements are two extremely sensitive photomultipliers, two glass-spaced Fabry-Perot filters having a bandwidth less than 1 A, and an analog computer. As in conventional fluorometers, concentration of a fluorescent substance is measured by comparison with standards. Quantitative use is probably accurate only at low altitudes but detection of luminescent substances should be possible from any altitude. Applications of the present FLD include remote sensing of fluorescent dyes used in studies of current dynamics. The basic technique is applicable to detection of oil spills, monitoring of pollutants, and sensing over land areas.
NASA Astrophysics Data System (ADS)
Köhler, Reinhard
2014-12-01
We have long been used to the domination of qualitative methods in modern linguistics. Indeed, qualitative methods have advantages such as ease of use and wide applicability to many types of linguistic phenomena. However, this shall not overshadow the fact that a great part of human language is amenable to quantification. Moreover, qualitative methods may lead to over-simplification by employing the rigid yes/no scale. When variability and vagueness of human language must be taken into account, qualitative methods will prove inadequate and give way to quantitative methods [1, p. 11]. In addition to such advantages as exactness and precision, quantitative concepts and methods make it possible to find laws of human language which are just like those in natural sciences. These laws are fundamental elements of linguistic theories in the spirit of the philosophy of science [2,3]. Theorization effort of this type is what quantitative linguistics [1,4,5] is devoted to. The review of Cong and Liu [6] has provided an informative and insightful survey of linguistic complex networks as a young field of quantitative linguistics, including the basic concepts and measures, the major lines of research with linguistic motivation, and suggestions for future research.
Li, Caixia; Tan, Xing Fei; Lim, Teck Kwang; Lin, Qingsong; Gong, Zhiyuan
2016-04-13
Omic approaches have been increasingly used in the zebrafish model for holistic understanding of molecular events and mechanisms of tissue functions. However, plasma is rarely used for omic profiling because of the technical challenges in collecting sufficient blood. In this study, we employed two mass spectrometric (MS) approaches for a comprehensive characterization of zebrafish plasma proteome, i.e. conventional shotgun liquid chromatography-tandem mass spectrometry (LC-MS/MS) for an overview study and quantitative SWATH (Sequential Window Acquisition of all THeoretical fragment-ion spectra) for comparison between genders. 959 proteins were identified in the shotgun profiling with estimated concentrations spanning almost five orders of magnitudes. Other than the presence of a few highly abundant female egg yolk precursor proteins (vitellogenins), the proteomic profiles of male and female plasmas were very similar in both number and abundance and there were basically no other highly gender-biased proteins. The types of plasma proteins based on IPA (Ingenuity Pathway Analysis) classification and tissue sources of production were also very similar. Furthermore, the zebrafish plasma proteome shares significant similarities with human plasma proteome, in particular in top abundant proteins including apolipoproteins and complements. Thus, the current study provided a valuable dataset for future evaluation of plasma proteins in zebrafish.
Li, Caixia; Tan, Xing Fei; Lim, Teck Kwang; Lin, Qingsong; Gong, Zhiyuan
2016-01-01
Omic approaches have been increasingly used in the zebrafish model for holistic understanding of molecular events and mechanisms of tissue functions. However, plasma is rarely used for omic profiling because of the technical challenges in collecting sufficient blood. In this study, we employed two mass spectrometric (MS) approaches for a comprehensive characterization of zebrafish plasma proteome, i.e. conventional shotgun liquid chromatography-tandem mass spectrometry (LC-MS/MS) for an overview study and quantitative SWATH (Sequential Window Acquisition of all THeoretical fragment-ion spectra) for comparison between genders. 959 proteins were identified in the shotgun profiling with estimated concentrations spanning almost five orders of magnitudes. Other than the presence of a few highly abundant female egg yolk precursor proteins (vitellogenins), the proteomic profiles of male and female plasmas were very similar in both number and abundance and there were basically no other highly gender-biased proteins. The types of plasma proteins based on IPA (Ingenuity Pathway Analysis) classification and tissue sources of production were also very similar. Furthermore, the zebrafish plasma proteome shares significant similarities with human plasma proteome, in particular in top abundant proteins including apolipoproteins and complements. Thus, the current study provided a valuable dataset for future evaluation of plasma proteins in zebrafish. PMID:27071722
NASA Astrophysics Data System (ADS)
Mehrishal, Seyedahmad; Sharifzadeh, Mostafa; Shahriar, Korosh; Song, Jae-Jon
2017-04-01
In relation to the shearing of rock joints, the precise and continuous evaluation of asperity interlocking, dilation, and basic friction properties has been the most important task in the modeling of shear strength. In this paper, in order to investigate these controlling factors, two types of limestone joint samples were prepared and CNL direct shear tests were performed on these joints under various shear conditions. One set of samples were travertine and another were onyx marble with slickensided surfaces, surfaces ground to #80, and rough surfaces were tested. Direct shear experiments conducted on slickensided and ground surfaces of limestone indicated that by increasing the applied normal stress, under different shearing rates, the basic friction coefficient decreased. Moreover, in the shear tests under constant normal stress and shearing rate, the basic friction coefficient remained constant for the different contact sizes. The second series of direct shear experiments in this research was conducted on tension joint samples to evaluate the effect of surface roughness on the shear behavior of the rough joints. This paper deals with the dilation and roughness interlocking using a method that characterizes the surface roughness of the joint based on a fundamental combined surface roughness concept. The application of stress-dependent basic friction and quantitative roughness parameters in the continuous modeling of the shear behavior of rock joints is an important aspect of this research.
Representation matters: quantitative behavioral variation in wild worm strains
NASA Astrophysics Data System (ADS)
Brown, Andre
Natural genetic variation in populations is the basis of genome-wide association studies, an approach that has been applied in large studies of humans to study the genetic architecture of complex traits including disease risk. Of course, the traits you choose to measure determine which associated genes you discover (or miss). In large-scale human studies, the measured traits are usually taken as a given during the association step because they are expensive to collect and standardize. Working with the nematode worm C. elegans, we do not have the same constraints. In this talk I will describe how large-scale imaging of worm behavior allows us to develop alternative representations of behavior that vary differently across wild populations. The alternative representations yield novel traits that can be used for genome-wide association studies and may reveal basic properties of the genotype-phenotype map that are obscured if only a small set of fixed traits are used.
Structures, performance, benefit, cost study. [gas turbine engines
NASA Technical Reports Server (NTRS)
Feder, E.
1981-01-01
Aircraft engine structures were studied to identify the advanced structural technologies that would provide the most benefits to future aircraft operations. A series of studies identified engine systems with the greatest potential for improvements. Based on these studies, six advanced generic structural concepts were selected and conceptually designed. The benefits of each concept were quantitatively assessed in terms of thrust specific fuel consumption, weight, cost, maintenance cost, fuel burned and direct operating cost plus interest. The probability of success of each concept was also determined. The concepts were ranked and the three most promising were selected for further study which consisted of identifying and comprehensively outlining the advanced technologies required to develop these concepts for aircraft engine application. Analytic, fabrication, and test technology developments are required. The technology programs outlined emphasize the need to provide basic, fundamental understanding of technology to obtain the benefit goals.
Kristanti, Martina Sinta; Setiyarini, Sri; Effendy, Christantie
2017-01-17
Palliative care in Indonesia is problematic because of cultural and socio-economic factors. Family in Indonesia is an integral part of caregiving process in inpatient and outpatient settings. However, most families are not adequately prepared to deliver basic care for their sick family member. This research is a pilot project aiming to evaluate how basic skills training (BST) given to family caregivers could enhance the quality of life (QoL) of palliative care cancer patients in Indonesia. The study is a prospective quantitative with pre and post-test design. Thirty family caregivers of cancer patients were trained in basic skills including showering, washing hair, assisting for fecal and urinary elimination and oral care, as well as feeding at bedside. Patients' QoL were measured at baseline and 4 weeks after training using EORTC QLQ C30. Hypothesis testing was done using related samples Wilcoxon Signed Rank. A paired t-test and one-way ANOVA were used to check in which subgroups was the intervention more significant. The intervention showed a significant change in patients' global health status/QoL, emotional and social functioning, pain, fatigue, dyspnea, insomnia, appetite loss, constipation and financial hardship of the patients. Male patient's had a significant effect on global health status (qol) (p = 0.030); female patients had a significant effect on dyspnea (p = 0.050) and constipation (p = 0.038). Younger patients had a significant effect in global health status/QoL (p = 0.002). Patients between 45 and 54 years old had significant effect on financial issue (p = 0.039). Caregivers between 45 and 54 years old had significant effect on patients' dyspnea (p = 0.031). Basic skills training for family caregivers provided some changes in some aspects of QoL of palliative cancer patients. The intervention showed promises in maintaining the QoL of cancer patients considering socio-economic and cultural challenges in the provision of palliative care in Indonesia.
Accuracy of Press Reports in Astronomy
NASA Astrophysics Data System (ADS)
Schaefer, B. E.; Hurley, K.; Nemiroff, R. J.; Branch, D.; Perlmutter, S.; Schaefer, M. W.; Consolmagno, G. J.; McSween, H.; Strom, R.
1999-12-01
Most Americans learn about modern science from press reports, while such articles have a bad reputation among scientists. We have performed a study of 403 news articles on three topics (gamma-ray astronomy, supernovae, and Mars) to quantitatively answer the questions 'How accurate are press reports of astronomy?' and 'What fraction of the basic science claims in the press are correct?' We have taken all articles on the topics from five news sources (UPI, NYT, S&T, SN, and 5 newspapers) for one decade (1987-1996). All articles were evaluated for a variety of errors, ranging from the fundamental to the trivial. For 'trivial' errors, S&T and SN were virtually perfect while the various newspapers averaged roughly one trivial error every two articles. For meaningful errors, we found that none of our 403 articles significantly mislead the reader or misrepresented the science. So a major result of our study is that reporters should be rehabilitated into the good graces of astronomers, since they are actually doing a good job. For our second question, we rated each story with the probability that its basic new science claim is correct. We found that the average probability over all stories is 70%, regardless of source, topic, importance, or quoted pundit. How do we reconcile our findings that the press does not make significant errors yet the basic science presented is 30% wrong? The reason is that the nature of news reporting is to present front-line science and the nature of front-line science is that reliable conclusions have not yet been reached. So a second major result of our study is to make the distinction between textbook science (with reliability near 100%) and front-line science which you read in the press (with reliability near 70%).
Trends in fluorescence imaging and related techniques to unravel biological information.
Haustein, Elke; Schwille, Petra
2007-09-01
Optical microscopy is among the most powerful tools that the physical sciences have ever provided biology. It is indispensable for basic lab work, as well as for cutting edge research, as the visual monitoring of life processes still belongs to the most compelling evidences for a multitude of biomedical applications. Along with the rapid development of new probes and methods for the analysis of laser induced fluorescence, optical microscopy over past years experienced a vast increase of both new techniques and novel combinations of established methods to study biological processes with unprecedented spatial and temporal precision. On the one hand, major technical advances have significantly improved spatial resolution. On the other hand, life scientists are moving toward three- and even four-dimensional cell biology and biophysics involving time as a crucial coordinate to quantitatively understand living specimen. Monitoring the whole cell or tissue in real time, rather than producing snap-shot-like two-dimensional projections, will enable more physiological and, thus, more clinically relevant experiments, whereas an increase in temporal resolution facilitates monitoring fast nonperiodic processes as well as the quantitative analysis of characteristic dynamics.
Trends in fluorescence imaging and related techniques to unravel biological information
Haustein, Elke; Schwille, Petra
2007-01-01
Optical microscopy is among the most powerful tools that the physical sciences have ever provided biology. It is indispensable for basic lab work, as well as for cutting edge research, as the visual monitoring of life processes still belongs to the most compelling evidences for a multitude of biomedical applications. Along with the rapid development of new probes and methods for the analysis of laser induced fluorescence, optical microscopy over past years experienced a vast increase of both new techniques and novel combinations of established methods to study biological processes with unprecedented spatial and temporal precision. On the one hand, major technical advances have significantly improved spatial resolution. On the other hand, life scientists are moving toward three- and even four-dimensional cell biology and biophysics involving time as a crucial coordinate to quantitatively understand living specimen. Monitoring the whole cell or tissue in real time, rather than producing snap-shot-like two-dimensional projections, will enable more physiological and, thus, more clinically relevant experiments, whereas an increase in temporal resolution facilitates monitoring fast nonperiodic processes as well as the quantitative analysis of characteristic dynamics. PMID:19404444
NASA Astrophysics Data System (ADS)
Mano, Kazune; Tanigawa, Shohei; Hori, Makoto; Yokota, Daiki; Wada, Kenji; Matsunaka, Toshiyuki; Morikawa, Hiroyasu; Horinaka, Hiromichi
2016-07-01
Fatty liver is a disease caused by the excess accumulation of fat in the human liver. The early diagnosis of fatty liver is very important, because fatty liver is the major marker linked to metabolic syndrome. We already proposed the ultrasonic velocity change imaging method to diagnose fatty liver by using the fact that the temperature dependence of ultrasonic velocity is different in water and in fat. For the diagonosis of a fatty liver stage, we attempted a feasibility study of the quantitative assessment of the fat content in the human liver using our ultrasonic velocity change imaging method. Experimental results showed that the fat content in the tissue mimic phantom containing lard was determined by its ultrasonic velocity change in the flat temperature region formed by a circular warming ultrasonic transducer with an acoustic lens having an appropriate focal length. By considering the results of our simulation using a thermal diffusion equation, we determined whether this method could be applied to fatty liver assessment under the condition that the tissue had the thermal relaxation effect caused by blood flow.
NASA Astrophysics Data System (ADS)
Brun, Christophe
2017-05-01
This paper is the second part of a study of katabatic jet along a convexly curved slope with a maximum angle of about 35.5°. Large-Eddy Simulation (LES) is performed with a special focus on the outer-layer shear of the katabatic jet. In the first part, a basic statistical quantitative analysis of the flow was performed. Here a qualitative and quantitative description of vortical structures is used to gain insight in the present 3-D turbulent flow. It is shown that Görtler vortices oriented in the streamwise downslope direction develop in the shear layer. They spread with a specific mushroom shape in the vertical direction up to about 100 m height. They play a main role with respect to local turbulent mixing in the ground surface boundary layer. The present curved slope configuration constitutes a realistic model for alpine orography. This paper provides a procedure based on local turbulence anisotropy to track Görtler vortices for in situ measurements, which has never been proposed in the literature.
Liu, Yang; Wilson, W David
2010-01-01
Surface plasmon resonance (SPR) technology with biosensor surfaces has become a widely-used tool for the study of nucleic acid interactions without any labeling requirements. The method provides simultaneous kinetic and equilibrium characterization of the interactions of biomolecules as well as small molecule-biopolymer binding. SPR monitors molecular interactions in real time and provides significant advantages over optical or calorimetic methods for systems with strong binding coupled to small spectroscopic signals and/or reaction heats. A detailed and practical guide for nucleic acid interaction analysis using SPR-biosensor methods is presented. Details of the SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips, and samples, as well as extensive information on experimental design, quantitative and qualitative data analysis and presentation. A specific example of the interaction of a minor-groove-binding agent with DNA is evaluated by both kinetic and steady-state SPR methods to illustrate the technique. Since the molecules that bind cooperatively to specific DNA sequences are attractive for many applications, a cooperative small molecule-DNA interaction is also presented.
Fluctuations and Noise in Stochastic Spread of Respiratory Infection Epidemics in Social Networks
NASA Astrophysics Data System (ADS)
Yulmetyev, Renat; Emelyanova, Natalya; Demin, Sergey; Gafarov, Fail; Hänggi, Peter; Yulmetyeva, Dinara
2003-05-01
For the analysis of epidemic and disease dynamics complexity, it is necessary to understand the basic principles and notions of its spreading in long-time memory media. Here we considering the problem from a theoretical and practical viewpoint, presenting the quantitative evidence confirming the existence of stochastic long-range memory and robust chaos in a real time series of respiratory infections of human upper respiratory track. In this work we present a new statistical method of analyzing the spread of grippe and acute respiratory track infections epidemic process of human upper respiratory track by means of the theory of discrete non-Markov stochastic processes. We use the results of our recent theory (Phys. Rev. E 65, 046107 (2002)) for the study of statistical effects of memory in real data series, describing the epidemic dynamics of human acute respiratory track infections and grippe. The obtained results testify to an opportunity of the strict quantitative description of the regular and stochastic components in epidemic dynamics of social networks with a view to time discreteness and effects of statistical memory.
NASA Astrophysics Data System (ADS)
Döveling, Katrin
2015-04-01
In an age of rising impact of online communication in social network sites (SNS), emotional interaction is neither limited nor restricted by time or space. Bereavement extends to the anonymity of cyberspace. What role does virtual interaction play in SNS in dealing with the basic human emotion of grief caused by the loss of a beloved person? The analysis laid out in this article provides answers in light of an interdisciplinary perspective on online bereavement. Relevant lines of research are scrutinized. After laying out the theoretical spectrum for the study, hypotheses based on a prior in-depth qualitative content analysis of 179 postings in three different German online bereavement platforms are proposed and scrutinized in a quantitative content analysis (2127 postings from 318 users). Emotion regulation patterns in SNS and similarities as well as differences in online bereavement of children, adolescents and adults are revealed. Large-scale quantitative findings into central motives, patterns, and restorative effects of online shared bereavement in regulating distress, fostering personal empowerment, and engendering meaning are presented. The article closes with implications for further analysis in memorialization practices.
Quantitative comparison of a human cancer cell surface proteome between interphase and mitosis.
Özlü, Nurhan; Qureshi, Mohammad H; Toyoda, Yusuke; Renard, Bernhard Y; Mollaoglu, Gürkan; Özkan, Nazlı E; Bulbul, Selda; Poser, Ina; Timm, Wiebke; Hyman, Anthony A; Mitchison, Timothy J; Steen, Judith A
2015-01-13
The cell surface is the cellular compartment responsible for communication with the environment. The interior of mammalian cells undergoes dramatic reorganization when cells enter mitosis. These changes are triggered by activation of the CDK1 kinase and have been studied extensively. In contrast, very little is known of the cell surface changes during cell division. We undertook a quantitative proteomic comparison of cell surface-exposed proteins in human cancer cells that were tightly synchronized in mitosis or interphase. Six hundred and twenty-eight surface and surface-associated proteins in HeLa cells were identified; of these, 27 were significantly enriched at the cell surface in mitosis and 37 in interphase. Using imaging techniques, we confirmed the mitosis-selective cell surface localization of protocadherin PCDH7, a member of a family with anti-adhesive roles in embryos. We show that PCDH7 is required for development of full mitotic rounding pressure at the onset of mitosis. Our analysis provided basic information on how cell cycle progression affects the cell surface. It also provides potential pharmacodynamic biomarkers for anti-mitotic cancer chemotherapy. © 2014 The Authors.
Quantitative comparison of a human cancer cell surface proteome between interphase and mitosis
Özlü, Nurhan; Qureshi, Mohammad H; Toyoda, Yusuke; Renard, Bernhard Y; Mollaoglu, Gürkan; Özkan, Nazlı E; Bulbul, Selda; Poser, Ina; Timm, Wiebke; Hyman, Anthony A; Mitchison, Timothy J; Steen, Judith A
2015-01-01
The cell surface is the cellular compartment responsible for communication with the environment. The interior of mammalian cells undergoes dramatic reorganization when cells enter mitosis. These changes are triggered by activation of the CDK1 kinase and have been studied extensively. In contrast, very little is known of the cell surface changes during cell division. We undertook a quantitative proteomic comparison of cell surface-exposed proteins in human cancer cells that were tightly synchronized in mitosis or interphase. Six hundred and twenty-eight surface and surface-associated proteins in HeLa cells were identified; of these, 27 were significantly enriched at the cell surface in mitosis and 37 in interphase. Using imaging techniques, we confirmed the mitosis-selective cell surface localization of protocadherin PCDH7, a member of a family with anti-adhesive roles in embryos. We show that PCDH7 is required for development of full mitotic rounding pressure at the onset of mitosis. Our analysis provided basic information on how cell cycle progression affects the cell surface. It also provides potential pharmacodynamic biomarkers for anti-mitotic cancer chemotherapy. PMID:25476450
Quantitative descriptive analysis of Italian polenta produced with different corn cultivars.
Zeppa, Giuseppe; Bertolino, Marta; Rolle, Luca
2012-01-30
Polenta is a porridge-like dish, generally made by mixing cornmeal with salt water and stirring constantly while cooking over a low heat. It can be eaten plain, straight from the pan, or topped with various foods (cheeses, meat, sausages, fish, etc.). It is most popular in northern Italy but can also be found in Switzerland, Austria, Croatia, Argentina and other countries in Eastern Europe and South America. Despite this diffusion, there are no data concerning the sensory characteristics of this product. A research study was therefore carried out to define the lexicon for a sensory profile of polenta and relationships with corn cultivars. A lexicon with 13 sensory parameters was defined and validated before references were determined. After panel training, the sensory profiles of 12 autochthonous maize cultivars were defined. The results of this research highlighted that quantitative descriptive analysis can also be used for the sensory description of polenta, and that the defined lexicon can be used to describe the sensory qualities of polenta for both basic research, such as maize selection, and product development. Copyright © 2011 Society of Chemical Industry.
Larsen, C R; Grantcharov, T; Aggarwal, R; Tully, A; Sørensen, J L; Dalsgaard, T; Ottesen, B
2006-09-01
Safe realistic training and unbiased quantitative assessment of technical skills are required for laparoscopy. Virtual reality (VR) simulators may be useful tools for training and assessing basic and advanced surgical skills and procedures. This study aimed to investigate the construct validity of the LapSimGyn VR simulator, and to determine the learning curves of gynecologists with different levels of experience. For this study, 32 gynecologic trainees and consultants (juniors or seniors) were allocated into three groups: novices (0 advanced laparoscopic procedures), intermediate level (>20 and <60 procedures), and experts (>100 procedures). All performed 10 sets of simulations consisting of three basic skill tasks and an ectopic pregnancy program. The simulations were carried out on 3 days within a maximum period of 2 weeks. Assessment of skills was based on time, economy of movement, and error parameters measured by the simulator. The data showed that expert gynecologists performed significantly and consistently better than intermediate and novice gynecologists. The learning curves differed significantly between the groups, showing that experts start at a higher level and more rapidly reach the plateau of their learning curve than do intermediate and novice groups of surgeons. The LapSimGyn VR simulator package demonstrates construct validity on both the basic skills module and the procedural gynecologic module for ectopic pregnancy. Learning curves can be obtained, but to reach the maximum performance for the more complex tasks, 10 repetitions do not seem sufficient at the given task level and settings. LapSimGyn also seems to be flexible and widely accepted by the users.
Artificial intelligence approaches for rational drug design and discovery.
Duch, Włodzisław; Swaminathan, Karthikeyan; Meller, Jarosław
2007-01-01
Pattern recognition, machine learning and artificial intelligence approaches play an increasingly important role in rational drug design, screening and identification of candidate molecules and studies on quantitative structure-activity relationships (QSAR). In this review, we present an overview of basic concepts and methodology in the fields of machine learning and artificial intelligence (AI). An emphasis is put on methods that enable an intuitive interpretation of the results and facilitate gaining an insight into the structure of the problem at hand. We also discuss representative applications of AI methods to docking, screening and QSAR studies. The growing trend to integrate computational and experimental efforts in that regard and some future developments are discussed. In addition, we comment on a broader role of machine learning and artificial intelligence approaches in biomedical research.
NASA Astrophysics Data System (ADS)
Valente, Diego; Savkar, Amit; Mokaya, Fridah; Wells, James
The Force Concept Inventory (FCI) has been analyzed and studied in various ways with regards to students' understanding of basic physics concepts. We present normalized learning gains and effect size calculations of FCI scores, taken in the context of large-scale classes in a 4-year public university and course instruction that incorporates elements of Just-In-Time teaching and active learning components. In addition, we will present here a novel way of using FCI pre- and post-test as a predictor of students' performance on midterm and final exams. Utilizing a taxonomy table of physics concepts, we will look at student performance broken down by topic, while also examining possible correlations between FCI post-test scores and other course assessments. College of Liberal Arts and Sciences (CLAS), UConn.
The solar power satellite concepts: The past decade and the next decade
NASA Technical Reports Server (NTRS)
Kraft, C. C., Jr.
1979-01-01
Results of studies on the solar power satellite concept are summarized. The basic advantages are near continuous access to sunlight and freedom from atmospheric effects and cloud cover. The systems definition studies consider photovoltaic and thermal energy conversion systems and find both to be technically feasible, with the photovoltaic approach preferred. A microwave test program is under way which will provide quantitative data on critical parameters, including beam forming and steering accuracy. Ballistic and winged launch vehicles are defined for the transportation of construction materials, with the shuttle expected to provide low cost transportation to and from space. A reference system is outlined for evaluating the concept in terms of environmental and other considerations. Preliminary estimates of natural resource requirements and energy payback intervals are encouraging.
Effects of Pressure on Stability of Biomolecules in Solutions Studied by Neutron Scattering
NASA Astrophysics Data System (ADS)
Bellissent-Funel, Marie-Claire-; Appavou, Marie-Sousai; Gibrat, Gabriel
Studies of the pressure dependence on protein structure and dynamics contribute not only to the basic knowledge of biological molecules but have also a considerable relevance in full technology, like in food sterilization and pharmacy. Conformational changes induced by pressure as well as the effects on the protein stability have been mostly studied by optical techniques (optical absorption, fluorescence, phosphorescence), and by NMR. Most optical techniques used so far give information related to the local nature of the used probe (fluorescent or phosphorescent tryptophan). Small angle neutron scattering and quasi-elastic neutron scattering provide essential complementary information to the optical data, giving quantitative data on change of conformation of soluble globular proteins such as bovine pancreatic trypsin inhibitor (BPTI) and on the mobility of protons belonging to the protein surface residues.
Application of Risk-Based Inspection method for gas compressor station
NASA Astrophysics Data System (ADS)
Zhang, Meng; Liang, Wei; Qiu, Zeyang; Lin, Yang
2017-05-01
According to the complex process and lots of equipment, there are risks in gas compressor station. At present, research on integrity management of gas compressor station is insufficient. In this paper, the basic principle of Risk Based Inspection (RBI) and the RBI methodology are studied; the process of RBI in the gas compressor station is developed. The corrosion loop and logistics loop of the gas compressor station are determined through the study of corrosion mechanism and process of the gas compressor station. The probability of failure is calculated by using the modified coefficient, and the consequence of failure is calculated by the quantitative method. In particular, we addressed the application of a RBI methodology in a gas compressor station. The risk ranking is helpful to find the best preventive plan for inspection in the case study.
In vitro three-dimensional cancer metastasis modeling: Past, present, and future
NASA Astrophysics Data System (ADS)
Wei-jing, Han; Wei, Yuan; Jiang-rui, Zhu; Qihui, Fan; Junle, Qu; Li-yu, Liu
2016-01-01
Metastasis is the leading cause of most cancer deaths, as opposed to dysregulated cell growth of the primary tumor. Molecular mechanisms of metastasis have been studied for decades and the findings have evolved our understanding of the progression of malignancy. However, most of the molecular mechanisms fail to address the causes of cancer and its evolutionary origin, demonstrating an inability to find a solution for complete cure of cancer. After being a neglected area of tumor biology for quite some time, recently several studies have focused on the impact of the tumor microenvironment on cancer growth. The importance of the tumor microenvironment is gradually gaining attention, particularly from the perspective of biophysics. In vitro three-dimensional (3-D) metastatic models are an indispensable platform for investigating the tumor microenvironment, as they mimic the in vivo tumor tissue. In 3-D metastatic in vitro models, static factors such as the mechanical properties, biochemical factors, as well as dynamic factors such as cell-cell, cell-ECM interactions, and fluid shear stress can be studied quantitatively. With increasing focus on basic cancer research and drug development, the in vitro 3-D models offer unique advantages in fundamental and clinical biomedical studies. Project supported by the National Basic Research Program of China (Grant No. 2013CB837200), the National Natural Science Foundation of China (Grant No. 11474345), and the Beijing Natural Science Foundation, China (Grant No. 7154221).
NASA Astrophysics Data System (ADS)
Webster, Nathan A. S.; Pownceby, Mark I.; Madsen, Ian C.; Studer, Andrew J.; Manuel, James R.; Kimpton, Justin A.
2014-12-01
Effects of basicity, B (CaO:SiO2 ratio) on the thermal range, concentration, and formation mechanisms of silico-ferrite of calcium and aluminum (SFCA) and SFCA-I iron ore sinter bonding phases have been investigated using an in situ synchrotron X-ray diffraction-based methodology with subsequent Rietveld refinement-based quantitative phase analysis. SFCA and SFCA-I phases are the key bonding materials in iron ore sinter, and improved understanding of the effects of processing parameters such as basicity on their formation and decomposition may assist in improving efficiency of industrial iron ore sintering operations. Increasing basicity significantly increased the thermal range of SFCA-I, from 1363 K to 1533 K (1090 °C to 1260 °C) for a mixture with B = 2.48, to ~1339 K to 1535 K (1066 °C to 1262 °C) for a mixture with B = 3.96, and to ~1323 K to 1593 K (1050 °C to 1320 °C) at B = 4.94. Increasing basicity also increased the amount of SFCA-I formed, from 18 wt pct for the mixture with B = 2.48 to 25 wt pct for the B = 4.94 mixture. Higher basicity of the starting sinter mixture will, therefore, increase the amount of SFCA-I, considered to be more desirable of the two phases. Basicity did not appear to significantly influence the formation mechanism of SFCA-I. It did, however, affect the formation mechanism of SFCA, with the decomposition of SFCA-I coinciding with the formation of a significant amount of additional SFCA in the B = 2.48 and 3.96 mixtures but only a minor amount in the highest basicity mixture. In situ neutron diffraction enabled characterization of the behavior of magnetite after melting of SFCA produced a magnetite plus melt phase assemblage.
The effect of social interactions in the primary consumption life cycle of motion pictures
NASA Astrophysics Data System (ADS)
Hidalgo R, César A.; Castro, Alejandra; Rodriguez-Sickert, Carlos
2006-04-01
We develop a 'basic principles' model which accounts for the primary life cycle consumption of films as a social coordination problem in which information transmission is governed by word of mouth. We fit the analytical solution of such a model to aggregated consumption data from the film industry and derive a quantitative estimator of its quality based on the structure of the life cycle.
NASA Technical Reports Server (NTRS)
Kuehl, H.
1947-01-01
The basic principles of the control of TL ongincs are developed on .the basis of a quantitative investigation of the behavior of these behavior under various operating conditions with particular consideration of the simplifications pormissible in each case. Various possible means of control of jet engines are suggested and are illustrated by schematic designs.
The Quantitative Preparation of Future Geoscience Graduate Students
NASA Astrophysics Data System (ADS)
Manduca, C. A.; Hancock, G. S.
2006-12-01
Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways. Calculus, calculus-based physics, chemistry, statistics, programming and linear algebra were viewed as important course preparation for a successful graduate experience. A set of recommendations for departments and for new community resources includes ideas for infusing quantitative reasoning throughout the undergraduate experience and mechanisms for learning from successful experiments in both geoscience and mathematics. A full list of participants, summaries of the meeting discussion and recommendations are available at http://serc.carleton.edu/quantskills/winter06/index.html. These documents, crafted by a small but diverse group can serve as a starting point for broader community discussion of the quantitative preparation of future geoscience graduate students.
Li, Mengshan; Zhang, Huaijing; Chen, Bingsheng; Wu, Yan; Guan, Lixin
2018-03-05
The pKa value of drugs is an important parameter in drug design and pharmacology. In this paper, an improved particle swarm optimization (PSO) algorithm was proposed based on the population entropy diversity. In the improved algorithm, when the population entropy was higher than the set maximum threshold, the convergence strategy was adopted; when the population entropy was lower than the set minimum threshold the divergence strategy was adopted; when the population entropy was between the maximum and minimum threshold, the self-adaptive adjustment strategy was maintained. The improved PSO algorithm was applied in the training of radial basis function artificial neural network (RBF ANN) model and the selection of molecular descriptors. A quantitative structure-activity relationship model based on RBF ANN trained by the improved PSO algorithm was proposed to predict the pKa values of 74 kinds of neutral and basic drugs and then validated by another database containing 20 molecules. The validation results showed that the model had a good prediction performance. The absolute average relative error, root mean square error, and squared correlation coefficient were 0.3105, 0.0411, and 0.9685, respectively. The model can be used as a reference for exploring other quantitative structure-activity relationships.
HyphArea--automated analysis of spatiotemporal fungal patterns.
Baum, Tobias; Navarro-Quezada, Aura; Knogge, Wolfgang; Douchkov, Dimitar; Schweizer, Patrick; Seiffert, Udo
2011-01-01
In phytopathology quantitative measurements are rarely used to assess crop plant disease symptoms. Instead, a qualitative valuation by eye is often the method of choice. In order to close the gap between subjective human inspection and objective quantitative results, the development of an automated analysis system that is capable of recognizing and characterizing the growth patterns of fungal hyphae in micrograph images was developed. This system should enable the efficient screening of different host-pathogen combinations (e.g., barley-Blumeria graminis, barley-Rhynchosporium secalis) using different microscopy technologies (e.g., bright field, fluorescence). An image segmentation algorithm was developed for gray-scale image data that achieved good results with several microscope imaging protocols. Furthermore, adaptability towards different host-pathogen systems was obtained by using a classification that is based on a genetic algorithm. The developed software system was named HyphArea, since the quantification of the area covered by a hyphal colony is the basic task and prerequisite for all further morphological and statistical analyses in this context. By means of a typical use case the utilization and basic properties of HyphArea could be demonstrated. It was possible to detect statistically significant differences between the growth of an R. secalis wild-type strain and a virulence mutant. Copyright © 2010 Elsevier GmbH. All rights reserved.
Ko, Sangjin; Park, Wanju
2018-06-02
The study investigated whether neurofeedback training can normalize the excessive high-beta and low alpha waves indicative of hyperarousal, and subsequently improve autonomous regulation based on the self-determination theory in alcohol use disorders. A nonequivalent control group pretest-posttest design was used. Data were collected using self-report questionnaires from 36 Korean inpatients who met the Alcohol Use Disorder Identification Test in Korea criteria. Data were collected from quantitative electroencephalography to assess alpha (8-12 Hz) and high-beta (21-30 Hz) waves for hyperarousal. The questionnaires included Basic Psychological Need Satisfaction scales that assessed autonomy, competence, and relatedness, and the Alcohol Abstinence Self-Efficacy Scale and Treatment Self-Regulation Questionnaire. The experimental group underwent 10 sessions of neurofeedback training over four weeks. Data were analyzed using the chi-squared, Mann-Whitney U, and Wilcoxon signed-rank tests. In the experimental group, the alpha wave was increased in 15 of 19 sites and high-beta waves were decreased in 15 of 19 sites, but this difference was not significant. However, high-beta waves were increased in 15 of 19 sites in the control group, with seven sites (Fz, Cz, Pz, Fp2, F4, C4, and P4) showing significant increases. The experimental group showed a significant increase in basic psychological need satisfaction, alcohol abstinence self-efficacy, and self-regulation compared with the control group. Neurofeedback training is recommended for improving autonomous regulation in alcohol use disorder as a nursing intervention. However, for significantly attenuating hyperarousal through brain wave correction, it may be necessary to increase the number of neurofeedback sessions. Copyright © 2018. Published by Elsevier B.V.
Chu, Felicia W.; vanMarle, Kristy; Geary, David C.
2016-01-01
One hundred children (44 boys) participated in a 3-year longitudinal study of the development of basic quantitative competencies and the relation between these competencies and later mathematics and reading achievement. The children's preliteracy knowledge, intelligence, executive functions, and parental educational background were also assessed. The quantitative tasks assessed a broad range of symbolic and nonsymbolic knowledge and were administered four times across 2 years of preschool. Mathematics achievement was assessed at the end of each of 2 years of preschool, and mathematics and word reading achievement were assessed at the end of kindergarten. Our goals were to determine how domain-general abilities contribute to growth in children's quantitative knowledge and to determine how domain-general and domain-specific abilities contribute to children's preschool mathematics achievement and kindergarten mathematics and reading achievement. We first identified four core quantitative competencies (e.g., knowledge of the cardinal value of number words) that predict later mathematics achievement. The domain-general abilities were then used to predict growth in these competencies across 2 years of preschool, and the combination of domain-general abilities, preliteracy skills, and core quantitative competencies were used to predict mathematics achievement across preschool and mathematics and word reading achievement at the end of kindergarten. Both intelligence and executive functions predicted growth in the four quantitative competencies, especially across the first year of preschool. A combination of domain-general and domain-specific competencies predicted preschoolers' mathematics achievement, with a trend for domain-specific skills to be more strongly related to achievement at the beginning of preschool than at the end of preschool. Preschool preliteracy skills, sensitivity to the relative quantities of collections of objects, and cardinal knowledge predicted reading and mathematics achievement at the end of kindergarten. Preliteracy skills were more strongly related to word reading, whereas sensitivity to relative quantity was more strongly related to mathematics achievement. The overall results indicate that a combination of domain-general and domain-specific abilities contribute to development of children's early mathematics and reading achievement. PMID:27252675
Chu, Felicia W; vanMarle, Kristy; Geary, David C
2016-01-01
One hundred children (44 boys) participated in a 3-year longitudinal study of the development of basic quantitative competencies and the relation between these competencies and later mathematics and reading achievement. The children's preliteracy knowledge, intelligence, executive functions, and parental educational background were also assessed. The quantitative tasks assessed a broad range of symbolic and nonsymbolic knowledge and were administered four times across 2 years of preschool. Mathematics achievement was assessed at the end of each of 2 years of preschool, and mathematics and word reading achievement were assessed at the end of kindergarten. Our goals were to determine how domain-general abilities contribute to growth in children's quantitative knowledge and to determine how domain-general and domain-specific abilities contribute to children's preschool mathematics achievement and kindergarten mathematics and reading achievement. We first identified four core quantitative competencies (e.g., knowledge of the cardinal value of number words) that predict later mathematics achievement. The domain-general abilities were then used to predict growth in these competencies across 2 years of preschool, and the combination of domain-general abilities, preliteracy skills, and core quantitative competencies were used to predict mathematics achievement across preschool and mathematics and word reading achievement at the end of kindergarten. Both intelligence and executive functions predicted growth in the four quantitative competencies, especially across the first year of preschool. A combination of domain-general and domain-specific competencies predicted preschoolers' mathematics achievement, with a trend for domain-specific skills to be more strongly related to achievement at the beginning of preschool than at the end of preschool. Preschool preliteracy skills, sensitivity to the relative quantities of collections of objects, and cardinal knowledge predicted reading and mathematics achievement at the end of kindergarten. Preliteracy skills were more strongly related to word reading, whereas sensitivity to relative quantity was more strongly related to mathematics achievement. The overall results indicate that a combination of domain-general and domain-specific abilities contribute to development of children's early mathematics and reading achievement.
Knee Instability and Basic and Advanced Function Decline in Knee Osteoarthritis.
Sharma, Leena; Chmiel, Joan S; Almagor, Orit; Moisio, Kirsten; Chang, Alison H; Belisle, Laura; Zhang, Yunhui; Hayes, Karen W
2015-08-01
Manifestations of instability in knee osteoarthritis (OA) include low overall knee confidence, low confidence that the knees will not buckle, buckling, and excessive motion during gait. Confidence and buckling may particularly influence activity choices, contributing to events leading to disability. Buckling is more likely to affect advanced than basic functional tasks. In this prospective longitudinal study, we tested the hypothesis that overall knee confidence, buckling confidence, buckling, and frontal plane motion during gait are associated with advanced 2-year function outcomes in persons with knee OA. Persons with knee OA were queried about overall knee confidence (higher score = worse confidence), buckling confidence, and knee buckling, and underwent quantitative gait analysis to quantify varus-valgus excursion and angular velocity. Physical function was assessed using the Late-Life Function and Disability Instrument Basic and Advanced Lower Extremity Domain scores. Logistic regression was used to evaluate the relationship between baseline instability measures and baseline-to-2-year function outcome, adjusting for potential confounders. The sample was comprised of 212 persons (mean age 64.6 years, 76.9% women). Buckling was significantly associated with poor advanced function outcome (adjusted odds ratio [OR] 2.08, 95% confidence interval [95% CI] 1.03-4.20) but not basic function outcome. Overall knee confidence was significantly associated with advanced outcome (adjusted OR 1.65, 95% CI 1.01-2.70), while associations between buckling confidence and both outcomes approached significance. Neither varus-valgus excursion nor angular velocity during gait was associated with either outcome. Knee buckling and low knee confidence were each associated with poor 2-year advanced function outcomes. Current treatment does not address these modifiable factors; interventions to address them may improve outcome in knee OA. © 2015, American College of Rheumatology.
Jackson, Charlotte; Mangtani, Punam; Hawker, Jeremy; Olowokure, Babatunde; Vynnycky, Emilia
2014-01-01
School closure is a potential intervention during an influenza pandemic and has been investigated in many modelling studies. To systematically review the effects of school closure on influenza outbreaks as predicted by simulation studies. We searched Medline and Embase for relevant modelling studies published by the end of October 2012, and handsearched key journals. We summarised the predicted effects of school closure on the peak and cumulative attack rates and the duration of the epidemic. We investigated how these predictions depended on the basic reproduction number, the timing and duration of closure and the assumed effects of school closures on contact patterns. School closures were usually predicted to be most effective if they caused large reductions in contact, if transmissibility was low (e.g. a basic reproduction number <2), and if attack rates were higher in children than in adults. The cumulative attack rate was expected to change less than the peak, but quantitative predictions varied (e.g. reductions in the peak were frequently 20-60% but some studies predicted >90% reductions or even increases under certain assumptions). This partly reflected differences in model assumptions, such as those regarding population contact patterns. Simulation studies suggest that school closure can be a useful control measure during an influenza pandemic, particularly for reducing peak demand on health services. However, it is difficult to accurately quantify the likely benefits. Further studies of the effects of reactive school closures on contact patterns are needed to improve the accuracy of model predictions.
Segmentation of human brain using structural MRI.
Helms, Gunther
2016-04-01
Segmentation of human brain using structural MRI is a key step of processing in imaging neuroscience. The methods have undergone a rapid development in the past two decades and are now widely available. This non-technical review aims at providing an overview and basic understanding of the most common software. Starting with the basis of structural MRI contrast in brain and imaging protocols, the concepts of voxel-based and surface-based segmentation are discussed. Special emphasis is given to the typical contrast features and morphological constraints of cortical and sub-cortical grey matter. In addition to the use for voxel-based morphometry, basic applications in quantitative MRI, cortical thickness estimations, and atrophy measurements as well as assignment of cortical regions and deep brain nuclei are briefly discussed. Finally, some fields for clinical applications are given.
Meeting Report: Tissue-based Image Analysis.
Saravanan, Chandra; Schumacher, Vanessa; Brown, Danielle; Dunstan, Robert; Galarneau, Jean-Rene; Odin, Marielle; Mishra, Sasmita
2017-10-01
Quantitative image analysis (IA) is a rapidly evolving area of digital pathology. Although not a new concept, the quantification of histological features on photomicrographs used to be cumbersome, resource-intensive, and limited to specialists and specialized laboratories. Recent technological advances like highly efficient automated whole slide digitizer (scanner) systems, innovative IA platforms, and the emergence of pathologist-friendly image annotation and analysis systems mean that quantification of features on histological digital images will become increasingly prominent in pathologists' daily professional lives. The added value of quantitative IA in pathology includes confirmation of equivocal findings noted by a pathologist, increasing the sensitivity of feature detection, quantification of signal intensity, and improving efficiency. There is no denying that quantitative IA is part of the future of pathology; however, there are also several potential pitfalls when trying to estimate volumetric features from limited 2-dimensional sections. This continuing education session on quantitative IA offered a broad overview of the field; a hands-on toxicologic pathologist experience with IA principles, tools, and workflows; a discussion on how to apply basic stereology principles in order to minimize bias in IA; and finally, a reflection on the future of IA in the toxicologic pathology field.
Quantitative modelling in cognitive ergonomics: predicting signals passed at danger.
Moray, Neville; Groeger, John; Stanton, Neville
2017-02-01
This paper shows how to combine field observations, experimental data and mathematical modelling to produce quantitative explanations and predictions of complex events in human-machine interaction. As an example, we consider a major railway accident. In 1999, a commuter train passed a red signal near Ladbroke Grove, UK, into the path of an express. We use the Public Inquiry Report, 'black box' data, and accident and engineering reports to construct a case history of the accident. We show how to combine field data with mathematical modelling to estimate the probability that the driver observed and identified the state of the signals, and checked their status. Our methodology can explain the SPAD ('Signal Passed At Danger'), generate recommendations about signal design and placement and provide quantitative guidance for the design of safer railway systems' speed limits and the location of signals. Practitioner Summary: Detailed ergonomic analysis of railway signals and rail infrastructure reveals problems of signal identification at this location. A record of driver eye movements measures attention, from which a quantitative model for out signal placement and permitted speeds can be derived. The paper is an example of how to combine field data, basic research and mathematical modelling to solve ergonomic design problems.
Fractography: determining the sites of fracture initiation.
Mecholsky, J J
1995-03-01
Fractography is the analysis of fracture surfaces. Here, it refers to quantitative fracture surface analysis (FSA) in the context of applying the principles of fracture mechanics to the topography observed on the fracture surface of brittle materials. The application of FSA is based on the principle that encoded on the fracture surface of brittle materials is the entire history of the fracture process. It is our task to develop the skills and knowledge to decode this information. There are several motivating factors for applying our knowledge of FSA. The first and foremost is that there is specific, quantitative information to be obtained from the fracture surface. This information includes the identification of the size and location of the fracture initiating crack or defect, the stress state at failure, the existence, or not, of local or global residual stress, the existence, or not, of stress corrosion and a knowledge of local processing anomalies which affect the fracture process. The second motivating factor is that the information is free. Once a material is tested to failure, the encoded information becomes available. If we decide to observe the features produced during fracture then we are rewarded with much information. If we decide to ignore the fracture surface, then we are left to guess and/or reason as to the cause of the failure without the benefit of all of the possible information available. This paper addresses the application of quantitative fracture surface analysis to basic research, material and product development, and "trouble-shooting" of in-service failures. First, the basic principles involved will be presented. Next, the methodology necessary to apply the principles will be presented. Finally, a summary of the presentation will be made showing the applicability to design and reliability.
Klin, Ami; Shultz, Sarah; Jones, Warren
2014-01-01
Efforts to determine and understand the causes of autism are currently hampered by a large disconnect between recent molecular genetics findings that are associated with the condition and the core behavioral symptoms that define the condition. In this perspective piece, we propose a systems biology framework to bridge that gap between genes and symptoms. The framework focuses on basic mechanisms of socialization that are highly-conserved in evolution and are early-emerging in development. By conceiving of these basic mechanisms of socialization as quantitative endophenotypes, we hope to connect genes and behavior in autism through integrative studies of neurodevelopmental, behavioral, and epigenetic changes. These changes both lead to and are led by the accomplishment of specific social adaptive tasks in a typical infant's life. However, based on recent research that indicates that infants later diagnosed with autism fail to accomplish at least some of these tasks, we suggest that a narrow developmental period, spanning critical transitions from reflexive, subcortically-controlled visual behavior to interactional, cortically-controlled and social visual behavior be prioritized for future study. Mapping epigenetic, neural, and behavioral changes that both drive and are driven by these early transitions may shed a bright light on the pathogenesis of autism. PMID:25445180
Emotional reactivity: Beware its involvement in traffic accidents.
M'bailara, Katia; Atzeni, Thierry; Contrand, Benjamin; Derguy, Cyrielle; Bouvard, Manuel-Pierre; Lagarde, Emmanuel; Galéra, Cédric
2018-04-01
Reducing risk attributable to traffic accidents is a public health challenge. Research into risk factors in the area is now moving towards identification of the psychological factors involved, particularly emotional states. The aim of this study was to evaluate the link between emotional reactivity and responsibility in road traffic accidents. We hypothesized that the more one's emotional reactivity is disturbed, the greater the likelihood of being responsible for a traffic accident. This case-control study was based on a sample of 955 drivers injured in a motor vehicle crash. Responsibility levels were determined with a standardized method adapted from the quantitative Robertson and Drummer crash responsibility instrument. Emotional reactivity was assessed with the MATHYS. Hierarchical cluster analysis discriminated four distinctive driver's emotional reactivity profiles: basic emotional reactivity (54%), mild emotional hyper-reactivity (29%), emotional hyper-reactivity (11%) and emotional hypo-reactivity (6%). Drivers who demonstrated emotional hypo-reactivity had a 2.3-fold greater risk of being responsible for a traffic accident than those with basic emotional reactivity. Drivers' responsibility in traffic accidents depends on their emotional status. The latter can change the ability of drivers, modifying their behavior and thus increasing their propensity to exhibit risk behavior and to cause traffic accidents. Copyright © 2017 Elsevier B.V. All rights reserved.
Nkosi, Z Z; Asah, F; Pillay, P
2011-10-01
Nurses are exposed to the changing demands in technology as they execute their patient-related duties in the workplace. Integration of Information Technology (IT) in healthcare systems improves the quality of care provided. Nursing students with prior exposure to computers tend to have a positive influence IT. A descriptive study design using a quantitative approach and structured questionnaire was used to measure the nurses' attitudes towards computer usage. A census of 45 post-basic first year nursing management students were participated in this study. The students demonstrated a positive attitude towards the use of a computer. But access to and use of a computer and IT was limited and nurses in clinics had no access to IT. A lack of computer skills was identified as a factor that hinders access to IT. Nursing students agreed that computer literacy should be included in the curriculum to allow them to become independent computer users. The Department of Health should have IT in all health-care facilities and also train all health-care workers to use IT. With the positive attitudes expressed by the students, nurse managers need to create a conducive environment to ensure such a positive attitude continues to excel. © 2011 Blackwell Publishing Ltd.
Klin, Ami; Shultz, Sarah; Jones, Warren
2015-03-01
Efforts to determine and understand the causes of autism are currently hampered by a large disconnect between recent molecular genetics findings that are associated with the condition and the core behavioral symptoms that define the condition. In this perspective piece, we propose a systems biology framework to bridge that gap between genes and symptoms. The framework focuses on basic mechanisms of socialization that are highly-conserved in evolution and are early-emerging in development. By conceiving of these basic mechanisms of socialization as quantitative endophenotypes, we hope to connect genes and behavior in autism through integrative studies of neurodevelopmental, behavioral, and epigenetic changes. These changes both lead to and are led by the accomplishment of specific social adaptive tasks in a typical infant's life. However, based on recent research that indicates that infants later diagnosed with autism fail to accomplish at least some of these tasks, we suggest that a narrow developmental period, spanning critical transitions from reflexive, subcortically-controlled visual behavior to interactional, cortically-controlled and social visual behavior be prioritized for future study. Mapping epigenetic, neural, and behavioral changes that both drive and are driven by these early transitions may shed a bright light on the pathogenesis of autism. Copyright © 2014 Elsevier Ltd. All rights reserved.
Musharraf, Syed Ghulam; Ul Arfeen, Qamar; Ul Haq, Faraz; Khatoon, Aliya; Azher Ali, Rahat
2017-10-01
Methyltestosterone is a synthetic testosterone derivative commonly used for the treatment of testosterone deficiency in males and one the anabolic steroids whose use is banned by World Anti-Doping Agency (WADA). This study presents a simple, cost-effective and rapid stability-indicating assay for densitometric quantification of methyltestosterone in pharmaceutical formulation. The developed method employed pre-coated TLC plates with mobile phase hexane:acetone (6.5:3.5 v/v). Limit of detection and limit of quantitation were found to be 2.06 and 6.24 ng/spot, respectively. Stress degradation study of methyltestosterone was conducted by applying various stress conditions such as hydrolysis under acidic, basic and neutral conditions, heating in anhydrous conditions and exposure to light. Methyltestosterone was found to be susceptible to photodegradation, acidic and basic hydrolysis. Degraded products were well resolved with significantly different Rf values. Acid degraded product was identified as 17,17-dimethyl-18-norandrosta-4,13(14)-dien-3-one through spectroscopic methods. The reactivity of methyltestosterone under applied stress conditions was also explained by quantum chemical calculations. The developed method is found to be repeatable, selective and accurate for quantification of methyltestosterone and can be employed for routine analysis. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Steinman, Gary
2013-07-01
The amounts of at least three biochemical factors are more often abnormal in autistic people than neurologically normal ones. They include insulin-like growth factor, anti-myelin basic protein, and serotonin. This may explain why processes initiated in utero which hinder normal neurogenesis, especially myelination, continue after delivery. Quantitation of these parameters may make possible the calculation of an autism index, anticipating at birth which children will ultimately develop overt autism. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quantitative Uncertainty Assessment and Numerical Simulation of Micro-Fluid Systems
2005-04-01
flow at Sandia, that was supported by the Laboratory Directed Research and Devel- opment program, and by the Dept. of Energy , Office of Basic Energy ...finite energy . 6 θ is used to denote the random nature of the corresponding quantity. Being symmetrical and positive definite, REE has all its...Laboratory Directed Research and Development Program at Sandia National Laboratories, funded by the U.S. Department of Energy . Support was also provided
Beyond detection: nuclear physics with a webcam in an educational setting
NASA Astrophysics Data System (ADS)
Pallone, A.; Barnes, P.
2016-09-01
Basic understanding of nuclear science enhances our daily-life experience in many areas, such as the environment, medicine, electric power generation, and even politics. Yet typical school curricula do not provide for experiments that explore the topic. We present a means by which educators can use the ubiquitous webcam and inexpensive sources of radiation to lead their students in a quantitative exploration of radioactivity, radiation, and the applications of nuclear physics.
Detecting and Targeting Oncogenic Myc in Breast Cancer
2006-06-01
expression in lung tumor samples. Real-time quantitative PCR amplification was conducted using the SYBR Green assay in the ABI PRISM 7900-HT (Applied...Methylation-sensitive se- quence-specific DNA binding by the c-Myc basic region. Science 1991;251:186–9. 37. Perini G, Diolaiti D, Porro A, et al. In...using nuclear magnetic resonance and circular dichroism. We show that several Myc NTD polypeptides are largely disordered in solution, which is
NASA Technical Reports Server (NTRS)
Claus, R. O.; Bennett, K. D.; Jackson, B. S.
1986-01-01
The application of fiber-optical time domain reflectometry (OTDR) to nondestructive quantitative measurements of distributed internal strain in graphite-epoxy composites, using optical fiber waveguides imbedded between plies, is discussed. The basic OTDR measurement system is described, together with the methods used to imbed optical fibers within composites. Measurement results, system limitations, and the effect of the imbedded fiber on the integrity of the host composite material are considered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.
Absorbable energy monitoring scheme: new design protocol to test vehicle structural crashworthiness.
Ofochebe, Sunday M; Enibe, Samuel O; Ozoegwu, Chigbogu G
2016-05-01
In vehicle crashworthiness design optimization detailed system evaluation capable of producing reliable results are basically achieved through high-order numerical computational (HNC) models such as the dynamic finite element model, mesh-free model etc. However the application of these models especially during optimization studies is basically challenged by their inherent high demand on computational resources, conditional stability of the solution process, and lack of knowledge of viable parameter range for detailed optimization studies. The absorbable energy monitoring scheme (AEMS) presented in this paper suggests a new design protocol that attempts to overcome such problems in evaluation of vehicle structure for crashworthiness. The implementation of the AEMS involves studying crash performance of vehicle components at various absorbable energy ratios based on a 2DOF lumped-mass-spring (LMS) vehicle impact model. This allows for prompt prediction of useful parameter values in a given design problem. The application of the classical one-dimensional LMS model in vehicle crash analysis is further improved in the present work by developing a critical load matching criterion which allows for quantitative interpretation of the results of the abstract model in a typical vehicle crash design. The adequacy of the proposed AEMS for preliminary vehicle crashworthiness design is demonstrated in this paper, however its extension to full-scale design-optimization problem involving full vehicle model that shows greater structural detail requires more theoretical development.
Spectroscopic and DFT Study of RhIII Chloro Complex Transformation in Alkaline Solutions.
Vasilchenko, Danila B; Berdyugin, Semen N; Korenev, Sergey V; O'Kennedy, Sean; Gerber, Wilhelmus J
2017-09-05
The hydrolysis of [RhCl 6 ] 3- in NaOH-water solutions was studied by spectrophotometric methods. The reaction proceeds via successive substitution of chloride with hydroxide to quantitatively form [Rh(OH) 6 ] 3- . Ligand substitution kinetics was studied in an aqueous 0.434-1.085 M NaOH matrix in the temperature range 5.5-15.3 °C. Transformation of [RhCl 6 ] 3- into [RhCl 5 (OH)] 3- was found to be the rate-determining step with activation parameters of ΔH † = 105 ± 4 kJ mol -1 and ΔS † = 59 ± 10 J K -1 mol -1 . The coordinated hydroxo ligand(s) induces rapid ligand substitution to form [Rh(OH) 6 ] 3- . By simulating ligand substitution as a dissociative mechanism, using density functional theory (DFT), we can now explain the relatively fast and slow kinetics of chloride substitution in basic and acidic matrices, respectively. Moreover, the DFT calculated activation energies corroborated experimental data that the kinetic stereochemical sequence of [RhCl 6 ] 3- hydrolysis in an acidic solution proceeds as [RhCl 6 ] 3- → [RhCl 5 (H 2 O)] 2- → cis-[RhCl 4 (H 2 O) 2 ] - . However, DFT calculations predict in a basic solution the trans route of substitution [RhCl 6 ] 3- → [RhCl 5 (OH)] 3- → trans-[RhCl 4 (OH) 2 ] 3- is kinetically favored.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laskin, Julia; Yang, Zhibo
2011-12-01
We present a first study of the energetics and dynamics of dissociation of deprotonated peptides using time- and collision-energy resolved surface-induced dissociation (SID) experiments. SID of four model peptides: RVYIHPF, HVYIHPF, DRVYIHPF, and DHVYIHPF was studied using a specially designed Fourier transform ion cyclotron resonance mass spectrometer (FT-ICR MS) configured for studying ion-surface collisions. Energy and entropy effects for the overall decomposition of the precursor ion were deduced by modeling the time- and collision energy-resolved survival curves using an RRKM based approach developed in our laboratory. The results were compared to the energetics and dynamics of dissociation of the correspondingmore » protonated species. We demonstrate that acidic peptides are less stable in the negative mode because of the low threshold associated with the kinetically hindered loss of H2O from [M-H]- ions. Comparison between the two basic peptides indicates that the lower stability of the [M-H]- ion of RVYIHPF as compared to HVYIHPF towards fragmentation is attributed to the differences in fragmentation mechanisms. Specifically, threshold energy associated with losses of NH3 and NHCNH from RVYIHPF is lower than the barrier for backbone fragmentation that dominates gas-phase decomposition of HVYIHPF. The results provide a first quantitative comparison between the energetics and dynamics of dissociation of [M+H]+ and [M-H]- ions of acidic and basic peptides.« less
Santos, Radleigh G; Appel, Jon R; Giulianotti, Marc A; Edwards, Bruce S; Sklar, Larry A; Houghten, Richard A; Pinilla, Clemencia
2013-05-30
In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays.
Das, Rahul K; Crick, Scott L; Pappu, Rohit V
2012-02-17
Basic region leucine zippers (bZIPs) are modular transcription factors that play key roles in eukaryotic gene regulation. The basic regions of bZIPs (bZIP-bRs) are necessary and sufficient for DNA binding and specificity. Bioinformatic predictions and spectroscopic studies suggest that unbound monomeric bZIP-bRs are uniformly disordered as isolated domains. Here, we test this assumption through a comparative characterization of conformational ensembles for 15 different bZIP-bRs using a combination of atomistic simulations and circular dichroism measurements. We find that bZIP-bRs have quantifiable preferences for α-helical conformations in their unbound monomeric forms. This helicity varies from one bZIP-bR to another despite a significant sequence similarity of the DNA binding motifs (DBMs). Our analysis reveals that intramolecular interactions between DBMs and eight-residue segments directly N-terminal to DBMs are the primary modulators of bZIP-bR helicities. We test the accuracy of this inference by designing chimeras of bZIP-bRs to have either increased or decreased overall helicities. Our results yield quantitative insights regarding the relationship between sequence and the degree of intrinsic disorder within bZIP-bRs, and might have general implications for other intrinsically disordered proteins. Understanding how natural sequence variations lead to modulation of disorder is likely to be important for understanding the evolution of specificity in molecular recognition through intrinsically disordered regions (IDRs). Copyright © 2011 Elsevier Ltd. All rights reserved.
Heterogeneous population dynamics and scaling laws near epidemic outbreaks.
Widder, Andreas; Kuehn, Christian
2016-10-01
In this paper, we focus on the influence of heterogeneity and stochasticity of the population on the dynamical structure of a basic susceptible-infected-susceptible (SIS) model. First we prove that, upon a suitable mathematical reformulation of the basic reproduction number, the homogeneous system and the heterogeneous system exhibit a completely analogous global behaviour. Then we consider noise terms to incorporate the fluctuation effects and the random import of the disease into the population and analyse the influence of heterogeneity on warning signs for critical transitions (or tipping points). This theory shows that one may be able to anticipate whether a bifurcation point is close before it happens. We use numerical simulations of a stochastic fast-slow heterogeneous population SIS model and show various aspects of heterogeneity have crucial influences on the scaling laws that are used as early-warning signs for the homogeneous system. Thus, although the basic structural qualitative dynamical properties are the same for both systems, the quantitative features for epidemic prediction are expected to change and care has to be taken to interpret potential warning signs for disease outbreaks correctly.
Hydrolysis of Indole-3-Acetic Acid Esters Exposed to Mild Alkaline Conditions 1
Baldi, Bruce G.; Maher, Barbara R.; Cohen, Jerry D.
1989-01-01
Ester conjugates of indole-3-acetic acid are hydrolyzed easily in basic solutions; however, quantitative data have not been available on the relationship between pH and rate of hydrolysis of the known ester conjugates. The use of basic conditions during extraction or purification of IAA by several laboratories suggested that a more systematic analysis of this process was needed. In this report we present data indicating: (a) that measurable hydrolysis of IAA-glucose (from standard solutions) and IAA-esters (from maize kernel extracts) occurs with only a few hours of treatment at pH 9 or above; (b) that the lability of some ester conjugates is even greater than that of IAA-glucose; and (c) that ester hydrolysis of standard compounds, IAA-glucose and IAA-p-nitrophenol, occurs in the `three phase extraction system' proposed by Liu and Tillberg ([1983] Physiol Plant 57: 441-447). These data indicate that the potential for problems with inadvertent hydrolysis of ester conjugates of IAA exists even at moderate pH values and in the multiphase system where exposure to basic conditions was thought to be limited. PMID:16667049
The Effect of Moisture on the Hydrolysis of Basic Salts.
Shi, Xiaoyang; Xiao, Hang; Chen, Xi; Lackner, Klaus S
2016-12-19
A great deal of information exists concerning the hydration of ions in bulk water. Much less noticeable, but equally ubiquitous is the hydration of ions holding on to several water molecules in nanoscopic pores or in natural air at low relative humidity. Such hydration of ions with a high ratio of ions to water molecules (up to 1:1) are essential in determining the energetics of many physical and chemical systems. Herein, we present a quantitative analysis of the energetics of ion hydration in nanopores based on molecular modeling of a series of basic salts associated with different numbers of water molecules. The results show that the degree of hydrolysis of basic salts in the presence of a few water molecules is significantly different from that in bulk water. The reduced availability of water molecules promotes the hydrolysis of divalent and trivalent basic ions (S 2 - , CO 3 2- , SO 3 2- , HPO 4 2- , SO 4 2- , PO 4 3- ), which produces lower valent ions (HS - , HCO 3 - , HSO 3 - , H 2 PO 4 - , HSO 4 - , HPO 4 2- ) and OH - ions. However, reducing the availability of water inhibits the hydrolysis of monovalent basic ions (CN - , HS - ). This finding sheds some light on a vast number of chemical processes in the atmosphere and on solid porous surfaces. The discovery has wide potential applications including designing efficient absorbents for acidic gases. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Computer Aided Enzyme Design and Catalytic Concepts
Frushicheva, Maria P.; Mills, Matthew J. L.; Schopf, Patrick; Singh, Manoj K.; Warshel, Arieh
2014-01-01
Gaining a deeper understanding of enzyme catalysis is of great practical and fundamental importance. Over the years it has become clear that despite advances made in experimental mutational studies, a quantitative understanding of enzyme catalysis will not be possible without the use of computer modeling approaches. While we believe that electrostatic preorganization is by far the most important catalytic factor, convincing the wider scientific community of this may require the demonstration of effective rational enzyme design. Here we make the point that the main current advances in enzyme design are basically advances in directed evolution and that computer aided enzyme design must involve approaches that can reproduce catalysis in well-defined test cases. Such an approach is provided by the empirical valence bond method. PMID:24814389
What can posturography tell us about vestibular function?
NASA Technical Reports Server (NTRS)
Black, F. O.
2001-01-01
Patients with balance disorders want answers to the following basic questions: (1) What is causing my problem? and (2) What can be done about my problem? Information to fully answer these questions must include status of both sensory and motor components of the balance control systems. Computerized dynamic posturography (CDP) provides quantitative assessment of both sensory and motor components of postural control along with how the sensory inputs to the brain interact. This paper reviews the scientific basis and clinical applications of CDP. Specifically, studies describing the integration of vestibular inputs with other sensory systems for postural control are briefly summarized. Clinical applications, including assessment, rehabilitation, and management are presented. Effects of aging on postural control along with prevention and management strategies are discussed.
Johnson, Heath E; Haugh, Jason M
2013-12-02
This unit focuses on the use of total internal reflection fluorescence (TIRF) microscopy and image analysis methods to study the dynamics of signal transduction mediated by class I phosphoinositide 3-kinases (PI3Ks) in mammalian cells. The first four protocols cover live-cell imaging experiments, image acquisition parameters, and basic image processing and segmentation. These methods are generally applicable to live-cell TIRF experiments. The remaining protocols outline more advanced image analysis methods, which were developed in our laboratory for the purpose of characterizing the spatiotemporal dynamics of PI3K signaling. These methods may be extended to analyze other cellular processes monitored using fluorescent biosensors. Copyright © 2013 John Wiley & Sons, Inc.
Ecott, Cheryl L; Critchfield, Thomas S
2004-01-01
Basic researchers, but not most applied researchers, have assumed that the behavior-decelerating effects of noncontingent reinforcement result at least partly from adventitious reinforcement of competing behaviors. The literature contains only sketchy evidence of these effects because few noncontingent reinforcement studies measure alternative behaviors. A laboratory model is presented in which concurrent schedules of contingent reinforcement were used to establish a "target" and an "alternative" behavior. Imposing noncontingent reinforcement decreased target behavior rates and increased alternative behavior rates, outcomes that were well described by the standard quantitative account of alternative reinforcement, the generalized matching law. These results suggest that adventitious reinforcement of alternative behaviors can occur during noncontingent reinforcement interventions, although the range of conditions under which this occurs remains to be determined in future studies. As an adjunct to applied studies, laboratory models permit easy measurement of alternative behaviors and parametric manipulations needed to answer many research questions. PMID:15529885
Rikabi, Sarah; French, Anna; Pinedo-Villanueva, Rafael; Morrey, Mark E; Wartolowska, Karolina; Judge, Andrew; MacLaren, Robert E; Mathur, Anthony; Williams, David J; Wall, Ivan; Birchall, Martin; Reeve, Brock; Atala, Anthony; Barker, Richard W; Cui, Zhanfeng; Furniss, Dominic; Bure, Kim; Snyder, Evan Y; Karp, Jeffrey M; Price, Andrew; Carr, Andrew; Brindley, David A
2014-01-01
There has been a large increase in basic science activity in cell therapy and a growing portfolio of cell therapy trials. However, the number of industry products available for widespread clinical use does not match this magnitude of activity. We hypothesize that the paucity of engagement with the clinical community is a key contributor to the lack of commercially successful cell therapy products. To investigate this, we launched a pilot study to survey clinicians from five specialities and to determine what they believe to be the most significant barriers to cellular therapy clinical development and adoption. Our study shows that the main concerns among this group are cost-effectiveness, efficacy, reimbursement, and regulation. Addressing these concerns can best be achieved by ensuring that future clinical trials are conducted to adequately answer the questions of both regulators and the broader clinical community. PMID:25383173
Davies, Benjamin M; Rikabi, Sarah; French, Anna; Pinedo-Villanueva, Rafael; Morrey, Mark E; Wartolowska, Karolina; Judge, Andrew; MacLaren, Robert E; Mathur, Anthony; Williams, David J; Wall, Ivan; Birchall, Martin; Reeve, Brock; Atala, Anthony; Barker, Richard W; Cui, Zhanfeng; Furniss, Dominic; Bure, Kim; Snyder, Evan Y; Karp, Jeffrey M; Price, Andrew; Carr, Andrew; Brindley, David A
2014-01-01
There has been a large increase in basic science activity in cell therapy and a growing portfolio of cell therapy trials. However, the number of industry products available for widespread clinical use does not match this magnitude of activity. We hypothesize that the paucity of engagement with the clinical community is a key contributor to the lack of commercially successful cell therapy products. To investigate this, we launched a pilot study to survey clinicians from five specialities and to determine what they believe to be the most significant barriers to cellular therapy clinical development and adoption. Our study shows that the main concerns among this group are cost-effectiveness, efficacy, reimbursement, and regulation. Addressing these concerns can best be achieved by ensuring that future clinical trials are conducted to adequately answer the questions of both regulators and the broader clinical community.
Quantitative phosphoproteomic analysis of early seed development in rice (Oryza sativa L.).
Qiu, Jiehua; Hou, Yuxuan; Tong, Xiaohong; Wang, Yifeng; Lin, Haiyan; Liu, Qing; Zhang, Wen; Li, Zhiyong; Nallamilli, Babi R; Zhang, Jian
2016-02-01
Rice (Oryza sativa L.) seed serves as a major food source for over half of the global population. Though it has been long recognized that phosphorylation plays an essential role in rice seed development, the phosphorylation events and dynamics in this process remain largely unknown so far. Here, we report the first large scale identification of rice seed phosphoproteins and phosphosites by using a quantitative phosphoproteomic approach. Thorough proteomic studies in pistils and seeds at 3, 7 days after pollination resulted in the successful identification of 3885, 4313 and 4135 phosphopeptides respectively. A total of 2487 proteins were differentially phosphorylated among the three stages, including Kip related protein 1, Rice basic leucine zipper factor 1, Rice prolamin box binding factor and numerous other master regulators of rice seed development. Moreover, differentially phosphorylated proteins may be extensively involved in the biosynthesis and signaling pathways of phytohormones such as auxin, gibberellin, abscisic acid and brassinosteroid. Our results strongly indicated that protein phosphorylation is a key mechanism regulating cell proliferation and enlargement, phytohormone biosynthesis and signaling, grain filling and grain quality during rice seed development. Overall, the current study enhanced our understanding of the rice phosphoproteome and shed novel insight into the regulatory mechanism of rice seed development.
Ochs, Matthias; Mühlfeld, Christian
2013-07-01
The growing awareness of the importance of accurate morphometry in lung research has recently motivated the publication of guidelines set forth by a combined task force of the American Thoracic Society and the European Respiratory Society (20). This official ATS/ERS Research Policy Statement provides general recommendations on which stereological methods are to be used in quantitative microscopy of the lung. However, to integrate stereology into a particular experimental study design, investigators are left with the problem of how to implement this in practice. Specifically, different animal models of human lung disease require the use of different stereological techniques and may determine the mode of lung fixation, tissue processing, preparation of sections, and other things. Therefore, the present companion articles were designed to allow a short practically oriented introduction into the concepts of design-based stereology (Part 1) and to provide recommendations for choosing the most appropriate methods to investigate a number of important disease models (Part 2). Worked examples with illustrative images will facilitate the practical performance of equivalent analyses. Study algorithms provide comprehensive surveys to ensure that no essential step gets lost during the multistage workflow. Thus, with this review, we hope to close the gap between theory and practice and enhance the use of stereological techniques in pulmonary research.
NASA Astrophysics Data System (ADS)
Galanzha, Ekaterina I.; Tuchin, Valery V.; Chowdhury, Parimal; Zharov, Vladimir P.
2004-08-01
The digital transmission microscopy is very informative, noninvasive for vessels, simple and available method for studying and measuring lymph microvessels function in vivo. Rat mesentery can use as promising animal model of lymph microvessels in vivo. Such imaging system allowed visualizing the entire lymphangion (with input and output valves), its wall, lymphatic valves, lymph flow as well as single cells in flow; obtaining anew basic information on lymph microcirculation and quantitative data on lymphatic function including indexes of phasic contractions and valve function, the quantitative parameters of lymph-flow velocity. Rat mesentery is good model to create different types of lymphedemas in acute and chronic experiments. The obtained data revealed that significant edema started immediately after lymph node dissection in one-half of cases and was accompanied by lymphatic disturbances. The greatest degree of edema was found after 1 week. After 4 weeks, the degree of edema sometimes decreased, but functional lymphatic disturbances progressed. Nicotine had significant direct dose-dependent effect on microlymphatic function at the acute local application, but the same dose of this drug was not effect on microcirculation in chronic intoxication. Despite yielding interesting data, transmittance microscopy had some limitations when applied to microcirculation studies. The problems could be solved at the application of integrated measuring technique.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1995-06-01
The basic relationships between stress and strain under cyclic conditions of loading are not at present well understood. It would seem that information of this type is vital for a fundamental approach to understand the fatigue behavior of dynamically loaded structures. In this paper, experimental and computational methods are utilized to study the fatigue behavior of a thin aluminum cantilever plate subjected to dynamic loading. The studies are performed by combining optomechanical and finite element methods. The cantilever plate is loaded periodically by excitation set at a fixed amplitude and at a specific resonance frequency of the plate. By continuously applying this type of loading and using holographic interferometry, the behavior of the plate during a specific period of time is investigated. Quantitative information is obtained from laser vibrometry data which are utilized by a finite element program to calculate strains and stresses assuming a homogeneous and isotropic material and constant strain elements. It is shown that the use of experimental and computational hybrid methodologies allows identification of different zones of the plate that are fatigue critical. This optomechanical approach proves to be a viable tool for understanding of fatigue behavior of mechanical components and for performing optimization of structures subjected to fatigue conditions.
Stem cell behavior on tailored porous oxide surface coatings.
Lavenus, Sandrine; Poxson, David J; Ogievetsky, Nika; Dordick, Jonathan S; Siegel, Richard W
2015-07-01
Nanoscale surface topographies are known to have a profound influence on cell behavior, including cell guidance, migration, morphology, proliferation, and differentiation. In this study, we have observed the behavior of human mesenchymal stem cells cultured on a range of tailored porous SiO2 and TiO2 nanostructured surface coatings fabricated via glancing angle electron-beam deposition. By controlling the physical vapor deposition angle during fabrication, we could control systematically the deposited coating porosity, along with associated topographic features. Immunocytochemistry and image analysis quantitatively revealed the number of adherent cells, as well as their basic cellular morphology, on these surfaces. Signaling pathway studies showed that even with subtle changes in nanoscale surface structures, the behavior of mesenchymal stem cells was strongly influenced by the precise surface structures of these porous coatings. Copyright © 2015 Elsevier Ltd. All rights reserved.
Motor Events during Healthy Sleep: A Quantitative Polysomnographic Study
Frauscher, Birgit; Gabelia, David; Mitterling, Thomas; Biermayr, Marlene; Bregler, Deborah; Ehrmann, Laura; Ulmer, Hanno; Högl, Birgit
2014-01-01
Study Objectives: Many sleep disorders are characterized by increased motor activity during sleep. In contrast, studies on motor activity during physiological sleep are largely lacking. We quantitatively investigated a large range of motor phenomena during polysomnography in physiological sleep. Design: Prospective polysomnographic investigation. Setting: Academic referral sleep laboratory. Participants: One hundred healthy sleepers age 19-77 y were strictly selected from a representative population sample by a two-step screening procedure. Interventions: N/A. Measurements and Results: Polysomnography according to American Academy of Sleep Medicine (AASM) standards was performed, and quantitative normative values were established for periodic limb movements in sleep (PLMS), high frequency leg movements (HFLM), fragmentary myoclonus (FM), neck myoclonus (NM), and rapid eye movement (REM)-related electromyographic (EMG) activity. Thirty-six subjects had a PLMS index > 5/h, 18 had a PLMS index > 15/h (90th percentile: 24.8/h). Thirty-three subjects had HFLM (90th percentile: four sequences/night). All subjects had FM (90th percentile 143.7/h sleep). Nine subjects fulfilled AASM criteria for excessive FM. Thirty-five subjects had NM (90th percentile: 8.8/h REM sleep). For REM sleep, different EMG activity measures for the mentalis and flexor digitorum superficialis muscles were calculated: the 90th percentile for phasic mentalis EMG activity for 30-sec epochs according to AASM recommendation was 15.6%, and for tonic mentalis EMG activity 2.6%. Twenty-five subjects exceeded the recently proposed phasic mentalis cutoff of 11%. None of the subjects exceeded the tonic mentalis cutoff of 9.6%. Conclusion: Quantification of motor phenomena is a basic prerequisite to develop normative values, and is a first step toward a more precise description of the various motor phenomena present during sleep. Because rates of motor events were unexpectedly high even in physiological sleep, the future use of normative values for both research and clinical routine is essential. Citation: Frauscher B; Gabelia D; Mitterling T; Biermayr M; Bregler D; Ehrmann L; Ulmer H; Högl B. Motor events during healthy sleep: a quantitative polysomnographic study. SLEEP 2014;37(4):763-773. PMID:24744455
Fractals in the neurosciences, Part II: clinical applications and future perspectives.
Di Ieva, Antonio; Esteban, Francisco J; Grizzi, Fabio; Klonowski, Wlodzimierz; Martín-Landrove, Miguel
2015-02-01
It has been ascertained that the human brain is a complex system studied at multiple scales, from neurons and microcircuits to macronetworks. The brain is characterized by a hierarchical organization that gives rise to its highly topological and functional complexity. Over the last decades, fractal geometry has been shown as a universal tool for the analysis and quantification of the geometric complexity of natural objects, including the brain. The fractal dimension has been identified as a quantitative parameter for the evaluation of the roughness of neural structures, the estimation of time series, and the description of patterns, thus able to discriminate different states of the brain in its entire physiopathological spectrum. Fractal-based computational analyses have been applied to the neurosciences, particularly in the field of clinical neurosciences including neuroimaging and neuroradiology, neurology and neurosurgery, psychiatry and psychology, and neuro-oncology and neuropathology. After a review of the basic concepts of fractal analysis and its main applications to the basic neurosciences in part I of this series, here, we review the main applications of fractals to the clinical neurosciences for a holistic approach towards a fractal geometry model of the brain. © The Author(s) 2013.
N-terminal region of myelin basic protein reduces fibrillar amyloid-β deposition in Tg-5xFAD mice.
Ou-Yang, Ming-Hsuan; Xu, Feng; Liao, Mei-Chen; Davis, Judianne; Robinson, John K; Van Nostrand, William E
2015-02-01
Alzheimer's disease is a progressive neurodegenerative disorder that is characterized by extensive deposition of fibrillar amyloid-β (Aβ) in the brain. Previously, myelin basic protein (MBP) was identified to be a potent inhibitor to Aβ fibril formation, and this inhibitory activity was localized to the N-terminal residues 1-64, a fragment designated MBP1. Here, we show that the modest neuronal expression of a fusion protein of the biologically active MBP1 fragment and the enhanced green fluorescent protein (MBP1-EGFP) significantly improved the performance of spatial learning memory in Tg-5xFAD mice, a model of pathologic Aβ accumulation in brain. The levels of insoluble Aβ and fibrillar amyloid were significantly reduced in bigenic Tg-5xFAD/Tg-MBP1-EGFP mice. Quantitative stereological analysis revealed that the reduction in amyloid was because of a reduction in the size of fibrillar plaques rather than a decrease in plaque numbers. The current findings support previous studies showing that MBP1 inhibits Aβ fibril formation in vitro and demonstrate the ability of MBP1 to reduce Aβ pathology and improve behavioral performance. Copyright © 2015 Elsevier Inc. All rights reserved.
Basic versus applied research: Julius Sachs (1832–1897) and the experimental physiology of plants
Kutschera, Ulrich
2015-01-01
The German biologist Julius Sachs was the first to introduce controlled, accurate, quantitative experimentation into the botanical sciences, and is regarded as the founder of modern plant physiology. His seminal monograph Experimental-Physiologie der Pflanzen (Experimental Physiology of Plants) was published 150 y ago (1865), when Sachs was employed as a lecturer at the Agricultural Academy in Poppelsdorf/Bonn (now part of the University). This book marks the beginning of a new era of basic and applied plant science. In this contribution, I summarize the achievements of Sachs and outline his lasting legacy. In addition, I show that Sachs was one of the first biologists who integrated bacteria, which he considered to be descendants of fungi, into the botanical sciences and discussed their interaction with land plants (degradation of wood etc.). This “plant-microbe-view” of green organisms was extended and elaborated by the laboratory botanist Wilhelm Pfeffer (1845–1920), so that the term “Sachs-Pfeffer-Principle of Experimental Plant Research” appears to be appropriate to characterize this novel way of performing scientific studies on green, photoautotrophic organisms (embryophytes, algae, cyanobacteria). PMID:26146794
Lombardi, A M
2017-09-18
Stochastic models provide quantitative evaluations about the occurrence of earthquakes. A basic component of this type of models are the uncertainties in defining main features of an intrinsically random process. Even if, at a very basic level, any attempting to distinguish between types of uncertainty is questionable, an usual way to deal with this topic is to separate epistemic uncertainty, due to lack of knowledge, from aleatory variability, due to randomness. In the present study this problem is addressed in the narrow context of short-term modeling of earthquakes and, specifically, of ETAS modeling. By mean of an application of a specific version of the ETAS model to seismicity of Central Italy, recently struck by a sequence with a main event of Mw6.5, the aleatory and epistemic (parametric) uncertainty are separated and quantified. The main result of the paper is that the parametric uncertainty of the ETAS-type model, adopted here, is much lower than the aleatory variability in the process. This result points out two main aspects: an analyst has good chances to set the ETAS-type models, but he may retrospectively describe and forecast the earthquake occurrences with still limited precision and accuracy.
Diffusion and Localization of Relative Strategy Scores in The Minority Game
NASA Astrophysics Data System (ADS)
Granath, Mats; Perez-Diaz, Alvaro
2016-10-01
We study the equilibrium distribution of relative strategy scores of agents in the asymmetric phase (α ≡ P/N≳ 1) of the basic Minority Game using sign-payoff, with N agents holding two strategies over P histories. We formulate a statistical model that makes use of the gauge freedom with respect to the ordering of an agent's strategies to quantify the correlation between the attendance and the distribution of strategies. The relative score xin Z of the two strategies of an agent is described in terms of a one dimensional random walk with asymmetric jump probabilities, leading either to a static and asymmetric exponential distribution centered at x=0 for fickle agents or to diffusion with a positive or negative drift for frozen agents. In terms of scaled coordinates x/√{N} and t / N the distributions are uniquely given by α and in quantitative agreement with direct simulations of the game. As the model avoids the reformulation in terms of a constrained minimization problem it can be used for arbitrary payoff functions with little calculational effort and provides a transparent and simple formulation of the dynamics of the basic Minority Game in the asymmetric phase.
NASA Astrophysics Data System (ADS)
Century, Daisy Nelson
This probing study focused on alternative and traditional assessments, their comparative impacts on students' attitudes and science learning outcomes. Four basic questions were asked: What type of science learning stemming from the instruction can best be assessed by the use of traditional paper-and pencil test? What type of science learning stemming from the instruction can best be assessed by the use of alternative assessment? What are the differences in the types of learning outcomes that can be assessed by the use of paper-pencil test and alternative assessment test? Is there a difference in students' attitude towards learning science when assessment of outcomes is by alternative assessment means compared to traditional means compared to traditional means? A mixed methodology involving quantitative and qualitative techniques was utilized. However, the study was essentially a case study. Quantitative data analysis included content achievement and attitude results, to which non-parametric statistics were applied. Analysis of qualitative data was done as a case study utilizing pre-set protocols resulting in a narrative summary style of report. These outcomes were combined in order to produce conclusions. This study revealed that the traditional method yielded more concrete cognitive content learning than did the alternative assessment. The alternative assessment yielded more psychomotor, cooperative learning and critical thinking skills. In both the alternative and the traditional methods the student's attitudes toward science were positive. There was no significant differences favoring either group. The quantitative findings of no statistically significant differences suggest that at a minimum there is no loss in the use of alternative assessment methods, in this instance, performance testing. Adding the results from the qualitative analysis to this suggests (1) that class groups were more satisfied when alternative methods were employed, and (2) that the two assessment methodologies are complementary to each other, and thus should probably be used together to produce maximum benefit.
Quantitative analysis of eosinophil chemotaxis tracked using a novel optical device -- TAXIScan.
Nitta, Nao; Tsuchiya, Tomoko; Yamauchi, Akira; Tamatani, Takuya; Kanegasaki, Shiro
2007-03-30
We have reported previously the development of an optically accessible, horizontal chemotaxis apparatus, in which migration of cells in the channel from a start line can be traced with time-lapse intervals using a CCD camera (JIM 282, 1-11, 2003). To obtain statistical data of migrating cells, we have developed quantitative methods to calculate various parameters in the process of chemotaxis, employing human eosinophil and CXCL12 as a model cell and a model chemoattractant, respectively. Median values of velocity and directionality of each cell within an experimental period could be calculated from the migratory pathway data obtained from time-lapse images and the data were expressed as Velocity-Directionality (VD) plot. This plot is useful for quantitatively analyzing multiple migrating cells exposed to a certain chemoattractant, and can distinguish chemotaxis from random migration. Moreover precise observation of cell migration revealed that each cell had a different lag period before starting chemotaxis, indicating variation in cell sensitivity to the chemoattractant. Thus lag time of each cell before migration, and time course of increment of the migrating cell ratio at the early stages could be calculated. We also graphed decrement of still moving cell ratio at the later stages by calculating the duration time of cell migration of each cell. These graphs could distinguish different motion patterns of chemotaxis of eosinophils, in response to a range of chemoattractants; PGD(2), fMLP, CCL3, CCL5 and CXCL12. Finally, we compared parameters of eosinophils from normal volunteers, allergy patients and asthma patients and found significant difference in response to PGD(2). The quantitative methods described here could be applicable to image data obtained with any combination of cells and chemoattractants and useful not only for basic studies of chemotaxis but also for diagnosis and for drug screening.
Korennoy, F I; Gulenkin, V M; Gogin, A E; Vergne, T; Karaulov, A K
2017-12-01
In 1977, Ukraine experienced a local epidemic of African swine fever (ASF) in the Odessa region. A total of 20 settlements were affected during the course of the epidemic, including both large farms and backyard households. Thanks to timely interventions, the virus circulation was successfully eradicated within 6 months, leading to no additional outbreaks. Detailed report of the outbreak's investigation has been publically available from 2014. The report contains some quantitative data that allow studying the ASF-spread dynamics in the course of the epidemic. In our study, we used this historical epidemic to estimate the basic reproductive number of the ASF virus both within and between farms. The basic reproductive number (R 0 ) represents the average number of secondary infections caused by one infectious unit during its infectious period in a susceptible population. Calculations were made under assumption of an exponential initial growth by fitting the approximating curve to the initial segments of the epidemic curves. The R 0 both within farm and between farms was estimated at 7.46 (95% confidence interval: 5.68-9.21) and 1.65 (1.42-1.88), respectively. Corresponding daily transmission rates were estimated at 1.07 (0.81-1.32) and 0.09 (0.07-0.10). These estimations based on historical data are consistent with those using data generated by the recent epidemic currently affecting eastern Europe. Such results contribute to the published knowledge on the ASF transmission dynamics under natural conditions and could be used to model and predict the spread of ASF in affected and non-affected regions and to evaluate the effectiveness of different control measures. © 2016 Blackwell Verlag GmbH.
Smith, Eric G.
2015-01-01
Background: Nonrandomized studies typically cannot account for confounding from unmeasured factors. Method: A method is presented that exploits the recently-identified phenomenon of “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors. Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure. Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results: Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met. Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations: Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions: To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward. The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226
A New Look at the Role of Thiolate Ligation in Cytochrome P450
Yosca, Timothy H.; Ledray, Aaron P.; Ngo, Joanna; Green, Michael T.
2017-01-01
Protonated ferryl (or iron(IV)hydroxide) intermediates have been characterized in several thiolate-ligated heme proteins that are known to catalyze C-H bond activation. The basicity of the ferryl intermediates in these species has been proposed to play a critical role in facilitating this chemistry, allowing hydrogen abstraction at reduction potentials below those that would otherwise lead to oxidative degradation of the enzyme. In this contribution, we discuss the events that led to the assignment and characterization of the unusual iron(IV)hydroxide species, highlighting experiments that provided a quantitative measure of the ferryl basicity, the iron(IV)hydroxide pKa. We then turn to the importance of the iron(IV)hydroxide state, presenting a new way of looking at the role of thiolate ligation in these systems. PMID:28091754
NASA Technical Reports Server (NTRS)
Reisel, John R.; Laurendeau, Normand M.
1994-01-01
Laser-induced fluorescence (LIF) has been applied to the quantitative measurement of nitric oxide (NO) in premixed, laminar, high-pressure flames. Their chemistry was also studied using three current kinetics schemes to determine the predictive capabilities of each mechanism with respect to NO concentrations. The flames studied were low-temperature (1600 less than T less than 1850K) C2H6/O2/N2 and C2H6/O2/N2 flames, and high temperature (2100 less than T less than 2300K) C2H6/O2/N2 flames. Laser-saturated fluorescence (LSF) was initially used to measure the NO concentrations. However, while the excitation transition was well saturated at atmospheric pressure, the fluorescence behavior was basically linear with respect to laser power at pressures above 6 atm. Measurements and calculations demonstrated that the fluorescence quenching rate variation is negligible for LIF measurements of NO at a given pressure. Therefore, linear LIF was used to perform quantitative measurements of NO concentration in these high-pressure flames. The transportability of a calibration factor from one set of flame conditions to another also was investigated by considering changes in the absorption and quenching environment for different flame conditions. The feasibility of performing LIF measurements of (NO) in turbulent flames was studied; the single-shot detection limit was determined to be 2 ppm.
Magnetic resonance imaging traits in siblings discordant for Alzheimer disease.
Cuenco, Karen T; Green, Robert C; Zhang, J; Lunetta, Kathryn; Erlich, Porat M; Cupples, L Adrienne; Farrer, Lindsay A; DeCarli, Charles
2008-07-01
Magnetic resonance imaging (MRI) can aid clinical assessment of brain changes potentially correlated with Alzheimer disease (AD). MRI traits may improve our ability to identify genes associated with AD-outcomes. We evaluated semi-quantitative MRI measures as endophenotypes for genetic studies by assessing their association with AD in families from the Multi-Institutional Research in Alzheimer Genetic Epidemiology (MIRAGE) Study. Discordant siblings from multiple ethnicities were ascertained through a single affected proband. Semi-quantitative MRI measures were obtained for each individual. The association between continuous/ordinal MRI traits and AD were analyzed using generalized estimating equations. Medical history and Apolipoprotein E (APOE)epsilon4 status were evaluated as potential confounders. Comparisons of 214 affected and 234 unaffected subjects from 229 sibships revealed that general cerebral atrophy, white matter hyperintensities (WMH), and mediotemporal atrophy differed significantly between groups (each at P < .0001) and varied by ethnicity. Age at MRI and duration of AD confounded all associations between AD and MRI traits. Among unaffected sibs, the presence of at least one APOEepsilon4 allele and MRI infarction was associated with more WMH after adjusting for age at MRI. The strong association between MRI traits and AD suggests that MRI traits may be informative endophenotypes for basic and clinical studies of AD. In particular, WMH may be a marker of vascular disease that contributes to AD pathogenesis.
Current and Future X-ray Studies of High-Redshift AGNs and the First Supermassive Black Holes
NASA Astrophysics Data System (ADS)
Brandt, Niel
2016-01-01
X-ray observations of high-redshift AGNs at z = 4-7 have played a critical role in understanding the physical processes at work inthese objects as well as their basic demographics. Since 2000, Chandra and XMM-Newton have provided new X-ray detections for more than 120 such objects, and well-defined samples of z > 4 AGNs now allow reliable X-ray population studies. Once luminosity effectsare considered, the basic X-ray continuum properties of most high-redshift AGNs appear remarkably similar to those of local AGNs, although there are some notable apparent exceptions (e.g., highly radio-loud quasars). Furthermore, the X-ray absorption found in some objects has been used as a diagnostic of outflowing winds and circumnuclear material. Demographically, the X-ray data now support an exponential decline in the number density of luminous AGNs above z ~ 3, and quantitative space-density comparisons for optically selected and X-ray selected quasars indicate basic statistical agreement.The current X-ray discoveries point the way toward the future breakthroughs that will be possible with, e.g., Athena and the X-raySurveyor. These missions will execute powerful blank-field surveys to elucidate the demographics of the first growing supermassive black holes (SMBHs), including highly obscured systems, up to z ~ 10. They will also carry out complementary X-ray spectroscopic and variability investigations of high-redshift AGNs by targeting the most-luminous z = 7-10 quasars found in wide-field surveys by, e.g., Euclid, LSST, and WFIRST. X-ray spectroscopic and variability studies of the X-ray continuum and reflection signatures will help determine Eddington ratios and disk/corona properties; measuring these will clarify how the first quasars grew so quickly. Furthermore, absorption line/edge studies will reveal how outflows from the first SMBHs influenced the growth of the first galaxies. I will suggest some efficient observational strategies for Athena and the X-ray Surveyor.
Understanding molecular structure from molecular mechanics.
Allinger, Norman L
2011-04-01
Molecular mechanics gives us a well known model of molecular structure. It is less widely recognized that valence bond theory gives us structures which offer a direct interpretation of molecular mechanics formulations and parameters. The electronic effects well-known in physical organic chemistry can be directly interpreted in terms of valence bond structures, and hence quantitatively calculated and understood. The basic theory is outlined in this paper, and examples of the effects, and their interpretation in illustrative examples is presented.
Towards A Topological Framework for Integrating Semantic Information Sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joslyn, Cliff A.; Hogan, Emilie A.; Robinson, Michael
2014-09-07
In this position paper we argue for the role that topological modeling principles can play in providing a framework for sensor integration. While used successfully in standard (quantitative) sensors, we are developing this methodology in new directions to make it appropriate specifically for semantic information sources, including keyterms, ontology terms, and other general Boolean, categorical, ordinal, and partially-ordered data types. We illustrate the basics of the methodology in an extended use case/example, and discuss path forward.
A Quantitative Analysis of Factors Affecting Retention of Female Aviators in U.S. Naval Aviation
2012-09-01
various reasons and is controlled by either US law or military regulations, and it equates to a basic quid pro quo scenario. The Navy offers the...Fitzgerald (2005) used time dependent modeling to investigate the effects of sexual harassment on turnover in the military. They discovered that females...exposed to sexual harassment were likely to leave either the job or organization to escape it, depending on the extent of the perceived threat
Sympathetic Cooling of Lattice Atoms by a Bose-Einstein Condensate
2010-08-13
average out to zero net change in momentum. This type of cooling is the basis for techniques such as Zeeman slowing and Magneto - optical traps . On a...change in momentum. This type of cooling is the basis for techniques such as Zeeman slowing and Magneto - optical traps . On a more basic level, an excited...cause stimulated emission of a second excitation. A quantitative explanation requires the use of the density fluctuation operator . This operator
Means and Method for Measurement of Drilling Fluid Properties
NASA Astrophysics Data System (ADS)
Lysyannikov, A.; Kondrashov, P.; Pavlova, P.
2016-06-01
The paper addresses the problem on creation of a new design of the device for determining rheological parameters of drilling fluids and the basic requirements which it must meet. The key quantitative parameters that define the developed device are provided. The algorithm of determining the coefficient of the yield point from the rheological Shvedov- Bingam model at a relative speed of rotation of glasses from the investigated drilling fluid of 300 and 600 rpm is presented.