Mixed Methods in CAM Research: A Systematic Review of Studies Published in 2012
Bishop, Felicity L.; Holmes, Michelle M.
2013-01-01
Background. Mixed methods research uses qualitative and quantitative methods together in a single study or a series of related studies. Objectives. To review the prevalence and quality of mixed methods studies in complementary medicine. Methods. All studies published in the top 10 integrative and complementary medicine journals in 2012 were screened. The quality of mixed methods studies was appraised using a published tool designed for mixed methods studies. Results. 4% of papers (95 out of 2349) reported mixed methods studies, 80 of which met criteria for applying the quality appraisal tool. The most popular formal mixed methods design was triangulation (used by 74% of studies), followed by embedded (14%), sequential explanatory (8%), and finally sequential exploratory (5%). Quantitative components were generally of higher quality than qualitative components; when quantitative components involved RCTs they were of particularly high quality. Common methodological limitations were identified. Most strikingly, none of the 80 mixed methods studies addressed the philosophical tensions inherent in mixing qualitative and quantitative methods. Conclusions and Implications. The quality of mixed methods research in CAM can be enhanced by addressing philosophical tensions and improving reporting of (a) analytic methods and reflexivity (in qualitative components) and (b) sampling and recruitment-related procedures (in all components). PMID:24454489
Code of Federal Regulations, 2011 CFR
2011-04-01
... post office address of, and must be countersigned by, an authorized attorney, agent, or official... pharmacologically related drugs. (c) Description of dosage form and quantitative composition. (ii) Scientific... quantitative statement of composition. Reasonable alternatives for any listed component may be specified. (ii...
Code of Federal Regulations, 2010 CFR
2010-04-01
... post office address of, and must be countersigned by, an authorized attorney, agent, or official... pharmacologically related drugs. (c) Description of dosage form and quantitative composition. (ii) Scientific... quantitative statement of composition. Reasonable alternatives for any listed component may be specified. (ii...
Code of Federal Regulations, 2012 CFR
2012-04-01
... post office address of, and must be countersigned by, an authorized attorney, agent, or official... pharmacologically related drugs. (c) Description of dosage form and quantitative composition. (ii) Scientific... quantitative statement of composition. Reasonable alternatives for any listed component may be specified. (ii...
Code of Federal Regulations, 2014 CFR
2014-04-01
... post office address of, and must be countersigned by, an authorized attorney, agent, or official... pharmacologically related drugs. (c) Description of dosage form and quantitative composition. (ii) Scientific... quantitative statement of composition. Reasonable alternatives for any listed component may be specified. (ii...
Code of Federal Regulations, 2013 CFR
2013-04-01
... post office address of, and must be countersigned by, an authorized attorney, agent, or official... pharmacologically related drugs. (c) Description of dosage form and quantitative composition. (ii) Scientific... quantitative statement of composition. Reasonable alternatives for any listed component may be specified. (ii...
Quantitating the Absorption, Partitioning and Toxicity of Hydrocarbon Components of JP-8 Jet Fuel
2007-08-24
with the skin. AFOSR Jet Fuel Toxicology Workshop. Tucson, AZ. October, 2004. 5. Basak SC, Riviere JE, Baynes RE, Xia XR, Gute BD. A hierarchical QSAR ... Toxicology Workshop, Tucson, AZ, 2005. 12. Basak SC, Riviere J, Baynes R, Gute BD: Theoretical descriptor based QSARs in predicting skin penetration of...NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Center for Chemical Toxicology Research and Pharmacokinetics College of Veterinary
Drew, Peter; Tippett, Vivienne; Devenish, Scott
2018-05-01
The objective of this review is to develop an aggregated synthesis of qualitative and quantitative data on occupational violence (OV) mitigation interventions for Emergency Service Workers (ESW), to cultivate useful conclusions and recommendations for paramedic occupational safety and policy development. Emergency Service Worker is a broad term encompassing all elements of community-based emergency support and includes paramedics, firefighters, and police.The objective of the quantitative component of this review is to quantify the effectiveness of OV mitigation interventions for ESW.The objective of the qualitative component of this review is to explore the perceptions and experiences of ESW on the effectiveness of OV mitigation interventions.This review seeks to address the following questions.
Keeping It Real, Keeping Them Interested and Keeping It in Their Minds
ERIC Educational Resources Information Center
Howley, Peter P.
2008-01-01
As part of many universities' Business degrees, students will undertake an introductory statistics course. Lecturers need to help these students appreciate and recognise the value of possessing quantitative skills and to learn and apply such skills. Three components to teaching that address these aims as well as the interdependence of these…
2013-01-01
Background Qualitative research methods are increasingly used within clinical trials to address broader research questions than can be addressed by quantitative methods alone. These methods enable health professionals, service users, and other stakeholders to contribute their views and experiences to evaluation of healthcare treatments, interventions, or policies, and influence the design of trials. Qualitative data often contribute information that is better able to reform policy or influence design. Methods Health services researchers, including trialists, clinicians, and qualitative researchers, worked collaboratively to develop a comprehensive portfolio of standard operating procedures (SOPs) for the West Wales Organisation for Rigorous Trials in Health (WWORTH), a clinical trials unit (CTU) at Swansea University, which has recently achieved registration with the UK Clinical Research Collaboration (UKCRC). Although the UKCRC requires a total of 25 SOPs from registered CTUs, WWORTH chose to add an additional qualitative-methods SOP (QM-SOP). Results The qualitative methods SOP (QM-SOP) defines good practice in designing and implementing qualitative components of trials, while allowing flexibility of approach and method. Its basic principles are that: qualitative researchers should be contributors from the start of trials with qualitative potential; the qualitative component should have clear aims; and the main study publication should report on the qualitative component. Conclusions We recommend that CTUs consider developing a QM-SOP to enhance the conduct of quantitative trials by adding qualitative data and analysis. We judge that this improves the value of quantitative trials, and contributes to the future development of multi-method trials. PMID:23433341
Addressing Negative Math Attitudes with Service-Learning
ERIC Educational Resources Information Center
Henrich, Allison; Sloughter, J. McLean; Anderson, Jeffrey; Bahuaud, Eric
2016-01-01
In this paper, we share the results of our recent study of a quantitative literacy course with a service-learning component. Our study aims to answer the question: How did student attitudes shift as a result of participating in this course? We present and analyze statistics from pre- and post-surveys in five classes (N = 78) taught by two…
Women of Color Navigating the Academy: The Discursive Power of Professionalism
ERIC Educational Resources Information Center
Castro, Corinne
2012-01-01
This project examines the professional experiences of women of color faculty to uncover less visible mechanisms of inequality in the academy. It is a mixed-methods study with both qualitative and quantitative components. I address the limitations of past research by revealing how even despite the relative successes of women of color in academia,…
Chen, Yongsheng; Persaud, Bhagwant
2014-09-01
Crash modification factors (CMFs) for road safety treatments are developed as multiplicative factors that are used to reflect the expected changes in safety performance associated with changes in highway design and/or the traffic control features. However, current CMFs have methodological drawbacks. For example, variability with application circumstance is not well understood, and, as important, correlation is not addressed when several CMFs are applied multiplicatively. These issues can be addressed by developing safety performance functions (SPFs) with components of crash modification functions (CM-Functions), an approach that includes all CMF related variables, along with others, while capturing quantitative and other effects of factors and accounting for cross-factor correlations. CM-Functions can capture the safety impact of factors through a continuous and quantitative approach, avoiding the problematic categorical analysis that is often used to capture CMF variability. There are two formulations to develop such SPFs with CM-Function components - fully specified models and hierarchical models. Based on sample datasets from two Canadian cities, both approaches are investigated in this paper. While both model formulations yielded promising results and reasonable CM-Functions, the hierarchical model was found to be more suitable in retaining homogeneity of first-level SPFs, while addressing CM-Functions in sub-level modeling. In addition, hierarchical models better capture the correlations between different impact factors. Copyright © 2014 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Marcoux, Helene Elizabeth
A study with both quantitative and qualitative components was conducted with undergraduate teaching faculty at Kansas State University for fall 1999 to spring 2001 to study faculty role in addressing cheating at the collegiate level and faculty awareness of the University's honor system and cheating policies. Data were gathered through a variety…
Electrical Components for Marine Renewable Energy Arrays: A Techno-Economic Review
Collin, Adam J.; Nambiar, Anup J.; Bould, David; ...
2017-11-27
This paper presents a review of the main electrical components that are expected to be present in marine renewable energy arrays. The review is put in context by appraising the current needs of the industry and identifying the key components required in both device and array-scale developments. For each component, electrical, mechanical and cost considerations are discussed; with quantitative data collected during the review made freely available for use by the community via an open access online repository. Here, this data collection updates previous research and addresses gaps specific to emerging offshore technologies, such as marine and floating wind, andmore » provides a comprehensive resource for the techno-economic assessment of offshore energy arrays.« less
Electrical Components for Marine Renewable Energy Arrays: A Techno-Economic Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collin, Adam J.; Nambiar, Anup J.; Bould, David
This paper presents a review of the main electrical components that are expected to be present in marine renewable energy arrays. The review is put in context by appraising the current needs of the industry and identifying the key components required in both device and array-scale developments. For each component, electrical, mechanical and cost considerations are discussed; with quantitative data collected during the review made freely available for use by the community via an open access online repository. Here, this data collection updates previous research and addresses gaps specific to emerging offshore technologies, such as marine and floating wind, andmore » provides a comprehensive resource for the techno-economic assessment of offshore energy arrays.« less
Erwin, Deborah O; Johnson, Virginia A; Feliciano-Libid, Luisa; Zamora, Dulce; Jandorf, Lina
2005-01-01
Latino immigrants are at higher risk of death from breast and cervical cancer, necessitating effective cancer education interventions. Qualitative and quantitative information was obtained from Latinos from Arkansas and New York City through focus groups and questionnaires. Findings were analyzed using the PEN-3 model. The results demonstrate a mechanism for creating a culturally competent program, Esperanza y Vida, through progressively analyzing the findings to define the key perceptions, enablers, and nurturers, then applying this information to construct program components to address appropriate health behavior and cultural components that address the specific needs of a diverse Latino population. Finding a systematic approach to incorporating and embracing sociocultural perspectives and constructs may effectively appeal to diverse Latino immigrants in the development of a cancer education intervention.
Stress Intensity Factors for Cracked Metallic Structures Under Rapid Thermal Loading
1987-10-01
if applicable ) Flight Dynamics Laboratory (AFWAL/FIBFC) APTECH Engineering Services Air Force Wright Aeronautical Laboratories 6c. ADDRESS (City...SPONSORING Bb OFFICE SYMBOL 9 PROCUREMENT INSTRUMENT IDENTIFICATION NUMBER ORGANIZATION (If applicable ) DOD SBIR Program Office F33615-86-C-3217 8c...flawed components requires the application of fracture mechanics wherein crack tip -tress intensity factors are used to provide a quantitative means
NASA Technical Reports Server (NTRS)
Connolly, Joseph W.; Kopasakis, George
2010-01-01
This paper covers the propulsion system component modeling and controls development of an integrated mixed compression inlet and turbojet engine that will be used for an overall vehicle Aero-Propulso-Servo-Elastic (APSE) model. Using previously created nonlinear component-level propulsion system models, a linear integrated propulsion system model and loop shaping control design have been developed. The design includes both inlet normal shock position control and jet engine rotor speed control for a potential supersonic commercial transport. A preliminary investigation of the impacts of the aero-elastic effects on the incoming flow field to the propulsion system are discussed, however, the focus here is on developing a methodology for the propulsion controls design that prevents unstart in the inlet and minimizes the thrust oscillation experienced by the vehicle. Quantitative Feedback Theory (QFT) specifications and bounds, and aspects of classical loop shaping are used in the control design process. Model uncertainty is incorporated in the design to address possible error in the system identification mapping of the nonlinear component models into the integrated linear model.
A hierarchical-multiobjective framework for risk management
NASA Technical Reports Server (NTRS)
Haimes, Yacov Y.; Li, Duan
1991-01-01
A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.
Calibration and filtering strategies for frequency domain electromagnetic data
Minsley, Burke J.; Smith, Bruce D.; Hammack, Richard; Sams, James I.; Veloski, Garret
2010-01-01
echniques for processing frequency-domain electromagnetic (FDEM) data that address systematic instrument errors and random noise are presented, improving the ability to invert these data for meaningful earth models that can be quantitatively interpreted. A least-squares calibration method, originally developed for airborne electromagnetic datasets, is implemented for a ground-based survey in order to address systematic instrument errors, and new insights are provided into the importance of calibration for preserving spectral relationships within the data that lead to more reliable inversions. An alternative filtering strategy based on principal component analysis, which takes advantage of the strong correlation observed in FDEM data, is introduced to help address random noise in the data without imposing somewhat arbitrary spatial smoothing.Read More: http://library.seg.org/doi/abs/10.4133/1.3445431
NASA Astrophysics Data System (ADS)
Paul, Regina J.
This study examined the success rate of IEEIA inservice training, and attempted to identify key variables that influenced successful implementation of the IEEIA curriculum. The study used both quantitative and qualitative methods to obtain in-depth data. The total sample consisted of 251 participants; 132 usable surveys were returned resulting in a 53% response rate. The quantitative phase of the study consisted of a nine-page survey. The survey was designed to determine the effectiveness of inservice teacher training for the implementation of IEEIA, teachers' implementation, and their perceptions of the effectiveness of the inservice, the impact using IEEIA had on students and themselves, and barriers that prevented complete implementation. Additional analyses examined the relationships between use of the approach with the variables of length and type of training, and support types teachers received. The second phase analyzed both comments written by the respondents on their surveys, and eight teacher interviews. The research found that teachers perceived their workshops to be between moderately to very effective in helping them develop skills related to IEEIA and for teaching them how to implement it with students. Analyses revealed that teachers who received extended training or attended multiple inservices tended to use IEEIA more than teachers who did not. However, the number of years the teachers had been using the approach had a stronger influence for addressing the action components. Over half of the teachers had used the approach. Support after the inservice was important to implementation. The component of having students conduct an actual issue investigation was addressed the most. Fewer teachers addressed the final component of action by having their students resolve the issues they investigated. However, the teachers who fully implemented IEEIA had students who were active in their communities. Teachers perceived using the approach resulted in positive impacts for students, themselves, and their communities.
Goujon, Nicolas; Devine, Alexandra; Baker, Sally M; Sprunt, Beth; Edmonds, Tanya J; Booth, Jennifer K; Keeffe, Jill E
2014-01-01
A review of existing measurement instruments was conducted to examine their suitability to measure disability prevalence and assess quality of life, protection of disability rights and community participation by people with disabilities, specifically within the context of development programs in low and middle-income countries. From a search of PubMed and the grey literature, potentially relevant measurement instruments were identified and examined for their content and psychometric properties, where possible. Criteria for inclusion were: based on the WHO's International Classification of Functioning Disability and Health (ICF), used quantitative methods, suitable for population-based studies of disability inclusive development in English and published after 1990. Characteristics of existing instruments were analysed according to components of the ICF and quality of life domains. Ten instruments were identified and reviewed according to the criteria listed above. Each version of instruments was analysed separately. Only three instruments included a component on quality of life. Domains from the ICF that were addressed by some but not all instruments included the environment, technology and communication. The measurement instruments reviewed covered the range of elements required to measure disability-inclusion within development contexts. However no single measurement instrument has the capacity to measure both disability prevalence and changes in quality of life according to contemporary disability paradigms. The review of measurement instruments supports the need for developing an instrument specifically intended to measure disability inclusive practice within development programs. Implications for Rehabilitation Surveys and tools are needed to plan disability inclusive development. Existing measurement tools to determine prevalence of disability, wellbeing, rights and access to the community were reviewed. No single validated tool exists for population-based studies, uses quantitative methods and the components of the ICF to measure prevalence of disability, well-being of people with disability and their access to their communities. A measurement tool that reflects the UNCRPD and addresses all components of the ICF is needed to assist in disability inclusive development, especially in low and mid resource countries.
NASA Astrophysics Data System (ADS)
Goldman, Daphne; Ben-Zvi Assaraf, Orit; Shemesh, Julia
2014-05-01
While importance of environmental ethics, as a component of sustainable development, in preparing engineers is widely acknowledged, little research has addressed chemical engineers' environmental concerns. This study aimed to address this void by exploring chemical engineering students' values regarding human-nature relationships. The study was conducted with 247 3rd-4th year chemical engineering students in Israeli Universities. It employed the New Ecological Paradigm (NEP)-questionnaire to which students added written explanations. Quantitative analysis of NEP-scale results shows that the students demonstrated moderately ecocentric orientation. Explanations to the NEP-items reveal diverse, ambivalent ideas regarding the notions embodied in the NEP, strong scientific orientation and reliance on technology for addressing environmental challenges. Endorsing sustainability implies that today's engineers be equipped with an ecological perspective. The capacity of Higher Education to enable engineers to develop dispositions about human-nature interrelationships requires adaptation of curricula towards multidisciplinary, integrative learning addressing social-political-economic-ethical perspectives, and implementing critical-thinking within the socio-scientific issues pedagogical approach.
New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks
NASA Astrophysics Data System (ADS)
Kurtz, Nolan Scot
The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.
Hispanic nurses' experiences of bias in the workplace.
Moceri, Joane T
2014-01-01
The continuing issue of health inequity for Hispanics highlights the importance of retaining Hispanic nurses in the workplace. This article describes the use of short answers such as "Describe the bias you experienced" and "If a patient refused care, what was the reason given?" to increase understandings about bias through the descriptions of Hispanic nurses. In this study, bias was defined as those implicit negative stereotypes and attitudes that negatively affect judgments about, evaluations of, and actions toward others. For this qualitative component of a descriptive study employing both qualitative and quantitative methods, 111 Hispanic nurses responded to open-ended questions about experiences of bias that were included with a survey tool and demographic questionnaire. Three themes emerged: being overlooked and undervalued, having to prove competency, and living with "only-ness." Respect was an overarching concept. The written descriptions of bias provided depth and understanding to the quantitative findings. Nurse leaders are well positioned to develop and implement strategies to more effectively support Hispanic nurses and to promote nonbiased interactions in the workplace. Retaining Hispanic nurses is a vital component to address issues of health inequity for Hispanic patients.
NASA Astrophysics Data System (ADS)
Reed, D. E.; Lyford, M.; Schmidt, L. O.; Bowles-Terry, M.
2012-12-01
Climate change education presents many challenges to college educators due to the interdisciplinary nature of the issue as well as the social and political context and implications. This presents multiple barriers to learning for the student, both because it is difficult to address all scientific components in one course, and because many students have strong preconceived feelings or beliefs about climate change. A further barrier to learning for non-science majors is that very often the number of required science courses is low and a highly complex issue such as climate change is difficult to address in introductory science courses. To attempt to address these issues a course for non-science majors, Life Science 1002, Discovering Science, at the University of Wyoming was created as an interdisciplinary and integrated science course that includes a lecture component as well as weekly lab and discussion sections. Our previous work has shown a clear change in the reference sources used by non-science majors when referring to complex topics; namely, students increase their use of scientific journals when they are shown how to use scientific journals and students also report a correlated decrease in non-peer reviewed sources (ie, radio, newspapers, TV). We seek to expand on this work by using pre- and post-topic student surveys in the course at the University of Wyoming to directly measure student performance in different components of the course. The course has enrollment between 120 and 130 students, with nearly equal distribution between grade levels and a wide sampling of non-science majors or undeclared majors. For this work we will use a non-quantitative survey of students to find out which part of the course (lecture, lab or discussion) is most effective for student learning. Further, quantitative analysis of which factors of the student body (class standing, major, gender, background and personal beliefs) will be correlated to help predict who achieved the best and we will apply a non-quantitative analysis to determine which section of the student body had difficulties and why. This work will show other higher education instructors both the methodology and results from this study of the interdisciplinary course on climate change. While this work is limited in only focusing on one introductory course, the large number of students and the diversity of those students allow for a study of which factors in the course are best for student learning.
Application Of Interferometry To Optical Components And Systems Evaluation
NASA Astrophysics Data System (ADS)
Houston, Joseph B., Jr.
1982-05-01
Interferometry provides opticians and lens designers with the ability to evaluate optical components and systems quantitatively. A variety of interferometers and interferometric test procedures have evolved over the past several decades. This evolution has stimulated an ever-increasing amount of interest in using a new generation of instrumentation and computer software for solving cost and schedule problems both in the shop and at field test sites. Optical engineers and their customers continue to gain confidence in their abilities to perform several operations such as assure component quality, analyze and optimize lens assemblies, and accurately predict end-item performance. In this paper, a set of typical test situations are addressed and some standard instrumentation is described, as a means of illustrating the special advantages of interferometric testing. Emphasis will be placed on the proper application of currently available hardware and some of the latest proven techniques.
Probabilistic Characterization of Adversary Behavior in Cyber Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyers, C A; Powers, S S; Faissol, D M
2009-10-08
The objective of this SMS effort is to provide a probabilistic characterization of adversary behavior in cyber security. This includes both quantitative (data analysis) and qualitative (literature review) components. A set of real LLNL email data was obtained for this study, consisting of several years worth of unfiltered traffic sent to a selection of addresses at ciac.org. The email data was subjected to three interrelated analyses: a textual study of the header data and subject matter, an examination of threats present in message attachments, and a characterization of the maliciousness of embedded URLs.
Prescott, Jeffrey William
2013-02-01
The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.
USDA-ARS?s Scientific Manuscript database
The majority of economically important traits targeted for cotton improvement are quantitatively inherited. In this chapter, the current state of cotton quantitative genetics is described and separated into four components. These components include: 1) traditional quantitative inheritance analysis, ...
Nuclear physics: quantitative single-cell approaches to nuclear organization and gene expression.
Lionnet, T; Wu, B; Grünwald, D; Singer, R H; Larson, D R
2010-01-01
The internal workings of the nucleus remain a mystery. A list of component parts exists, and in many cases their functional roles are known for events such as transcription, RNA processing, or nuclear export. Some of these components exhibit structural features in the nucleus, regions of concentration or bodies that have given rise to the concept of functional compartmentalization--that there are underlying organizational principles to be described. In contrast, a picture is emerging in which transcription appears to drive the assembly of the functional components required for gene expression, drawing from pools of excess factors. Unifying this seemingly dual nature requires a more rigorous approach, one in which components are tracked in time and space and correlated with onset of specific nuclear functions. In this chapter, we anticipate tools that will address these questions and provide the missing kinetics of nuclear function. These tools are based on analyzing the fluctuations inherent in the weak signals of endogenous nuclear processes and determining values for them. In this way, it will be possible eventually to provide a computational model describing the functional relationships of essential components.
A hybrid spatial-spectral denoising method for infrared hyperspectral images using 2DPCA
NASA Astrophysics Data System (ADS)
Huang, Jun; Ma, Yong; Mei, Xiaoguang; Fan, Fan
2016-11-01
The traditional noise reduction methods for 3-D infrared hyperspectral images typically operate independently in either the spatial or spectral domain, and such methods overlook the relationship between the two domains. To address this issue, we propose a hybrid spatial-spectral method in this paper to link both domains. First, principal component analysis and bivariate wavelet shrinkage are performed in the 2-D spatial domain. Second, 2-D principal component analysis transformation is conducted in the 1-D spectral domain to separate the basic components from detail ones. The energy distribution of noise is unaffected by orthogonal transformation; therefore, the signal-to-noise ratio of each component is used as a criterion to determine whether a component should be protected from over-denoising or denoised with certain 1-D denoising methods. This study implements the 1-D wavelet shrinking threshold method based on Stein's unbiased risk estimator, and the quantitative results on publicly available datasets demonstrate that our method can improve denoising performance more effectively than other state-of-the-art methods can.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uneri, A; De Silva, T; Goerres, J
Purpose: Intraoperative x-ray radiography/fluoroscopy is commonly used to qualitatively assess delivery of surgical devices (e.g., spine pedicle screws) but can fail to reliably detect suboptimal placement (e.g., breach of adjacent critical structures). We present a method wherein prior knowledge of the patient and surgical components is leveraged to match preoperative CT and intraoperative radiographs for quantitative assessment of 3D pose. The method presents a new means of operating room quantitative quality assurance (ORQA) that could improve quality and safety, and reduce the frequency of revision surgeries. Methods: The algorithm (known-component registration, KC-Reg) uses patient-specific preoperative CT and parametrically defined surgicalmore » component models within a robust 3D-2D registration method to iteratively optimize gradient similarity using the covariance matrix adaptation evolution strategy. Advances from previous work address key challenges to clinical translation: i) absolving the need for offline geometric calibration of the C-arm; and ii) solving multiple component bodies simultaneously, thereby allowing QA in a single step (e.g., spinal construct with 4–20 screws), rather than sequential QA of each component. Performance was tested in a spine phantom with 10 pedicle screws, and first results from clinical studies are reported. Results: Phantom experiments demonstrated median target registration error (TRE) of (1.0±0.3) mm at the screw tip and (0.7°±0.4°) in angulation. The simultaneous multi-body registration approach improved TRE from the previous (sequential) method by 42%, reduced outliers, and fits into the natural workflow. Initial application of KC-Reg in clinical data shows TRE of (2.5±4.5) mm and (4.7°±0.5°). Conclusion: The KC-Reg algorithm offers a potentially valuable method for quantitative QA of the surgical product, using radiographic systems that are already within the surgical arsenal. For spine surgery, the method offers a near-real-time independent check on the quality of surgical product, facilitating immediate revision if necessary and potentially avoiding postoperative morbidity and/or revision surgery. Gerhard Kleinszig and Sebastian Vogt are employees of Siemens Healthcare.« less
Winters, J M
1995-01-01
A perspective is offered on rehabilitation engineering educational strategies, with a focus on the bachelor's and master's levels. Ongoing changes in engineering education are summarized, especially as related to the integration of design and computers throughout the curriculum; most positively affect rehabilitation engineering training. The challenge of identifying long-term "niches" for rehabilitation engineers within a changing rehabilitation service delivery process is addressed. Five key training components are identified and developed: core science and engineering knowledge, synthesized open-ended problem-solving skill development, hands-on design experience, rehabilitation breadth exposure, and a clinical internship. Two unique abilities are identified that help demarcate the engineer from other providers: open-ended problem-solving skills that include quantitative analysis when appropriate, and objective quantitative evaluation of human performance. Educational strategies for developing these abilities are addressed. Finally, a case is made for training "hybrid" engineers/therapists, in particular bachelor-level engineers who go directly to graduate school to become certified orthotists/prosthetists or physical/occupational therapists, pass the RESNA-sponsored assistive technology service provision exam along the way, then later in life obtain a professional engineer's license and an engineering master's degree.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, Jicheng; Gaffrey, Matthew J.; Qian, Wei-Jun
Protein cysteine thiols play a crucial role in redox signaling, regulation of enzymatic activity and protein function, and maintaining redox homeostasis in living systems. The unique chemical reactivity of thiol groups makes cysteine susceptible to oxidative modifications by reactive oxygen and nitrogen species to form a broad array of reversible and irreversible protein post-translational modifications (PTMs). The reversible modifications in particular are one of the major components of redox signaling and are involved in regulation of various cellular processes under physiological and pathological conditions. The biological significance of these redox PTMs in health and diseases has been increasingly recognized. Herein,more » we review the recent advances of quantitative proteomic approaches for investigating redox PTMs in complex biological systems, including the general considerations of sample processing, various chemical or affinity enrichment strategies, and quantitative approaches. We also highlight a number of redox proteomic approaches that enable effective profiling of redox PTMs for addressing specific biological questions. Although some technological limitations remain, redox proteomics is paving the way towards a better understanding of redox signaling and regulation in human health and diseases.« less
Probabilistic modeling of discourse-aware sentence processing.
Dubey, Amit; Keller, Frank; Sturt, Patrick
2013-07-01
Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.
Delgado Reyes, Lourdes M; Bohache, Kevin; Wijeakumar, Sobanawartiny; Spencer, John P
2018-04-01
Motion artifacts are often a significant component of the measured signal in functional near-infrared spectroscopy (fNIRS) experiments. A variety of methods have been proposed to address this issue, including principal components analysis (PCA), correlation-based signal improvement (CBSI), wavelet filtering, and spline interpolation. The efficacy of these techniques has been compared using simulated data; however, our understanding of how these techniques fare when dealing with task-based cognitive data is limited. Brigadoi et al. compared motion correction techniques in a sample of adult data measured during a simple cognitive task. Wavelet filtering showed the most promise as an optimal technique for motion correction. Given that fNIRS is often used with infants and young children, it is critical to evaluate the effectiveness of motion correction techniques directly with data from these age groups. This study addresses that problem by evaluating motion correction algorithms implemented in HomER2. The efficacy of each technique was compared quantitatively using objective metrics related to the physiological properties of the hemodynamic response. Results showed that targeted PCA (tPCA), spline, and CBSI retained a higher number of trials. These techniques also performed well in direct head-to-head comparisons with the other approaches using quantitative metrics. The CBSI method corrected many of the artifacts present in our data; however, this approach produced sometimes unstable HRFs. The targeted PCA and spline methods proved to be the most robust, performing well across all comparison metrics. When compared head to head, tPCA consistently outperformed spline. We conclude, therefore, that tPCA is an effective technique for correcting motion artifacts in fNIRS data from young children.
Gravitropic responses of plants in the absence of a complicating G-force (6-IML-1)
NASA Technical Reports Server (NTRS)
Brown, Allan H.
1992-01-01
On the Earth it is patently impossible to measure any tropistic, physiologic, or morphogenic reactions to environmental stimuli without taking into account our planet's gravitational influence on the time course of the test subject's response. It follows that all published reports of quantitative measurements of such responses must have been contaminated by an additional gravity dependent component which probably was not trivial. Our research effort has as its principal scientific objective, the acquisition of experimental data from tests in a microgravity environment that will address a number of basic questions about plants' gravitropic responses to the perception of transversely applied g forces in the hypogravity range, from essentially zero to unit g. Comparable tests on Earth but in the same flight hardware, referred to as the Gravitational Plant Physiology Facility (GPPF), will provide 1 g data for various useful comparisons. Four specific scientific questions are addressed.
Liu, Wenjing; Song, Qingqing; Yan, Yu; Liu, Yao; Li, Peng; Wang, Yitao; Tu, Pengfei; Song, Yuelin; Li, Jun
2018-08-03
Although far away from perfect, it is practical to assess the quality of a given herbal medicine (HM) through simultaneous determination of a panel of components. However, the confidences of the quantitative outcomes from LC-MS/MS platform risk several technical barriers, such as chemical degradation, polarity range, concentration span, and identity misrecognition. Herein, we made an attempt to circumvent these obstacles by integrating several fit-for-purpose techniques, including online extraction (OLE), serially coupled reversed phase LC-hydrophilic interaction liquid chromatography (RPLC-HILIC), tailored multiple reaction monitoring (MRM), and relative response vs. collision energy curve (RRCEC) matching. Confidence-enhanced quantitative analysis of Cistanche salsa (Csa), a well-known psammophytic species and tonic herbal medicine, was conducted as a proof-of-concept. OLE module was deployed to prohibit chemical degradation, in particular E/Z-configuration transformation for phenylethanoid glycosides. Satisfactory retention took place for each analyte regardless of polarity because of successive passing through RPLC and HILIC columns. Optimum parameters for the minor components, at the meanwhile of inferior ones for the abundant ingredients, ensured the locations of all contents in the linear ranges. The unequivocal assignment of the captured signals was achieved by matching retention times, ion transitions, and more importantly, RRCECs between authentic compounds and suspect peaks. Diverse validation assays demonstrated the newly developed method to be reliable. Particularly, the distribution of mannitol rather than galactitol was disclosed although these isomers showed identical retention time and ion transitions. The contents of 21 compounds-of-interest were definitively determined in Csa as well as two analogous species, and the quantitative patterns exerted great variations among not only different species but different Csa samples. Together, the fortification of OLE-RPLC-HILIC-tailored MRM with RRCEC matching could fully address the demands from confidence-enhanced quantitative analysis of HMs. Copyright © 2018 Elsevier B.V. All rights reserved.
Larue, Ruben T H M; Defraene, Gilles; De Ruysscher, Dirk; Lambin, Philippe; van Elmpt, Wouter
2017-02-01
Quantitative analysis of tumour characteristics based on medical imaging is an emerging field of research. In recent years, quantitative imaging features derived from CT, positron emission tomography and MR scans were shown to be of added value in the prediction of outcome parameters in oncology, in what is called the radiomics field. However, results might be difficult to compare owing to a lack of standardized methodologies to conduct quantitative image analyses. In this review, we aim to present an overview of the current challenges, technical routines and protocols that are involved in quantitative imaging studies. The first issue that should be overcome is the dependency of several features on the scan acquisition and image reconstruction parameters. Adopting consistent methods in the subsequent target segmentation step is evenly crucial. To further establish robust quantitative image analyses, standardization or at least calibration of imaging features based on different feature extraction settings is required, especially for texture- and filter-based features. Several open-source and commercial software packages to perform feature extraction are currently available, all with slightly different functionalities, which makes benchmarking quite challenging. The number of imaging features calculated is typically larger than the number of patients studied, which emphasizes the importance of proper feature selection and prediction model-building routines to prevent overfitting. Even though many of these challenges still need to be addressed before quantitative imaging can be brought into daily clinical practice, radiomics is expected to be a critical component for the integration of image-derived information to personalize treatment in the future.
Estimated long-term outdoor air pollution concentrations in a cohort study
NASA Astrophysics Data System (ADS)
Beelen, Rob; Hoek, Gerard; Fischer, Paul; Brandt, Piet A. van den; Brunekreef, Bert
Several recent studies associated long-term exposure to air pollution with increased mortality. An ongoing cohort study, the Netherlands Cohort Study on Diet and Cancer (NLCS), was used to study the association between long-term exposure to traffic-related air pollution and mortality. Following on a previous exposure assessment study in the NLCS, we improved the exposure assessment methods. Long-term exposure to nitrogen dioxide (NO 2), nitrogen oxide (NO), black smoke (BS), and sulphur dioxide (SO 2) was estimated. Exposure at each home address ( N=21 868) was considered as a function of a regional, an urban and a local component. The regional component was estimated using inverse distance weighed interpolation of measurement data from regional background sites in a national monitoring network. Regression models with urban concentrations as dependent variables, and number of inhabitants in different buffers and land use variables, derived with a Geographic Information System (GIS), as predictor variables were used to estimate the urban component. The local component was assessed using a GIS and a digital road network with linked traffic intensities. Traffic intensity on the nearest road and on the nearest major road, and the sum of traffic intensity in a buffer of 100 m around each home address were assessed. Further, a quantitative estimate of the local component was estimated. The regression models to estimate the urban component explained 67%, 46%, 49% and 35% of the variances of NO 2, NO, BS, and SO 2 concentrations, respectively. Overall regression models which incorporated the regional, urban and local component explained 84%, 44%, 59% and 56% of the variability in concentrations for NO 2, NO, BS and SO 2, respectively. We were able to develop an exposure assessment model using GIS methods and traffic intensities that explained a large part of the variations in outdoor air pollution concentrations.
Quantitative risk analysis of oil storage facilities in seismic areas.
Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto
2005-08-31
Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.
Cuervo, Adriana; Valdés, Lorena; Salazar, Nuria; de los Reyes-Gavilán, Clara G; Ruas-Madiedo, Patricia; Gueimonde, Miguel; González, Sonia
2014-06-11
Several studies have addressed the use of dietary fibers in the modulation of intestinal microbiota; however, information about other highly correlated components in foods, such as polyphenols, is scarce. The aim of this work was to explore the association between the intake of fibers and polyphenols from a regular diet and fecal microbiota composition in 38 healthy adults. Food intake was recorded using an annual food frequency questionnaire (FFQ). Quantification of microbial populations in feces was performed by quantitative PCR. A negative association was found between the intake of pectins and flavanones from oranges and the levels of Blautia coccoides and Clostridium leptum. By contrast, white bread, providing hemicellulose and resistant starch, was directly associated with Lactobacillus. Because some effects on intestinal microbiota attributed to isolated fibers or polyphenols might be modified by other components present in the same food, future research should be focused on diet rather than individual compounds.
Some Epistemological Considerations Concerning Quantitative Analysis
ERIC Educational Resources Information Center
Dobrescu, Emilian
2008-01-01
This article presents the author's address at the 2007 "Journal of Applied Quantitative Methods" ("JAQM") prize awarding festivity. The festivity was included in the opening of the 4th International Conference on Applied Statistics, November 22, 2008, Bucharest, Romania. In the address, the author reflects on three theses that…
The Use of Mixed Methods for Therapeutic Massage Research
Porcino, Antony Joseph; Verhoef, Marja J.
2010-01-01
Mixed methods research is the integration of quantitative and qualitative components in a research project. Whether you are reading or designing a mixed methods research project, it is important to be familiar with both qualitative and quantitative research methods and the specific purposes for which they are brought together in a study: triangulation, complementarity, expansion, initiation, or development. In addition, decisions need to be made about the sequencing and the priority or importance of each qualitative and quantitative component relative to the other components, and the point or points at which the various qualitative and quantitative components will be integrated. Mixed methods research is increasingly being recognized for its ability to bring multiple points of view to a research project, taking advantage of the strengths of each of the quantitative and qualitative components to explain or resolve complex phenomena or results. This ability becomes critical when complex healing systems such as therapeutic massage are being studied. Complex healing systems may have multiple physiologic effects, often reflected in changes throughout the patient’s body. Additionally, the patient’s experience of the treatment may be an important outcome. PMID:21589698
Nonlocal low-rank and sparse matrix decomposition for spectral CT reconstruction
NASA Astrophysics Data System (ADS)
Niu, Shanzhou; Yu, Gaohang; Ma, Jianhua; Wang, Jing
2018-02-01
Spectral computed tomography (CT) has been a promising technique in research and clinics because of its ability to produce improved energy resolution images with narrow energy bins. However, the narrow energy bin image is often affected by serious quantum noise because of the limited number of photons used in the corresponding energy bin. To address this problem, we present an iterative reconstruction method for spectral CT using nonlocal low-rank and sparse matrix decomposition (NLSMD), which exploits the self-similarity of patches that are collected in multi-energy images. Specifically, each set of patches can be decomposed into a low-rank component and a sparse component, and the low-rank component represents the stationary background over different energy bins, while the sparse component represents the rest of the different spectral features in individual energy bins. Subsequently, an effective alternating optimization algorithm was developed to minimize the associated objective function. To validate and evaluate the NLSMD method, qualitative and quantitative studies were conducted by using simulated and real spectral CT data. Experimental results show that the NLSMD method improves spectral CT images in terms of noise reduction, artifact suppression and resolution preservation.
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
Shin, Sangmin; Lee, Seungyub; Judi, David; ...
2018-02-07
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
A Systematic Review of Quantitative Resilience Measures for Water Infrastructure Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Sangmin; Lee, Seungyub; Judi, David
Over the past few decades, the concept of resilience has emerged as an important consideration in the planning and management of water infrastructure systems. Accordingly, various resilience measures have been developed for the quantitative evaluation and decision-making of systems. There are, however, numerous considerations and no clear choice of which measure, if any, provides the most appropriate representation of resilience for a given application. This study provides a critical review of quantitative approaches to measure the resilience of water infrastructure systems, with a focus on water resources and distribution systems. A compilation of 11 criteria evaluating 21 selected resilience measuresmore » addressing major features of resilience is developed using the Axiomatic Design process. Existing gaps of resilience measures are identified based on the review criteria. The results show that resilience measures have generally paid less attention to cascading damage to interrelated systems, rapid identification of failure, physical damage of system components, and time variation of resilience. Concluding the paper, improvements to resilience measures are recommended. The findings contribute to our understanding of gaps and provide information to help further improve resilience measures of water infrastructure systems.« less
ERIC Educational Resources Information Center
Francis, Leslie J.; Croft, Jennifer S.; Pyke, Alice; Robbins, Mandy
2012-01-01
This essay discusses the design of the quantitative component of the "Young People's Attitudes to Religious Diversity" project, conceived by Professor Robert Jackson within the Warwick Religions and Education Research Unit, and presents some preliminary findings from the data. The quantitative component followed and built on the…
Bishop, Felicity L
2015-02-01
To outline some of the challenges of mixed methods research and illustrate how they can be addressed in health psychology research. This study critically reflects on the author's previously published mixed methods research and discusses the philosophical and technical challenges of mixed methods, grounding the discussion in a brief review of methodological literature. Mixed methods research is characterized as having philosophical and technical challenges; the former can be addressed by drawing on pragmatism, the latter by considering formal mixed methods research designs proposed in a number of design typologies. There are important differences among the design typologies which provide diverse examples of designs that health psychologists can adapt for their own mixed methods research. There are also similarities; in particular, many typologies explicitly orient to the technical challenges of deciding on the respective timing of qualitative and quantitative methods and the relative emphasis placed on each method. Characteristics, strengths, and limitations of different sequential and concurrent designs are identified by reviewing five mixed methods projects each conducted for a different purpose. Adapting formal mixed methods designs can help health psychologists address the technical challenges of mixed methods research and identify the approach that best fits the research questions and purpose. This does not obfuscate the need to address philosophical challenges of mixing qualitative and quantitative methods. Statement of contribution What is already known on this subject? Mixed methods research poses philosophical and technical challenges. Pragmatism in a popular approach to the philosophical challenges while diverse typologies of mixed methods designs can help address the technical challenges. Examples of mixed methods research can be hard to locate when component studies from mixed methods projects are published separately. What does this study add? Critical reflections on the author's previously published mixed methods research illustrate how a range of different mixed methods designs can be adapted and applied to address health psychology research questions. The philosophical and technical challenges of mixed methods research should be considered together and in relation to the broader purpose of the research. © 2014 The British Psychological Society.
A Methodology for Quantifying Certain Design Requirements During the Design Phase
NASA Technical Reports Server (NTRS)
Adams, Timothy; Rhodes, Russel
2005-01-01
A methodology for developing and balancing quantitative design requirements for safety, reliability, and maintainability has been proposed. Conceived as the basis of a more rational approach to the design of spacecraft, the methodology would also be applicable to the design of automobiles, washing machines, television receivers, or almost any other commercial product. Heretofore, it has been common practice to start by determining the requirements for reliability of elements of a spacecraft or other system to ensure a given design life for the system. Next, safety requirements are determined by assessing the total reliability of the system and adding redundant components and subsystems necessary to attain safety goals. As thus described, common practice leaves the maintainability burden to fall to chance; therefore, there is no control of recurring costs or of the responsiveness of the system. The means that have been used in assessing maintainability have been oriented toward determining the logistical sparing of components so that the components are available when needed. The process established for developing and balancing quantitative requirements for safety (S), reliability (R), and maintainability (M) derives and integrates NASA s top-level safety requirements and the controls needed to obtain program key objectives for safety and recurring cost (see figure). Being quantitative, the process conveniently uses common mathematical models. Even though the process is shown as being worked from the top down, it can also be worked from the bottom up. This process uses three math models: (1) the binomial distribution (greaterthan- or-equal-to case), (2) reliability for a series system, and (3) the Poisson distribution (less-than-or-equal-to case). The zero-fail case for the binomial distribution approximates the commonly known exponential distribution or "constant failure rate" distribution. Either model can be used. The binomial distribution was selected for modeling flexibility because it conveniently addresses both the zero-fail and failure cases. The failure case is typically used for unmanned spacecraft as with missiles.
Design and analysis of quantitative differential proteomics investigations using LC-MS technology.
Bukhman, Yury V; Dharsee, Moyez; Ewing, Rob; Chu, Peter; Topaloglou, Thodoros; Le Bihan, Thierry; Goh, Theo; Duewel, Henry; Stewart, Ian I; Wisniewski, Jacek R; Ng, Nancy F
2008-02-01
Liquid chromatography-mass spectrometry (LC-MS)-based proteomics is becoming an increasingly important tool in characterizing the abundance of proteins in biological samples of various types and across conditions. Effects of disease or drug treatments on protein abundance are of particular interest for the characterization of biological processes and the identification of biomarkers. Although state-of-the-art instrumentation is available to make high-quality measurements and commercially available software is available to process the data, the complexity of the technology and data presents challenges for bioinformaticians and statisticians. Here, we describe a pipeline for the analysis of quantitative LC-MS data. Key components of this pipeline include experimental design (sample pooling, blocking, and randomization) as well as deconvolution and alignment of mass chromatograms to generate a matrix of molecular abundance profiles. An important challenge in LC-MS-based quantitation is to be able to accurately identify and assign abundance measurements to members of protein families. To address this issue, we implement a novel statistical method for inferring the relative abundance of related members of protein families from tryptic peptide intensities. This pipeline has been used to analyze quantitative LC-MS data from multiple biomarker discovery projects. We illustrate our pipeline here with examples from two of these studies, and show that the pipeline constitutes a complete workable framework for LC-MS-based differential quantitation. Supplementary material is available at http://iec01.mie.utoronto.ca/~thodoros/Bukhman/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halls, Benjamin R.; Meyer, Terrence R.; Kastengren, Alan L.
2015-01-01
The complex geometry and large index-of-refraction gradients that occur near the point of impingement of binary liquid jets present a challenging environment for optical interrogation. A simultaneous quadruple-tracer x-ray fluorescence and line-of-sight radiography technique is proposed as a means of distinguishing and quantifying individual liquid component distributions prior to, during, and after jet impact. Two different pairs of fluorescence tracers are seeded into each liquid stream to maximize their attenuation ratio for reabsorption correction and differentiation of the two fluids during mixing. This approach for instantaneous correction of x-ray fluorescence reabsorption is compared with a more time-intensive approach of usingmore » stereographic reconstruction of x-ray attenuation along multiple lines of sight. The proposed methodology addresses the need for a quantitative measurement technique capable of interrogating optically complex, near-field liquid distributions in many mixing systems of practical interest involving two or more liquid streams.« less
Qualitative research and the profound grasp of the obvious.
Hurley, R E
1999-01-01
OBJECTIVE: To discuss the value of promoting coexistent and complementary relationships between qualitative and quantitative research methods as illustrated by presentations made by four respected health services researchers who described their experiences in multi-method projects. DATA SOURCES: Presentations and publications related to the four research projects, which described key substantive and methodological areas that had been addressed with qualitative techniques. PRINCIPAL FINDINGS: Sponsor interest in timely, insightful, and reality-anchored evidence has provided a strong base of support for the incorporation of qualitative methods into major contemporary policy research studies. In addition, many issues may be suitable for study only with qualitative methods because of their complexity, their emergent nature, or because of the need to revisit and reexamine previously untested assumptions. CONCLUSION: Experiences from the four projects, as well as from other recent health services studies with major qualitative components, support the assertion that the interests of sponsors in the policy realm and pressure from them suppress some of the traditional tensions and antagonisms between qualitative and quantitative methods. PMID:10591276
Shuey, Megan M; Drees, Kevin P; Lindner, Daniel L; Keim, Paul; Foster, Jeffrey T
2014-03-01
White-nose syndrome is a fungal disease that has decimated bat populations across eastern North America. Identification of the etiologic agent, Pseudogymnoascus destructans (formerly Geomyces destructans), in environmental samples is essential to proposed management plans. A major challenge is the presence of closely related species, which are ubiquitous in many soils and cave sediments and often present in high abundance. We present a dual-probe real-time quantitative PCR assay capable of detecting and differentiating P. destructans from closely related fungi in environmental samples from North America. The assay, based on a single nucleotide polymorphism (SNP) specific to P. destructans, is capable of rapid low-level detection from various sampling media, including sediment, fecal samples, wing biopsy specimens, and skin swabs. This method is a highly sensitive, high-throughput method for identifying P. destructans, other Pseudogymnoascus spp., and Geomyces spp. in the environment, providing a fundamental component of research and risk assessment for addressing this disease, as well as other ecological and mycological work on related fungi.
Halls, Benjamin R.; Meyer, Terrence R.; Kastengren, Alan L.
2015-01-23
The complex geometry and large index-of-refraction gradients that occur near the point of impingement of binary liquid jets present a challenging environment for optical interrogation. A simultaneous quadruple-tracer x-ray fluorescence and line-of-sight radiography technique is proposed as a means of distinguishing and quantifying individual liquid component distributions prior to, during, and after jet impact. Two different pairs of fluorescence tracers are seeded into each liquid stream to maximize their attenuation ratio for reabsorption correction and differentiation of the two fluids during mixing. This approach for instantaneous correction of x-ray fluorescence reabsorption is compared with a more time-intensive approach of usingmore » stereographic reconstruction of x-ray attenuation along multiple lines of sight. The proposed methodology addresses the need for a quantitative measurement technique capable of interrogating optically complex, near-field liquid distributions in many mixing systems of practical interest involving two or more liquid streams.« less
2016-09-15
Investigative Questions This research will quantitatively address the impact of proposed benefits of a 3D printed satellite architecture on the...subsystems of a CubeSat. The objective of this research is to bring a quantitative analysis to the discussion of whether a fully 3D printed satellite...manufacturers to quantitatively address what impact the architecture would have on the subsystems of a CubeSat. Summary of Research Gap, Research Questions, and
Wang, Li-Li; Zhang, Yun-Bin; Sun, Xiao-Ya; Chen, Sui-Qing
2016-05-08
Establish a quantitative analysis of multi-components by the single marker (QAMS) method for quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four main components in Linderae Reflexae Radix. Four main components of pinostrobin, pinosylvin, pinocembrin, and 3,5-dihydroxy-2-(1- p -mentheneyl)- trans -stilbene were selected as analytes to evaluate the quality by RP-HPLC coupled with a UV-detector. The method was evaluated by a comparison of the quantitative results between the external standard method and QAMS with a different HPLC system. The results showed that no significant differences were found in the quantitative results of the four contents of Linderae Reflexae Radix determined by the external standard method and QAMS (RSD <3%). The contents of four analytes (pinosylvin, pinocembrin, pinostrobin, and Reflexanbene I) in Linderae Reflexae Radix were determined by the single marker of pinosylvin. This fingerprint was the spectra determined by Shimadzu LC-20AT and Waters e2695 HPLC that were equipped with three different columns.
Hoste, H; Torres-Acosta, J F J; Quijada, J; Chan-Perez, I; Dakheel, M M; Kommuru, D S; Mueller-Harvey, I; Terrill, T H
2016-01-01
Interactions between host nutrition and feeding behaviour are central to understanding the pathophysiological consequences of infections of the digestive tract with parasitic nematodes. The manipulation of host nutrition provides useful options to control gastrointestinal nematodes as a component of an integrated strategy. Focussed mainly on the Haemonchus contortus infection model in small ruminants, this chapter (1) illustrates the relationship between quantitative (macro- and micro-nutrients) and qualitative (plant secondary metabolites) aspects of host nutrition and nematode infection, and (2) shows how basic studies aimed at addressing some generic questions can help to provide solutions, despite the considerable diversity of epidemiological situations and breeding systems. Copyright © 2016 Elsevier Ltd. All rights reserved.
Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A
2012-04-01
Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. © Health Research and Educational Trust.
Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A
2012-01-01
Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040
Status of the Flooding Fragility Testing Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, C. L.; Savage, B.; Bhandari, B.
2016-06-01
This report provides an update on research addressing nuclear power plant component reliability under flooding conditions. The research includes use of the Component Flooding Evaluation Laboratory (CFEL) where individual components and component subassemblies will be tested to failure under various flooding conditions. The resulting component reliability data can then be incorporated with risk simulation strategies to provide a more thorough representation of overall plant risk. The CFEL development strategy consists of four interleaved phases. Phase 1 addresses design and application of CFEL with water rise and water spray capabilities allowing testing of passive and active components including fully electrified components.more » Phase 2 addresses research into wave generation techniques followed by the design and addition of the wave generation capability to CFEL. Phase 3 addresses methodology development activities including small scale component testing, development of full scale component testing protocol, and simulation techniques including Smoothed Particle Hydrodynamic (SPH) based computer codes. Phase 4 involves full scale component testing including work on full scale component testing in a surrogate CFEL testing apparatus.« less
An anthropomorphic phantom for quantitative evaluation of breast MRI.
Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo
2011-02-01
In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of breast MRI imaging protocols for lesion detection and characterization.
The SAM framework: modeling the effects of management factors on human behavior in risk analysis.
Murphy, D M; Paté-Cornell, M E
1996-08-01
Complex engineered systems, such as nuclear reactors and chemical plants, have the potential for catastrophic failure with disastrous consequences. In recent years, human and management factors have been recognized as frequent root causes of major failures in such systems. However, classical probabilistic risk analysis (PRA) techniques do not account for the underlying causes of these errors because they focus on the physical system and do not explicitly address the link between components' performance and organizational factors. This paper describes a general approach for addressing the human and management causes of system failure, called the SAM (System-Action-Management) framework. Beginning with a quantitative risk model of the physical system, SAM expands the scope of analysis to incorporate first the decisions and actions of individuals that affect the physical system. SAM then links management factors (incentives, training, policies and procedures, selection criteria, etc.) to those decisions and actions. The focus of this paper is on four quantitative models of action that describe this last relationship. These models address the formation of intentions for action and their execution as a function of the organizational environment. Intention formation is described by three alternative models: a rational model, a bounded rationality model, and a rule-based model. The execution of intentions is then modeled separately. These four models are designed to assess the probabilities of individual actions from the perspective of management, thus reflecting the uncertainties inherent to human behavior. The SAM framework is illustrated for a hypothetical case of hazardous materials transportation. This framework can be used as a tool to increase the safety and reliability of complex technical systems by modifying the organization, rather than, or in addition to, re-designing the physical system.
Yoshida, Mitsuhiro; Mochizuki, Tomohiro; Urayama, Syun-Ichi; Yoshida-Takashima, Yukari; Nishi, Shinro; Hirai, Miho; Nomaki, Hidetaka; Takaki, Yoshihiro; Nunoura, Takuro; Takai, Ken
2018-01-01
Previous studies on marine environmental virology have primarily focused on double-stranded DNA (dsDNA) viruses; however, it has recently been suggested that single-stranded DNA (ssDNA) viruses are more abundant in marine ecosystems. In this study, we performed a quantitative viral community DNA analysis to estimate the relative abundance and composition of both ssDNA and dsDNA viruses in offshore upper bathyal sediment from Tohoku, Japan (water depth = 500 m). The estimated dsDNA viral abundance ranged from 3 × 106 to 5 × 106 genome copies per cm3 sediment, showing values similar to the range of fluorescence-based direct virus counts. In contrast, the estimated ssDNA viral abundance ranged from 1 × 108 to 3 × 109 genome copies per cm3 sediment, thus providing an estimation that the ssDNA viral populations represent 96.3–99.8% of the benthic total DNA viral assemblages. In the ssDNA viral metagenome, most of the identified viral sequences were associated with ssDNA viral families such as Circoviridae and Microviridae. The principle components analysis of the ssDNA viral sequence components from the sedimentary ssDNA viral metagenomic libraries found that the different depth viral communities at the study site all exhibited similar profiles compared with deep-sea sediment ones at other reference sites. Our results suggested that deep-sea benthic ssDNA viruses have been significantly underestimated by conventional direct virus counts and that their contributions to deep-sea benthic microbial mortality and geochemical cycles should be further addressed by such a new quantitative approach. PMID:29467725
Reproducibility of Quantitative Structural and Physiological MRI Measurements
2017-08-09
per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and...Journal Article 3. DATES COVERED (From – To) January 2015 – July 2017 4. TITLE AND SUBTITLE Reproducibility of Quantitative Structural and...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) USAF School of Aerospace Medicine Aeromedical Research Dept/FHOH 2510 Fifth St., Bldg
A further component analysis for illicit drugs mixtures with THz-TDS
NASA Astrophysics Data System (ADS)
Xiong, Wei; Shen, Jingling; He, Ting; Pan, Rui
2009-07-01
A new method for quantitative analysis of mixtures of illicit drugs with THz time domain spectroscopy was proposed and verified experimentally. In traditional method we need fingerprints of all the pure chemical components. In practical as only the objective components in a mixture and their absorption features are known, it is necessary and important to present a more practical technique for the detection and identification. Our new method of quantitatively inspect of the mixtures of illicit drugs is developed by using derivative spectrum. In this method, the ratio of objective components in a mixture can be obtained on the assumption that all objective components in the mixture and their absorption features are known but the unknown components are not needed. Then methamphetamine and flour, a illicit drug and a common adulterant, were selected for our experiment. The experimental result verified the effectiveness of the method, which suggested that it could be an effective method for quantitative identification of illicit drugs. This THz spectroscopy technique is great significant in the real-world applications of illicit drugs quantitative analysis. It could be an effective method in the field of security and pharmaceuticals inspection.
Synthesising quantitative and qualitative research in evidence-based patient information.
Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan
2007-03-01
Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review.
Highly Reproducible Label Free Quantitative Proteomic Analysis of RNA Polymerase Complexes*
Mosley, Amber L.; Sardiu, Mihaela E.; Pattenden, Samantha G.; Workman, Jerry L.; Florens, Laurence; Washburn, Michael P.
2011-01-01
The use of quantitative proteomics methods to study protein complexes has the potential to provide in-depth information on the abundance of different protein components as well as their modification state in various cellular conditions. To interrogate protein complex quantitation using shotgun proteomic methods, we have focused on the analysis of protein complexes using label-free multidimensional protein identification technology and studied the reproducibility of biological replicates. For these studies, we focused on three highly related and essential multi-protein enzymes, RNA polymerase I, II, and III from Saccharomyces cerevisiae. We found that label-free quantitation using spectral counting is highly reproducible at the protein and peptide level when analyzing RNA polymerase I, II, and III. In addition, we show that peptide sampling does not follow a random sampling model, and we show the need for advanced computational models to predict peptide detection probabilities. In order to address these issues, we used the APEX protocol to model the expected peptide detectability based on whole cell lysate acquired using the same multidimensional protein identification technology analysis used for the protein complexes. Neither method was able to predict the peptide sampling levels that we observed using replicate multidimensional protein identification technology analyses. In addition to the analysis of the RNA polymerase complexes, our analysis provides quantitative information about several RNAP associated proteins including the RNAPII elongation factor complexes DSIF and TFIIF. Our data shows that DSIF and TFIIF are the most highly enriched RNAP accessory factors in Rpb3-TAP purifications and demonstrate our ability to measure low level associated protein abundance across biological replicates. In addition, our quantitative data supports a model in which DSIF and TFIIF interact with RNAPII in a dynamic fashion in agreement with previously published reports. PMID:21048197
Liver DCE-MRI Registration in Manifold Space Based on Robust Principal Component Analysis.
Feng, Qianjin; Zhou, Yujia; Li, Xueli; Mei, Yingjie; Lu, Zhentai; Zhang, Yu; Feng, Yanqiu; Liu, Yaqin; Yang, Wei; Chen, Wufan
2016-09-29
A technical challenge in the registration of dynamic contrast-enhanced magnetic resonance (DCE-MR) imaging in the liver is intensity variations caused by contrast agents. Such variations lead to the failure of the traditional intensity-based registration method. To address this problem, a manifold-based registration framework for liver DCE-MR time series is proposed. We assume that liver DCE-MR time series are located on a low-dimensional manifold and determine intrinsic similarities between frames. Based on the obtained manifold, the large deformation of two dissimilar images can be decomposed into a series of small deformations between adjacent images on the manifold through gradual deformation of each frame to the template image along the geodesic path. Furthermore, manifold construction is important in automating the selection of the template image, which is an approximation of the geodesic mean. Robust principal component analysis is performed to separate motion components from intensity changes induced by contrast agents; the components caused by motion are used to guide registration in eliminating the effect of contrast enhancement. Visual inspection and quantitative assessment are further performed on clinical dataset registration. Experiments show that the proposed method effectively reduces movements while preserving the topology of contrast-enhancing structures and provides improved registration performance.
Test Driven Development of a Parameterized Ice Sheet Component
NASA Astrophysics Data System (ADS)
Clune, T.
2011-12-01
Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
Huntley, Alyson L; King, Anna J L; Moore, Theresa H M; Paterson, Charlotte; Persad, Raj; Sharp, Debbie; Evans, Maggie
2017-01-01
To present a methodological exemplar of integrating findings from a quantitative and qualitative review on the same topic to provide insight into components of care that contribute to supportive care that is acceptable to men with prostate cancer. Men with prostate cancer are likely to live a long time with the disease, experience side effects from treatment and therefore have ongoing supportive care needs. Quantitative and qualitative reviews have been published but the findings have yet to be integrated. Integration of quantitative and qualitative synthesized evidence. Two previously published systematic reviews. Synthesized evidence on supportive care for men with prostate cancer was integrated from two previously published systematic reviews: a narrative quantitative review and a qualitative review with thematic synthesis. These two streams of synthesized evidence were synthesized using concurrent narrative summary. Data from both reviews were used to develop a set of propositions from which a summary of components of care that likely to contribute to supportive care acceptable to men with prostate cancer were identified. Nine propositions were developed which covered men's supportive care focusing on the role of health professionals. These propositions were used to compose nine components of care likely to lead to supportive care that is acceptable to men with prostate cancer. Some of these components are no/low cost such as developing a more empathic personalized approach, but more specific approaches need further investigation in randomized controlled trials, for example, online support. This methodological exemplar demonstrates the integration of quantitative and qualitative synthesized data to determine components of care likely to lead to provision of supportive care acceptable to men with prostate cancer. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.
Quantitating Organoleptic Volatile Phenols in Smoke-Exposed Vitis vinifera Berries.
Noestheden, Matthew; Thiessen, Katelyn; Dennis, Eric G; Tiet, Ben; Zandberg, Wesley F
2017-09-27
Accurate methods for quantitating volatile phenols (i.e., guaiacol, syringol, 4-ethylphenol, etc.) in smoke-exposed Vitis vinifera berries prior to fermentation are needed to predict the likelihood of perceptible smoke taint following vinification. Reported here is a complete, cross-validated analytical workflow to accurately quantitate free and glycosidically bound volatile phenols in smoke-exposed berries using liquid-liquid extraction, acid-mediated hydrolysis, and gas chromatography-tandem mass spectrometry. The reported workflow addresses critical gaps in existing methods for volatile phenols that impact quantitative accuracy, most notably the effect of injection port temperature and the variability in acid-mediated hydrolytic procedures currently used. Addressing these deficiencies will help the wine industry make accurate, informed decisions when producing wines from smoke-exposed berries.
NASA Astrophysics Data System (ADS)
Ke, Jingtang; Pryputniewicz, Ryszard J.
Various papers on the state of the art in laser and optoelectronic technology in industry are presented. Individual topics addressed include: wavelength compensation for holographic optical element, optoelectronic techniques for measurement and inspection, new optical measurement methods in Western Europe, applications of coherent optics at ISL, imaging techniques for gas turbine development, the Rolls-Royce experience with industrial holography, panoramic holocamera for tube and borehole inspection, optical characterization of electronic materials, optical strain measurement of rotating components, quantitative interpretation of holograms and specklegrams, laser speckle technique for hydraulic structural model test, study of holospeckle interferometry, common path shearing fringe scanning interferometer, and laser interferometry applied to nondestructive testing of tires.
Respectful Modeling: Addressing Uncertainty in Dynamic System Models for Molecular Biology.
Tsigkinopoulou, Areti; Baker, Syed Murtuza; Breitling, Rainer
2017-06-01
Although there is still some skepticism in the biological community regarding the value and significance of quantitative computational modeling, important steps are continually being taken to enhance its accessibility and predictive power. We view these developments as essential components of an emerging 'respectful modeling' framework which has two key aims: (i) respecting the models themselves and facilitating the reproduction and update of modeling results by other scientists, and (ii) respecting the predictions of the models and rigorously quantifying the confidence associated with the modeling results. This respectful attitude will guide the design of higher-quality models and facilitate the use of models in modern applications such as engineering and manipulating microbial metabolism by synthetic biology. Copyright © 2016 Elsevier Ltd. All rights reserved.
An analysis of the process and results of manual geocode correction
McDonald, Yolanda J.; Schwind, Michael; Goldberg, Daniel W.; Lampley, Amanda; Wheeler, Cosette M.
2018-01-01
Geocoding is the science and process of assigning geographical coordinates (i.e. latitude, longitude) to a postal address. The quality of the geocode can vary dramatically depending on several variables, including incorrect input address data, missing address components, and spelling mistakes. A dataset with a considerable number of geocoding inaccuracies can potentially result in an imprecise analysis and invalid conclusions. There has been little quantitative analysis of the amount of effort (i.e. time) to perform geocoding correction, and how such correction could improve geocode quality type. This study used a low-cost and easy to implement method to improve geocode quality type of an input database (i.e. addresses to be matched) through the processes of manual geocode intervention, and it assessed the amount of effort to manually correct inaccurate geocodes, reported the resulting match rate improvement between the original and the corrected geocodes, and documented the corresponding spatial shift by geocode quality type resulting from the corrections. Findings demonstrated that manual intervention of geocoding resulted in a 90% improvement of geocode quality type, took 42 hours to process, and the spatial shift ranged from 0.02 to 151,368 m. This study provides evidence to inform research teams considering the application of manual geocoding intervention that it is a low-cost and relatively easy process to execute. PMID:28555477
NASA Astrophysics Data System (ADS)
Burn, H. E.; Wenner, J. M.; Baer, E. M.
2011-12-01
The quantitative components of introductory geoscience courses can pose significant barriers to students. Many academic departments respond by stripping courses of their quantitative components or by attaching prerequisite mathematics courses [PMC]. PMCs cause students to incur additional costs and credits and may deter enrollment in introductory courses; yet, stripping quantitative content from geoscience courses masks the data-rich, quantitative nature of geoscience. Furthermore, the diversity of math skills required in geoscience and students' difficulty with transferring mathematical knowledge across domains suggest that PMCs may be ineffective. Instead, this study explores an alternative strategy -- to remediate students' mathematical skills using online modules that provide students with opportunities to build contextual quantitative reasoning skills. The Math You Need, When You Need It [TMYN] is a set of modular online student resources that address mathematical concepts in the context of the geosciences. TMYN modules are online resources that employ a "just-in-time" approach - giving students access to skills and then immediately providing opportunities to apply them. Each module places the mathematical concept in multiple geoscience contexts. Such an approach illustrates the immediate application of a principle and provides repeated exposure to a mathematical skill, enhancing long-term retention. At the same time, placing mathematics directly in several geoscience contexts better promotes transfer of learning by using similar discourse (words, tools, representations) and context that students will encounter when applying mathematics in the future. This study uses quantitative and qualitative data to explore the effectiveness of TMYN modules in remediating students' mathematical skills. Quantitative data derive from ten geoscience courses that used TMYN modules during the fall 2010 and spring 2011 semesters; none of the courses had a PMC. In all courses, students completed a pretest, the assigned modules, and a posttest. Success in remediation was measured using normalized gain scores, which measures the change in score divided by the maximum possible increase: (posttest-pretest)/(1-pretest). To compare across courses, normalized gain scores were standardized. Additional analysis included disaggregating normalized gain scores by quartiles based on pretest scores. The results were supplemented by qualitative data from faculty interviews and information provided by faculty on a web form upon completion of the course. Results suggest TMYN modules remediate mathematical skills effectively, and that normalized gains tend to be higher for students in the lower quartiles on the pretest. Students indicate finding the modules helpful, though sometimes difficult. Faculty interview data triangulate these findings and provide further evidence that online, modularized remediation is an effective alternative to assigning prerequisite mathematical courses to remediate mathematical skills.
Woods, Cindy; Carlisle, Karen; Larkins, Sarah; Thompson, Sandra Claire; Tsey, Komla; Matthews, Veronica; Bailie, Ross
2017-01-01
Continuous Quality Improvement is a process for raising the quality of primary health care (PHC) across Indigenous PHC services. In addition to clinical auditing using plan, do, study, and act cycles, engaging staff in a process of reflecting on systems to support quality care is vital. The One21seventy Systems Assessment Tool (SAT) supports staff to assess systems performance in terms of five key components. This study examines quantitative and qualitative SAT data from five high-improving Indigenous PHC services in northern Australia to understand the systems used to support quality care. High-improving services selected for the study were determined by calculating quality of care indices for Indigenous health services participating in the Audit and Best Practice in Chronic Disease National Research Partnership. Services that reported continuing high improvement in quality of care delivered across two or more audit tools in three or more audits were selected for the study. Precollected SAT data (from annual team SAT meetings) are presented longitudinally using radar plots for quantitative scores for each component, and content analysis is used to describe strengths and weaknesses of performance in each systems' component. High-improving services were able to demonstrate strong processes for assessing system performance and consistent improvement in systems to support quality care across components. Key strengths in the quality support systems included adequate and orientated workforce, appropriate health system supports, and engagement with other organizations and community, while the weaknesses included lack of service infrastructure, recruitment, retention, and support for staff and additional costs. Qualitative data revealed clear voices from health service staff expressing concerns with performance, and subsequent SAT data provided evidence of changes made to address concerns. Learning from the processes and strengths of high-improving services may be useful as we work with services striving to improve the quality of care provided in other areas.
Shuey, Megan M.; Drees, Kevin P.; Lindner, Daniel L.; Keim, Paul
2014-01-01
White-nose syndrome is a fungal disease that has decimated bat populations across eastern North America. Identification of the etiologic agent, Pseudogymnoascus destructans (formerly Geomyces destructans), in environmental samples is essential to proposed management plans. A major challenge is the presence of closely related species, which are ubiquitous in many soils and cave sediments and often present in high abundance. We present a dual-probe real-time quantitative PCR assay capable of detecting and differentiating P. destructans from closely related fungi in environmental samples from North America. The assay, based on a single nucleotide polymorphism (SNP) specific to P. destructans, is capable of rapid low-level detection from various sampling media, including sediment, fecal samples, wing biopsy specimens, and skin swabs. This method is a highly sensitive, high-throughput method for identifying P. destructans, other Pseudogymnoascus spp., and Geomyces spp. in the environment, providing a fundamental component of research and risk assessment for addressing this disease, as well as other ecological and mycological work on related fungi. PMID:24375140
NASA Astrophysics Data System (ADS)
Love, Curtis Clinton
New hybrid educational programs are evolving to challenge traditional definitions of distance education. One such program is the Integrated Science (IS) program of The University of Alabama's Center for Communication and Educational Technology (CCET), which was developed to address concerns about scientific illiteracy in middle school education. IS relies on a multilayered use of communication technologies (primarily videotape and e-mail) for delivery of student instruction, as a delivery vehicle for curriculum materials, and as a feedback mechanism. The IS program serves to enhance classroom science instruction by providing professionally developed videotaped educational lectures and curriculum materials used by classroom science teachers. To date, such hybrid forms of distance education have seldom been examined. Using both qualitative and quantitative methodologies, this study examines 64 IS classrooms visited from October 1992 to April 1995 by researchers at the Institute for Communication Research at The University of Alabama. Detailed qualitative information was gathered from each classroom by student, teacher, and administrator interviews; focus groups; questionnaires; and recording observations of classroom activity. From the reports of the site visits, key components of the IS classroom experience thought to be predictors of the success of the program for individual classrooms are identified. Exemplars of both positive and negative components are provided in narrative form. A model is posited to describe the potential relationships between the various components and their impact on the overall success of the IS program in an individual classroom. Quantitative assessments were made of the 21 key variables identified in the qualitative data that appeared to enhance the likelihood of success for the IS program in an individual classroom. Accounting for 90% of the variance in the regression model, the factor with the greatest predictive potential for success of Integrated Science was "how effective the teacher was in using classroom management skills." The results suggest that despite extensive research and curriculum development, use of sophisticated communication technologies, high video production standards, and expertise of IS video instructors, ultimately the classroom teacher plays the most critical role in determining a class's success and in achieving the goals of the Integrated Science program.
1997-11-01
and Quantitative PCR PRINCIPAL INVESTIGATOR: Indra Poola, Ph.D. CONTRACTING ORGANIZATION: Howard university Washington, DC 20059 REPORT DATE... Howard University Washington, DC 20059 8. PERFORMING ORGANIZATION REPORT NUMBER 8. SPONSORING f MONITORING AGENCY NAME(S) AND ADDRESS(ES) U.S...Stollar J.A. Hanover B.L. Vallee C.B. Hirschberg November 21, 1997 Indra Poola Dept. of Pharmacology Howard University School of Medicine 520 W
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-18
... lending test rating, and the other addresses the quantitative consideration given to a certain type of... activities would receive when examiners considered them, not the quantitative consideration that those... quantitative) in the institution's CRA evaluation. As a result, such financial institutions may have been...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-25
... Comprehensive Quantitative Impact Study.'' DATES: You should submit comments by March 26, 2010. ADDRESSES... requesting approval of the following new information collection: Title: Basel Comprehensive Quantitative... quantitative impact study (QIS) to assess the impact of the proposed revisions that were published by the Basel...
The Eudaimonic and Hedonic Components of Happiness: Qualitative and Quantitative Findings
ERIC Educational Resources Information Center
Fave, Antonella Delle; Brdar, Ingrid; Freire, Teresa; Vella-Brodrick, Dianne; Wissing, Marie P.
2011-01-01
This paper illustrates a new project developed by a cross-country team of researchers, with the aim of studying the hedonic and eudaimonic components of happiness through a mixed method approach combining both qualitative and quantitative analyses. Data were collected from 666 participants in Australia, Croatia, Germany, Italy, Portugal, Spain,…
Kragelund Nielsen, Karoline; Damm, Peter; Bygbjerg, Ib C; Kapur, Anil
2018-04-21
An estimated 87.6% of hyperglycaemia in pregnancy cases are in low and middle income countries (LMICs). The aim of this study is to review the evidence on barriers and facilitators to programmes and services addressing hyperglycaemia in pregnancy in LMICs. A systematic review and narrative synthesis was conducted based on searches in PubMed. A total of 23 qualitative and quantitative studies were included. Barriers and facilitators exist at the health system level, individual level and social and societal levels and are often interacting. At the health system level they relate to capacity in terms of human and material resources; availability of feasible and appropriate guidelines; organisational management and referral pathways. Individual level barriers and facilitators include knowledge; risk perception; illness beliefs; financial condition; work obligations; concerns for the baby and hardship associated with services. At the social and societal level important factors are: perceptions and norms related to women's roles, mobility and health; the knowledge and support of women's social network; and structural aspects. Numerous factors influence programmes and services addressing hyperglycaemia in pregnancy in LMICs. Thus, several components are needed to ensure detection, treatment and follow-up of women with hyperglycaemia in pregnancy. Copyright © 2018 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adem Esmail, Blal, E-mail: blal.ademesmail@unitn.it; Geneletti, Davide
Watershed investments, whose main aim is to secure water for cities, represent a promising opportunity for large-scale sustainability transitions in the near future. If properly designed, they promote activities in the watershed that enhance ecosystem services while protecting nature and biodiversity, as well as achieving other societal goals. In this paper, we build on the concepts of ecosystem services and boundary work, to develop and test an operative approach for designing and assessing the impact of watershed investments. The approach is structured to facilitate negotiations among stakeholders. Its strategic component includes setting the agenda; defining investment scenarios; and assessing themore » performance of watershed investments as well as planning for a follow-up. Its technical component concerns data processing; tailoring spatially explicit ecosystem service models; hence their application to design a set of “investment portfolios”, generate future land use scenarios, and model impacts on selected ecosystem services. A case study illustrates how the technical component can be developed in a data scarce context in sub-Saharan Africa in a way that is functional to support the steps of the strategic component. The case study addresses soil erosion and water scarcity-related challenges affecting Asmara, a medium-sized city in Eritrea, and considers urban water security and rural poverty alleviation as two illustrative objectives, within a ten-year planning horizon. The case study results consist in spatially explicit data (investment portfolio, land use scenario, impact on ecosystem services), which were aggregated to quantitatively assess the performance of different watershed investments scenarios, in terms of changes in soil erosion control. By addressing stakeholders' concerns of credibility, saliency, and legitimacy, the approach is expected to facilitate negotiation of objectives, definition of scenarios, and assessment of alternative watershed investments, ultimately, to contribute to implementing an adaptive watershed management.« less
Gender Differences in Motor Skills of the Overarm Throw
Gromeier, Michael; Koester, Dirk; Schack, Thomas
2017-01-01
In this cross-sectional study, the qualitative and quantitative throwing performance of male and female athletes (6 to 16 years of age) was analyzed. The goal of this study was to assess whether there were gender based qualitative and quantitative differences in throwing performance of young athletes, throughout three different age bands (childhood, pubescence, and adolescence). Furthermore, we explored whether all components of the throwing movement are equally affected by gender differences. Focus was placed on five essential components of action: trunk, forearm, humerus, stepping, and backswing. Therefore, children and adolescents (N = 96) were invited to throw three times from three different distances, while aiming at a target placed at shoulder height. The participants were aspiring athletes, competitive in the sport handball. For analyzing the quality of movement the component approach of Halverson and Roberton (1984) was used. The throwing accuracy was noted and used to evaluate the quantitative performance of the throwing movement. Throughout three different age bands, no statistically significant difference was found between genders in throwing accuracy, i.e., quantitative performance. Regarding the qualitative evaluation of the throwing movement, male and female athletes differed significantly. The component approach yielded higher scores for male than for female participants. As expected, with increasing age qualitative and quantitative performance of male and female athletes improved. These results suggest that there are gender-specific differences in qualitative throwing performance, but not necessarily in quantitative throwing performance. Exploration shows that differences in the qualitative throwing performance were seen in specific components of action. Male and female athletes demonstrated similar movement patterns in humerus and forearm actions, but differed in trunk, stepping, and backswing actions. PMID:28261142
Yang, Guang; Zhao, Xin; Wen, Jun; Zhou, Tingting; Fan, Guorong
2017-04-01
An analytical approach including fingerprint, quantitative analysis and rapid screening of anti-oxidative components was established and successfully applied for the comprehensive quality control of Rhizoma Smilacis Glabrae (RSG), a well-known Traditional Chinese Medicine with the homology of medicine and food. Thirteen components were tentatively identified based on their retention behavior, UV absorption and MS fragmentation patterns. Chemometric analysis based on coulmetric array data was performed to evaluate the similarity and variation between fifteen batches. Eight discriminating components were quantified using single-compound calibration. The unit responses of those components in coulmetric array detection were calculated and compared with those of several compounds reported to possess antioxidant activity, and four of them were tentatively identified as main contributors to the total anti-oxidative activity. The main advantage of the proposed approach was that it realized simultaneous fingerprint, quantitative analysis and screening of anti-oxidative components, providing comprehensive information for quality assessment of RSG. Copyright © 2017 Elsevier B.V. All rights reserved.
Li, Yixian; Qi, Lehua; Song, Yongshan; Chao, Xujiang
2017-06-01
The components of carbon/carbon (C/C) composites have significant influence on the thermal and mechanical properties, so a quantitative characterization of component is necessary to study the microstructure of C/C composites, and further to improve the macroscopic properties of C/C composites. Considering the extinction crosses of the pyrocarbon matrix have significant moving features, the polarized light microscope (PLM) video is used to characterize C/C composites quantitatively because it contains sufficiently dynamic and structure information. Then the optical flow method is introduced to compute the optical flow field between the adjacent frames, and segment the components of C/C composites from PLM image by image processing. Meanwhile the matrix with different textures is re-segmented by the length difference of motion vectors, and then the component fraction of each component and extinction angle of pyrocarbon matrix are calculated directly. Finally, the C/C composites are successfully characterized from three aspects of carbon fiber, pyrocarbon, and pores by a series of image processing operators based on PLM video, and the errors of component fractions are less than 15%. © 2017 Wiley Periodicals, Inc.
Mapping carcass and meat quality QTL on Sus Scrofa chromosome 2 in commercial finishing pigs
Heuven, Henri CM; van Wijk, Rik HJ; Dibbits, Bert; van Kampen, Tony A; Knol, Egbert F; Bovenhuis, Henk
2009-01-01
Quantitative trait loci (QTL) affecting carcass and meat quality located on SSC2 were identified using variance component methods. A large number of traits involved in meat and carcass quality was detected in a commercial crossbred population: 1855 pigs sired by 17 boars from a synthetic line, which where homozygous (A/A) for IGF2. Using combined linkage and linkage disequilibrium mapping (LDLA), several QTL significantly affecting loin muscle mass, ham weight and ham muscles (outer ham and knuckle ham) and meat quality traits, such as Minolta-L* and -b*, ultimate pH and Japanese colour score were detected. These results agreed well with previous QTL-studies involving SSC2. Since our study is carried out on crossbreds, different QTL may be segregating in the parental lines. To address this question, we compared models with a single QTL-variance component with models allowing for separate sire and dam QTL-variance components. The same QTL were identified using a single QTL variance component model compared to a model allowing for separate variances with minor differences with respect to QTL location. However, the variance component method made it possible to detect QTL segregating in the paternal line (e.g. HAMB), the maternal lines (e.g. Ham) or in both (e.g. pHu). Combining association and linkage information among haplotypes improved slightly the significance of the QTL compared to an analysis using linkage information only. PMID:19284675
Rigour in quantitative research.
Claydon, Leica Sarah
2015-07-22
This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.
2010-01-01
Background Quantitative models of biochemical and cellular systems are used to answer a variety of questions in the biological sciences. The number of published quantitative models is growing steadily thanks to increasing interest in the use of models as well as the development of improved software systems and the availability of better, cheaper computer hardware. To maximise the benefits of this growing body of models, the field needs centralised model repositories that will encourage, facilitate and promote model dissemination and reuse. Ideally, the models stored in these repositories should be extensively tested and encoded in community-supported and standardised formats. In addition, the models and their components should be cross-referenced with other resources in order to allow their unambiguous identification. Description BioModels Database http://www.ebi.ac.uk/biomodels/ is aimed at addressing exactly these needs. It is a freely-accessible online resource for storing, viewing, retrieving, and analysing published, peer-reviewed quantitative models of biochemical and cellular systems. The structure and behaviour of each simulation model distributed by BioModels Database are thoroughly checked; in addition, model elements are annotated with terms from controlled vocabularies as well as linked to relevant data resources. Models can be examined online or downloaded in various formats. Reaction network diagrams generated from the models are also available in several formats. BioModels Database also provides features such as online simulation and the extraction of components from large scale models into smaller submodels. Finally, the system provides a range of web services that external software systems can use to access up-to-date data from the database. Conclusions BioModels Database has become a recognised reference resource for systems biology. It is being used by the community in a variety of ways; for example, it is used to benchmark different simulation systems, and to study the clustering of models based upon their annotations. Model deposition to the database today is advised by several publishers of scientific journals. The models in BioModels Database are freely distributed and reusable; the underlying software infrastructure is also available from SourceForge https://sourceforge.net/projects/biomodels/ under the GNU General Public License. PMID:20587024
Trade study comparing specimen chamber servicing methods for the Space Station Centrifuge Facility
NASA Technical Reports Server (NTRS)
Calvisi, Michael L.; Sun, Sidney C.
1991-01-01
The Specimen Chamber Service Unit, a component of the Space Station Centrifuge Facility, must provide a clean enclosure on a continuing basis for the facility's plant, rodent and primate specimens. The specimen chambers can become soiled and can require periodic servicing to maintain a clean environment for the specimens. Two methods of servicing the specimen chambers are discussed: washing the chambers with an on-board washer, or disposing of the soiled chambers and replacing them with clean ones. Many of these issues are addressed by developing several servicing options, using either cleaning or replacement as the method of providing clean specimen chambers, and then evaluating each option according to a set of established quantitative and qualitative criteria. Disposing and replacing the Specimen Chambers is preferable to washing them.
Tadmor, Brigitta; Tidor, Bruce
2005-09-01
Progress in the life sciences, including genome sequencing and high-throughput experimentation, offers an opportunity for understanding biology and medicine from a systems perspective. This 'new view', which complements the more traditional component-based approach, involves the integration of biological research with approaches from engineering disciplines and computer science. The result is more than a new set of technologies. Rather, it promises a fundamental reconceptualization of the life sciences based on the development of quantitative and predictive models to describe crucial processes. To achieve this change, learning communities are being formed at the interface of the life sciences, engineering and computer science. Through these communities, research and education will be integrated across disciplines and the challenges associated with multidisciplinary team-based science will be addressed.
Implementation of the Hammersmith Infant Neurological Exam in a High-Risk Infant Follow-Up Program
Maitre, Nathalie L; Chorna, Olena; Romeo, Domenico M; Guzzetta, Andrea
2017-01-01
Background High-Risk Infant Follow-Up (HRIF) programs provide early identification and referral for treatment of neurodevelopmental delays and impairments. In these programs, a standardized neurological exam is a critical component of evaluation for clinical and research purposes. Implementation To address primary challenges of provider educational diversity and standardized documentation, we designed an approach to training and implementation of the Hammersmith Infant Neurological Exam (HINE) with pre-course materials, a workshop model and adaptation of the electronic medical record. Conclusions Provider completion and documentation of a neurologic exam were evaluated before and after HINE training. Standardized training and implementation of the HINE in a large HRIF is feasible and effective and allows for quantitative evaluation of neurological findings and developmental trajectories. PMID:27765470
[The role of the Aedes aegypti vector in the epidemiology of dengue in Mexico].
Fernández-Salas, I; Flores-Leal, A
1995-01-01
The role of Aedes aegypti (Lineo) in the epidemiology of dengue fever in Mexico is herein discussed based on the vectorial capacity model. Comments on the advantages and disadvantages of each model component at the time of field determinations are also presented. Emphasis is made on the impact of sampling and method bias on the results of vectorial capacity studies. The paper also addresses the need to increase vector biology knowledge as an input for epidemiological work to explain and predict dengue fever outbreaks. Comments on potential entomological variables not considered by the quantitative model are included. Finally, we elaborate on the introduction of Aedes albopictus (Skuse) in Mexico as a new risk factor and on its implications for the understanding of dengue fever transmission in Mexico.
Weaver, A J; Flannelly, L T; Flannelly, K J
2001-09-01
All articles published between 1991 and 1997 in the Journal of Gerontological Nursing and Geriatric Nursing were classified as qualitative research, quantitative research, or non-research. Of the 784 articles reviewed, 5.1% mentioned religion or spirituality. Research articles (7.7%) were more likely than non-research articles (2.8%) to address religion and spirituality. No statistical difference was found between the percentage of qualitative (10.7%) and quantitative (6.8%) studies addressing religious and spiritual factors. The percentage of quantitative studies including religious and spiritual variables was found to be higher than that found by systematic reviews of the research literature in various health professions.
The Practices of Critical Thinking Component and Its Impact in Malaysian Nurses Health Education
ERIC Educational Resources Information Center
Abdullah, Abdul Ghani Kanesan; Alzaidiyeen, Naser Jamil; Yee, Ng Mooi
2010-01-01
The purpose of this research is to study the impact of the critical thinking component in the health education curriculum of nurses for patients with different health needs. Data for this research was gathered from mixed approaches, quantitative and qualitative approaches. For the quantitative approach 84 student nurses were selected randomly to…
Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong
2017-12-01
Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.
Quantitation without Calibration: Response Profile as an Indicator of Target Amount.
Debnath, Mrittika; Farace, Jessica M; Johnson, Kristopher D; Nesterova, Irina V
2018-06-21
Quantitative assessment of biomarkers is essential in numerous contexts from decision-making in clinical situations to food quality monitoring to interpretation of life-science research findings. However, appropriate quantitation techniques are not as widely addressed as detection methods. One of the major challenges in biomarker's quantitation is the need to have a calibration for correlating a measured signal to a target amount. The step complicates the methodologies and makes them less sustainable. In this work we address the issue via a new strategy: relying on position of response profile rather than on an absolute signal value for assessment of a target's amount. In order to enable the capability we develop a target-probe binding mechanism based on a negative cooperativity effect. A proof-of-concept example demonstrates that the model is suitable for quantitative analysis of nucleic acids over a wide concentration range. The general principles of the platform will be applicable toward a variety of biomarkers such as nucleic acids, proteins, peptides, and others.
Brotto, Lori A; Dunkley, Cara R; Breckon, Erin; Carter, Jeanne; Brown, Carl; Daniluk, Judith; Miller, Dianne
2017-10-03
Sexual health is an integral component of quality of life for cancer survivors, and is often negatively impacted by treatment. Geographic limitations often prohibit survivors from accessing sexual health programs designed to address their needs. This study examined the efficacy of an online, 12-week psychoeducational program, which included elements of mindfulness meditation, for sexual difficulties in survivors of colorectal or gynecologic cancer. Complete pre- and postintervention data were available for 46 women (mean age 55.0, SD 9.6) and 15 men (mean age 59.7, SD 6.8). Women experienced significant improvements in sex-related distress (p < .001), sexual function (p < .001 and p < .01), and mood (p < .001); these results were maintained at six months follow-up. Men's improvement in desire was not significant (p = .06), whereas intercourse satisfaction was (p < .05) immediately after the program, but not at follow-up. In order to more fully explore women's experiences, interviews were carried out with six participants and analyzed using narrative inquiry. Women shared a feeling of renewed hope for regaining their sex lives, and expressed that they would have valued an interactive component to the program. These findings suggest that an online, unidirectional psychoeducational program is feasible, and may be effective for women survivors of gynecologic and colorectal cancer, but further work is needed to ensure that online interventions address the sexual health needs of male survivors.
Shao, Xueguang; Yu, Zhengliang; Ma, Chaoxiong
2004-06-01
An improved method is proposed for the quantitative determination of multicomponent overlapping chromatograms based on a known transmutation method. To overcome the main limitation of the transmutation method caused by the oscillation generated in the transmutation process, two techniques--wavelet transform smoothing and the cubic spline interpolation for reducing data points--were adopted, and a new criterion was also developed. By using the proposed algorithm, the oscillation can be suppressed effectively, and quantitative determination of the components in both the simulated and experimental overlapping chromatograms is successfully obtained.
Jiang, Shun-Yuan; Sun, Hong-Bing; Sun, Hui; Ma, Yu-Ying; Chen, Hong-Yu; Zhu, Wen-Tao; Zhou, Yi
2016-03-01
This paper aims to explore a comprehensive assessment method combined traditional Chinese medicinal material specifications with quantitative quality indicators. Seventy-six samples of Notopterygii Rhizoma et Radix were collected on market and at producing areas. Traditional commercial specifications were described and assigned, and 10 chemical components and volatile oils were determined for each sample. Cluster analysis, Fisher discriminant analysis and correspondence analysis were used to establish the relationship between the traditional qualitative commercial specifications and quantitative chemical indices for comprehensive evaluating quality of medicinal materials, and quantitative classification of commercial grade and quality grade. A herb quality index (HQI) including traditional commercial specifications and chemical components for quantitative grade classification were established, and corresponding discriminant function were figured out for precise determination of quality grade and sub-grade of Notopterygii Rhizoma et Radix. The result showed that notopterol, isoimperatorin and volatile oil were the major components for determination of chemical quality, and their dividing values were specified for every grade and sub-grade of the commercial materials of Notopterygii Rhizoma et Radix. According to the result, essential relationship between traditional medicinal indicators, qualitative commercial specifications, and quantitative chemical composition indicators can be examined by K-mean cluster, Fisher discriminant analysis and correspondence analysis, which provide a new method for comprehensive quantitative evaluation of traditional Chinese medicine quality integrated traditional commodity specifications and quantitative modern chemical index. Copyright© by the Chinese Pharmaceutical Association.
Multivariate proteomic profiling identifies novel accessory proteins of coated vesicles
Antrobus, Robin; Hirst, Jennifer; Bhumbra, Gary S.; Kozik, Patrycja; Jackson, Lauren P.; Sahlender, Daniela A.
2012-01-01
Despite recent advances in mass spectrometry, proteomic characterization of transport vesicles remains challenging. Here, we describe a multivariate proteomics approach to analyzing clathrin-coated vesicles (CCVs) from HeLa cells. siRNA knockdown of coat components and different fractionation protocols were used to obtain modified coated vesicle-enriched fractions, which were compared by stable isotope labeling of amino acids in cell culture (SILAC)-based quantitative mass spectrometry. 10 datasets were combined through principal component analysis into a “profiling” cluster analysis. Overall, 136 CCV-associated proteins were predicted, including 36 new proteins. The method identified >93% of established CCV coat proteins and assigned >91% correctly to intracellular or endocytic CCVs. Furthermore, the profiling analysis extends to less well characterized types of coated vesicles, and we identify and characterize the first AP-4 accessory protein, which we have named tepsin. Finally, our data explain how sequestration of TACC3 in cytosolic clathrin cages causes the severe mitotic defects observed in auxilin-depleted cells. The profiling approach can be adapted to address related cell and systems biological questions. PMID:22472443
Self-contained microfluidic systems: a review.
Boyd-Moss, Mitchell; Baratchi, Sara; Di Venere, Martina; Khoshmanesh, Khashayar
2016-08-16
Microfluidic systems enable rapid diagnosis, screening and monitoring of diseases and health conditions using small amounts of biological samples and reagents. Despite these remarkable features, conventional microfluidic systems rely on bulky expensive external equipment, which hinders their utility as powerful analysis tools outside of research laboratories. 'Self-contained' microfluidic systems, which contain all necessary components to facilitate a complete assay, have been developed to address this limitation. In this review, we provide an in-depth overview of self-contained microfluidic systems. We categorise these systems based on their operating mechanisms into three major groups: passive, hand-powered and active. Several examples are provided to discuss the structure, capabilities and shortcomings of each group. In particular, we discuss the self-contained microfluidic systems enabled by active mechanisms, due to their unique capability for running multi-step and highly controllable diagnostic assays. Integration of self-contained microfluidic systems with the image acquisition and processing capabilities of smartphones, especially those equipped with accessory optical components, enables highly sensitive and quantitative assays, which are discussed. Finally, the future trends and possible solutions to expand the versatility of self-contained, stand-alone microfluidic platforms are outlined.
Improving the lifetime in optical microtraps by using elliptically polarized dipole light
NASA Astrophysics Data System (ADS)
Garcia, Sébastien; Reichel, Jakob; Long, Romain
2018-02-01
Tightly focused optical dipole traps induce vector light shifts ("fictitious magnetic fields") which complicate their use for single-atom trapping and manipulation. The problem can be mitigated by adding a larger, real magnetic field, but this solution is not always applicable; in particular, it precludes fast switching to a field-free configuration. Here we show that this issue can be addressed elegantly by deliberately adding a small elliptical polarization component to the dipole trap beam. In our experiments with single 87Rb atoms laser-cooled in a chopped trap, we observe improvements up to a factor of 11 of the trap lifetime compared to the standard, seemingly ideal linear polarization. This effect results from a modification of heating processes via spin-state diffusion in state-dependent trapping potentials. We develop Monte Carlo simulations of the evolution of the atom's internal and motional states and find that they agree quantitatively with the experimental data. The method is general and can be applied in all experiments where the longitudinal polarization component is non-negligible.
Sinkiewicz, Daniel; Friesen, Lendra; Ghoraani, Behnaz
2017-02-01
Cortical auditory evoked potentials (CAEP) are used to evaluate cochlear implant (CI) patient auditory pathways, but the CI device produces an electrical artifact, which obscures the relevant information in the neural response. Currently there are multiple methods, which attempt to recover the neural response from the contaminated CAEP, but there is no gold standard, which can quantitatively confirm the effectiveness of these methods. To address this crucial shortcoming, we develop a wavelet-based method to quantify the amount of artifact energy in the neural response. In addition, a novel technique for extracting the neural response from single channel CAEPs is proposed. The new method uses matching pursuit (MP) based feature extraction to represent the contaminated CAEP in a feature space, and support vector machines (SVM) to classify the components as normal hearing (NH) or artifact. The NH components are combined to recover the neural response without artifact energy, as verified using the evaluation tool. Although it needs some further evaluation, this approach is a promising method of electrical artifact removal from CAEPs. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
FIRST INFRARED BAND STRENGTHS FOR AMORPHOUS CO{sub 2}, AN OVERLOOKED COMPONENT OF INTERSTELLAR ICES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerakines, Perry A.; Hudson, Reggie L., E-mail: Reggie.Hudson@NASA.gov
2015-08-01
Solid carbon dioxide (CO{sub 2}) has long been recognized as a component of both interstellar and solar system ices, but a recent literature search has revealed significant qualitative and quantitative discrepancies in the laboratory spectra on which the abundances of extraterrestrial CO{sub 2} are based. Here we report new infrared (IR) spectra of amorphous CO{sub 2}-ice along with band intensities (band strengths) of four mid-IR absorptions, the first such results in the literature. A possible thickness dependence for amorphous-CO{sub 2} IR band shapes and positions also is investigated, and the three discordant reports of amorphous CO{sub 2} spectra in themore » literature are addressed. Applications of our results are discussed with an emphasis on laboratory investigations and results from astronomical observations. A careful comparison with earlier work shows that the IR spectra calculated from several databases for CO{sub 2} ices, all ices being made near 10 K, are not for amorphous CO{sub 2}, but rather for crystalline CO{sub 2} or crystalline-amorphous mixtures.« less
Aiemjoy, Kristen; Stoller, Nicole E; Gebresillasie, Sintayehu; Shiferaw, Ayalew; Tadesse, Zerihun; Sewnet, Tegene; Ayele, Bezuayehu; Chanyalew, Melsew; Callahan, Kelly; Stewart, Aisha; Emerson, Paul M; Lietman, Thomas M; Keenan, Jeremy D; Oldenburg, Catherine E
2016-10-01
Face cleanliness is a core component of the SAFE (Surgery, Antibiotics, Facial cleanliness, and Environmental improvements) strategy for trachoma control. Understanding knowledge, attitudes, and behaviors related to face washing may be helpful for designing effective interventions for improving facial cleanliness. In April 2014, a mixed methods study including focus groups and a quantitative cross-sectional study was conducted in the East Gojjam zone of the Amhara region of Ethiopia. Participants were asked about face washing practices, motivations for face washing, use of soap (which may reduce bacterial load), and fly control strategies. Overall, both knowledge and reported practice of face washing was high. Participants reported they knew that washing their own face and their children's faces daily was important for hygiene and infection control. Although participants reported high knowledge of the importance of soap for face washing, quantitative data revealed strong variations by community in the use of soap for face washing, ranging from 4.4% to 82.2% of households reporting using soap for face washing. Cost and forgetfulness were cited as barriers to the use of soap for face washing. Keeping flies from landing on children was a commonly cited motivator for regular face washing, as was trachoma prevention. Interventions aiming to improve facial cleanliness for trachoma prevention should focus on habit formation (to address forgetfulness) and address barriers to the use of soap, such as reducing cost. Interventions that focus solely on improving knowledge may not be effective for changing face-washing behaviors.
Rigoard, P; Nivole, K; Blouin, P; Monlezun, O; Roulaud, M; Lorgeoux, B; Bataille, B; Guetarni, F
2015-03-01
One of the major challenges of neurostimulation is actually to address the back pain component in patients suffering from refractory chronic back and leg pain. Facing a tremendous expansion of neurostimulation techniques and available devices, implanters and patients can still remain confused as they need to select the right tool for the right indication. To be able to evaluate and compare objectively patient outcomes, depending on therapeutical strategies, it appears essential to develop a rational and quantitative approach to pain assessment for those who undergo neurostimulation implantation. We developed a touch screen interface, in Poitiers University Hospital and N(3)Lab, called the "Neuro-Pain'T", to detect, record and quantify the painful area surface and intensity changes in an implanted patient within time. The second aim of this software is to analyse the link between a paraesthesia coverage generated by a type of neurostimulation and a potential analgesic effect, measured by pain surface reduction, pain intensity reduction within the painful surface and local change in pain characteristics distribution. The third aim of Neuro-Pain'T is to correlate these clinical parameters to global patient data and functional outcome analysis, via a network database (Neuro-Database), to be able to provide a concise but objective approach of the neurostimulation efficacy, summarized by an index called "RFG Index". This software has been used in more than 190 patients since 2012, leading us to define three clinical parameters grouped as a clinical component of the RFG Index, which might be helpful to assess neurostimulation efficacy and compare implanted devices. The Neuro-Pain'T is an original software designed to objectively and quantitatively characterize reduction of a painful area in a given individual, in terms of intensity, surface and pain typology, in response to a treatment strategy or implantation of an analgesic device. Because pain is a physical sensation, which integrates a psychological dimension, its assessment justifies the use of multidimensional and global evaluation scales. However, in the context of neurostimulation and comparative clinical trials designed to test the technical efficacy of a given device, a simple, objective and quantitative evaluation tool could help to guide tomorrow's treatment options by transforming personal convictions into a more robust scientific rationale based on data collection and data mining techniques. Copyright © 2014. Published by Elsevier Masson SAS.
Synthesising quantitative and qualitative research in evidence‐based patient information
Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan
2007-01-01
Background Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence‐based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. Aims This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Methods Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non‐quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg “explain what the test involves”) was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. Results 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. Conclusions A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review. PMID:17325406
A general method for assessing brain-computer interface performance and its limitations
NASA Astrophysics Data System (ADS)
Hill, N. Jeremy; Häuser, Ann-Katrin; Schalk, Gerwin
2014-04-01
Objective. When researchers evaluate brain-computer interface (BCI) systems, we want quantitative answers to questions such as: How good is the system’s performance? How good does it need to be? and: Is it capable of reaching the desired level in future? In response to the current lack of objective, quantitative, study-independent approaches, we introduce methods that help to address such questions. We identified three challenges: (I) the need for efficient measurement techniques that adapt rapidly and reliably to capture a wide range of performance levels; (II) the need to express results in a way that allows comparison between similar but non-identical tasks; (III) the need to measure the extent to which certain components of a BCI system (e.g. the signal processing pipeline) not only support BCI performance, but also potentially restrict the maximum level it can reach. Approach. For challenge (I), we developed an automatic staircase method that adjusted task difficulty adaptively along a single abstract axis. For challenge (II), we used the rate of information gain between two Bernoulli distributions: one reflecting the observed success rate, the other reflecting chance performance estimated by a matched random-walk method. This measure includes Wolpaw’s information transfer rate as a special case, but addresses the latter’s limitations including its restriction to item-selection tasks. To validate our approach and address challenge (III), we compared four healthy subjects’ performance using an EEG-based BCI, a ‘Direct Controller’ (a high-performance hardware input device), and a ‘Pseudo-BCI Controller’ (the same input device, but with control signals processed by the BCI signal processing pipeline). Main results. Our results confirm the repeatability and validity of our measures, and indicate that our BCI signal processing pipeline reduced attainable performance by about 33% (21 bits min-1). Significance. Our approach provides a flexible basis for evaluating BCI performance and its limitations, across a wide range of tasks and task difficulties.
ERIC Educational Resources Information Center
McLafferty, Charles L., Jr.; Slate, John R.; Onwuegbuzie, Anthony J.
2010-01-01
Quantitative research dominates published literature in the helping professions. Mixed methods research, which integrates quantitative and qualitative methodologies, has received a lukewarm reception. The authors address the iterative separation that infuses theory, praxis, philosophy, methodology, training, and public perception and propose a…
Biomathematical modeling of pulsatile hormone secretion: a historical perspective.
Evans, William S; Farhy, Leon S; Johnson, Michael L
2009-01-01
Shortly after the recognition of the profound physiological significance of the pulsatile nature of hormone secretion, computer-based modeling techniques were introduced for the identification and characterization of such pulses. Whereas these earlier approaches defined perturbations in hormone concentration-time series, deconvolution procedures were subsequently employed to separate such pulses into their secretion event and clearance components. Stochastic differential equation modeling was also used to define basal and pulsatile hormone secretion. To assess the regulation of individual components within a hormone network, a method that quantitated approximate entropy within hormone concentration-times series was described. To define relationships within coupled hormone systems, methods including cross-correlation and cross-approximate entropy were utilized. To address some of the inherent limitations of these methods, modeling techniques with which to appraise the strength of feedback signaling between and among hormone-secreting components of a network have been developed. Techniques such as dynamic modeling have been utilized to reconstruct dose-response interactions between hormones within coupled systems. A logical extension of these advances will require the development of mathematical methods with which to approximate endocrine networks exhibiting multiple feedback interactions and subsequently reconstruct their parameters based on experimental data for the purpose of testing regulatory hypotheses and estimating alterations in hormone release control mechanisms.
Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site
NASA Astrophysics Data System (ADS)
Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.
2012-04-01
The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".
Quantitative Evaluation of Protein Heterogeneity within Herpes Simplex Virus 1 Particles.
El Bilali, Nabil; Duron, Johanne; Gingras, Diane; Lippé, Roger
2017-05-15
Several virulence genes have been identified thus far in the herpes simplex virus 1 genome. It is also generally accepted that protein heterogeneity among virions further impacts viral fitness. However, linking this variability directly with infectivity has been challenging at the individual viral particle level. To address this issue, we resorted to flow cytometry (flow virometry), a powerful approach we recently employed to analyze individual viral particles, to identify which tegument proteins vary and directly address if such variability is biologically relevant. We found that the stoichiometry of the U L 37, ICP0, and VP11/12 tegument proteins in virions is more stable than the VP16 and VP22 tegument proteins, which varied significantly among viral particles. Most interestingly, viruses sorted for their high VP16 or VP22 content yielded modest but reproducible increases in infectivity compared to their corresponding counterparts containing low VP16 or VP22 content. These findings were corroborated for VP16 in short interfering RNA experiments but proved intriguingly more complex for VP22. An analysis by quantitative Western blotting revealed substantial alterations of virion composition upon manipulation of individual tegument proteins and suggests that VP22 protein levels acted indirectly on viral fitness. These findings reaffirm the interdependence of the virion components and corroborate that viral fitness is influenced not only by the genome of viruses but also by the stoichiometry of proteins within each virion. IMPORTANCE The ability of viruses to spread in animals has been mapped to several viral genes, but other factors are clearly involved, including virion heterogeneity. To directly probe whether the latter influences viral fitness, we analyzed the protein content of individual herpes simplex virus 1 particles using an innovative flow cytometry approach. The data confirm that some viral proteins are incorporated in more controlled amounts, while others vary substantially. Interestingly, this correlates with the VP16 trans -activating viral protein and indirectly with VP22, a second virion component whose modulation profoundly alters virion composition. This reaffirms that not only the presence but also the amount of specific tegument proteins is an important determinant of viral fitness. Copyright © 2017 American Society for Microbiology.
Quantitative Evaluation of Protein Heterogeneity within Herpes Simplex Virus 1 Particles
El Bilali, Nabil; Duron, Johanne; Gingras, Diane
2017-01-01
ABSTRACT Several virulence genes have been identified thus far in the herpes simplex virus 1 genome. It is also generally accepted that protein heterogeneity among virions further impacts viral fitness. However, linking this variability directly with infectivity has been challenging at the individual viral particle level. To address this issue, we resorted to flow cytometry (flow virometry), a powerful approach we recently employed to analyze individual viral particles, to identify which tegument proteins vary and directly address if such variability is biologically relevant. We found that the stoichiometry of the UL37, ICP0, and VP11/12 tegument proteins in virions is more stable than the VP16 and VP22 tegument proteins, which varied significantly among viral particles. Most interestingly, viruses sorted for their high VP16 or VP22 content yielded modest but reproducible increases in infectivity compared to their corresponding counterparts containing low VP16 or VP22 content. These findings were corroborated for VP16 in short interfering RNA experiments but proved intriguingly more complex for VP22. An analysis by quantitative Western blotting revealed substantial alterations of virion composition upon manipulation of individual tegument proteins and suggests that VP22 protein levels acted indirectly on viral fitness. These findings reaffirm the interdependence of the virion components and corroborate that viral fitness is influenced not only by the genome of viruses but also by the stoichiometry of proteins within each virion. IMPORTANCE The ability of viruses to spread in animals has been mapped to several viral genes, but other factors are clearly involved, including virion heterogeneity. To directly probe whether the latter influences viral fitness, we analyzed the protein content of individual herpes simplex virus 1 particles using an innovative flow cytometry approach. The data confirm that some viral proteins are incorporated in more controlled amounts, while others vary substantially. Interestingly, this correlates with the VP16 trans-activating viral protein and indirectly with VP22, a second virion component whose modulation profoundly alters virion composition. This reaffirms that not only the presence but also the amount of specific tegument proteins is an important determinant of viral fitness. PMID:28275191
Kellogg, Joshua J; Graf, Tyler N; Paine, Mary F; McCune, Jeannine S; Kvalheim, Olav M; Oberlies, Nicholas H; Cech, Nadja B
2017-05-26
A challenge that must be addressed when conducting studies with complex natural products is how to evaluate their complexity and variability. Traditional methods of quantifying a single or a small range of metabolites may not capture the full chemical complexity of multiple samples. Different metabolomics approaches were evaluated to discern how they facilitated comparison of the chemical composition of commercial green tea [Camellia sinensis (L.) Kuntze] products, with the goal of capturing the variability of commercially used products and selecting representative products for in vitro or clinical evaluation. Three metabolomic-related methods-untargeted ultraperformance liquid chromatography-mass spectrometry (UPLC-MS), targeted UPLC-MS, and untargeted, quantitative 1 HNMR-were employed to characterize 34 commercially available green tea samples. Of these methods, untargeted UPLC-MS was most effective at discriminating between green tea, green tea supplement, and non-green-tea products. A method using reproduced correlation coefficients calculated from principal component analysis models was developed to quantitatively compare differences among samples. The obtained results demonstrated the utility of metabolomics employing UPLC-MS data for evaluating similarities and differences between complex botanical products.
Yang, Yongxin; Zhao, Xiaowei; Yu, Shumin; Cao, Suizhong
2015-02-01
Yak (Bos grunniens) is an important natural resource in mountainous regions. To date, few studies have addressed the differences in the protein profiles of yak colostrum and milk. We used quantitative proteomics to compare the protein profiles of whey from yak colostrum and milk. Milk samples were collected from 21 yaks after calving (1 and 28 d). Whey protein profiles were generated through isobaric tag for relative and absolute quantification (iTRAQ)-labelled proteomics. We identified 183 proteins in milk whey; of these, the expression levels of 86 proteins differed significantly between the whey from colostrum and milk. Haemoglobin expression showed the greatest change; its levels were significantly higher in the whey from colostrum than in mature milk whey. Functional analysis revealed that many of the differentially expressed proteins were associated with biological regulation and response to stimuli. Further, eight differentially expressed proteins involved in the complement and coagulation cascade pathway were enriched in milk whey. These findings add to the general understanding of the protein composition of yak milk, suggest potential functions of the differentially expressed proteins, and provide novel information on the role of colostral components in calf survival. © 2014 Society of Chemical Industry.
2017-01-01
A challenge that must be addressed when conducting studies with complex natural products is how to evaluate their complexity and variability. Traditional methods of quantifying a single or a small range of metabolites may not capture the full chemical complexity of multiple samples. Different metabolomics approaches were evaluated to discern how they facilitated comparison of the chemical composition of commercial green tea [Camellia sinensis (L.) Kuntze] products, with the goal of capturing the variability of commercially used products and selecting representative products for in vitro or clinical evaluation. Three metabolomic-related methods—untargeted ultraperformance liquid chromatography–mass spectrometry (UPLC-MS), targeted UPLC-MS, and untargeted, quantitative 1HNMR—were employed to characterize 34 commercially available green tea samples. Of these methods, untargeted UPLC-MS was most effective at discriminating between green tea, green tea supplement, and non-green-tea products. A method using reproduced correlation coefficients calculated from principal component analysis models was developed to quantitatively compare differences among samples. The obtained results demonstrated the utility of metabolomics employing UPLC-MS data for evaluating similarities and differences between complex botanical products. PMID:28453261
Applying Mixed Methods Research at the Synthesis Level: An Overview
ERIC Educational Resources Information Center
Heyvaert, Mieke; Maes, Bea; Onghena, Patrick
2011-01-01
Historically, qualitative and quantitative approaches have been applied relatively separately in synthesizing qualitative and quantitative evidence, respectively, in several research domains. However, mixed methods approaches are becoming increasingly popular nowadays, and practices of combining qualitative and quantitative research components at…
Robust red-emission spectra and yields in firefly bioluminescence against temperature changes
NASA Astrophysics Data System (ADS)
Mochizuki, Toshimitsu; Wang, Yu; Hiyama, Miyabi; Akiyama, Hidefumi
2014-05-01
We measured the quantitative spectra of firefly (Photinus pyralis) bioluminescence at various temperatures to investigate the temperature dependence of the luciferin-luciferase reaction at 15-34 °C. The quantitative spectra were decomposed very well into red (1.9 eV), orange (2.0 eV), and green (2.2 eV) Gaussian components. The intensity of the green component was the only temperature sensitive quantity that linearly decreased as the temperature increased at pH 7 and 8. We found the quantitative bioluminescence spectra to be robust below 2.0 eV against temperature and other experimental conditions. The revealed robustness of the red emissions should be useful for quantitative applications such as adenosine-5'-triphosphate detection.
Wang, Chao-Qun; Jia, Xiu-Hong; Zhu, Shu; Komatsu, Katsuko; Wang, Xuan; Cai, Shao-Qing
2015-03-01
A new quantitative analysis of multi-component with single marker (QAMS) method for 11 saponins (ginsenosides Rg1, Rb1, Rg2, Rh1, Rf, Re and Rd; notoginsenosides R1, R4, Fa and K) in notoginseng was established, when 6 of these saponins were individually used as internal referring substances to investigate the influences of chemical structure, concentrations of quantitative components, and purities of the standard substances on the accuracy of the QAMS method. The results showed that the concentration of the analyte in sample solution was the major influencing parameter, whereas the other parameters had minimal influence on the accuracy of the QAMS method. A new method for calculating the relative correction factors by linear regression was established (linear regression method), which demonstrated to decrease standard method differences of the QAMS method from 1.20%±0.02% - 23.29%±3.23% to 0.10%±0.09% - 8.84%±2.85% in comparison with the previous method. And the differences between external standard method and the QAMS method using relative correction factors calculated by linear regression method were below 5% in the quantitative determination of Rg1, Re, R1, Rd and Fa in 24 notoginseng samples and Rb1 in 21 notoginseng samples. And the differences were mostly below 10% in the quantitative determination of Rf, Rg2, R4 and N-K (the differences of these 4 constituents bigger because their contents lower) in all the 24 notoginseng samples. The results indicated that the contents assayed by the new QAMS method could be considered as accurate as those assayed by external standard method. In addition, a method for determining applicable concentration ranges of the quantitative components assayed by QAMS method was established for the first time, which could ensure its high accuracy and could be applied to QAMS methods of other TCMs. The present study demonstrated the practicability of the application of the QAMS method for the quantitative analysis of multi-component and the quality control of TCMs and TCM prescriptions. Copyright © 2014 Elsevier B.V. All rights reserved.
Quantitative multiplex detection of pathogen biomarkers
Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I.; Martinez, Jennifer; Grace, Wynne K.
2016-02-09
The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.
Quantitative multiplex detection of pathogen biomarkers
Mukundan, Harshini; Xie, Hongzhi; Swanson, Basil I; Martinez, Jennifer; Grace, Wynne K
2014-10-14
The present invention addresses the simultaneous detection and quantitative measurement of multiple biomolecules, e.g., pathogen biomarkers through either a sandwich assay approach or a lipid insertion approach. The invention can further employ a multichannel, structure with multi-sensor elements per channel.
Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang
2013-05-01
To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.
Studies of satellite support to weather modification in the western US region
NASA Technical Reports Server (NTRS)
Cotton, W. R.; Grant, L. O.; Vonderhaar, T. H.
1978-01-01
The applications of meteorological satellite data to both summer and winter weather modification programs are addressed. Appraisals of the capability of satellites to assess seedability, to provide real-time operational support, and to assist in the post-experiment analysis of a seeding experiment led to the incorporation of satellite observing systems as a major component in the Bureau of Reclamations weather modification activities. Satellite observations are an integral part of the South Park Area cumulus experiment (SPACE) which aims to formulate a quantitative hypothesis for enhancing precipitation from orographically induced summertime mesoscale convective systems (orogenic mesoscale systems). Progress is reported in using satellite observations to assist in classifying the important mesoscale systems, and in defining their frequency and coverage, and potential area of effect. Satellite studies of severe storms are also covered.
Highlights of NASA's Role in Developing State-of-the-Art Nondestructive Evaluation for Composites
NASA Technical Reports Server (NTRS)
2001-01-01
Since the 1970's, when the promise of composites was being pursued for aeronautics applications, NASA has had programs that addressed the development of NDE methods for composites. These efforts included both microscopic and macroscopic NDE. At the microscopic level, NDE investigations interrogated composites at the submicron to micron level to understand a composite's microstructure. A novel microfocus CT system was developed as well as the science underlying applications of acoustic microscopy to a composite's component material properties. On the macroscopic scale NDE techniques were developed that advanced the capabilities to be faster and more quantitative. Techniques such as stiffness imaging, ultrasonic arrays, laser based ultrasound, advanced acoustic emission, thermography, and novel health monitoring systems were researched. Underlying these methods has been a strong modeling capability that has aided in method development.
In Search of the Physics: The Interplay of Experiment and Computation in Slat Aeroacoustics
NASA Technical Reports Server (NTRS)
Khorrami, Mehdi R.; Choudhari, Meelan; Singer, Bart A.; Lockard, David P.; Streett, Craig L.
2003-01-01
The synergistic use of experiments and numerical simulations can uncover the underlying physics of airframe noise sources. We focus on the high-lift noise component associated with a leading-edge slat; flap side-edge noise is discussed in a companion paper by Streett et al. (2003). The present paper provides an overview of how slat noise was split into subcomponents and analyzed with carefully planned complementary experimental and numerical tests. We consider both tonal and broadband aspects of slat noise. The predicted far-field noise spectra are shown to be in good qualitative (and, to lesser extent, good quantitative agreement) with acoustic array measurements. Although some questions remain unanswered, the success of current airframe noise studies provides ample promise that remaining technical issues can be successfully addressed in the near future.
Potter, Jacqueline N; Hanley, Gregory P; Augustine, Matotopa; Clay, Casey J; Phelps, Meredith C
2013-01-01
Use of automatically reinforced stereotypy as reinforcement has been shown to be successful for increasing socially desirable behaviors in persons with intellectual disabilities (Charlop, Kurtz, & Casey, 1990; Hanley, Iwata, Thompson, & Lindberg, 2000; Hung, 1978). A component analysis of this treatment was conducted with 3 adolescents who had been diagnosed with autism, and then extended by (a) progressively increasing the quantitative and qualitative aspects of the response requirement to earn access to stereotypy, (b) arranging objective measures of client preference for contingent access to stereotypy compared to other relevant treatments for their automatically reinforced stereotypy, and (c) assessing the social validity of this treatment with other relevant stakeholders. Implications for addressing stereotypy and increasing the leisure skills of adolescents with autism are discussed. © Society for the Experimental Analysis of Behavior.
Qualitative methods in environmental health research.
Brown, Phil
2003-01-01
Public health researchers increasingly turn to qualitative methods either on their own or in combination with quantitative methods. Qualitative methods are especially important to community environmental health research, as they provide a way to produce community narratives that give voice to individuals and characterize the community in a full and complex fashion. This article first traces the legacy of qualitative research in environmental health, then uses a case study of the author's experiences studying the Woburn, Massachusetts, childhood leukemia cluster to provide personal and scholarly insights on qualitative approaches. That material then informs a discussion of important components of qualitative methods in environmental health research, including flexible study design, access, trust, empathy, and personal shifts in the researcher's worldview, bias, and the nature of the researcher's roles. A concluding discussion addresses issues in funding policy and research practices. PMID:14594634
Research highlights: microfluidics meets big data.
Tseng, Peter; Weaver, Westbrook M; Masaeli, Mahdokht; Owsley, Keegan; Di Carlo, Dino
2014-03-07
In this issue we highlight a collection of recent work in which microfluidic parallelization and automation have been employed to address the increasing need for large amounts of quantitative data concerning cellular function--from correlating microRNA levels to protein expression, increasing the throughput and reducing the noise when studying protein dynamics in single-cells, and understanding how signal dynamics encodes information. The painstaking dissection of cellular pathways one protein at a time appears to be coming to an end, leading to more rapid discoveries which will inevitably translate to better cellular control--in producing useful gene products and treating disease at the individual cell level. From these studies it is also clear that development of large scale mutant or fusion libraries, automation of microscopy, image analysis, and data extraction will be key components as microfluidics contributes its strengths to aid systems biology moving forward.
Theoretical Foundations of Remote Sensing for Glacier Assessment and Mapping
NASA Technical Reports Server (NTRS)
Bishop, Michael P.; Bush, Andrew B. G.; Furfaro, Roberto; Gillespie, Alan R.; Hall, Dorothy K.; Haritashya, Umesh K.; Shroder, John F., Jr.
2014-01-01
The international scientific community is actively engaged in assessing ice sheet and alpine glacier fluctuations at a variety of scales. The availability of stereoscopic, multitemporal, and multispectral satellite imagery from the optical wavelength regions of the electromagnetic spectrum has greatly increased our ability to assess glaciological conditions and map the cryosphere. There are, however, important issues and limitations associated with accurate satellite information extraction and mapping, as well as new opportunities for assessment and mapping that are all rooted in understanding the fundamentals of the radiation transfer cascade. We address the primary radiation transfer components, relate them to glacier dynamics and mapping, and summarize the analytical approaches that permit transformation of spectral variation into thematic and quantitative parameters. We also discuss the integration of satellite-derived information into numerical modeling approaches to facilitate understandings of glacier dynamics and causal mechanisms.
Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He
2017-07-01
This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.
Change Mechanisms of Schema-Centered Group Psychotherapy with Personality Disorder Patients
Tschacher, Wolfgang; Zorn, Peter; Ramseyer, Fabian
2012-01-01
Background This study addressed the temporal properties of personality disorders and their treatment by schema-centered group psychotherapy. It investigated the change mechanisms of psychotherapy using a novel method by which psychotherapy can be modeled explicitly in the temporal domain. Methodology and Findings 69 patients were assigned to a specific schema-centered behavioral group psychotherapy, 26 to social skills training as a control condition. The largest diagnostic subgroups were narcissistic and borderline personality disorder. Both treatments offered 30 group sessions of 100 min duration each, at a frequency of two sessions per week. Therapy process was described by components resulting from principal component analysis of patients' session-reports that were obtained after each session. These patient-assessed components were Clarification, Bond, Rejection, and Emotional Activation. The statistical approach focused on time-lagged associations of components using time-series panel analysis. This method provided a detailed quantitative representation of therapy process. It was found that Clarification played a core role in schema-centered psychotherapy, reducing rejection and regulating the emotion of patients. This was also a change mechanism linked to therapy outcome. Conclusions/Significance The introduced process-oriented methodology allowed to highlight the mechanisms by which psychotherapeutic treatment became effective. Additionally, process models depicted the actual patterns that differentiated specific diagnostic subgroups. Time-series analysis explores Granger causality, a non-experimental approximation of causality based on temporal sequences. This methodology, resting upon naturalistic data, can explicate mechanisms of action in psychotherapy research and illustrate the temporal patterns underlying personality disorders. PMID:22745811
2013-09-03
SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION... 11 Connecting technical risk and types of complexity...24 Figure 11 . Complexity evolution throughout the systems acquisition lifecycle ......................................... 25
Quantitative Risk - Phases 1 & 2
2013-11-12
MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR/MONITOR’S REPORT NUMBER(S) 12. DISTRIBUTION/AVAILABILITY... 11 Connecting technical risk and types of complexity...24 Figure 11 . Complexity evolution throughout the systems acquisition lifecycle
ERIC Educational Resources Information Center
Teater, Barbra; Roy, Jessica; Carpenter, John; Forrester, Donald; Devaney, John; Scourfield, Jonathan
2017-01-01
Students in the United Kingdom (UK) are found to lack knowledge and skills in quantitative research methods. To address this gap, a quantitative research method and statistical analysis curriculum comprising 10 individual lessons was developed, piloted, and evaluated at two universities The evaluation found that BSW students' (N = 81)…
Development of quantitative risk acceptance criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griesmeyer, J. M.; Okrent, D.
Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.
Abrakova, E L
1982-01-01
The effector component of immune system was studied in 39 cases of fibroadenomatosis by morphometric and histochemical methods. Signs of immunosuppression (decreased fraction of free cells of stroma and disturbances in their cooperation) were identified in cases of precancerous changes and cancer. A considerable difference in nucleic acid level in fibroadenomatosis and cancer was established. It is suggested that quantitative morphological study of the effector component of immune system may provide more diagnostic criteria for identification of prognostically unfavorable forms of fibroadenomatosis and cancer.
Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer
NASA Astrophysics Data System (ADS)
Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad
2017-04-01
Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.
Verhoef, Marja J; Lewith, George; Ritenbaugh, Cheryl; Boon, Heather; Fleishman, Susan; Leis, Anne
2005-09-01
Complementary and alternative medicine (CAM) often consists of whole systems of care (such as naturopathic medicine or traditional Chinese medicine (TCM)) that combine a wide range of modalities to provide individualised treatment. The complexity of these interventions and their potential synergistic effect requires innovative evaluative approaches. Model validity, which encompasses the need for research to adequately address the unique healing theory and therapeutic context of the intervention, is central to whole systems research (WSR). Classical randomised controlled trials (RCTs) are limited in their ability to address this need. Therefore, we propose a mixed methods approach that includes a range of relevant and holistic outcome measures. As the individual components of most whole systems are inseparable, complementary and synergistic, WSR must not focus only on the "active" ingredients of a system. An emerging WSR framework must be non-hierarchical, cyclical, flexible and adaptive, as knowledge creation is continuous, evolutionary and necessitates a continuous interplay between research methods and "phases" of knowledge. Finally, WSR must hold qualitative and quantitative research methods in equal esteem to realize their unique research contribution. Whole systems are complex and therefore no one method can adequately capture the meaning, process and outcomes of these interventions.
CO2 Urban Synthesis and Analysis ("CO2-USA") Network
NASA Astrophysics Data System (ADS)
Lin, J. C.; Hutyra, L.; Loughner, C.; Stein, A. F.; Lusk, K.; Mitchell, L.; Gately, C.; Wofsy, S. C.
2017-12-01
Emissions of carbon associated with cities comprise a large component of the anthropogenic source. A number of cities have announced plans to reduce greenhouse gas emissions, but the scientific knowledge to quantitatively track emissions and assess the efficacy of mitigation is lacking. As the global population increasingly resides in urban regions, scientific knowledge about how much, where, and why a particular city emits carbon becomes increasingly important. To address this gap, researchers have initiated studies of carbon emissions and cycling in several U.S. cities, making it timely to develop a collaborative network to exchange information on community standards and common measurements, facilitate data sharing, and create analysis frameworks and cross-city syntheses to catalyze a new generation of researchers and enable new collaborations tackling important objectives that are difficult to address in isolation. We describe initial results from an incipient network focusing initially on cities in the U.S. with low barriers of entry that entrains a cross-section of U.S. urban centers with varying characteristics: size, population density, vegetation, urban form, infrastructure, development rates, climate, and meteorological patterns. Results will be reported that emerge from an initial workshop covering data harmonization & integration, inventory comparison, stakeholder outreach, network design, inverse modeling, and collaboration.
Predictors of employment for young adults with developmental motor disabilities.
Magill-Evans, Joyce; Galambos, Nancy; Darrah, Johanna; Nickerson, Christy
2008-01-01
To identify the personal, family, and community factors that facilitate or hinder employment for young adults with developmental motor disabilities. Quantitative methods with an embedded qualitative component were used. Seventy-six persons between the ages of 20 and 30 years of age (Mean = 25, SD = 3.1) with a diagnosis of either cerebral palsy or spina bifida completed questionnaires addressing factors such as depression, and participated in a semi-structured interview that allowed participants to describe their experiences with education, employment, transportation, and other services. Almost half of the participants (n = 35) were not currently employed. Hierarchical regression analyses indicated that gender (females were less likely to be employed), IQ (lower IQ associated with unemployment), and transportation dependence accounted for 42% of the variance in employment. Themes emerging from content analysis of the interviews supported the findings related to transportation barriers. Social reactions to disability limited employment opportunities, and participants often felt stuck in terms of employment options with limited opportunities for advancement. Transportation is a significant barrier to employment and innovative solutions are needed. Issues related to gender need to be considered when addressing employment inequities for persons with primarily motor disabilities.
Reisner, Sari L; Mimiaga, Matthew J; Mayer, Kenneth H; Tinsley, Jake P; Safren, Steven A
2008-01-01
Sex work is a significant risk for HIV and sexually transmitted infection (STI) among men who have sex with men (MSM); however, there is a dearth of knowledge about how to reduce risk in this group. MSM sex workers (N = 32) completed a semistructured qualitative interview and a close-ended quantitative assessment. Analyses focused on themes relevant to intervention development. Participants reported an average of 46 male sex partners in the prior 12 months; 31% of participants were HIV-infected. Male sex workers frequently used substances during sex and had elevated levels of psychological distress. Qualitative findings suggest that trauma-informed mental health and substance abuse treatment, ready access to HIV/STI testing and treatment and condoms/informational materials, support groups to address isolation/loneliness, skill-building for risk reduction with sex partners, and paid incentives as add-ons to effective behavior change interventions may be valuable intervention components. Targeting consumers of paid/exchanged sex may assist with changing community norms regarding the practice of transactional sex. Multipronged interventions to decrease sexual risk taking among male sex workers would also benefit from addressing the unique socioeconomic and legal needs of this population.
Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim
2009-01-01
Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...
Zhou, Yan; Cao, Hui
2013-01-01
We propose an augmented classical least squares (ACLS) calibration method for quantitative Raman spectral analysis against component information loss. The Raman spectral signals with low analyte concentration correlations were selected and used as the substitutes for unknown quantitative component information during the CLS calibration procedure. The number of selected signals was determined by using the leave-one-out root-mean-square error of cross-validation (RMSECV) curve. An ACLS model was built based on the augmented concentration matrix and the reference spectral signal matrix. The proposed method was compared with partial least squares (PLS) and principal component regression (PCR) using one example: a data set recorded from an experiment of analyte concentration determination using Raman spectroscopy. A 2-fold cross-validation with Venetian blinds strategy was exploited to evaluate the predictive power of the proposed method. The one-way variance analysis (ANOVA) was used to access the predictive power difference between the proposed method and existing methods. Results indicated that the proposed method is effective at increasing the robust predictive power of traditional CLS model against component information loss and its predictive power is comparable to that of PLS or PCR.
'Wasteaware' benchmark indicators for integrated sustainable waste management in cities.
Wilson, David C; Rodic, Ljiljana; Cowing, Michael J; Velis, Costas A; Whiteman, Andrew D; Scheinberg, Anne; Vilches, Recaredo; Masterson, Darragh; Stretz, Joachim; Oelz, Barbara
2015-01-01
This paper addresses a major problem in international solid waste management, which is twofold: a lack of data, and a lack of consistent data to allow comparison between cities. The paper presents an indicator set for integrated sustainable waste management (ISWM) in cities both North and South, to allow benchmarking of a city's performance, comparing cities and monitoring developments over time. It builds on pioneering work for UN-Habitat's solid waste management in the World's cities. The comprehensive analytical framework of a city's solid waste management system is divided into two overlapping 'triangles' - one comprising the three physical components, i.e. collection, recycling, and disposal, and the other comprising three governance aspects, i.e. inclusivity; financial sustainability; and sound institutions and proactive policies. The indicator set includes essential quantitative indicators as well as qualitative composite indicators. This updated and revised 'Wasteaware' set of ISWM benchmark indicators is the cumulative result of testing various prototypes in more than 50 cities around the world. This experience confirms the utility of indicators in allowing comprehensive performance measurement and comparison of both 'hard' physical components and 'soft' governance aspects; and in prioritising 'next steps' in developing a city's solid waste management system, by identifying both local strengths that can be built on and weak points to be addressed. The Wasteaware ISWM indicators are applicable to a broad range of cities with very different levels of income and solid waste management practices. Their wide application as a standard methodology will help to fill the historical data gap. Copyright © 2014 Elsevier Ltd. All rights reserved.
Deschutes estuary feasibility study: hydrodynamics and sediment transport modeling
George, Douglas A.; Gelfenbaum, Guy; Lesser, Giles; Stevens, Andrew W.
2006-01-01
- Provide the completed study to the CLAMP Steering Committee so that a recommendation about a long-term aquatic environment of the basin can be made. The hydrodynamic and sediment transport modeling task developed a number of different model simulations using a process-based morphological model, Delft3D, to help address these goals. Modeling results provide a qualitative assessment of estuarine behavior both prior to dam construction and after various post-dam removal scenarios. Quantitative data from the model is used in the companion biological assessment and engineering design components of the overall study. Overall, the modeling study found that after dam removal, tidal and estuarine processes are immediately restored, with marine water from Budd Inlet carried into North and Middle Basin on each rising tide and mud flats being exposed with each falling tide. Within the first year after dam removal, tidal processes, along with the occasional river floods, act to modify the estuary bed by redistributing sediment through erosion and deposition. The morphological response of the bed is rapid during the first couple of years, then slows as a dynamic equilibrium is reached within three to five years. By ten years after dam removal, the overall hydrodynamic and morphologic behavior of the estuary is similar to the pre-dam estuary, with the exception of South Basin, which has been permanently modified by human activities. In addition to a qualitative assessment of estuarine behavior, process-based modeling provides the ability address specific questions to help to inform decision-making. Considering that predicting future conditions of a complex estuarine environment is wrought with uncertainties, quantitative results in this report are often expressed in terms of ranges of possible outcomes.
Julka, Samir; Cortes, Hernan; Harfmann, Robert; Bell, Bruce; Schweizer-Theobaldt, Andreas; Pursch, Matthias; Mondello, Luigi; Maynard, Shawn; West, David
2009-06-01
A comprehensive multidimensional liquid chromatography system coupled to Electrospray Ionization-Mass Spectrometry (LCxLC-ESI-MS) was developed for detailed characterization and quantitation of solid epoxy resin components. The two orthogonal modes of separation selected were size exclusion chromatography (SEC) in the first dimension and liquid chromatography at critical conditions (LCCC) in the second dimension. Different components present in the solid epoxy resins were separated and quantitated for the first time based on the functional groups and molecular weight heterogeneity. Coupling LCxLC separations with mass spectrometry enabled the identification of components resolved in the two-dimensional space. Several different functional group families of compounds were separated and identified, including epoxy-epoxy and epoxy-alpha-glycol functional oligomers, and their individual molecular weight ranges were determined. Repeatability obtained ranged from 0.5% for the main product to 21% for oligomers at the 0.4% concentration level.
Multidisciplinary, interdisciplinary, or dysfunctional? Team working in mixed-methods research.
O'Cathain, Alicia; Murphy, Elizabeth; Nicholl, Jon
2008-11-01
Combining qualitative and quantitative methods in a single study-otherwise known as mixed-methods research-is common. In health research these projects can be delivered by research teams. A typical scenario, for example, involves medical sociologists delivering qualitative components and researchers from medicine or health economics delivering quantitative components. We undertook semistructured interviews with 20 researchers who had worked on mixed-methods studies in health services research to explore the facilitators of and barriers to exploiting the potential of this approach. Team working emerged as a key issue, with three models of team working apparent: multidisciplinary, interdisciplinary, and dysfunctional. Interdisciplinary research was associated with integration of data or findings from the qualitative and quantitative components in both the final reports and the peer-reviewed publications. Methodological respect between team members and a principal investigator who valued integration emerged as essential to achieving integrated research outcomes.
Student-Centred Learning (SCL): Roles Changed?
ERIC Educational Resources Information Center
Onurkan Aliusta, Gülen; Özer, Bekir
2017-01-01
This paper addresses the espoused and enacted practices of high school teachers with regard to student-centred learning (SCL). Explanatory mixed-method design, where quantitative strand is followed by qualitative one, is employed. While the quantitative strand aims to explore teachers' perceptions regarding the extent student-centred teacher and…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dykstra, Andrew B; St. Brice, Lois; Rodriguez, Jr., Miguel
2014-01-01
Clostridium thermocellum has emerged as a leading bioenergy-relevant microbe due to its ability to solubilize cellulose into carbohydrates, mediated by multi-component membrane-attached complexes termed cellulosomes. To probe microbial cellulose utilization rates, it is desirable to be able to measure the concentrations of saccharolytic enzymes and estimate the total amount of cellulosome present on a mass basis. Current cellulase determination methodologies involve labor-intensive purification procedures and only allow for indirect determination of abundance. We have developed a method using multiple reaction monitoring (MRM-MS) to simultaneously quantitate both enzymatic and structural components of the cellulosome protein complex in samples ranging in complexitymore » from purified cellulosomes to whole cell lysates, as an alternative to a previously-developed enzyme-linked immunosorbent assay (ELISA) method of cellulosome quantitation. The precision of the cellulosome mass concentration in technical replicates is better than 5% relative standard deviation for all samples, indicating high precision for determination of the mass concentration of cellulosome components.« less
Challenge in Enhancing the Teaching and Learning of Variable Measurements in Quantitative Research
ERIC Educational Resources Information Center
Kee, Chang Peng; Osman, Kamisah; Ahmad, Fauziah
2013-01-01
Statistical analysis is one component that cannot be avoided in a quantitative research. Initial observations noted that students in higher education institution faced difficulty analysing quantitative data which were attributed to the confusions of various variable measurements. This paper aims to compare the outcomes of two approaches applied in…
Zhang, Huakun; Zhu, Bo; Qi, Bao; Gou, Xiaowan; Dong, Yuzhu; Xu, Chunming; Zhang, Bangjiao; Huang, Wei; Liu, Chang; Wang, Xutong; Yang, Chunwu; Zhou, Hao; Kashkush, Khalil; Feldman, Moshe; Wendel, Jonathan F.; Liu, Bao
2014-01-01
Subgenome integrity in bread wheat (Triticum aestivum; BBAADD) makes possible the extraction of its BBAA component to restitute a novel plant type. The availability of such a ploidy-reversed wheat (extracted tetraploid wheat [ETW]) provides a unique opportunity to address whether and to what extent the BBAA component of bread wheat has been modified in phenotype, karyotype, and gene expression during its evolutionary history at the allohexaploid level. We report here that ETW was anomalous in multiple phenotypic traits but maintained a stable karyotype. Microarray-based transcriptome profiling identified a large number of differentially expressed genes between ETW and natural tetraploid wheat (Triticum turgidum), and the ETW-downregulated genes were enriched for distinct Gene Ontology categories. Quantitative RT-PCR analysis showed that gene expression differences between ETW and a set of diverse durum wheat (T. turgidum subsp durum) cultivars were distinct from those characterizing tetraploid cultivars per se. Pyrosequencing revealed that the expression alterations may occur to either only one or both of the B and A homoeolog transcripts in ETW. A majority of the genes showed additive expression in a resynthesized allohexaploid wheat. Analysis of a synthetic allohexaploid wheat and diverse bread wheat cultivars revealed the rapid occurrence of expression changes to the BBAA subgenomes subsequent to allohexaploidization and their evolutionary persistence. PMID:24989045
Mubuuke, AG; Oria, H; Dhabangi, A; Kiguli, S; Sewankambo, NK
2015-01-01
Introduction To produce health professionals who are oriented towards addressing community priority health needs, the training in medical schools has been transformed to include a component of community-based training. During this period, students spend a part of their training in the communities they are likely to serve upon graduation. They engage and empower local people in the communities to address their health needs during their placements, and at the same time learn from the people. During the community-based component, students are constantly supervised by faculty from the university to ensure that the intended objectives are achieved. The purpose of the present study was to explore student experiences of support supervision from university faculty during their community-based education, research and service (COBERS placements) and to identify ways in which the student learning can be improved through improved faculty supervision. Methods This was a cross-sectional study involving students at the College of Health Sciences, Makerere University, Uganda, who had a community-based component during their training. Data were collected using both questionnaires and focus group discussions. Quantitative data were analyzed using statistical software and thematic approaches were used for the analysis of qualitative data. Results Most students reported satisfaction with the COBERS supervision; however, junior students were less satisfied with the supervision than the more senior students with more experience of community-based training. Although many supervisors assisted students before departure to COBERS sites, a significant number of supervisors made little follow-up while students were in the community. Incorporating the use of information technology avenues such as emails and skype sessions was suggested as a potential way of enhancing supervision amidst resource constraints without faculty physically visiting the sites. Conclusions Although many students were satisfied with COBERS supervision, there are still some challenges, mostly seen with the more junior students. Using information technology could be a solution to some of these challenges. PMID:26626014
Mubuuke, Aloysius G; Oria, Hussein; Dhabangi, Aggrey; Kiguli, Sarah; Sewankambo, Nelson K
2015-01-01
To produce health professionals who are oriented towards addressing community priority health needs, the training in medical schools has been transformed to include a component of community-based training. During this period, students spend a part of their training in the communities they are likely to serve upon graduation. They engage and empower local people in the communities to address their health needs during their placements, and at the same time learn from the people. During the community-based component, students are constantly supervised by faculty from the university to ensure that the intended objectives are achieved. The purpose of the present study was to explore student experiences of support supervision from university faculty during their community-based education, research and service (COBERS placements) and to identify ways in which the student learning can be improved through improved faculty supervision. This was a cross-sectional study involving students at the College of Health Sciences, Makerere University, Uganda, who had a community-based component during their training. Data were collected using both questionnaires and focus group discussions. Quantitative data were analyzed using statistical software and thematic approaches were used for the analysis of qualitative data. Most students reported satisfaction with the COBERS supervision; however, junior students were less satisfied with the supervision than the more senior students with more experience of community-based training. Although many supervisors assisted students before departure to COBERS sites, a significant number of supervisors made little follow-up while students were in the community. Incorporating the use of information technology avenues such as emails and skype sessions was suggested as a potential way of enhancing supervision amidst resource constraints without faculty physically visiting the sites. Although many students were satisfied with COBERS supervision, there are still some challenges, mostly seen with the more junior students. Using information technology could be a solution to some of these challenges.
Slaug, Björn; Schilling, Oliver; Iwarsson, Susanne; Carlsson, Gunilla
2015-09-02
Making the built environment accessible for all regardless of functional capacity is an important goal for public health efforts. Considerable impediments to achieving this goal suggest the need for valid measurements of acccessibility and for greater attention to the complexity of person-environment fit issues. To address these needs, this study aimed to provide a methodological platform, useful for further research and instrument development within accessibility research. This was accomplished by the construction of a typology of problematic person-environment fit constellations, utilizing an existing methodology developed to assess and analyze accessibility problems in the built environment. By means of qualitative review and statistical methods we classified the person-environment fit components covered by an existing application which targets housing accessibility: the Housing Enabler (HE) instrument. The International Classification of Functioning, Disability and Health (ICF) was used as a conceptual framework. Qualitative classification principles were based on conceptual similarities and for quantitative analysis of similarities, Principal Component Analysis was carried out. We present a typology of problematic person-environment fit constellations classified along three dimensions: 1) accessibility problem range and severity 2) aspects of functioning 3) environmental context. As a result of the classification of the HE components, 48 typical person-environment fit constellations were recognised. The main contribution of this study is the proposed typology of person-environment fit constellations. The typology provides a methodological platform for the identification and quantification of problematic person-environment fit constellations. Its link to the globally accepted ICF classification system facilitates communication within the scientific and health care practice communities. The typology also highlights how relations between aspects of functioning and physical environmental barriers generate typical accessibility problems, and thereby furnishes a reference point for research oriented to how the built environment may be designed to be supportive for activity, participation and health.
Klimek-Turek, A; Sikora, M; Rybicki, M; Dzido, T H
2016-03-04
A new concept of using thin-layer chromatography to sample preparation for the quantitative determination of solute/s followed by instrumental techniques is presented Thin-layer chromatography (TLC) is used to completely separate acetaminophen and its internal standard from other components (matrix) and to form a single spot/zone containing them at the solvent front position (after the final stage of the thin-layer chromatogram development). The location of the analytes and internal standard in the solvent front zone allows their easy extraction followed by quantitation by HPLC. The exctraction procedure of the solute/s and internal standard can proceed from whole solute frontal zone or its part without lowering in accuracy of quantitative analysis. Copyright © 2016 Elsevier B.V. All rights reserved.
Automated diagnosis of Alzheimer's disease with multi-atlas based whole brain segmentations
NASA Astrophysics Data System (ADS)
Luo, Yuan; Tang, Xiaoying
2017-03-01
Voxel-based analysis is widely used in quantitative analysis of structural brain magnetic resonance imaging (MRI) and automated disease detection, such as Alzheimer's disease (AD). However, noise at the voxel level may cause low sensitivity to AD-induced structural abnormalities. This can be addressed with the use of a whole brain structural segmentation approach which greatly reduces the dimension of features (the number of voxels). In this paper, we propose an automatic AD diagnosis system that combines such whole brain segmen- tations with advanced machine learning methods. We used a multi-atlas segmentation technique to parcellate T1-weighted images into 54 distinct brain regions and extract their structural volumes to serve as the features for principal-component-analysis-based dimension reduction and support-vector-machine-based classification. The relationship between the number of retained principal components (PCs) and the diagnosis accuracy was systematically evaluated, in a leave-one-out fashion, based on 28 AD subjects and 23 age-matched healthy subjects. Our approach yielded pretty good classification results with 96.08% overall accuracy being achieved using the three foremost PCs. In addition, our approach yielded 96.43% specificity, 100% sensitivity, and 0.9891 area under the receiver operating characteristic curve.
NASA Astrophysics Data System (ADS)
Zhu, Lian; Weber, Stephanie; Berry, Joel; Vaidya, Nilesh; Haataja, Mikko; Brangwynne, Clifford
2015-03-01
The nucleolus is a liquid-like membrane-less nuclear body which plays an important role in cell growth and size control. By modulating nucleolar component concentration through RNAi conditions that change C. elegans cell size, we find that nucleoli only assemble above a threshold concentration; moreover, the ripening dynamics of nucleated droplets are consistent with the hypothesis that the assembly of the nucleolus represents an intracellular liquid-liquid phase transition. A key question is how this phase-transition is linked to the primary function of the nucleolus, in transcribing and processing ribosomal RNA. To address this, we characterize the localization of RNA Polymerase I, a key transcriptional enzyme, into nucleolar foci as a function of nucleolar component concentration. Our results suggest that there are a small number of key disordered phosphoproteins that may serve as a link between transcription and assembly. Finally, we present preliminary results using a reduced model system consisting of purified nucleolar proteins to assess the ability of nucleolar proteins to drive liquid-liquid phase separation in vitro. These results lay the foundation for a quantitative understanding of intracellular phase transitions and their impact on biomedically-critical RNA-processing steps.
Bradbury-Jones, Caroline; Breckenridge, Jenna P; Devaney, John; Duncan, Fiona; Kroll, Thilo; Lazenbatt, Anne; Taylor, Julie
2015-12-28
Domestic abuse is a significant public health issue. It occurs more frequently among disabled women than those without a disability and evidence suggests that a great deal of domestic abuse begins or worsens during pregnancy. All women and their infants are entitled to equal access to high quality maternity care. However, research has shown that disabled women who experience domestic abuse face numerous barriers to accessing care. The aim of the study was to identify the priority areas for improving access to maternity services for this group of women; develop strategies for improved access and utilisation; and explore the feasibility of implementing the identified strategies. This multi-method study was the third and final part of a larger study conducted in the UK between 2012 and 2014. The study used a modified concept mapping approach and was theoretically underpinned by Andersen's model of healthcare use. Seven focus group interviews were conducted with a range of maternity care professionals (n = 45), incorporating quantitative and qualitative components. Participants ranked perceived barriers to women's access and utilisation of maternity services in order of priority using a 5-point Likert scale. Quantitative data exploration used descriptive and non-parametric analyses. In the qualitative component of each focus group, participants discussed the barriers and identified potential improvement strategies (and feasibility of implementing these). Qualitative data were analysed inductively using a framework analysis approach. The three most highly ranked barriers to women's access and utilisation of maternity services identified in the quantitative component were: 1) staff being unaware and not asking about domestic abuse and disability; 2) the impact of domestic abuse on women; 3) women's fear of disclosure. The top two priority strategies were: providing information about domestic abuse to all women and promoting non-judgemental staff attitude. These were also considered very feasible. The qualitative analysis identified a range of psychosocial and environmental barriers experienced by this group of women in accessing maternity care. Congruent with the quantitative results, the main themes were lack of awareness and fear of disclosure. Key strategies were identified as demystifying disclosure and creating physical spaces to facilitate disclosure. The study supports findings of previous research regarding the barriers that women face in accessing and utilising maternity services, particularly regarding the issue of disclosure. But the study provides new evidence on the perceived importance and feasibility of strategies to address such barriers. This is an important step in ensuring practice-based acceptability and ease with which improvement strategies might be implemented in maternity care settings.
A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis
ERIC Educational Resources Information Center
Katharaki, Maria; Katharakis, George
2010-01-01
In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…
Dual-Enrollment High-School Graduates' College-Enrollment Considerations
ERIC Educational Resources Information Center
Damrow, Roberta J.
2017-01-01
This quantitative study examined college enrollment considerations of dual-enrollment students enrolling at one Wisconsin credit-granting technical college. A combined college-choice theoretical framework guided this quantitative study that addressed two research questions: To what extent, if any, did the number of dual credits predict likelihood…
Quantitative Experiments to Explain the Change of Seasons
ERIC Educational Resources Information Center
Testa, Italo; Busarello, Gianni; Puddu, Emanuella; Leccia, Silvio; Merluzzi, Paola; Colantonio, Arturo; Moretti, Maria Ida; Galano, Silvia; Zappia, Alessandro
2015-01-01
The science education literature shows that students have difficulty understanding what causes the seasons. Incorrect explanations are often due to a lack of knowledge about the physical mechanisms underlying this phenomenon. To address this, we present a module in which the students engage in quantitative measurements with a photovoltaic panel to…
USDA-ARS?s Scientific Manuscript database
The genomics revolution provides vital tools to address global food security. Yet to be incorporated into livestock breeding, molecular techniques need to be integrated into a quantitative genetics framework. Within the U.S., with shrinking faculty numbers with the requisite skills, the capacity to ...
ERIC Educational Resources Information Center
Carsey, Thomas M.; Harden, Jeffrey J.
2015-01-01
Graduate students in political science come to the discipline interested in exploring important political questions, such as "What causes war?" or "What policies promote economic growth?" However, they typically do not arrive prepared to address those questions using quantitative methods. Graduate methods instructors must…
Engaging Business Students in Quantitative Skills Development
ERIC Educational Resources Information Center
Cronin, Anthony; Carroll, Paula
2015-01-01
In this paper the complex problems of developing quantitative and analytical skills in undergraduate first year, first semester business students are addressed. An action research project, detailing how first year business students perceive the relevance of data analysis and inferential statistics in light of the economic downturn and the…
The calibration of video cameras for quantitative measurements
NASA Technical Reports Server (NTRS)
Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.
1993-01-01
Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.
Pre-Service Teachers' Assessment Literacy and Its Implementation into Practice
ERIC Educational Resources Information Center
Ogan-Bekiroglu, Feral; Suzuk, Erol
2014-01-01
This research study was conducted to address pre-service physics teachers' assessment literacy and its implementation into practice. The research was both quantitative and qualitative in nature. For the quantitative aspect of the study, the researchers determined the participants' assessment literacy after their enrolment with the…
2L-PCA: a two-level principal component analyzer for quantitative drug design and its applications.
Du, Qi-Shi; Wang, Shu-Qing; Xie, Neng-Zhong; Wang, Qing-Yan; Huang, Ri-Bo; Chou, Kuo-Chen
2017-09-19
A two-level principal component predictor (2L-PCA) was proposed based on the principal component analysis (PCA) approach. It can be used to quantitatively analyze various compounds and peptides about their functions or potentials to become useful drugs. One level is for dealing with the physicochemical properties of drug molecules, while the other level is for dealing with their structural fragments. The predictor has the self-learning and feedback features to automatically improve its accuracy. It is anticipated that 2L-PCA will become a very useful tool for timely providing various useful clues during the process of drug development.
Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M
2007-01-01
We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus.
Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M
2007-01-01
We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus. PMID:18466597
Roeder, Ingo; Loeffler, Markus; Glauche, Ingmar
2011-04-15
Here we report about an interdisciplinary workshop focusing on the effects of the local growth-environment on the regulation of stem cell development. Under the title "Towards a quantitative understanding of stem cell/ niche interaction: Experiments, models, and technologies", 33 experts from eight countries discussed current knowledge, new experimental and theoretical results as well as innovative measurement technologies. Specifically, the workshop addressed the following questions: What defines a stem cell niche? What are functional/regulatory characteristics of stem cell- microenvironment interactions? What experimental systems and technologies for quantifying niche function are available? As a consensus result it was recorded that there is no unique niche architecture across tissues but that there are generic principles of niche organization guaranteeing a proper function of stem cells. This functional aspect, as the major defining criterion, leads to the conclusion that stem cells and their niches need to be considered as an inseparable pair with implications for their experimental assessment: To be able to study any of those two components, the other component has to be accounted for. In this context, a number of classical in vitro assays using co-cultures of stem and stroma cells, but also new, specifically bioengineered culture systems have been discussed with respect to their advantages and disadvantages. Finally, there was a general agreement that the comprehensive understanding of niche-mediated stem cell regulation will, due to the complexity of involved mechanisms, require an interdisciplinary, systems biological approach. In addition to cell and molecular biology, biochemistry, biophysics and bioengineering also bioinformatics and mathematical modeling will play a major role in the future of this field. Copyright © 2011 Elsevier Inc. All rights reserved.
Mutual interferences and design principles for mechatronic devices in magnetic resonance imaging.
Yu, Ningbo; Gassert, Roger; Riener, Robert
2011-07-01
Robotic and mechatronic devices that work compatibly with magnetic resonance imaging (MRI) are applied in diagnostic MRI, image-guided surgery, neurorehabilitation and neuroscience. MRI-compatible mechatronic systems must address the challenges imposed by the scanner's electromagnetic fields. We have developed objective quantitative evaluation criteria for device characteristics needed to formulate design guidelines that ensure MRI-compatibility based on safety, device functionality and image quality. The mutual interferences between an MRI system and mechatronic devices working in its vicinity are modeled and tested. For each interference, the involved components are listed, and a numerical measure for "MRI-compatibility" is proposed. These interferences are categorized into an MRI-compatibility matrix, with each element representing possible interactions between one part of the mechatronic system and one component of the electromagnetic fields. Based on this formulation, design principles for MRI-compatible mechatronic systems are proposed. Furthermore, test methods are developed to examine whether a mechatronic device indeed works without interferences within an MRI system. Finally, the proposed MRI-compatibility criteria and design guidelines have been applied to an actual design process that has been validated by the test procedures. Objective and quantitative MRI-compatibility measures for mechatronic and robotic devices have been established. Applying the proposed design principles, potential problems in safety, device functionality and image quality can be considered in the design phase to ensure that the mechatronic system will fulfill the MRI-compatibility criteria. New guidelines and test procedures for MRI instrument compatibility provide a rational basis for design and evaluation of mechatronic devices in various MRI applications. Designers can apply these criteria and use the tests, so that MRI-compatibility results can accrue to build an experiential database.
Lucas, Stephen M.; Druss, Benjamin G.
2013-01-01
Stigma against mental illness is a complex construct with affective, cognitive, and behavioral components. Beyond its symbolic value, federal law can only directly address one component of stigma: discrimination. This article reviews three landmark antidiscrimination laws that expanded protections over time for individuals with mental illness. Despite these legislative advances, protections are still not uniform for all subpopulations with mental illness. Furthermore, multiple components of stigma (e.g., prejudice) are beyond the reach of legislation, as demonstrated by the phenomenon of label avoidance; individuals may not seek protection from discrimination because of fear of the stigma that may ensue after disclosing their mental illness. To yield the greatest improvements, antidiscrimination laws must be coupled with antistigma programs that directly address other components of stigma. PMID:23488484
Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy
NASA Astrophysics Data System (ADS)
Jiang, Dejun; Zhao, Shusen; Shen, Jingling
2008-03-01
A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.
The operations manual: a mechanism for improving the research process.
Bowman, Ann; Wyman, Jean F; Peters, Jennifer
2002-01-01
The development and use of an operations manual has the potential to improve the capacity of nurse scientists to address the complex, multifaceted issues associated with conducting research in today's healthcare environment. An operations manual facilitates communication, standardizes training and evaluation, and enhances the development and standard implementation of clear policies, processes, and protocols. A 10-year review of methodology articles in relevant nursing journals revealed no attention to this topic. This article will discuss how an operations manual can improve the conduct of research methods and outcomes for both small-scale and large-scale research studies. It also describes the purpose and components of a prototype operations manual for use in quantitative research. The operations manual increases reliability and reproducibility of the research while improving the management of study processes. It can prevent costly and untimely delays or errors in the conduct of research.
Multifunctional wearable devices for diagnosis and therapy of movement disorders.
Son, Donghee; Lee, Jongha; Qiao, Shutao; Ghaffari, Roozbeh; Kim, Jaemin; Lee, Ji Eun; Song, Changyeong; Kim, Seok Joo; Lee, Dong Jun; Jun, Samuel Woojoo; Yang, Shixuan; Park, Minjoon; Shin, Jiho; Do, Kyungsik; Lee, Mincheol; Kang, Kwanghun; Hwang, Cheol Seong; Lu, Nanshu; Hyeon, Taeghwan; Kim, Dae-Hyeong
2014-05-01
Wearable systems that monitor muscle activity, store data and deliver feedback therapy are the next frontier in personalized medicine and healthcare. However, technical challenges, such as the fabrication of high-performance, energy-efficient sensors and memory modules that are in intimate mechanical contact with soft tissues, in conjunction with controlled delivery of therapeutic agents, limit the wide-scale adoption of such systems. Here, we describe materials, mechanics and designs for multifunctional, wearable-on-the-skin systems that address these challenges via monolithic integration of nanomembranes fabricated with a top-down approach, nanoparticles assembled by bottom-up methods, and stretchable electronics on a tissue-like polymeric substrate. Representative examples of such systems include physiological sensors, non-volatile memory and drug-release actuators. Quantitative analyses of the electronics, mechanics, heat-transfer and drug-diffusion characteristics validate the operation of individual components, thereby enabling system-level multifunctionalities.
Fluorescence decay data analysis correcting for detector pulse pile-up at very high count rates
NASA Astrophysics Data System (ADS)
Patting, Matthias; Reisch, Paja; Sackrow, Marcus; Dowler, Rhys; Koenig, Marcelle; Wahl, Michael
2018-03-01
Using time-correlated single photon counting for the purpose of fluorescence lifetime measurements is usually limited in speed due to pile-up. With modern instrumentation, this limitation can be lifted significantly, but some artifacts due to frequent merging of closely spaced detector pulses (detector pulse pile-up) remain an issue to be addressed. We propose a data analysis method correcting for this type of artifact and the resulting systematic errors. It physically models the photon losses due to detector pulse pile-up and incorporates the loss in the decay fit model employed to obtain fluorescence lifetimes and relative amplitudes of the decay components. Comparison of results with and without this correction shows a significant reduction of systematic errors at count rates approaching the excitation rate. This allows quantitatively accurate fluorescence lifetime imaging at very high frame rates.
Critical diversity: Divided or united states of social coordination
Kelso, J. A. Scott; Tognoli, Emmanuelle
2018-01-01
Much of our knowledge of coordination comes from studies of simple, dyadic systems or systems containing large numbers of components. The huge gap ‘in between’ is seldom addressed, empirically or theoretically. We introduce a new paradigm to study the coordination dynamics of such intermediate-sized ensembles with the goal of identifying key mechanisms of interaction. Rhythmic coordination was studied in ensembles of eight people, with differences in movement frequency (‘diversity’) manipulated within the ensemble. Quantitative change in diversity led to qualitative changes in coordination, a critical value separating régimes of integration and segregation between groups. Metastable and multifrequency coordination between participants enabled communication across segregated groups within the ensemble, without destroying overall order. These novel findings reveal key factors underlying coordination in ensemble sizes previously considered too complicated or 'messy' for systematic study and supply future theoretical/computational models with new empirical checkpoints. PMID:29617371
Organizational cultural competence consultation to a mental health institution.
Fung, Kenneth; Lo, Hung-Tat Ted; Srivastava, Rani; Andermann, Lisa
2012-04-01
Cultural competence is increasingly recognized as an essential component of effective mental health care delivery to address diversity and equity issues. Drawing from the literature and our experience in providing cultural competence consultation and training, the paper will discuss our perspective on the foundational concepts of cultural competence and how it applies to a health care organization, including its programs and services. Based on a recent consultation project, we present a methodology for assessing cultural competence in health care organizations, involving mixed quantitative and qualitative methods. Key findings and recommendations from the resulting cultural competence plan are discussed, including core principles, change strategies, and an Organizational Cultural Competence Framework, which may be applicable to other health care institutions seeking such changes. This framework, consisting of eight domains, can be used for organizational assessment and cultural competence planning, ultimately aiming at enhancing mental health care service to the diverse patients, families, and communities.
NASA Astrophysics Data System (ADS)
Huang, Chun-Yi; Chang, Hsin-Wei; Chang, Che-Chen
2018-03-01
Knowledge about the chemical compositions of meso/nanomaterials is fundamental to development of their applications in advanced technologies. Auger electron spectroscopy (AES) is an effective analysis method for the characterization of meso/nanomaterial structures. Although a few studies have reported the use of AES for the analysis of the local composition of these structures, none have explored in detail the validity of the meso/nanoanalysis results generated by the AES instrument. This paper addresses the limitations of AES and the corrections necessary to offset them for this otherwise powerful meso/nanoanalysis tool. The results of corrections made to the AES multi-point analysis of high-density copper-based meso/nanostructures provides major insights into their local chemical compositions and technological prospects, which the primitive composition output of the AES instrument failed to provide.
Maitre, Nathalie L; Chorna, Olena; Romeo, Domenico M; Guzzetta, Andrea
2016-12-01
High-risk infant follow-up programs provide early identification and referral for treatment of neurodevelopmental delays and impairments. In these programs, a standardized neurological examination is a critical component of evaluation for clinical and research purposes. To address primary challenges of provider educational diversity and standardized documentation, we designed an approach to training and implementation of the Hammersmith Infant Neurological Examination with precourse materials, a workshop model, and adaptation of the electronic medical record. Provider completion and documentation of a neurological examination were evaluated before and after Hammersmith Infant Neurological Examination training. Standardized training and implementation of the Hammersmith Infant Neurological Examination in a large high-risk infant follow-up is feasible and effective and allows for quantitative evaluation of neurological findings and developmental trajectories. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Critical strain for Sn incorporation into spontaneously graded Ge/GeSn core/shell nanowires.
Albani, Marco; Assali, Simone; Verheijen, Marcel A; Koelling, Sebastian; Bergamaschini, Roberto; Pezzoli, Fabio; Bakkers, Erik P A M; Miglio, Leo
2018-04-19
We address the role of non-uniform composition, as measured by energy-dispersive x-ray spectroscopy, in the elastic properties of core/shell nanowires for the Ge/GeSn system. In particular, by finite element method simulations and transmission electron diffraction measurements, we estimate the residual misfit strain when a radial gradient in Sn and a Ge segregation at the nanowire facet edges are present. An elastic stiffening of the structure with respect to the uniform one is concluded, particularly for the axial strain component. More importantly, refined predictions linking the strain and the Sn percentage at the nanowire facets enable us to quantitatively determine the maximum compressive strain value allowing for additional Sn incorporation into a GeSn alloy. The progressive incorporation with increasing shell thickness, under constant growth conditions, is specifically induced by the nanowire configuration, where a larger elastic relaxation of the misfit strain takes place.
Imaging the Localized Plasmon Resonance Modes in Graphene Nanoribbons
Hu, F.; Luan, Y.; Fei, Z.; ...
2017-08-14
Here, we report a nanoinfrared (IR) imaging study of the localized plasmon resonance modes of graphene nanoribbons (GNRs) using a scattering-type scanning near-field optical microscope (s-SNOM). By comparing the imaging data of GNRs that are aligned parallel and perpendicular to the in-plane component of the excitation laser field, we observed symmetric and asymmetric plasmonic interference fringes, respectively. Theoretical analysis indicates that the asymmetric fringes are formed due to the interplay between the localized surface plasmon resonance (SPR) mode excited by the GNRs and the propagative surface plasmon polariton (SPP) mode launched by the s-SNOM tip. And with rigorous simulations, wemore » reproduce the observed fringe patterns and address quantitatively the role of the s-SNOM tip on both the SPR and SPP modes. Moreover, we have seen real-space signatures of both the dipole and higher-order SPR modes by varying the ribbon width.« less
Designing Studies That Would Address the Multilayered Nature of Health Care
Pennell, Michael; Rhoda, Dale; Hade, Erinn M.; Paskett, Electra D.
2010-01-01
We review design and analytic methods available for multilevel interventions in cancer research with particular attention to study design, sample size requirements, and potential to provide statistical evidence for causal inference. The most appropriate methods will depend on the stage of development of the research and whether randomization is possible. Early on, fractional factorial designs may be used to screen intervention components, particularly when randomization of individuals is possible. Quasi-experimental designs, including time-series and multiple baseline designs, can be useful once the intervention is designed because they require few sites and can provide the preliminary evidence to plan efficacy studies. In efficacy and effectiveness studies, group-randomized trials are preferred when randomization is possible and regression discontinuity designs are preferred otherwise if assignment based on a quantitative score is possible. Quasi-experimental designs may be used, especially when combined with recent developments in analytic methods to reduce bias in effect estimates. PMID:20386057
Semi Quantitative MALDI TOF for Antimicrobial Susceptibility Testing in Staphylococcus aureus
2017-08-31
Semi- quantitative MALDI-TOF for antimicrobial susceptibility testing in Staphylococcus 1 aureus 2 3 4 Tucker Maxson,a Cheryl L. Taylor-Howell,a...Timothy D. Minoguea# 5 6 Diagnostic Systems Division, United States Army Medical Research Institute of Infectious 7 Disease, Fort Detrick, MD...USAa 8 9 Running Title: Quantitative MALDI for AST in S. aureus 10 #Address correspondence to Timothy D. Minogue, timothy.d.minogue.civ@mail.mil
Understanding Pre-Quantitative Risk in Projects
NASA Technical Reports Server (NTRS)
Cooper, Lynne P.
2011-01-01
Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.
Shellie, Robert; Marriott, Philip; Morrison, Paul
2004-09-01
The use of gas chromatography (GC)-mass spectrometry (MS), GC-time-of-flight MS (TOFMS), comprehensive two-dimensional GC (GCxGC)-flame ionization detection (FID), and GCxGC-TOFMS is discussed for the characterization of the eight important representative components, including Z-alpha-santalol, epi-alpha-bisabolol, Z-alpha-trans-bergamotol, epi-beta-santalol, Z-beta-santalol, E,E-farnesol, Z-nuciferol, and Z-lanceol, in the oil of west Australian sandalwood (Santalum spicatum). Single-column GC-MS lacks the resolving power to separate all of the listed components as pure peaks and allow precise analytical measurement of individual component abundances. With enhanced peak resolution capabilities in GCxGC, these components are sufficiently well resolved to be quantitated using flame ionization detection, following initial characterization of components by using GCxGC-TOFMS.
NASA Astrophysics Data System (ADS)
Arnold, J.; Gutmann, E. D.; Clark, M. P.; Nijssen, B.; Vano, J. A.; Addor, N.; Wood, A.; Newman, A. J.; Mizukami, N.; Brekke, L. D.; Rasmussen, R.; Mendoza, P. A.
2016-12-01
Climate change narratives for water-resource applications must represent the change signals contextualized by hydroclimatic process variability and uncertainty at multiple scales. Building narratives of plausible change includes assessing uncertainties across GCM structure, internal climate variability, climate downscaling methods, and hydrologic models. Work with this linked modeling chain has dealt mostly with GCM sampling directed separately to either model fidelity (does the model correctly reproduce the physical processes in the world?) or sensitivity (of different model responses to CO2 forcings) or diversity (of model type, structure, and complexity). This leaves unaddressed any interactions among those measures and with other components in the modeling chain used to identify water-resource vulnerabilities to specific climate threats. However, time-sensitive, real-world vulnerability studies typically cannot accommodate a full uncertainty ensemble across the whole modeling chain, so a gap has opened between current scientific knowledge and most routine applications for climate-changed hydrology. To close that gap, the US Army Corps of Engineers, the Bureau of Reclamation, and the National Center for Atmospheric Research are working on techniques to subsample uncertainties objectively across modeling chain components and to integrate results into quantitative hydrologic storylines of climate-changed futures. Importantly, these quantitative storylines are not drawn from a small sample of models or components. Rather, they stem from the more comprehensive characterization of the full uncertainty space for each component. Equally important from the perspective of water-resource practitioners, these quantitative hydrologic storylines are anchored in actual design and operations decisions potentially affected by climate change. This talk will describe part of our work characterizing variability and uncertainty across modeling chain components and their interactions using newly developed observational data, models and model outputs, and post-processing tools for making the resulting quantitative storylines most useful in practical hydrology applications.
NASA Astrophysics Data System (ADS)
Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.
2015-08-01
A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.
On-Going Laboratory Efforts to Quantitatively Address Clay Abundance on Mars
NASA Technical Reports Server (NTRS)
Roush, Ted L.; Bishop, Janice L.; Brown, Adrian J.; Blake, David F.; Bristow, Thomas F.
2012-01-01
Data obtained at visible and near-infrared wavelengths by OMEGA on MarsExpress and CRISM on MRO provide definitive evidence for the presence of phyllosilicates and other hydrated phases on Mars. A diverse range of both Fe/Mg-OH and Al-OH-bearing phyllosilicates were identified including the smectites, nontronite, saponite, and montmorillonite. In order to constrain the abundances of these phyllosilicates spectral analyses of mixtures are needed. We report on our on-going effort to enable the quantitative evaluation of the abundance of hydrated-hydroxylated silicates when they are contained in mixtures. We include two component mixtures of hydrated/hydroxylated silicates with each other and with two analogs for other martian materials; pyroxene (enstatite) and palagonitic soil (an alteration product of basaltic glass). For the hydrated-hydroxylated silicates we include saponite and montmorillonite (Mg- and Al- rich smectites). We prepared three size separates of each end-member for study: 20-45, 63-90, and 125-150 µm. As the second phase of our effort we used scanning electron microscopy imaging and x-ray diffraction to characterize the grain size distribution, and structural nature, respectively, of the mixtures. Visible and near-infrared reflectance spectra of the 63-90 micrometers grain size of the mixture samples are shown in Figure 1. We discuss the results of our measurements of these mixtures.
Skuratovsky, Aleksander; Soto, Robert J; Porter, Marc D
2018-06-19
This paper presents a method for immunometric biomarker quantitation that uses standard flow-through assay reagents and obviates the need for constructing a calibration curve. The approach relies on a nitrocellulose immunoassay substrate with multiple physical addresses for analyte capture, each modified with different amounts of an analyte-specific capture antibody. As such, each address generates a distinctly different readout signal that is proportional to the analyte concentration in the sample. To establish the feasibility of this concept, equations derived from antibody-antigen binding equilibrium were first applied in modeling experiments. Next, nitrocellulose membranes with multiple capture antibody addresses were fabricated for detection of a model analyte, human Immunoglobulin G (hIgG), by a heterogeneous sandwich immunoassay using antibody-modified gold nanoparticles (AuNPs) as the immunolabel. Counting the number of colored capture addresses visible to the unassisted eye enabled semiquantitative hIgG determination. We then demonstrated that, by leveraging the localized surface plasmon resonance of the AuNPs, surface-enhanced Raman spectroscopy (SERS) can be used for quantitative readout. By comparing the SERS signal intensities from each capture address with values predicted using immunoassay equilibrium theory, the concentration of hIgG can be determined (∼30% average absolute deviation) without reference to a calibration curve. This work also demonstrates the ability to manipulate the dynamic range of the assay over ∼4 orders of magnitude (from 2 ng mL -1 to 10 μg mL -1 ). The potential prospects in applying this concept to point-of-need diagnostics are also discussed.
Ferreira, Silvana Margarida Benevides; Yonekura, Tatiana; Ignotti, Eliane; Oliveira, Larissa Bertacchini de; Takahashi, Juliana; Soares, Cassia Baldini
2017-10-01
Individuals in contact with patients who have leprosy have an increased risk of disease exposure, which reinforces the need for chemoprophylactic measures, such as the use of rifampicin. The objective of the review was to synthesize the best available evidence regarding the effectiveness of rifampicin chemoprophylaxis for contacts with patients with leprosy, and to synthesize the best available evidence on the experience and acceptability of rifampicin chemoprophylaxis as reported by the contacts and health professionals involved in the treatment of leprosy or Hansen's disease. In the quantitative component, individuals in contact with leprosy patients were included. In the qualitative component, in addition to contacts, health professionals who were in the practice of treating leprosy were included. The quantitative component considered as an intervention rifampicin at any dose, frequency and mode of administration, and rifampicin combination regimens.The qualitative component considered as phenomena of interest the experience and acceptability of rifampicin chemoprophylaxis. The quantitative component considered experimental and observational studies whereas the qualitative component considered studies that focused on qualitative data, including but not limited to, designs such as phenomenology, grounded theory, ethnography and action-research. The quantitative component considered studies that reported on outcomes such as the development of clinical leprosy in the contacts of patients who had leprosy, incidence rates, adverse effects and safety/harmful effects of the intervention. A three-step strategy for published and unpublished literature was used. The search for published studies included: PubMed, Cumulative Index to Nursing and Allied Health Literature, Cochrane Library, Scopus, Web of Science, National Institute for Health and Clinical Excellence, Latin American and Caribbean Health Sciences Literature; and Google Scholar and EVIPnet for unpublished studies. Studies published from the time of the respective database inception to January 2016 in English, Spanish, Portuguese, Japanese and Chinese were considered. Two reviewers independently assessed the studies for methodological quality using standardized critical appraisal instruments from the Joanna Briggs Institute. Standardized data extraction tools developed by the Joanna Briggs Institute were used to extract quantitative and qualitative data from papers included in the review. Due to clinical and methodological heterogeneity in the interventions of the included studies, no statistical meta-analysis was possible. Quantitative and qualitative research findings are presented in narrative form. Following critical appraisal, eight studies were included in this review, seven quantitative and one qualitative. The reduction in incidence of leprosy, using one dose of rifampicin in the first two years, was 56.5%; in the follow up period of one to four years, the reduction was 34.9%. The combination of rifampicin and the Bacillus Calmette-Guérin vaccine showed a preventative effect of 80% against the disease. The only controlled clinical trial using two doses of rifampicin was community-based and did not indicate effectiveness of the intervention. The qualitative findings showed social acceptability of rifampicin. Chemoprophylaxis with one dose of rifampicin is found to be effective in preventing contacts of leprosy patients from contracting the disease. Also, there is indication that this strategy is socially accepted.
The effects of r- and K-selection on components of variance for two quantitative traits.
Long, T; Long, G
1974-03-01
The genetic and environmental components of variance for two quantitative characters were measured in the descendants of Drosophila melanogaster populations which had been grown for several generations at densities of 100, 200, 300, and 400 eggs per vial. Populations subject to intermediate densities had a greater proportion of phenotypic variance available for selection than populations from either extreme. Selection on either character would be least effective under pure r-selection, a frequent attribute of selection programs.
Saito, L.; Johnson, B.M.; Bartholow, J.; Hanna, R.B.
2001-01-01
We investigated the effects on the reservoir food web of a new temperature control device (TCD) on the dam at Shasta Lake, California. We followed a linked modeling approach that used a specialized reservoir water quality model to forecast operation-induced changes in phytoplankton production. A food web–energy transfer model was also applied to propagate predicted changes in phytoplankton up through the food web to the predators and sport fishes of interest. The food web–energy transfer model employed a 10% trophic transfer efficiency through a food web that was mapped using carbon and nitrogen stable isotope analysis. Stable isotope analysis provided an efficient and comprehensive means of estimating the structure of the reservoir's food web with minimal sampling and background data. We used an optimization procedure to estimate the diet proportions of all food web components simultaneously from their isotopic signatures. Some consumers were estimated to be much more sensitive than others to perturbations to phytoplankton supply. The linked modeling approach demonstrated that interdisciplinary efforts enhance the value of information obtained from studies of managed ecosystems. The approach exploited the strengths of engineering and ecological modeling methods to address concerns that neither of the models could have addressed alone: (a) the water quality model could not have addressed quantitatively the possible impacts to fish, and (b) the food web model could not have examined how phytoplankton availability might change due to reservoir operations.
How much separation for LC-MS/MS quantitative bioanalysis of drugs and metabolites?
Tan, Aimin; Fanaras, John C
2018-05-01
LC-MS/MS has been the dominant analytical technology for quantitative bioanalysis of drugs and metabolites for more than two decades. Despite this, a very fundamental question like how much separation is required for LC-MS/MS quantitative bioanalysis of drugs and metabolites has not been adequately addressed. Some think that no or only very limited separation is necessary thanks to the unparalleled selectivity offered by tandem mass spectrometry. Others think that the more separation, the better, because of the potential detrimental impact of matrix effect (ion suppression or enhancement). Still others just use a rule-of-thumb approach by keeping the adjusted retention/capacity factor always between 2 and 5. The purpose of this article is to address this fundamental question through rational thinking together with various real case examples drawn from regulated bioanalytical laboratories. Copyright © 2018 Elsevier B.V. All rights reserved.
Pattern Search in Multi-structure Data: A Framework for the Next-Generation Evidence-based Medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R; Ainsworth, Keela C
With the advent of personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledge-bases) to predict diagnostic risks is fast emerging. Addressing this need, we pose and address the following questions (i) How can we jointly analyze both qualitative and quantitative data ? (ii) Is the fusion of multi-structure data expected to provide better insights than either of them individually ? We present experiments on two bio-medical data sets - mammography and traumatic brain studies to demonstrate architectures and tools for evidence-pattern search.
Challenges in Higher Education Research: The Use of Quantitative Tools in Comparative Analyses
ERIC Educational Resources Information Center
Reale, Emanuela
2014-01-01
Despite the value of the comparative perspective for the study of higher education is widely recognised, there is little consensus about the specific methodological approaches. Quantitative tools outlined their relevance for addressing comparative analyses since they are supposed to reducing the complexity, finding out and graduating similarities…
In an adverse outcome pathway (AOP), the target site dose participates in a molecular initiating event (MIE), which in turn triggers a sequence of key events leading to an adverse outcome (AO). Quantitative AOPs (QAOP) are needed if AOP characterization is to address risk as well...
Leadership Trust in Virtual Teams Using Communication Tools: A Quantitative Correlational Study
ERIC Educational Resources Information Center
Clark, Robert Lynn
2014-01-01
The purpose of this quantitative correlational study was to address leadership trust in virtual teams using communication tools in a small south-central, family-owned pharmaceutical organization, with multiple dispersed locations located in the United States. The results of the current research study could assist leaders to develop a communication…
ERIC Educational Resources Information Center
Gilstrap, Donald L.
2013-01-01
In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…
Hou, Zhifei; Sun, Guoxiang; Guo, Yong
2016-01-01
The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard.
Cilia, M.; Fish, T.; Yang, X.; Mclaughlin, M.; Thannhauser, T. W.
2009-01-01
Protein extraction methods can vary widely in reproducibility and in representation of the total proteome, yet there are limited data comparing protein isolation methods. The methodical comparison of protein isolation methods is the first critical step for proteomic studies. To address this, we compared three methods for isolation, purification, and solubilization of insect proteins. The aphid Schizaphis graminum, an agricultural pest, was the source of insect tissue. Proteins were extracted using TCA in acetone (TCA-acetone), phenol, or multi-detergents in a chaotrope solution. Extracted proteins were solubilized in a multiple chaotrope solution and examined using 1-D and 2-D electrophoresis and compared directly using 2-D Difference Gel Electrophoresis (2-D DIGE). Mass spectrometry was used to identify proteins from each extraction type. We were unable to ascribe the differences in the proteins extracted to particular physical characteristics, cell location, or biological function. The TCA-acetone extraction yielded the greatest amount of protein from aphid tissues. Each extraction method isolated a unique subset of the aphid proteome. The TCA-acetone method was explored further for its quantitative reliability using 2-D DIGE. Principal component analysis showed that little of the variation in the data was a result of technical issues, thus demonstrating that the TCA-acetone extraction is a reliable method for preparing aphid proteins for a quantitative proteomics experiment. These data suggest that although the TCA-acetone method is a suitable method for quantitative aphid proteomics, a combination of extraction approaches is recommended for increasing proteome coverage when using gel-based separation techniques. PMID:19721822
NASA Astrophysics Data System (ADS)
Dietrich, Klaus; Brülisauer, Martina; ćaǧin, Emine; Bertsch, Dietmar; Lüthi, Stefan; Heeb, Peter; Stärker, Ulrich; Bernard, André
2017-06-01
The applications of fluorescence microscopy span medical diagnostics, bioengineering and biomaterial analytics. Full exploitation of fluorescent microscopy is hampered by imperfections in illumination, detection and filtering. Mainly, errors stem from deviations induced by real-world components inducing spatial or angular variations of propagation properties along the optical path, and they can be addressed through consistent and accurate calibration. For many applications, uniform signal to noise ratio (SNR) over the imaging area is required. Homogeneous SNR can be achieved by quantifying and compensating for the signal bias. We present a method to quantitatively characterize novel reference materials as a calibration reference for biomaterials analytics. The reference materials under investigation comprise thin layers of fluorophores embedded in polymer matrices. These layers are highly homogeneous in their fluorescence response, where cumulative variations do not exceed 1% over the field of view (1.5 x 1.1 mm). An automated and reproducible measurement methodology, enabling sufficient correction for measurement artefacts, is reported. The measurement setup is equipped with an autofocus system, ensuring that the measured film quality is not artificially increased by out-of-focus reduction of the system modulation transfer function. The quantitative characterization method is suitable for analysis of modified bio-materials, especially through patterned protein decoration. The imaging method presented here can be used to statistically analyze protein patterns, thereby increasing both precision and throughput. Further, the method can be developed to include a reference emitter and detector pair on the image surface of the reference object, in order to provide traceable measurements.
Balasubramanian, M; Spencer, A J; Short, S D; Watkins, K; Chrisopoulos, S; Brennan, D S
2016-09-01
The integration of qualitative and quantitative approaches introduces new avenues to bridge strengths, and address weaknesses of both methods. To develop measure(s) for migrant dentist experiences in Australia through a mixed methods approach. The sequential qualitative-quantitative design involved first the harvesting of data items from qualitative study, followed by a national survey of migrant dentists in Australia. Statements representing unique experiences in migrant dentists' life stories were deployed the survey questionnaire, using a five-point Likert scale. Factor analysis was used to examine component factors. Eighty-two statements from 51 participants were harvested from the qualitative analysis. A total of 1,022 of 1,977 migrant dentists (response rate 54.5%) returned completed questionnaires. Factor analysis supported an initial eight-factor solution; further scale development and reliability analysis led to five scales with a final list of 38 life story experience (LSE) items. Three scales were based on home country events: health system and general lifestyle concerns (LSE1; 10 items), society and culture (LSE4; 4 items) and career development (LSE5; 4 items). Two scales included migrant experiences in Australia: appreciation towards Australian way of life (LSE2; 13 items) and settlement concerns (LSE3; 7 items). The five life story experience scales provided necessary conceptual clarity and empirical grounding to explore migrant dentist experiences in Australia. Being based on original migrant dentist narrations, these scales have the potential to offer in-depth insights for policy makers and support future research on dentist migration. Copyright© 2016 Dennis Barber Ltd
Wachowiak, Roman; Strach, Bogna
2006-01-01
The study takes advantage of the presently available effective physicochemical methods (isolation, crystallization, determination of melting point, TLC, GLC and UV spectrophotometry) for an objective and reliable qualitative and quantitative analysis of frequently abused drugs. The authors determined the conditions for qualitative and quantitative analysis of active components of the secured evidence materials containing amphetamine sulphate, methylamphetamine hydrochloride, 3,4-me-tylenedioxy-methamphetamine hydrochloride (MDMA, Ecstasy), as well as delta(9)-tetrahydrocannabinol (delta(9)-THC) as an active component of cannabis (marihuana, hashish). The usefulness of physicochemical tests of evidence materials for opinionating purposes is subject to a detailed forensic toxicological interpretation.
NASA Astrophysics Data System (ADS)
Quatela, Alessia; Gilmore, Adam M.; Steege Gall, Karen E.; Sandros, Marinella; Csatorday, Karoly; Siemiarczuk, Alex; (Ben Yang, Boqian; Camenen, Loïc
2018-04-01
We investigate the new simultaneous absorbance-transmission and fluorescence excitation-emission matrix method for rapid and effective characterization of the varying components from a mixture. The absorbance-transmission and fluorescence excitation-emission matrix method uniquely facilitates correction of fluorescence inner-filter effects to yield quantitative fluorescence spectral information that is largely independent of component concentration. This is significant because it allows one to effectively monitor quantitative component changes using multivariate methods and to generate and evaluate spectral libraries. We present the use of this novel instrument in different fields: i.e. tracking changes in complex mixtures including natural water, wine as well as monitoring stability and aggregation of hormones for biotherapeutics.
Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C
2016-02-01
Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p < 0.05) difference in sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.
Qualitative and quantitative aspects of pain in lateral posterior thoracotomy patients.
Xavier, Thaiza Teixeira; Torres, Gilson de Vasconcelos; da Rocha, Vera Maria
2006-01-01
Descriptive study that proposed to compare the qualitative and quantitative behavior of the pain in lateral posterior thoracotomy patients. The sample was consisted of 18 individuals with an average age of 44 years. The instruments used were physiotherapy evaluation form, numerical pain scale and McGill questionnaire for pain. The pain on the numerical pain scale was considered moderate(5) for both sexes. The descriptors of the McGill questionnaire choosen by the patients with higher frequency were: in the sensorial component, beat4, pointed1, shock2, final and pull2; in the afetive component, tired1, bored1, punishald1 and miserable1 and in the evaluative component was flat. The characteristics of pain in the sensorial group were more evidents on male group. No significant statistical difeferences were observed between quantitative answers concerning pain between the men and women. On the qualitative aspects , was observed an predominancy of the same descriptors of pain in afetive component for both sexes. Pain intensity was categorized as moderate. No significant statistical difference were observed between the pain on the post-operatory lateral posterior thoracotomy. These data demonstrate a necessity for an analysis with a larger study group.
The mathematics of cancer: integrating quantitative models.
Altrock, Philipp M; Liu, Lin L; Michor, Franziska
2015-12-01
Mathematical modelling approaches have become increasingly abundant in cancer research. The complexity of cancer is well suited to quantitative approaches as it provides challenges and opportunities for new developments. In turn, mathematical modelling contributes to cancer research by helping to elucidate mechanisms and by providing quantitative predictions that can be validated. The recent expansion of quantitative models addresses many questions regarding tumour initiation, progression and metastases as well as intra-tumour heterogeneity, treatment responses and resistance. Mathematical models can complement experimental and clinical studies, but also challenge current paradigms, redefine our understanding of mechanisms driving tumorigenesis and shape future research in cancer biology.
A Quantitative Approach to Assessing System Evolvability
NASA Technical Reports Server (NTRS)
Christian, John A., III
2004-01-01
When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.
[Content determination of twelve major components in Tibetan medicine Zuozhu Daxi by UPLC].
Qu, Yan; Li, Jin-hua; Zhang, Chen; Li, Chun-xue; Dong, Hong-jiao; Wang, Chang-sheng; Zeng, Rui; Chen, Xiao-hu
2015-05-01
A quantitative analytical method of ultra-high performance liquid chromatography (UPLC) was developed for simultaneously determining twelve components in Tibetan medicine Zuozhu Daxi. SIMPCA 12.0 software was used a principal component analysis PCA) and partial small squares analysis (PLSD-DA) on the twelve components in 10 batches from four pharmaceutical factories. Acquity UPLC BEH C15 column (2.1 mm x 100 mm, 1.7 µm) was adopted at the column temperature of 35 °C and eluted with acetonitrile (A) -0.05% phosphate acid solution (B) as the mobile phase with a flow rate of 0. 3 mL · min(-1). The injection volume was 1 µL. The detection wavelengths were set at 210 nm for alantolactone, isoalantolactone and oleanolic; 260 nm for trychnine and brucine; 288 nm for protopine; 306 nm for protopine, resveratrol and piperine; 370 nm for quercetin and isorhamnetin. The results showed a good separation among index components, with a good linearity relationship (R2 = 0.999 6) within the selected concentration range. The average sample recovery rates ranged between 99.44%-101.8%, with RSD between 0.37%-1.7%, indicating the method is rapid and accurate with a good repeatability and stability. The PCA and PLSD-DA analysis on the sample determination results revealed a great difference among samples from different pharmaceutical factories. The twelve components included in this study contributed significantly to the quantitative determination of intrinsic quality of Zuozhu Daxi. The UPLC established for to the quantitative determination of the twelve components can provide scientific basis for the comprehensive quality evaluation of Zuozhu Daxi.
2004-09-01
The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.
ERIC Educational Resources Information Center
Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim
2014-01-01
Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…
Quantitative Mixing Measurements of a Supersonic Injection COIL Nozzle with Trip Jets
2008-06-13
6. AUTHOR( S ) 5d. PROJECT NUMBER 4866 Carrie A. Noren 5e. TASK NUMBER LB 5f. WORK UNIT NUMBER 11 7. PERFORMING ORGANIZATION NAME( S ) AND...ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING / MONITORING AGENCY NAME( S ) AND ADDRESS(ES) 10...SPONSOR/MONITOR’S ACRONYM( S ) Air Force Research Laboratory 3550 Aberdeen Ave SE 11. SPONSOR/MONITOR’S REPORT Kirtland AFB, NM 87117-5776
Multivariate Quantitative Chemical Analysis
NASA Technical Reports Server (NTRS)
Kinchen, David G.; Capezza, Mary
1995-01-01
Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.
School Districts-University Partnerships: A College-Readiness Program
ERIC Educational Resources Information Center
Labas, Gladys; Minaya Rowe, Liliana
2010-01-01
This paper reports on how ten school districts and a state university system address a state-funded college readiness program for high school student achievement in mathematics and English. It addresses in qualitative and quantitative detail: (a) the school-university partnership designed to decrease the number of high school students that require…
Brenner, Stephan; Muula, Adamson S; Robyn, Paul Jacob; Bärnighausen, Till; Sarker, Malabika; Mathanga, Don P; Bossert, Thomas; De Allegri, Manuela
2014-04-22
In this article we present a study design to evaluate the causal impact of providing supply-side performance-based financing incentives in combination with a demand-side cash transfer component on equitable access to and quality of maternal and neonatal healthcare services. This intervention is introduced to selected emergency obstetric care facilities and catchment area populations in four districts in Malawi. We here describe and discuss our study protocol with regard to the research aims, the local implementation context, and our rationale for selecting a mixed methods explanatory design with a quasi-experimental quantitative component. The quantitative research component consists of a controlled pre- and post-test design with multiple post-test measurements. This allows us to quantitatively measure 'equitable access to healthcare services' at the community level and 'healthcare quality' at the health facility level. Guided by a theoretical framework of causal relationships, we determined a number of input, process, and output indicators to evaluate both intended and unintended effects of the intervention. Overall causal impact estimates will result from a difference-in-difference analysis comparing selected indicators across intervention and control facilities/catchment populations over time.To further explain heterogeneity of quantitatively observed effects and to understand the experiential dimensions of financial incentives on clients and providers, we designed a qualitative component in line with the overall explanatory mixed methods approach. This component consists of in-depth interviews and focus group discussions with providers, service user, non-users, and policy stakeholders. In this explanatory design comprehensive understanding of expected and unexpected effects of the intervention on both access and quality will emerge through careful triangulation at two levels: across multiple quantitative elements and across quantitative and qualitative elements. Combining a traditional quasi-experimental controlled pre- and post-test design with an explanatory mixed methods model permits an additional assessment of organizational and behavioral changes affecting complex processes. Through this impact evaluation approach, our design will not only create robust evidence measures for the outcome of interest, but also generate insights on how and why the investigated interventions produce certain intended and unintended effects and allows for a more in-depth evaluation approach.
Quantitative Genetic Interactions Reveal Layers of Biological Modularity
Beltrao, Pedro; Cagney, Gerard; Krogan, Nevan J.
2010-01-01
In the past, biomedical research has embraced a reductionist approach, primarily focused on characterizing the individual components that comprise a system of interest. Recent technical developments have significantly increased the size and scope of data describing biological systems. At the same time, advances in the field of systems biology have evoked a broader view of how the underlying components are interconnected. In this essay, we discuss how quantitative genetic interaction mapping has enhanced our view of biological systems, allowing a deeper functional interrogation at different biological scales. PMID:20510918
Component-Based and Whole-Mixtures Assessments in Addressing the Unidentified Fraction of Complex Mixtures: Drinking Water as an Example
J. E. Simmons; L. K. Teuschler; C. Gennings; T. F. Speth; S. D. Richardson; R. J. Miltner; M. G. Narotsky; K. D. Schenck; G. Rice
How to Combine ChIP with qPCR.
Asp, Patrik
2018-01-01
Chromatin immunoprecipitation (ChIP) coupled with quantitative PCR (qPCR) has in the last 15 years become a basic mainstream tool in genomic research. Numerous commercially available ChIP kits, qPCR kits, and real-time PCR systems allow for quick and easy analysis of virtually anything chromatin-related as long as there is an available antibody. However, the highly accurate quantitative dimension added by using qPCR to analyze ChIP samples significantly raises the bar in terms of experimental accuracy, appropriate controls, data analysis, and data presentation. This chapter will address these potential pitfalls by providing protocols and procedures that address the difficulties inherent in ChIP-qPCR assays.
Drivers of the OCT market growth in clinical applications
NASA Astrophysics Data System (ADS)
Bouyé, Clémentine; d'Humières, Benoît.
2015-03-01
The development of Optical Coherence Tomography (OCT) systems started in the 1990's. The technology early found its application in Ophthalmology. Today more than 75% of the OCT market comes from the ophthalmic sector. However the growth of the ophthalmology OCT market has been slowing down for the last years. In this article, we present the results of our prospective study on the drivers of the OCT market growth in the coming years, based on bibliographical research and interviews with key players. In parallel of being used for ophthalmology, OCT has been developed and tested in new medical domains like cardiology, dermatology, gastroenterology, etc. OCT addresses key societal challenges such as the diagnosis of coronary artery diseases or skin cancer. There is a strong demand for fast, high-resolution, label-free and in vivo imaging tools in these fields. For the last 5 years, RD efforts have been focused on improving the performance and the compactness of OCT components and sub-systems. Advances in integrated photonics will enable the miniaturization of components and sub-systems and thus pave the way to Point-of-Care applications. Moreover the developments of new functionalities of OCT systems are undertaken to reach more complex diagnosis. OCT will no longer be a simple imaging device, it is on the verge of becoming a quantitative measurement tool. Our study shows that the emergence of new applications along with the improvements of components performance and the progress of functional OCT will drive the OCT market growth in the coming years.
Shao, Xi; Lv, Lishuang; Parks, Tiffany; Wu, Hou; Ho, Chi-Tang; Sang, Shengmin
2010-01-01
For the first time, a sensitive reversed-phase HPLC electrochemical array method has been developed for the quantitative analysis of eight major ginger components ([6]-, [8]-, and [10]-gingerol, [6]-, [8]-, and [10]-shogaol, [6]-paradol, and [1]-dehydrogingerdione) in eleven ginger-containing commercial products. This method was valid with unrivaled sensitivity as low as 7.3 – 20.2 pg of limit of detection and a range of 14.5 to 40.4 pg of limit of quantification. Using this method, we quantified the levels of eight ginger components in eleven different commercial products. Our results found that both levels and ratios among the eight compounds vary greatly in commercial products. PMID:21090746
NASA Technical Reports Server (NTRS)
Vander Velde, W. E.; Carignan, C. R.
1984-01-01
One of the first questions facing the designer of the control system for a large space structure is how many components actuators and sensors - to specify and where to place them on the structure. This paper presents a methodology which is intended to assist the designer in making these choices. A measure of controllability is defined which is a quantitative indication of how well the system can be controlled with a given set of actuators. Similarly, a measure of observability is defined which is a quantitative indication of how well the system can be observed with a given set of sensors. Then the effect of component unreliability is introduced by computing the average expected degree of controllability (observability) over the operating lifetime of the system accounting for the likelihood of various combinations of component failures. The problem of component location is resolved by optimizing this performance measure over the admissible set of locations. The variation of this optimized performance measure with number of actuators (sensors) is helpful in deciding how many components to use.
NASA Astrophysics Data System (ADS)
Reed, Krystal Astra
The "Advancement via Individual Determination (AVID) program was designed to provide resources and strategies that enable underrepresented minority students to attend 4-year colleges" (AVID Center, 2013, p. 2). These students are characterized as the forgotten middle in that they have high test scores, average-to-low grades, minority or low socioeconomic status, and will be first-generation college students (AVID, 2011). Research indicates (Huerta, Watt, & Butcher, 2013) that strict adherence to 11 program components supports success of students enrolled in AVID, and AVID certification depends on districts following those components. Several studies (AVID Center, 2013) have investigated claims about the AVID program through qualitative analyses; however, very few have addressed this program quantitatively. This researcher sought to determine whether differences existed between student achievement and attendance rates between AVID and non-AVID middle schools. To achieve this goal, the researcher compared eighth-grade science and seventh- and eighth-grade mathematics scores from the 2007 to 2011 Texas Assessment of Knowledge and Skills (TAKS) and overall attendance rates in demographically equivalent AVID and non-AVID middle schools. Academic Excellence Indicator System (AEIS) reports from the Texas Education Agency (TEA) were used to obtain 2007 to 2011 TAKS results and attendance information for the selected schools. The results indicated a statistically significant difference between AVID demonstration students and non-AVID students in schools with similar CI. No statistically significant differences were found on any component of the TAKS for AVID economically disadvantaged students. The mean scores indicated an achievement gap between non-AVID and AVID demonstration middle schools. The findings from the other three research questions indicated no statistically significant differences between AVID and non-AVID student passing rates on the seventh- and eighth-grade TAKS math tests or on overall attendance rates. The mean scores on the eighth-grade TAKS science test revealed some positive results in the academic performance of economically disadvantaged in non-AVID demonstration middle schools. Specifically, the results indicated that the mean passing percentage of AVID demonstration was lower than that of non-AVID middle schools. The TAKS scores showed a small achievement gap between non-AVID and AVID demonstration middle schools.
Ju, Jin Hyun; Crystal, Ronald G.
2017-01-01
Genome-wide expression Quantitative Trait Loci (eQTL) studies in humans have provided numerous insights into the genetics of both gene expression and complex diseases. While the majority of eQTL identified in genome-wide analyses impact a single gene, eQTL that impact many genes are particularly valuable for network modeling and disease analysis. To enable the identification of such broad impact eQTL, we introduce CONFETI: Confounding Factor Estimation Through Independent component analysis. CONFETI is designed to address two conflicting issues when searching for broad impact eQTL: the need to account for non-genetic confounding factors that can lower the power of the analysis or produce broad impact eQTL false positives, and the tendency of methods that account for confounding factors to model broad impact eQTL as non-genetic variation. The key advance of the CONFETI framework is the use of Independent Component Analysis (ICA) to identify variation likely caused by broad impact eQTL when constructing the sample covariance matrix used for the random effect in a mixed model. We show that CONFETI has better performance than other mixed model confounding factor methods when considering broad impact eQTL recovery from synthetic data. We also used the CONFETI framework and these same confounding factor methods to identify eQTL that replicate between matched twin pair datasets in the Multiple Tissue Human Expression Resource (MuTHER), the Depression Genes Networks study (DGN), the Netherlands Study of Depression and Anxiety (NESDA), and multiple tissue types in the Genotype-Tissue Expression (GTEx) consortium. These analyses identified both cis-eQTL and trans-eQTL impacting individual genes, and CONFETI had better or comparable performance to other mixed model confounding factor analysis methods when identifying such eQTL. In these analyses, we were able to identify and replicate a few broad impact eQTL although the overall number was small even when applying CONFETI. In light of these results, we discuss the broad impact eQTL that have been previously reported from the analysis of human data and suggest that considerable caution should be exercised when making biological inferences based on these reported eQTL. PMID:28505156
Ju, Jin Hyun; Shenoy, Sushila A; Crystal, Ronald G; Mezey, Jason G
2017-05-01
Genome-wide expression Quantitative Trait Loci (eQTL) studies in humans have provided numerous insights into the genetics of both gene expression and complex diseases. While the majority of eQTL identified in genome-wide analyses impact a single gene, eQTL that impact many genes are particularly valuable for network modeling and disease analysis. To enable the identification of such broad impact eQTL, we introduce CONFETI: Confounding Factor Estimation Through Independent component analysis. CONFETI is designed to address two conflicting issues when searching for broad impact eQTL: the need to account for non-genetic confounding factors that can lower the power of the analysis or produce broad impact eQTL false positives, and the tendency of methods that account for confounding factors to model broad impact eQTL as non-genetic variation. The key advance of the CONFETI framework is the use of Independent Component Analysis (ICA) to identify variation likely caused by broad impact eQTL when constructing the sample covariance matrix used for the random effect in a mixed model. We show that CONFETI has better performance than other mixed model confounding factor methods when considering broad impact eQTL recovery from synthetic data. We also used the CONFETI framework and these same confounding factor methods to identify eQTL that replicate between matched twin pair datasets in the Multiple Tissue Human Expression Resource (MuTHER), the Depression Genes Networks study (DGN), the Netherlands Study of Depression and Anxiety (NESDA), and multiple tissue types in the Genotype-Tissue Expression (GTEx) consortium. These analyses identified both cis-eQTL and trans-eQTL impacting individual genes, and CONFETI had better or comparable performance to other mixed model confounding factor analysis methods when identifying such eQTL. In these analyses, we were able to identify and replicate a few broad impact eQTL although the overall number was small even when applying CONFETI. In light of these results, we discuss the broad impact eQTL that have been previously reported from the analysis of human data and suggest that considerable caution should be exercised when making biological inferences based on these reported eQTL.
SYSTEMS BIOLOGY MODEL DEVELOPMENT AND APPLICATION
System biology models holistically describe, in a quantitative fashion, the relationships between different levels of a biologic system. Relationships between individual components of a system are delineated. System biology models describe how the components of the system inter...
Ghosh, Debasree; Chattopadhyay, Parimal
2012-06-01
The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.
Hou, Zhifei; Sun, Guoxiang; Guo, Yong
2016-01-01
The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425
Watson, Roger
2015-04-01
This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.
Propagating Qualitative Values Through Quantitative Equations
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak
1992-01-01
In most practical problems where traditional numeric simulation is not adequate, one need to reason about a system with both qualitative and quantitative equations. In this paper, we address the problem of propagating qualitative values represented as interval values through quantitative equations. Previous research has produced exponential-time algorithms for approximate solution of the problem. These may not meet the stringent requirements of many real time applications. This paper advances the state of art by producing a linear-time algorithm that can propagate a qualitative value through a class of complex quantitative equations exactly and through arbitrary algebraic expressions approximately. The algorithm was found applicable to Space Shuttle Reaction Control System model.
Behavioral Assembly Required: Particularly for Quantitative Courses
ERIC Educational Resources Information Center
Mazen, Abdelmagid
2008-01-01
This article integrates behavioral approaches into the teaching and learning of quantitative subjects with application to statistics. Focusing on the emotional component of learning, the article presents a system dynamic model that provides descriptive and prescriptive accounts of learners' anxiety. Metaphors and the metaphorizing process are…
Azelaic acid gel 15%: clinical versatility in the treatment of rosacea.
Del Rosso, James Q; Baum, Eric W; Draelos, Zoe Diana; Elewski, Boni E; Fleischer, Alan B; Kakita, Lenore S; Thiboutot, Diane
2006-11-01
There are numerous proposed but contested components involved in the pathophysiology of rosacea, including inflammatory mediators, reactive oxygen species (ROS) released by neutrophils, and microbial components. Ideal comprehensive rosacea management should address these components. Azelaic acid (AzA), a naturally occurring substance, has many proposed mechanisms of action--antimicrobial, anti-inflammatory/antioxidant, and keratinolytic--that address the proposed components of rosacea pathophysiology and has demonstrated efficacy in subtype 2 rosacea. In a roundtable discussion, information leaders discussed the pathophysiology of rosacea and other issues of importance to successful rosacea management, such as skin care regimens, quality of life (QOL), and compliance.
A quantitative framework for the forward design of synthetic miRNA circuits.
Bloom, Ryan J; Winkler, Sally M; Smolke, Christina D
2014-11-01
Synthetic genetic circuits incorporating regulatory components based on RNA interference (RNAi) have been used in a variety of systems. A comprehensive understanding of the parameters that determine the relationship between microRNA (miRNA) and target expression levels is lacking. We describe a quantitative framework supporting the forward engineering of gene circuits that incorporate RNAi-based regulatory components in mammalian cells. We developed a model that captures the quantitative relationship between miRNA and target gene expression levels as a function of parameters, including mRNA half-life and miRNA target-site number. We extended the model to synthetic circuits that incorporate protein-responsive miRNA switches and designed an optimized miRNA-based protein concentration detector circuit that noninvasively measures small changes in the nuclear concentration of β-catenin owing to induction of the Wnt signaling pathway. Our results highlight the importance of methods for guiding the quantitative design of genetic circuits to achieve robust, reliable and predictable behaviors in mammalian cells.
Quantitative Outcomes of a One Health approach to Study Global Health Challenges.
Falzon, Laura C; Lechner, Isabel; Chantziaras, Ilias; Collineau, Lucie; Courcoul, Aurélie; Filippitzi, Maria-Eleni; Laukkanen-Ninios, Riikka; Peroz, Carole; Pinto Ferreira, Jorge; Postma, Merel; Prestmo, Pia G; Phythian, Clare J; Sarno, Eleonora; Vanantwerpen, Gerty; Vergne, Timothée; Grindlay, Douglas J C; Brennan, Marnie L
2018-03-01
Having gained momentum in the last decade, the One Health initiative promotes a holistic approach to address complex global health issues. Before recommending its adoption to stakeholders, however, it is paramount to first compile quantitative evidence of the benefit of such an approach. The aim of this scoping review was to identify and summarize primary research that describes monetary and non-monetary outcomes following adoption of a One Health approach. An extensive literature search yielded a total of 42,167 references, of which 85 were included in the final analysis. The top two biotic health issues addressed in these studies were rabies and malaria; the top abiotic health issue was air pollution. Most studies described collaborations between human and animal (n = 42), or human and environmental disciplines (n = 41); commonly reported interventions included vector control and animal vaccination. Monetary outcomes were commonly expressed as cost-benefit or cost-utility ratios; non-monetary outcomes were described using disease frequency or disease burden measurements. The majority of the studies reported positive or partially positive outcomes. This paper illustrates the variety of health challenges that can be addressed using a One Health approach, and provides tangible quantitative measures that can be used to evaluate future implementations of the One Health approach.
Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Booth, Andrew; Harden, Angela; Hannes, Karin; Thomas, James; Flemming, Kate; Garside, Ruth; Noyes, Jane
2018-05-01
This article provides reviewers with guidance on methods for identifying and processing evidence to understand intervention implementation. Strategies, tools, and methods are applied to the systematic review process to illustrate how process and implementation can be addressed using quantitative, qualitative, and other sources of evidence (i.e., descriptive textual and nonempirical). Reviewers can take steps to navigate the heterogeneity and level of uncertainty present in the concepts, measures, and methods used to assess implementation. Activities can be undertaken in advance of a Cochrane quantitative review to develop program theory and logic models that situate implementation in the causal chain. Four search strategies are offered to retrieve process and implementation evidence. Recommendations are made for addressing rigor or risk of bias in process evaluation or implementation evidence. Strategies are recommended for locating and extracting data from primary studies. The basic logic is presented to assist reviewers to make initial review-level judgments about implementation failure and theory failure. Although strategies, tools, and methods can assist reviewers to address process and implementation using quantitative, qualitative, and other forms of evidence, few exemplar reviews exist. There is a need for further methodological development and trialing of proposed approaches. Copyright © 2017 Elsevier Inc. All rights reserved.
Applications of a Conditional Event Algebra to Data Fusion
1988-05-01
FUNDING NUMBERS APPLICATIONS OF A CONDITIONAL EVENT ALGEBRA TO DATA FUSION- In-house funding 6. AUTOR(S)C 1. R. Goodman 7. PERFORMING ORGANIZATION NAME...S) AND ADDRESS(ES) EL C E llk 8. PERFORMING ORGAiIIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(Eq 10. SPONSORING...of the corresponding quantitative descript- D . CO N -R o ion. In effect , this entails choosing both an appropri- SNODE 6 ate relational syntax and a
NASA Astrophysics Data System (ADS)
Khishfe, Rola; Alshaya, Fahad S.; BouJaoude, Saouma; Mansour, Nasser; Alrudiyan, Khalid I.
2017-02-01
The purpose of this study was to examine students understandings about nature of science (NOS) and their arguments in context of controversial socio-scientific issue (SSI). A total of 74 11th graders in six schools in Saudi Arabia participated in the study. The instrument used was a questionnaire consisting of four scenarios addressing SSI about global warming, genetically modified food, acid rain, and human cloning. The scenarios were followed by questions relating to argumentation and NOS. Quantitative and qualitative measures were employed to analyze the data related to participants understandings of three NOS aspects (subjective, tentative, and empirical) and their arguments components (argument, counterargument, and rebuttal). Results showed no significant correlations between argument components and the NOS aspects. On the other hand, qualitative data showed that participants who generated well-developed arguments across the four SSI also exhibited more informed understandings of the NOS aspects, especially for female participants. Further, the chi-square analyses did not show significant differences in participants arguments and NOS understandings across the four scenarios. Again, the qualitative data from questionnaires showed differences in participants responses to the different scenarios. The results were interpreted along contextual factors, emotional factors, and cultural factors. Implications for the teaching of NOS and arguments were discussed.
NASA Astrophysics Data System (ADS)
Mulder, Christian; Sakorafa, Vasiliki; Burragato, Francesco; Visscher, Henk
2000-06-01
A consensus about the development of freshwater wetlands in relation to time and space is urgently required. Our study aims to address this issue by providing additional data for a fine-scaled comparison of local depositional settings of Greek mires during the Pliocene and Pleistocene. Lignite profiles exhibit phytogenic organic components (macerals) that have been used to investigate the past peat-forming vegetation structure and their succession series. The organic petrology of lignite samples from the opencast mines of Komanos (Ptolemais) and Choremi (Megalopolis) was achieved to assess the water supply, wetland type, nutrient status and vegetation physiognomy. A holistic approach (a study of ecosystems as complete entities) was carried out for a paleoecological reconstruction of the mires. Huminite, liptinite and inertinite were traced by means of their chemical and morphological differences together with the morphogenic and taphonomic affinities. The problem of combining independent information from different approaches in a multivariate calibration setup has been considered. Linear regression, non-metric multidimensional scaling and one-way analysis of variance tested the occurrence of palynological and petrological proxies. Although the lignite formation and deposition are less related to humid periods than expected, the resulting differences occurring in the reconstructed development stages appear to be related to astronomically forced climate fluctuations.
NASA Astrophysics Data System (ADS)
Brouwers, H. J. H.; Gilding, B. H.
2006-02-01
The unsteady process of steam stripping of the unsaturated zone of soils contaminated with volatile organic compounds (VOCs) is addressed. A model is presented. It accounts for the effects of water and contaminants remaining in vapour phase, as well as diffusion and dispersion of contaminants in this phase. The model has two components. The first is a one-dimensional description of the propagation of a steam front in the start-up phase. This is based on Darcy's law and conservation laws of mass and energy. The second component describes the transport of volatile contaminants. Taking the view that non-equilibrium between liquid and vapour phases exists, it accounts for evaporation, transport, and condensation at the front. This leads to a moving-boundary problem. The moving-boundary problem is brought into a fixed domain by a suitable transformation of the governing partial differential equations, and solved numerically. For a broad range of the governing dimensionless numbers, such as the Henry, Merkel and Péclet numbers, computational results are discussed. A mathematical asymptotic analysis supports this discussion. The range of parameter values for which the model is valid is investigated. Diffusion and dispersion are shown to be of qualitative importance, but to have little quantitative effect in the start-up phase.
This report summarizes the discussion, conclusions, and points of consensus of the IWGT Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (QWG) based on a meeting in Foz do Iguaçu, Brazil October 31–November 2, 2013. Topics addressed incl...
ERIC Educational Resources Information Center
Carter, Jackie; Brown, Mark; Simpson, Kathryn
2017-01-01
In British social science degree programmes, methods courses have a bad press, and statistics courses in particular are not well-liked by most students. A nationally-coordinated, strategic investment in quantitative skills training, Q-Step, is an attempt to address the issues affecting the shortage of quantitatively trained humanities and social…
A high throughput geocomputing system for remote sensing quantitative retrieval and a case study
NASA Astrophysics Data System (ADS)
Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting
2011-12-01
The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.
Levasseur, Mélanie; Dubois, Marie-France; Généreux, Mélissa; Menec, Verena; Raina, Parminder; Roy, Mathieu; Gabaude, Catherine; Couturier, Yves; St-Pierre, Catherine
2017-05-25
To address the challenges of the global aging population, the World Health Organization promoted age-friendly communities as a way to foster the development of active aging community initiatives. Accordingly, key components (i.e., policies, services and structures related to the communities' physical and social environments) should be designed to be age-friendly and help all aging adults to live safely, enjoy good health and stay involved in their communities. Although age-friendly communities are believed to be a promising way to help aging Canadians lead healthy and active lives, little is known about which key components best foster positive health, social participation and health equity, and their underlying mechanisms. This study aims to better understand which and how key components of age-friendly communities best foster positive health, social participation and health equity in aging Canadians. Specifically, the research objectives are to: 1) Describe and compare age-friendly key components of communities across Canada 2) Identify key components best associated with positive health, social participation and health equity of aging adults 3) Explore how these key components foster positive health, social participation and health equity METHODS: A mixed-method sequential explanatory design will be used. The quantitative part will involve a survey of Canadian communities and secondary analysis of cross-sectional data from the Canadian Longitudinal Study on Aging (CLSA). The survey will include an age-friendly questionnaire targeting key components in seven domains: physical environment, housing options, social environment, opportunities for participation, community supports and healthcare services, transportation options, communication and information. The CLSA is a large, national prospective study representative of the Canadian aging population designed to examine health transitions and trajectories of adults as they age. In the qualitative part, a multiple case study will be conducted in five Canadian communities performing best on positive health, social participation and health equity. Building on new and existing collaborations and generating evidence from real-world interventions, the results of this project will help communities to promote age-friendly policies, services and structures which foster positive health, social participation and health equity at a population level.
2013-09-01
ORGANIZATION REPORT NUMBER ARL-TR-6576 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) 11 . SPONSOR... 11 Figure 11 . Estimated angle-of-attack components history, projectile no.2... 11 Figure 12. Comparison of angle-of-attack component estimates, projectile no.2. ........................12 Figure 13. Total angle-of
Assessing the Impact of Faculty-Led Supplemental Instruction on Attrition, GPA, and Graduation Rates
ERIC Educational Resources Information Center
Bryan, Karla Needham
2013-01-01
The purpose of this quasi-experimental, quantitative study was to investigate the effectiveness of supplemental instruction (SI) as a means to address the existing high attrition and low graduation rates evidenced at a rural southern community college. The research problem addressed the high attrition rates in barrier courses and low graduation…
Ding, Shujing; Dudley, Ed; Plummer, Sue; Tang, Jiandong; Newton, Russell P; Brenton, A Gareth
2006-01-01
A reversed-phase high-performance liquid chromatography/electrospray ionisation mass spectrometry (RP-HPLC/ESI-MS) method was developed and validated for the simultaneous determination of ten major active components in Ginkgo biloba extract (bilobalide, ginkgolides A, B, C, quercetin, kaempferol, isorhamnetin, rutin hydrate, quercetin-3-beta-D-glucoside and quercitrin hydrate) which have not been previously reported to be quantified in a single analysis. The ten components exhibit baseline separation in 50 min by C18 chromatography using a water/1:1 (v/v) methanol/acetonitrile gradient. Quantitation was performed using negative ESI-MS in selected ion monitoring (SIM) mode. Good reproducibility and recovery were obtained by this method. The sensitivity of both UV and different mass spectrometry modes (full scan, selected ion monitoring (SIM), and selected reaction monitoring (SRM)) were compared and both quantitation with and without internal standard were evaluated. The analysis of Ginkgo biloba commercial products showed remarkable variations in the rutin and quercetin content as well as the terpene lactone contents although all the products satisfy the conventional quality control method. Copyright 2006 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ken Thomas; Ted Quinn; Jerry Mauck
There are significant developments underway in new types of actuators for power plant active components. Many of these make use of digital technology to provide a wide array of benefits in performance of the actuators and in reduced burden to maintain them. These new product offerings have gained considerable acceptance in use in process plants. In addition, they have been used in conventional power generation very successfully. This technology has been proven to deliver the benefits promised and substantiate the claims of improved performance. The nuclear industry has been reluctant to incorporate digital actuator technology into nuclear plant designs duemore » to concerns due to a number of concerns. These could be summarized as cost, regulatory uncertainty, and a certain comfort factor with legacy analog technology. The replacement opportunity for these types of components represents a decision point for whether to invest in more modern technology that would provide superior operational and maintenance benefits. Yet, the application of digital technology has been problematic for the nuclear industry, due to qualification and regulatory issues. With some notable exceptions, the result has been a continuing reluctance to undertake the risks and uncertainties of implementing digital actuator technology when replacement opportunities present themselves. Rather, utilities would typically prefer to accept the performance limitations of the legacy analog actuator technologies to avoid impacts to project costs and schedules. The purpose of this report is to demonstrate that the benefits of digital actuator technology can be significant in terms of plant performance and that it is worthwhile to address the barriers currently holding back the widespread development and use of this technology. It addresses two important objectives in pursuit of the beneficial use of digital actuator technology for nuclear power plants: 1. To demonstrate the benefits of digital actuator technology over legacy analog sensor technology in both quantitative and qualitative ways. 2. To recognize and address the added difficulty of digital technology qualification, especially in regard to software common cause failure (SCCF), that is introduced by the use of digital actuator technology.« less
A molecular investigation of soil organic carbon composition across a subalpine catchment
Hsu, Hsiao-Tieh; Lawrence, Corey R.; Winnick, Matthew J.; Bargar, John R.; Maher, Katharine
2018-01-01
The dynamics of soil organic carbon (SOC) storage and turnover are a critical component of the global carbon cycle. Mechanistic models seeking to represent these complex dynamics require detailed SOC compositions, which are currently difficult to characterize quantitatively. Here, we address this challenge by using a novel approach that combines Fourier transform infrared spectroscopy (FT-IR) and bulk carbon X-ray absorption spectroscopy (XAS) to determine the abundance of SOC functional groups, using elemental analysis (EA) to constrain the total amount of SOC. We used this SOC functional group abundance (SOC-fga) method to compare variability in SOC compositions as a function of depth across a subalpine watershed (East River, Colorado, USA) and found a large degree of variability in SOC functional group abundances between sites at different elevations. Soils at a lower elevation are predominantly composed of polysaccharides, while soils at a higher elevation have more substantial portions of carbonyl, phenolic, or aromatic carbon. We discuss the potential drivers of differences in SOC composition between these sites, including vegetation inputs, internal processing and losses, and elevation-driven environmental factors. Although numerical models would facilitate the understanding and evaluation of the observed SOC distributions, quantitative and meaningful measurements of SOC molecular compositions are required to guide such models. Comparison among commonly used characterization techniques on shared reference materials is a critical next step for advancing our understanding of the complex processes controlling SOC compositions.
A Computational Framework for Quantitative Evaluation of Movement during Rehabilitation
NASA Astrophysics Data System (ADS)
Chen, Yinpeng; Duff, Margaret; Lehrer, Nicole; Sundaram, Hari; He, Jiping; Wolf, Steven L.; Rikakis, Thanassis
2011-06-01
This paper presents a novel generalized computational framework for quantitative kinematic evaluation of movement in a rehabilitation clinic setting. The framework integrates clinical knowledge and computational data-driven analysis together in a systematic manner. The framework provides three key benefits to rehabilitation: (a) the resulting continuous normalized measure allows the clinician to monitor movement quality on a fine scale and easily compare impairments across participants, (b) the framework reveals the effect of individual movement components on the composite movement performance helping the clinician decide the training foci, and (c) the evaluation runs in real-time, which allows the clinician to constantly track a patient's progress and make appropriate adaptations to the therapy protocol. The creation of such an evaluation is difficult because of the sparse amount of recorded clinical observations, the high dimensionality of movement and high variations in subject's performance. We address these issues by modeling the evaluation function as linear combination of multiple normalized kinematic attributes y = Σwiφi(xi) and estimating the attribute normalization function φi(ṡ) by integrating distributions of idealized movement and deviated movement. The weights wi are derived from a therapist's pair-wise comparison using a modified RankSVM algorithm. We have applied this framework to evaluate upper limb movement for stroke survivors with excellent results—the evaluation results are highly correlated to the therapist's observations.
Brinton, Jemima; Simmonds, James; Minter, Francesca; Leverington-Waite, Michelle; Snape, John; Uauy, Cristobal
2017-08-01
Crop yields must increase to address food insecurity. Grain weight, determined by grain length and width, is an important yield component, but our understanding of the underlying genes and mechanisms is limited. We used genetic mapping and near isogenic lines (NILs) to identify, validate and fine-map a major quantitative trait locus (QTL) on wheat chromosome 5A associated with grain weight. Detailed phenotypic characterisation of developing and mature grains from the NILs was performed. We identified a stable and robust QTL associated with a 6.9% increase in grain weight. The positive interval leads to 4.0% longer grains, with differences first visible 12 d after fertilization. This grain length effect was fine-mapped to a 4.3 cM interval. The locus also has a pleiotropic effect on grain width (1.5%) during late grain development that determines the relative magnitude of the grain weight increase. Positive NILs have increased maternal pericarp cell length, an effect which is independent of absolute grain length. These results provide direct genetic evidence that pericarp cell length affects final grain size and weight in polyploid wheat. We propose that combining genes that control distinct biological mechanisms, such as cell expansion and proliferation, will enhance crop yields. © 2017 The Authors. New Phytologist © 2017 New Phytologist Trust.
Assessing the future of diffuse optical imaging technologies for breast cancer management
Tromberg, Bruce J.; Pogue, Brian W.; Paulsen, Keith D.; Yodh, Arjun G.; Boas, David A.; Cerussi, Albert E.
2008-01-01
Diffuse optical imaging (DOI) is a noninvasive optical technique that employs near-infrared (NIR) light to quantitatively characterize the optical properties of thick tissues. Although NIR methods were first applied to breast transillumination (also called diaphanography) nearly 80 years ago, quantitative DOI methods employing time- or frequency-domain photon migration technologies have only recently been used for breast imaging (i.e., since the mid-1990s). In this review, the state of the art in DOI for breast cancer is outlined and a multi-institutional Network for Translational Research in Optical Imaging (NTROI) is described, which has been formed by the National Cancer Institute to advance diffuse optical spectroscopy and imaging (DOSI) for the purpose of improving breast cancer detection and clinical management. DOSI employs broadband technology both in near-infrared spectral and temporal signal domains in order to separate absorption from scattering and quantify uptake of multiple molecular probes based on absorption or fluorescence contrast. Additional dimensionality in the data is provided by integrating and co-registering the functional information of DOSI with x-ray mammography and magnetic resonance imaging (MRI), which provide structural information or vascular flow information, respectively. Factors affecting DOSI performance, such as intrinsic and extrinsic contrast mechanisms, quantitation of biochemical components, image formation∕visualization, and multimodality co-registration are under investigation in the ongoing research NTROI sites. One of the goals is to develop standardized DOSI platforms that can be used as stand-alone devices or in conjunction with MRI, mammography, or ultrasound. This broad-based, multidisciplinary effort is expected to provide new insight regarding the origins of breast disease and practical approaches for addressing several key challenges in breast cancer, including: Detecting disease in mammographically dense tissue, distinguishing between malignant and benign lesions, and understanding the impact of neoadjuvant chemotherapies. PMID:18649477
Danner, Marion; Vennedey, Vera; Hiligsmann, Mickaël; Fauser, Sascha; Gross, Christian; Stock, Stephanie
2016-10-01
In this study, we tested the feasibility of an interviewer-assisted analytic hierarchy process (AHP) in a special patient population with age-related macular degeneration (AMD). One aim was to generate preference weights regarding AMD treatment characteristics. A secondary aim was to explore the consistency of preference judgments and reasons for inconsistency. We generated quantitative importance weights for decision criteria using the matrix multiplication method. A qualitative study component in the form of asking patients to think aloud throughout their judgments was implemented to facilitate understanding of quantitative findings. Consistency ratios were calculated as a measure of logical judgment performance within AHP. If consistency ratios exceeded 0.2, we explored reasons for inconsistency. We interviewed 86 patients and generated preference weights for criteria. Patients rated the injection's effect on visual function the highest (0.44), followed by the frequency of monitoring visits (0.18), approval status (0.13), injection frequency (0.13), and side effects (0.12). Inconsistency in judgments was prevalent at the subcriteria level. Whereas much of the observed inconsistency was due to an excessive use of high/extreme value judgments, these judgments seemed to result from patients reasonably trying to highlight their strong preferences. Our study combines quantitative with qualitative data to explore patients' preference weights and decision processes using the AHP. It suggests that the type of inconsistency observed in judgments of AMD patients mostly results from rational decision making, not from error or lack of understanding. Further research should address which type and extent of inconsistency might be acceptable in different AHP settings.
Redmond, Haley; Thompson, Jonathan E
2011-04-21
In this work we describe and evaluate a simple scheme by which the refractive index (λ = 589 nm) of non-absorbing components common to secondary organic aerosols (SOA) may be predicted from molecular formula and density (g cm(-3)). The QSPR approach described is based on three parameters linked to refractive index-molecular polarizability, the ratio of mass density to molecular weight, and degree of unsaturation. After computing these quantities for a training set of 111 compounds common to atmospheric aerosols, multi-linear regression analysis was conducted to establish a quantitative relationship between the parameters and accepted value of refractive index. The resulting quantitative relationship can often estimate refractive index to ±0.01 when averaged across a variety of compound classes. A notable exception is for alcohols for which the model consistently underestimates refractive index. Homogenous internal mixtures can conceivably be addressed through use of either the volume or mole fraction mixing rules commonly used in the aerosol community. Predicted refractive indices reconstructed from chemical composition data presented in the literature generally agree with previous reports of SOA refractive index. Additionally, the predicted refractive indices lie near measured values we report for λ = 532 nm for SOA generated from vapors of α-pinene (R.I. 1.49-1.51) and toluene (R.I. 1.49-1.50). We envision the QSPR method may find use in reconstructing optical scattering of organic aerosols if mass composition data is known. Alternatively, the method described could be incorporated into in models of organic aerosol formation/phase partitioning to better constrain organic aerosol optical properties.
Bauermeister, Anja; Mahnert, Alexander; Auerbach, Anna; Böker, Alexander; Flier, Niwin; Weber, Christina; Probst, Alexander J; Moissl-Eichinger, Christine; Haberer, Klaus
2014-01-01
Bioburden encapsulated in spacecraft polymers (such as adhesives and coatings) poses a potential risk to jeopardize scientific exploration of other celestial bodies. This is particularly critical for spacecraft components intended for hard landing. So far, it remained unclear if polymers are indeed a source of microbial contamination. In addition, data with respect to survival of microbes during the embedding/polymerization process are sparse. In this study we developed testing strategies to quantitatively examine encapsulated bioburden in five different polymers used frequently and in large quantities on spaceflight hardware. As quantitative extraction of the bioburden from polymerized (solid) materials did not prove feasible, contaminants were extracted from uncured precursors. Cultivation-based analyses revealed <0.1-2.5 colony forming units (cfu) per cm3 polymer, whereas quantitative PCR-based detection of contaminants indicated considerably higher values, despite low DNA extraction efficiency. Results obtained from this approach reflect the most conservative proxy for encapsulated bioburden, as they give the maximum bioburden of the polymers irrespective of any additional physical and chemical stress occurring during polymerization. To address the latter issue, we deployed an embedding model to elucidate and monitor the physiological status of embedded Bacillus safensis spores in a cured polymer. Staining approaches using AlexaFluor succinimidyl ester 488 (AF488), propidium monoazide (PMA), CTC (5-cyano-2,3-diotolyl tetrazolium chloride) demonstrated that embedded spores retained integrity, germination and cultivation ability even after polymerization of the adhesive Scotch-Weld 2216 B/A. Using the methods presented here, we were able to estimate the worst case contribution of encapsulated bioburden in different polymers to the bioburden of spacecraft. We demonstrated that spores were not affected by polymerization processes. Besides Planetary Protection considerations, our results could prove useful for the manufacturing of food packaging, pharmacy industry and implant technology.
Beryllium and titanium cost-adjustment report
NASA Astrophysics Data System (ADS)
Owen, John; Ulph, Eric, Sr.
1991-09-01
This report summarizes cost adjustment factors for beryllium (Be, S200) and titanium (Ti, 6Al-4V) that were derived relative to aluminum (Al, 7075-T6). Aluminum is traditionally the material upon which many of the Cost Analysis Office, Missile Division cost estimating relationships (CERs) are based. The adjustment factors address both research and development and production (Q > 100) quantities. In addition, the factors derived include optical elements, normal structure, and structure with special requirements for minimal microcreep, such as sensor assembly parts and supporting components. Since booster cost per payload pound is an even larger factor in total missile launch costs than was initially presumed, the primary cost driver for all materials compared was the missiles' booster cost per payload pound for both R&D and production quantities. Al and Ti are 1.5 and 2.4 times more dense, respectively, than Be, and the cost to lift the heavier materials results in greater booster expense. In addition, Al and Ti must be 2.1 and 2.8, respectively, times the weight of a Be component to provide equivalent stiffness, based on the example component addressed in the report. These factors also increase booster costs. After review of the relative factors cited above, especially the lower costs for Be when stiffness and booster costs are taken into consideration, affordability becomes an important issue. When this study was initiated, both government and contractor engineers said that Be was the material to be used as a last resort because of its prohibitive cost and extreme toxicity. Although the initial price of Be may lead one to believe that any Be product would be extremely expensive, the total cost of Be used for space applications is actually competitive with or less costly than either Al or Ti. Also, the Be toxicity problem has turned out to be a non-issue for purchasers of finished Be components since no machining or grinding operations are required on the finished components. Several new costing techniques are developed which provide quantitative measures of the cost of material stiffness, costs related to payload weight, and costs associated with the relative temperature stability of different materials. In addition, use is made of the Design/Cost Trade Model developed by Applied Research, Inc., to determine the booster cost differential relative to changes in payload weight, and a mirror fabrication cost model, developed by OCA Applied Optics, was used for mirror costing. This report is a summary of an extensive study done by the U.S. Army Strategic Defense Command, Huntsville, Alabama.
Robust LOD scores for variance component-based linkage analysis.
Blangero, J; Williams, J T; Almasy, L
2000-01-01
The variance component method is now widely used for linkage analysis of quantitative traits. Although this approach offers many advantages, the importance of the underlying assumption of multivariate normality of the trait distribution within pedigrees has not been studied extensively. Simulation studies have shown that traits with leptokurtic distributions yield linkage test statistics that exhibit excessive Type I error when analyzed naively. We derive analytical formulae relating the deviation from the expected asymptotic distribution of the lod score to the kurtosis and total heritability of the quantitative trait. A simple correction constant yields a robust lod score for any deviation from normality and for any pedigree structure, and effectively eliminates the problem of inflated Type I error due to misspecification of the underlying probability model in variance component-based linkage analysis.
Zhao, Wen-Wen; Wu, Zhi-Min; Wu, Xia; Zhao, Hai-Yu; Chen, Xiao-Qing
2016-10-01
This study is to determine five naphthaquinones (acetylshikonin, β-acetoxyisovalerylalkannin, isobutylshikonin, β,β'-dimethylacrylalkannin,α-methyl-n-butylshikonin) by quantitative analysis of multi-components with a single marker (QAMS). β,β'-Dimethylacrylalkannin was selected as the internal reference substance, and the relative correlation factors (RCFs) of acetylshikonin, β-acetoxyisovalerylalkannin, isobutylshikonin and α-methyl-n-butylshikonin were calculated. Then the ruggedness of relative correction factors was tested on different instruments and columns. Meanwhile, 16 batches of Arnebia euchroma were analyzed by external standard method (ESM) and QAMS, respectively. The peaks were identifited by LC-MS. The ruggedness of relative correction factors was good. And the analytical results calculated by ESM and QAMS showed no difference. The quantitative method established was feasible and suitable for the quality evaluation of A. euchroma. Copyright© by the Chinese Pharmaceutical Association.
Han, Lide; Yang, Jian; Zhu, Jun
2007-06-01
A genetic model was proposed for simultaneously analyzing genetic effects of nuclear, cytoplasm, and nuclear-cytoplasmic interaction (NCI) as well as their genotype by environment (GE) interaction for quantitative traits of diploid plants. In the model, the NCI effects were further partitioned into additive and dominance nuclear-cytoplasmic interaction components. Mixed linear model approaches were used for statistical analysis. On the basis of diallel cross designs, Monte Carlo simulations showed that the genetic model was robust for estimating variance components under several situations without specific effects. Random genetic effects were predicted by an adjusted unbiased prediction (AUP) method. Data on four quantitative traits (boll number, lint percentage, fiber length, and micronaire) in Upland cotton (Gossypium hirsutum L.) were analyzed as a worked example to show the effectiveness of the model.
Cryogenic applications of commercial electronic components
NASA Astrophysics Data System (ADS)
Buchanan, Ernest D.; Benford, Dominic J.; Forgione, Joshua B.; Harvey Moseley, S.; Wollack, Edward J.
2012-10-01
We have developed a range of techniques useful for constructing analog and digital circuits for operation in a liquid Helium environment (4.2 K), using commercially available low power components. The challenges encountered in designing cryogenic electronics include finding components that can function usefully in the cold and possess low enough power dissipation so as not to heat the systems they are designed to measure. From design, test, and integration perspectives it is useful for components to operate similarly at room and cryogenic temperatures; however this is not a necessity. Some of the circuits presented here have been used successfully in the MUSTANG [1] and in the GISMO [2] camera to build a complete digital to analog multiplexer (which will be referred to as the Cryogenic Address Driver board). Many of the circuit elements described are of a more general nature rather than specific to the Cryogenic Address Driver board, and were studied as a part of a more comprehensive approach to addressing a larger set of cryogenic electronic needs.
Cryogenic Applications of Commercial Electronic Components
NASA Technical Reports Server (NTRS)
Buchanan, Ernest D.; Benford, Dominic J.; Forgione, Joshua B.; Moseley, S. Harvey; Wollack, Edward J.
2012-01-01
We have developed a range of techniques useful for constructing analog and digital circuits for operation in a liquid Helium environment (4.2K), using commercially available low power components. The challenges encountered in designing cryogenic electronics include finding components that can function usefully in the cold and possess low enough power dissipation so as not to heat the systems they are designed to measure. From design, test, and integration perspectives it is useful for components to operate similarly at room and cryogenic temperatures; however this is not a necessity. Some of the circuits presented here have been used successfully in the MUSTANG and in the GISMO camera to build a complete digital to analog multiplexer (which will be referred to as the Cryogenic Address Driver board). Many of the circuit elements described are of a more general nature rather than specific to the Cryogenic Address Driver board, and were studied as a part of a more comprehensive approach to addressing a larger set of cryogenic electronic needs.
Quantitative organic vapor-particle sampler
Gundel, Lara; Daisey, Joan M.; Stevens, Robert K.
1998-01-01
A quantitative organic vapor-particle sampler for sampling semi-volatile organic gases and particulate components. A semi-volatile organic reversible gas sorbent macroreticular resin agglomerates of randomly packed microspheres with the continuous porous structure of particles ranging in size between 0.05-10 .mu.m for use in an integrated diffusion vapor-particle sampler.
Matsuoka, Sadatoshi; Obara, Hiromi; Nagai, Mari; Murakami, Hitoshi; Chan Lon, Rasmey
2014-07-01
Though Cambodia made impressive gains in immunization coverage between the years 2000 and 2005, it recognized several health system challenges to greater coverage of immunization and sustainability. The Global Alliance for Vaccines and Immunization (GAVI) opened a Health System Strengthening (HSS) funding window in 2006. To address the health system challenges, Cambodia has been receiving the GAVI HSS fund since October 2007. The major component of the support is performance-based financing (PBF) for maternal, neonatal and child health (MNCH) services. To examine the impact of the PBF scheme on MNCH services and administrative management in rural Cambodia. Quantitative and qualitative studies were conducted in Kroch Chhmar Operational District (OD), Cambodia. Quantitative analyses were conducted on the trends of the numbers of MNCH services. A brief analysis was conducted using qualitative data. After the commencement of the PBF support, the volume of MNCH services was significantly boosted. In addition, strengthened financial and operational management was observed in the study area. However, the quality of the MNCH services was not ensured. Technical assistance, rather than the PBF scheme, was perceived by stakeholders to play a vital role in increasing the quality of the services. To improve the quality of the health services provided, it is better to include indicators on the quality of care in the PBF scheme. Mutual co-operation between PBF models and technical assistance may ensure better service quality while boosting the quantity. A robust but feasible data validation mechanism should be in place, as a PBF could incentivize inaccurate reporting. The capacity for financial management should be strengthened in PBF recipient ODs. To address the broader aspects of MNCH, a balanced input of resources and strengthening of all six building blocks of a health system are necessary. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2013; all rights reserved.
McMillen, J Curtis; Narendorf, Sarah Carter; Robinson, Debra; Havlicek, Judy; Fedoravicius, Nicole; Bertram, Julie; McNelly, David
2015-01-01
Older youth in out-of-home care often live in restrictive settings and face psychiatric issues without sufficient family support. This paper reports on the development and piloting of a manualized treatment foster care program designed to step down older youth with high psychiatric needs from residential programs to treatment foster care homes. A team of researchers and agency partners set out to develop a treatment foster care model for older youth based on Multi-dimensional Treatment Foster Care (MTFC). After matching youth by mental health condition and determining for whom randomization would be allowed, 14 youth were randomized to treatment as usual or a treatment foster home intervention. Stakeholders were interviewed qualitatively at multiple time points. Quantitative measures assessed mental health symptoms, days in locked facilities, employment and educational outcomes. Development efforts led to substantial variations from the MTFC model and a new model, Treatment Foster Care for Older Youth was piloted. Feasibility monitoring suggested that it was difficult, but possible to recruit and randomize youth from and out of residential homes and that foster parents could be recruited to serve them. Qualitative data pointed to some qualified clinical successes. Stakeholders viewed two team roles - that of psychiatric nurse and skills coaches - very highly. However, results also suggested that foster parents and some staff did not tolerate the intervention well and struggled to address the emotion dysregulation issues of the young people they served. Quantitative data demonstrated that the intervention was not keeping youth out of locked facilities. The intervention needed further refinement prior to a broader trial. Intervention development work continued until components were developed to help address emotion regulation problems among fostered youth. Psychiatric nurses and skills coaches who work with youth in community settings hold promise as important supports for older youth with psychiatric needs.
2014-01-01
Background While statistics is increasingly taught as part of the medical curriculum, it can be an unpopular subject and feedback from students indicates that some find it more difficult than other subjects. Understanding attitudes towards statistics on entry to graduate entry medical programmes is particularly important, given that many students may have been exposed to quantitative courses in their previous degree and hence bring preconceptions of their ability and interest to their medical education programme. The aim of this study therefore is to explore, for the first time, attitudes towards statistics of graduate entry medical students from a variety of backgrounds and focus on understanding the role of prior learning experiences. Methods 121 first year graduate entry medical students completed the Survey of Attitudes toward Statistics instrument together with information on demographics and prior learning experiences. Results Students tended to appreciate the relevance of statistics in their professional life and be prepared to put effort into learning statistics. They had neutral to positive attitudes about their interest in statistics and their intellectual knowledge and skills when applied to it. Their feelings towards statistics were slightly less positive e.g. feelings of insecurity, stress, fear and frustration and they tended to view statistics as difficult. Even though 85% of students had taken a quantitative course in the past, only 24% of students described it as likely that they would take any course in statistics if the choice was theirs. How well students felt they had performed in mathematics in the past was a strong predictor of many of the components of attitudes. Conclusion The teaching of statistics to medical students should start with addressing the association between students’ past experiences in mathematics and their attitudes towards statistics and encouraging students to recognise the difference between the two disciplines. Addressing these issues may reduce students’ anxiety and perception of difficulty at the start of their learning experience and encourage students to engage with statistics in their future careers. PMID:24708762
Hannigan, Ailish; Hegarty, Avril C; McGrath, Deirdre
2014-04-04
While statistics is increasingly taught as part of the medical curriculum, it can be an unpopular subject and feedback from students indicates that some find it more difficult than other subjects. Understanding attitudes towards statistics on entry to graduate entry medical programmes is particularly important, given that many students may have been exposed to quantitative courses in their previous degree and hence bring preconceptions of their ability and interest to their medical education programme. The aim of this study therefore is to explore, for the first time, attitudes towards statistics of graduate entry medical students from a variety of backgrounds and focus on understanding the role of prior learning experiences. 121 first year graduate entry medical students completed the Survey of Attitudes toward Statistics instrument together with information on demographics and prior learning experiences. Students tended to appreciate the relevance of statistics in their professional life and be prepared to put effort into learning statistics. They had neutral to positive attitudes about their interest in statistics and their intellectual knowledge and skills when applied to it. Their feelings towards statistics were slightly less positive e.g. feelings of insecurity, stress, fear and frustration and they tended to view statistics as difficult. Even though 85% of students had taken a quantitative course in the past, only 24% of students described it as likely that they would take any course in statistics if the choice was theirs. How well students felt they had performed in mathematics in the past was a strong predictor of many of the components of attitudes. The teaching of statistics to medical students should start with addressing the association between students' past experiences in mathematics and their attitudes towards statistics and encouraging students to recognise the difference between the two disciplines. Addressing these issues may reduce students' anxiety and perception of difficulty at the start of their learning experience and encourage students to engage with statistics in their future careers.
B.M. Collins; S.L. Stephens
2010-01-01
The complexity inherent in variable, or mixed-severity fire regimes makes quantitative characterization of important fire regime attributes (e.g., proportion of landscape burned at different severities, size and distribution of stand-replacing patches) difficult. As a result, there is ambiguity associated with the term ‘mixed-severity’. We address...
Selective Cutting Impact on Carbon Storage in Fremont-Winema National Forest, Oregon
NASA Astrophysics Data System (ADS)
Huybrechts, C.; Cleve, C. T.
2004-12-01
Management personnel of the Fremont-Winema National Forest in southern Oregon were interested in investigating how selective cutting or fuel load reduction treatments affect forest carbon sinks and as an ancillary product, fire risk. This study was constructed with the objective of providing this information to the forest administrators, as well as to satisfy a directive to study carbon management, a component of the 2004 NASA's Application Division Program Plan. During the summer of 2004, a request for decision support tools by the forest management was addressed by a NASA sponsored student-led, student-run internship group called DEVELOP. This full-time10-week program was designed to be an introduction to work done by earth scientists, professional business / client relationships and the facilities available at NASA Ames. Four college and graduate students from varying educational backgrounds designed the study and implementation plan. The team collected data for five consecutive days in Oregon throughout the Fremont-Winema forest and the surrounding terrain, consisting of soil sampling for underground carbon dynamics, fire model and vegetation map validation. The goal of the carbon management component of the project was to model current carbon levels, then to gauge the effect of fuel load reduction treatments. To study carbon dynamics, MODIS derived fraction photosynthetically active radiation (FPAR) maps, regional climate data, and Landsat 5 generated dominant vegetation species and land cover maps were used in conjunction with the NASA - Carnegie-Ames-Stanford-Approach (CASA) model. To address fire risk the dominant vegetation species map was used to estimate fuel load based on species biomass in conjunction with a mosaic of digital elevation models (DEMs) as components to the creation of an Anderson-inspired fuel map, a rate of spread in meters/minute map and a flame length map using ArcMap 9 and FlamMap. Fire risk results are to be viewed qualitatively as maps output spatial distribution of data rather then quantitative assessment of risk. For the first time ever, the resource managers at the Fremont-Winema forest will be taking into consideration the value of carbon as a resource in their decision making process for the 2005 Fremont-Winema forest management plan.
Sundh, Ingvar
1992-01-01
The molecular size distribution and biochemical composition of the dissolved organic carbon released from natural communities of lake phytoplankton (photosynthetically produced dissolved organic carbon [PDOC]) and subsequently used by heterotrophic bacteria were determined in three lakes differing in trophic status and concentration of humic substances. After incubation of epilimnetic lake water samples with H14CO3- over one diel cycle, the phytoplankton were removed by size-selective filtration. The filtrates, still containing most of the heterotrophic bacteria, were reincubated in darkness (heterotrophic incubation). Differences in the amount and composition of PDO14C between samples collected before the heterotrophic incubation and samples collected afterwards were considered to be a result of bacterial utilization. The PDO14C collected at the start of the heterotrophic incubations always contained both high (>10,000)- and low (<1,000)-molecular-weight (MW) components and sometimes contained intermediate-MW components as well. In general, bacterial turnover rates of the low-MW components were fairly rapid, whereas the high-MW components were utilized slowly or not at all. In the humic lake, the intermediate-MW components accounted for a large proportion of the net PDO14C and were subject to rapid bacterial utilization. This fraction probably consisted almost entirely of polysaccharides of ca. 6,000 MW. Amino acids and peptides, other organic acids, and carbohydrates could all be quantitatively important parts of the low-MW PDO14C that was utilized by the heterotrophic bacteria, but the relative contributions of these fractions differed widely. It was concluded that, generally, low-MW components of PDOC are quantitatively much more important to the bacteria than are high-MW components, that PDOC released from phytoplankton does not contain substances of quantitative importance as bacterial substrates in all situations, and that high-MW components of PDOC probably contribute to the buildup of refractory, high-MW dissolved organic carbon in pelagic environments. PMID:16348767
Ulgen, Ayse; Han, Zhihua; Li, Wentian
2003-12-31
We address the question of whether statistical correlations among quantitative traits lead to correlation of linkage results of these traits. Five measured quantitative traits (total cholesterol, fasting glucose, HDL cholesterol, blood pressure, and triglycerides), and one derived quantitative trait (total cholesterol divided by the HDL cholesterol) are used for phenotype correlation studies. Four of them are used for linkage analysis. We show that although correlation among phenotypes partially reflects the correlation among linkage analysis results, the LOD-score correlations are on average low. The most significant peaks found by using different traits do not often overlap. Studying covariances at specific locations in LOD scores may provide clues for further bivariate linkage analyses.
Munoz-Plaza, Corrine E; Parry, Carla; Hahn, Erin E; Tang, Tania; Nguyen, Huong Q; Gould, Michael K; Kanter, Michael H; Sharp, Adam L
2016-08-15
Despite reports advocating for integration of research into healthcare delivery, scant literature exists describing how this can be accomplished. Examples highlighting application of qualitative research methods embedded into a healthcare system are particularly needed. This article describes the process and value of embedding qualitative research as the second phase of an explanatory, sequential, mixed methods study to improve antibiotic stewardship for acute sinusitis. Purposive sampling of providers for in-depth interviews improved understanding of unwarranted antibiotic prescribing and elicited stakeholder recommendations for improvement. Qualitative data collection, transcription and constant comparative analyses occurred iteratively. Emerging themes and sub-themes identified primary drivers of unwarranted antibiotic prescribing patterns and recommendations for improving practice. These findings informed the design of a health system intervention to improve antibiotic stewardship for acute sinusitis. Core components of the intervention are also described. Qualitative research can be effectively applied in learning healthcare systems to elucidate quantitative results and inform improvement efforts.
Predicting soil quality indices with near infrared analysis in a wildfire chronosequence.
Cécillon, Lauric; Cassagne, Nathalie; Czarnes, Sonia; Gros, Raphaël; Vennetier, Michel; Brun, Jean-Jacques
2009-01-15
We investigated the power of near infrared (NIR) analysis for the quantitative assessment of soil quality in a wildfire chronosequence. The effect of wildfire disturbance and soil engineering activity of earthworms on soil organic matter quality was first assessed with principal component analysis of NIR spectra. Three soil quality indices were further calculated using an adaptation of the method proposed by Velasquez et al. [Velasquez, E., Lavelle, P., Andrade, M. GISQ, a multifunctional indicator of soil quality. Soil Biol Biochem 2007; 39: 3066-3080.], each one addressing an ecosystem service provided by soils: organic matter storage, nutrient supply and biological activity. Partial least squares regression models were developed to test the predicting ability of NIR analysis for these soil quality indices. All models reached coefficients of determination above 0.90 and ratios of performance to deviation above 2.8. This finding provides new opportunities for the monitoring of soil quality, using NIR scanning of soil samples.
NASA Astrophysics Data System (ADS)
Wang, Yana; Zhou, Zhili; Chen, Mingji; Huang, Yixing; Wang, Changxian; Song, Wei-Li
2018-05-01
Since achievement in electromagnetic (EM) technology dramatically promotes the critical requirement in developing advanced EM response materials, which are required to hold various advantageous features in light weight, small thickness, strong reflection loss and broadband absorption, the most important requirements, i.e. strong reflection loss and broadband absorption, are still highly pursued because of the intrinsic shortage in conventional EM absorbers. For addressing such critical problems, a unique three-dimensional nitrogen doped carbon monolith was demonstrated to understand the effects of the nitrogen doping on the dielectric and microwave absorption performance. The chemical components of the nitrogen doped carbon monoliths have been quantitatively determined for fully understanding the effects of nanoscale structures on the macroscopic composites. A modified Cole-Cole plot is plotted for guiding the chemical doping and material process, aiming to realizing the best matching conditions. The results have promised a universal route for achieving advanced materials with strong and broadband EM absorption.
Burnett, W.C.; Santos, I.R.; Weinstein, Y.; Swarzenski, P.W.; Herut, B.
2007-01-01
Research performed in many locations over the past decade has shown that radon is an effective tracer for quantifying submarine groundwater discharge (SGD). The technique works because both fresh and saline groundwaters acquire radon from the subterranean environment and display activities that are typically orders of magnitude greater than those found in coastal seawaters. However, some uncertainties and unanswered problems remain. We focus here on three components of the mass balance, each of which has some unresolved issues: (1) End-member radon - what to do if groundwater Rn measurements are highly variable? (2) Atmospheric evasion -do the standard gas exchange equations work under high-energy coastal mixing scenarios? And (3) "mixing" losses - are there other significant radon losses (e.g. recharge of coastal waters into the aquifer) besides those attributed to mixing with lower-activity waters offshore? We address these issues using data sets collected from several different types of coastal environment. Copyright ?? 2007 IAHS Press.
Allen, Victoria W; Shirasu-Hiza, Mimi
2018-01-01
Despite being pervasive, the control of programmed grooming is poorly understood. We addressed this gap by developing a high-throughput platform that allows long-term detection of grooming in Drosophila melanogaster. In our method, a k-nearest neighbors algorithm automatically classifies fly behavior and finds grooming events with over 90% accuracy in diverse genotypes. Our data show that flies spend ~13% of their waking time grooming, driven largely by two major internal programs. One of these programs regulates the timing of grooming and involves the core circadian clock components cycle, clock, and period. The second program regulates the duration of grooming and, while dependent on cycle and clock, appears to be independent of period. This emerging dual control model in which one program controls timing and another controls duration, resembles the two-process regulatory model of sleep. Together, our quantitative approach presents the opportunity for further dissection of mechanisms controlling long-term grooming in Drosophila. PMID:29485401
Methods for heat transfer and temperature field analysis of the insulated diesel
NASA Technical Reports Server (NTRS)
Morel, T.; Blumberg, P. N.; Fort, E. F.; Keribar, R.
1984-01-01
Work done during phase 1 of a three-year program aimed at developing a comprehensive heat transfer and thermal analysis methodology oriented specifically to the design requirements of insulated diesel engines is reported. The technology developed in this program makes possible a quantitative analysis of the low heat rejection concept. The program is comprehensive in that it addresses all the heat transfer issues that are critical to the successful development of the low heat rejection diesel engine: (1) in-cylinder convective and radiative heat transfer; (2) cyclic transient heat transfer in thin solid layers at component surfaces adjacent to the combustion chamber; and (3) steady-state heat conduction in the overall engine structure. The Integral Technologies, Inc. (ITI) program is comprised of a set of integrated analytical and experimental tasks. A detailed review of the ITI program approach is provided, including the technical issues which underlie it and a summay of the methods that were developed.
Using ERPs for assessing the (sub) conscious perception of noise.
Porbadnigk, Anne K; Antons, Jan-N; Blankertz, Benjamin; Treder, Matthias S; Schleicher, Robert; Moller, Sebastian; Curio, Gabriel
2010-01-01
In this paper, we investigate the use of event-related potentials (ERPs) as a quantitative measure for quality assessment of disturbed audio signals. For this purpose, we ran an EEG study (N=11) using an oddball paradigm, during which subjects were presented with the phoneme /a/, superimposed with varying degrees of signal-correlated noise. Based on this data set, we address the question to which degree the degradation of the auditory stimuli is reflected on a neural level, even if the disturbance is below the threshold of conscious perception. For those stimuli that are consciously recognized as being disturbed, we suggest the use of the amplitude and latency of the P300 component for assessing the level of disturbance. For disturbed stimuli for which the noise is not perceived consciously, we show for two subjects that a classifier based on shrinkage LDA can be applied successfully to single out stimuli, for which the noise was presumably processed subconsciously.
Next Generation Life Support: High Performance EVA Glove
NASA Technical Reports Server (NTRS)
Walsh, Sarah K.
2015-01-01
The objectives of the High Performance EVA Glove task are to develop advanced EVA gloves for future human space exploration missions and generate corresponding standards by which progress may be quantitatively assessed. New technologies and manufacturing techniques will be incorporated into the new gloves to address finger and hand mobility, injury reduction and durability in nonpristine environments. Three prototypes will be developed, each focusing on different technological advances. A robotic assist glove will integrate a powered grasping system into the current EVA glove design to reduce astronaut hand fatigue and hand injuries. A mechanical counter pressure (MCP) glove will be developed to further explore the potential of MCP technology and assess its capability for countering the effects of vacuum or low pressure environments on the body by using compression fabrics or materials to apply the necessary pressure. A gas pressurized glove, incorporating new technologies, will be the most flight-like of the three prototypes. Advancements include the development and integration of aerogel insulation, damage sensing components, dust-repellant coatings, and dust tolerant bearings.
ERIC Educational Resources Information Center
Silver, Edward A.
This paper is a reaction to a plenary address, "Fairness in Dealing: Diversity, Psychology, and Mathematics Education" by Suzanne Damarin (SE 057 179). The issues of intentionality, institutional and instructional practices, identity development, and assessment are addressed in regard to the Quantitative Understanding: Amplifying Student…
ERIC Educational Resources Information Center
Voight, Rebecca W.
2012-01-01
The purpose of this quantitative descriptive study was to explore the perceptions of DNP administrators regarding the extent acupuncture and acupressure are addressed in DNP curricula. Five research questions were addressed: 1. What are the perspectives of DNP program administrators regarding acupuncture and acupressure? 2. What are the…
Addressing Research and Development Gaps for Plasma-Material Interactions with Linear Plasma Devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rapp, Juergen
Plasma-material interactions in future fusion reactors have been identified as a knowledge gap to be dealt with before any next step device past ITER can be built. The challenges are manifold. They are related to power dissipation so that the heat fluxes to the plasma-facing components can be kept at technologically feasible levels; maximization of the lifetime of divertor plasma-facing components that allow for steadystate operation in a reactor to reach the neutron fluence required; the tritium inventory (storage) in the plasma-facing components, which can lead to potential safety concerns and reduction in the fuel efficiency; and it is relatedmore » to the technology of the plasma-facing components itself, which should demonstrate structural integrity under the high temperatures and high neutron fluence. While the dissipation of power exhaust can and should be addressed in high power toroidal devices, the interaction of the plasma with the materials can be best addressed in dedicated linear devices due to their cost effectiveness and ability to address urgent research and development gaps more timely. However, new linear plasma devices are needed to investigate the PMI under fusion reactor conditions and test novel plasma-facing components. Existing linear devices are limited either in their flux, their reactor-relevant plasma transport regimes in front of the target, their fluence, or their ability to test material samples a priori exposed to high neutron fluence. The proposed Material Plasma Exposure eXperiment (MPEX) is meant to address those deficiencies and will be designed to fulfill the fusion reactor-relevant plasma parameters as well as the ability to expose a priori neutron activated materials to plasmas.« less
The SCHEIE Visual Field Grading System
Sankar, Prithvi S.; O’Keefe, Laura; Choi, Daniel; Salowe, Rebecca; Miller-Ellis, Eydie; Lehman, Amanda; Addis, Victoria; Ramakrishnan, Meera; Natesh, Vikas; Whitehead, Gideon; Khachatryan, Naira; O’Brien, Joan
2017-01-01
Objective No method of grading visual field (VF) defects has been widely accepted throughout the glaucoma community. The SCHEIE (Systematic Classification of Humphrey visual fields-Easy Interpretation and Evaluation) grading system for glaucomatous visual fields was created to convey qualitative and quantitative information regarding visual field defects in an objective, reproducible, and easily applicable manner for research purposes. Methods The SCHEIE grading system is composed of a qualitative and quantitative score. The qualitative score consists of designation in one or more of the following categories: normal, central scotoma, paracentral scotoma, paracentral crescent, temporal quadrant, nasal quadrant, peripheral arcuate defect, expansive arcuate, or altitudinal defect. The quantitative component incorporates the Humphrey visual field index (VFI), location of visual defects for superior and inferior hemifields, and blind spot involvement. Accuracy and speed at grading using the qualitative and quantitative components was calculated for non-physician graders. Results Graders had a median accuracy of 96.67% for their qualitative scores and a median accuracy of 98.75% for their quantitative scores. Graders took a mean of 56 seconds per visual field to assign a qualitative score and 20 seconds per visual field to assign a quantitative score. Conclusion The SCHEIE grading system is a reproducible tool that combines qualitative and quantitative measurements to grade glaucomatous visual field defects. The system aims to standardize clinical staging and to make specific visual field defects more easily identifiable. Specific patterns of visual field loss may also be associated with genetic variants in future genetic analysis. PMID:28932621
NASA Astrophysics Data System (ADS)
Fan, Li; Lin, Changhu; Duan, Wenjuan; Wang, Xiao; Liu, Jianhua; Liu, Feng
2015-01-01
An ultrahigh pressure extraction (UPE)-high performance liquid chromatography (HPLC)/diode array detector (DAD) method was established to evaluate the quality of Lonicera japonica Thunb. Ten active components, including neochlorogenic acid, chlorogenic acid, 4-dicaffeoylquinic acid, caffeic acid, rutin, luteoloside, isochlorogenic acid B, isochlorogenic acid A, isochlorogenic acid C, and quercetin, were qualitatively evaluated and quantitatively determined. Scanning electron microscope images elucidated the bud surface microstructure and extraction mechanism. The optimal extraction conditions of the UPE were 60% methanol solution, 400 MPa of extraction pressure, 3 min of extraction time, and 1:30 (g/mL) solid:liquid ratio. Under the optimized conditions, the total extraction yield of 10 active components was 57.62 mg/g. All the components showed good linearity (r2 ≥ 0.9994) and recoveries. This method was successfully applied to quantify 10 components in 22 batches of L. japonica samples from different areas. Compared with heat reflux extraction and ultrasonic-assisted extraction, UPE can be considered as an alternative extraction technique for fast extraction of active ingredient from L. japonica.
Ultrastructural analysis of cell component distribution in the apical cell of Ceratodon protonemata
NASA Technical Reports Server (NTRS)
Walker, L. M.; Sack, F. D.
1995-01-01
A distinctive feature of tip-growing plant cells is that cell components are distributed differentially along the length of the cell, although most ultrastructural analyses have been qualitative. The longtitudinal distribution of cell components was studied both qualitatively and quantitatively in the apical cell of dark-grown protonemata of the moss Ceratodon. The first 35 micrometers of the apical cell was analyzed stereologically using transmission electron microscopy. There were four types of distributions along the cell's axis, three of them differential: (1) tubular endoplasmic reticulum was evenly distributed, (2) cisternal endoplasmic reticulum and Golgi vesicles were distributed in a tip-to-base gradient, (3) plastids, vacuoles, and Golgi stacks were enriched in specific areas, although the locations of the enrichments varied, and (4) mitochondria were excluded in the tip-most 5 micrometers and evenly distributed throughout the remaining 30 micrometers. This study provides one of the most comprehensive quantitative, ultrastructural analyses of the distribution of cell components in the apex of any tip-growing plant cell. The finding that almost every component had its own spatial arrangement demonstrates the complexity of the organization and regulation of the distribution of components in tip-growing cells.
Jović, Ozren; Smolić, Tomislav; Primožič, Ines; Hrenar, Tomica
2016-04-19
The aim of this study was to investigate the feasibility of FTIR-ATR spectroscopy coupled with the multivariate numerical methodology for qualitative and quantitative analysis of binary and ternary edible oil mixtures. Four pure oils (extra virgin olive oil, high oleic sunflower oil, rapeseed oil, and sunflower oil), as well as their 54 binary and 108 ternary mixtures, were analyzed using FTIR-ATR spectroscopy in combination with principal component and discriminant analysis, partial least-squares, and principal component regression. It was found that the composition of all 166 samples can be excellently represented using only the first three principal components describing 98.29% of total variance in the selected spectral range (3035-2989, 1170-1140, 1120-1100, 1093-1047, and 930-890 cm(-1)). Factor scores in 3D space spanned by these three principal components form a tetrahedral-like arrangement: pure oils being at the vertices, binary mixtures at the edges, and ternary mixtures on the faces of a tetrahedron. To confirm the validity of results, we applied several cross-validation methods. Quantitative analysis was performed by minimization of root-mean-square error of cross-validation values regarding the spectral range, derivative order, and choice of method (partial least-squares or principal component regression), which resulted in excellent predictions for test sets (R(2) > 0.99 in all cases). Additionally, experimentally more demanding gas chromatography analysis of fatty acid content was carried out for all specimens, confirming the results obtained by FTIR-ATR coupled with principal component analysis. However, FTIR-ATR provided a considerably better model for prediction of mixture composition than gas chromatography, especially for high oleic sunflower oil.
Scintigraphy of infected total hip arthroplasty (THA): A canine model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merkel, K.D.; Brown, M.L.; Fitzgerald, R.H.
1984-01-01
Differentiating low-grade sepsis from aseptic loosening of an orthopedic prosthesis is difficult. This study was designed to compare the ability of Tc-99m-HMDP, Ga-67, and In-111 leukocytes (WC) to differentiate low-grade sepsis from aseptic THA component loosening in a canine model. A canine THA was implanted in 14 dogs. Six dogs were given infected femoral components by injecting 10/sup 5/ colony-forming units of Staphylococcus aureus into the femoral canal 6y0 to 90 seconds prior to cementing. Four dogs had an aseptic loose femoral component, and four dogs had an aseptic tight femoral component (control). At six months all dogs were evaluatedmore » with X-ray, lab scintigraphy, and tissue quantitation of each tracer. Diagnosis was confirmed by histology and quantitative microbiology. White blood cell counts and differentials were normal in all dogs, and in only one out of six infected dogs was the sedimentation rate abnormal. X-rays were interpreted as possible infection in five dogs and probable infection in only one dog. In-111 WBC scans were more accurate than sequential Tc-Ga scans (sensitivity 94% vs 61%, specificity 86% vs 71% accuracy 90% vs 67%). Quantitative counting of gamma camera data and tissue samples demonstrated significantly (P < .01) higher accumulation of In-111 WBC about the infected than the loose or control component. No significant difference was demonstrated between the loose and septic components with TC-HMDP or Ga. These results correlate well and confirm our clinical data that In-111 WBC scanning is accurate and useful in the workup of the painful orthopedic prosthesis.« less
2015-06-29
requirements for the system’s ground components to the generating capabilities of standard Marine Corps generators ). The Navy did not fully address two...dedicated generators to power the ground control stations. Recommendations The Navy and Marine Corps should consider the following recommendations in...components to the generating capabilities of standard Marine Corps generators ). The Navy did not fully address two recommendations (strengthening
2015 Workplace and Gender Relations Survey of Reserve Component Members: Tabulations of Responses
2016-03-17
Report 6. AUTHOR(S) DefenseResearch, Surveys,andStatistics Center (RSSC) 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Defense Manpower Data Center...2015 WORKPLACE AND GENDER RELATIONS SURVEY OF RESERVE COMPONENT MEMBERS: TABULATIONS OF RESPONSES Defense Manpower Data Center Defense Research...Surveys, and Statistics Center 4800 Mark Center Drive, Suite 04E25-01, Alexandria, VA 22350-4000 ii DMDC Acknowledgments The Defense Manpower Data
Raven, J A
2000-01-01
Biochemical studies have complemented ultrastructural and, subsequently molecular genetic evidence consistent with the Charophyceae being the closest extant algal relatives of the embryophytes. Among the genes used in such molecular phylogenetic studies is that rbcL) for the large subunit of ribulose bisphosphate carboxylase-oxygenase (RUBISCO). The RUBISCO of the embryophytes is derived, via the Chlorophyta. from that of the cyanobacteria. This clade of the molecular phylogeny of RUBISCO shows a range of kinetic characteristics, especially of CO2 affinities and of CO2/O2 selectivities. The range of these kinetic values within the bryophytes is no greater than in the rest of the embryophytes; this has implications for the evolution of the embryophytes in the high atmospheric CO2 environment of the late Lower Palaeozoic. The differences in biochemistry between charophycean algae and embryophytes can to some extent be related functionally to the structure and physiology of embryophytes. Examples of components of embryophytes, which are qualitatively or quantitatively different from those of charophytes, are the water repellent/water resistant extracellular lipids, the rigid phenolic polymers functional in water-conducting elements and mechanical support in air, and in UV-B absorption, flavonoid phenolics involved in UV-B absorption and in interactions with other organisms, and the greater emphasis on low Mr organic acids. retained in the plant as free acids or salts, or secreted to the rhizosphere. The roles of these components are discussed in relation to the environmental conditions at the time of evolution of the terrestrial embryophytes. A significant point about embryophytes is the predominance of nitrogen-free extracellular structural material (a trait shared by most algae) and UV-B screening components, by contrast with analogous components in many other organisms. An important question, which has thus far been incompletely addressed, is the extent to which the absence from bryophytes of the biochemical pathways which produce components found only in tracheophytes is the result of evolutionary loss of these functions. PMID:10905612
Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.
2017-01-01
Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.
QUANTITATION OF MOLECULAR ENDPOINTS FOR THE DOSE-RESPONSE COMPONENT OF CANCER RISK ASSESSMENT
Cancer risk assessment involves the steps of hazard identification, dose-response assessment, exposure assessment and risk characterization. The rapid advances in the use of molecular biology approaches has had an impact on all four components, but the greatest overall current...
Q and you: The application of Q methodology in recreation research
Whitney Ward
2010-01-01
Researchers have used various qualitative and quantitative methods to deal with subjectivity in studying people's recreation experiences. Q methodology has been the most effective approach for analyzing both qualitative and quantitative aspects of experience, including attitudes or perceptions. The method is composed of two main components--Q sorting and Q factor...
Quantitative Residual Strain Analyses on Strain Hardened Nickel Based Alloy
NASA Astrophysics Data System (ADS)
Yonezawa, Toshio; Maeguchi, Takaharu; Goto, Toru; Juan, Hou
Many papers have reported about the effects of strain hardening by cold rolling, grinding, welding, etc. on stress corrosion cracking susceptibility of nickel based alloys and austenitic stainless steels for LWR pipings and components. But, the residual strain value due to cold rolling, grinding, welding, etc. is not so quantitatively evaluated.
Group Projects and Civic Engagement in a Quantitative Literacy Course
ERIC Educational Resources Information Center
Dewar, Jacqueline; Larson, Suzanne; Zachariah, Thomas
2011-01-01
We describe our approach to incorporating a civic engagement component into a quantitative literacy (QL) course and the resulting gains in student learning, confidence, and awareness of local civic issues. We revised the existing QL course by including semester-long group projects involving local community issues that students could investigate…
ERIC Educational Resources Information Center
Flamer, Eric, Sr.
2012-01-01
Establishing college-degree programs for prison inmates is an evidence-based effective instructional strategy in reducing recidivism. Evaluating academic arenas as a resource to improve behavior and levels of functioning within correctional facilities is a necessary component of inmate academic programs. The purpose of this quantitative,…
A Quantitative Description of FBI Public Relations.
ERIC Educational Resources Information Center
Gibson, Dirk C.
1997-01-01
States that the Federal Bureau of Investigation (FBI) had the most successful media relations program of all government agencies from the 1930s to the 1980s. Uses quantitative analysis to show why those media efforts were successful. Identifies themes that typified the verbal component of FBI publicity and the broad spectrum of mass communication…
USDA-ARS?s Scientific Manuscript database
A quantitative answer cannot exist in an analysis without a qualitative component to give enough confidence that the result meets the analytical needs for the analysis (i.e. the result relates to the analyte and not something else). Just as a quantitative method must typically undergo an empirical ...
Dawidowicz, Andrzej L; Szewczyk, Joanna; Dybowski, Michal P
2016-09-07
Similar quantitative relations between individual constituents of the liquid sample established by its direct injection can be obtained applying Polydimethylsiloxane (PDMS) fiber in the headspace solid phase microextraction (HS-SPME) system containing the examined sample suspended in methyl silica oil. This paper proves that the analogous system composed of sample suspension/emulsion in polyethylene glycol (PEG) and Carbowax fiber allows to get similar quantitative relations between components of the mixture as those established by its direct analysis, but only for polar constituents. It is demonstrated for essential oil (EO) components of savory, sage, mint and thyme, and of artificial liquid mixture of polar constituents. The observed differences in quantitative relations between polar constituents estimated by both applied procedures are insignificant (Fexp < Fcrit). The presented results indicates that wider applicability of the system composed of a sample suspended in the oil of the same physicochemical character as that of used SPME fiber coating strongly depends on the character of interactions between analytes-suspending liquid and analytes-fiber coating. Copyright © 2016 Elsevier B.V. All rights reserved.
Zhang, Qinnan; Zhong, Liyun; Tang, Ping; Yuan, Yingjie; Liu, Shengde; Tian, Jindong; Lu, Xiaoxu
2017-05-31
Cell refractive index, an intrinsic optical parameter, is closely correlated with the intracellular mass and concentration. By combining optical phase-shifting interferometry (PSI) and atomic force microscope (AFM) imaging, we constructed a label free, non-invasive and quantitative refractive index of single cell measurement system, in which the accurate phase map of single cell was retrieved with PSI technique and the cell morphology with nanoscale resolution was achieved with AFM imaging. Based on the proposed AFM/PSI system, we achieved quantitative refractive index distributions of single red blood cell and Jurkat cell, respectively. Further, the quantitative change of refractive index distribution during Daunorubicin (DNR)-induced Jurkat cell apoptosis was presented, and then the content changes of intracellular biochemical components were achieved. Importantly, these results were consistent with Raman spectral analysis, indicating that the proposed PSI/AFM based refractive index system is likely to become a useful tool for intracellular biochemical components analysis measurement, and this will facilitate its application for revealing cell structure and pathological state from a new perspective.
He, Fuyuan; Deng, Kaiwen; Zou, Huan; Qiu, Yun; Chen, Feng; Zhou, Honghao
2011-01-01
To study on the differences between chromatopharmacokinetics (pharmacokinetics with fingerprint chromatography) and chromatopharmacodynamics (pharmacodynamics with fingerprint chromatography) of Chinese materia medica formulae to answer the question whether the pharmacokinetic parameters of multiple composites can be utilized to guide the medication of multiple composites. On the base of established four chromatopharmacology (pharmacology with chromatographic fingerprint), the pharmacokinetics, and pharmacodynamics were analyzed comparably on their mathematical model and parameter definition. On the basis of quantitative pharmacology, the function expressions and total statistical parameters, such as total zero moment, total first moment, total second moment of the pharmacokinetics, and pharmacodynamics were analyzed to the common expressions and elucidated results for single and multiple components in Chinese materia medica formulae. Total quantitative pharmacokinetic, i.e., chromatopharmacokinetic parameter were decided by each component pharmacokinetic parameters, whereas the total quantitative pharmacodynamic, i.e., chromatopharmacodynamic parameter were decided by both of pharmacokinetic and pharmacodynamic parameters of each components. The pharmacokinetic parameters were corresponded to pharmacodynamic parameters with an existing stable effective coefficient when the constitutive ratio of each composite was a constant. The effects of Chinese materia medica were all controlled by pharmacokinetic and pharmacodynamic coefficient. It is a special case that the pharmacokinetic parameter could independently guide the clinical medication for single component whereas the chromatopharmacokinetic parameters are not applied to the multiple drug combination system, and not be used to solve problems of chromatopharmacokinetic of Chinese materia medica formulae.
Mapping complex traits as a dynamic system
Sun, Lidan; Wu, Rongling
2017-01-01
Despite increasing emphasis on the genetic study of quantitative traits, we are still far from being able to chart a clear picture of their genetic architecture, given an inherent complexity involved in trait formation. A competing theory for studying such complex traits has emerged by viewing their phenotypic formation as a “system” in which a high-dimensional group of interconnected components act and interact across different levels of biological organization from molecules through cells to whole organisms. This system is initiated by a machinery of DNA sequences that regulate a cascade of biochemical pathways to synthesize endophenotypes and further assemble these endophenotypes toward the end-point phenotype in virtue of various developmental changes. This review focuses on a conceptual framework for genetic mapping of complex traits by which to delineate the underlying components, interactions and mechanisms that govern the system according to biological principles and understand how these components function synergistically under the control of quantitative trait loci (QTLs) to comprise a unified whole. This framework is built by a system of differential equations that quantifies how alterations of different components lead to the global change of trait development and function, and provides a quantitative and testable platform for assessing the multiscale interplay between QTLs and development. The method will enable geneticists to shed light on the genetic complexity of any biological system and predict, alter or engineer its physiological and pathological states. PMID:25772476
NASA Astrophysics Data System (ADS)
Sharifi, P.; Jamali, J.; Sadayappan, K.; Wood, J. T.
2018-05-01
A quantitative experimental study of the effects of process parameters on the formation of defects during solidification of high-pressure die cast magnesium alloy components is presented. The parameters studied are slow-stage velocity, fast-stage velocity, intensification pressure, and die temperature. The amount of various defects are quantitatively characterized. Multiple runs of the commercial casting simulation package, ProCAST™, are used to model the mold-filling and solidification events. Several locations in the component including knit lines, last-to-fill region, and last-to-solidify region are identified as the critical regions that have a high concentration of defects. The area fractions of total porosity, shrinkage porosity, gas porosity, and externally solidified grains are separately measured. This study shows that the process parameters, fluid flow and local solidification conditions, play major roles in the formation of defects during HPDC process.
Otani, Takashi
2017-01-01
The article is an in-depth explanation of qualitative research, an approach increasingly prevalent among today's research communities. After discussing its present spread within the health sciences, the author addresses: 1. Its definition. 2. Its characteristics, as well as its theoretical and procedural background. 3. Its procedures. 4. Differences between qualitative and quantitative approaches. 5. Mixed methods incorporating quantitative research. And in conclusion: 6. The importance of establishing an epistemological perspective in qualitative research.
Quantitative Live-Cell Confocal Imaging of 3D Spheroids in a High-Throughput Format.
Leary, Elizabeth; Rhee, Claire; Wilks, Benjamin T; Morgan, Jeffrey R
2018-06-01
Accurately predicting the human response to new compounds is critical to a wide variety of industries. Standard screening pipelines (including both in vitro and in vivo models) often lack predictive power. Three-dimensional (3D) culture systems of human cells, a more physiologically relevant platform, could provide a high-throughput, automated means to test the efficacy and/or toxicity of novel substances. However, the challenge of obtaining high-magnification, confocal z stacks of 3D spheroids and understanding their respective quantitative limitations must be overcome first. To address this challenge, we developed a method to form spheroids of reproducible size at precise spatial locations across a 96-well plate. Spheroids of variable radii were labeled with four different fluorescent dyes and imaged with a high-throughput confocal microscope. 3D renderings of the spheroid had a complex bowl-like appearance. We systematically analyzed these confocal z stacks to determine the depth of imaging and the effect of spheroid size and dyes on quantitation. Furthermore, we have shown that this loss of fluorescence can be addressed through the use of ratio imaging. Overall, understanding both the limitations of confocal imaging and the tools to correct for these limits is critical for developing accurate quantitative assays using 3D spheroids.
Quantification of terpene trilactones in Ginkgo biloba with a 1H NMR method.
Liang, Tingfu; Miyakawa, Takuya; Yang, Jinwei; Ishikawa, Tsutomu; Tanokura, Masaru
2018-06-01
Ginkgo biloba L. has been used as a herbal medicine in the traditional treatment of insufficient blood flow, memory deficits, and cerebral insufficiency. The terpene trilactone components, the bioactive agents of Ginkgo biloba L., have also been reported to exhibit useful functionality such as anti-inflammatory and neuroprotective effects. Therefore, in the present research, we attempted to analyze quantitatively the terpene trilactone components in Ginkgo biloba leaf extract, with quantitative 1 H NMR (qNMR) and obtained almost identical results to data reported using HPLC. Application of the qNMR method for the analysis of the terpene trilactone contents in commercial Ginkgo extract products, such as soft gel capsules and tablets, produced the same levels noted in package labels. Thus, qNMR is an alternative method for quantification of the terpene trilactone components in commercial Ginkgo extract products.
Salisbury, Chris; Thomas, Clare; O'Cathain, Alicia; Rogers, Anne; Pope, Catherine; Yardley, Lucy; Hollinghurst, Sandra; Fahey, Tom; Lewis, Glyn; Large, Shirley; Edwards, Louisa; Rowsell, Alison; Segar, Julia; Brownsell, Simon; Montgomery, Alan A
2015-02-06
To develop a conceptual model for effective use of telehealth in the management of chronic health conditions, and to use this to develop and evaluate an intervention for people with two exemplar conditions: raised cardiovascular disease risk and depression. The model was based on several strands of evidence: a metareview and realist synthesis of quantitative and qualitative evidence on telehealth for chronic conditions; a qualitative study of patients' and health professionals' experience of telehealth; a quantitative survey of patients' interest in using telehealth; and review of existing models of chronic condition management and evidence-based treatment guidelines. Based on these evidence strands, a model was developed and then refined at a stakeholder workshop. Then a telehealth intervention ('Healthlines') was designed by incorporating strategies to address each of the model components. The model also provided a framework for evaluation of this intervention within parallel randomised controlled trials in the two exemplar conditions, and the accompanying process evaluations and economic evaluations. Primary care. The TElehealth in CHronic Disease (TECH) model proposes that attention to four components will offer interventions the best chance of success: (1) engagement of patients and health professionals, (2) effective chronic disease management (including subcomponents of self-management, optimisation of treatment, care coordination), (3) partnership between providers and (4) patient, social and health system context. Key intended outcomes are improved health, access to care, patient experience and cost-effective care. A conceptual model has been developed based on multiple sources of evidence which articulates how telehealth may best provide benefits for patients with chronic health conditions. It can be used to structure the design and evaluation of telehealth programmes which aim to be acceptable to patients and providers, and cost-effective. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
NASA Astrophysics Data System (ADS)
Zhou, Xiaoying
The purpose of this study is to integrate the quantitative environmental performance assessment tools and the theory of multi-objective optimization within the boundary of electronic product systems to support the selection among design alternatives in terms of environmental impact, technical criteria, and economic feasibility. To meet with the requirements that result from emerging environmental legislation targeting electronics products, the research addresses an important analytical methodological approach to facilitate environmentally conscious design and end-of-life management with a life cycle viewpoint. A synthesis of diverse assessment tools is applied on a set of case studies: lead-free solder materials selection, cellular phone design, and desktop display technology assessment. In the first part of this work, an in-depth industrial survey of the status and concerns of the U.S. electronics industry on the elimination of lead (Pb) in solders is described. The results show that the trade-offs among environmental consequences, technology challenges, business risks, legislative compliance and stakeholders' preferences must be explicitly, simultaneously, and systematically addressed in the decision-making process used to guide multi-faceted planning of environmental solutions. In the second part of this work, the convergent optimization of the technical cycle, economic cycle and environmental cycle is addressed in a coherent and systematic way using the application of environmentally conscious design of cellular phones. The technical understanding of product structure, components analysis, and materials flow facilitates the development of "Design for Disassembly" guidelines. A bottom-up disassembly analysis on a "bill of materials" based structure at a micro-operational level is utilized to select optimal end-of-life strategies on the basis of economic feasibility. A macro-operational level life cycle model is used to investigate the environmental consequences linking environmental impact with the cellular phone production activities focusing on the upstream manufacturing and end-of-life life cycle stages. The last part of this work, the quantitative elicitation of weighting factors facilitates the comparison of trade-offs in the context of a multi-attribute problem. An integrated analytical approach, Integrated Industrial Ecology Function Deployment (I2-EFD), is proposed to assess alternatives at the design phase of a product system and is validated with the assessment of desktop display technologies and lead-free solder alternatives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peterson, Mark J; Efroymson, Rebecca Ann; Adams, Marshall
The long-term ecological recovery of an impaired stream in response to an industrial facility's pollution abatement actions and the implications of the biological monitoring effort to environmental management is the subject of this special issue of Environmental Management. This final article focuses on the synthesis of the biological monitoring program's components and methods, the efficacy of various biological monitoring techniques to environmental management, and the lessons learned from the program that might be applicable to the design and application of other programs. The focus of the 25-year program has been on East Fork Poplar Creek, an ecologically impaired stream inmore » Oak Ridge, Tennessee with varied and complex stressors from a Department of Energy facility in its headwaters. Major components of the long-term program included testing and monitoring of invertebrate and fish toxicity, bioindicators of fish health, fish contaminant accumulation, and instream communities (including periphyton, benthic macroinvertebrate, and fish). Key parallel components of the program include water chemistry sampling and data management. Multiple lines of evidence suggested positive ecological responses during three major pollution abatement periods. Based on this case study and the related literature, effective environmental management of impaired streams starts with program design that is consistent across space and time, but also adaptable to changing conditions. The biological monitoring approaches used for the program provided a strong basis for assessments of recovery from remedial actions, and the likely causes of impairment. This case study provides a unique application of multidisciplinary and quantitative techniques to address multiple and complex regulatory and programmatic goals, environmental stressors, and remedial actions.« less
Kohrt, Brandon A; Jordans, Mark J D; Turner, Elizabeth L; Sikkema, Kathleen J; Luitel, Nagendra P; Rai, Sauharda; Singla, Daisy R; Lamichhane, Jagannath; Lund, Crick; Patel, Vikram
2018-01-01
Non-specialist healthcare providers, including primary and community healthcare workers, in low- and middle-income countries can effectively treat mental illness. However, scaling-up mental health services within existing health systems has been limited by barriers such as stigma against people with mental illness. Therefore, interventions are needed to address attitudes and behaviors among non-specialists. Aimed at addressing this gap, RE ducing S tigma among H ealthc A re P roviders to Improv E mental health services (RESHAPE) is an intervention in which social contact with mental health service users is added to training for non-specialist healthcare workers integrating mental health services into primary healthcare. This protocol describes a mixed methods pilot and feasibility study in primary care centers in Chitwan, Nepal. The qualitative component will include key informant interviews and focus group discussions. The quantitative component consists of a pilot cluster randomized controlled trial (c-RCT), which will establish parameters for a future effectiveness study of RESHAPE compared to training as usual (TAU). Primary healthcare facilities (the cluster unit, k = 34) will be randomized to TAU or RESHAPE. The direct beneficiaries of the intervention are the primary healthcare workers in the facilities ( n = 150); indirect beneficiaries are their patients ( n = 100). The TAU condition is existing mental health training and supervision for primary healthcare workers delivered through the Programme for Improving Mental healthcarE (PRIME) implementing the mental health Gap Action Programme (mhGAP). The primary objective is to evaluate acceptability and feasibility through qualitative interviews with primary healthcare workers, trainers, and mental health service users. The secondary objective is to collect quantitative information on health worker outcomes including mental health stigma (Social Distance Scale), clinical knowledge (mhGAP), clinical competency (ENhancing Assessment of Common Therapeutic factors, ENACT), and implicit attitudes (Implicit Association Test, IAT), and patient outcomes including stigma-related barriers to care, daily functioning, and symptoms. The pilot and feasibility study will contribute to refining recommendations for implementation of mhGAP and other mental health services in primary healthcare settings in low-resource health systems. The pilot c-RCT findings will inform an effectiveness trial of RESHAPE to advance the evidence-base for optimal approaches to training and supervision for non-specialist providers. ClinicalTrials.gov identifier, NCT02793271.
Complex Genetics of Behavior: BXDs in the Automated Home-Cage.
Loos, Maarten; Verhage, Matthijs; Spijker, Sabine; Smit, August B
2017-01-01
This chapter describes a use case for the genetic dissection and automated analysis of complex behavioral traits using the genetically diverse panel of BXD mouse recombinant inbred strains. Strains of the BXD resource differ widely in terms of gene and protein expression in the brain, as well as in their behavioral repertoire. A large mouse resource opens the possibility for gene finding studies underlying distinct behavioral phenotypes, however, such a resource poses a challenge in behavioral phenotyping. To address the specifics of large-scale screening we describe how to investigate: (1) how to assess mouse behavior systematically in addressing a large genetic cohort, (2) how to dissect automation-derived longitudinal mouse behavior into quantitative parameters, and (3) how to map these quantitative traits to the genome, deriving loci underlying aspects of behavior.
MBTH: A novel approach to rapid, spectrophotometric quantitation of total algal carbohydrates
Van Wychen, Stefanie; Long, William; Black, Stuart K.; ...
2016-11-24
A high-throughput and robust application of the 3-methyl-2-benzothiazolinone hydrazone (MBTH) method was developed for carbohydrate determination in microalgae. The traditional phenol-sulfuric acid method to quantify carbohydrates is strongly affected by algal biochemical components and exhibits a highly variable response to microalgal monosaccharides. We present a novel use of the MBTH method to accurately quantify carbohydrates in hydrolyzate after acid hydrolysis of algal biomass, without a need for neutralization. As a result, the MBTH method demonstrated consistent and sensitive quantitation of algae-specific monosaccharides down to 5 ug mL -1 without interference from other algae acidic hydrolyzate components.
MBTH: A novel approach to rapid, spectrophotometric quantitation of total algal carbohydrates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Wychen, Stefanie; Long, William; Black, Stuart K.
A high-throughput and robust application of the 3-methyl-2-benzothiazolinone hydrazone (MBTH) method was developed for carbohydrate determination in microalgae. The traditional phenol-sulfuric acid method to quantify carbohydrates is strongly affected by algal biochemical components and exhibits a highly variable response to microalgal monosaccharides. We present a novel use of the MBTH method to accurately quantify carbohydrates in hydrolyzate after acid hydrolysis of algal biomass, without a need for neutralization. As a result, the MBTH method demonstrated consistent and sensitive quantitation of algae-specific monosaccharides down to 5 ug mL -1 without interference from other algae acidic hydrolyzate components.
Using a Genetic Algorithm to Learn Behaviors for Autonomous Vehicles,
1992-08-12
Truly autonomous vehicles will require both projective planning and reactive components in order to perform robustly. Projective components are...long time period. This work addresses the problem of creating reactive components for autonomous vehicles . Creating reactive behaviors (stimulus
Opportunities for Nondestructive Evaluation: Quantitative Characterization (Postprint)
2017-07-01
between the design and the as- manufactured and, ultimately, the as-maintained state. As this occurs with the article of interest being placed in...TESTING (STINFO COPY) AIR FORCE RESEARCH LABORATORY MATERIALS AND MANUFACTURING DIRECTORATE WRIGHT-PATTERSON AIR FORCE BASE, OH 45433-7750 AIR FORCE...MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING/MONITORING AGENCY ACRONYM(S) Air Force Research Laboratory Materials and Manufacturing
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
2002-06-01
Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.
Makrakis, Vassilios; Kostoulas-Makrakis, Nelly
2016-02-01
Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.
How multi segmental patterns deviate in spastic diplegia from typical developed.
Zago, Matteo; Sforza, Chiarella; Bona, Alessia; Cimolin, Veronica; Costici, Pier Francesco; Condoluci, Claudia; Galli, Manuela
2017-10-01
The relationship between gait features and coordination in children with Cerebral Palsy is not sufficiently analyzed yet. Principal Component Analysis can help in understanding motion patterns decomposing movement into its fundamental components (Principal Movements). This study aims at quantitatively characterizing the functional connections between multi-joint gait patterns in Cerebral Palsy. 65 children with spastic diplegia aged 10.6 (SD 3.7) years participated in standardized gait analysis trials; 31 typically developing adolescents aged 13.6 (4.4) years were also tested. To determine if posture affects gait patterns, patients were split into Crouch and knee Hyperextension group according to knee flexion angle at standing. 3D coordinates of hips, knees, ankles, metatarsal joints, pelvis and shoulders were submitted to Principal Component Analysis. Four Principal Movements accounted for 99% of global variance; components 1-3 explained major sagittal patterns, components 4-5 referred to movements on frontal plane and component 6 to additional movement refinements. Dimensionality was higher in patients than in controls (p<0.01), and the Crouch group significantly differed from controls in the application of components 1 and 4-6 (p<0.05), while the knee Hyperextension group in components 1-2 and 5 (p<0.05). Compensatory strategies of children with Cerebral Palsy (interactions between main and secondary movement patterns), were objectively determined. Principal Movements can reduce the effort in interpreting gait reports, providing an immediate and quantitative picture of the connections between movement components. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hungry Horse Mitigation; Flathead Lake, 2001-2002 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Barry
2003-06-09
The Confederated Salish and Kootenai Tribes (CSKT) and Montana Fish Wildlife and Parks (MFWP) wrote ''Fisheries Mitigation Plan for Losses Attributable to the Construction and Operation of Hungry Horse Dam'' in March 1991 to define the fisheries losses, mitigation alternatives and recommendations to protect, mitigate and enhance resident fish and aquatic habitat affected by Hungry Horse Dam. On November 12, 1991, the Northwest Power Planning Council (NPPC) approved the mitigation plan with minor modifications, called for a detailed implementation plan, and amended measures 903(h)(1) through (7). A long-term mitigation plan was submitted in August 1992, was approved by the Councilmore » in 1993, and the first contract for this project was signed on November 11, 1993. The problem this project addresses is the loss of habitat, both in quality and quantity, in the interconnected Flathead Lake and River basin resulting from the construction and operation of Hungry Horse Dam. The purpose of the project is to both implement mitigation measures and monitor the biological responses to those measures including those implemented by Project Numbers 9101903 and 9101904. Goals and objectives of the 1994 Fish and Wildlife Program (Section 10.1) addressed by this project are the rebuilding to sustainable levels weak, but recoverable, native populations injured by the hydropower system. The project mitigates the blockage of spawning runs by Hungry Horse Dam by restoring and even creating spawning habitats within direct drainages to Flathead Lake. The project also addresses the altered habitat within Flathead Lake resulting from species shifts and consequent dominance of new species that restricts the potential success of mitigation measures. Specific goals of this project are to create and restore habitat and quantitatively monitor changes in fish populations to verify the efficacy of our mitigation measures. The project consists of three components: monitoring, restoration and research. Monitoring, for example, includes a spring gillnetting series conducted annually in Flathead Lake and builds on an existing data set initiated in 1981. Monitoring of the experimental kokanee reintroduction was a primary activity of this project between 1992 and 1997. Lake trout, whose high densities have precluded successful mitigation of losses of other species in Flathead Lake, have been monitored since 1996 to measure several biological parameters. Results of this work have utility in determining the population status of this key predator in Flathead Lake. The project has also defined the baseline condition of the Flathead Lake fishery in 1992-1993 and has conducted annual lakewide surveys since 1998. The restoration component of the project has addressed several stream channel, riparian, and fish passage problems. The research component of the project began in FY 2000 and measured trophic linkages between M. relicta and other species to assist in predicting the results of potential mitigation strategies. Only Objective 1 in the workplan is funded entirely by Hungry Horse Mitigation funds. Additional funds are drawn from other sources to assist in completion of Objectives 2-6.« less
ERIC Educational Resources Information Center
Soebari, Titien; Aldridge, Jill M.
2016-01-01
This article reports on the differential effectiveness of a teacher professional development programme for teachers in urban and rural schools in Indonesia. The study employed an embedded mixed methods design that involved the concurrent collection of both quantitative and qualitative data. The quantitative component involved a pre-post design in…
ERIC Educational Resources Information Center
Sobel, Robert M.; Ballantine, David S.; Ryzhov, Victor
2005-01-01
Industrial application of gas chromatography-mass spectrometry (GC-MS) analysis is a powerful technique that could be used to elucidate components of a complex mixture while offering the benefits of high-precision quantitative analysis. The natural wintergreen oil is examined for its phenol concentration to determine the level of refining…
ERIC Educational Resources Information Center
Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.
2007-01-01
A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…
ERIC Educational Resources Information Center
Mitchell-Koch, Jeremy T.; Reid, Kendra R.; Meyerhoff, Mark E.
2008-01-01
An experiment for the undergraduate quantitative analysis laboratory involving applications of visible spectrophotometry is described. Salicylate, a component found in several medications, as well as the active by-product of aspirin decomposition, is quantified. The addition of excess iron(III) to a solution of salicylate generates a deeply…
LABORATORY EVALUATION OF NONSTICK COATINGS TO REDUCE INK ADHESION TO PRINTING PRESS COMPONENTS
The report gives results of a project to identify surface coatings or preparations that might reduce and/or eliminate the need for solvent cleaning of ink-feed-system components of printing equipment. The study was designed to provide qualitative, but not quantitative, results. T...
Pastoral Perspectives of Humor's Use in Ministry Contexts
ERIC Educational Resources Information Center
Young, Jonathan W.
2012-01-01
Results from a qualitative component of a mixed-methods research design are reported regarding the use of humor in pastoral ministry. Thirteen Southern Baptist (SB) pastors were interviewed, from a total of 37 exemplars identified in the quantitative component of the study, regarding their perspectives toward humor in ministry. Results overall…
Components That Affect the Personal Motivation to Implement Campus Safety Protocols
ERIC Educational Resources Information Center
Burt, Ernest, III
2013-01-01
This study examined components that have an effect on crisis response team members' personal motivation to perform campus safety protocols. The research method for this study was a quantitative design. The variables measured were compensation, experience, training, and communication. The motivation sources for this study included instrumental…
The Intercultural Component in Textbooks for Teaching a Service Technical Writing Course
ERIC Educational Resources Information Center
Matveeva, Natalia
2007-01-01
This research article investigates new developments in the representation of the intercultural component in textbooks for a service technical writing course. Through textual analysis, using quantitative and qualitative techniques, I report discourse analysis of 15 technical writing textbooks published during 1993-2006. The theoretical and…
MEMBRANE FILTER PROCEDURE FOR ENUMERATING THE COMPONENT GENERA OF THE COLIFORM GROUP IN SEAWATER
A facile, quantitative, membrane filter procedure (mC) for defining the distribution of coliform populations in seawater according to the component genera was developed. The procedure, which utilizes a series of in situ substrate tests to obviate the picking of colonies for ident...
Project Simu-School Component Washington State University
ERIC Educational Resources Information Center
Glass, Thomas E.
1976-01-01
This component of the project attempts to facilitate planning by furnishing models that manage cumbersome and complex data, supply an objectivity that identifies all relationships between elements of the model, and provide a quantitative model allowing for various forecasting techniques that describe the long-range impact of decisions. (Author/IRT)
The Undergraduate Spanish Major Curriculum: Realities and Faculty Perceptions
ERIC Educational Resources Information Center
Hertel, Tammy Jandrey; Dings, Abby
2014-01-01
This article presents the quantitative and qualitative results of a nationwide survey of Spanish department faculty on the components of their undergraduate Spanish major curriculum and their perceptions of these components, as well as their perceptions of recent Modern Language Association (MLA) reports (2007, 2009) and the reports'…
ERIC Educational Resources Information Center
Hannan, Michael T.
This document is part of a series of chapters described in SO 011 759. Addressing the problems of studying change and the change process, the report argues that sociologists should study coupled changes in qualitative and quantitative outcomes (e.g., marital status and earnings). The author presents a model for sociological studies of change in…
A Step Beyond Simple Keyword Searches: Services Enabled by a Full Content Digital Journal Archive
NASA Technical Reports Server (NTRS)
Boccippio, Dennis J.
2003-01-01
The problems of managing and searching large archives of scientific journal articles can potentially be addressed through data mining and statistical techniques matured primarily for quantitative scientific data analysis. A journal paper could be represented by a multivariate descriptor, e.g., the occurrence counts of a number key technical terms or phrases (keywords), perhaps derived from a controlled vocabulary ( e . g . , the American Meteorological Society's Glossary of Meteorology) or bootstrapped from the journal archive itself. With this technique, conventional statistical classification tools can be leveraged to address challenges faced by both scientists and professional societies in knowledge management. For example, cluster analyses can be used to find bundles of "most-related" papers, and address the issue of journal bifurcation (when is a new journal necessary, and what topics should it encompass). Similarly, neural networks can be trained to predict the optimal journal (within a society's collection) in which a newly submitted paper should be published. Comparable techniques could enable very powerful end-user tools for journal searches, all premised on the view of a paper as a data point in a multidimensional descriptor space, e.g.: "find papers most similar to the one I am reading", "build a personalized subscription service, based on the content of the papers I am interested in, rather than preselected keywords", "find suitable reviewers, based on the content of their own published works", etc. Such services may represent the next "quantum leap" beyond the rudimentary search interfaces currently provided to end-users, as well as a compelling value-added component needed to bridge the print-to-digital-medium gap, and help stabilize professional societies' revenue stream during the print-to-digital transition.
Evolution of Bow-Tie Architectures in Biology
Friedlander, Tamar; Mayo, Avraham E.; Tlusty, Tsvi; Alon, Uri
2015-01-01
Bow-tie or hourglass structure is a common architectural feature found in many biological systems. A bow-tie in a multi-layered structure occurs when intermediate layers have much fewer components than the input and output layers. Examples include metabolism where a handful of building blocks mediate between multiple input nutrients and multiple output biomass components, and signaling networks where information from numerous receptor types passes through a small set of signaling pathways to regulate multiple output genes. Little is known, however, about how bow-tie architectures evolve. Here, we address the evolution of bow-tie architectures using simulations of multi-layered systems evolving to fulfill a given input-output goal. We find that bow-ties spontaneously evolve when the information in the evolutionary goal can be compressed. Mathematically speaking, bow-ties evolve when the rank of the input-output matrix describing the evolutionary goal is deficient. The maximal compression possible (the rank of the goal) determines the size of the narrowest part of the network—that is the bow-tie. A further requirement is that a process is active to reduce the number of links in the network, such as product-rule mutations, otherwise a non-bow-tie solution is found in the evolutionary simulations. This offers a mechanism to understand a common architectural principle of biological systems, and a way to quantitate the effective rank of the goals under which they evolved. PMID:25798588
Excitation of the Earth's Chandler wobble by southern oscillation/El Nino, 1900-1979
NASA Technical Reports Server (NTRS)
Chao, B. F.
1985-01-01
The southern oscillation/El Nino (ENSO) is the single most prominent interannual signal in global atmospheric/oceanic fluctuations. The following question is addressed: how important is the angular momentum carried by ENSO in exciting the Earth's Chandler wobble? The question is attacked through a statistical analysis of the coherence spectra (correlation as a function of frequency) between two data sets spanning 1900 to 1979-the southern oscillation index (SOI) time series and the excitation function psi (with x-component psi sub x and y-component psi sub y) of the Chandler wobble derived from the homogeneous ILS (International Latitude Service) polar motion data. The coherence power and phase in the Chandler frequency band (approx. 0.79 to 0.89 cpy) are studied. It is found that, during 1900 to 1979 the coherence between SOI and psi sub x is significant well over the 95% confidence threshold whereas that between SOI and psi sub y is practically nil. Quantitatively, the coherence study shows that ENSO provides some 20% of the observed Chandler wobble excitation power. Since earlier investigations have shown that the total atmospheric/oceanic variation can account for the Chandler wobble excitation at about 20% level, the implication is that ENSO maybe an important (interannual) part of the atmospheric/oceanic variation that is responsible for the Chandler wobble excitation during 1900 to 1979.
Applying Quantitative Approaches to the Formative Evaluation of Antismoking Campaign Messages
Parvanta, Sarah; Gibson, Laura; Forquer, Heather; Shapiro-Luft, Dina; Dean, Lorraine; Freres, Derek; Lerman, Caryn; Mallya, Giridhar; Moldovan-Johnson, Mihaela; Tan, Andy; Cappella, Joseph; Hornik, Robert
2014-01-01
This article shares an in-depth summary of a formative evaluation that used quantitative data to inform the development and selection of promotional ads for the antismoking communication component of a social marketing campaign. A foundational survey provided cross-sectional data to identify beliefs about quitting smoking that campaign messages should target, as well as beliefs to avoid. Pretesting draft ads against quantitative indicators of message effectiveness further facilitated the selection and rejection of final campaign ads. Finally, we consider lessons learned from the process of balancing quantitative methods and judgment to make formative decisions about more and less promising persuasive messages for campaigns. PMID:24817829
Hierarchical Analytical Approaches for Unraveling the Composition of Proprietary Mixtures
The composition of commercial mixtures including pesticide inert ingredients, aircraft deicers, and aqueous film-forming foam (AFFF) formulations, and by analogy, fracking fluids, are proprietary. Quantitative analytical methodologies can only be developed for mixture components once their identities are known. Because proprietary mixtures may contain volatile and non-volatile components, a hierarchy of analytical methods is often required for the full identification of all proprietary mixture components.
Yi, YaXiong; Zhang, Yong; Ding, Yue; Lu, Lu; Zhang, Tong; Zhao, Yuan; Xu, XiaoJun; Zhang, YuXin
2016-11-01
J. Sep. Sci. 2016, 39, 4147-4157 DOI: 10.1002/jssc.201600284 Yinchenhao decoction (YCHD) is a famous Chinese herbal formula recorded in the Shang Han Lun which was prescribed by Zhongjing Zhang during 150-219 AD. A novel quantitative analysis method was developed, based on ultrahigh performance liquid chromatography coupled with a diode array detector for the simultaneous determination of 14 main active components in Yinchenhao decoction. Furthermore, the method has been applied for compositional difference analysis of the 14 components in eight normal extraction samples of Yinchenhao decoction, with the aid of hierarchical clustering analysis and similarity analysis. The present research could help hospital, factory and lab choose the best way to make Yinchenhao decoction with better efficacy. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Shahlaei, Mohsen; Sabet, Razieh; Ziari, Maryam Bahman; Moeinifard, Behzad; Fassihi, Afshin; Karbakhsh, Reza
2010-10-01
Quantitative relationships between molecular structure and methionine aminopeptidase-2 inhibitory activity of a series of cytotoxic anthranilic acid sulfonamide derivatives were discovered. We have demonstrated the detailed application of two efficient nonlinear methods for evaluation of quantitative structure-activity relationships of the studied compounds. Components produced by principal component analysis as input of developed nonlinear models were used. The performance of the developed models namely PC-GRNN and PC-LS-SVM were tested by several validation methods. The resulted PC-LS-SVM model had a high statistical quality (R(2)=0.91 and R(CV)(2)=0.81) for predicting the cytotoxic activity of the compounds. Comparison between predictability of PC-GRNN and PC-LS-SVM indicates that later method has higher ability to predict the activity of the studied molecules. Copyright (c) 2010 Elsevier Masson SAS. All rights reserved.
Microlocalization and Quantitation of Risk Associated Elements in Gleason Graded Prostate Tissue
2007-03-01
ORGANIZATION: Regents of the University of California Maya Conn Los Angeles CA 90024 REPORT DATE: March 2007 TYPE OF REPORT...California Maya Conn Los Angeles CA 90024 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S...carcinoma of different histological grading in comparison to normal prostate tissue and adenofibromyomatosis (BPH) Uro Res 10:301-303. 5. Feustel A
Rescue of TET2 Haploinsufficiency in Myelodysplastic Syndrome Patients Using Turbo Cosubstrate
2017-07-01
Annual PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort Detrick, Maryland 21702-5012 DISTRIBUTION STATEMENT: Approved for...2446 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) U.S. Army Medical Research and Materiel Command Fort...enhance the activity of TET2 enzymes from MDS patient. Further, we published a manuscript on convenient expression, purification and quantitative liquid
Mathematical Model of HIF-1 alpha Pathway, Oxygen Transport and Hypoxia
2017-09-01
interpret experimental data in terms of underlying mechanisms. Such experiments, if quantitative , can also be used to calibrate and further parameterize...Wing Air Force Research Laboratory Wright-Patterson AFB OH 45433-5707 STINFO COPY Work Unit Manager MATTIE.DAV ID.R.123010 1880 Digitally signed by...MONITORING AGENCY NAME(S) AND ADDRESS(ES) Air Force Materiel Command* Air Force Research Laboratory 711th Human Performance Wing Human Effectiveness
Criteria and Planning Guidance for Ex-Plant Harvesting to Support Subsequent License Renewal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Devanathan, Ram; Meyer, Ryan M.
As U.S. nuclear power plants look to subsequent license renewal (SLR) to operate for a 20-year period beyond 60 years, the U.S. Nuclear Regulatory Commission and the industry will be addressing technical issues around the capability of long-lived passive components to meet their functionality objectives. A key challenge will be to better understand likely materials degradation mechanisms in these components and their impacts on component functionality and margins to safety. Research addressing many of the remaining technical gaps in these areas for SLR may greatly benefit from materials sampled from plants (decommissioned or operating). Because of the cost and inefficiencymore » of piecemeal sampling, there is a need for a strategic and systematic approach to sampling materials from structures, systems, and components (SSC) in both operating and decommissioned plants. This document describes a potential approach for sampling (harvesting) materials that focuses on prioritizing materials for sampling using a number of criteria. These criteria are based on an evaluation of technical gaps identified in the literature, research needs to address these technical gaps, and lessons learned from previous harvesting campaigns. The document also describes a process for planning future harvesting campaigns; such a plan would include an understanding of the harvesting priorities, available materials, and the planned use of the materials to address the technical gaps.« less
A New Approach for Quantitative Evaluation of Ultrasonic Wave Attenuation in Composites
NASA Astrophysics Data System (ADS)
Ni, Qing-Qing; Li, Ran; Xia, Hong
2017-02-01
When ultrasonic waves propagate in composite materials, the propagation behaviors result from the combination effects of various factors, such as material anisotropy and viscoelastic property, internal microstructure and defects, incident wave characteristics and interface condition between composite components. It is essential to make it clear how these factors affect the ultrasonic wave propagation and attenuation characteristics, and how they mutually interact on each other. In the present paper, based on a newly developed time-domain finite element analysis code, PZflex, a unique approach for clarifying the detailed influence mechanism of aforementioned factors is proposed, in which each attenuation component can be extracted from the overall attenuation and analyzed respectively. By taking into consideration the interrelation between each individual attenuation component, the variation behaviors of each component and internal dynamic stress distribution against material anisotropy and matrix viscosity are separately and quantitatively evaluated. From the detailed analysis results of each attenuation component, the energy dissipation at interface is a major component in ultrasonic wave attenuation characteristics, which can provide a maximum contribution rate of 68.2 % to the overall attenuation, and each attenuation component is closely related to the material anisotropy and viscoelasticity. The results clarify the correlation between ultrasonic wave propagation characteristics and material viscoelastic properties, which will be useful in the further development of ultrasonic technology in defect detection.
Design and analysis issues in quantitative proteomics studies.
Karp, Natasha A; Lilley, Kathryn S
2007-09-01
Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.
2013-01-01
Background To inform the design of a randomised controlled trial (RCT) of an exercise-based programme for long term stroke survivors, we conducted a mixed methods before-and-after case series with assessment at three time points. We evaluated Action for Rehabilitation from Neurological Injury (ARNI), a personalised, functionally-focussed programme. It was delivered through 24 hours of one-to-one training by an Exercise Professional (EP), plus at least 2 hours weekly unsupervised exercise, over 12- 14 weeks. Assessment was by patient-rated questionnaires addressing function, physical activity, confidence, fatigue and health-related quality of life; objective assessment of gait quality and speed; qualitative individual interviews conducted with participants. Data were collected at baseline, 3 months and 6 months. Fidelity and acceptability was assessed by participant interviews, audit of participant and EP records, and observation of training. Findings Four of six enrolled participants completed the exercise programme. Quantitative data demonstrated little change across the sample, but marked changes on some measures for some individuals. Qualitative interviews suggested that small benefits in physical outcomes could be of great psychological significance to participants. Participant-reported fatigue levels commonly increased, and non-completers said they found the programme too demanding. Most key components of the intervention were delivered, but there were several potentially important departures from intervention fidelity. Discussion The study provided data and experience that are helping to inform the design of an RCT of this intervention. It suggested the need for a broader recruitment strategy; indicated areas that could be explored in more depth in the qualitative component of the trial; and highlighted issues that should be addressed to enhance and evaluate fidelity, particularly in the preparation and monitoring of intervention providers. The experience illustrates the value of even small sample before-and-after studies in the development of trials of complex interventions. PMID:23941470
A Routine 'Top-Down' Approach to Analysis of the Human Serum Proteome.
D'Silva, Arlene M; Hyett, Jon A; Coorssen, Jens R
2017-06-06
Serum provides a rich source of potential biomarker proteoforms. One of the major obstacles in analysing serum proteomes is detecting lower abundance proteins owing to the presence of hyper-abundant species (e.g., serum albumin and immunoglobulins). Although depletion methods have been used to address this, these can lead to the concomitant removal of non-targeted protein species, and thus raise issues of specificity, reproducibility, and the capacity for meaningful quantitative analyses. Altering the native stoichiometry of the proteome components may thus yield a more complex series of issues than dealing directly with the inherent complexity of the sample. Hence, here we targeted method refinements so as to ensure optimum resolution of serum proteomes via a top down two-dimensional gel electrophoresis (2DE) approach that enables the routine assessment of proteoforms and is fully compatible with subsequent mass spectrometric analyses. Testing included various fractionation and non-fractionation approaches. The data show that resolving 500 µg protein on 17 cm 3-10 non-linear immobilised pH gradient strips in the first dimension followed by second dimension resolution on 7-20% gradient gels with a combination of lithium dodecyl sulfate (LDS) and sodium dodecyl sulfate (SDS) detergents markedly improves the resolution and detection of proteoforms in serum. In addition, well established third dimension electrophoretic separations in combination with deep imaging further contributed to the best available resolution, detection, and thus quantitative top-down analysis of serum proteomes.
NASA Astrophysics Data System (ADS)
Brinkman, Elliot; Seekamp, Erin; Davenport, Mae A.; Brehm, Joan M.
2012-10-01
Community capacity for watershed management has emerged as an important topic for the conservation of water resources. While much of the literature on community capacity has focused primarily on theory construction, there have been few efforts to quantitatively assess community capacity variables and constructs, particularly for watershed management and conservation. This study seeks to identify predictors of community capacity for watershed conservation in southwestern Illinois. A subwatershed-scale survey of residents from four communities located within the Lower Kaskaskia River watershed of southwestern Illinois was administered to measure three specific capacity variables: community empowerment, shared vision and collective action. Principal component analysis revealed key dimensions of each variable. Specifically, collective action was characterized by items relating to collaborative governance and social networks, community empowerment was characterized by items relating to community competency and a sense of responsibility and shared vision was characterized by items relating to perceptions of environmental threats, issues with development, environmental sense of place and quality of life. From the emerging factors, composite measures were calculated to determine the extent to which each variable contributed to community capacity. A stepwise regression revealed that community empowerment explained most of the variability in the composite measure of community capacity for watershed conservation. This study contributes to the theoretical understanding of community capacity by quantifying the role of collective action, community empowerment and shared vision in community capacity, highlighting the need for multilevel interaction to address watershed issues.
Farr, Deeonna E; Brandt, Heather M; Comer, Kimberly D; Jackson, Dawnyéa D; Pandya, Kinjal; Friedman, Daniela B; Ureda, John R; Williams, Deloris G; Scott, Dolores B; Green, Wanda; Hébert, James R
2015-09-01
Increasing the participation of Blacks in cancer research is a vital component of a strategy to reduce racial inequities in cancer burden. Community-based participatory research (CBPR) is especially well-suited to advancing our knowledge of factors that influence research participation to ultimately address cancer-related health inequities. A paucity of literature focuses on the role of structural factors limiting participation in cancer research. As part of a larger CBPR project, we used survey data from a statewide cancer needs assessment of a Black faith community to examine the influence of structural factors on attitudes toward research and the contributions of both structural and attitudinal factors on whether individuals participate in research. Regression analyses and non-parametric statistics were conducted on data from 727 adult survey respondents. Structural factors, such as having health insurance coverage, experiencing discrimination during health care encounters, and locale, predicted belief in the benefits, but not the risks, of research participation. Positive attitudes toward research predicted intention to participate in cancer research. Significant differences in structural and attitudinal factors were found between cancer research participants and non-participants; however, directionality is confounded by the cross-sectional survey design and causality cannot be determined. This study points to complex interplay of structural and attitudinal factors on research participation as well as need for additional quantitative examinations of the various types of factors that influence research participation in Black communities.
Cline, James P; Von Dreele, Robert B; Winburn, Ryan; Stephens, Peter W; Filliben, James J
2011-07-01
A non-diffracting surface layer exists at any boundary of a crystal and can comprise a mass fraction of several percent in a finely divided solid. This has led to the long-standing issue of amorphous content in standards for quantitative phase analysis (QPA). NIST standard reference material (SRM) 676a is a corundum (α-Al(2)O(3)) powder, certified with respect to phase purity for use as an internal standard in powder diffraction QPA. The amorphous content of SRM 676a is determined by comparing diffraction data from mixtures with samples of silicon powders that were engineered to vary their specific surface area. Under the (supported) assumption that the thickness of an amorphous surface layer on Si was invariant, this provided a method to control the crystalline/amorphous ratio of the silicon components of 50/50 weight mixtures of SRM 676a with silicon. Powder diffraction experiments utilizing neutron time-of-flight and 25 keV and 67 keV X-ray energies quantified the crystalline phase fractions from a series of specimens. Results from Rietveld analyses, which included a model for extinction effects in the silicon, of these data were extrapolated to the limit of zero amorphous content of the Si powder. The certified phase purity of SRM 676a is 99.02% ± 1.11% (95% confidence interval). This novel certification method permits quantification of amorphous content for any sample of interest, by spiking with SRM 676a.
NASA Lewis Research Center Futuring Workshop
NASA Technical Reports Server (NTRS)
Boroush, Mark; Stover, John; Thomas, Charles
1987-01-01
On October 21 and 22, 1986, the Futures Group ran a two-day Futuring Workshop on the premises of NASA Lewis Research Center. The workshop had four main goals: to acquaint participants with the general history of technology forecasting; to familiarize participants with the range of forecasting methodologies; to acquaint participants with the range of applicability, strengths, and limitations of each method; and to offer participants some hands-on experience by working through both judgmental and quantitative case studies. Among the topics addressed during this workshop were: information sources; judgmental techniques; quantitative techniques; merger of judgment with quantitative measurement; data collection methods; and dealing with uncertainty.
NASA Astrophysics Data System (ADS)
Jacomassa, Fábio André F.; Pizo, Marco Aurélio
2010-09-01
Although the overlap in fruit diet between birds and bats is low, they sometimes consume and compete for fruits of the same plant species. What is poorly known is how birds and bats compare with each other in relation to the effectiveness of seed dispersal. In this paper we contrasted birds and bats in relation to quantitative (the amount of fruits removed from plants) and qualitative (germination performance of seeds) components of the seed dispersal of Solanum granuloso- leprosum, a pioneer, small-seeded tree of open areas and forest edges in south Brazil. We tagged fruits on the plants and monitored their removal by day and night. We compared the final percent of germination and speed of germination of seeds ingested by birds and bats with non-ingested, control seeds. While bats removed more fruits than birds, performing better in the quantitative component, birds improved the germination performance of seeds, an aspect of the qualitative component of seed dispersal effectiveness. Although bats are more likely to deposit seeds in highly disturbed sites that favor the recruitment of pioneer plant species, birds frequent forest edges, which is also suitable habitat for S. granuloso- leprosum, We concluded that birds and bats are not 'redundant' seed dispersers for S. granuloso- leprosum because in conjunction they may enhance its recruitment by diversifying the microsites where seeds are deposited, performing in addition different ecological functions in terms of quantity and quality of dispersal.
49 CFR 236.1015 - PTC Safety Plan content requirements and PTC System Certification.
Code of Federal Regulations, 2014 CFR
2014-10-01
... supplied from other business systems needed to execute the braking algorithm, survey data needed for... require submission of a quantitative risk assessment addressing these potential errors. Effective Date...
Collaborative Initiative in Biomedical Imaging to Study Complex Diseases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Weili; Fiddy, Michael A.
2012-03-31
The work reported addressed these topics: Fluorescence imaging; Optical coherence tomography; X-ray interferometer/phase imaging system; Quantitative imaging from scattered fields, Terahertz imaging and spectroscopy; and Multiphoton and Raman microscopy.
Rezeli, Melinda; Végvári, Akos; Ottervald, Jan; Olsson, Tomas; Laurell, Thomas; Marko-Varga, György
2011-12-10
As a proof-of-principle study, a multiple reaction monitoring (MRM) assay was developed for quantitation of proteotypic peptides, representing seven plasma proteins associated with inflammation (complement components and C-reactive protein). The assay development and the sample analysis were performed on a linear ion trap mass spectrometer. We were able to quantify 5 of the 7 target proteins in depleted plasma digests with reasonable reproducibility over a 2 orders of magnitude linear range (RSD≤25%). The assay panel was utilized for the analysis of a small multiple sclerosis sample cohort with 10 diseased and 8 control patients. Copyright © 2011 Elsevier B.V. All rights reserved.
MBTH: A novel approach to rapid, spectrophotometric quantitation of total algal carbohydrates.
Van Wychen, Stefanie; Long, William; Black, Stuart K; Laurens, Lieve M L
2017-02-01
A high-throughput and robust application of the 3-methyl-2-benzothiazolinone hydrazone (MBTH) method was developed for carbohydrate determination in microalgae. The traditional phenol-sulfuric acid method to quantify carbohydrates is strongly affected by algal biochemical components and exhibits a highly variable response to microalgal monosaccharides. We present a novel use of the MBTH method to accurately quantify carbohydrates in hydrolyzate after acid hydrolysis of algal biomass, without a need for neutralization. The MBTH method demonstrated consistent and sensitive quantitation of algae-specific monosaccharides down to 5 μg mL -1 without interference from other algae acidic hydrolyzate components. Copyright © 2016 Elsevier Inc. All rights reserved.
Li, Jie; Zhang, Ji; Jin, Hang; Wang, Yuan-Zhong; Huang, Heng-Yu
2017-01-01
Millions of people are killed by viral hepatitis every year in the world, whereas many relevant medicines are too expensive to purchase. Swertia mileensis , a medicinal plant for hepatitis in the system of traditional Chinese medicine, has been vanishing gradually because of overexploitation. To find substitutes of S. mileensis and reduce the cost of purchasing drugs for hepatitis patients, the similarity of phytochemical constituents between S. mileensis and other three Swertia species was compared. Both ultra high performance liquid chromatographies and ultraviolet-vis fingerprints of four Swertia species were developed. Methanol extracts of the stems and leaves were used as samples to establish the fingerprint. The calibration curve was drawn for quantitative analysis of swertiamarin. The data of ultra high performance liquid chromatographies were evaluated statistically using similarity analysis and principal component analysis. The result shows a significant difference at area of 204-290 nm in the ultraviolet fingerprint. Swertiamarin, the only one common peak, was defined in chromatographic fingerprints of four Swertia species. The quantitative analysis suggested that the highest concentration of swertiamarin is in S. davidii . The similarity indexes between different samples were almost under 0.60. In the principal component analysis, separate points not only represent the distinction among different species, but also perform chemical discrepancies in content between stems and leaves of one same species. S. angustifolia , S. davidii , and S. punicea are not suitable as substitutes of S. mileensis because of their remarkable differences in entirety and local part. In order to address issues about substitutes and high cost of purchasing drugs, more studies need to undertake. The UHPLC fingerprint method indicated the significant difference on chemical ingredients in four plants from Swertia .Swertiamarin is the unique common compounds for four plants, which exist are in leaves of S. davidii with the highest content.The obvious diversity in four plants was displayed from comprehensive point of view though similarity assay and PCA analysis.The UV fingerprint method offsets the defect that the UHPLC fingerprint reflected messages of secoiridoid glycosides only. Abbreviation used: UHPLC: Ultra high performance liquid chromatography, UV-vis: Ultraviolet-vis, HBV: Anti-hepatitis virus, DNA: Deoxyribonucleic acid, PCA: Principal component analysis, D-GaIN: D-Galactosamine, BCG: Bacille Calmette-Guerin, LPS: Lipopolysaccharide.
Li, Jie; Zhang, Ji; Jin, Hang; Wang, Yuan-Zhong; Huang, Heng-Yu
2017-01-01
Background: Millions of people are killed by viral hepatitis every year in the world, whereas many relevant medicines are too expensive to purchase. Swertia mileensis, a medicinal plant for hepatitis in the system of traditional Chinese medicine, has been vanishing gradually because of overexploitation. Objective: To find substitutes of S. mileensis and reduce the cost of purchasing drugs for hepatitis patients, the similarity of phytochemical constituents between S. mileensis and other three Swertia species was compared. Materials and Methods: Both ultra high performance liquid chromatographies and ultraviolet-vis fingerprints of four Swertia species were developed. Methanol extracts of the stems and leaves were used as samples to establish the fingerprint. The calibration curve was drawn for quantitative analysis of swertiamarin. The data of ultra high performance liquid chromatographies were evaluated statistically using similarity analysis and principal component analysis. Results: The result shows a significant difference at area of 204–290 nm in the ultraviolet fingerprint. Swertiamarin, the only one common peak, was defined in chromatographic fingerprints of four Swertia species. The quantitative analysis suggested that the highest concentration of swertiamarin is in S. davidii. The similarity indexes between different samples were almost under 0.60. In the principal component analysis, separate points not only represent the distinction among different species, but also perform chemical discrepancies in content between stems and leaves of one same species. Conclusions: S. angustifolia, S. davidii, and S. punicea are not suitable as substitutes of S. mileensis because of their remarkable differences in entirety and local part. In order to address issues about substitutes and high cost of purchasing drugs, more studies need to undertake. SUMMARY The UHPLC fingerprint method indicated the significant difference on chemical ingredients in four plants from Swertia.Swertiamarin is the unique common compounds for four plants, which exist are in leaves of S. davidii with the highest content.The obvious diversity in four plants was displayed from comprehensive point of view though similarity assay and PCA analysis.The UV fingerprint method offsets the defect that the UHPLC fingerprint reflected messages of secoiridoid glycosides only. Abbreviation used: UHPLC: Ultra high performance liquid chromatography, UV-vis: Ultraviolet-vis, HBV: Anti-hepatitis virus, DNA: Deoxyribonucleic acid, PCA: Principal component analysis, D-GaIN: D-Galactosamine, BCG: Bacille Calmette-Guerin, LPS: Lipopolysaccharide PMID:28216877
NASA Astrophysics Data System (ADS)
Pratt, Jon R.; Kramar, John A.; Newell, David B.; Smith, Douglas T.
2005-05-01
If nanomechanical testing is to evolve into a tool for process and quality control in semiconductor fabrication, great advances in throughput, repeatability, and accuracy of the associated instruments and measurements will be required. A recent grant awarded by the NIST Advanced Technology Program seeks to address the throughput issue by developing a high-speed AFM-based platform for quantitative nanomechanical measurements. The following paper speaks to the issue of quantitative accuracy by presenting an overview of various standards and techniques under development at NIST and other national metrology institutes (NMIs) that can provide a metrological basis for nanomechanical testing. The infrastructure we describe places firm emphasis on traceability to the International System of Units, paving the way for truly quantitative, rather than qualitative, physical property testing.
Integrated viral clearance strategies-reflecting on the present, projecting to the future.
Roush, David J
2018-01-20
Viral clearance and inactivation are critical steps in ensuring the safety of biological products derived from mammalian cell culture and are a component of an adventitious agent control strategy which spans both upstream and downstream processes. Although these approaches have been sufficient to support the development of biologics to date, the empirical and semi-quantitative nature of the approach leaves some potential gaps. For example, the concept of performing a quantitative risk assessment for the downstream components of virus safety was introduced in ICH Q5A for XMuLV. An ideal future state would be to perform a similar quantitative risk assessment for a range of viruses based on an assessment of potential virus risk in both upstream and downstream processes. This assessment combined with an integrated control strategy (including monitoring) would be extremely beneficial in minimizing potential adventitious agent risks. Significant progress has been achieved towards this goal in the last several years including recent advances in quantification of virus sequences in cell banks (ADVTIG), development of truly modular or generic viral clearance claims for specific unit operations, enhanced controls of upstream media (HTST/nanofiltration) and the use of RVLP for in-process monitoring. The recent shift towards continuous processing has the potential to enhance the criticality of in-line monitoring and the complexity of viral clearance and inactivation (owing to a wide range of potential 'worst case' viral clearance scenarios). However, gaps exist in, firstly, the ability to quantify potential virus risk levels in process streams in real-time, secondly, mechanistic understanding of virus/chromatography media interactions, and thirdly, mechanistic understanding of virus/filter interactions. Some new technologies may also need to be developed to allow for real-time confirmation of virus inactivation and clearance to support process development (both batch and continuous) and assessment of the impact of process deviations during manufacturing. This review paper provides an overview of the current state of an overall integrated control strategy for upstream and downstream processing and highlights the investments that could be pursued to achieve the future state of a quantitative virus risk assessment for a range of viruses. One potential approach to address these gaps is the use of data mining from large, comprehensive and diverse data sets to establish heuristic rules for virus detection, clearance and inactivation followed by specific hypothesis-driven experiments for cases that fall outside of the normal paradigm. Once this approach reaches a mature state suitable for implementation, there is an opportunity to update regulatory guidance (e.g. ICH Q5A) accordingly. Copyright © 2018 Elsevier Ltd. All rights reserved.
Grindstaff, Quirinus G.
1992-01-01
Described is a new gas chromatograph-mass spectrometer (GC/MS) system and method for quantitative analysis of reactive chemical compounds. All components of such a GC/MS system external to the oven of the gas chromatograph are programmably temperature controlled to operate at a volatilization temperature specific to the compound(s) sought to be separated and measured.
ERIC Educational Resources Information Center
Wilder, Esther Isabelle
2010-01-01
Quantitative and computer literacy are increasingly recognized as core components of a liberal education in sociology. This study draws on student, faculty, and alumni questionnaires to identify the kinds of quantitative literacy skills that are perceived to be most critical for students enrolled in sociology courses. Respondents at Lehman College…
2010-01-28
Considerations – Position categories: • Hard • Generic or “ soft ” • Advanced Training – Language requirements – Need for • military, combat and/or field...Analysis (DGMPRA) Presentation to MORS WG Personnel and National Security: A Quantitative Approach 25-28 January 2010 Defence Research and...SUPPLEMENTARY NOTES Personnel and National Security: A Quantitative Approach (Unclass), 25-28 January 2010, Johns Hopkins University Applied Physics
Catalog of components for electric and hybrid vehicle propulsion systems
NASA Technical Reports Server (NTRS)
Eissler, H. C.
1981-01-01
This catalog of commercially available electric and hybrid vehicle propulsion system components is intended for designers and builders of these vehicles and contains 50 categories of components. These categories include those components used between the battery terminals and the output axle hub, as well as some auxiliary equipment. An index of the components and a listing of the suppliers and their addresses and phone numbers are included.
19 CFR 10.843 - Articles eligible for duty-free treatment.
Code of Federal Regulations, 2010 CFR
2010-04-01
... paragraphs (a) through (j) of this section that are imported directly from Haiti or the Dominican Republic...-shape, and yarns, subject to the applicable quantitative limits set forth in U.S. Note 6(g), Subchapter...-shape, and yarns, without regard to the source of the fabric, fabric components, components knit-to...
19 CFR 10.843 - Articles eligible for duty-free treatment.
Code of Federal Regulations, 2011 CFR
2011-04-01
... paragraphs (a) through (j) of this section that are imported directly from Haiti or the Dominican Republic...-shape, and yarns, subject to the applicable quantitative limits set forth in U.S. Note 6(g), Subchapter...-shape, and yarns, without regard to the source of the fabric, fabric components, components knit-to...
ERIC Educational Resources Information Center
Smith, Claire F.; Hall, Samuel; Border, Scott; Adds, Philip J.; Finn, Gabrielle M.
2015-01-01
There is increasing recognition of multiprofessional learning in anatomy and its role in medical and healthcare professions. This study utilized two components to investigate anatomy interprofessional education (AIPE) in the United Kingdom and Ireland. First, a survey involving qualitative and quantitative components asked Heads of Anatomy to…
Pina, Violeta; Fuentes, Luis J.; Castillo, Alejandro; Diamantopoulou, Sofia
2014-01-01
It is assumed that children’s performance in mathematical abilities is influenced by several factors such as working memory (WM), verbal ability, intelligence, and socioeconomic status. The present study explored the contribution of those factors to mathematical performance taking a componential view of both WM and mathematics. We explored the existing relationship between different WM components (verbal and spatial) with tasks that make differential recruitment of the central executive, and simple and complex mathematical skills in a sample of 102 children in grades 4–6. The main findings point to a relationship between the verbal WM component and complex word arithmetic problems, whereas language and non-verbal intelligence were associated with knowledge of quantitative concepts and arithmetic ability. The spatial WM component was associated with the subtest Series, whereas the verbal component was with the subtest Concepts. The results also suggest a positive relationship between parental educational level and children’s performance on Quantitative Concepts. These findings suggest that specific cognitive skills might be trained in order to improve different aspects of mathematical ability. PMID:24847306
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-05-01
Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.
Venkatakrishnan, K; Ecsedy, J A
2017-01-01
Clinical pharmacodynamic evaluation is a key component of the "pharmacologic audit trail" in oncology drug development. We posit that its value can and should be greatly enhanced via application of a robust quantitative pharmacology framework informed by biologically mechanistic considerations. Herein, we illustrate examples of intersectional blindspots across the disciplines of quantitative pharmacology and translational science and offer a roadmap aimed at enhancing the caliber of clinical pharmacodynamic research in the development of oncology therapeutics. © 2016 American Society for Clinical Pharmacology and Therapeutics.
NASA Technical Reports Server (NTRS)
1990-01-01
Project Exodus is an in-depth study to identify and address the basic problems of a manned mission to Mars. The most important problems concern propulsion, life support, structure, trajectory, and finance. Exodus will employ a passenger ship, cargo ship, and landing craft for the journey to Mars. These three major components of the mission design are discussed separately. Within each component the design characteristics of structures, trajectory, and propulsion are addressed. The design characteristics of life support are mentioned only in those sections requiring it.
Targeting Transcription Elongation Machinery for Breast Cancer Therapy
2016-05-01
Zhou CONTRACTING ORGANIZATION: University of California, Berkeley Berkeley, CA 94704 REPORT DATE: May 2016 TYPE OF REPORT: Annual Report...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER AND ADDRESS(ES) University of California, Berkeley Berkeley, CA ...without affecting the Brd4 or PTEFb molecules. We have employed the CRISPR /Cas9 genome-editing tool to knock out the gene encoding the SEC component AFF4
ERIC Educational Resources Information Center
Wang, Victor C. X.
2010-01-01
Developing curriculum(s) requires instructors to take into several factors. These factors can be viewed as critical components of curriculum development for career and technical education (CTE) instructors. Without adequately addressing critical components such as curriculum history, curriculum theory, curriculum philosophies, curriculum…
The Academic Library and the Culture for Learning
ERIC Educational Resources Information Center
Hufford, Jon R.
2016-01-01
Several components of a campus culture affect learning, yet assessments regularly neglect some of them. Academic librarians should evaluate how they impact courses and student learning through their support of these neglected components. Assessment goals to address some of the components include measuring the level of support for courses with…
Individual Differences in Dual Task Performance. Final Report.
ERIC Educational Resources Information Center
Lansman, Marcy; Hunt, Earl
This report summarizes the research results and provides a reference. The basic question addressed was, "Is performance on multi-component tasks predicted by performance on the individual components performed separately?" In the first series of experiments, a dual task involving memory and verbal processing components to predict a…
40 CFR 1033.645 - Non-OEM component certification program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... needs of your component. (iv) An engineering analysis (including test data in some cases) demonstrating to us that your component will not cause emissions to increase. The analysis must address both low-hour and end-of-useful life emissions. The amount of information required for this analysis is less...
40 CFR 1033.645 - Non-OEM component certification program.
Code of Federal Regulations, 2011 CFR
2011-07-01
... needs of your component. (iv) An engineering analysis (including test data in some cases) demonstrating to us that your component will not cause emissions to increase. The analysis must address both low-hour and end-of-useful life emissions. The amount of information required for this analysis is less...
Code of Federal Regulations, 2010 CFR
2010-01-01
... management plan, and addition of multiple-site components. 921.33 Section 921.33 Commerce and Foreign Trade... management plan, and addition of multiple-site components. (a) Changes in the boundary of a Reserve and major... management plan shall address goals and objectives for all components of the multi-site Reserve and the...
The number processing and calculation system: evidence from cognitive neuropsychology.
Salguero-Alcañiz, M P; Alameda-Bailén, J R
2015-04-01
Cognitive neuropsychology focuses on the concepts of dissociation and double dissociation. The performance of number processing and calculation tasks by patients with acquired brain injury can be used to characterise the way in which the healthy cognitive system manipulates number symbols and quantities. The objective of this study is to determine the components of the numerical processing and calculation system. Participants consisted of 6 patients with acquired brain injuries in different cerebral localisations. We used Batería de evaluación del procesamiento numérico y el cálculo, a battery assessing number processing and calculation. Data was analysed using the difference in proportions test. Quantitative numerical knowledge is independent from number transcoding, qualitative numerical knowledge, and calculation. Recodification is independent from qualitative numerical knowledge and calculation. Quantitative numerical knowledge and calculation are also independent functions. The number processing and calculation system comprises at least 4 components that operate independently: quantitative numerical knowledge, number transcoding, qualitative numerical knowledge, and calculation. Therefore, each one may be damaged selectively without affecting the functioning of another. According to the main models of number processing and calculation, each component has different characteristics and cerebral localisations. Copyright © 2013 Sociedad Española de Neurología. Published by Elsevier Espana. All rights reserved.
Sonoda, T; Ona, T; Yokoi, H; Ishida, Y; Ohtani, H; Tsuge, S
2001-11-15
Detailed quantitative analysis of lignin monomer composition comprising p-coumaryl, coniferyl, and sinapyl alcohol and p-coumaraldehyde, coniferaldehyde, and sinapaldehyde in plant has not been studied from every point mainly because of artifact formation during the lignin isolation procedure, partial loss of the lignin components inherent in the chemical degradative methods, and difficulty in the explanation of the complex spectra generally observed for the lignin components. Here we propose a new method to quantify lignin monomer composition in detail by pyrolysis-gas chromatography (Py-GC) using acetylated lignin samples. The lignin acetylation procedure would contribute to prevent secondary formation of cinnamaldehydes from the corresponding alcohol forms during pyrolysis, which are otherwise unavoidable in conventional Py-GC process to some extent. On the basis of the characteristic peaks on the pyrograms of the acetylated sample, lignin monomer compositions in various dehydrogenative polymers (DHP) as lignin model compounds were determined, taking even minor components such as cinnamaldehydes into consideration. The observed compositions by Py-GC were in good agreement with the supplied lignin monomer contents on DHP synthesis. The new Py-GC method combined with sample preacetylation allowed us an accurate quantitative analysis of detailed lignin monomer composition using a microgram order of extractive-free plant samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Joshua M.
Manufacturing tasks that are deemed too hazardous for workers require the use of automation, robotics, and/or other remote handling tools. The associated hazards may be radiological or nonradiological, and based on the characteristics of the environment and processing, a design may necessitate robotic labor, human labor, or both. There are also other factors such as cost, ergonomics, maintenance, and efficiency that also effect task allocation and other design choices. Handling the tradeoffs of these factors can be complex, and lack of experience can be an issue when trying to determine if and what feasible automation/robotics options exist. To address thismore » problem, we utilize common engineering design approaches adapted more for manufacturing system design in hazardous environments. We limit our scope to the conceptual and embodiment design stages, specifically a computational algorithm for concept generation and early design evaluation. In regard to concept generation, we first develop the functional model or function structure for the process, using the common 'verb-noun' format for describing function. A common language or functional basis for manufacturing was developed and utilized to formalize function descriptions and guide rules for function decomposition. Potential components for embodiment are also grouped in terms of this functional language and are stored in a database. The properties of each component are given as quantitative and qualitative criteria. Operators are also rated for task-relevant criteria which are used to address task compatibility. Through the gathering of process requirements/constraints, construction of the component database, and development of the manufacturing basis and rule set, design knowledge is stored and available for computer use. Thus, once the higher level process functions are defined, the computer can automate the synthesis of new design concepts through alternating steps of embodiment and function structure updates/decomposition. In the process, criteria guide function allocation of components/operators and help ensure compatibility and feasibility. Through multiple function assignment options and varied function structures, multiple design concepts are created. All of the generated designs are then evaluated based on a number of relevant evaluation criteria: cost, dose, ergonomics, hazards, efficiency, etc. These criteria are computed using physical properties/parameters of each system based on the qualities an engineer would use to make evaluations. Nuclear processes such as oxide conversion and electrorefining are utilized to aid algorithm development and provide test cases for the completed program. Through our approach, we capture design knowledge related to manufacturing and other operations in hazardous environments to enable a computational program to automatically generate and evaluate system design concepts.« less
Quantitative Methods for Determining U.S. Air Force Crew Cushion Comfort
2006-09-01
Directorate Biosciences and Protection Division Biomechanics Branch Wright Patterson AFB OH 45433-7947 Form Approved REPORT DOCUMENTATION PAGE OMB No...Division Biomechanics Branch Wright-Patterson AFB OH 45433-7947 9. SPONSORING I MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S...workstations were constructed utilizing ejection seat long-term flight. mockups and foot pedal assemblies modified to simulate the ACES II seat in the F-16
Montanini, R; Freni, F; Rossi, G L
2012-09-01
This paper reports one of the first experimental results on the application of ultrasound activated lock-in vibrothermography for quantitative assessment of buried flaws in complex cast parts. The use of amplitude modulated ultrasonic heat generation allowed selective response of defective areas within the part, as the defect itself is turned into a local thermal wave emitter. Quantitative evaluation of hidden damages was accomplished by estimating independently both the area and the depth extension of the buried flaws, while x-ray 3D computed tomography was used as reference for sizing accuracy assessment. To retrieve flaw's area, a simple yet effective histogram-based phase image segmentation algorithm with automatic pixels classification has been developed. A clear correlation was found between the thermal (phase) signature measured by the infrared camera on the target surface and the actual mean cross-section area of the flaw. Due to the very fast cycle time (<30 s/part), the method could potentially be applied for 100% quality control of casting components.
Ancestral effect on HOMA-IR levels quantitated in an American population of Mexican origin.
Qu, Hui-Qi; Li, Quan; Lu, Yang; Hanis, Craig L; Fisher-Hoch, Susan P; McCormick, Joseph B
2012-12-01
An elevated insulin resistance index (homeostasis model assessment of insulin resistance [HOMA-IR]) is more commonly seen in the Mexican American population than in European populations. We report quantitative ancestral effects within a Mexican American population, and we correlate ancestral components with HOMA-IR. We performed ancestral analysis in 1,551 participants of the Cameron County Hispanic Cohort by genotyping 103 ancestry-informative markers (AIMs). These AIMs allow determination of the percentage (0-100%) ancestry from three major continental populations, i.e., European, African, and Amerindian. We observed that predominantly Amerindian ancestral components were associated with increased HOMA-IR (β = 0.124, P = 1.64 × 10(-7)). The correlation was more significant in males (Amerindian β = 0.165, P = 5.08 × 10(-7)) than in females (Amerindian β = 0.079, P = 0.019). This unique study design demonstrates how genomic markers for quantitative ancestral information can be used in admixed populations to predict phenotypic traits such as insulin resistance.
The role of quantitative safety evaluation in regulatory decision making of drugs.
Chakravarty, Aloka G; Izem, Rima; Keeton, Stephine; Kim, Clara Y; Levenson, Mark S; Soukup, Mat
2016-01-01
Evaluation of safety is a critical component of drug review at the US Food and Drug Administration (FDA). Statisticians are playing an increasingly visible role in quantitative safety evaluation and regulatory decision-making. This article reviews the history and the recent events relating to quantitative drug safety evaluation at the FDA. The article then focuses on five active areas of quantitative drug safety evaluation and the role Division of Biometrics VII (DBVII) plays in these areas, namely meta-analysis for safety evaluation, large safety outcome trials, post-marketing requirements (PMRs), the Sentinel Initiative, and the evaluation of risk from extended/long-acting opioids. This article will focus chiefly on developments related to quantitative drug safety evaluation and not on the many additional developments in drug safety in general.
NASA Astrophysics Data System (ADS)
Bandrowski, D.; Lai, Y.; Bradley, N.; Gaeuman, D. A.; Murauskas, J.; Som, N. A.; Martin, A.; Goodman, D.; Alvarez, J.
2014-12-01
In the field of river restoration sciences there is a growing need for analytical modeling tools and quantitative processes to help identify and prioritize project sites. 2D hydraulic models have become more common in recent years and with the availability of robust data sets and computing technology, it is now possible to evaluate large river systems at the reach scale. The Trinity River Restoration Program is now analyzing a 40 mile segment of the Trinity River to determine priority and implementation sequencing for its Phase II rehabilitation projects. A comprehensive approach and quantitative tool has recently been developed to analyze this complex river system referred to as: 2D-Hydrodynamic Based Logic Modeling (2D-HBLM). This tool utilizes various hydraulic output parameters combined with biological, ecological, and physical metrics at user-defined spatial scales. These metrics and their associated algorithms are the underpinnings of the 2D-HBLM habitat module used to evaluate geomorphic characteristics, riverine processes, and habitat complexity. The habitat metrics are further integrated into a comprehensive Logic Model framework to perform statistical analyses to assess project prioritization. The Logic Model will analyze various potential project sites by evaluating connectivity using principal component methods. The 2D-HBLM tool will help inform management and decision makers by using a quantitative process to optimize desired response variables with balancing important limiting factors in determining the highest priority locations within the river corridor to implement restoration projects. Effective river restoration prioritization starts with well-crafted goals that identify the biological objectives, address underlying causes of habitat change, and recognizes that social, economic, and land use limiting factors may constrain restoration options (Bechie et. al. 2008). Applying natural resources management actions, like restoration prioritization, is essential for successful project implementation (Conroy and Peterson, 2013). Evaluating tradeoffs and examining alternatives to improve fish habitat through optimization modeling is not just a trend but rather the scientific strategy by which management needs embrace and apply in its decision framework.
1, 2, 3, 4: infusing quantitative literacy into introductory biology.
Speth, Elena Bray; Momsen, Jennifer L; Moyerbrailean, Gregory A; Ebert-May, Diane; Long, Tammy M; Wyse, Sara; Linton, Debra
2010-01-01
Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills.
van der Put, Robert M F; de Haan, Alex; van den IJssel, Jan G M; Hamidi, Ahd; Beurret, Michel
2015-11-27
Due to the rapidly increasing introduction of Haemophilus influenzae type b (Hib) and other conjugate vaccines worldwide during the last decade, reliable and robust analytical methods are needed for the quantitative monitoring of intermediate samples generated during fermentation (upstream processing, USP) and purification (downstream processing, DSP) of polysaccharide vaccine components. This study describes the quantitative characterization of in-process control (IPC) samples generated during the fermentation and purification of the capsular polysaccharide (CPS), polyribosyl-ribitol-phosphate (PRP), derived from Hib. Reliable quantitative methods are necessary for all stages of production; otherwise accurate process monitoring and validation is not possible. Prior to the availability of high performance anion exchange chromatography methods, this polysaccharide was predominantly quantified either with immunochemical methods, or with the colorimetric orcinol method, which shows interference from fermentation medium components and reagents used during purification. Next to an improved high performance anion exchange chromatography-pulsed amperometric detection (HPAEC-PAD) method, using a modified gradient elution, both the orcinol assay and high performance size exclusion chromatography (HPSEC) analyses were evaluated. For DSP samples, it was found that the correlation between the results obtained by HPAEC-PAD specific quantification of the PRP monomeric repeat unit released by alkaline hydrolysis, and those from the orcinol method was high (R(2)=0.8762), and that it was lower between HPAEC-PAD and HPSEC results. Additionally, HPSEC analysis of USP samples yielded surprisingly comparable results to those obtained by HPAEC-PAD. In the early part of the fermentation, medium components interfered with the different types of analysis, but quantitative HPSEC data could still be obtained, although lacking the specificity of the HPAEC-PAD method. Thus, the HPAEC-PAD method has the advantage of giving a specific response compared to the orcinol assay and HPSEC, and does not show interference from various components that can be present in intermediate and purified PRP samples. Copyright © 2014 Elsevier Ltd. All rights reserved.
33 CFR 148.105 - What must I include in my application?
Code of Federal Regulations, 2011 CFR
2011-07-01
... each director, or their equivalents. (e) Address for service of documents. The name and address of one... name and address of one individual who may receive other documents. (f) Location and use. The proposed... studies. (1) A copy of each contract that the applicant made for the construction of any component of the...
33 CFR 148.105 - What must I include in my application?
Code of Federal Regulations, 2012 CFR
2012-07-01
... each director, or their equivalents. (e) Address for service of documents. The name and address of one... name and address of one individual who may receive other documents. (f) Location and use. The proposed... studies. (1) A copy of each contract that the applicant made for the construction of any component of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-24
...-0076] Agency Information Collection Activities: Sponsor's Notice of Change of Address, Form I-865...: Sponsor's Notice of Change of Address. (3) Agency form number, if any, and the applicable component of the.... During this 60-day period, USCIS will be evaluating whether to revise the Form I-865. Should USCIS decide...
Contributions of Organic Sources to Atmospheric Aerosol Particle Concentrations and Growth
NASA Astrophysics Data System (ADS)
Russell, L. M.
2017-12-01
Organic molecules are important contributors to aerosol particle mass and number concentrations through primary emissions as well as secondary growth in the atmosphere. New techniques for measuring organic aerosol components in atmospheric particles have improved measurements of this contribution in the last 20 years, including Scanning Transmission X-ray Microscopy Near Edge X-ray Absorption Fine Structure (STXM-NEXAFS), Fourier Transform Infrared spectroscopy (FTIR), and High-Resolution Aerosol Mass Spectrometry (AMS). STXM-NEXAFS individual aerosol particle composition illustrated the variety of morphology of organic components in marine aerosols, the inherent relationships between organic composition and shape, and the links between atmospheric aerosol composition and particles produced in smog chambers. This type of single particle microscopy has also added to size distribution measurements by providing evidence of how surface-controlled and bulk-controlled processes contribute to the growth of particles in the atmosphere. FTIR analysis of organic functional groups are sufficient to distinguish combustion, marine, and terrestrial organic particle sources and to show that each of those types of sources has a surprisingly similar organic functional group composition over four different oceans and four different continents. Augmenting the limited sampling of these off-line techniques with side-by-side inter-comparisons to online AMS provides complementary composition information and consistent quantitative attribution to sources (despite some clear method differences). Single-particle AMS techniques using light scattering and event trigger modes have now also characterized the types of particles found in urban, marine, and ship emission aerosols. Most recently, by combining with off-line techniques, single particle composition measurements have separated and quantified the contributions of organic, sulfate and salt components from ocean biogenic and sea spray emissions to particles, addressing the persistent question of the sources of cloud condensation nuclei in clean marine conditions.
The Use of Dialectical Behavior Therapy (DBT) in Music Therapy: A Sequential Explanatory Study.
Chwalek, Carolyn M; McKinney, Cathy H
2015-01-01
There are published examples of how dialectical behavior therapy (DBT) and music therapy are effectively being used as separate therapies in the treatment of individuals with a variety of mental health disorders. However, research examining DBT-informed music therapy is limited. The purpose of this study was to determine whether music therapists working in mental health settings are implementing components of DBT in their work, and if so, how and why; and if not, why not and what is their level of interest in such work. We used a sequential explanatory mixed-methods research design implemented in two phases. Phase 1 was a quantitative survey of board-certified music therapists (n=260). Due to a low survey response rate (18%), and to enhance the validity of the findings, Phase 2, an embedded qualitative procedure in the form of interviews with clinicians experienced in the DBT approach, was added to the study. Both survey and interviews inquired about DBT training, use of DBT-informed music therapy, music therapy experiences used to address DBT skills, and experiences of implementing DBT-informed music therapy. Respondents indicating they implement DBT-informed music therapy (38.3%) are using components and adaptations of the standard DBT protocol. Advantages of implementing DBT-informed music therapy were identified, and more than half of the respondents who do not implement DBT in their music therapy practice also perceived this work as at least somewhat important. Disadvantages were also identified and support the need for further research. Components of DBT are used in music therapy and are valued, but there is a lack of empirical evidence to inform, refine, and guide practice. © the American Music Therapy Association 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Quinnell, R.; Thompson, R.; LeBard, R. J.
2013-09-01
Developing quantitative skills, or being academically numerate, is part of the curriculum agenda in science teaching and learning. For many of our students, being asked to 'do maths' as part of 'doing science' leads to disengagement from learning. Notions of 'I can't do maths' speak of a rigidity of mind, a 'standoff', forming a barrier to learning in science that needs to be addressed if we, as science educators, are to offer solutions to the so-called 'maths problem' and to support students as they move from being novice to expert. Moving from novice to expert is complex and we lean on several theoretical frameworks (thinking dispositions, threshold concepts and mindfulness in learning) to characterize this pathway in science, with a focus on quantitative skills. Fluid thinking and application of numeracy skills are required to manipulate experimental data sets and are integral to our science practice; we need to stop students from seeing them as optional 'maths' or 'statistics' tasks within our discipline. Being explicit about the ways those in the discipline think, how quantitative data is processed, and allowing places for students to address their skills (including their confidence) offer some ways forward.
2011-01-01
Background The term 'inequities' refers to avoidable differences rooted in injustice. This review examined whether or not, and how, quantitative studies identifying inequalities in risk factors and health service utilization for asthma explicitly addressed underlying inequities. Asthma was chosen because recent decades have seen strong increases in asthma prevalence in many international settings, and inequalities in risk factors and related outcomes. Methods A review was conducted of studies that identified social inequalities in asthma-related outcomes or health service use in adult populations. Data were extracted on use of equity terms (objective evidence), and discussion of equity issues without using the exact terms (subjective evidence). Results Of the 219 unique articles retrieved, 21 were eligible for inclusion. None used the terms equity/inequity. While all but one article traced at least partial pathways to inequity, only 52% proposed any intervention and 55% of these interventions focused exclusively on the more proximal, clinical level. Conclusions Without more in-depth and systematic examination of inequities underlying asthma prevalence, quantitative studies may fail to provide the evidence required to inform equity-oriented interventions to address underlying circumstances restricting opportunities for health. PMID:21749720
Clementi, Massimo; Bagnarelli, Patrizia
2015-10-01
In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges.
[Analysis of chemical constituents of volatile components from Jia Ga Song Tang by GC-MS].
Tan, Qing-long; Xiong, Tian-qin; Liao, Jia-yi; Yang, Tao; Zhao, Yu-min; Lin, Xi; Zhang, Cui-xian
2014-10-01
To analyze the chemical components of volatile components from Jia Ga Song Tang. The volatile oils were extracted by water steam distillation. The chemical components of essential oil were analyzed by GC-MS and quantitatively determined by a normalization method. 103 components were separated and 87 components were identified in the volatile oil of Zingiberis Rhizoma. 58 components were separated and 38 components were identified in the volatile oil of Myristicae Semen. 49 components were separated and 38 components were identified in the volatile oil of Amomi Rotundus Fructus. 89 components were separated and 63 components were identified in the volatile oil of Jia Ga Song Tang. Eucalyptol, β-phellandrene and other terpenes were the main compounds in the volatile oil of Jia Ga Song Tang. Changes in the kinds and content of volatile components can provide evidences for scientific and rational compatibility for Jia Ga Song Tang.
Data warehouse governance programs in healthcare settings: a literature review and a call to action.
Elliott, Thomas E; Holmes, John H; Davidson, Arthur J; La Chance, Pierre-Andre; Nelson, Andrew F; Steiner, John F
2013-01-01
Given the extensive data stored in healthcare data warehouses, data warehouse governance policies are needed to ensure data integrity and privacy. This review examines the current state of the data warehouse governance literature as it applies to healthcare data warehouses, identifies knowledge gaps, provides recommendations, and suggests approaches for further research. A comprehensive literature search using five data bases, journal article title-search, and citation searches was conducted between 1997 and 2012. Data warehouse governance documents from two healthcare systems in the USA were also reviewed. A modified version of nine components from the Data Governance Institute Framework for data warehouse governance guided the qualitative analysis. Fifteen articles were retrieved. Only three were related to healthcare settings, each of which addressed only one of the nine framework components. Of the remaining 12 articles, 10 addressed between one and seven framework components and the remainder addressed none. Each of the two data warehouse governance plans obtained from healthcare systems in the USA addressed a subset of the framework components, and between them they covered all nine. While published data warehouse governance policies are rare, the 15 articles and two healthcare organizational documents reviewed in this study may provide guidance to creating such policies. Additional research is needed in this area to ensure that data warehouse governance polices are feasible and effective. The gap between the development of data warehouses in healthcare settings and formal governance policies is substantial, as evidenced by the sparse literature in this domain.
SARGENT, DANIEL J.; GEIBEL, M.; HAWKINS, J. A.; WILKINSON, M. J.; BATTEY, N. H.; SIMPSON, D. W.
2004-01-01
• Background and Aims The aims of this investigation were to highlight the qualitative and quantitative diversity apparent between nine diploid Fragaria species and produce interspecific populations segregating for a large number of morphological characters suitable for quantitative trait loci analysis. • Methods A qualitative comparison of eight described diploid Fragaria species was performed and measurements were taken of 23 morphological traits from 19 accessions including eight described species and one previously undescribed species. A principal components analysis was performed on 14 mathematically unrelated traits from these accessions, which partitioned the species accessions into distinct morphological groups. Interspecific crosses were performed with accessions of species that displayed significant quantitative divergence and, from these, populations that should segregate for a range of quantitative traits were raised. • Key Results Significant differences between species were observed for all 23 morphological traits quantified and three distinct groups of species accessions were observed after the principal components analysis. Interspecific crosses were performed between these groups, and F2 and backcross populations were raised that should segregate for a range of morphological characters. In addition, the study highlighted a number of distinctive morphological characters in many of the species studied. • Conclusions Diploid Fragaria species are morphologically diverse, yet remain highly interfertile, making the group an ideal model for the study of the genetic basis of phenotypic differences between species through map-based investigation using quantitative trait loci. The segregating interspecific populations raised will be ideal for such investigations and could also provide insights into the nature and extent of genome evolution within this group. PMID:15469944
Establish an Agent-Simulant Technology Relationship (ASTR)
2017-04-14
for quantitative measures that characterize simulant performance in testing , such as the ability to be removed from surfaces. Component-level ASTRs...Overall Test and Agent-Simulant Technology Relationship (ASTR) process. 1.2 Background. a. Historically, many tests did not develop quantitative ...methodology report14. Report provides a VX-TPP ASTR for post -decon contact hazard and off- gassing. In the Stryker production verification test (PVT
Engen, Steinar; Saether, Bernt-Erik
2014-03-01
We analyze the stochastic components of the Robertson-Price equation for the evolution of quantitative characters that enables decomposition of the selection differential into components due to demographic and environmental stochasticity. We show how these two types of stochasticity affect the evolution of multivariate quantitative characters by defining demographic and environmental variances as components of individual fitness. The exact covariance formula for selection is decomposed into three components, the deterministic mean value, as well as stochastic demographic and environmental components. We show that demographic and environmental stochasticity generate random genetic drift and fluctuating selection, respectively. This provides a common theoretical framework for linking ecological and evolutionary processes. Demographic stochasticity can cause random variation in selection differentials independent of fluctuating selection caused by environmental variation. We use this model of selection to illustrate that the effect on the expected selection differential of random variation in individual fitness is dependent on population size, and that the strength of fluctuating selection is affected by how environmental variation affects the covariance in Malthusian fitness between individuals with different phenotypes. Thus, our approach enables us to partition out the effects of fluctuating selection from the effects of selection due to random variation in individual fitness caused by demographic stochasticity. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.
ERIC Educational Resources Information Center
Psacharopoulos, George
1990-01-01
Replies to commentaries on the relationship of comparative education theory and practice, addressing the beneficiaries of educational planning and policymaking, the planner's role, evaluation criteria for educational planning, choice of discipline, simplification of theory, analytical versus quantitative research, theoretical foundations of…
Geological impacts on nutrition
USDA-ARS?s Scientific Manuscript database
This chapter reviews the nutritional roles of mineral elements, as part of a volume on health implications of geology. The chapter addresses the absorption and post-absorptive utilization of the nutritionally essential minerals, including their physiological functions and quantitative requirements....
AGRICULTURAL NONPOINT SOURCE POLLUTION (AGNPS)
Developed by the USDA Agricultural Research Service, Agricultural Nonpoint Source Pollution (AGNPS) model addresses concerns related to the potential impacts of point and nonpoint source pollution on surface and groundwater quality (Young et al., 1989). It was designed to quantit...
When Should Zero Be Included on a Scale Showing Magnitude?
ERIC Educational Resources Information Center
Kozak, Marcin
2011-01-01
This article addresses an important problem of graphing quantitative data: should one include zero on the scale showing magnitude? Based on a real time series example, the problem is discussed and some recommendations are proposed.
Field Verification of Undercut Criteria and Alternatives for Subgrade Stabilization-Coastal Plain
DOT National Transportation Integrated Search
2012-06-01
The North Carolina Department of Transportation (NCDOT) is progressing toward developing quantitative and systematic : criteria that address the implementation of undercutting as a subgrade stabilization measure. As part of this effort, a : laborator...
Building Strategically Aligned Individualized Education Programs for Transition
ERIC Educational Resources Information Center
Flannery, K. Brigid; Hellemn, Lisa A.
2015-01-01
The Individualized Education Programs (IEPs) for students 16 years of age or above must address specific transition components. Studies to date have focused on the presence and quality of these transition components, yet the alignment of these components and their role in leading the development of the IEP is just as critical. This qualitative…
Family Functioning in Families with a Child with Down Syndrome: A Mixed Methods Approach
ERIC Educational Resources Information Center
Povee, K.; Roberts, L.; Bourke, J.; Leonard, H.
2012-01-01
Background: This study aimed to explore the factors that predict functioning in families with a child with Down syndrome using a mixed methods design. The quantitative component examined the effect of maladaptive and autism-spectrum behaviours on the functioning of the family while the qualitative component explored the impact of having a child…
Wallace, Erik A.; Miller-Cribbs, Julie E.; Duffy, F. Daniel
2013-01-01
Background Future physicians may not be prepared for the challenges of caring for the growing population of poor patients in this country. Given the potential for a socioeconomic “gulf” between physicians and patients and the lack of curricula that address the specific needs of poor patients, resident knowledge about caring for this underserved population is low. Intervention We created a 2-day Resident Academy orientation, before the start of residency training, to improve community knowledge and address resident attitudes toward poor patients through team-based experiential activities. We collected demographic and satisfaction data through anonymous presurvey and postsurvey t tests, and descriptive analysis of the quantitative data were conducted. Qualitative comments from open-ended questions were reviewed, coded, and divided into themes. We also offer information on the cost and replicability of the Academy. Results Residents rated most components of the Academy as “very good” or “excellent.” Satisfaction scores were higher among residents in primary care training programs than among residents in nonprimary care programs for most Academy elements. Qualitative data demonstrated an overall positive effect on resident knowledge and attitudes about community resource availability for underserved patients, and the challenges of poor patients to access high-quality health care. Conclusions The Resident Academy orientation improved knowledge and attitudes of new residents before the start of residency, and residents were satisfied with the experience. The commitment of institutional leaders is essential for success. PMID:24404238
Addressing unmeasured confounding in comparative observational research.
Zhang, Xiang; Faries, Douglas E; Li, Hu; Stamey, James D; Imbens, Guido W
2018-04-01
Observational pharmacoepidemiological studies can provide valuable information on the effectiveness or safety of interventions in the real world, but one major challenge is the existence of unmeasured confounder(s). While many analytical methods have been developed for dealing with this challenge, they appear under-utilized, perhaps due to the complexity and varied requirements for implementation. Thus, there is an unmet need to improve understanding the appropriate course of action to address unmeasured confounding under a variety of research scenarios. We implemented a stepwise search strategy to find articles discussing the assessment of unmeasured confounding in electronic literature databases. Identified publications were reviewed and characterized by the applicable research settings and information requirements required for implementing each method. We further used this information to develop a best practice recommendation to help guide the selection of appropriate analytical methods for assessing the potential impact of unmeasured confounding. Over 100 papers were reviewed, and 15 methods were identified. We used a flowchart to illustrate the best practice recommendation which was driven by 2 critical components: (1) availability of information on the unmeasured confounders; and (2) goals of the unmeasured confounding assessment. Key factors for implementation of each method were summarized in a checklist to provide further assistance to researchers for implementing these methods. When assessing comparative effectiveness or safety in observational research, the impact of unmeasured confounding should not be ignored. Instead, we suggest quantitatively evaluating the impact of unmeasured confounding and provided a best practice recommendation for selecting appropriate analytical methods. Copyright © 2018 John Wiley & Sons, Ltd.
How to attract a nephrology trainee: quantitative questionnaire results.
Lane, Cathie Anne; Healy, Carol; Ho, Maria-Theresa; Pearson, Sallie-Anne; Brown, Mark Ashley
2008-04-01
Global health workforce shortages are being experienced across many specialties. Australia faces a nephrology workforce shortage coupled with increased demand for nephrology services. This study examines issues impacting on the choice of nephrology as a career and identifies factors that can be modified to improve trainee recruitment. This study provides evidence to inform those seeking to address nephrology, and by extrapolation, other specialty workforce shortages. In Australia in 2005, a mailed self-administered questionnaire was sent to all basic physician trainees eligible for the clinical component of the Royal Australasian College of Physicians' examination. Trainees were asked about the main influences on career choice; including perceived motivators and detractors surrounding a career in nephrology. Of the 531 doctors surveyed, 222 (42%) responded. Younger respondents and those with previous nephrology experience were more likely to consider nephrology. Perceptions deterring respondents from considering nephrology included inflexible work hours, an absence of positive role models, the perceived restriction of the subspecialty to a hospital-based practice and poor remuneration relative to other specialties. Exposure to a nephrology term in early postgraduate years is an important factor in a decision to undertake nephrology training. During these rotations; trainees need to experience positive role modelling. Effective trainee recruitment strategies should utilize the positive influence of role models, and must consider restructuring workforce and training activities to improve work hour flexibility and remuneration. Negative perceptions, acting as barriers to the pursuit of a career in nephrology, must be addressed and any misinformation corrected.
Observations and Laboratory Data of Planetary Organics
NASA Technical Reports Server (NTRS)
Roush, Ted L.
2002-01-01
Many efforts are underway to search for evidence of prebiotic materials in the outer solar system. Current and planned Mars missions obtain remote sensing observations that can be used to address the potential presence of prebiotic materials. Additional missions to, and continuing earth-based observations of, more distant solar system objects will also provide remote sensing observations that can be used to address the potential presence of prebiotic materials. I will present an overview of on-going observations, associated laboratory investigations of candidate materials, and theoretical modeling of observational data. In the past the room temperature reflectance spectra of many residues created from HC-bearing gases and solids have been reported. The results of an investigation of what effect temperatures more representative of outer solar system surfaces (50-140K) have on the reflectance spectra of these residues, and the associated interpretations, will be presented. The relatively organic-rich Tagish Lake Meteorite has been suggested as a spectral analog for Dtype asteroids. Using a new approach that relies upon iterative use of Hapke theory and Kraniers-Kronig analysis the optical constants of TLM were estimated. The approach and results of the analysis will be presented. Use of optical constants in scattering theories, such as the Hapke theory, provide the ability to determine quantitative estimates of the relative abundances and grain sizes of candidate surface components. This approach has been applied to interpret the reflectance spectra of several outer solar system surfaces. A summary will be provided describing the results of such modeling efforts.
Gopakumar, Gopalakrishna Pillai; Swetha, Murali; Sai Siva, Gorthi; Sai Subrahmanyam, Gorthi R K
2018-03-01
The present paper introduces a focus stacking-based approach for automated quantitative detection of Plasmodium falciparum malaria from blood smear. For the detection, a custom designed convolutional neural network (CNN) operating on focus stack of images is used. The cell counting problem is addressed as the segmentation problem and we propose a 2-level segmentation strategy. Use of CNN operating on focus stack for the detection of malaria is first of its kind, and it not only improved the detection accuracy (both in terms of sensitivity [97.06%] and specificity [98.50%]) but also favored the processing on cell patches and avoided the need for hand-engineered features. The slide images are acquired with a custom-built portable slide scanner made from low-cost, off-the-shelf components and is suitable for point-of-care diagnostics. The proposed approach of employing sophisticated algorithmic processing together with inexpensive instrumentation can potentially benefit clinicians to enable malaria diagnosis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Relational Care for Perinatal Substance Use: A Systematic Review.
Kramlich, Debra; Kronk, Rebecca
2015-01-01
The purpose of this systematic review of the literature is to highlight published studies of perinatal substance use disorder that address relational aspects of various care delivery models to identify opportunities for future studies in this area. Quantitative, qualitative, and mixed-methods studies that included relational variables, such as healthcare provider engagement with pregnant women and facilitation of maternal-infant bonding, were identified using PubMed, Scopus, and EBSCO databases. Key words included neonatal abstinence syndrome, drug, opioid, substance, dependence, and pregnancy. Six studies included in this review identified statistically and/or clinically significant positive maternal and neonatal outcomes thought to be linked to engagement in antenatal care and development of caring relationships with healthcare providers. Comprehensive, integrated multidisciplinary services for pregnant women with substance use disorder aimed at harm reduction are showing positive results. Evidence exists that pregnant women's engagement with comprehensive services facilitated by caring relationships with healthcare providers may improve perinatal outcomes. Gaps in the literature remain; studies have yet to identify the relative contribution of multiple risk factors to adverse outcomes as well as program components most likely to improve outcomes.
A mixed-methods approach to systematic reviews.
Pearson, Alan; White, Heath; Bath-Hextall, Fiona; Salmond, Susan; Apostolo, Joao; Kirkpatrick, Pamela
2015-09-01
There are an increasing number of published single-method systematic reviews that focus on different types of evidence related to a particular topic. As policy makers and practitioners seek clear directions for decision-making from systematic reviews, it is likely that it will be increasingly difficult for them to identify 'what to do' if they are required to find and understand a plethora of syntheses related to a particular topic.Mixed-methods systematic reviews are designed to address this issue and have the potential to produce systematic reviews of direct relevance to policy makers and practitioners.On the basis of the recommendations of the Joanna Briggs Institute International Mixed Methods Reviews Methodology Group in 2012, the Institute adopted a segregated approach to mixed-methods synthesis as described by Sandelowski et al., which consists of separate syntheses of each component method of the review. Joanna Briggs Institute's mixed-methods synthesis of the findings of the separate syntheses uses a Bayesian approach to translate the findings of the initial quantitative synthesis into qualitative themes and pooling these with the findings of the initial qualitative synthesis.
Ivy, T M
2007-03-01
Genetic benefits can enhance the fitness of polyandrous females through the high intrinsic genetic quality of females' mates or through the interaction between female and male genes. I used a full diallel cross, a quantitative genetics design that involves all possible crosses among a set of genetically homogeneous lines, to determine the mechanism through which polyandrous female decorated crickets (Gryllodes sigillatus) obtain genetic benefits. I measured several traits related to fitness and partitioned the phenotypic variance into components representing the contribution of additive genetic variance ('good genes'), nonadditive genetic variance (genetic compatibility), as well as maternal and paternal effects. The results reveal a significant variance attributable to both nonadditive and additive sources in the measured traits, and their influence depended on which trait was considered. The lack of congruence in sources of phenotypic variance among these fitness-related traits suggests that the evolution and maintenance of polyandry are unlikely to have resulted from one selective influence, but rather are the result of the collective effects of a number of factors.
Hu, Tina; Cox, Kelly Anne; Nyhof-Young, Joyce
2017-01-01
Background Poverty is a key determinant of health that leads to poor health outcomes. Although most healthcare providers will work with patients experiencing poverty, surveys among healthcare students have reported a curriculum gap in this area. This study aims to introduce and evaluate a novel, student-run interprofessional inner city health educational program that combines both practical and didactic educational components. Methods Students participating in the program answered pre- and post-program surveys. Wilcoxon signed-rank tests and descriptive thematic analysis were used for quantitative and qualitative data, respectively. Results A total of 28 out of 35 participants responded (response rate: 80%). Student knowledge about issues facing underserved populations and resources for underserved populations significantly increased after program participation. Student comfort working with underserved populations also significantly increased after program participation. Valued program elements included workshops, shadowing, and a focus on marginalized populations. Conclusion Interprofessional inner city health educational programs are beneficial for students to learn about poverty intervention and resources, and may represent a strategy to address a gap in the healthcare professional curriculum. PMID:28344718
Cultural psychiatry: research strategies and future directions.
Kirmayer, Laurence J; Ban, Lauren
2013-01-01
This chapter reviews some key aspects of current research in cultural psychiatry and explores future prospects. The first section discusses the multiple meanings of culture in the contemporary world and their relevance for understanding mental health and illness. The next section considers methodological strategies for unpacking the concept of culture and studying the impact of cultural variables, processes and contexts. Multiple methods are needed to address the many different components or dimensions of cultural identity and experience that constitute local worlds, ways of life or systems of knowledge. Quantitative and observational methods of clinical epidemiology and experimental science as well as qualitative ethnographic methods are needed to capture crucial aspects of culture as systems of meaning and practice. Emerging issues in cultural psychiatric research include: cultural variations in illness experience and expression; the situated nature of cognition and emotion; cultural configurations of self and personhood; concepts of mental disorder and mental health literacy; and the prospect of ecosocial models of health and culturally based interventions. The conclusion considers the implications of the emerging perspectives from cultural neuroscience for psychiatric theory and practice. Copyright © 2013 S. Karger AG, Basel.
Anger in the context of postnatal depression: An integrative review.
Ou, Christine H; Hall, Wendy A
2018-05-20
Contrary to social constructions of new motherhood as a joyous time, mothers may experience postnatal depression and anger. Although postnatal depression has been thoroughly studied, the expression of maternal anger in the context of postnatal depression is conceptually unclear. This integrative review investigated the framing of anger in the context of postnatal depression. After undertaking a search of CINAHL, Ovid-Medline, PsycInfo, and Web of Science, we identified qualitative (n = 7) and quantitative (n = 17) papers that addressed maternal anger and postnatal depression. We analyzed the data by developing themes. Our review indicated that anger was a salient mood disturbance for some postnatally depressed women with themes integrated as: (i) anger accompanying depression, (ii) powerlessness as a component of depression and anger, and (iii) anger occurring as a result of expectations being violated. Our findings indicate that anger can coexist with women's postnatal depression. Anger can be expressed toward the self and toward children and family members with negative relationship effects. We recommend that health care providers and researchers consider anger in the context of postnatal mood disturbances. © 2018 Wiley Periodicals, Inc.
Biosimulation of Inflammation and Healing in Surgically Injured Vocal Folds
Li, Nicole Y. K.; Vodovotz, Yoram; Hebda, Patricia A.; Abbott, Katherine Verdolini
2010-01-01
Objectives The pathogenesis of vocal fold scarring is complex and remains to be deciphered. The current study is part of research endeavors aimed at applying systems biology approaches to address the complex biological processes involved in the pathogenesis of vocal fold scarring and other lesions affecting the larynx. Methods We developed a computational agent-based model (ABM) to quantitatively characterize multiple cellular and molecular interactions involved in inflammation and healing in vocal fold mucosa after surgical trauma. The ABM was calibrated with empirical data on inflammatory mediators (eg, tumor necrosis factor) and extracellular matrix components (eg, hyaluronan) from published studies on surgical vocal fold injury in the rat population. Results The simulation results reproduced and predicted trajectories seen in the empirical data from the animals. Moreover, the ABM studies suggested that hyaluronan fragments might be the clinical surrogate of tissue damage, a key variable that in these simulations both is enhanced by and further induces inflammation. Conclusions A relatively simple ABM such as the one reported in this study can provide new understanding of laryngeal wound healing and generate working hypotheses for further wet-lab studies. PMID:20583741
Ecological network analysis on global virtual water trade.
Yang, Zhifeng; Mao, Xufeng; Zhao, Xu; Chen, Bin
2012-02-07
Global water interdependencies are likely to increase with growing virtual water trade. To address the issues of the indirect effects of water trade through the global economic circulation, we use ecological network analysis (ENA) to shed insight into the complicated system interactions. A global model of virtual water flow among agriculture and livestock production trade in 1995-1999 is also built as the basis for network analysis. Control analysis is used to identify the quantitative control or dependency relations. The utility analysis provides more indicators for describing the mutual relationship between two regions/countries by imitating the interactions in the ecosystem and distinguishes the beneficiary and the contributor of virtual water trade system. Results show control and utility relations can well depict the mutual relation in trade system, and direct observable relations differ from integral ones with indirect interactions considered. This paper offers a new way to depict the interrelations between trade components and can serve as a meaningful start as we continue to use ENA in providing more valuable implications for freshwater study on a global scale.
Multichannel waveguides for the simultaneous detection of disease biomarkers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mukundan, Harshini; Price, Dominique Z; Grace, Wynne K
2009-01-01
The sensor team at the Los Alamos National Laboratory has developed a waveguide-based optical biosensor that has previously been used for the detection of biomarkers associated with diseases such as tuberculosis, breast cancer, anthrax and influenza in complex biological samples (e.g., serum and urine). However, no single biomarker can accurately predict disease. To address this issue, we developed a multiplex assay for the detection of components of the Bacillus anthracis lethal toxin on single mode planar optical waveguides with tunable quantum dots as the fluorescence reporter. This limited ability to multiplex is still insufficient for accurate detection of disease ormore » for monitoring prognosis. In this manuscript, we demonstrate for the first time, the design, fabrication and successful evaluation of a multichannel planar optical waveguide for the simultaneous detection of at least three unknown samples in quadruplicate. We demonstrate the simultaneous, rapid (30 min), quantitative (with internal standard) and sensitive (limit of detection of 1 pM) detection of protective antigen and lethal factor of Bacillus anthracis in complex biological samples (serum) using specific monoclonal antibodies labeled with quantum dots as the fluorescence reporter.« less
Analysis and Calibration of Sources of Electronic Error in PSD Sensor Response.
Rodríguez-Navarro, David; Lázaro-Galilea, José Luis; Bravo-Muñoz, Ignacio; Gardel-Vicente, Alfredo; Tsirigotis, Georgios
2016-04-29
In order to obtain very precise measurements of the position of agents located at a considerable distance using a sensor system based on position sensitive detectors (PSD), it is necessary to analyze and mitigate the factors that generate substantial errors in the system's response. These sources of error can be divided into electronic and geometric factors. The former stem from the nature and construction of the PSD as well as the performance, tolerances and electronic response of the system, while the latter are related to the sensor's optical system. Here, we focus solely on the electrical effects, since the study, analysis and correction of these are a prerequisite for subsequently addressing geometric errors. A simple calibration method is proposed, which considers PSD response, component tolerances, temperature variations, signal frequency used, signal to noise ratio (SNR), suboptimal operational amplifier parameters, and analog to digital converter (ADC) quantitation SNRQ, etc. Following an analysis of these effects and calibration of the sensor, it was possible to correct the errors, thus rendering the effects negligible, as reported in the results section.
Analysis and Calibration of Sources of Electronic Error in PSD Sensor Response
Rodríguez-Navarro, David; Lázaro-Galilea, José Luis; Bravo-Muñoz, Ignacio; Gardel-Vicente, Alfredo; Tsirigotis, Georgios
2016-01-01
In order to obtain very precise measurements of the position of agents located at a considerable distance using a sensor system based on position sensitive detectors (PSD), it is necessary to analyze and mitigate the factors that generate substantial errors in the system’s response. These sources of error can be divided into electronic and geometric factors. The former stem from the nature and construction of the PSD as well as the performance, tolerances and electronic response of the system, while the latter are related to the sensor’s optical system. Here, we focus solely on the electrical effects, since the study, analysis and correction of these are a prerequisite for subsequently addressing geometric errors. A simple calibration method is proposed, which considers PSD response, component tolerances, temperature variations, signal frequency used, signal to noise ratio (SNR), suboptimal operational amplifier parameters, and analog to digital converter (ADC) quantitation SNRQ, etc. Following an analysis of these effects and calibration of the sensor, it was possible to correct the errors, thus rendering the effects negligible, as reported in the results section. PMID:27136562
Biosimulation of inflammation and healing in surgically injured vocal folds.
Li, Nicole Y K; Vodovotz, Yoram; Hebda, Patricia A; Abbott, Katherine Verdolini
2010-06-01
The pathogenesis of vocal fold scarring is complex and remains to be deciphered. The current study is part of research endeavors aimed at applying systems biology approaches to address the complex biological processes involved in the pathogenesis of vocal fold scarring and other lesions affecting the larynx. We developed a computational agent-based model (ABM) to quantitatively characterize multiple cellular and molecular interactions involved in inflammation and healing in vocal fold mucosa after surgical trauma. The ABM was calibrated with empirical data on inflammatory mediators (eg, tumor necrosis factor) and extracellular matrix components (eg, hyaluronan) from published studies on surgical vocal fold injury in the rat population. The simulation results reproduced and predicted trajectories seen in the empirical data from the animals. Moreover, the ABM studies suggested that hyaluronan fragments might be the clinical surrogate of tissue damage, a key variable that in these simulations both is enhanced by and further induces inflammation. A relatively simple ABM such as the one reported in this study can provide new understanding of laryngeal wound healing and generate working hypotheses for further wet-lab studies.
Current advances in mathematical modeling of anti-cancer drug penetration into tumor tissues.
Kim, Munju; Gillies, Robert J; Rejniak, Katarzyna A
2013-11-18
Delivery of anti-cancer drugs to tumor tissues, including their interstitial transport and cellular uptake, is a complex process involving various biochemical, mechanical, and biophysical factors. Mathematical modeling provides a means through which to understand this complexity better, as well as to examine interactions between contributing components in a systematic way via computational simulations and quantitative analyses. In this review, we present the current state of mathematical modeling approaches that address phenomena related to drug delivery. We describe how various types of models were used to predict spatio-temporal distributions of drugs within the tumor tissue, to simulate different ways to overcome barriers to drug transport, or to optimize treatment schedules. Finally, we discuss how integration of mathematical modeling with experimental or clinical data can provide better tools to understand the drug delivery process, in particular to examine the specific tissue- or compound-related factors that limit drug penetration through tumors. Such tools will be important in designing new chemotherapy targets and optimal treatment strategies, as well as in developing non-invasive diagnosis to monitor treatment response and detect tumor recurrence.
Perret, A; Foray, G; Masenelli-Varlot, K; Maire, E; Yrieix, B
2018-01-01
For insulation applications, boards thinner than 2 cm are under design with specific thermal conductivities lower than 15 mW m -1 K -1 . This requires binding slightly hydrophobic aerogels which are highly nanoporous granular materials. To reach this step and ensure insulation board durability at the building scale, it is compulsory to design, characterise and analyse the microstructure at the nanoscale. It is indeed necessary to understand how the solid material is formed from a liquid suspension. This issue is addressed in this paper through wet-STEM experiments carried out in an Environmental Scanning Electron Microscope (ESEM). Latex-surfactant binary blends and latex-surfactant-aerogel ternary systems are studied, with two different surfactants of very different chemical structures. Image analysis is used to distinguish the different components and get quantitative morphological parameters which describe the sample architecture. The evolution of such morphological parameters during water evaporation permits a good understanding of the role of the surfactant. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
Türker-Kaya, Sevgi; Huck, Christian W
2017-01-20
Plant cells, tissues and organs are composed of various biomolecules arranged as structurally diverse units, which represent heterogeneity at microscopic levels. Molecular knowledge about those constituents with their localization in such complexity is very crucial for both basic and applied plant sciences. In this context, infrared imaging techniques have advantages over conventional methods to investigate heterogeneous plant structures in providing quantitative and qualitative analyses with spatial distribution of the components. Thus, particularly, with the use of proper analytical approaches and sampling methods, these technologies offer significant information for the studies on plant classification, physiology, ecology, genetics, pathology and other related disciplines. This review aims to present a general perspective about near-infrared and mid-infrared imaging/microspectroscopy in plant research. It is addressed to compare potentialities of these methodologies with their advantages and limitations. With regard to the organization of the document, the first section will introduce the respective underlying principles followed by instrumentation, sampling techniques, sample preparations, measurement, and an overview of spectral pre-processing and multivariate analysis. The last section will review selected applications in the literature.
NASA Technical Reports Server (NTRS)
Kirstetter, Pierre-Emmanuel; Hong, Y.; Gourley, J. J.; Schwaller, M.; Petersen, W; Zhang, J.
2012-01-01
Characterization of the error associated to satellite rainfall estimates is a necessary component of deterministic and probabilistic frameworks involving spaceborne passive and active microwave measurements for applications ranging from water budget studies to forecasting natural hazards related to extreme rainfall events. We focus here on the error structure of Tropical Rainfall Measurement Mission (TRMM) Precipitation Radar (PR) quantitative precipitation estimation (QPE) at ground. The problem was addressed in a previous paper by comparison of 2A25 version 6 (V6) product with reference values derived from NOAA/NSSL's ground radar-based National Mosaic and QPE system (NMQ/Q2). The primary contribution of this study is to compare the new 2A25 version 7 (V7) products that were recently released as a replacement of V6. This new version is considered superior over land areas. Several aspects of the two versions are compared and quantified including rainfall rate distributions, systematic biases, and random errors. All analyses indicate V7 is an improvement over V6.
NASA Astrophysics Data System (ADS)
Schneiderbauer, Stefan; Zebisch, Marc; Becker, Daniel; Pedoth, Lydia; Renner, Kathrin; Kienberger, Stefan
2016-04-01
Changing climate conditions may have beneficial or adverse effects on the social-ecological systems we are living in. In any case, the possible effects result from complex and interlinked physical and social processes embedded in these systems. Traditional research addresses these bio-physical and societal issues in a separate way. Therefore, in general, studies on risks related to climate change are still mono-disciplinary in nature with an increasing amount of work following a multi-disciplinary approach. The quality and usefulness of the results of such research for policy or decision making in practice may further be limited by study designs that do not acknowledge appropriately the significance of integrating or at least mixing qualitative and quantitative information and knowledge. Finally, the acceptance of study results - particularly when containing some kind of assessments - is often endangered by insufficient and / or late involvement of stakeholders and users. The above mentioned limitations have often been brought up in the recent past. However, despite that a certain consensus could be achieved in the last years recognising the need to tackle these issues, little progress has been made in terms of implementation within the context of (research) studies. This paper elaborates in detail on reasons that hamper the application of - interdisciplinary (i.e. natural and social science), - trans-disciplinary (i.e. co-production of knowledge) and - integrative (i.e. combining qualitative and quantitative approaches) work. It is based on the experience gained through a number of applied climate change vulnerability studies carried out within the context of various GIZ-financed development cooperation projects, a consultancy project for the German Environment Agency as well as the workshop series INQUIMUS, which tackles particularly the issues of mixing qualitative and quantitative research approaches. Potentials and constraints of possible attempts for solutions to solve the existing limitations are discussed. Conclusions drawn underline the importance to involve stakeholders from the very beginning (i.e. the study design) and to communicate the various uncertainties (at the levels of data and methodologies) as well as the subjective components of the study (i.e. for the value system of the assessment) in a transparent way. It is also stated that truly interdisciplinary approaches - though often demanded by programs and research managers - is yet to be supported by breaking-up traditional structures in research institutions and administrative departments.
NASA Astrophysics Data System (ADS)
Cornell, Sarah
2015-04-01
It is time to collate a global community database of atmospheric water-soluble organic nitrogen deposition. Organic nitrogen (ON) has long been known to be globally ubiquitous in atmospheric aerosol and precipitation, with implications for air and water quality, climate, biogeochemical cycles, ecosystems and human health. The number of studies of atmospheric ON deposition has increased steadily in recent years, but to date there is no accessible global dataset, for either bulk ON or its major components. Improved qualitative and quantitative understanding of the organic nitrogen component is needed to complement the well-established knowledge base pertaining to other components of atmospheric deposition (cf. Vet et al 2014). Without this basic information, we are increasingly constrained in addressing the current dynamics and potential interactions of atmospheric chemistry, climate and ecosystem change. To see the full picture we need global data synthesis, more targeted data gathering, and models that let us explore questions about the natural and anthropogenic dynamics of atmospheric ON. Collectively, our research community already has a substantial amount of atmospheric ON data. Published reports extend back over a century and now have near-global coverage. However, datasets available from the literature are very piecemeal and too often lack crucially important information that would enable aggregation or re-use. I am initiating an open collaborative process to construct a community database, so we can begin to systematically synthesize these datasets (generally from individual studies at a local and temporally limited scale) to increase their scientific usability and statistical power for studies of global change and anthropogenic perturbation. In drawing together our disparate knowledge, we must address various challenges and concerns, not least about the comparability of analysis and sampling methodologies, and the known complexity of composition of ON. We need to discuss and develop protocols that work for diverse research needs. The database will need to be harmonized or merged into existing global N data initiatives. This presentation therefore launches a standing invitation for experts to contribute and share rain and aerosol ON and chemical composition data, and jointly refine the preliminary database structure and metadata requirements for optimal mutual use. Reference: Vet et al. (2014) A global assessment of precipitation chemistry… Atmos Environ 93: 3-100
[HPLC fingerprint of flavonoids in Sophora flavescens and determination of five components].
Ma, Hong-Yan; Zhou, Wan-Shan; Chu, Fu-Jiang; Wang, Dong; Liang, Sheng-Wang; Li, Shao
2013-08-01
A simple and reliable method of high-performance liquid chromatography with photodiode array detection (HPLC-DAD) was developed to evaluate the quality of a traditional Chinese medicine Sophora flavescens through establishing chromatographic fingerprint and simultaneous determination of five flavonoids, including trifolirhizin, maackiain, kushenol I, kurarinone and sophoraflavanone G. The optimal conditions of separation and detection were achieved on an ULTIMATE XB-C18 column (4.6 mm x 250 mm, 5 microm) with a gradient of acetonitrile and water, detected at 295 nm. In the chromatographic fingerprint, 13 peaks were selected as the characteristic peaks to assess the similarities of different samples collected from different origins in China according to similarity evaluation for chromatographic fingerprint of traditional chinese medicine (2004AB) and principal component analysis (PCA) were used in data analysis. There were significant differences in the fingerprint chromatograms between S. flavescens and S. tonkinensis. Principal component analysis showed that kurarinone and sophoraflavanone G were the most important component. In quantitative analysis, the five components showed good regression (R > 0.999) with linear ranges, and their recoveries were in the range of 96.3% - 102.3%. This study indicated that the combination of quantitative and chromatographic fingerprint analysis can be readily utilized as a quality control method for S. flavescens and its related traditional Chinese medicinal preparations.
Evolutionary multidimensional access architecture featuring cost-reduced components
NASA Astrophysics Data System (ADS)
Farjady, Farsheed; Parker, Michael C.; Walker, Stuart D.
1998-12-01
We describe a three-stage wavelength-routed optical access network, utilizing coarse passband-flattened arrayed- waveguide grating routers. An N-dimensional addressing strategy enables 6912 customers to be bi-directionally addressed with multi-Gb/s data using only 24 wavelengths spaced by 1.6 nm. Coarse wavelength separation allows use of increased tolerance WDM components at the exchange and customer premises. The architecture is designed to map onto standard access network topologies, allowing elegant upgradability from legacy PON infrastructures at low cost. Passband-flattening of the routers is achieved through phase apodization.
2nd Generation Reusable Launch Vehicle (2G RLV). Revised
NASA Technical Reports Server (NTRS)
Matlock, Steve; Sides, Steve; Kmiec, Tom; Arbogast, Tim; Mayers, Tom; Doehnert, Bill
2001-01-01
This is a revised final report and addresses all of the work performed on this program. Specifically, it covers vehicle architecture background, definition of six baseline engine cycles, reliability baseline (space shuttle main engine QRAS), and component level reliability/performance/cost for the six baseline cycles, and selection of 3 cycles for further study. This report further addresses technology improvement selection and component level reliability/performance/cost for the three cycles selected for further study, as well as risk reduction plans, and recommendation for future studies.
New directions in lubrication, materials, wear, and surface interactions - Tribology in the 80's
NASA Technical Reports Server (NTRS)
Loomis, W. R. (Editor)
1985-01-01
New directions in tribology are described. A range of topics is addressed, extending from fundamental research on tribological materials of all kinds and their surface effects, to final technological applications in mechanical components such as bearings, gears, and seals. The general topics addressed include: importance and definition of materials in tribology; future directions of research in adhesion and friction, wear and wear-resistant materials, and liquid lubricants and additives; status and new directions in elastohydrodynamic lubrication and solid lubricants; and tribological materials for mechanical components of the future.
Balancing Petroleum Force Structure/Capabilities between Active and Reserve Components
2013-03-01
ORGANIZATION NAME(S) AND ADDRESS(ES) Colonel Robert M. Mundell Department of Command, Leadership, and Management 8. PERFORMING ORGANIZATION REPORT...Colonel Kenneth R. Hook United States Army Colonel Robert M. Mundell Department of Command, Leadership, and Management Project
Quantitative interpretations of Visible-NIR reflectance spectra of blood.
Serebrennikova, Yulia M; Smith, Jennifer M; Huffman, Debra E; Leparc, German F; García-Rubio, Luis H
2008-10-27
This paper illustrates the implementation of a new theoretical model for rapid quantitative analysis of the Vis-NIR diffuse reflectance spectra of blood cultures. This new model is based on the photon diffusion theory and Mie scattering theory that have been formulated to account for multiple scattering populations and absorptive components. This study stresses the significance of the thorough solution of the scattering and absorption problem in order to accurately resolve for optically relevant parameters of blood culture components. With advantages of being calibration-free and computationally fast, the new model has two basic requirements. First, wavelength-dependent refractive indices of the basic chemical constituents of blood culture components are needed. Second, multi-wavelength measurements or at least the measurements of characteristic wavelengths equal to the degrees of freedom, i.e. number of optically relevant parameters, of blood culture system are required. The blood culture analysis model was tested with a large number of diffuse reflectance spectra of blood culture samples characterized by an extensive range of the relevant parameters.
NASA Astrophysics Data System (ADS)
Kemper, Björn; Schmidt, Lisa; Przibilla, Sabine; Rommel, Christina; Vollmer, Angelika; Ketelhut, Steffi; Schnekenburger, Jürgen; von Bally, Gert
2010-04-01
Digital holographic microscopy (DHM) provides label-free quantitative phase contrast with low demands on sample preparation. Nevertheless, for DHM measurements on fixed cells the mounting medium has to be considered while the phase contrast of living cells may be influenced by the used buffer solution. To quantify these effects, the maximum cell caused phase contrast and the visibility of the nucleoli were analyzed. A second aim of the study was to identify subcellular components in DHM phase contrast images. Therefore, comparative investigations using bright field imaging, DHM and fluorescence microscopy with 4',6- Diamidino-2-phenylindol (DAPI) staining were performed. DAPI-staining visualizes cell components containing DNA. The obtained results demonstrate exemplarily for two tumor cell lines that from DHM phase contrast images of fixed cells in phosphate buffer saline (PBS) cell thickness values are obtained which are comparable to living cells. Furthermore, it is shown that in many cases nucleus components can be identified only by DHM phase contrast.
Boydell, K M; Everett, B
1992-01-01
Supported housing (as distinct from supportive housing) emphasizes the values of consumer choice; independence; participation; permanence; normalcy; and flexible, ongoing supports. As a model, it has only recently become popular in the literature and therefore little is known of its effectiveness in serving people with long-term psychiatric backgrounds. In 1989, Homeward Projects, a community mental health agency located in Metropolitan Toronto, established a supported housing project. Homeward included an evaluative component in its program from the outset. In order to give equal weight to the tenants' opinions, both quantitative and qualitative methodologies were employed. In the quantitative component, residential milieu, social support, and service delivery were examined. The qualitative component involved an ethnographic study which allowed the tenants to voice their experiences of living in such a setting. Results provided a rich understanding of the model. Overall, the tenants eventually came to describe their house as a home.
NASA Astrophysics Data System (ADS)
Wang, Yu; Akiyama, Hidefumi; Terakado, Kanako; Nakatsu, Toru
2013-08-01
Firefly bioluminescence has attracted great interest because of its high quantum yield and intriguing modifiable colours. Modifications to the structure of the enzyme luciferase can change the emission colour of firefly bioluminescence, and the mechanism of the colour change has been intensively studied by biochemists, structural biologists, optical physicists, and quantum-chemistry theorists. Here, we report on the quantitative spectra of firefly bioluminescence catalysed by wild-type and four site-directed mutant luciferases. While the mutation caused different emission spectra, the spectra differed only in the intensity of the green component (λmax ~ 560 nm). In contrast, the orange (λmax ~ 610 nm) and red (λmax ~ 650 nm) components present in all the spectra were almost unaffected by the modifications to the luciferases and changes in pH. Our results reveal that the intensity of the green component is the unique factor that is influenced by the luciferase structure and other reaction conditions.
2016-01-01
Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978
DOT National Transportation Integrated Search
1996-10-01
This document presents quantitative national estimates of the magnitude of transportations impacts on the natural environment. It is the most comprehensive compilation of environmental and transportation data to date. This document addresses all p...
Integer programming methods for reserve selection and design
Robert G. Haight; Stephanie A. Snyder
2009-01-01
How many nature reserves should there be? Where should they be located? Which places have highest priority for protection? Conservation biologists, economists, and operations researchers have been developing quantitative methods to address these questions since the 1980s.
Quantitative interaction proteomics using mass spectrometry.
Wepf, Alexander; Glatter, Timo; Schmidt, Alexander; Aebersold, Ruedi; Gstaiger, Matthias
2009-03-01
We present a mass spectrometry-based strategy for the absolute quantification of protein complex components isolated through affinity purification. We quantified bait proteins via isotope-labeled reference peptides corresponding to an affinity tag sequence and prey proteins by label-free correlational quantification using the precursor ion signal intensities of proteotypic peptides generated in reciprocal purifications. We used this method to quantitatively analyze interaction stoichiometries in the human protein phosphatase 2A network.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2015-02-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2017-01-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353
Partial Analysis of Insta-Foam
NASA Technical Reports Server (NTRS)
Chou, L. W.
1983-01-01
Insta-Foam, used as a thermal insulator for the non-critical area of the external tank during the prelaunch phase to minimize icing, is a two-component system. Component A has polyisocyanates, blowing agents, and stabilizers; Component B has the polyols, catalysts, blowing agents, stabilizers and fire retardant. The blowing agents are Freon 11 and Freon 12, the stabilizers are silicone surfactants, the catalysts are tertiary amines, and the fire retardant is tri-(beta-chloro-isopropyl) phosphate (PCF). High performance liquid chromatography (HPLC) was quantitatively identified polyols and PFC.
Hungry Horse Mitigation; Flathead Lake, 2003-2004 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Barry; Evarts, Les
2005-06-01
The Confederated Salish and Kootenai Tribes (CSKT) and Montana Fish Wildlife and Parks (MFWP) wrote the ''Fisheries Mitigation Plan for Losses Attributable to the Construction and Operation of Hungry Horse Dam'' in March 1991 to define the fisheries losses, mitigation alternatives and recommendations to protect, mitigate and enhance resident fish and aquatic habitat affected by Hungry Horse Dam. On November 12, 1991, the Northwest Power Planning Council (NPPC) approved the mitigation plan with minor modifications, called for a detailed implementation plan, and amended measures 903(h)(1) through (7). A long-term mitigation plan was submitted in August 1992, was approved by themore » Council in 1993, and the first contract for this project was signed on November 11, 1993. The problem this project addresses is the loss of habitat, both in quality and quantity, in the Flathead Lake and River basin resulting from the construction and operation of Hungry Horse Dam. The purpose of the project is to both implement mitigation measures and monitor the biological responses to those measures including those implemented by Project Numbers 9101903 and 9101904. Goals and objectives of the 1994 Fish and Wildlife Program (Section 10.1) addressed by this project are the rebuilding to sustainable levels weak, but recoverable, native populations injured by the hydropower system. The project mitigates the blockage of spawning runs by Hungry Horse Dam by restoring and even creating spawning habitats within direct drainages to Flathead Lake. The project also addresses the altered habitat within Flathead Lake resulting from species shifts and consequent dominance of new species that restricts the potential success of mitigation measures. Specific goals of this project are to create and restore habitat and quantitatively monitor changes in fish populations to verify the efficacy of our mitigation measures. The project consists of three components: monitoring, restoration and research. Monitoring, for example, includes a spring gillnetting series conducted annually in Flathead Lake and builds on an existing data set initiated in 1981. Monitoring of the experimental kokanee reintroduction was a primary activity of this project between 1992 and 1997. Lake trout, whose high densities have precluded successful mitigation of losses of other species in Flathead Lake, have been monitored since 1996 to measure several biological parameters. Results of this work have utility in determining the population status of this key predator in Flathead Lake. The project has also defined the baseline condition of the Flathead Lake fishery in 1992-1993 and has conducted annual lakewide surveys since 1998. The restoration component of the project has addressed several stream channel, riparian, and fish passage problems, and suppression of non-native fish. The research component of the project began in FY 2000 and measured trophic linkages between M. relicta and other species to assist in predicting the results of our efforts to suppress lake trout. Only Objective 1 in the workplan is funded entirely by Hungry Horse Mitigation funds. Additional funds are drawn from other sources to assist in completion of Objectives 2-8.« less
Hungry Horse Mitigation : Flathead Lake : Annual Progress Report 2007.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Barry; Evarts, Les
2008-12-22
The Confederated Salish and Kootenai Tribes (CSKT) and Montana Fish Wildlife and Parks (MFWP) wrote the 'Fisheries Mitigation Plan for Losses Attributable to the Construction and Operation of Hungry Horse Dam' in March 1991 to define the fisheries losses, mitigation alternatives and recommendations to protect, mitigate and enhance resident fish and aquatic habitat affected by Hungry Horse Dam. On November 12, 1991, the Northwest Power Planning Council (NPPC) approved the mitigation plan with minor modifications, called for a detailed implementation plan, and amended measures 903(h)(1) through (7). A long-term mitigation plan was submitted in August 1992, was approved by themore » Council in 1993, and the first contract for this project was signed on November 11, 1993. The problem this project addresses is the loss of habitat, both in quality and quantity, in the Flathead Lake and River basin resulting from the construction and operation of Hungry Horse Dam. The purpose of the project is to both implement mitigation measures and monitor the biological responses to those measures including those implemented by Project Numbers 9101903 and 9101904. Goals and objectives of the 1994 Fish and Wildlife Program (Section 10.1) addressed by this project are the rebuilding to sustainable levels weak, but recoverable, native populations injured by the hydropower system. The project mitigates the blockage of spawning runs by Hungry Horse Dam by restoring and even creating spawning habitats within direct drainages to Flathead Lake. The project also addresses the altered habitat within Flathead Lake resulting from species shifts and consequent dominance of new species that restricts the potential success of mitigation measures. Specific goals of this project are to create and restore habitat and quantitatively monitor changes in fish populations to verify the efficacy of our mitigation measures. The project consists of three components: monitoring, restoration and research. Monitoring, for example, includes a spring gillnetting series conducted annually in Flathead Lake and builds on an existing data set initiated in 1981. Monitoring of the experimental kokanee reintroduction was a primary activity of this project between 1992 and 1997. Lake trout, whose high densities have precluded successful mitigation of losses of other species in Flathead Lake, have been monitored since 1996 to measure several biological parameters. Results of this work have utility in determining the population status of this key predator in Flathead Lake. The project has also defined the baseline condition of the Flathead Lake fishery in 1992-1993 and has conducted annual lakewide surveys since 1998. The restoration component of the project has addressed several stream channel, riparian, and fish passage problems, and suppression of non-native fish. The research component of the project began in FY 2000 and measured trophic linkages between M. relicta and other species to assist in predicting the results of our efforts to suppress lake trout. Only Work Element A in the Statement of Work is funded entirely by Hungry Horse Mitigation funds. Additional funds are drawn from other sources to assist in completion of all remaining Work Elements.« less
Hungry Horse Mitigation; Flathead Lake, 2004-2005 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Barry; Evarts, Les
2006-06-01
The Confederated Salish and Kootenai Tribes (CSKT) and Montana Fish Wildlife and Parks (MFWP) wrote the ''Fisheries Mitigation Plan for Losses Attributable to the Construction and Operation of Hungry Horse Dam'' in March 1991 to define the fisheries losses, mitigation alternatives and recommendations to protect, mitigate and enhance resident fish and aquatic habitat affected by Hungry Horse Dam. On November 12, 1991, the Northwest Power Planning Council (NPPC) approved the mitigation plan with minor modifications, called for a detailed implementation plan, and amended measures 903(h)(1) through (7). A long-term mitigation plan was submitted in August 1992, was approved by themore » Council in 1993, and the first contract for this project was signed on November 11, 1993. The problem this project addresses is the loss of habitat, both in quality and quantity, in the Flathead Lake and River basin resulting from the construction and operation of Hungry Horse Dam. The purpose of the project is to both implement mitigation measures and monitor the biological responses to those measures including those implemented by Project Numbers 9101903 and 9101904. Goals and objectives of the 1994 Fish and Wildlife Program (Section 10.1) addressed by this project are the rebuilding to sustainable levels weak, but recoverable, native populations injured by the hydropower system. The project mitigates the blockage of spawning runs by Hungry Horse Dam by restoring and even creating spawning habitats within direct drainages to Flathead Lake. The project also addresses the altered habitat within Flathead Lake resulting from species shifts and consequent dominance of new species that restricts the potential success of mitigation measures. Specific goals of this project are to create and restore habitat and quantitatively monitor changes in fish populations to verify the efficacy of our mitigation measures. The project consists of three components: monitoring, restoration and research. Monitoring, for example, includes a spring gillnetting series conducted annually in Flathead Lake and builds on an existing data set initiated in 1981. Monitoring of the experimental kokanee reintroduction was a primary activity of this project between 1992 and 1997. Lake trout, whose high densities have precluded successful mitigation of losses of other species in Flathead Lake, have been monitored since 1996 to measure several biological parameters. Results of this work have utility in determining the population status of this key predator in Flathead Lake. The project has also defined the baseline condition of the Flathead Lake fishery in 1992-1993 and has conducted annual lakewide surveys since 1998. The restoration component of the project has addressed several stream channel, riparian, and fish passage problems, and suppression of non-native fish. The research component of the project began in FY 2000 and measured trophic linkages between M. relicta and other species to assist in predicting the results of our efforts to suppress lake trout. Only Work Element A in the Statement of Work is funded entirely by Hungry Horse Mitigation funds. Additional funds are drawn from other sources to assist in completion of all remaining Work Elements.« less
Hungry Horse Mitigation : Flathead Lake : Annual Progress Report 2008.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Barry; Evarts, Les
2009-08-06
The Confederated Salish and Kootenai Tribes (CSKT) and Montana Fish Wildlife and Parks (MFWP) wrote the 'Fisheries Mitigation Plan for Losses Attributable to the Construction and Operation of Hungry Horse Dam' in March 1991 to define the fisheries losses, mitigation alternatives and recommendations to protect, mitigate and enhance resident fish and aquatic habitat affected by Hungry Horse Dam. On November 12, 1991, the Northwest Power Planning Council (NPPC) approved the mitigation plan with minor modifications, called for a detailed implementation plan, and amended measures 903(h)(1) through (7). A long-term mitigation plan was submitted in August 1992, was approved by themore » Council in 1993, and the first contract for this project was signed on November 11, 1993. The problem this project addresses is the loss of habitat, both in quality and quantity, in the Flathead Lake and River basin resulting from the construction and operation of Hungry Horse Dam. The purpose of the project is to both implement mitigation measures and monitor the biological responses to those measures including those implemented by Project Numbers 9101903 and 9101904. Goals and objectives of the 1994 Fish and Wildlife Program (Section 10.1) addressed by this project are the rebuilding to sustainable levels weak, but recoverable, native populations injured by the hydropower system. The project mitigates the blockage of spawning runs by Hungry Horse Dam by restoring and even creating spawning habitats within direct drainages to Flathead Lake. The project also addresses the altered habitat within Flathead Lake resulting from species shifts and consequent dominance of new species that restricts the potential success of mitigation measures. Specific goals of this project are to create and restore habitat and quantitatively monitor changes in fish populations to verify the efficacy of our mitigation measures. The project consists of three components: monitoring, restoration and research. Monitoring, for example, includes a spring gillnetting series conducted annually in Flathead Lake and builds on an existing data set initiated in 1981. Monitoring of the experimental kokanee reintroduction was a primary activity of this project between 1992 and 1997. Lake trout, whose high densities have precluded successful mitigation of losses of other species in Flathead Lake, have been monitored since 1996 to measure several biological parameters. Results of this work have utility in determining the population status of this key predator in Flathead Lake. The project has also defined the baseline condition of the Flathead Lake fishery in 1992-1993 and has conducted annual lakewide surveys since 1998. The restoration component of the project has addressed several stream channel, riparian, and fish passage problems, and suppression of non-native fish. The research component of the project began in FY 2000 and measured trophic linkages between M. relicta and other species to assist in predicting the results of our efforts to suppress lake trout. Only Work Element A in the Statement of Work is funded entirely by Hungry Horse Mitigation funds. Additional funds are drawn from other sources to assist in completion of all remaining Work Elements.« less
Preil, Simone A R; Kristensen, Lars P; Beck, Hans C; Jensen, Pia S; Nielsen, Patricia S; Steiniche, Torben; Bjørling-Poulsen, Marina; Larsen, Martin R; Hansen, Maria L; Rasmussen, Lars M
2015-10-01
The increased risk of cardiovascular diseases in type 2 diabetes mellitus has been extensively documented, but the origins of the association remain largely unknown. We sought to determine changes in protein expressions in arterial tissue from patients with type 2 diabetes mellitus and moreover hypothesized that metformin intake influences the protein composition. We analyzed nonatherosclerotic repair arteries gathered at coronary bypass operations from 30 patients with type 2 diabetes mellitus and from 30 age- and sex-matched nondiabetic individuals. Quantitative proteome analysis was performed by isobaric tag for relative and absolute quantitation-labeling and liquid chromatography-mass spectrometry, tandem mass spectrometry analysis on individual arterial samples. The amounts of the basement membrane components, α1-type IV collagen and α2-type IV collagen, γ1-laminin and β2-laminin, were significantly increased in patients with diabetes mellitus. Moreover, the expressions of basement membrane components and other vascular proteins were significantly lower among metformin users when compared with nonusers. Patients treated with or without metformin had similar levels of hemoglobin A1c, cholesterol, and blood pressure. In addition, quantitative histomorphometry showed increased area fractions of collagen-stainable material in tunica intima and media among patients with diabetes mellitus. The distinct accumulation of arterial basement membrane proteins in type 2 diabetes mellitus discloses a similarity between the diabetic macroangiopathy and microangiopathy and suggests a molecular explanation behind the alterations in vascular remodeling, biomechanical properties, and aneurysm formation described in diabetes mellitus. The lower amounts of basement membrane components in metformin-treated individuals are compatible with the hypothesis of direct beneficial drug effects on the matrix composition in the vasculature. © 2015 American Heart Association, Inc.
Investigation of the effects of long duration space exposure on active optical system components
NASA Technical Reports Server (NTRS)
Blue, M. D.
1994-01-01
This experiment was exposed to the space environment for 6 years on the Long Duration Exposure Facility (LDEF). It investigated quantitatively the effects of the long-duration space exposure on the relevant performance parameters of a representative set of electron-optic system components, including lasers, radiation detectors, filters, modulators, windows, and other related components. It evaluated the results and implications of the measurements indicating real or suspected degradation mechanisms. This information will be used to establish guidelines for the selection and use of components for space-based, electro-optic systems.
The response of Galileo aft cover components to laser radiation
NASA Technical Reports Server (NTRS)
Metzger, J. W.
1982-01-01
The aft region of the Galileo probe will be subjected to severe heat transfer rates dominated by the radiation contributions. To assess the response of several vehicle aft region components to thermal radiation, tests employing a 10 KW CO2 laser were conducted. The experiments evaluated the annulus/aft cover interface, the umbilical feedthrough assembly and the mortar cover seal assembly. Experimental evidence of the response of the phenolic nylon heatshield and quantitative measures of its effect on gap geometries of several vehicle components were acquired. In addition, qualitative measures of the survivability of the irradiated components were obtained.
Zeng, Dong; Xie, Qi; Cao, Wenfei; Lin, Jiahui; Zhang, Hao; Zhang, Shanli; Huang, Jing; Bian, Zhaoying; Meng, Deyu; Xu, Zongben; Liang, Zhengrong; Chen, Wufan
2017-01-01
Dynamic cerebral perfusion computed tomography (DCPCT) has the ability to evaluate the hemodynamic information throughout the brain. However, due to multiple 3-D image volume acquisitions protocol, DCPCT scanning imposes high radiation dose on the patients with growing concerns. To address this issue, in this paper, based on the robust principal component analysis (RPCA, or equivalently the low-rank and sparsity decomposition) model and the DCPCT imaging procedure, we propose a new DCPCT image reconstruction algorithm to improve low dose DCPCT and perfusion maps quality via using a powerful measure, called Kronecker-basis-representation tensor sparsity regularization, for measuring low-rankness extent of a tensor. For simplicity, the first proposed model is termed tensor-based RPCA (T-RPCA). Specifically, the T-RPCA model views the DCPCT sequential images as a mixture of low-rank, sparse, and noise components to describe the maximum temporal coherence of spatial structure among phases in a tensor framework intrinsically. Moreover, the low-rank component corresponds to the “background” part with spatial–temporal correlations, e.g., static anatomical contribution, which is stationary over time about structure, and the sparse component represents the time-varying component with spatial–temporal continuity, e.g., dynamic perfusion enhanced information, which is approximately sparse over time. Furthermore, an improved nonlocal patch-based T-RPCA (NL-T-RPCA) model which describes the 3-D block groups of the “background” in a tensor is also proposed. The NL-T-RPCA model utilizes the intrinsic characteristics underlying the DCPCT images, i.e., nonlocal self-similarity and global correlation. Two efficient algorithms using alternating direction method of multipliers are developed to solve the proposed T-RPCA and NL-T-RPCA models, respectively. Extensive experiments with a digital brain perfusion phantom, preclinical monkey data, and clinical patient data clearly demonstrate that the two proposed models can achieve more gains than the existing popular algorithms in terms of both quantitative and visual quality evaluations from low-dose acquisitions, especially as low as 20 mAs. PMID:28880164
Establishing key components of yoga interventions for musculoskeletal conditions: a Delphi survey
2014-01-01
Background Evidence suggests yoga is a safe and effective intervention for the management of physical and psychosocial symptoms associated with musculoskeletal conditions. However, heterogeneity in the components and reporting of clinical yoga trials impedes both the generalization of study results and the replication of study protocols. The aim of this Delphi survey was to address these issues of heterogeneity, by developing a list of recommendations of key components for the design and reporting of yoga interventions for musculoskeletal conditions. Methods Recognised experts involved in the design, conduct, and teaching of yoga for musculoskeletal conditions were identified from a systematic review, and invited to contribute to the Delphi survey. Forty-one of the 58 experts contacted, representing six countries, agreed to participate. A three-round Delphi was conducted via electronic surveys. Round 1 presented an open-ended question, allowing panellists to individually identify components they considered key to the design and reporting of yoga interventions for musculoskeletal conditions. Thematic analysis of Round 1 identified items for quantitative rating in Round 2; items not reaching consensus were forwarded to Round 3 for re-rating. Results Thirty-six panellists (36/41; 88%) completed the three rounds of the Delphi survey. Panellists provided 348 comments to the Round 1 question. These comments were reduced to 49 items, grouped under five themes, for rating in subsequent rounds. A priori group consensus of ≥80% was reached on 28 items related to five themes concerning defining the yoga intervention, types of yoga practices to include in an intervention, delivery of the yoga protocol, domains of outcome measures, and reporting of yoga interventions for musculoskeletal conditions. Additionally, a priori consensus of ≥50% was reached on five items relating to minimum values for intervention parameters. Conclusions Expert consensus has provided a non-prescriptive reference list for the design and reporting of yoga interventions for musculoskeletal conditions. It is anticipated future research incorporating the Delphi guidelines will facilitate high quality international research in this field, increase homogeneity of intervention components and parameters, and enhance the comparison and reproducibility of research into the use of yoga for the management of musculoskeletal conditions. PMID:24942270
Mark A. Finney; Charles W. McHugh; Isaac Grenfell; Karin L. Riley
2010-01-01
Components of a quantitative risk assessment were produced by simulation of burn probabilities and fire behavior variation for 134 fire planning units (FPUs) across the continental U.S. The system uses fire growth simulation of ignitions modeled from relationships between large fire occurrence and the fire danger index Energy Release Component (ERC). Simulations of 10,...
Wang, Xiang-yang; Ma, Xiao-hui; Li, Wei; Chu, Yang; Guo, Jia-hua; Zhou, Shui-ping; Zhu, Yong-hong
2014-09-01
A simple, accurate and reliable method for the simultaneous separation and determination of six active components (protocatechuic acid, chlorogenic acid, caffeic acid, paeoniflorin, ferulic acid and rosmarinic acid) in traditional Chinese medicinal preparation Cerebralcare Granule(®) (CG) was developed using reverse-phase high-performance liquid chromatography coupled with diode array detector detection. The chromatographic separation was performed on a Hypersil GOLD C18 column with aqueous formic acid (0.1%, v/v) and acetonitrile as mobile phase at a flow rate of 0.2 ml/min at 30 °C. Because of the different UV characteristics of these components, change detection wavelength method was used for quantitative analysis. All of the analytes showed good linearity (r > 0.9992). The established method showed good precision and relative standard deviations (%) for intra-day and inter-day variations of 0.15-1.81 and 0.11-1.98%, respectively. The validated method was successfully applied to the simultaneously determination of six active components in CG from different batches. © The Author [2013]. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Li, Xiaonan; Ramchiary, Nirala; Dhandapani, Vignesh; Choi, Su Ryun; Hur, Yoonkang; Nou, Ill-Sup; Yoon, Moo Kyoung; Lim, Yong Pyo
2013-01-01
Brassica rapa is an important crop species that produces vegetables, oilseed, and fodder. Although many studies reported quantitative trait loci (QTL) mapping, the genes governing most of its economically important traits are still unknown. In this study, we report QTL mapping for morphological and yield component traits in B. rapa and comparative map alignment between B. rapa, B. napus, B. juncea, and Arabidopsis thaliana to identify candidate genes and conserved QTL blocks between them. A total of 95 QTL were identified in different crucifer blocks of the B. rapa genome. Through synteny analysis with A. thaliana, B. rapa candidate genes and intronic and exonic single nucleotide polymorphisms in the parental lines were detected from whole genome resequenced data, a few of which were validated by mapping them to the QTL regions. Semi-quantitative reverse transcriptase PCR analysis showed differences in the expression levels of a few genes in parental lines. Comparative mapping identified five key major evolutionarily conserved crucifer blocks (R, J, F, E, and W) harbouring QTL for morphological and yield components traits between the A, B, and C subgenomes of B. rapa, B. juncea, and B. napus. The information of the identified candidate genes could be used for breeding B. rapa and other related Brassica species. PMID:23223793
1, 2, 3, 4: Infusing Quantitative Literacy into Introductory Biology
Momsen, Jennifer L.; Moyerbrailean, Gregory A.; Ebert-May, Diane; Long, Tammy M.; Wyse, Sara; Linton, Debra
2010-01-01
Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills. PMID:20810965
NASA Astrophysics Data System (ADS)
Holford, Karen M.; Eaton, Mark J.; Hensman, James J.; Pullin, Rhys; Evans, Sam L.; Dervilis, Nikolaos; Worden, Keith
2017-04-01
The acoustic emission (AE) phenomenon has many attributes that make it desirable as a structural health monitoring or non-destructive testing technique, including the capability to continuously and globally monitor large structures using a sparse sensor array and with no dependency on defect size. However, AE monitoring is yet to fulfil its true potential, due mainly to limitations in location accuracy and signal characterisation that often arise in complex structures with high levels of background noise. Furthermore, the technique has been criticised for a lack of quantitative results and the large amount of operator interpretation required during data analysis. This paper begins by introducing the challenges faced in developing an AE based structural health monitoring system and then gives a review of previous progress made in addresing these challenges. Subsequently an overview of a novel methodology for automatic detection of fatigue fractures in complex geometries and noisy environments is presented, which combines a number of signal processing techniques to address the current limitations of AE monitoring. The technique was developed for monitoring metallic landing gear components during pre-flight certification testing and results are presented from a full-scale steel landing gear component undergoing fatigue loading. Fracture onset was successfully identify automatically at 49,000 fatigue cycles prior to final failure (validated by the use of dye penetrant inspection) and the fracture position was located to within 10 mm of the actual location.
2016-07-01
Cancer Research ; CSHL Quantitative Symposium on Cancer; Virginia Commonwealth Cancer Biology Dept seminar; CSHL PI3K and mTOR signaling Meeting...Jolla, CA 92037 REPORT DATE: July 2016 TYPE OF REPORT: Final Report PREPARED FOR: U.S. Army Medical Research and Materiel Command Fort...Jolla, CA 92037-1002 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S) U.S. Army Medical Research and
Data Warehouse Governance Programs in Healthcare Settings: A Literature Review and a Call to Action
Elliott, Thomas E.; Holmes, John H.; Davidson, Arthur J.; La Chance, Pierre-Andre; Nelson, Andrew F.; Steiner, John F.
2013-01-01
Purpose: Given the extensive data stored in healthcare data warehouses, data warehouse governance policies are needed to ensure data integrity and privacy. This review examines the current state of the data warehouse governance literature as it applies to healthcare data warehouses, identifies knowledge gaps, provides recommendations, and suggests approaches for further research. Methods: A comprehensive literature search using five data bases, journal article title-search, and citation searches was conducted between 1997 and 2012. Data warehouse governance documents from two healthcare systems in the USA were also reviewed. A modified version of nine components from the Data Governance Institute Framework for data warehouse governance guided the qualitative analysis. Results: Fifteen articles were retrieved. Only three were related to healthcare settings, each of which addressed only one of the nine framework components. Of the remaining 12 articles, 10 addressed between one and seven framework components and the remainder addressed none. Each of the two data warehouse governance plans obtained from healthcare systems in the USA addressed a subset of the framework components, and between them they covered all nine. Conclusions: While published data warehouse governance policies are rare, the 15 articles and two healthcare organizational documents reviewed in this study may provide guidance to creating such policies. Additional research is needed in this area to ensure that data warehouse governance polices are feasible and effective. The gap between the development of data warehouses in healthcare settings and formal governance policies is substantial, as evidenced by the sparse literature in this domain. PMID:25848561
2015-06-01
shield that tends to lower the EM levels at the surface of the Earth (Christopoulos 2007). It is important to note for the purposes of this paper...ELECTROMAGNETIC ENVIRONMENTS 5. FUNDING NUMBERS 6. AUTHOR( S ) Jared A. Johnson 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) Naval Postgraduate School...Monterey, CA 93943-5000 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING /MONITORING AGENCY NAME( S ) AND ADDRESS(ES) N/A 10. SPONSORING
Left seat command or leadership flight, leadership training and research at North Central Airlines
NASA Technical Reports Server (NTRS)
Foster, G. C.; Garvey, M. C.
1980-01-01
The need for flight leadership training for flight deck crewmembers is addressed. A management grid is also described which provides a quantitative management language against which any number of management behaviors can be measured.
Service-Learning and Mathematics
ERIC Educational Resources Information Center
Roemer, Cynthia Anne
2009-01-01
Contemporary educational theory has given increased attention to service-learning as valuable pedagogy. Ever-changing technology progress and applications demand a quantitatively literate population, supporting the need for experiential activities in mathematics. This study addresses service-learning pedagogy in mathematics through a study of the…
Presidential Address: Empowerment Evaluation.
ERIC Educational Resources Information Center
Fetterman, David
1994-01-01
Empowerment evaluation is the use of evaluation concepts and techniques to foster self-determination, focusing on helping people help themselves. This collaborative evaluation approach requires both qualitative and quantitative methodologies. It is a multifaceted approach that can be applied to evaluation in any area. (SLD)
Junge, Tina; Ahler, Jonas; Knudsen, Hans K; Kristensen, Hanne K
2018-01-01
Background People with dementia may benefit from the effect of physical activity on behavioural and psychological symptoms of dementia. Qualitative synthesis of the importance of physical activity might complement and help clarify quantitative findings on this topic. The purpose of this systematic mixed studies review was to evaluate findings from both quantitative and qualitative methods about the effect and importance of physical activity on behavioural and psychological symptoms of dementia in people with dementia. Methods The systematic literature search was conducted in EMBASE, CINAHL, PubMed, PEDro and PsycINFO. Inclusion criteria were: people with a light to moderate degree of dementia, interventions including physical activity and outcomes focusing on behavioural and psychological symptoms of dementia or quality of life. To assess the methodological quality of the studies, the AMSTAR and GRADE checklists were applied for the quantitative studies and the CASP qualitative checklist for the qualitative studies. Results A small reduction in depression level and improved mood were seen in some quantitative studies of multi-component physical activity interventions, including walking. Due to high heterogeneity in the quantitative studies, a single summary of the effect of physical activity on behavioural and psychological symptoms of dementia should be interpreted with some caution. Across the qualitative studies, the common themes about the importance of physical activity were its 'socially rewarding' nature, the 'benefits of walking outdoors' and its contribution to 'maintaining self-hood'. Conclusion For people with dementia, there was a small, quantitative effect of multi-component physical activity including walking, on depression level and mood. People with dementia reported the importance of walking outdoors, experiencing the social rewards of physical activity in groups, as well as physical activity were a means toward maintaining self-hood.
A multiplexed system for quantitative comparisons of chromatin landscapes
van Galen, Peter; Viny, Aaron D.; Ram, Oren; Ryan, Russell J.H.; Cotton, Matthew J.; Donohue, Laura; Sievers, Cem; Drier, Yotam; Liau, Brian B.; Gillespie, Shawn M.; Carroll, Kaitlin M.; Cross, Michael B.; Levine, Ross L.; Bernstein, Bradley E.
2015-01-01
Genome-wide profiling of histone modifications can provide systematic insight into the regulatory elements and programs engaged in a given cell type. However, conventional chromatin immunoprecipitation and sequencing (ChIP-seq) does not capture quantitative information on histone modification levels, requires large amounts of starting material, and involves tedious processing of each individual sample. Here we address these limitations with a technology that leverages DNA barcoding to profile chromatin quantitatively and in multiplexed format. We concurrently map relative levels of multiple histone modifications across multiple samples, each comprising as few as a thousand cells. We demonstrate the technology by monitoring dynamic changes following inhibition of P300, EZH2 or KDM5, by linking altered epigenetic landscapes to chromatin regulator mutations, and by mapping active and repressive marks in purified human hematopoietic stem cells. Hence, this technology enables quantitative studies of chromatin state dynamics across rare cell types, genotypes, environmental conditions and drug treatments. PMID:26687680
Space Station Freedom Water Recovery test total organic carbon accountability
NASA Technical Reports Server (NTRS)
Davidson, Michael W.; Slivon, Laurence; Sheldon, Linda; Traweek, Mary
1991-01-01
Marshall Space Flight Center's (MSFC) Water Recovery Test (WRT) addresses the concept of integrated hygiene and potable reuse water recovery systems baselined for Space Station Freedom (SSF). To assess the adequacy of water recovery system designs and the conformance of reclaimed water quality to established specifications, MSFC has initiated an extensive water characterization program. MSFC's goal is to quantitatively account for a large percentage of organic compounds present in waste and reclaimed hygiene and potable waters from the WRT and in humidity condensate from Spacelab missions. The program is coordinated into Phase A and B. Phase A's focus is qualitative and semi-quantitative. Precise quantitative analyses are not emphasized. Phase B's focus centers on a near complete quantitative characterization of all water types. Technical approaches along with Phase A and partial Phase B investigations on the compositional analysis of Total Organic Carbon (TOC) Accountability are presented.
Early Foundations for Mathematics Learning and Their Relations to Learning Disabilities.
Geary, David C
2013-02-01
Children's quantitative competencies upon entry into school can have lifelong consequences. Children who start behind generally stay behind, and mathematical skills at school completion influence employment prospects and wages in adulthood. I review the current debate over whether early quantitative learning is supported by (a) an inherent system for representing approximate magnitudes, (b) an attentional-control system that enables explicit processing of quantitative symbols, such as Arabic numerals, or (c) the logical problem-solving abilities that facilitate learning of the relations among numerals. Studies of children with mathematical learning disabilities and difficulties have suggested that each of these competencies may be involved, but to different degrees and at different points in the learning process. Clarifying how and when these competencies facilitate early quantitative learning and developing interventions to address their impact on children have the potential to yield substantial benefits for individuals and for society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cline, J. P.; Von Dreele, R. B.; Winburn, R.
2011-07-01
A non-diffracting surface layer exists at any boundary of a crystal and can comprise a mass fraction of several percent in a finely divided solid. This has led to the long-standing issue of amorphous content in standards for quantitative phase analysis (QPA). NIST standard reference material (SRM) 676a is a corundum ({alpha}-Al{sub 2}O{sub 3}) powder, certified with respect to phase purity for use as an internal standard in powder diffraction QPA. The amorphous content of SRM 676a is determined by comparing diffraction data from mixtures with samples of silicon powders that were engineered to vary their specific surface area. Undermore » the (supported) assumption that the thickness of an amorphous surface layer on Si was invariant, this provided a method to control the crystalline/amorphous ratio of the silicon components of 50/50 weight mixtures of SRM 676a with silicon. Powder diffraction experiments utilizing neutron time-of-flight and 25 keV and 67 keV X-ray energies quantified the crystalline phase fractions from a series of specimens. Results from Rietveld analyses, which included a model for extinction effects in the silicon, of these data were extrapolated to the limit of zero amorphous content of the Si powder. The certified phase purity of SRM 676a is 99.02% {+-} 1.11% (95% confidence interval). This novel certification method permits quantification of amorphous content for any sample of interest, by spiking with SRM 676a.« less
Spatially Regularized Machine Learning for Task and Resting-state fMRI
Song, Xiaomu; Panych, Lawrence P.; Chen, Nan-kuei
2015-01-01
Background Reliable mapping of brain function across sessions and/or subjects in task- and resting-state has been a critical challenge for quantitative fMRI studies although it has been intensively addressed in the past decades. New Method A spatially regularized support vector machine (SVM) technique was developed for the reliable brain mapping in task- and resting-state. Unlike most existing SVM-based brain mapping techniques, which implement supervised classifications of specific brain functional states or disorders, the proposed method performs a semi-supervised classification for the general brain function mapping where spatial correlation of fMRI is integrated into the SVM learning. The method can adapt to intra- and inter-subject variations induced by fMRI nonstationarity, and identify a true boundary between active and inactive voxels, or between functionally connected and unconnected voxels in a feature space. Results The method was evaluated using synthetic and experimental data at the individual and group level. Multiple features were evaluated in terms of their contributions to the spatially regularized SVM learning. Reliable mapping results in both task- and resting-state were obtained from individual subjects and at the group level. Comparison with Existing Methods A comparison study was performed with independent component analysis, general linear model, and correlation analysis methods. Experimental results indicate that the proposed method can provide a better or comparable mapping performance at the individual and group level. Conclusions The proposed method can provide accurate and reliable mapping of brain function in task- and resting-state, and is applicable to a variety of quantitative fMRI studies. PMID:26470627
DOE Office of Scientific and Technical Information (OSTI.GOV)
J Cline; R Von Dreele; R Winburn
2011-12-31
A non-diffracting surface layer exists at any boundary of a crystal and can comprise a mass fraction of several percent in a finely divided solid. This has led to the long-standing issue of amorphous content in standards for quantitative phase analysis (QPA). NIST standard reference material (SRM) 676a is a corundum ({alpha}-Al{sub 2}O{sub 3}) powder, certified with respect to phase purity for use as an internal standard in powder diffraction QPA. The amorphous content of SRM 676a is determined by comparing diffraction data from mixtures with samples of silicon powders that were engineered to vary their specific surface area. Undermore » the (supported) assumption that the thickness of an amorphous surface layer on Si was invariant, this provided a method to control the crystalline/amorphous ratio of the silicon components of 50/50 weight mixtures of SRM 676a with silicon. Powder diffraction experiments utilizing neutron time-of-flight and 25 keV and 67 keV X-ray energies quantified the crystalline phase fractions from a series of specimens. Results from Rietveld analyses, which included a model for extinction effects in the silicon, of these data were extrapolated to the limit of zero amorphous content of the Si powder. The certified phase purity of SRM 676a is 99.02% {+-} 1.11% (95% confidence interval). This novel certification method permits quantification of amorphous content for any sample of interest, by spiking with SRM 676a.« less
Suman, Arnela; Schaafsma, Frederieke G; Buchbinder, Rachelle; van Tulder, Maurits W; Anema, Johannes R
2017-09-01
Background To reduce the burden of low back pain (LBP) in the Netherlands, a multidisciplinary guideline for LBP has been implemented in Dutch primary care using a multifaceted implementation strategy targeted at health care professionals (HCPs) and patients. The current paper describes the process evaluation of the implementation among HCPs. Methods The strategy aimed to improve multidisciplinary collaboration and communication, and consisted of 7 components. This process evaluation was performed using the Linnan and Steckler framework. Data were collected using a mixed methods approach of quantitative and qualitative data. Results 128 HCPs participated in the implementation study, of which 96 participated in quantitative and 21 participated in qualitative evaluation. Overall dose delivered for this study was 89 %, and the participants were satisfied with the strategy, mostly with the multidisciplinary approach, which contributed to the mutual understanding of each other's disciplines and perspectives. While the training sessions did not yield any new information, the strategy created awareness of the guideline and its recommendations, contributing to positively changing attitudes and aiding in improving guideline adherent behaviour. However, many barriers to implementation still exist, including personal and practical factors, confidence, dependence and distrust issues among the HCPs, as well as policy factors (e.g. reimbursement systems). Conclusions The data presented in this paper have shown that the strategy that was used to implement the guideline in a Dutch primary care setting was feasible, especially when using a multidisciplinary approach. However, identified barriers for implementation have been identified and should be addressed in future implementation.
How runoff begins (and ends): characterizing hydrologic response at the catchment scale
Mirus, Benjamin B.; Loague, Keith
2013-01-01
Improved understanding of the complex dynamics associated with spatially and temporally variable runoff response is needed to better understand the hydrology component of interdisciplinary problems. The objective of this study was to quantitatively characterize the environmental controls on runoff generation for the range of different streamflow-generation mechanisms illustrated in the classic Dunne diagram. The comprehensive physics-based model of coupled surface-subsurface flow, InHM, is employed in a heuristic mode. InHM has been employed previously to successfully simulate the observed hydrologic response at four diverse, well-characterized catchments, which provides the foundation for this study. The C3 and CB catchments are located within steep, forested terrain; the TW and R5 catchments are located in gently sloping rangeland. The InHM boundary-value problems for these four catchments provide the corner-stones for alternative simulation scenarios designed to address the question of how runoff begins (and ends). Simulated rainfall-runoff events are used to systematically explore the impact of soil-hydraulic properties and rainfall characteristics. This approach facilitates quantitative analysis of both integrated and distributed hydrologic responses at high-spatial and temporal resolution over the wide range of environmental conditions represented by the four catchments. The results from 140 unique simulation scenarios illustrate how rainfall intensity/depth, subsurface permeability contrasts, characteristic curve shapes, and topography provide important controls on the hydrologic-response dynamics. The processes by which runoff begins (and ends) are shown, in large part, to be defined by the relative rates of rainfall, infiltration, lateral flow convergence, and storage dynamics within the variably saturated soil layers.
Staneva, Jordanka; Denkova, Pavletta; Todorova, Milka; Evstatieva, Ljuba
2011-01-05
(1)H NMR spectroscopy was used as a method for quantitative analysis of sesquiterpene lactones present in a crude lactone fraction isolated from Arnica montana. Eight main components - tigloyl-, methacryloyl-, isobutyryl- and 2-methylbutyryl-esters of helenalin (H) and 11α,13-dihydrohelenalin (DH) were identified in the studied sample. The method allows the determination of the total amount of sesquiterpene lactones and the quantity of both type helenalin and 11α,13-dihydrohelenalin esters separately. Furthermore, 6-O-tigloylhelenalin (HT, 1), 6-O-methacryloylhelenalin (HM, 2), 6-O-tigloyl-11α,13-dihydrohelenalin (DHT, 5), and 6-O-methacryloyl-11α,13-dihydrohelenalin (DHM, 6) were quantified as individual components. Copyright © 2010 Elsevier B.V. All rights reserved.
Encephalization and quantitative brain composition in bats in relation to their life-habits.
Pirlot, P; Pottier, J
1977-12-01
A quantitative analysis of the brains of 43 bat species is presented. Eleven brain components were studied. The species were arranged according to seven distinct dietary groups and it was found that the relative development of the principal components is related to those groups. The importance of neocorticalization as a reflection of evolution of all the bats in contrast to specialization in some species is stressed. This work gives a clearer view of Chiropteran progressiveness or primitiveness: the insectivorous forms occupy the least advanced, although most specialized, level; the vampires, the carnivorous species and the flying foxes are at the top of the scale. The importance of behaviour and the relative development of the central nervous system in the hierarchial classification of mammals is stressed.
A classical model for closed-loop diagrams of binary liquid mixtures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schnitzler, J.v.; Prausnitz, J.M.
1994-03-01
A classical lattice model for closed-loop temperature-composition phase diagrams has been developed. It considers the effect of specific interactions, such as hydrogen bonding, between dissimilar components. This van Laar-type model includes a Flory-Huggins term for the excess entropy of mixing. It is applied to several liquid-liquid equilibria of nonelectrolytes, where the molecules of the two components differ in size. The model is able to represent the observed data semi-quantitatively, but in most cases it is not flexible enough to predict all parts of the closed loop quantitatively. The ability of the model to represent different binary systems is discussed. Finally,more » attention is given to a correction term, concerning the effect of concentration fluctuations near the upper critical solution temperature.« less
Frequency-selective quantitation of short-echo time 1H magnetic resonance spectra
NASA Astrophysics Data System (ADS)
Poullet, Jean-Baptiste; Sima, Diana M.; Van Huffel, Sabine; Van Hecke, Paul
2007-06-01
Accurate and efficient filtering techniques are required to suppress large nuisance components present in short-echo time magnetic resonance (MR) spectra. This paper discusses two powerful filtering techniques used in long-echo time MR spectral quantitation, the maximum-phase FIR filter (MP-FIR) and the Hankel-Lanczos Singular Value Decomposition with Partial ReOrthogonalization (HLSVD-PRO), and shows that they can be applied to their more complex short-echo time spectral counterparts. Both filters are validated and compared through extensive simulations. Their properties are discussed. In particular, the capability of MP-FIR for dealing with macromolecular components is emphasized. Although this property does not make a large difference for long-echo time MR spectra, it can be important when quantifying short-echo time spectra.
NASA Astrophysics Data System (ADS)
Xu, Jing; Wang, Yu-Tian; Liu, Xiao-Fei
2015-04-01
Edible blend oil is a mixture of vegetable oils. Eligible blend oil can meet the daily need of two essential fatty acids for human to achieve the balanced nutrition. Each vegetable oil has its different composition, so vegetable oils contents in edible blend oil determine nutritional components in blend oil. A high-precision quantitative analysis method to detect the vegetable oils contents in blend oil is necessary to ensure balanced nutrition for human being. Three-dimensional fluorescence technique is high selectivity, high sensitivity, and high-efficiency. Efficiency extraction and full use of information in tree-dimensional fluorescence spectra will improve the accuracy of the measurement. A novel quantitative analysis is proposed based on Quasi-Monte-Carlo integral to improve the measurement sensitivity and reduce the random error. Partial least squares method is used to solve nonlinear equations to avoid the effect of multicollinearity. The recovery rates of blend oil mixed by peanut oil, soybean oil and sunflower are calculated to verify the accuracy of the method, which are increased, compared the linear method used commonly for component concentration measurement.
Accelerated life assessment of coating on the radar structure components in coastal environment.
Liu, Zhe; Ming, ZhiMao
2016-07-04
This paper aimed to build an accelerated life test scheme and carry out quantitative analysis between accelerated life test in the laboratory and actual service for the coating composed of epoxy primer and polyurethane paint on structure components of some kind of radar served in the coastal environment of South China Sea. The accelerated life test scheme was built based on the service environment and failure analysis of the coating. The quantitative analysis between accelerated life test and actual service was conducted by comparing the gloss loss, discoloration, chalking, blistering, cracking and electrochemical impedance spectroscopy of the coating. The main factors leading to the coating failure were ultraviolet radiation, temperature, moisture, salt fog and loads, the accelerated life test included ultraviolet radiation, damp heat, thermal shock, fatigue and salt spray. The quantitative relationship was that one cycle of the accelerated life test was equal to actual service for one year. It was established that one cycle of the accelerated life test was equal to actual service for one year. It provided a precise way to predict actual service life of newly developed coatings for the manufacturer.
High-Content Screening for Quantitative Cell Biology.
Mattiazzi Usaj, Mojca; Styles, Erin B; Verster, Adrian J; Friesen, Helena; Boone, Charles; Andrews, Brenda J
2016-08-01
High-content screening (HCS), which combines automated fluorescence microscopy with quantitative image analysis, allows the acquisition of unbiased multiparametric data at the single cell level. This approach has been used to address diverse biological questions and identify a plethora of quantitative phenotypes of varying complexity in numerous different model systems. Here, we describe some recent applications of HCS, ranging from the identification of genes required for specific biological processes to the characterization of genetic interactions. We review the steps involved in the design of useful biological assays and automated image analysis, and describe major challenges associated with each. Additionally, we highlight emerging technologies and future challenges, and discuss how the field of HCS might be enhanced in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.
Spatiotemporal Characterization of a Fibrin Clot Using Quantitative Phase Imaging
Gannavarpu, Rajshekhar; Bhaduri, Basanta; Tangella, Krishnarao; Popescu, Gabriel
2014-01-01
Studying the dynamics of fibrin clot formation and its morphology is an important problem in biology and has significant impact for several scientific and clinical applications. We present a label-free technique based on quantitative phase imaging to address this problem. Using quantitative phase information, we characterized fibrin polymerization in real-time and present a mathematical model describing the transition from liquid to gel state. By exploiting the inherent optical sectioning capability of our instrument, we measured the three-dimensional structure of the fibrin clot. From this data, we evaluated the fractal nature of the fibrin network and extracted the fractal dimension. Our non-invasive and speckle-free approach analyzes the clotting process without the need for external contrast agents. PMID:25386701
Blending quantitative and qualitative methods in language research and intervention.
Brinton, Bonnie; Fujiki, Martin
2003-05-01
Best practice in speech-language pathology should be informed by current research findings. Traditional research methods are not always geared to address some of the complex, individual questions that arise in clinical intervention, however. Qualitative research methods may provide useful tools for bridging the gap from research to practice. Combinations of qualitative and quantitative procedures may be particularly helpful in sorting out some of the important issues surrounding language intervention in both clinical and research contexts. Examples of research blending qualitative and quantitative methods, as well as the case study of Sid, an 11-year-old boy with specific language impairment, are presented to illustrate how a combination of procedures can be used to enhance language research and intervention.
Nutrition Surveillance. Annual Summary 1982.
ERIC Educational Resources Information Center
Centers for Disease Control (DHHS/PHS), Atlanta, GA.
This report summarizes information, including selected indices of nutritional status, as reported from 28 states and the District of Columbia to the Nutritional Status Surveillance System. This system has two components, one addressing nutritional status among high-risk pediatric populations, and the other addressing nutritional status among…
Yi, YaXiong; Zhang, Yong; Ding, Yue; Lu, Lu; Zhang, Tong; Zhao, Yuan; Xu, XiaoJun; Zhang, YuXin
2016-11-01
We developed a novel quantitative analysis method based on ultra high performance liquid chromatography coupled with diode array detection for the simultaneous determination of the 14 main active components in Yinchenhao decoction. All components were separated on an Agilent SB-C18 column by using a gradient solvent system of acetonitrile/0.1% phosphoric acid solution at a flow rate of 0.4 mL/min for 35 min. Subsequently, linearity, precision, repeatability, and accuracy tests were implemented to validate the method. Furthermore, the method has been applied for compositional difference analysis of 14 components in eight normal-extraction Yinchenhao decoction samples, accompanied by hierarchical clustering analysis and similarity analysis. The result that all samples were divided into three groups based on different contents of components demonstrated that extraction methods of decocting, refluxing, ultrasonication and extraction solvents of water or ethanol affected component differentiation, and should be related to its clinical applications. The results also indicated that the sample prepared by patients in the family by using water extraction employing a casserole was almost same to that prepared using a stainless-steel kettle, which is mostly used in pharmaceutical factories. This research would help patients to select the best and most convenient method for preparing Yinchenhao decoction. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sleep Duration and Area-Level Deprivation in Twins.
Watson, Nathaniel F; Horn, Erin; Duncan, Glen E; Buchwald, Dedra; Vitiello, Michael V; Turkheimer, Eric
2016-01-01
We used quantitative genetic models to assess whether area-level deprivation as indicated by the Singh Index predicts shorter sleep duration and modifies its underlying genetic and environmental contributions. Participants were 4,218 adult twin pairs (2,377 monozygotic and 1,841 dizygotic) from the University of Washington Twin Registry. Participants self-reported habitual sleep duration. The Singh Index was determined by linking geocoding addresses to 17 indicators at the census-tract level using data from Census of Washington State and Census Tract Cartographic Boundary Files from 2000 and 2010. Data were analyzed using univariate and bivariate genetic decomposition and quantitative genetic interaction models that assessed A (additive genetics), C (common environment), and E (unique environment) main effects of the Singh Index on sleep duration and allowed the magnitude of residual ACE variance components in sleep duration to vary with the Index. The sample had a mean age of 38.2 y (standard deviation [SD] = 18), and was predominantly female (62%) and Caucasian (91%). Mean sleep duration was 7.38 h (SD = 1.20) and the mean Singh Index score was 0.00 (SD = 0.89). The heritability of sleep duration was 39% and the Singh Index was 12%. The uncontrolled phenotypic regression of sleep duration on the Singh Index showed a significant negative relationship between area-level deprivation and sleep length (b = -0.080, P < 0.001). Every 1 SD in Singh Index was associated with a ∼4.5 min change in sleep duration. For the quasi-causal bivariate model, there was a significant main effect of E (b(0E) = -0.063; standard error [SE] = 0.30; P < 0.05). Residual variance components unique to sleep duration were significant for both A (b(0Au) = 0.734; SE = 0.020; P < 0.001) and E (b(0Eu) = 0.934; SE = 0.013; P < 0.001). Area-level deprivation has a quasi-causal association with sleep duration, with greater deprivation being related to shorter sleep. As area-level deprivation increases, unique genetic and nonshared environmental residual variance in sleep duration increases. © 2016 Associated Professional Sleep Societies, LLC.
Large scale database scrubbing using object oriented software components.
Herting, R L; Barnes, M R
1998-01-01
Now that case managers, quality improvement teams, and researchers use medical databases extensively, the ability to share and disseminate such databases while maintaining patient confidentiality is paramount. A process called scrubbing addresses this problem by removing personally identifying information while keeping the integrity of the medical information intact. Scrubbing entire databases, containing multiple tables, requires that the implicit relationships between data elements in different tables of the database be maintained. To address this issue we developed DBScrub, a Java program that interfaces with any JDBC compliant database and scrubs the database while maintaining the implicit relationships within it. DBScrub uses a small number of highly configurable object-oriented software components to carry out the scrubbing. We describe the structure of these software components and how they maintain the implicit relationships within the database.
Sillam-Dussès, David; Kalinová, Blanka; Jiros, Pavel; Brezinová, Anna; Cvacka, Josef; Hanus, Robert; Sobotník, Jan; Bordereau, Christian; Valterová, Irena
2009-08-01
GC/MS analysis confirmed that neocembrene is the major component of the trail pheromone in the three species of the termite genus Prorhinotermes (P. simplex, P. canalifrons, P. inopinatus). In addition, EAG and GC-EAD experiments with P. simplex strongly suggest that dodecatrienol is a quantitatively minor component but a qualitatively important component of this trail pheromone. Trail-following bioassays confirmed the two-component nature of the trail pheromone. This is the first report of the use of the GC-EAD for the identification of trail pheromone in termites. These original results underline once again the special phylogenetic status of the Prorhinotermitinae among Rhinotermitidae.
DAnTE: a statistical tool for quantitative analysis of –omics data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep
2008-05-03
DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.
ERIC Educational Resources Information Center
Gröschner, Alexander; Seidel, Tina; Kiemer, Katharina; Pehmer, Ann-Kathrin
2015-01-01
For developing professional development (PD) programs, research suggests referring to effective components. In developing a PD program on classroom dialogue, we explored to what extent effective components could be addressed. We conducted a study with two groups. In the "Dialogic Video Cycle" (DVC), six German teachers participated in a…
Precocious quantitative cognition in monkeys.
Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F
2016-02-01
Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone.
Hou, Sen; Sun, Lili; Wieczorek, Stefan A; Kalwarczyk, Tomasz; Kaminski, Tomasz S; Holyst, Robert
2014-01-15
Fluorescent double-stranded DNA (dsDNA) molecules labeled at both ends are commonly produced by annealing of complementary single-stranded DNA (ssDNA) molecules, labeled with fluorescent dyes at the same (3' or 5') end. Because the labeling efficiency of ssDNA is smaller than 100%, the resulting dsDNA have two, one or are without a dye. Existing methods are insufficient to measure the percentage of the doubly-labeled dsDNA component in the fluorescent DNA sample and it is even difficult to distinguish the doubly-labeled DNA component from the singly-labeled component. Accurate measurement of the percentage of such doubly labeled dsDNA component is a critical prerequisite for quantitative biochemical measurements, which has puzzled scientists for decades. We established a fluorescence correlation spectroscopy (FCS) system to measure the percentage of doubly labeled dsDNA (PDL) in the total fluorescent dsDNA pool. The method is based on comparative analysis of the given sample and a reference dsDNA sample prepared by adding certain amount of unlabeled ssDNA into the original ssDNA solution. From FCS autocorrelation functions, we obtain the number of fluorescent dsDNA molecules in the focal volume of the confocal microscope and PDL. We also calculate the labeling efficiency of ssDNA. The method requires minimal amount of material. The samples have the concentration of DNA in the nano-molar/L range and the volume of tens of microliters. We verify our method by using restriction enzyme Hind III to cleave the fluorescent dsDNA. The kinetics of the reaction depends strongly on PDL, a critical parameter for quantitative biochemical measurements. Copyright © 2013 Elsevier B.V. All rights reserved.
Forman, Jane; Creswell, John W; Damschroder, Laura; Kowalski, Christine P; Krein, Sarah L
2008-12-01
Infection control professionals and hospital epidemiologists are accustomed to using quantitative research. Although quantitative studies are extremely important in the field of infection control and prevention, often they cannot help us explain why certain factors affect the use of infection control practices and identify the underlying mechanisms through which they do so. Qualitative research methods, which use open-ended techniques, such as interviews, to collect data and nonstatistical techniques to analyze it, provide detailed, diverse insights of individuals, useful quotes that bring a realism to applied research, and information about how different health care settings operate. Qualitative research can illuminate the processes underlying statistical correlations, inform the development of interventions, and show how interventions work to produce observed outcomes. This article describes the key features of qualitative research and the advantages that such features add to existing quantitative research approaches in the study of infection control. We address the goal of qualitative research, the nature of the research process, sampling, data collection and analysis, validity, generalizability of findings, and presentation of findings. Health services researchers are increasingly using qualitative methods to address practical problems by uncovering interacting influences in complex health care environments. Qualitative research methods, applied with expertise and rigor, can contribute important insights to infection prevention efforts.
Exploring how to increase response rates to surveys of older people.
Palonen, Mira; Kaunonen, Marja; Åstedt-Kurki, Päivi
2016-05-01
To address the special considerations that need to be taken into account when collecting data from older people in healthcare research. An objective of all research studies is to ensure there is an adequate sample size. The final sample size will be influenced by methods of recruitment and data collection, among other factors. There are some special considerations that need to be addressed when collecting data among older people. Quantitative surveys of people aged 60 or over in 2009-2014 were analysed using statistical methods. A quantitative study of patients aged 75 or over in an emergency department was used as an example. A methodological approach to analysing quantitative studies concerned with older people. The best way to ensure high response rates in surveys involving people aged 60 or over is to collect data in the presence of the researcher; response rates are lowest in posted surveys and settings where the researcher is not present when data are collected. Response rates do not seem to vary according to the database from which information about the study participants is obtained or according to who is responsible for recruitment to the survey. Implications for research/practice To conduct coherent studies with older people, the data collection process should be carefully considered.
Ground tire rubber (GTR) as a component material in concrete mixtures for paving concrete.
DOT National Transportation Integrated Search
2015-02-01
This research was done to investigate if the problems associated with flexibility and temperature sensitivity (expansion and : contraction) in roadway concrete pavements can be addressed by replacing some of the fine or coarse aggregate component : w...