Quantitative Ultrasound for Measuring Obstructive Severity in Children with Hydronephrosis.
Cerrolaza, Juan J; Peters, Craig A; Martin, Aaron D; Myers, Emmarie; Safdar, Nabile; Linguraru, Marius George
2016-04-01
We define sonographic biomarkers for hydronephrotic renal units that can predict the necessity of diuretic nuclear renography. We selected a cohort of 50 consecutive patients with hydronephrosis of varying severity in whom 2-dimensional sonography and diuretic mercaptoacetyltriglycine renography had been performed. A total of 131 morphological parameters were computed using quantitative image analysis algorithms. Machine learning techniques were then applied to identify ultrasound based safety thresholds that agreed with the t½ for washout. A best fit model was then derived for each threshold level of t½ that would be clinically relevant at 20, 30 and 40 minutes. Receiver operating characteristic curve analysis was performed. Sensitivity, specificity and area under the receiver operating characteristic curve were determined. Improvement obtained by the quantitative imaging method compared to the Society for Fetal Urology grading system and the hydronephrosis index was statistically verified. For the 3 thresholds considered and at 100% sensitivity the specificities of the quantitative imaging method were 94%, 70% and 74%, respectively. Corresponding area under the receiver operating characteristic curve values were 0.98, 0.94 and 0.94, respectively. Improvement obtained by the quantitative imaging method over the Society for Fetal Urology grade and hydronephrosis index was statistically significant (p <0.05 in all cases). Quantitative imaging analysis of renal sonograms in children with hydronephrosis can identify thresholds of clinically significant washout times with 100% sensitivity to decrease the number of diuretic renograms in up to 62% of children. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
A Quantitative Analysis of Cognitive Strategy Usage in the Marking of Two GCSE Examinations
ERIC Educational Resources Information Center
Suto, W. M. Irenka; Greatorex, Jackie
2008-01-01
Diverse strategies for marking GCSE examinations have been identified, ranging from simple automatic judgements to complex cognitive operations requiring considerable expertise. However, little is known about patterns of strategy usage or how such information could be utilised by examiners. We conducted a quantitative analysis of previous verbal…
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Hruska, Carrie B; Geske, Jennifer R; Swanson, Tiffinee N; Mammel, Alyssa N; Lake, David S; Manduca, Armando; Conners, Amy Lynn; Whaley, Dana H; Scott, Christopher G; Carter, Rickey E; Rhodes, Deborah J; O'Connor, Michael K; Vachon, Celine M
2018-06-05
Background parenchymal uptake (BPU), which refers to the level of Tc-99m sestamibi uptake within normal fibroglandular tissue on molecular breast imaging (MBI), has been identified as a breast cancer risk factor, independent of mammographic density. Prior analyses have used subjective categories to describe BPU. We evaluate a new quantitative method for assessing BPU by testing its reproducibility, comparing quantitative results with previously established subjective BPU categories, and determining the association of quantitative BPU with breast cancer risk. Two nonradiologist operators independently performed region-of-interest analysis on MBI images viewed in conjunction with corresponding digital mammograms. Quantitative BPU was defined as a unitless ratio of the average pixel intensity (counts/pixel) within the fibroglandular tissue versus the average pixel intensity in fat. Operator agreement and the correlation of quantitative BPU measures with subjective BPU categories assessed by expert radiologists were determined. Percent density on mammograms was estimated using Cumulus. The association of quantitative BPU with breast cancer (per one unit BPU) was examined within an established case-control study of 62 incident breast cancer cases and 177 matched controls. Quantitative BPU ranged from 0.4 to 3.2 across all subjects and was on average higher in cases compared to controls (1.4 versus 1.2, p < 0.007 for both operators). Quantitative BPU was strongly correlated with subjective BPU categories (Spearman's r = 0.59 to 0.69, p < 0.0001, for each paired combination of two operators and two radiologists). Interoperator and intraoperator agreement in the quantitative BPU measure, assessed by intraclass correlation, was 0.92 and 0.98, respectively. Quantitative BPU measures showed either no correlation or weak negative correlation with mammographic percent density. In a model adjusted for body mass index and percent density, higher quantitative BPU was associated with increased risk of breast cancer for both operators (OR = 4.0, 95% confidence interval (CI) 1.6-10.1, and 2.4, 95% CI 1.2-4.7). Quantitative measurement of BPU, defined as the ratio of average counts in fibroglandular tissue relative to that in fat, can be reliably performed by nonradiologist operators with a simple region-of-interest analysis tool. Similar to results obtained with subjective BPU categories, quantitative BPU is a functional imaging biomarker of breast cancer risk, independent of mammographic density and hormonal factors.
Products of combustion of non-metallic materials
NASA Technical Reports Server (NTRS)
Perry, Cortes L.
1995-01-01
The objective of this project is to evaluate methodologies for the qualitative and quantitative determination of the gaseous products of combustion of non-metallic materials of interest to the aerospace community. The goal is to develop instrumentation and analysis procedures which qualitatively and quantitatively identify gaseous products evolved by thermal decomposition and provide NASA a detailed system operating procedure.
Volgushev, Maxim; Malyshev, Aleksey; Balaban, Pavel; Chistiakova, Marina; Volgushev, Stanislav; Wolf, Fred
2008-04-09
The generation of action potentials (APs) is a key process in the operation of nerve cells and the communication between neurons. Action potentials in mammalian central neurons are characterized by an exceptionally fast onset dynamics, which differs from the typically slow and gradual onset dynamics seen in identified snail neurons. Here we describe a novel method of analysis which provides a quantitative measure of the onset dynamics of action potentials. This method captures the difference between the fast, step-like onset of APs in rat neocortical neurons and the gradual, exponential-like AP onset in identified snail neurons. The quantitative measure of the AP onset dynamics, provided by the method, allows us to perform quantitative analyses of factors influencing the dynamics.
Volgushev, Maxim; Malyshev, Aleksey; Balaban, Pavel; Chistiakova, Marina; Volgushev, Stanislav; Wolf, Fred
2008-01-01
The generation of action potentials (APs) is a key process in the operation of nerve cells and the communication between neurons. Action potentials in mammalian central neurons are characterized by an exceptionally fast onset dynamics, which differs from the typically slow and gradual onset dynamics seen in identified snail neurons. Here we describe a novel method of analysis which provides a quantitative measure of the onset dynamics of action potentials. This method captures the difference between the fast, step-like onset of APs in rat neocortical neurons and the gradual, exponential-like AP onset in identified snail neurons. The quantitative measure of the AP onset dynamics, provided by the method, allows us to perform quantitative analyses of factors influencing the dynamics. PMID:18398478
Operational models of infrastructure resilience.
Alderson, David L; Brown, Gerald G; Carlyle, W Matthew
2015-04-01
We propose a definition of infrastructure resilience that is tied to the operation (or function) of an infrastructure as a system of interacting components and that can be objectively evaluated using quantitative models. Specifically, for any particular system, we use quantitative models of system operation to represent the decisions of an infrastructure operator who guides the behavior of the system as a whole, even in the presence of disruptions. Modeling infrastructure operation in this way makes it possible to systematically evaluate the consequences associated with the loss of infrastructure components, and leads to a precise notion of "operational resilience" that facilitates model verification, validation, and reproducible results. Using a simple example of a notional infrastructure, we demonstrate how to use these models for (1) assessing the operational resilience of an infrastructure system, (2) identifying critical vulnerabilities that threaten its continued function, and (3) advising policymakers on investments to improve resilience. © 2014 Society for Risk Analysis.
Fraysse, François; Milanese, Steven; Thewlis, Dominic
2016-12-01
Load restraint systems in automobile transport utilise tie-down lashings placed over the car's tyres, which are tensioned manually by the operator using a ratchet assembly. This process has been identified as a significant manual handling injury risk. The aim of this study was to gain insight on the current practices associated with tie-down lashings operation, and identify the gaps between current and optimal practice. We approached this with qualitative and quantitative assessments and one numerical simulation to establish: (i) insight into the factors involved in ratcheting; (ii) the required tension to hold the car on the trailer; and (iii) the tension achieved by drivers in practice and associated joint loads. We identified that the method recommended to the drivers was not used in practice. Drivers instead tensioned the straps to the maximum of their capability, leading to over-tensioning and mechanical overload at the shoulder and elbow. We identified the postures and strategies that resulted in the lowest loads on the upper body during ratcheting (using both hands and performing the task with their full body). This research marks the first step towards the development of a training programme aiming at changing practice to reduce injury risks associated with the operation of tie-down lashings in the automobile transport industry. Practitioner Summary: The study investigated current practice associated with the operation of tie-down lashings through qualitative (interviews) and quantitative (biomechanical analysis) methods. Operators tended to systematically over-tension the lashings and consequently overexert, increasing injury risks.
NASA Astrophysics Data System (ADS)
Xue, Fei; Bompard, Ettore; Huang, Tao; Jiang, Lin; Lu, Shaofeng; Zhu, Huaiying
2017-09-01
As the modern power system is expected to develop to a more intelligent and efficient version, i.e. the smart grid, or to be the central backbone of energy internet for free energy interactions, security concerns related to cascading failures have been raised with consideration of catastrophic results. The researches of topological analysis based on complex networks have made great contributions in revealing structural vulnerabilities of power grids including cascading failure analysis. However, existing literature with inappropriate assumptions in modeling still cannot distinguish the effects between the structure and operational state to give meaningful guidance for system operation. This paper is to reveal the interrelation between network structure and operational states in cascading failure and give quantitative evaluation by integrating both perspectives. For structure analysis, cascading paths will be identified by extended betweenness and quantitatively described by cascading drop and cascading gradient. Furthermore, the operational state for cascading paths will be described by loading level. Then, the risk of cascading failure along a specific cascading path can be quantitatively evaluated considering these two factors. The maximum cascading gradient of all possible cascading paths can be used as an overall metric to evaluate the entire power grid for its features related to cascading failure. The proposed method is tested and verified on IEEE30-bus system and IEEE118-bus system, simulation evidences presented in this paper suggests that the proposed model can identify the structural causes for cascading failure and is promising to give meaningful guidance for the protection of system operation in the future.
Advancing the Fork detector for quantitative spent nuclear fuel verification
Vaccaro, S.; Gauld, I. C.; Hu, J.; ...
2018-01-31
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less
Advancing the Fork detector for quantitative spent nuclear fuel verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaccaro, S.; Gauld, I. C.; Hu, J.
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less
Advancing the Fork detector for quantitative spent nuclear fuel verification
NASA Astrophysics Data System (ADS)
Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.
2018-04-01
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. The results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.
Assessment of Scheduling and Plan Execution of Apollo 14 Lunar Surface Operations
NASA Technical Reports Server (NTRS)
Marquez, Jessica J.
2010-01-01
Although over forty years have passed since first landing on the Moon, there is not yet a comprehensive, quantitative assessment of Apollo extravehicular activities (EVAs). Quantitatively evaluating lunar EVAs will provide a better understanding of the challenges involved with surface operations. This first evaluation of a surface EVA centers on comparing the planned and the as-ran timeline, specifically collecting data on discrepancies between durations that were estimated versus executed. Differences were summarized by task categories in order to gain insight as to the type of surface operation activities that were most challenging. One Apollo 14 EVA was assessed utilizing the described methodology. Selected metrics and task categorizations were effective, and limitations to this process were identified.
Analyses in support of risk-informed natural gas vehicle maintenance facility codes and standards :
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekoto, Isaac W.; Blaylock, Myra L.; LaFleur, Angela Christine
2014-03-01
Safety standards development for maintenance facilities of liquid and compressed gas fueled large-scale vehicles is required to ensure proper facility design and operation envelopes. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase I work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest. Finally, scenario analyses were performed using detailed simulations and modeling to estimate the overpressure hazardsmore » from HAZOP defined scenarios. The results from Phase I will be used to identify significant risk contributors at NGV maintenance facilities, and are expected to form the basis for follow-on quantitative risk analysis work to address specific code requirements and identify effective accident prevention and mitigation strategies.« less
Driver detention times in commercial motor vehicle operations.
DOT National Transportation Integrated Search
2014-12-01
The purpose of this project was to quantitatively identify detention times in the commercial motor vehicle (CMV) : industry. Although there is currently no standard definition, the industry commonly defines detention time as any : time drivers hav...
Space Transportation Operations: Assessment of Methodologies and Models
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla
2001-01-01
The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.
Space Transportation Operations: Assessment of Methodologies and Models
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla
2002-01-01
The systems design process for future space transportation involves understanding multiple variables and their effect on lifecycle metrics. Variables such as technology readiness or potential environmental impact are qualitative, while variables such as reliability, operations costs or flight rates are quantitative. In deciding what new design concepts to fund, NASA needs a methodology that would assess the sum total of all relevant qualitative and quantitative lifecycle metrics resulting from each proposed concept. The objective of this research was to review the state of operations assessment methodologies and models used to evaluate proposed space transportation systems and to develop recommendations for improving them. It was found that, compared to the models available from other sources, the operations assessment methodology recently developed at Kennedy Space Center has the potential to produce a decision support tool that will serve as the industry standard. Towards that goal, a number of areas of improvement in the Kennedy Space Center's methodology are identified.
Spectrophotometric Titration of a Mixture of Calcium and Magnesium.
ERIC Educational Resources Information Center
Fulton, Robert; And Others
1986-01-01
Describes a spectrophotometric titration experiment which uses a manual titration spectrophotometer and manually operated buret, rather than special instrumentation. Identifies the equipment, materials, and procedures needed for the completion of the experiment. Recommends the use of this experiment in introductory quantitative analysis…
Composite Structure with Origami Core
2016-07-19
spherical linkages using the mechanism theory . Precise motions of origami were identified. In the second case, we identified a link between thick panel...operating reversibly by a coupled tension-to-torsion actuation mechanism . Using theory , we quantitatively explain the complementary effects of an increase in...structures. Our research has paved the way for much broader utilization of such structures in aeronautics and aerospace industries. 15. SUBJECT TERMS
Hydrothermal Liquefaction Treatment Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
Hazard analyses were performed to evaluate the modular hydrothermal liquefaction treatment system. The hazard assessment process was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. The analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affecting the public. The following selected hazardous scenarios receivedmore » increased attention: •Scenarios involving a release of hazardous material or energy, controls were identified in the What-If analysis table that prevent the occurrence or mitigate the effects of the release. •Scenarios with significant consequences that could impact personnel outside the immediate operations area, quantitative analyses were performed to determine the potential magnitude of the scenario. The set of “critical controls” were identified for these scenarios (see Section 4) which prevent the occurrence or mitigate the effects of the release of events with significant consequences.« less
Rapid quantification of soilborne pathogen communities in wheat-based long-term field experiments
USDA-ARS?s Scientific Manuscript database
Traditional isolation and quantification of inoculum density is difficult for most soilborne pathogens. Quantitative PCR methods have been developed to rapidly identify and quantify many of these pathogens using a single DNA extract from soil. Rainfed experiments operated continuously for up to 84 y...
ERIC Educational Resources Information Center
Dos Santos, Alves
2013-01-01
The study examined the relationship between recession and alumni contributions to institutions of higher education for operational expenses and capital expenditures that include property, buildings, and equipment. Identifying variables that may decrease alumni contributions is important because decreased state funding for higher education…
Assessment and Evaluation Methods for Access Services
ERIC Educational Resources Information Center
Long, Dallas
2014-01-01
This article serves as a primer for assessment and evaluation design by describing the range of methods commonly employed in library settings. Quantitative methods, such as counting and benchmarking measures, are useful for investigating the internal operations of an access services department in order to identify workflow inefficiencies or…
A Progression of Fraction Schemes Common to Chinese and U.S. Students
ERIC Educational Resources Information Center
Norton, Anderson; Wilkins, Jesse L. M.; Xu, Cong ze
2018-01-01
Through their work on the Fractions Project, Steffe and Olive (2010) identified a progression of fraction schemes that describes students' development toward more and more sophisticated ways of operating with fractions. Although several quantitative studies have affirmed this progression, the question has remained open as to whether it is specific…
Risk Assessment: Evidence Base
NASA Technical Reports Server (NTRS)
Johnson-Throop, Kathy A.
2007-01-01
Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.
Investigating the strategic antecedents of agility in humanitarian logistics.
L'Hermitte, Cécile; Brooks, Benjamin; Bowles, Marcus; Tatham, Peter H
2017-10-01
This study investigates the strategic antecedents of operational agility in humanitarian logistics. It began by identifying the particular actions to be taken at the strategic level of a humanitarian organisation to support field-level agility. Next, quantitative data (n=59) were collected on four strategic-level capabilities (being purposeful, action-focused, collaborative, and learning-oriented) and on operational agility (field responsiveness and flexibility). Using a quantitative analysis, the study tested the relationship between organisational capacity building and operational agility and found that the four strategic-level capabilities are fundamental building blocks of agility. Collectively they account for 52 per cent of the ability of humanitarian logisticians to deal with ongoing changes and disruptions in the field. This study emphasises the need for researchers and practitioners to embrace a broader perspective of agility in humanitarian logistics. In addition, it highlights the inherently strategic nature of agility, the development of which involves focusing simultaneously on multiple drivers. © 2017 The Author(s). Disasters © Overseas Development Institute, 2017.
McKinney, C.W.; Loftin, K.A.; Meyer, M.T.; Davis, J.G.; Pruden, A.
2010-01-01
Although livestock operations are known to harbor elevated levels of antibiotic resistant bacteria, few studies have examined the potential of livestock waste lagoons to reduce antibiotic resistance genes (ARGs). The purpose of this study was to determine the prevalence and examine the behavior of tetracycline [tet(O) and tet(W)] and sulfonamide [sul(I) and su/(II)] ARGsin a broad cross-section of livestock lagoons within the same semiarid western watershed. ARGs were monitored for one year in the water and the settled solids of eight lagoon systems by quantitative polymerase chain reaction. In addition, antibiotic residues and various bulk water quality constituents were analyzed. It was found that the lagoons of the chicken layer operation had the lowest concentrations of both tet and sul ARGs and low total antibiotic concentrations, whereas su ARGs were highest in the swine lagoons, which generally corresponded to the highest total antibiotic concentrations. A marginal benefit of organic and small dairy operations also was observed compared to conventional and large dairies, respectively. In all lagoons, su ARGs were observed to be generally more recalcitrant than tet ARGs. Also, positive correlations of various bulk water quality constituents were identified with tet ARGs but not sul ARGs. Significant positive correlations were identified between several metals and tet ARGs, but Pearson's correlation coefficients were mostly lower than those determined between antibiotic residues and ARGs. This study represents a quantitative characterization of ARGs in lagoons across a variety of livestock operations and provides insight into potential options for managing antibiotic resistance emanating from agricultural activities. ?? 2010 American Chemical Society.
Comfort and Accessibility Evaluation of Light Rail Vehicles
NASA Astrophysics Data System (ADS)
Hirasawa, Takayuki; Matsuoka, Shigeki; Suda, Yoshihiro
A quantitative evaluation method for passenger rooms of light rail vehicles from viewpoint of comfort and accessibility is proposed as the result of physical modeling of in-vehicle behavior of passengers upon Gibson's ecological psychology approach. The model parameters are identified from experiments at real vehicles at the depot of Kumamoto municipal transport and at the full-scale mockup of the University of Tokyo. The developed model has realized quantitative evaluation of floor lowering effects by abolishing internal steps at passenger doorways and door usage restriction scenarios from viewpoint of both passengers and operators in comparison to commuter railway vehicles.
Time-dynamics of the two-color emission from vertical-external-cavity surface-emitting lasers
NASA Astrophysics Data System (ADS)
Chernikov, A.; Wichmann, M.; Shakfa, M. K.; Scheller, M.; Moloney, J. V.; Koch, S. W.; Koch, M.
2012-01-01
The temporal stability of a two-color vertical-external-cavity surface-emitting laser is studied using single-shot streak-camera measurements. The collected data is evaluated via quantitative statistical analysis schemes. Dynamically stable and unstable regions for the two-color operation are identified and the dependence on the pump conditions is analyzed.
Jansen, Sanne M; de Bruin, Daniel M; van Berge Henegouwen, Mark I; Strackee, Simon D; Veelo, Denise P; van Leeuwen, Ton G; Gisbertz, Suzanne S
2017-01-01
Compromised perfusion as a result of surgical intervention causes a reduction of oxygen and nutrients in tissue and therefore decreased tissue vitality. Quantitative imaging of tissue perfusion during reconstructive surgery, therefore, may reduce the incidence of complications. Non-invasive optical techniques allow real-time tissue imaging, with high resolution and high contrast. The objectives of this study are, first, to assess the feasibility and accuracy of optical coherence tomography (OCT), sidestream darkfield microscopy (SDF), laser speckle contrast imaging (LSCI), and fluorescence imaging (FI) for quantitative perfusion imaging and, second, to identify/search for criteria that enable risk prediction of necrosis during gastric tube and free flap reconstruction. This prospective, multicenter, observational in vivo pilot study will assess tissue perfusion using four optical technologies: OCT, SDF, LSCI, and FI in 40 patients: 20 patients who will undergo gastric tube reconstruction after esophagectomy and 20 patients who will undergo free flap surgery. Intra-operative images of gastric perfusion will be obtained directly after reconstruction at four perfusion areas. Feasibility of perfusion imaging will be analyzed per technique. Quantitative parameters directly related to perfusion will be scored per perfusion area, and differences between biologically good versus reduced perfusion will be tested statistically. Patient outcome will be correlated to images and perfusion parameters. Differences in perfusion parameters before and after a bolus of ephedrine will be tested for significance. This study will identify quantitative perfusion-related parameters for an objective assessment of tissue perfusion during surgery. This will likely allow early risk stratification of necrosis development, which will aid in achieving a reduction of complications in gastric tube reconstruction and free flap transplantation. Clinicaltrials.gov registration number NCT02902549. Dutch Central Committee on Research Involving Human Subjects registration number NL52377.018.15.
Andrzejak, Ralph G.; Hauf, Martinus; Pollo, Claudio; Müller, Markus; Weisstanner, Christian; Wiest, Roland; Schindler, Kaspar
2015-01-01
Background Epilepsy surgery is a potentially curative treatment option for pharmacoresistent patients. If non-invasive methods alone do not allow to delineate the epileptogenic brain areas the surgical candidates undergo long-term monitoring with intracranial EEG. Visual EEG analysis is then used to identify the seizure onset zone for targeted resection as a standard procedure. Methods Despite of its great potential to assess the epileptogenicty of brain tissue, quantitative EEG analysis has not yet found its way into routine clinical practice. To demonstrate that quantitative EEG may yield clinically highly relevant information we retrospectively investigated how post-operative seizure control is associated with four selected EEG measures evaluated in the resected brain tissue and the seizure onset zone. Importantly, the exact spatial location of the intracranial electrodes was determined by coregistration of pre-operative MRI and post-implantation CT and coregistration with post-resection MRI was used to delineate the extent of tissue resection. Using data-driven thresholding, quantitative EEG results were separated into normally contributing and salient channels. Results In patients with favorable post-surgical seizure control a significantly larger fraction of salient channels in three of the four quantitative EEG measures was resected than in patients with unfavorable outcome in terms of seizure control (median over the whole peri-ictal recordings). The same statistics revealed no association with post-operative seizure control when EEG channels contributing to the seizure onset zone were studied. Conclusions We conclude that quantitative EEG measures provide clinically relevant and objective markers of target tissue, which may be used to optimize epilepsy surgery. The finding that differentiation between favorable and unfavorable outcome was better for the fraction of salient values in the resected brain tissue than in the seizure onset zone is consistent with growing evidence that spatially extended networks might be more relevant for seizure generation, evolution and termination than a single highly localized brain region (i.e. a “focus”) where seizures start. PMID:26513359
de Gramatica, Martina; Massacci, Fabio; Shim, Woohyun; Turhan, Uğur; Williams, Julian
2017-02-01
We analyze the issue of agency costs in aviation security by combining results from a quantitative economic model with a qualitative study based on semi-structured interviews. Our model extends previous principal-agent models by combining the traditional fixed and varying monetary responses to physical and cognitive effort with nonmonetary welfare and potentially transferable value of employees' own human capital. To provide empirical evidence for the tradeoffs identified in the quantitative model, we have undertaken an extensive interview process with regulators, airport managers, security personnel, and those tasked with training security personnel from an airport operating in a relatively high-risk state, Turkey. Our results indicate that the effectiveness of additional training depends on the mix of "transferable skills" and "emotional" buy-in of the security agents. Principals need to identify on which side of a critical tipping point their agents are to ensure that additional training, with attached expectations of the burden of work, aligns the incentives of employees with the principals' own objectives. © 2016 Society for Risk Analysis.
Yetisen, Ali K; Butt, Haider; Volpatti, Lisa R; Pavlichenko, Ida; Humar, Matjaž; Kwok, Sheldon J J; Koo, Heebeom; Kim, Ki Su; Naydenova, Izabela; Khademhosseini, Ali; Hahn, Sei Kwang; Yun, Seok Hyun
2016-01-01
Analyte-sensitive hydrogels that incorporate optical structures have emerged as sensing platforms for point-of-care diagnostics. The optical properties of the hydrogel sensors can be rationally designed and fabricated through self-assembly, microfabrication or laser writing. The advantages of photonic hydrogel sensors over conventional assay formats include label-free, quantitative, reusable, and continuous measurement capability that can be integrated with equipment-free text or image display. This Review explains the operation principles of photonic hydrogel sensors, presents syntheses of stimuli-responsive polymers, and provides an overview of qualitative and quantitative readout technologies. Applications in clinical samples are discussed, and potential future directions are identified. Copyright © 2015 Elsevier Inc. All rights reserved.
Medical response planning for pandemic flu.
Harrison, Jeffrey P; Bukhari, Syed Zeeshan Haider; Harrison, Richard A
2010-01-01
This quantitative research study evaluates the health care infrastructure necessary to provide medical care in US hospitals during a flu pandemic. These hospitals are identified within the US health care system because they operate airborne infectious isolation rooms. Data were obtained from the 2006 American Hospital Association annual survey. This data file provides essential information on individual US hospitals and identifies the health care capabilities in US communities. Descriptive statistics were evaluated to examine hospitals with the appropriate infrastructure to treat a flu pandemic. In addition, geographic information system software was used to identify geographic areas where essential infrastructure is lacking. The study found 3,341 US hospitals operate airborne infectious isolation rooms, representing 69% of reporting hospitals. The results also indicate that those hospitals with airborne infectious isolation rooms are larger and are located in metropolitan areas. The study has managerial implications associated with local medical disaster response and policy implications on the allocation of disaster resources.
Structures, performance, benefit, cost study. [gas turbine engines
NASA Technical Reports Server (NTRS)
Feder, E.
1981-01-01
Aircraft engine structures were studied to identify the advanced structural technologies that would provide the most benefits to future aircraft operations. A series of studies identified engine systems with the greatest potential for improvements. Based on these studies, six advanced generic structural concepts were selected and conceptually designed. The benefits of each concept were quantitatively assessed in terms of thrust specific fuel consumption, weight, cost, maintenance cost, fuel burned and direct operating cost plus interest. The probability of success of each concept was also determined. The concepts were ranked and the three most promising were selected for further study which consisted of identifying and comprehensively outlining the advanced technologies required to develop these concepts for aircraft engine application. Analytic, fabrication, and test technology developments are required. The technology programs outlined emphasize the need to provide basic, fundamental understanding of technology to obtain the benefit goals.
NASA Technical Reports Server (NTRS)
Clement, W. F.; Allen, R. W.; Heffley, R. K.; Jewell, W. F.; Jex, H. R.; Mcruer, D. T.; Schulman, T. M.; Stapleford, R. L.
1980-01-01
The NASA Ames Research Center proposed a man-vehicle systems research facility to support flight simulation studies which are needed for identifying and correcting the sources of human error associated with current and future air carrier operations. The organization of research facility is reviewed and functional requirements and related priorities for the facility are recommended based on a review of potentially critical operational scenarios. Requirements are included for the experimenter's simulation control and data acquisition functions, as well as for the visual field, motion, sound, computation, crew station, and intercommunications subsystems. The related issues of functional fidelity and level of simulation are addressed, and specific criteria for quantitative assessment of various aspects of fidelity are offered. Recommendations for facility integration, checkout, and staffing are included.
Human systems immunology: hypothesis-based modeling and unbiased data-driven approaches.
Arazi, Arnon; Pendergraft, William F; Ribeiro, Ruy M; Perelson, Alan S; Hacohen, Nir
2013-10-31
Systems immunology is an emerging paradigm that aims at a more systematic and quantitative understanding of the immune system. Two major approaches have been utilized to date in this field: unbiased data-driven modeling to comprehensively identify molecular and cellular components of a system and their interactions; and hypothesis-based quantitative modeling to understand the operating principles of a system by extracting a minimal set of variables and rules underlying them. In this review, we describe applications of the two approaches to the study of viral infections and autoimmune diseases in humans, and discuss possible ways by which these two approaches can synergize when applied to human immunology. Copyright © 2012 Elsevier Ltd. All rights reserved.
Measurement Models for Reasoned Action Theory.
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin
2012-03-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach.
Martínez-Llordella, Marc; Lozano, Juan José; Puig-Pey, Isabel; Orlando, Giuseppe; Tisone, Giuseppe; Lerut, Jan; Benítez, Carlos; Pons, Jose Antonio; Parrilla, Pascual; Ramírez, Pablo; Bruguera, Miquel; Rimola, Antoni; Sánchez-Fueyo, Alberto
2008-08-01
A fraction of liver transplant recipients are able to discontinue all immunosuppressive therapies without rejecting their grafts and are said to be operationally tolerant to the transplant. However, accurate identification of these recipients remains a challenge. To design a clinically applicable molecular test of operational tolerance in liver transplantation, we studied transcriptional patterns in the peripheral blood of 80 liver transplant recipients and 16 nontransplanted healthy individuals by employing oligonucleotide microarrays and quantitative real-time PCR. This resulted in the discovery and validation of several gene signatures comprising a modest number of genes capable of identifying tolerant and nontolerant recipients with high accuracy. Multiple peripheral blood lymphocyte subsets contributed to the tolerance-associated transcriptional patterns, although NK and gammadeltaTCR+ T cells exerted the predominant influence. These data suggest that transcriptional profiling of peripheral blood can be employed to identify liver transplant recipients who can discontinue immunosuppressive therapy and that innate immune cells are likely to play a major role in the maintenance of operational tolerance in liver transplantation.
Study of Gender Differences in Performance at the U.S. Naval Academy and U.S. Coast Guard Academy
2005-06-01
teacher preparation. By using both qualitative and quantitative methods for pre-service teachers, Kelly concludes that most teachers could not identify...Engineering MATH/SCIENCE Marine and Environmental Sciences Math and Computer Science Operations Research SOCIAL SCIENCE Government...Tabachnik and Findell, 2001). Correlational research is often a good precursor to answering other questions by empirical methods . Correlations measure the
Numerical study of cold filling and tube deformation in the molten salt receiver
NASA Astrophysics Data System (ADS)
Xu, Tingting; Zhang, Gongchen; Peniguel, Christophe; Liao, Zhirong; Li, Xin; Lu, Jiahui; Wang, Zhifeng
2017-06-01
Molten salt tube cold filling is one way to accelerate the startup of molten salt Concentrated Solar Power (CSP) plant. This practical operation may induce salt solidification and large thermal stress due to tube's large temperature difference. This paper presents the cold filling study and the induced thermal stress quantitatively through simulation approaches. Physical mechanisms and safe working criteria are identified under certain conditions.
Fuel Price Effects on Readiness
2014-05-01
While fuel purchases represent only about 2.5 percent of the Department’s budget, there is large variation among the Services. In Fiscal Year (FY...Training pillar were used as the primary quantitative measures of readiness in this paper. Level of training can be measured using operating tempo (OPTEMPO...CA: Naval Post Graduate School, 1989), http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier =ADA219807; Lawrence Goldberg
A systematic review of women's satisfaction and regret following risk-reducing mastectomy.
Braude, Lucy; Kirsten, Laura; Gilchrist, Jemma; Juraskova, Ilona
2017-12-01
A systematic review of quantitative and qualitative studies, to describe patient satisfaction and regret associated with risk-reducing mastectomies (RRM), and the patient-reported factors associated with these among women at high risk of developing breast cancer. Studies were identified using Medline, CINAHL, Embase and PsycInfo databases (1995-2016). Data were extracted and crosschecked for accuracy. Article quality was assessed using standardised criteria. Of the 1657 unique articles identified, 30 studies met the inclusion criteria (n=23 quantitative studies, n=3 qualitative studies, n=4 mixed-method studies). Studies included were cross-sectional (n=23) or retrospective (n=7). General satisfaction with RRM, decision satisfaction and aesthetic satisfaction were generally high, although some women expressed regret around their decision and dissatisfaction with their appearance. Factors associated with both patient satisfaction and regret included: post-operative complications, body image changes, psychological distress and perceived inadequacy of information. While satisfaction with RRM was generally high, some women had regrets and expressed dissatisfaction. Future research is needed to further explore RRM, and to investigate current satisfaction trends given the ongoing improvements to surgical and clinical practice. Offering pre-operative preparation, decisional support and continuous psychological input may help to facilitate satisfaction with this complex procedure. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Jen-Chieh; Zhou, Yufeng
2017-03-01
Extracorporeal shock wave lithotripsy (ESWL) has been used widely in the noninvasive treatment of kidney calculi. The fine fragments less than 2 mm in size can be discharged by urination, which determines the success of ESWL. Although ultrasonic and fluorescent imaging are used to localize the calculi, it's challenging to monitor the stone comminution progress, especially at the late stage of ESWL when fragments spread out as a cloud. The lack of real-time and quantitative evaluation makes this procedure semi-blind, resulting in either under- or over-treatment after the legal number of pulses required by FDA. The time reversal operator (TRO) method has the ability to detect point-like scatterers, and the number of non-zero eigenvalues of TRO is equal to that of the scatterers. In this study, the validation of TRO method to identify stones was illustrated from both numerical and experimental results for one to two stones with various sizes and locations. Furthermore, the parameters affecting the performance of TRO method has also been investigated. Overall, TRO method is effective in identifying the fragments in a stone cluster in real-time. Further development of a detection system and evaluation of its performance both in vitro and in vivo during ESWL is necessary for application.
Interdependency Assessment of Coupled Natural Gas and Power Systems in Energy Market
NASA Astrophysics Data System (ADS)
Yang, Hongzhao; Qiu, Jing; Zhang, Sanhua; Lai, Mingyong; Dong, Zhao Yang
2015-12-01
Owing to the technological development of natural gas exploration and the increasing penetration of gas-fired power generation, gas and power systems inevitably interact with each other from both physical and economic points of view. In order to effectively assess the two systems' interdependency, this paper proposes a systematic modeling framework and constructs simulation platforms for coupled gas and power systems in an energy market environment. By applying the proposed approach to the Australian national electricity market (NEM) and gas market, the impacts of six types of market and system factors are quantitatively analyzed, including power transmission limits, gas pipeline contingencies, gas pipeline flow constraints, carbon emission constraints, power load variations, and non-electric gas load variations. The important interdependency and infrastructure weakness for the two systems are well studied and identified. Our work provides a quantitative basis for grid operators and policy makers to support and guide operation and investment decisions for electric power and natural gas industries.
Takamatsu, Daiko; Yoneyama, Akio; Asari, Yusuke; Hirano, Tatsumi
2018-02-07
A fundamental understanding of concentrations of salts in lithium-ion battery electrolytes during battery operation is important for optimal operation and design of lithium-ion batteries. However, there are few techniques that can be used to quantitatively characterize salt concentration distributions in the electrolytes during battery operation. In this paper, we demonstrate that in operando X-ray phase imaging can quantitatively visualize the salt concentration distributions that arise in electrolytes during battery operation. From quantitative evaluation of the concentration distributions at steady states, we obtained the salt diffusivities in electrolytes with different initial salt concentrations. Because of no restriction on samples and high temporal and spatial resolutions, X-ray phase imaging will be a versatile technique for evaluating electrolytes, both aqueous and nonaqueous, of many electrochemical systems.
Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lepinski, James
2013-09-30
A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments were conducted on three (3) sites using the QFMEA model: (1) SACROC Northern Platform CO{sub 2}-EOR Site in the Permian Basin, Scurry County, TX, (2) Pump Canyon CO{sub 2}-ECBM Site in the San Juan Basin, San Juan County, NM, and (3) Farnsworth Unit CO{sub 2}-EOR Site in the Anadarko Basin, Ochiltree County, TX. The sites were sufficiently different from each other to test the robustness of the QFMEA model.« less
Research on Intelligent Interface in Double-front Work Machines
NASA Astrophysics Data System (ADS)
Kamezaki, Mitsuhiro; Iwata, Hiroyasu; Sugano, Shigeki
This paper proposes a work state identification method with full independent of work environmental conditions and operator skill levels for construction machinery. Advanced operated-work machines, which have been designed for complicated tasks, require intelligent systems that can provide the quantitative work analysis needed to determine effective work procedures and that can provide operational and cognitive support for operators. Construction work environments are extremely complicated, however, and this makes state identification, which is a key technology for an intelligent system, difficult. We therefore defined primitive static states (PSS) that are determined using on-off information for the lever inputs and manipulator loads for each part of the grapple and front and that are completely independent of the various environmental conditions and variation in operator skill level that can cause an incorrect work state identification. To confirm the usefulness of PSS, we performed experiments with a demolition task by using our virtual reality simulator. We confirmed that PSS could robustly and accurately identify the work states and that untrained skills could be easily inferred from the results of PSS-based work analysis. We also confirmed in skill-training experiments that advice information based on PSS-based skill analysis greatly improved operator's work performance. We thus confirmed that PSS can adequately identify work states and are useful for work analysis and skill improvement.
Lin, Yi Hsin; Chang, Yu Hern
2008-04-01
Aviation insurance premiums have become a heavy burden for the airline industry since September 11, 2001. Although the industry must constantly balance its operations between profitability and safety, the reality is that airlines are in the business of making money. Therefore, their ability to reduce cost and manage risk is a key factor for success. Unlike past research, which used subjective judgment methods, this study applied quantitative historical data (1999-2000) and gray relation analysis to identify the primary factors influencing ratemaking for aviation insurance premiums. An empirical study of six airlines in Taiwan was conducted to determine these factors and to analyze the management strategies used to deal with them. Results showed that the loss experience and performance of individual airlines were the key elements associated with aviation insurance premiums paid by each airline. By identifying and understanding the primary factors influencing ratemaking for aviation insurance, airlines will better understand their relative operational strengths and weaknesses, and further help top management identify areas for further improvement. Knowledge of these factors combined with effective risk management strategies, may result in lower premiums and operating costs for airline companies.
Quantitative and multiplexed detection for blood typing based on quantum dot-magnetic bead assay.
Xu, Ting; Zhang, Qiang; Fan, Ya-Han; Li, Ru-Qing; Lu, Hua; Zhao, Shu-Ming; Jiang, Tian-Lun
2017-01-01
Accurate and reliable blood grouping is essential for safe blood transfusion. However, conventional methods are qualitative and use only single-antigen detection. We overcame these limitations by developing a simple, quantitative, and multiplexed detection method for blood grouping using quantum dots (QDs) and magnetic beads. In the QD fluorescence assay (QFA), blood group A and B antigens were quantified using QD labeling and magnetic beads, and the blood groups were identified according to the R value (the value was calculated with the fluorescence intensity from dual QD labeling) of A and B antigens. The optimized performance of QFA was established by blood typing 791 clinical samples. Quantitative and multiplexed detection for blood group antigens can be completed within 35 min with more than 10 5 red blood cells. When conditions are optimized, the assay performance is satisfactory for weak samples. The coefficients of variation between and within days were less than 10% and the reproducibility was good. The ABO blood groups of 791 clinical samples were identified by QFA, and the accuracy obtained was 100% compared with the tube test. Receiver-operating characteristic curves revealed that the QFA has high sensitivity and specificity toward clinical samples, and the cutoff points of the R value of A and B antigens were 1.483 and 1.576, respectively. In this study, we reported a novel quantitative and multiplexed method for the identification of ABO blood groups and presented an effective alternative for quantitative blood typing. This method can be used as an effective tool to improve blood typing and further guarantee clinical transfusion safety.
Measurement Models for Reasoned Action Theory
Hennessy, Michael; Bleakley, Amy; Fishbein, Martin
2012-01-01
Quantitative researchers distinguish between causal and effect indicators. What are the analytic problems when both types of measures are present in a quantitative reasoned action analysis? To answer this question, we use data from a longitudinal study to estimate the association between two constructs central to reasoned action theory: behavioral beliefs and attitudes toward the behavior. The belief items are causal indicators that define a latent variable index while the attitude items are effect indicators that reflect the operation of a latent variable scale. We identify the issues when effect and causal indicators are present in a single analysis and conclude that both types of indicators can be incorporated in the analysis of data based on the reasoned action approach. PMID:23243315
NASA Technical Reports Server (NTRS)
Rey, P. A.; Gourinard, Y.; Cambou, F. (Principal Investigator); Guyader, J. C.; Gouaux, P.; Letoan, T.; Monchant, M.; Donville, B.; Loubet, D.
1973-01-01
The author has identified the following significant results. Significant results of the ARNICA program (February - December 1973) were: (1) The quantitative processing of ERTS-1 data was developed along two lines: the study of geological structures and lineaments of Spanish Catalonia, and the phytogeographical study of the forest region of the Landes of Gascony (France). In both cases it is shown that the ERTS-1 imagery can be used in establishing zonings of equal quantitative interpretation value. (2) In keeping with the operational transfer program proposed in previous reports between exploration of the imagery and charting of the object, a precise data processing method was developed, concerning more particularly the selection of digital equidensity samples computer display and rigorous referencing.
Contextual Advantage for State Discrimination
NASA Astrophysics Data System (ADS)
Schmid, David; Spekkens, Robert W.
2018-02-01
Finding quantitative aspects of quantum phenomena which cannot be explained by any classical model has foundational importance for understanding the boundary between classical and quantum theory. It also has practical significance for identifying information processing tasks for which those phenomena provide a quantum advantage. Using the framework of generalized noncontextuality as our notion of classicality, we find one such nonclassical feature within the phenomenology of quantum minimum-error state discrimination. Namely, we identify quantitative limits on the success probability for minimum-error state discrimination in any experiment described by a noncontextual ontological model. These constraints constitute noncontextuality inequalities that are violated by quantum theory, and this violation implies a quantum advantage for state discrimination relative to noncontextual models. Furthermore, our noncontextuality inequalities are robust to noise and are operationally formulated, so that any experimental violation of the inequalities is a witness of contextuality, independently of the validity of quantum theory. Along the way, we introduce new methods for analyzing noncontextuality scenarios and demonstrate a tight connection between our minimum-error state discrimination scenario and a Bell scenario.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data
Kümmel, Anne; Panke, Sven; Heinemann, Matthias
2006-01-01
As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595
Interdependence of PRECIS Role Operators: A Quantitative Analysis of Their Associations.
ERIC Educational Resources Information Center
Mahapatra, Manoranjan; Biswas, Subal Chandra
1986-01-01
Analyzes associations among different role operators quantitatively by taking input strings from 200 abstracts, each related to subject fields of taxation, genetic psychology, and Shakespearean drama, and subjecting them to the Chi-square test. Significant associations by other differencing operators and connectives are discussed. A schema of role…
Voronovskaja's theorem revisited
NASA Astrophysics Data System (ADS)
Tachev, Gancho T.
2008-07-01
We represent a new quantitative variant of Voronovskaja's theorem for Bernstein operator. This estimate improves the recent quantitative versions of Voronovskaja's theorem for certain Bernstein-type operators, obtained by H. Gonska, P. Pitul and I. Rasa in 2006.
Antonelli, Giorgia; Padoan, Andrea; Aita, Ada; Sciacovelli, Laura; Plebani, Mario
2017-08-28
Background The International Standard ISO 15189 is recognized as a valuable guide in ensuring high quality clinical laboratory services and promoting the harmonization of accreditation programmes in laboratory medicine. Examination procedures must be verified in order to guarantee that their performance characteristics are congruent with the intended scope of the test. The aim of the present study was to propose a practice model for implementing procedures employed for the verification of validated examination procedures already used for at least 2 years in our laboratory, in agreement with the ISO 15189 requirement at the Section 5.5.1.2. Methods In order to identify the operative procedure to be used, approved documents were identified, together with the definition of performance characteristics to be evaluated for the different methods; the examination procedures used in laboratory were analyzed and checked for performance specifications reported by manufacturers. Then, operative flow charts were identified to compare the laboratory performance characteristics with those declared by manufacturers. Results The choice of performance characteristics for verification was based on approved documents used as guidance, and the specific purpose tests undertaken, a consideration being made of: imprecision and trueness for quantitative methods; diagnostic accuracy for qualitative methods; imprecision together with diagnostic accuracy for semi-quantitative methods. Conclusions The described approach, balancing technological possibilities, risks and costs and assuring the compliance of the fundamental component of result accuracy, appears promising as an easily applicable and flexible procedure helping laboratories to comply with the ISO 15189 requirements.
Analyses of Mobilization Manpower Supply and Demand.
1982-03-01
7AD-AI30 148 ANALYSES OF MOBIL ZATION MANPOWER SUPPLY AND DEMAND U) l1 . ADMINISTRATIVE SCIENCES CORP SPRINOFIELD VA BREAU EAL MAR82 ASCR134...79-C-0527 for use in identifying and quantifying issues in the CPAM process, and to employ the model for selected quantitative ard qualitative analyses...nurses and corpsmen) to operate on a Commander FX Microcomputer, to be used by 2 the Bureau of Medicine and Surgery to develop inputs for Navy-wide
An academic medical center's response to widespread computer failure.
Genes, Nicholas; Chary, Michael; Chason, Kevin W
2013-01-01
As hospitals incorporate information technology (IT), their operations become increasingly vulnerable to technological breakdowns and attacks. Proper emergency management and business continuity planning require an approach to identify, mitigate, and work through IT downtime. Hospitals can prepare for these disasters by reviewing case studies. This case study details the disruption of computer operations at Mount Sinai Medical Center (MSMC), an urban academic teaching hospital. The events, and MSMC's response, are narrated and the impact on hospital operations is analyzed. MSMC's disaster management strategy prevented computer failure from compromising patient care, although walkouts and time-to-disposition in the emergency department (ED) notably increased. This incident highlights the importance of disaster preparedness and mitigation. It also demonstrates the value of using operational data to evaluate hospital responses to disasters. Quantifying normal hospital functions, just as with a patient's vital signs, may help quantitatively evaluate and improve disaster management and business continuity planning.
Historical data and analysis for the first five years of KSC STS payload processing
NASA Technical Reports Server (NTRS)
Ragusa, J. M.
1986-01-01
General and specific quantitative and qualitative results were identified from a study of actual operational experience while processing 186 science, applications, and commercial payloads for the first 5 years of Space Transportation System (STS) operations at the National Aeronautics and Space Administration's (NASA) John F. Kennedy Space Center (KSC). All non-Department of Defense payloads from STS-2 through STS-33 were part of the study. Historical data and cumulative program experiences from key personnel were used extensively. Emphasis was placed on various program planning and events that affected KSC processing, payload experiences and improvements, payload hardware condition after arrival, services to customers, and the impact of STS operations and delays. From these initial considerations, operational drivers were identified, data for selected processing parameters collected and analyzed, processing criteria and options determined, and STS payload results and conclusions reached. The study showed a significant reduction in time and effort needed by STS customers and KSC to process a wide variety of payload configurations. Also of significance is the fact that even the simplest payloads required more processing resources than were initially assumed. The success to date of payload integration, testing, and mission operations, however, indicates the soundness of the approach taken and the methods used.
Zhe, Gao; Ying-Chun, Wang; Yan-Xu, Chang
2016-01-01
Using high-performance liquid chromatography coupled with diode array detection and electrospray ionization tandem mass spectrometry (HPLC-DAD-MSn) method, qualitative and quantitative analysis of flavonoids of stems, leaves, fruits and seeds, and anthocyanidin of fresh fruits in Nitraria tangutorum were performed. A total of 14 flavonoid components were identified from the seeds of N. tangutorum including three quercetin derivatives, three kaempferol derivatives, and eight isorhamnetin derivatives. A total of 12, 10, and 7 flavonoid components were identified from leaves, stems, and fruits of N. tangutorum, respectively; all were present in seeds also. The total content of flavonoids in leaves was the highest, up to 42.43 mg/g·dry weight. A total of 12 anthocyanidin components were identified from the fresh fruits of N. tangutorum, belonging to five anthocyanidin. The total content of anthocyanidin in fresh fruits was up to 45.83 mg/100 g· fresh weight, of which the acylated anthocyanidin accounted for 65.7%. The HPLC-DAD-MS(n) method can be operated easily, rapidly, and accurately, and is feasible for qualitative and quantitative analysis of flavone glycosides in N. tangutorum.
Towards Measurement of Confidence in Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Paim Ganesh J.; Habli, Ibrahim
2011-01-01
Arguments in safety cases are predominantly qualitative. This is partly attributed to the lack of sufficient design and operational data necessary to measure the achievement of high-dependability targets, particularly for safety-critical functions implemented in software. The subjective nature of many forms of evidence, such as expert judgment and process maturity, also contributes to the overwhelming dependence on qualitative arguments. However, where data for quantitative measurements is systematically collected, quantitative arguments provide far more benefits over qualitative arguments, in assessing confidence in the safety case. In this paper, we propose a basis for developing and evaluating integrated qualitative and quantitative safety arguments based on the Goal Structuring Notation (GSN) and Bayesian Networks (BN). The approach we propose identifies structures within GSN-based arguments where uncertainties can be quantified. BN are then used to provide a means to reason about confidence in a probabilistic way. We illustrate our approach using a fragment of a safety case for an unmanned aerial system and conclude with some preliminary observations
Segmentation And Quantification Of Black Holes In Multiple Sclerosis
Datta, Sushmita; Sajja, Balasrinivasa Rao; He, Renjie; Wolinsky, Jerry S.; Gupta, Rakesh K.; Narayana, Ponnada A.
2006-01-01
A technique that involves minimal operator intervention was developed and implemented for identification and quantification of black holes on T1- weighted magnetic resonance images (T1 images) in multiple sclerosis (MS). Black holes were segmented on T1 images based on grayscale morphological operations. False classification of black holes was minimized by masking the segmented images with images obtained from the orthogonalization of T2-weighted and T1 images. Enhancing lesion voxels on postcontrast images were automatically identified and eliminated from being included in the black hole volume. Fuzzy connectivity was used for the delineation of black holes. The performance of this algorithm was quantitatively evaluated on 14 MS patients. PMID:16126416
NASA Astrophysics Data System (ADS)
Amarasinghe, Pradeep; Liu, An; Egodawatta, Prasanna; Barnes, Paul; McGree, James; Goonetilleke, Ashantha
2016-09-01
A water supply system can be impacted by rainfall reduction due to climate change, thereby reducing its supply potential. This highlights the need to understand the system resilience, which refers to the ability to maintain service under various pressures (or disruptions). Currently, the concept of resilience has not yet been widely applied in managing water supply systems. This paper proposed three technical resilience indictors to assess the resilience of a water supply system. A case study analysis was undertaken of the Water Grid system of Queensland State, Australia, to showcase how the proposed indicators can be applied to assess resilience. The research outcomes confirmed that the use of resilience indicators is capable of identifying critical conditions in relation to the water supply system operation, such as the maximum allowable rainfall reduction for the system to maintain its operation without failure. Additionally, resilience indicators also provided useful insight regarding the sensitivity of the water supply system to a changing rainfall pattern in the context of climate change, which represents the system's stability when experiencing pressure. The study outcomes will help in the quantitative assessment of resilience and provide improved guidance to system operators to enhance the efficiency and reliability of a water supply system.
TECHNOLOGICAL INNOVATION IN NEUROSURGERY: A QUANTITATIVE STUDY
Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar
2015-01-01
Object Technological innovation within healthcare may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technologically intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical technique. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation respectively. Methods A patent database was searched between 1960 and 2010 using the search terms “neurosurgeon” OR “neurosurgical” OR “neurosurgery”. The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top performing technology cluster was then selected as an exemplar for more detailed analysis of individual patents. Results In all, 11,672 patents and 208,203 publications relating to neurosurgery were identified. The top performing technology clusters over the 50 years were: image guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes and endoscopes. Image guidance and neuromodulation devices demonstrated a highly correlated rapid rise in patents and publications, suggesting they are areas of technology expansion. In-depth analysis of neuromodulation patents revealed that the majority of high performing patents were related to Deep Brain Stimulation (DBS). Conclusions Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery. PMID:25699414
Stool-based biomarkers of interstitial cystitis/bladder pain syndrome.
Braundmeier-Fleming, A; Russell, Nathan T; Yang, Wenbin; Nas, Megan Y; Yaggie, Ryan E; Berry, Matthew; Bachrach, Laurie; Flury, Sarah C; Marko, Darlene S; Bushell, Colleen B; Welge, Michael E; White, Bryan A; Schaeffer, Anthony J; Klumpp, David J
2016-05-18
Interstitial cystitis/bladder pain syndrome (IC) is associated with significant morbidity, yet underlying mechanisms and diagnostic biomarkers remain unknown. Pelvic organs exhibit neural crosstalk by convergence of visceral sensory pathways, and rodent studies demonstrate distinct bacterial pain phenotypes, suggesting that the microbiome modulates pelvic pain in IC. Stool samples were obtained from female IC patients and healthy controls, and symptom severity was determined by questionnaire. Operational taxonomic units (OTUs) were identified by16S rDNA sequence analysis. Machine learning by Extended Random Forest (ERF) identified OTUs associated with symptom scores. Quantitative PCR of stool DNA with species-specific primer pairs demonstrated significantly reduced levels of E. sinensis, C. aerofaciens, F. prausnitzii, O. splanchnicus, and L. longoviformis in microbiota of IC patients. These species, deficient in IC pelvic pain (DIPP), were further evaluated by Receiver-operator characteristic (ROC) analyses, and DIPP species emerged as potential IC biomarkers. Stool metabolomic studies identified glyceraldehyde as significantly elevated in IC. Metabolomic pathway analysis identified lipid pathways, consistent with predicted metagenome functionality. Together, these findings suggest that DIPP species and metabolites may serve as candidates for novel IC biomarkers in stool. Functional changes in the IC microbiome may also serve as therapeutic targets for treating chronic pelvic pain.
Descriptive and Experimental Analyses of Potential Precursors to Problem Behavior
Borrero, Carrie S.W; Borrero, John C
2008-01-01
We conducted descriptive observations of severe problem behavior for 2 individuals with autism to identify precursors to problem behavior. Several comparative probability analyses were conducted in addition to lag-sequential analyses using the descriptive data. Results of the descriptive analyses showed that the probability of the potential precursor was greater given problem behavior compared to the unconditional probability of the potential precursor. Results of the lag-sequential analyses showed a marked increase in the probability of a potential precursor in the 1-s intervals immediately preceding an instance of problem behavior, and that the probability of problem behavior was highest in the 1-s intervals immediately following an instance of the precursor. We then conducted separate functional analyses of problem behavior and the precursor to identify respective operant functions. Results of the functional analyses showed that both problem behavior and the precursor served the same operant functions. These results replicate prior experimental analyses on the relation between problem behavior and precursors and extend prior research by illustrating a quantitative method to identify precursors to more severe problem behavior. PMID:18468281
Pi, Shan; Cao, Rong; Qiang, Jin Wei; Guo, Yan Hui
2018-01-01
Background Diffusion-weighted imaging (DWI) and quantitative apparent diffusion coefficient (ADC) values are widely used in the differential diagnosis of ovarian tumors. Purpose To assess the diagnostic performance of quantitative ADC values in ovarian tumors. Material and Methods PubMed, Embase, the Cochrane Library, and local databases were searched for studies assessing ovarian tumors using quantitative ADC values. We quantitatively analyzed the diagnostic performances for two clinical problems: benign vs. malignant tumors and borderline vs. malignant tumors. We evaluated diagnostic performances by the pooled sensitivity and specificity values and by summary receiver operating characteristic (SROC) curves. Subgroup analyses were used to analyze study heterogeneity. Results From the 742 studies identified in the search results, 16 studies met our inclusion criteria. A total of ten studies evaluated malignant vs. benign ovarian tumors and six studies assessed malignant vs. borderline ovarian tumors. Regarding the diagnostic accuracy of quantitative ADC values for distinguishing between malignant and benign ovarian tumors, the pooled sensitivity and specificity values were 0.91 and 0.91, respectively. The area under the SROC curve (AUC) was 0.96. For differentiating borderline from malignant tumors, the pooled sensitivity and specificity values were 0.89 and 0.79, and the AUC was 0.91. The methodological quality of the included studies was moderate. Conclusion Quantitative ADC values could serve as useful preoperative markers for predicting the nature of ovarian tumors. Nevertheless, prospective trials focused on standardized imaging parameters are needed to evaluate the clinical value of quantitative ADC values in ovarian tumors.
2012-01-01
of target audiences in conflictive areas. Survey research can provide quantitative baselines and trend analyses of key attitudes held by the target...qualitative research , rather than quantitative . . . . The PRT [provincial reconstruction team] conducted mis- sions almost daily during the time frame it was...opposed to national-level polls, which might not be representative of target audiences in conflictive areas). Survey research can provide quantitative
NASA Astrophysics Data System (ADS)
Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng
2017-05-01
As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.
Role of man in flight experiment payloads, phase 1. [Spacelab mission planning
NASA Technical Reports Server (NTRS)
Malone, T. B.; Kirkpatrick, M.
1974-01-01
The identification of required data for studies of Spacelab experiment functional allocation, the development of an approach to collecting these data from the payload community, and the specification of analytical methods necessary to quantitatively determine the role of man in specific Spacelab experiments are presented. A generalized Spacelab experiment operation sequence was developed, and the parameters necessary to describe each signle function in the sequence were identified. A set of functional descriptor worksheets were also drawn up. The methodological approach to defining the role of man was defined as a series of trade studies using a digial simulation technique. The tradeoff variables identified include scientific crew size, skill mix, and location. An existing digital simulation program suitable for the required analyses was identified and obtained.
Potential protein biomarkers for burning mouth syndrome discovered by quantitative proteomics.
Ji, Eoon Hye; Diep, Cynthia; Liu, Tong; Li, Hong; Merrill, Robert; Messadi, Diana; Hu, Shen
2017-01-01
Burning mouth syndrome (BMS) is a chronic pain disorder characterized by severe burning sensation in normal looking oral mucosa. Diagnosis of BMS remains to be a challenge to oral healthcare professionals because the method for definite diagnosis is still uncertain. In this study, a quantitative saliva proteomic analysis was performed in order to identify target proteins in BMS patients' saliva that may be used as biomarkers for simple, non-invasive detection of the disease. By using isobaric tags for relative and absolute quantitation labeling and liquid chromatography-tandem mass spectrometry to quantify 1130 saliva proteins between BMS patients and healthy control subjects, we found that 50 proteins were significantly changed in the BMS patients when compared to the healthy control subjects ( p ≤ 0.05, 39 up-regulated and 11 down-regulated). Four candidates, alpha-enolase, interleukin-18 (IL-18), kallikrein-13 (KLK13), and cathepsin G, were selected for further validation. Based on enzyme-linked immunosorbent assay measurements, three potential biomarkers, alpha-enolase, IL-18, and KLK13, were successfully validated. The fold changes for alpha-enolase, IL-18, and KLK13 were determined as 3.6, 2.9, and 2.2 (burning mouth syndrome vs. control), and corresponding receiver operating characteristic values were determined as 0.78, 0.83, and 0.68, respectively. Our findings indicate that testing of the identified protein biomarkers in saliva might be a valuable clinical tool for BMS detection. Further validation studies of the identified biomarkers or additional candidate biomarkers are needed to achieve a multi-marker prediction model for improved detection of BMS with high sensitivity and specificity.
Valdés, Pablo A.; Kim, Anthony; Brantsch, Marco; Niu, Carolyn; Moses, Ziev B.; Tosteson, Tor D.; Wilson, Brian C.; Paulsen, Keith D.; Roberts, David W.; Harris, Brent T.
2011-01-01
Extent of resection is a major goal and prognostic factor in the treatment of gliomas. In this study we evaluate whether quantitative ex vivo tissue measurements of δ-aminolevulinic acid–induced protoporphyrin IX (PpIX) identify regions of increasing malignancy in low- and high-grade gliomas beyond the capabilities of current fluorescence imaging in patients undergoing fluorescence-guided resection (FGR). Surgical specimens were collected from 133 biopsies in 23 patients and processed for ex vivo neuropathological analysis: PpIX fluorimetry to measure PpIX concentrations (CPpIX) and Ki-67 immunohistochemistry to assess tissue proliferation. Samples displaying visible levels of fluorescence showed significantly higher levels of CPpIX and tissue proliferation. CPpIX was strongly correlated with histopathological score (nonparametric) and tissue proliferation (parametric), such that increasing levels of CPpIX were identified with regions of increasing malignancy. Furthermore, a large percentage of tumor-positive biopsy sites (∼40%) that were not visibly fluorescent under the operating microscope had levels of CPpIX greater than 0.1 µg/mL, which indicates that significant PpIX accumulation exists below the detection threshold of current fluorescence imaging. Although PpIX fluorescence is recognized as a visual biomarker for neurosurgical resection guidance, these data show that it is quantitatively related at the microscopic level to increasing malignancy in both low- and high-grade gliomas. This work suggests a need for improved PpIX fluorescence detection technologies to achieve better sensitivity and quantification of PpIX in tissue during surgery. PMID:21798847
Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.
Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis
2015-01-01
Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended.
Assessment of and standardization for quantitative nondestructive test
NASA Technical Reports Server (NTRS)
Neuschaefer, R. W.; Beal, J. B.
1972-01-01
Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.
NASA Technical Reports Server (NTRS)
Mungas, Greg S.; Beegle, Luther W.; Boynton, John E.; Lee, Pascal; Shidemantle, Ritch; Fisher, Ted
2004-01-01
The Camera, Hand Lens, and Microscope Probe (CHAMP) will allow examination of martian surface features and materials (terrain, rocks, soils, samples) on spatial scales ranging from kilometers to micrometers, thus enabling both microscopy and context imaging with high operational flexibility. CHAMP is designed to allow the detailed and quantitative investigation of a wide range of geologic features and processes on Mars, leading to a better quantitative understanding of the evolution of the martian surface environment through time. In particular, CHAMP will provide key data that will help understand the local region explored by Mars Surface Laboratory (MSL) as a potential habitat for life. CHAMP will also support other anticipated MSL investigations, in particular by helping identify and select the highest priority targets for sample collection and analysis by the MSL's analytical suite.
QUANTITATIVE DECISION TOOLS AND MANAGEMENT DEVELOPMENT PROGRAMS.
ERIC Educational Resources Information Center
BYARS, LLOYD L.; NUNN, GEOFFREY E.
THIS ARTICLE OUTLINED THE CURRENT STATUS OF QUANTITATIVE METHODS AND OPERATIONS RESEARCH (OR), SKETCHED THE STRENGTHS OF TRAINING EFFORTS AND ISOLATED WEAKNESSES, AND FORMULATED WORKABLE CRITERIA FOR EVALUATING SUCCESS OF OPERATIONS RESEARCH TRAINING PROGRAMS. A SURVEY OF 105 COMPANIES REVEALED THAT PERT, INVENTORY CONTROL THEORY AND LINEAR…
Miao, Jing-Kun; Chen, Qi-Xiong; Bao, Li-Ming; Huang, Yi; Zhang, Juan; Wan, Ke-Xing; Yi, Jing; Wang, Shi-Yi; Zou, Lin; Li, Ting-Yu
2013-09-23
Conventional screening tests to assess G6PD deficiency use a low cutoff value of 2.10 U/gHb which may not be adequate for detecting females with heterozygous deficiency. The aim of present study was to determine an appropriate cutoff value with increased sensitivity in identifying G6PD-deficient heterozygous females. G6PD activity analysis was performed on 51,747 neonates using semi-quantitative fluorescent spot test. Neonates suspected with G6PD deficiency were further analyzed using quantitatively enzymatic assay and for common G6PD mutations. The cutoff values of G6PD activity were estimated using the receiver operating characteristic curve. Our results demonstrated that using 2.10 U/g Hb as a cutoff, the sensitivity of the assay to detect female neonates with G6PD heterozygous deficiency was 83.3%, as compared with 97.6% using 2.55 U/g Hb as a cutoff. The high cutoff identified 21% (8/38) of the female neonates with partial G6PD deficiency which were not detected with 2.10 U/g Hb. Our study found that high cutoffs, 2.35 and 2.55 U/g Hb, would increase assay's sensitivity to identify male and female G6PD deficiency neonates, respectively. We established a reliable cutoff value of G6PD activity with increased sensitivity in identifying female newborns with partial G6PD deficiency. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rios Velazquez, E; Parmar, C; Narayan, V
Purpose: To compare the complementary value of quantitative radiomic features to that of radiologist-annotated semantic features in predicting EGFR mutations in lung adenocarcinomas. Methods: Pre-operative CT images of 258 lung adenocarcinoma patients were available. Tumors were segmented using the sing-click ensemble segmentation algorithm. A set of radiomic features was extracted using 3D-Slicer. Test-retest reproducibility and unsupervised dimensionality reduction were applied to select a subset of reproducible and independent radiomic features. Twenty semantic annotations were scored by an expert radiologist, describing the tumor, surrounding tissue and associated findings. Minimum-redundancy-maximum-relevance (MRMR) was used to identify the most informative radiomic and semantic featuresmore » in 172 patients (training-set, temporal split). Radiomic, semantic and combined radiomic-semantic logistic regression models to predict EGFR mutations were evaluated in and independent validation dataset of 86 patients using the area under the receiver operating curve (AUC). Results: EGFR mutations were found in 77/172 (45%) and 39/86 (45%) of the training and validation sets, respectively. Univariate AUCs showed a similar range for both feature types: radiomics median AUC = 0.57 (range: 0.50 – 0.62); semantic median AUC = 0.53 (range: 0.50 – 0.64, Wilcoxon p = 0.55). After MRMR feature selection, the best-performing radiomic, semantic, and radiomic-semantic logistic regression models, for EGFR mutations, showed a validation AUC of 0.56 (p = 0.29), 0.63 (p = 0.063) and 0.67 (p = 0.004), respectively. Conclusion: Quantitative volumetric and textural Radiomic features complement the qualitative and semi-quantitative radiologist annotations. The prognostic value of informative qualitative semantic features such as cavitation and lobulation is increased with the addition of quantitative textural features from the tumor region.« less
Enhanced vision flight deck technology for commercial aircraft low-visibility surface operations
NASA Astrophysics Data System (ADS)
Arthur, Jarvis J.; Norman, R. M.; Kramer, Lynda J.; Prinzel, Lawerence J.; Ellis, Kyle K.; Harrison, Stephanie J.; Comstock, J. R.
2013-05-01
NASA Langley Research Center and the FAA collaborated in an effort to evaluate the effect of Enhanced Vision (EV) technology display in a commercial flight deck during low visibility surface operations. Surface operations were simulated at the Memphis, TN (FAA identifier: KMEM) airfield during nighttime with 500 Runway Visual Range (RVR) in a high-fidelity, full-motion simulator. Ten commercial airline flight crews evaluated the efficacy of various EV display locations and parallax and minification effects. The research paper discusses qualitative and quantitative results of the simulation experiment, including the effect of EV display placement on visual attention, as measured by the use of non-obtrusive oculometry and pilot mental workload. The results demonstrated the potential of EV technology to enhance situation awareness which is dependent on the ease of access and location of the displays. Implications and future directions are discussed.
Enhanced Vision Flight Deck Technology for Commercial Aircraft Low-Visibility Surface Operations
NASA Technical Reports Server (NTRS)
Arthur, Jarvis J., III; Norman, R. Michael; Kramer, Lynda J.; Prinzel, Lawrence J., III; Ellis, Kyle K. E.; Harrison, Stephanie J.; Comstock, J. Ray
2013-01-01
NASA Langley Research Center and the FAA collaborated in an effort to evaluate the effect of Enhanced Vision (EV) technology display in a commercial flight deck during low visibility surface operations. Surface operations were simulated at the Memphis, TN (FAA identifier: KMEM) air field during nighttime with 500 Runway Visual Range (RVR) in a high-fidelity, full-motion simulator. Ten commercial airline flight crews evaluated the efficacy of various EV display locations and parallax and mini cation effects. The research paper discusses qualitative and quantitative results of the simulation experiment, including the effect of EV display placement on visual attention, as measured by the use of non-obtrusive oculometry and pilot mental workload. The results demonstrated the potential of EV technology to enhance situation awareness which is dependent on the ease of access and location of the displays. Implications and future directions are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jung, Euihan; Hwang, Gwangseok; Chung, Jaehun
2015-01-26
Performance degradation resulting from efficiency droop during high-power operation is a critical problem in the development of high-efficiency light-emitting diodes (LEDs). In order to resolve the efficiency droop and increase the external quantum efficiency of LEDs, the droop's origin should be identified first. To experimentally investigate the cause of efficiency droop, we used null-point scanning thermal microscopy to quantitatively profile the temperature distribution on the cross section of the epi-layers of an operating GaN-based vertical LED with nanoscale spatial resolution at four different current densities. The movement of temperature peak towards the p-GaN side as the current density increases suggestsmore » that more heat is generated by leakage current than by Auger recombination. We therefore suspect that at higher current densities, current leakage becomes the dominant cause of the droop problem.« less
Comparative analysis of quantitative efficiency evaluation methods for transportation networks
He, Yuxin; Hong, Jian
2017-01-01
An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess’s Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified. PMID:28399165
Comparative analysis of quantitative efficiency evaluation methods for transportation networks.
He, Yuxin; Qin, Jin; Hong, Jian
2017-01-01
An effective evaluation of transportation network efficiency could offer guidance for the optimal control of urban traffic. Based on the introduction and related mathematical analysis of three quantitative evaluation methods for transportation network efficiency, this paper compares the information measured by them, including network structure, traffic demand, travel choice behavior and other factors which affect network efficiency. Accordingly, the applicability of various evaluation methods is discussed. Through analyzing different transportation network examples it is obtained that Q-H method could reflect the influence of network structure, traffic demand and user route choice behavior on transportation network efficiency well. In addition, the transportation network efficiency measured by this method and Braess's Paradox can be explained with each other, which indicates a better evaluation of the real operation condition of transportation network. Through the analysis of the network efficiency calculated by Q-H method, it can also be drawn that a specific appropriate demand is existed to a given transportation network. Meanwhile, under the fixed demand, both the critical network structure that guarantees the stability and the basic operation of the network and a specific network structure contributing to the largest value of the transportation network efficiency can be identified.
Technological innovation in neurosurgery: a quantitative study.
Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar
2015-07-01
Technological innovation within health care may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technology-intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical techniques. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation, respectively. The authors searched a patent database for articles published between 1960 and 2010 using the Boolean search term "neurosurgeon OR neurosurgical OR neurosurgery." The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top-performing technology cluster was then selected as an exemplar for a more detailed analysis of individual patents. In all, 11,672 patents and 208,203 publications related to neurosurgery were identified. The top-performing technology clusters during these 50 years were image-guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes, and endoscopes. In relation to image-guidance and neuromodulation devices, the authors found a highly correlated rapid rise in the numbers of patents and publications, which suggests that these are areas of technology expansion. An in-depth analysis of neuromodulation-device patents revealed that the majority of well-performing patents were related to deep brain stimulation. Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery.
Analysis of rocket engine injection combustion processes
NASA Technical Reports Server (NTRS)
Salmon, J. W.
1976-01-01
A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.
Biological monitoring of Upper Three Runs Creek, Savannah River Plant, Aiken County, South Carolina
DOE Office of Scientific and Technical Information (OSTI.GOV)
Specht, W.L.
1991-10-01
In anticipation of the fall 1988 start up of effluent discharges into Upper Three Creek by the F/H Area Effluent Treatment Facility of the Savannah River Site, Aiken, SC, a two and one half year biological study was initiated in June 1987. Upper Three Runs Creek is an intensively studied fourth order stream known for its high species richness. Designed to assess the potential impact of F H area effluent on the creek, the study includes qualitative and quantitative macroinvertebrate stream surveys at five sites, chronic toxicity testing of the effluent, water chemistry and bioaccumulation analysis. This final report presentsmore » the results of both pre-operational and post-operational qualitative and quantitative (artificial substrate) macroinvertebrate studies. Six quantitative and three qualitative studies were conducted prior to the initial release of the F/H ETF effluent and five quantitative and two qualitative studies were conducted post-operationally.« less
NASA Astrophysics Data System (ADS)
Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Strohbach, Jens; Förstner, Jochen; Potthast, Roland
2017-12-01
A new backscatter lidar forward operator was developed which is based on the distinct calculation of the aerosols' backscatter and extinction properties. The forward operator was adapted to the COSMO-ART ash dispersion simulation of the Eyjafjallajökull eruption in 2010. While the particle number concentration was provided as a model output variable, the scattering properties of each individual particle type were determined by dedicated scattering calculations. Sensitivity studies were performed to estimate the uncertainties related to the assumed particle properties. Scattering calculations for several types of non-spherical particles required the usage of T-matrix routines. Due to the distinct calculation of the backscatter and extinction properties of the models' volcanic ash size classes, the sensitivity studies could be made for each size class individually, which is not the case for forward models based on a fixed lidar ratio. Finally, the forward-modeled lidar profiles have been compared to automated ceilometer lidar (ACL) measurements both qualitatively and quantitatively while the attenuated backscatter coefficient was chosen as a suitable physical quantity. As the ACL measurements were not calibrated automatically, their calibration had to be performed using satellite lidar and ground-based Raman lidar measurements. A slight overestimation of the model-predicted volcanic ash number density was observed. Major requirements for future data assimilation of data from ACL have been identified, namely, the availability of calibrated lidar measurement data, a scattering database for atmospheric aerosols, a better representation and coverage of aerosols by the ash dispersion model, and more investigation in backscatter lidar forward operators which calculate the backscatter coefficient directly for each individual aerosol type. The introduced forward operator offers the flexibility to be adapted to a multitude of model systems and measurement setups.
NASA Technical Reports Server (NTRS)
Shearer, C. K.; Eppler, D.; Farrell, W.; Gruener, J.; Lawrence, S.; Pellis, N.; Spudis, P. D.; Stopar, J.; Zeigler, R.; Neal, C;
2016-01-01
The Lunar Exploration Analysis Group (LEAG) was tasked by the Human Exploration Operations Mission Directorate (HEOMD) to establish a Specific Action Team (SAT) to review lunar Strategic Knowledge Gaps (SKGs) within the context of new lunar data and some specific human mission scenarios. Within this review, the SAT was to identify the SKGs that have been fully or partially retired, identify new SKGs resulting from new data and observations, and review quantitative descriptions of measurements that are required to fill knowledge gaps, the fidelity of the measurements needed, and if relevant, provide examples of existing instruments or potential missions capable of filling the SKGs.
NASA Astrophysics Data System (ADS)
Goddard, Braden
The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.
Rudnick, Paul A.; Clauser, Karl R.; Kilpatrick, Lisa E.; Tchekhovskoi, Dmitrii V.; Neta, Pedatsur; Blonder, Nikša; Billheimer, Dean D.; Blackman, Ronald K.; Bunk, David M.; Cardasis, Helene L.; Ham, Amy-Joan L.; Jaffe, Jacob D.; Kinsinger, Christopher R.; Mesri, Mehdi; Neubert, Thomas A.; Schilling, Birgit; Tabb, David L.; Tegeler, Tony J.; Vega-Montoto, Lorenzo; Variyath, Asokan Mulayath; Wang, Mu; Wang, Pei; Whiteaker, Jeffrey R.; Zimmerman, Lisa J.; Carr, Steven A.; Fisher, Susan J.; Gibson, Bradford W.; Paulovich, Amanda G.; Regnier, Fred E.; Rodriguez, Henry; Spiegelman, Cliff; Tempst, Paul; Liebler, Daniel C.; Stein, Stephen E.
2010-01-01
A major unmet need in LC-MS/MS-based proteomics analyses is a set of tools for quantitative assessment of system performance and evaluation of technical variability. Here we describe 46 system performance metrics for monitoring chromatographic performance, electrospray source stability, MS1 and MS2 signals, dynamic sampling of ions for MS/MS, and peptide identification. Applied to data sets from replicate LC-MS/MS analyses, these metrics displayed consistent, reasonable responses to controlled perturbations. The metrics typically displayed variations less than 10% and thus can reveal even subtle differences in performance of system components. Analyses of data from interlaboratory studies conducted under a common standard operating procedure identified outlier data and provided clues to specific causes. Moreover, interlaboratory variation reflected by the metrics indicates which system components vary the most between laboratories. Application of these metrics enables rational, quantitative quality assessment for proteomics and other LC-MS/MS analytical applications. PMID:19837981
Potential protein biomarkers for burning mouth syndrome discovered by quantitative proteomics
Ji, Eoon Hye; Diep, Cynthia; Liu, Tong; Li, Hong; Merrill, Robert; Messadi, Diana
2017-01-01
Burning mouth syndrome (BMS) is a chronic pain disorder characterized by severe burning sensation in normal looking oral mucosa. Diagnosis of BMS remains to be a challenge to oral healthcare professionals because the method for definite diagnosis is still uncertain. In this study, a quantitative saliva proteomic analysis was performed in order to identify target proteins in BMS patients’ saliva that may be used as biomarkers for simple, non-invasive detection of the disease. By using isobaric tags for relative and absolute quantitation labeling and liquid chromatography-tandem mass spectrometry to quantify 1130 saliva proteins between BMS patients and healthy control subjects, we found that 50 proteins were significantly changed in the BMS patients when compared to the healthy control subjects (p ≤ 0.05, 39 up-regulated and 11 down-regulated). Four candidates, alpha-enolase, interleukin-18 (IL-18), kallikrein-13 (KLK13), and cathepsin G, were selected for further validation. Based on enzyme-linked immunosorbent assay measurements, three potential biomarkers, alpha-enolase, IL-18, and KLK13, were successfully validated. The fold changes for alpha-enolase, IL-18, and KLK13 were determined as 3.6, 2.9, and 2.2 (burning mouth syndrome vs. control), and corresponding receiver operating characteristic values were determined as 0.78, 0.83, and 0.68, respectively. Our findings indicate that testing of the identified protein biomarkers in saliva might be a valuable clinical tool for BMS detection. Further validation studies of the identified biomarkers or additional candidate biomarkers are needed to achieve a multi-marker prediction model for improved detection of BMS with high sensitivity and specificity. PMID:28326926
Wang, Yan; Zhu, Wenhui; Duan, Xingxing; Zhao, Yongfeng; Liu, Wengang; Li, Ruizhen
2011-04-01
To evaluate intraventricular systolic dyssynchrony in rats with post-infarction heart failure by quantitative tissue velocity imaging combining synchronous electrocardiograph. A total of 60 male SD rats were randomly assigned to 3 groups: a 4 week post-operative group and an 8 week post-operation group (each n=25, with anterior descending branch of the left coronary artery ligated), and a sham operation group (n=10, with thoracotomy and open pericardium, but no ligation of the artery). The time to peak systolic velocity of regional myocardial in the rats was measured and the index of the left intraventricular dyssynchrony was calculated. All indexes of the heart function became lower as the heart failure worsened except the left ventricle index in the post-operative groups. All indexes of the dyssynchrony got longer in the post-operative groups (P<0.05), while the changes in the sham operation group were not significantly different (P>0.05). Quantitative tissue velocity imaging combining synchronous electrocardiograph can analyse the intraventricular systolic dyssynchrony accurately.
Pujol, Laure; Albert, Isabelle; Magras, Catherine; Johnson, Nicholas Brian; Membré, Jeanne-Marie
2015-11-20
In a previous study, a modular process risk model, from the raw material reception to the final product storage, was built to estimate the risk of a UHT-aseptic line of not complying with commercial sterility (Pujol et al., 2015). This present study was focused on demonstrating how the model (updated version with uncertainty and variability separated and 2(nd) order Monte Carlo procedure run) could be used to assess quantitatively the influence of management options. This assessment was done in three steps: pinpoint which process step had the highest influence on the risk, identify which management option(s) could be the most effective to control and/or reduce the risk, and finally evaluate quantitatively the influence of changing process setting(s) on the risk. For Bacillus cereus, it was identified that during post-process storage in an aseptic tank, there was potentially an air re-contamination due to filter efficiency loss (efficiency loss due to successive in-place sterilizations after cleaning operations), followed by B. cereus growth. Two options were then evaluated: i) reducing by one fifth of the number of filter sterilizations before renewing the filters, ii) designing new UHT-aseptic lines without an aseptic tank, i.e. without a storage period after the thermal process and before filling. Considering the uncertainty in the model, it was not possible to confirm whether these options had a significant influence on the risk associated with B. cereus. On the other hand, for Geobacillus stearothermophilus, combinations of heat-treatment time and temperature enabling the control or reduction in risk by a factor of ca. 100 were determined; for ease of operational implementation, they were presented graphically in the form of iso-risk curves. For instance, it was established that a heat treatment of 138°C for 31s (instead of 138°C for 25s) enabled a reduction in risk to 18×10(-8) (95% CI=[10; 34]×10(-8)), instead of 578×10(-8) (95% CI=[429; 754]×10(-8)) initially. In conclusion, a modular risk model, as the one exemplified here with a UHT-aseptic line, is a valuable tool in process design and operation, bringing definitive quantitative elements into the decision making process. Copyright © 2015 Elsevier B.V. All rights reserved.
A composite score associated with spontaneous operational tolerance in kidney transplant recipients.
Danger, Richard; Chesneau, Mélanie; Paul, Chloé; Guérif, Pierrick; Durand, Maxim; Newell, Kenneth A; Kanaparthi, Sai; Turka, Laurence A; Soulillou, Jean-Paul; Houlgatte, Rémi; Giral, Magali; Ramstein, Gérard; Brouard, Sophie
2017-06-01
New challenges in renal transplantation include using biological information to devise a useful clinical test for discerning high- and low-risk patients for individual therapy and ascertaining the best combination and appropriate dosages of drugs. Based on a 20-gene signature from a microarray meta-analysis performed on 46 operationally tolerant patients and 266 renal transplant recipients with stable function, we applied the sparse Bolasso methodology to identify a minimal and robust combination of six genes and two demographic parameters associated with operational tolerance. This composite score of operational tolerance discriminated operationally tolerant patients with an area under the curve of 0.97 (95% confidence interval 0.94-1.00). The score was not influenced by immunosuppressive treatment, center of origin, donor type, or post-transplant lymphoproliferative disorder history of the patients. This composite score of operational tolerance was significantly associated with both de novo anti-HLA antibodies and tolerance loss. It was validated by quantitative polymerase chain reaction using independent samples and demonstrated specificity toward a model of tolerance induction. Thus, our score would allow clinicians to improve follow-up of patients, paving the way for individual therapy. Copyright © 2017 International Society of Nephrology. All rights reserved.
A composite score associated with spontaneous operational tolerance in kidney transplant recipients
Danger, Richard; Chesneau, Mélanie; Paul, Chloé; Guérif, Pierrick; Durand, Maxim; Newell, Kenneth A; Kanaparthi, Sai; Turka, Laurence A; Soulillou, Jean-Paul; Houlgatte, Rémi; Giral, Magali; Ramstein, Gérard; Brouard, Sophie
2017-01-01
New challenges in renal transplantation include using biological information to devise a useful clinical test for discerning high- and low-risk patients for individual therapy and ascertaining the best combination and appropriate dosages of drugs. Based on a 20-gene signature from a microarray meta-analysis performed on 46 operationally tolerant patients and 266 renal transplanted recipients with stable function, we applied the sparse Bolasso methodology to identify a minimal and robust combination of six genes and two demographic parameters associated with operational tolerance. This composite score of operational tolerance discriminated operationally tolerant patients with an area under the curve of 0.97 (95% confidence interval 0.94–1.00). The score was not influenced by immunosuppressive treatment, center of origin, donor type, or post-transplant lymphoproliferative disorder history of the patients. This composite score of operational tolerance was significantly associated with both de novo anti-HLA antibodies and tolerance loss. It was validated by quantitative polymerase chain reaction using independent samples and demonstrated specificity toward a model of tolerance induction. Thus, our score would allow clinicians to improve follow-up of patients, paving the way for individual therapy. PMID:28242033
Statistical Evaluation of Causal Factors Associated with Astronaut Shoulder Injury in Space Suits.
Anderson, Allison P; Newman, Dava J; Welsch, Roy E
2015-07-01
Shoulder injuries due to working inside the space suit are some of the most serious and debilitating injuries astronauts encounter. Space suit injuries occur primarily in the Neutral Buoyancy Laboratory (NBL) underwater training facility due to accumulated musculoskeletal stress. We quantitatively explored the underlying causal mechanisms of injury. Logistic regression was used to identify relevant space suit components, training environment variables, and anthropometric dimensions related to an increased propensity for space-suited injury. Two groups of subjects were analyzed: those whose reported shoulder incident is attributable to the NBL or working in the space suit, and those whose shoulder incidence began in active duty, meaning working in the suit could be a contributing factor. For both groups, percent of training performed in the space suit planar hard upper torso (HUT) was the most important predictor variable for injury. Frequency of training and recovery between training were also significant metrics. The most relevant anthropometric dimensions were bideltoid breadth, expanded chest depth, and shoulder circumference. Finally, record of previous injury was found to be a relevant predictor for subsequent injury. The first statistical model correctly identifies 39% of injured subjects, while the second model correctly identifies 68% of injured subjects. A review of the literature suggests this is the first work to quantitatively evaluate the hypothesized causal mechanisms of all space-suited shoulder injuries. Although limited in predictive capability, each of the identified variables can be monitored and modified operationally to reduce future impacts on an astronaut's health.
Yost, Erin E; Stanek, John; DeWoskin, Robert S; Burgoon, Lyle D
2016-07-19
The United States Environmental Protection Agency (EPA) identified 1173 chemicals associated with hydraulic fracturing fluids, flowback, or produced water, of which 1026 (87%) lack chronic oral toxicity values for human health assessments. To facilitate the ranking and prioritization of chemicals that lack toxicity values, it may be useful to employ toxicity estimates from quantitative structure-activity relationship (QSAR) models. Here we describe an approach for applying the results of a QSAR model from the TOPKAT program suite, which provides estimates of the rat chronic oral lowest-observed-adverse-effect level (LOAEL). Of the 1173 chemicals, TOPKAT was able to generate LOAEL estimates for 515 (44%). To address the uncertainty associated with these estimates, we assigned qualitative confidence scores (high, medium, or low) to each TOPKAT LOAEL estimate, and found 481 to be high-confidence. For 48 chemicals that had both a high-confidence TOPKAT LOAEL estimate and a chronic oral reference dose from EPA's Integrated Risk Information System (IRIS) database, Spearman rank correlation identified 68% agreement between the two values (permutation p-value =1 × 10(-11)). These results provide support for the use of TOPKAT LOAEL estimates in identifying and prioritizing potentially hazardous chemicals. High-confidence TOPKAT LOAEL estimates were available for 389 of 1026 hydraulic fracturing-related chemicals that lack chronic oral RfVs and OSFs from EPA-identified sources, including a subset of chemicals that are frequently used in hydraulic fracturing fluids.
Gandomkar, Ziba; Brennan, Patrick C.; Mello-Thoms, Claudia
2017-01-01
Context: Previous studies showed that the agreement among pathologists in recognition of mitoses in breast slides is fairly modest. Aims: Determining the significantly different quantitative features among easily identifiable mitoses, challenging mitoses, and miscounted nonmitoses within breast slides and identifying which color spaces capture the difference among groups better than others. Materials and Methods: The dataset contained 453 mitoses and 265 miscounted objects in breast slides. The mitoses were grouped into three categories based on the confidence degree of three pathologists who annotated them. The mitoses annotated as “probably a mitosis” by the majority of pathologists were considered as the challenging category. The miscounted objects were recognized as a mitosis or probably a mitosis by only one of the pathologists. The mitoses were segmented using k-means clustering, followed by morphological operations. Morphological, intensity-based, and textural features were extracted from the segmented area and also the image patch of 63 × 63 pixels in different channels of eight color spaces. Holistic features describing the mitoses' surrounding cells of each image were also extracted. Statistical Analysis Used: The Kruskal–Wallis H-test followed by the Tukey-Kramer test was used to identify significantly different features. Results: The results indicated that challenging mitoses were smaller and rounder compared to other mitoses. Among different features, the Gabor textural features differed more than others between challenging mitoses and the easily identifiable ones. Sizes of the non-mitoses were similar to easily identifiable mitoses, but nonmitoses were rounder. The intensity-based features from chromatin channels were the most discriminative features between the easily identifiable mitoses and the miscounted objects. Conclusions: Quantitative features can be used to describe the characteristics of challenging mitoses and miscounted nonmitotic objects. PMID:28966834
NASA Astrophysics Data System (ADS)
Molthan, A.; Fuell, K. K.; Berndt, E.; Schultz, L. A.
2016-12-01
The NASA/SPoRT Program supports the NOAA/JPSS program through the transition of S-NPP VIIRS and CrIS/ATMS products to prepare users for the upcoming JPSS-1/-2 missions. Several multispectral (i.e. RGB) imagery products can be created from VIIRS based on internationally-accepted recipes developed by EUMETSAT. Initial transition of a Nighttime Microphysics RGB to operations revealed improved distinction between low clouds and fog compared with legacy satellite imagery, and hence, improvement in short-term aviation and public forecasts. An increased number of S-NPP passes at high latitude combined with other instruments led to a series of "microphysical" RGBs to be introduced to NWS forecasters in Alaska at both local weather offices as well as regional aviation centers. Forecasters in Alaska also applied VIIRS microphysical RGBs to identify small scale features such as valley/coastal fog, volcanic ash, and convective precipitation. Further use of a "Dust" RGB in the U.S. southwest led to changes in NWS forecast products due to improvements in detection and monitoring of dust aloft. As multispectral imagery has gained operational acceptance, additional work has begun to develop quantitative products to assist users with their interpretation of RGB imagery. For example, National Center forecasters often use an "Air Mass" RGB to differentiate between possible stratospheric /tropospheric interactions, moist tropical air masses, and cool, continental/maritime air masses. Research was done to demonstrate how the NUCAPS CrIS/ATMS infrared retrieved temperature, moisture, and ozone profiles can aid Air Mass RGB imagery interpretation as well as how these quantitative values are important for anticipating tropical to extratropical transition events. In addition, an enhanced stratospheric depth product was developed to identify the dynamic tropopause from the NUCAPS retrieved ozone profiles to aid identification of stratospheric air influence. Forecasters from National Centers evaluated the NUCAPS profiles as a tool for anticipating extratropical transition during the latter half of the 2016 hurricane season. Examples of multispectral and sounding product impacts in near-realtime operations from VIIRS and CrIS/ATMS are presented here.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Specht, W.L.
1991-10-01
In anticipation of the fall 1988 start up of effluent discharges into Upper Three Creek by the F/H Area Effluent Treatment Facility of the Savannah River Site, Aiken, SC, a two and one half year biological study was initiated in June 1987. Upper Three Runs Creek is an intensively studied fourth order stream known for its high species richness. Designed to assess the potential impact of F?H area effluent on the creek, the study includes qualitative and quantitative macroinvertebrate stream surveys at five sites, chronic toxicity testing of the effluent, water chemistry and bioaccumulation analysis. This final report presents themore » results of both pre-operational and post-operational qualitative and quantitative (artificial substrate) macroinvertebrate studies. Six quantitative and three qualitative studies were conducted prior to the initial release of the F/H ETF effluent and five quantitative and two qualitative studies were conducted post-operationally.« less
Chin, Kok-Yong; Low, Nie Yen; Kamaruddin, Alia Annessa Ain; Dewiputri, Wan Ilma; Soelaiman, Ima-Nirwana
2017-01-01
Background Calcaneal quantitative ultrasound (QUS) is a useful tool in osteoporosis screening. However, QUS device may not be available at all primary health care settings. Osteoporosis self-assessment tool for Asians (OSTA) is a simple algorithm for osteoporosis screening that does not require any sophisticated instruments. This study explored the possibility of replacing QUS with OSTA by determining their agreement in identifying individuals at risk of osteoporosis. Methods A cross-sectional study was conducted to recruit Malaysian men and women aged ≥50 years. Their bone health status was measured using a calcaneal QUS device and OSTA. The association between OSTA and QUS was determined using Spearman’s correlation and their agreement was assessed using Cohen Kappa and receiver-operating curve. Results All QUS indices correlated significantly with OSTA (p<0.05). The agreement between QUS and OSTA was minimal but statistically significant (p<0.05). The performance of OSTA in identifying subjects at risk of osteoporosis according to QUS was poor-to-fair in women (p<0.05), but not statistically significant for men (p>0.05). Changing the cut-off values improved the performance of OSTA in women but not in men. Conclusion The agreement between QUS and OSTA is minimal in categorizing individuals at risk of osteoporosis. Therefore, they cannot be used interchangeably in osteoporosis screening. PMID:29070951
Health risk to medical personnel of surgical smoke produced during laparoscopic surgery.
Dobrogowski, Miłosz; Wesolowski, Wiktor; Kucharska, Małgorzata; Paduszyńska, Katarzyna; Dworzyńska, Agnieszka; Szymczak, Wiesław; Sapota, Andrzej; Pomorski, Lech
2015-01-01
During laparoscopic cholecystectomy, the removal of the gall bladder, pyrolysis occurs in the peritoneal cavity. Chemical substances which are formed during this process escape into the operating room through trocars in the form of surgical smoke. The aim of this study was to identify and quantitatively measure a number of selected chemical substances found in surgical smoke and to assess the risk they carry to medical personnel. The study was performed at the Maria Skłodowska-Curie Memorial Provincial Specialist Hospital in Zgierz between 2011 and 2013. Air samples were collected in the operating room during laparoscopic cholecystectomy. A complete qualitative and quantitative analysis of the air samples showed a number of chemical substances present, such as aldehydes, benzene, toluene, ethylbenzene, xylene, ozone, dioxins and others. The concentrations of these substances were much lower than the hygienic standards allowed by the European Union Maximum Acceptable Concentration (MAC). The calculated risk of developing cancer as a result of exposure to surgical smoke during laparoscopic cholecystectomy is negligible. Yet it should be kept in mind that repeated exposure to a cocktail of these substances increases the possibility of developing adverse effects. Many of these compounds are toxic, and may possibly be carcinogenic, mutagenic or genotoxic. Therefore, it is necessary to remove surgical smoke from the operating room in order to protect medical personnel. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
A Quantitative Study Identifying Political Strategies Used by Principals of Dual Language Programs
ERIC Educational Resources Information Center
Girard, Guadalupe
2017-01-01
Purpose. The purpose of this quantitative study was to identify the external and internal political strategies used by principals that allow them to successfully navigate the political environment surrounding dual language programs. Methodology. This quantitative study used descriptive research to collect, analyze, and report data that identified…
Quantitative option analysis for implementation and management of landfills.
Kerestecioğlu, Merih
2016-09-01
The selection of the most feasible strategy for implementation of landfills is a challenging step. Potential implementation options of landfills cover a wide range, from conventional construction contracts to the concessions. Montenegro, seeking to improve the efficiency of the public services while maintaining affordability, was considering privatisation as a way to reduce public spending on service provision. In this study, to determine the most feasible model for construction and operation of a regional landfill, a quantitative risk analysis was implemented with four steps: (i) development of a global risk matrix; (ii) assignment of qualitative probabilities of occurrences and magnitude of impacts; (iii) determination of the risks to be mitigated, monitored, controlled or ignored; (iv) reduction of the main risk elements; and (v) incorporation of quantitative estimates of probability of occurrence and expected impact for each risk element in the reduced risk matrix. The evaluated scenarios were: (i) construction and operation of the regional landfill by the public sector; (ii) construction and operation of the landfill by private sector and transfer of the ownership to the public sector after a pre-defined period; and (iii) operation of the landfill by the private sector, without ownership. The quantitative risk assessment concluded that introduction of a public private partnership is not the most feasible option, unlike the common belief in several public institutions in developing countries. A management contract for the first years of operation was advised to be implemented, after which, a long term operating contract may follow. © The Author(s) 2016.
QTLomics in Soybean: A Way Forward for Translational Genomics and Breeding
Kumawat, Giriraj; Gupta, Sanjay; Ratnaparkhe, Milind B.; Maranna, Shivakumar; Satpute, Gyanesh K.
2016-01-01
Food legumes play an important role in attaining both food and nutritional security along with sustainable agricultural production for the well-being of humans globally. The various traits of economic importance in legume crops are complex and quantitative in nature, which are governed by quantitative trait loci (QTLs). Mapping of quantitative traits is a tedious and costly process, however, a large number of QTLs has been mapped in soybean for various traits albeit their utilization in breeding programmes is poorly reported. For their effective use in breeding programme it is imperative to narrow down the confidence interval of QTLs, to identify the underlying genes, and most importantly allelic characterization of these genes for identifying superior variants. In the field of functional genomics, especially in the identification and characterization of gene responsible for quantitative traits, soybean is far ahead from other legume crops. The availability of genic information about quantitative traits is more significant because it is easy and effective to identify homologs than identifying shared syntenic regions in other crop species. In soybean, genes underlying QTLs have been identified and functionally characterized for phosphorous efficiency, flowering and maturity, pod dehiscence, hard-seededness, α-Tocopherol content, soybean cyst nematode, sudden death syndrome, and salt tolerance. Candidate genes have also been identified for many other quantitative traits for which functional validation is required. Using the sequence information of identified genes from soybean, comparative genomic analysis of homologs in other legume crops could discover novel structural variants and useful alleles for functional marker development. The functional markers may be very useful for molecular breeding in soybean and harnessing benefit of translational research from soybean to other leguminous crops. Thus, soybean crop can act as a model crop for translational genomics and breeding of quantitative traits in legume crops. In this review, we summarize current status of identification and characterization of genes underlying QTLs for various quantitative traits in soybean and their significance in translational genomics and breeding of other legume crops. PMID:28066449
1986-03-31
requirements necessary to optimize BAS/DCS operation in worst case environments . 4) Identify the qualitative and quantitative values of equipment which... Defibrillator 2.3 2.3 265 1.0 265 2 Sink unit, surgici1 17.0 34.0 3910 0.1 390 1 Resuscitator - inhaler 0.9 0.9 104 0.5 52 2 Sterilizer, surgical 10.1...transferred, the driving force for transfer is the difference in dry bulb temperatures. During heat transfer between unsaturated air and a wetted
Subsonic Ultra Green Aircraft Research Phase II: N+4 Advanced Concept Development
NASA Technical Reports Server (NTRS)
Bradley, Marty K.; Droney, Christopher K.
2012-01-01
This final report documents the work of the Boeing Subsonic Ultra Green Aircraft Research (SUGAR) team on Task 1 of the Phase II effort. The team consisted of Boeing Research and Technology, Boeing Commercial Airplanes, General Electric, and Georgia Tech. Using a quantitative workshop process, the following technologies, appropriate to aircraft operational in the N+4 2040 timeframe, were identified: Liquefied Natural Gas (LNG), Hydrogen, fuel cell hybrids, battery electric hybrids, Low Energy Nuclear (LENR), boundary layer ingestion propulsion (BLI), unducted fans and advanced propellers, and combinations. Technology development plans were developed.
NASA Astrophysics Data System (ADS)
Dumpuri, Prashanth; Clements, Logan W.; Li, Rui; Waite, Jonathan M.; Stefansic, James D.; Geller, David A.; Miga, Michael I.; Dawant, Benoit M.
2009-02-01
Preoperative planning combined with image-guidance has shown promise towards increasing the accuracy of liver resection procedures. The purpose of this study was to validate one such preoperative planning tool for four patients undergoing hepatic resection. Preoperative computed tomography (CT) images acquired before surgery were used to identify tumor margins and to plan the surgical approach for resection of these tumors. Surgery was then performed with intraoperative digitization data acquire by an FDA approved image-guided liver surgery system (Pathfinder Therapeutics, Inc., Nashville, TN). Within 5-7 days after surgery, post-operative CT image volumes were acquired. Registration of data within a common coordinate reference was achieved and preoperative plans were compared to the postoperative volumes. Semi-quantitative comparisons are presented in this work and preliminary results indicate that significant liver regeneration/hypertrophy in the postoperative CT images may be present post-operatively. This could challenge pre/post operative CT volume change comparisons as a means to evaluate the accuracy of preoperative surgical plans.
Lunar Landing Operational Risk Model
NASA Technical Reports Server (NTRS)
Mattenberger, Chris; Putney, Blake; Rust, Randy; Derkowski, Brian
2010-01-01
Characterizing the risk of spacecraft goes beyond simply modeling equipment reliability. Some portions of the mission require complex interactions between system elements that can lead to failure without an actual hardware fault. Landing risk is currently the least characterized aspect of the Altair lunar lander and appears to result from complex temporal interactions between pilot, sensors, surface characteristics and vehicle capabilities rather than hardware failures. The Lunar Landing Operational Risk Model (LLORM) seeks to provide rapid and flexible quantitative insight into the risks driving the landing event and to gauge sensitivities of the vehicle to changes in system configuration and mission operations. The LLORM takes a Monte Carlo based approach to estimate the operational risk of the Lunar Landing Event and calculates estimates of the risk of Loss of Mission (LOM) - Abort Required and is Successful, Loss of Crew (LOC) - Vehicle Crashes or Cannot Reach Orbit, and Success. The LLORM is meant to be used during the conceptual design phase to inform decision makers transparently of the reliability impacts of design decisions, to identify areas of the design which may require additional robustness, and to aid in the development and flow-down of requirements.
Belrhiti, Zakaria; Booth, Andrew; Marchal, Bruno; Verstraeten, Roosmarijn
2016-04-27
District health managers play a key role in the effectiveness of decentralized health systems in low- and middle-income countries. Inadequate management and leadership skills often hamper their ability to improve quality of care and effectiveness of health service delivery. Nevertheless, significant investments have been made in capacity-building programmes based on site-based training, mentoring, and operational research. This systematic review aims to review the effectiveness of site-based training, mentoring, and operational research (or action research) on the improvement of district health system management and leadership. Our secondary objectives are to assess whether variations in composition or intensity of the intervention influence its effectiveness and to identify enabling and constraining contexts and underlying mechanisms. We will search the following databases: MEDLINE, PsycInfo, Cochrane Library, CRD database (DARE), Cochrane Effective Practice and Organisation of Care (EPOC) group, ISI Web of Science, Health Evidence.org, PDQ-Evidence, ERIC, EMBASE, and TRIP. Complementary search will be performed (hand-searching journals and citation and reference tracking). Studies that meet the following PICO (Population, Intervention, Comparison, Outcome) criteria will be included: P: professionals working at district health management level; I: site-based training with or without mentoring, or operational research; C: normal institutional arrangements; and O: district health management functions. We will include cluster randomized controlled trials, controlled before-and-after studies, interrupted time series analysis, quasi-experimental designs, and cohort and longitudinal studies. Qualitative research will be included to contextualize findings and identify barriers and facilitators. Primary outcomes that will be reported are district health management and leadership functions. We will assess risk of bias with the Cochrane Collaboration's tools for randomized controlled trials (RCT) and non RCT studies and Critical Appraisal Skills Programme checklists for qualitative studies. We will assess strength of recommendations with the GRADE tool for quantitative studies, and the CERQual approach for qualitative studies. Synthesis of quantitative studies will be performed through meta-analysis when appropriate. Best fit framework synthesis will be used to synthesize qualitative studies. This protocol paper describes a systematic review assessing the effectiveness of site-based training (with or without mentoring programmes or operational research) on the improvement of district health system management and leadership. PROSPERO CRD42015032351.
Crew fatigue safety performance indicators for fatigue risk management systems.
Gander, Philippa H; Mangie, Jim; Van Den Berg, Margo J; Smith, A Alexander T; Mulrine, Hannah M; Signal, T Leigh
2014-02-01
Implementation of Fatigue Risk Management Systems (FRMS) is gaining momentum; however, agreed safety performance indicators (SPIs) are lacking. This paper proposes an initial set of SPIs based on measures of crewmember sleep, performance, and subjective fatigue and sleepiness, together with methods for interpreting them. Data were included from 133 landing crewmembers on 2 long-range and 3 ultra-long-range trips (4-person crews, 3 airlines, 220 flights). Studies had airline, labor, and regulatory support, and underwent independent ethical review. SPIs evaluated preflight and at top of descent (TOD) were: total sleep in the prior 24 h and time awake at duty start and at TOD (actigraphy); subjective sleepiness (Karolinska Sleepiness Scale) and fatigue (Samn-Perelli scale); and psychomotor vigilance task (PVT) performance. Kruskal-Wallis nonparametric ANOVA with post hoc tests was used to identify significant differences between flights for each SPI. Visual and preliminary quantitative comparisons of SPIs between flights were made using box plots and bar graphs. Statistical analyses identified significant differences between flights across a range of SPls. In an FRMS, crew fatigue SPIs are envisaged as a decision aid alongside operational SPIs, which need to reflect the relevant causes of fatigue in different operations. We advocate comparing multiple SPIs between flights rather than defining safe/unsafe thresholds on individual SPIs. More comprehensive data sets are needed to identify the operational and biological factors contributing to the differences between flights reported here. Global sharing of an agreed core set of SPIs would greatly facilitate implementation and improvement of FRMS.
Controlling robots in the home: Factors that affect the performance of novice robot operators.
McGinn, Conor; Sena, Aran; Kelly, Kevin
2017-11-01
For robots to successfully integrate into everyday life, it is important that they can be effectively controlled by laypeople. However, the task of manually controlling mobile robots can be challenging due to demanding cognitive and sensorimotor requirements. This research explores the effect that the built environment has on the manual control of domestic service robots. In this study, a virtual reality simulation of a domestic robot control scenario was developed. The performance of fifty novice users was evaluated, and their subjective experiences recorded through questionnaires. Through quantitative and qualitative analysis, it was found that untrained operators frequently perform poorly at navigation-based robot control tasks. The study found that passing through doorways accounted for the largest number of collisions, and was consistently identified as a very difficult operation to perform. These findings suggest that homes and other human-orientated settings present significant challenges to robot control. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bomber Deployments: A New Power Projection Strategy
2016-08-21
civilian cargo airlift. The second quantitative analysis will assess the B-52 direct aviation support UTCs containing support equipment. B-52 UTCs...troops and cargo back and forth to the theater of operations. Operation IRAQI FREEDOM tested airlift capabilities when multiple services placed their...the quantitative analysis shows, to move all of the support equipment for one bomber squadron can be expensive and tie up valuable cargo aircraft
Subsistence strategies in Argentina during the late Pleistocene and early Holocene
NASA Astrophysics Data System (ADS)
Martínez, Gustavo; Gutiérrez, María A.; Messineo, Pablo G.; Kaufmann, Cristian A.; Rafuse, Daniel J.
2016-07-01
This paper highlights regional and temporal variation in the presence and exploitation of faunal resources from different regions of Argentina during the late Pleistocene and early Holocene. Specifically, the faunal analysis considered here includes the zooarchaeological remains from all sites older than 7500 14C years BP. We include quantitative information for each reported species (genus, family, or order) and we use the number of identified specimens (NISP per taxon and the NISPtotal by sites) as the quantitative measure of taxonomic abundance. The taxonomic richness (Ntaxatotal and Ntaxaexploited) and the taxonomic heterogeneity or Shannon-Wiener index are estimated in order to consider dietary generalization or specialization, and ternary diagrams are used to categorize subsistence patterns of particular sites and regions. The archaeological database is composed of 78 sites which are represented by 110 stratigraphic contexts. Our results demonstrate that although some quantitative differences between regions are observed, artiodactyls (camelids and deer) were the most frequently consumed animal resource in Argentina. Early hunter-gatherers did not follow a specialized predation strategy in megamammals. A variety in subsistence systems, operating in parallel with a strong regional emphasis is shown, according to specific environmental conditions and cultural trajectories.
Giera, Brian; Bukosky, Scott; Lee, Elaine; ...
2018-01-23
Here, quantitative color analysis is performed on videos of high contrast, low power reversible electrophoretic deposition (EPD)-based displays operated under different applied voltages. This analysis is coded in an open-source software, relies on a color differentiation metric, ΔE * 00, derived from digital video, and provides an intuitive relationship between the operating conditions of the devices and their performance. Time-dependent ΔE * 00 color analysis reveals color relaxation behavior, recoverability for different voltage sequences, and operating conditions that can lead to optimal performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giera, Brian; Bukosky, Scott; Lee, Elaine
Here, quantitative color analysis is performed on videos of high contrast, low power reversible electrophoretic deposition (EPD)-based displays operated under different applied voltages. This analysis is coded in an open-source software, relies on a color differentiation metric, ΔE * 00, derived from digital video, and provides an intuitive relationship between the operating conditions of the devices and their performance. Time-dependent ΔE * 00 color analysis reveals color relaxation behavior, recoverability for different voltage sequences, and operating conditions that can lead to optimal performance.
78 FR 57903 - Notice of Intent To Seek Approval To Renew an Information Collection
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-20
.... The indicators are both quantitative and descriptive. Quantitative information from the most recently... center activities with respect to industrial collaboration. [cir] Conducting a survey of all center... quantitative indicators determined by NSF to analyze the management and operation of the center. [cir...
Normalised quantitative polymerase chain reaction for diagnosis of tuberculosis-associated uveitis.
Barik, Manas Ranjan; Rath, Soveeta; Modi, Rohit; Rana, Rajkishori; Reddy, Mamatha M; Basu, Soumyava
2018-05-01
Polymerase chain reaction (PCR)-based diagnosis of tuberculosis-associated uveitis (TBU) in TB-endemic countries is challenging due to likelihood of latent mycobacterial infection in both immune and non-immune cells. In this study, we investigated normalised quantitative PCR (nqPCR) in ocular fluids (aqueous/vitreous) for diagnosis of TBU in a TB-endemic population. Mycobacterial copy numbers (mpb64 gene) were normalised to host genome copy numbers (RNAse P RNA component H1 [RPPH1] gene) in TBU (n = 16) and control (n = 13) samples (discovery cohort). The mpb64:RPPH1 ratios (normalised value) from each TBU and control sample were tested against the current reference standard i.e. clinically-diagnosed TBU, to generate Receiver Operating Characteristic (ROC) curves. The optimum cut-off value of mpb64:RPPH1 ratio (0.011) for diagnosing TBU was identified from the highest Youden index. This cut-off value was then tested in a different cohort of TBU and controls (validation cohort, 20 cases and 18 controls), where it yielded specificity, sensitivity and diagnostic accuracy of 94.4%, 85.0%, and 89.4% respectively. The above values for conventional quantitative PCR (≥1 copy of mpb64 per reaction) were 61.1%, 90.0%, and 74.3% respectively. Normalisation markedly improved the specificity and diagnostic accuracy of quantitative PCR for diagnosis of TBU. Copyright © 2018 Elsevier Ltd. All rights reserved.
Azuma, M; Hirai, T; Yamada, K; Yamashita, S; Ando, Y; Tateishi, M; Iryo, Y; Yoneda, T; Kitajima, M; Wang, Y; Yamashita, Y
2016-05-01
Quantitative susceptibility mapping is useful for assessing iron deposition in the substantia nigra of patients with Parkinson disease. We aimed to determine whether quantitative susceptibility mapping is useful for assessing the lateral asymmetry and spatial difference in iron deposits in the substantia nigra of patients with Parkinson disease. Our study population comprised 24 patients with Parkinson disease and 24 age- and sex-matched healthy controls. They underwent 3T MR imaging by using a 3D multiecho gradient-echo sequence. On reconstructed quantitative susceptibility mapping, we measured the susceptibility values in the anterior, middle, and posterior parts of the substantia nigra, the whole substantia nigra, and other deep gray matter structures in both hemibrains. To identify the more and less affected hemibrains in patients with Parkinson disease, we assessed the severity of movement symptoms for each hemibrain by using the Unified Parkinson's Disease Rating Scale. In the posterior substantia nigra of patients with Parkinson disease, the mean susceptibility value was significantly higher in the more than the less affected hemibrain substantia nigra (P < .05). This value was significantly higher in both the more and less affected hemibrains of patients with Parkinson disease than in controls (P < .05). Asymmetry of the mean susceptibility values was significantly greater for patients than controls (P < .05). Receiver operating characteristic analysis showed that quantitative susceptibility mapping of the posterior substantia nigra in the more affected hemibrain provided the highest power for discriminating patients with Parkinson disease from the controls. Quantitative susceptibility mapping is useful for assessing the lateral asymmetry and spatial difference of iron deposition in the substantia nigra of patients with Parkinson disease. © 2016 by American Journal of Neuroradiology.
Biomarkers identified by urinary metabonomics for noninvasive diagnosis of nutritional rickets.
Wang, Maoqing; Yang, Xue; Ren, Lihong; Li, Songtao; He, Xuan; Wu, Xiaoyan; Liu, Tingting; Lin, Liqun; Li, Ying; Sun, Changhao
2014-09-05
Nutritional rickets is a worldwide public health problem; however, the current diagnostic methods retain shortcomings for accurate diagnosis of nutritional rickets. To identify urinary biomarkers associated with nutritional rickets and establish a noninvasive diagnosis method, urinary metabonomics analysis by ultra-performance liquid chromatography/quadrupole time-of-flight tandem mass spectrometry and multivariate statistical analysis were employed to investigate the metabolic alterations associated with nutritional rickets in 200 children with or without nutritional rickets. The pathophysiological changes and pathogenesis of nutritional rickets were illustrated by the identified biomarkers. By urinary metabolic profiling, 31 biomarkers of nutritional rickets were identified and five candidate biomarkers for clinical diagnosis were screened and identified by quantitative analysis and receiver operating curve analysis. Urinary levels of five candidate biomarkers were measured using mass spectrometry or commercial kits. In the validation step, the combination of phosphate and sebacic acid was able to give a noninvasive and accurate diagnostic with high sensitivity (94.0%) and specificity (71.2%). Furthermore, on the basis of the pathway analysis of biomarkers, our urinary metabonomics analysis gives new insight into the pathogenesis and pathophysiology of nutritional rickets.
Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E
2014-01-01
This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for detecting obstructive coronary artery disease. QP outperforms semiquantitative measures of perfusion and qualitative methods that incorporate a combination of cine, perfusion, and late gadolinium enhancement imaging. These findings suggest a potential clinical role for quantitative stress perfusion CMR. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
DOT National Transportation Integrated Search
1981-10-01
The objectives of the Systems Operation Studies (SOS) for automated guideway transit (AGT) systems are to develop models for the analysis of system operations, to evaluate performance and cost, and to establish guidelines for the design and operation...
Approximation by the iterates of Bernstein operator
NASA Astrophysics Data System (ADS)
Zapryanova, Teodora; Tachev, Gancho
2012-11-01
We study the degree of pointwise approximation of the iterated Bernstein operators to its limiting operator. We obtain a quantitative estimates related to the conjecture of Gonska and Raşa from 2006.
Pulmonary MRA: differentiation of pulmonary embolism from truncation artefact.
Bannas, Peter; Schiebler, Mark L; Motosugi, Utaroh; François, Christopher J; Reeder, Scott B; Nagle, Scott K
2014-08-01
Truncation artefact (Gibbs ringing) causes central signal drop within vessels in pulmonary magnetic resonance angiography (MRA) that can be mistaken for emboli, reducing diagnostic accuracy for pulmonary embolism (PE). We propose a quantitative approach to differentiate truncation artefact from PE. Twenty-eight patients who underwent pulmonary computed tomography angiography (CTA) for suspected PE were recruited for pulmonary MRA. Signal intensity drops within pulmonary arteries that persisted on both arterial-phase and delayed-phase MRA were identified. The percent signal loss between the vessel lumen and central drop was measured. CTA served as the reference standard for presence of pulmonary emboli. A total of 65 signal intensity drops were identified on MRA. Of these, 48 (74%) were artefacts and 17 (26%) were PE, as confirmed by CTA. Truncation artefacts had a significantly lower median signal drop than PE on both arterial-phase (26% [range 12-58%] vs. 85% [range 53-91%]) and delayed-phase MRA (26% [range 11-55%] vs. 77% [range 47-89%]), p < 0.0001 for both. Receiver operating characteristic (ROC) analyses revealed a threshold value of 51% (arterial phase) and 47% signal drop (delayed phase) to differentiate between truncation artefact and PE with 100% sensitivity and greater than 90% specificity. Quantitative signal drop is an objective tool to help differentiate truncation artefact and pulmonary embolism in pulmonary MRA. • Inexperienced readers may mistake truncation artefacts for emboli on pulmonary MRA • Pulmonary emboli have non-uniform signal drop • 51% (arterial phase) and 47% (delayed phase) cut-off differentiates truncation artefact from PE • Quantitative signal drop measurement enables more accurate pulmonary embolism diagnosis with MRA.
Susarla, Srinivas M; Dodson, Thomas B; Lopez, Joseph; Swanson, Edward W; Calotta, Nicholas; Peacock, Zachary S
2015-08-01
Academic promotion is linked to research productivity. The purpose of this study was to assess the correlation between quantitative measures of academic productivity and academic rank among academic oral and maxillofacial surgeons. This was a cross-sectional study of full-time academic oral and maxillofacial surgeons in the United States. The predictor variables were categorized as demographic (gender, medical degree, research doctorate, other advanced degree) and quantitative measures of academic productivity (total number of publications, total number of citations, maximum number of citations for a single article, I-10 index [number of publications with ≥ 10 citations], and h-index [number of publications h with ≥ h citations each]). The outcome variable was current academic rank (instructor, assistant professor, associate professor, professor, or endowed professor). Descriptive, bivariate, and multiple regression statistics were computed to evaluate associations between the predictors and academic rank. Receiver-operator characteristic curves were computed to identify thresholds for academic promotion. The sample consisted of 324 academic oral and maxillofacial surgeons, of whom 11.7% were female, 40% had medical degrees, and 8% had research doctorates. The h-index was the most strongly correlated with academic rank (ρ = 0.62, p < 0.001). H-indexes of ≥ 4, ≥ 8, and ≥ 13 were identified as thresholds for promotion to associate professor, professor, and endowed professor, respectively (p < 0.001). This study found that the h-index was strongly correlated with academic rank among oral and maxillofacial surgery faculty members and thus suggests that promotions committees should consider using the h-index as an additional method to assess research activity.
The system of technical diagnostics of the industrial safety information network
NASA Astrophysics Data System (ADS)
Repp, P. V.
2017-01-01
This research is devoted to problems of safety of the industrial information network. Basic sub-networks, ensuring reliable operation of the elements of the industrial Automatic Process Control System, were identified. The core tasks of technical diagnostics of industrial information safety were presented. The structure of the technical diagnostics system of the information safety was proposed. It includes two parts: a generator of cyber-attacks and the virtual model of the enterprise information network. The virtual model was obtained by scanning a real enterprise network. A new classification of cyber-attacks was proposed. This classification enables one to design an efficient generator of cyber-attacks sets for testing the virtual modes of the industrial information network. The numerical method of the Monte Carlo (with LPτ - sequences of Sobol), and Markov chain was considered as the design method for the cyber-attacks generation algorithm. The proposed system also includes a diagnostic analyzer, performing expert functions. As an integrative quantitative indicator of the network reliability the stability factor (Kstab) was selected. This factor is determined by the weight of sets of cyber-attacks, identifying the vulnerability of the network. The weight depends on the frequency and complexity of cyber-attacks, the degree of damage, complexity of remediation. The proposed Kstab is an effective integral quantitative measure of the information network reliability.
Informational analysis involving application of complex information system
NASA Astrophysics Data System (ADS)
Ciupak, Clébia; Vanti, Adolfo Alberto; Balloni, Antonio José; Espin, Rafael
The aim of the present research is performing an informal analysis for internal audit involving the application of complex information system based on fuzzy logic. The same has been applied in internal audit involving the integration of the accounting field into the information systems field. The technological advancements can provide improvements to the work performed by the internal audit. Thus we aim to find, in the complex information systems, priorities for the work of internal audit of a high importance Private Institution of Higher Education. The applied method is quali-quantitative, as from the definition of strategic linguistic variables it was possible to transform them into quantitative with the matrix intersection. By means of a case study, where data were collected via interview with the Administrative Pro-Rector, who takes part at the elaboration of the strategic planning of the institution, it was possible to infer analysis concerning points which must be prioritized at the internal audit work. We emphasize that the priorities were identified when processed in a system (of academic use). From the study we can conclude that, starting from these information systems, audit can identify priorities on its work program. Along with plans and strategic objectives of the enterprise, the internal auditor can define operational procedures to work in favor of the attainment of the objectives of the organization.
NASA Technical Reports Server (NTRS)
1972-01-01
The results of the space station data flow study are reported. Conceived is a low cost interactive data dissemination system for space station experiment data that includes facility and personnel requirements and locations, phasing requirements and implementation costs. Each of the experiments identified by the operating schedule is analyzed and the support characteristics identified in order to determine data characteristics. Qualitative and quantitative comparison of candidate concepts resulted in a proposed data system configuration baseline concept that includes a data center which combines the responsibility of reprocessing, archiving, and user services according to the various agencies and their responsibility assignments. The primary source of data is the space station complex which provides through the Tracking Data Relay Satellite System (TDRS) and by space shuttle delivery data from experiments in free flying modules and orbiting shuttles as well as from the experiments in the modular space station itself.
Boiler Tube Corrosion Characterization with a Scanning Thermal Line
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Jacobstein, Ronald; Reilly, Thomas
2001-01-01
Wall thinning due to corrosion in utility boiler water wall tubing is a significant operational concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. Unfortunately, ultrasonic inspection is very manpower intense and slow. Therefore, thickness measurements are typically taken over a relatively small percentage of the total boiler wall and statistical analysis is used to determine the overall condition of the boiler tubing. Other inspection techniques, such as electromagnetic acoustic transducer (EMAT), have recently been evaluated, however they provide only a qualitative evaluation - identifying areas or spots where corrosion has significantly reduced the wall thickness. NASA Langley Research Center, in cooperation with ThermTech Services, has developed a thermal NDE technique designed to quantitatively measure the wall thickness and thus determine the amount of material thinning present in steel boiler tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed and accuracy for large structures such as boiler water walls. A theoretical basis for the technique will be presented to establish the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of the application of this technology to actual water wall tubing samples and in-situ inspections will be presented.
Integrating socio-economic and biophysical data to enhance watershed management and planning
NASA Astrophysics Data System (ADS)
Pirani, Farshad Jalili; Mousavi, Seyed Alireza
2016-09-01
Sustainability has always been considered as one of the main aspects of watershed management plans. In many developing countries, watershed management practices and planning are usually performed by integrating biophysical layers, and other existing layers which cannot be identified as geographic layers are ignored. We introduce an approach to consider some socioeconomic parameters which are important for watershed management decisions. Ganj basin in Chaharmahal-Bakhtiari Province was selected as the case study area, which includes three traditional sanctums: Ganj, Shiremard and Gerdabe Olya. Socioeconomic data including net agricultural income, net ranching income, population and household number, literacy rate, unemployment rate, population growth rate and active population were mapped within traditional sanctums and then were integrated into other biophysical layers. After overlaying and processing these data to determine management units, different quantitative and qualitative approaches were adopted to achieve a practical framework for watershed management planning and relevant plans for homogeneous units were afterwards proposed. Comparing the results with current plans, the area of allocated lands to different proposed operations considering both qualitative and quantitative approaches were the same in many cases and there was a meaningful difference with current plans; e.g., 3820 ha of lands are currently managed under an enclosure plan, while qualitative and quantitative approaches in this study suggest 1388 and 1428 ha to be allocated to this operation type, respectively. Findings show that despite the ambiguities and complexities, different techniques could be adopted to incorporate socioeconomic conditions in watershed management plans. This introductory approach will help to enhance watershed management decisions with more attention to societal background and economic conditions, which will presumably motivate local communities to participate in watershed management plans.
Cheng, Yuan; Bian, Wuying; Pang, Xin; Yu, Jiahong; Ahammed, Golam J; Zhou, Guozhi; Wang, Rongqing; Ruan, Meiying; Li, Zhimiao; Ye, Qingjing; Yao, Zhuping; Yang, Yuejian; Wan, Hongjian
2017-01-01
Gene expression analysis in tomato fruit has drawn increasing attention nowadays. Quantitative real-time PCR (qPCR) is a routine technique for gene expression analysis. In qPCR operation, reliability of results largely depends on the choice of appropriate reference genes (RGs). Although tomato is a model for fruit biology study, few RGs for qPCR analysis in tomato fruit had yet been developed. In this study, we initially identified 38 most stably expressed genes based on tomato transcriptome data set, and their expression stabilities were further determined in a set of tomato fruit samples of four different fruit developmental stages (Immature, mature green, breaker, mature red) using qPCR analysis. Two statistical algorithms, geNorm and Normfinder, concordantly determined the superiority of these identified putative RGs. Notably, SlFRG05 (Solyc01g104170), SlFRG12 (Solyc04g009770), SlFRG16 (Solyc10g081190), SlFRG27 (Solyc06g007510), and SlFRG37 (Solyc11g005330) were proved to be suitable RGs for tomato fruit development study. Further analysis using geNorm indicate that the combined use of SlFRG03 (Solyc02g063070) and SlFRG27 would provide more reliable normalization results in qPCR experiments. The identified RGs in this study will be beneficial for future qPCR analysis of tomato fruit developmental study, as well as for the potential identification of optimal normalization controls in other plant species.
Cheng, Yuan; Bian, Wuying; Pang, Xin; Yu, Jiahong; Ahammed, Golam J.; Zhou, Guozhi; Wang, Rongqing; Ruan, Meiying; Li, Zhimiao; Ye, Qingjing; Yao, Zhuping; Yang, Yuejian; Wan, Hongjian
2017-01-01
Gene expression analysis in tomato fruit has drawn increasing attention nowadays. Quantitative real-time PCR (qPCR) is a routine technique for gene expression analysis. In qPCR operation, reliability of results largely depends on the choice of appropriate reference genes (RGs). Although tomato is a model for fruit biology study, few RGs for qPCR analysis in tomato fruit had yet been developed. In this study, we initially identified 38 most stably expressed genes based on tomato transcriptome data set, and their expression stabilities were further determined in a set of tomato fruit samples of four different fruit developmental stages (Immature, mature green, breaker, mature red) using qPCR analysis. Two statistical algorithms, geNorm and Normfinder, concordantly determined the superiority of these identified putative RGs. Notably, SlFRG05 (Solyc01g104170), SlFRG12 (Solyc04g009770), SlFRG16 (Solyc10g081190), SlFRG27 (Solyc06g007510), and SlFRG37 (Solyc11g005330) were proved to be suitable RGs for tomato fruit development study. Further analysis using geNorm indicate that the combined use of SlFRG03 (Solyc02g063070) and SlFRG27 would provide more reliable normalization results in qPCR experiments. The identified RGs in this study will be beneficial for future qPCR analysis of tomato fruit developmental study, as well as for the potential identification of optimal normalization controls in other plant species. PMID:28900431
Mumtaz, Zubia; Salway, Sarah; Nyagero, Josephat; Osur, Joachim; Chirwa, Ellen; Kachale, Fannie; Saunders, Duncan
2016-01-01
The Government of Malawi is seeking evidence to improve implementation of its flagship quality of care improvement initiative-the Standards Based Management-Recognition for Reproductive Health (SBM-R(RH)). This implementation study will assess the quality of maternal healthcare in facilities where the SBM-R(RH) initiative has been employed, identify factors that support or undermine effectiveness of the initiative and develop strategies to further enhance its operation. Data will be collected in 4 interlinked modules using quantitative and qualitative research methods. Module 1 will develop the programme theory underlying the SBM-R(RH) initiative, using document review and in-depth interviews with policymakers and programme managers. Module 2 will quantitatively assess the quality and equity of maternal healthcare provided in facilities where the SBM-R(RH) initiative has been implemented, using the Malawi Integrated Performance Standards for Reproductive Health. Module 3 will conduct an organisational ethnography to explore the structures and processes through which SBM-R(RH) is currently operationalised. Barriers and facilitators will be identified. Module 4 will involve coordinated co-production of knowledge by researchers, policymakers and the public, to identify and test strategies to improve implementation of the initiative. The research outcomes will provide empirical evidence of strategies that will enhance the facilitators and address the barriers to effective implementation of the initiative. It will also contribute to the theoretical advances in the emerging science of implementation research.
Standard Operational Protocols in professional nursing practice: use, weaknesses and potentialities.
Sales, Camila Balsero; Bernardes, Andrea; Gabriel, Carmen Silvia; Brito, Maria de Fátima Paiva; Moura, André Almeida de; Zanetti, Ariane Cristina Barboza
2018-01-01
to evaluate the use of Standard Operational Protocols (SOPs) in the professional practice of the nursing team based on the theoretical framework of Donabedian, as well as to identify the weaknesses and potentialities from its implementation. Evaluative research, with quantitative approach performed with nursing professionals working in the Health Units of a city of São Paulo, composed of two stages: document analysis and subsequent application of a questionnaire to nursing professionals. A total of 247 nursing professionals participated and reported changes in the way the interventions were performed. The main weaknesses were the small number of professionals, inadequate physical structure and lack of materials. Among the potentialities were: the standardization of materials and concern of the manager and professional related to patient safety. The reassessment of SOPs is necessary, as well as the adoption of a strategy of permanent education of professionals aiming at improving the quality of care provided.
Assessing the status of airline safety culture and its relationship to key employee attitudes
NASA Astrophysics Data System (ADS)
Owen, Edward L.
The need to identify the factors that influence the overall safety environment and compliance with safety procedures within airline operations is substantial. This study examines the relationships between job satisfaction, the overall perception of the safety culture, and compliance with safety rules and regulations of airline employees working in flight operations. A survey questionnaire administered via the internet gathered responses which were converted to numerical values for quantitative analysis. The results were grouped to provide indications of overall average levels in each of the three categories, satisfaction, perceptions, and compliance. Correlations between data in the three sets were tested for statistical significance using two-sample t-tests assuming equal variances. Strong statistical significance was found between job satisfaction and compliance with safety rules and between perceptions of the safety environment and safety compliance. The relationship between job satisfaction and safety perceptions did not show strong statistical significance.
ERIC Educational Resources Information Center
Yousef, Darwish Abdulrahamn
2017-01-01
Purpose: This paper aims to investigate the impacts of teaching style, English language and communication and assessment methods on the academic performance of undergraduate business students in introductory quantitative courses such as Statistics for Business 1 and 2, Quantitative Methods for Business, Operations and Production Management and…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... quantitative approaches to determine the levels of taking that would result in a negligible impact to affected species or stocks of marine mammals. The quantitative approach is more appropriate for serious injury and... required a more quantitative approach for assessing what level of removals from a population stock of...
Preliminary Investigation of Impact of Technological Impairment on Trajectory-Based Operations
NASA Technical Reports Server (NTRS)
Sankararaman, Shankar; Roychoudhury, Indranil; Zhang, Xiaoge; Goebel, Kai
2017-01-01
The Next Generation Air Transportation System (NextGen) incorporates collaborative air traffic management and Trajectory-Based Operations (TBO) in order to significantly increase the capacity, efficiency, and predictability of operations in the National Airspace System (NAS), without decreasing safety. This is enabled by airspace users and service providers sharing knowledge about operations that allows prediction of the complete 4D flight trajectory with as little uncertainty as possible. Additionally, new software and hardware technology is critical to reaching NextGen goals, especially with regard to TBO. What if the technologies that are critical for TBO were to be impaired or fail completely? Should there be a malfunction of a piece of the technology, it must be ensured that the whole system does not break down completely or suffer severe impairment. Instead, operations need to be maintained proportionally to the problem and safety needs to be ensured (graceful degradation). This paper proposes a systematic framework to investigate the vulnerability of TBO to technology disruption, and determine the impact of technological impairment on TBO. Two representative technologies are chosen for detailed investigation and the impact of their impairment on the degradation of TBO is illustrated using a weather-related scenario. XXXX There are several possible directions of future work. We believe it is desirable to develop methods to quantitatively assess the impact of technological disruption on TBO and to have the simulation tools to validate the impact. The availability of prognostics and health management methods could be leveraged to predict technological failure/disruption, thus predicting how TBO will be a ected, and possibly pro-actively mitigating the impact. It is important to develop large-scale scenarios where the e ect of technological impairment is prominent, and identify methods to quantitatively assess the extent of TBO degradation. An important goal of such an investigation is the development of failure-resistant resilient trajectory-based oper- ations. Resilience14, 15 is the property of a system to \\bounce back" and resume at least a signi cant portion of its functionalities after degradation due to technological impairment(s). A systems resilience includes properties such as \\bu ering capacity" (quantifying disruptions the system can absorb or adapt to without a fundamental breakdown in performance or in the systems structure), \\ exibility" (ability to restructure itself in response to external changes or pressures), "margin" (how closely the system is currently operating rela- tive to one or another kind of performance boundary), \\tolerance" (whether the system gracefully degrades as stress/pressure increase, or collapses quickly when pressure exceeds adaptive capacity), etc. Future work needs to focus on quantifying and improving the resilience of TBO, and identifying resilient design solutions for aviation.
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
33 CFR 19.15 - Permits for commercial vessels handling explosives at military installations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... no quantitative restrictions, based on considerations of isolation and remoteness, shall be required... from the Coast Guard for such operations with respect to quantitative or other restrictions imposed by...
33 CFR 19.15 - Permits for commercial vessels handling explosives at military installations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... no quantitative restrictions, based on considerations of isolation and remoteness, shall be required... from the Coast Guard for such operations with respect to quantitative or other restrictions imposed by...
33 CFR 19.15 - Permits for commercial vessels handling explosives at military installations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... no quantitative restrictions, based on considerations of isolation and remoteness, shall be required... from the Coast Guard for such operations with respect to quantitative or other restrictions imposed by...
33 CFR 19.15 - Permits for commercial vessels handling explosives at military installations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... no quantitative restrictions, based on considerations of isolation and remoteness, shall be required... from the Coast Guard for such operations with respect to quantitative or other restrictions imposed by...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
33 CFR 19.15 - Permits for commercial vessels handling explosives at military installations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... no quantitative restrictions, based on considerations of isolation and remoteness, shall be required... from the Coast Guard for such operations with respect to quantitative or other restrictions imposed by...
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator...
Tahar, Alexandre; Tiedeken, Erin Jo; Clifford, Eoghan; Cummins, Enda; Rowan, Neil
2017-12-15
Contamination of receiving waters with pharmaceutical compounds is of pressing concern. This constitutes the first study to report on the development of a semi-quantitative risk assessment (RA) model for evaluating the environmental threat posed by three EU watch list pharmaceutical compounds namely, diclofenac, 17-beta-estradiol and 17-alpha-ethinylestradiol, to aquatic ecosystems using Irish data as a case study. This RA model adopts the Irish Environmental Protection Agency Source-Pathway-Receptor concept to define relevant parameters for calculating low, medium or high risk score for each agglomeration of wastewater treatment plant (WWTP), which include catchment, treatments, operational and management factors. This RA model may potentially be used on a national scale to (i) identify WWTPs that pose a particular risk as regards releasing disproportionally high levels of these pharmaceutical compounds, and (ii) help identify priority locations for introducing or upgrading control measures (e.g. tertiary treatment, source reduction). To assess risks for these substances of emerging concern, the model was applied to 16 urban WWTPs located in different regions in Ireland that were scored for the three different compounds and ranked as low, medium or high risk. As a validation proxy, this case study used limited monitoring data recorded at some these plants receiving waters. It is envisaged that this semi-quantitative RA approach may aid other EU countries investigate and screen for potential risks where limited measured or predicted environmental pollutant concentrations and/or hydrological data are available. This model is semi-quantitative, as other factors such as influence of climate change and drug usage or prescription data will need to be considered in a future point for estimating and predicting risks. Copyright © 2017 Elsevier B.V. All rights reserved.
Hydrothermal Liquefaction Treatment Preliminary Hazard Analysis Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Peter P.; Wagner, Katie A.
A preliminary hazard assessment was completed during February 2015 to evaluate the conceptual design of the modular hydrothermal liquefaction treatment system. The hazard assessment was performed in 2 stages. An initial assessment utilizing Hazard Identification and Preliminary Hazards Analysis (PHA) techniques identified areas with significant or unique hazards (process safety-related hazards) that fall outside of the normal operating envelope of PNNL and warranted additional analysis. The subsequent assessment was based on a qualitative What-If analysis. This analysis was augmented, as necessary, by additional quantitative analysis for scenarios involving a release of hazardous material or energy with the potential for affectingmore » the public.« less
A Backscatter-Lidar Forward-Operator
NASA Astrophysics Data System (ADS)
Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Vogel, Bernhard; Mattis, Ina; Flentje, Harald; Förstner, Jochen; Potthast, Roland
2015-04-01
We have developed a forward-operator which is capable of calculating virtual lidar profiles from atmospheric state simulations. The operator allows us to compare lidar measurements and model simulations based on the same measurement parameter: the lidar backscatter profile. This method simplifies qualitative comparisons and also makes quantitative comparisons possible, including statistical error quantification. Implemented into an aerosol-capable model system, the operator will act as a component to assimilate backscatter-lidar measurements. As many weather services maintain already networks of backscatter-lidars, such data are acquired already in an operational manner. To estimate and quantify errors due to missing or uncertain aerosol information, we started sensitivity studies about several scattering parameters such as the aerosol size and both the real and imaginary part of the complex index of refraction. Furthermore, quantitative and statistical comparisons between measurements and virtual measurements are shown in this study, i.e. applying the backscatter-lidar forward-operator on model output.
Walsh, Declan
2004-01-01
In this study, a hematology/oncology computerized discharge database was qualitatively and quantitatively reviewed using an empirical methodology. The goal was to identify potential patients for admission to a planned acute-care, palliative medicine inpatient unit. Patients were identified by the International Classifications of Disease (ICD-9) codes. A large heterogenous population, comprising up to 40 percent of annual discharges from the Hematology/Oncology service, was identified. If management decided to add an acute-care, palliative medicine unit to the hospital, these are the patients who would benefit. The study predicted a significant change in patient profile, acuity, complexity, and resource utilization in current palliative care services. This study technique predicted the actual clinical load of the acute-care unit when it opened and was very helpful in program development. Our model predicted that 695 patients would be admitted to the acute-care palliative medicine unit in the first year of operation; 655 patients were actually admitted during this time.
Barvkar, Vitthal T; Pardeshi, Varsha C; Kale, Sandip M; Kadoo, Narendra Y; Giri, Ashok P; Gupta, Vidya S
2012-12-07
Flax (Linum usitatissimum L.) seeds are an important source of food and feed due to the presence of various health promoting compounds, making it a nutritionally and economically important plant. An in-depth analysis of the proteome of developing flax seed is expected to provide significant information with respect to the regulation and accumulation of such storage compounds. Therefore, a proteomic analysis of seven seed developmental stages (4, 8, 12, 16, 22, 30, and 48 days after anthesis) in a flax variety, NL-97 was carried out using a combination of 1D-SDS-PAGE and LC-MSE methods. A total 1716 proteins were identified and their functional annotation revealed that a majority of them were involved in primary metabolism, protein destination, storage and energy. Three carbon assimilatory pathways appeared to operate in flax seeds. Reverse transcription quantitative PCR of selected 19 genes was carried out to understand their roles during seed development. Besides storage proteins, methionine synthase, RuBisCO and S-adenosylmethionine synthetase were highly expressed transcripts, highlighting their importance in flax seed development. Further, the identified proteins were mapped onto developmental seed specific expressed sequence tag (EST) libraries of flax to obtain transcriptional evidence and 81% of them had detectable expression at the mRNA level. This study provides new insights into the complex seed developmental processes operating in flax.
33 CFR 154.2020 - Certification and recertification-owner/operator responsibilities.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Procedures,” and in Military Standard MIL-STD-882B for a quantitative failure analysis. For assistance in... quantitative failure analysis is also conducted, the level of safety attained is at least one order of...
Using Risk Assessment Methodologies to Meet Management Objectives
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2015-01-01
Current decision making involves numerous possible combinations of technology elements, safety and health issues, operational aspects and process considerations to satisfy program goals. Identifying potential risk considerations as part of the management decision making process provides additional tools to make more informed management decision. Adapting and using risk assessment methodologies can generate new perspectives on various risk and safety concerns that are not immediately apparent. Safety and operational risks can be identified and final decisions can balance these considerations with cost and schedule risks. Additional assessments can also show likelihood of event occurrence and event consequence to provide a more informed basis for decision making, as well as cost effective mitigation strategies. Methodologies available to perform Risk Assessments range from qualitative identification of risk potential, to detailed assessments where quantitative probabilities are calculated. Methodology used should be based on factors that include: 1) type of industry and industry standards, 2) tasks, tools, and environment 3) type and availability of data and 4) industry views and requirements regarding risk & reliability. Risk Assessments are a tool for decision makers to understand potential consequences and be in a position to reduce, mitigate or eliminate costly mistakes or catastrophic failures.
Devine, Susan G; Muller, Reinhold; Carter, Anthony
2008-12-01
An exploratory descriptive study was undertaken to identify staff perceptions of the types and sources of occupational health and safety hazards at a remote fly-in-fly-out minerals extraction and processing plant in northwest Queensland. Ongoing focus groups with all sectors of the operation were conducted concurrently with quantitative research studies from 2001 to 2005. Action research processes were used with management and staff to develop responses to identified issues. Staff identified and generated solutions to the core themes of: health and safety policies and procedures; chemical exposures; hydration and fatigue. The Framework for Health Promotion Action was applied to ensure a comprehensive and holistic response to identified issues. Participatory processes using an action research framework enabled a deep understanding of staff perceptions of occupational health and safety hazards in this setting. The Framework for Health Promotion provided a relevant and useful tool to engage with staff and develop solutions to perceived occupational health and safety issues in the workplace.
Forman, Jane; Creswell, John W; Damschroder, Laura; Kowalski, Christine P; Krein, Sarah L
2008-12-01
Infection control professionals and hospital epidemiologists are accustomed to using quantitative research. Although quantitative studies are extremely important in the field of infection control and prevention, often they cannot help us explain why certain factors affect the use of infection control practices and identify the underlying mechanisms through which they do so. Qualitative research methods, which use open-ended techniques, such as interviews, to collect data and nonstatistical techniques to analyze it, provide detailed, diverse insights of individuals, useful quotes that bring a realism to applied research, and information about how different health care settings operate. Qualitative research can illuminate the processes underlying statistical correlations, inform the development of interventions, and show how interventions work to produce observed outcomes. This article describes the key features of qualitative research and the advantages that such features add to existing quantitative research approaches in the study of infection control. We address the goal of qualitative research, the nature of the research process, sampling, data collection and analysis, validity, generalizability of findings, and presentation of findings. Health services researchers are increasingly using qualitative methods to address practical problems by uncovering interacting influences in complex health care environments. Qualitative research methods, applied with expertise and rigor, can contribute important insights to infection prevention efforts.
Velasco-Tapia, Fernando
2014-01-01
Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures).
Quantitative fault tolerant control design for a hydraulic actuator with a leaking piston seal
NASA Astrophysics Data System (ADS)
Karpenko, Mark
Hydraulic actuators are complex fluid power devices whose performance can be degraded in the presence of system faults. In this thesis a linear, fixed-gain, fault tolerant controller is designed that can maintain the positioning performance of an electrohydraulic actuator operating under load with a leaking piston seal and in the presence of parametric uncertainties. Developing a control system tolerant to this class of internal leakage fault is important since a leaking piston seal can be difficult to detect, unless the actuator is disassembled. The designed fault tolerant control law is of low-order, uses only the actuator position as feedback, and can: (i) accommodate nonlinearities in the hydraulic functions, (ii) maintain robustness against typical uncertainties in the hydraulic system parameters, and (iii) keep the positioning performance of the actuator within prescribed tolerances despite an internal leakage fault that can bypass up to 40% of the rated servovalve flow across the actuator piston. Experimental tests verify the functionality of the fault tolerant control under normal and faulty operating conditions. The fault tolerant controller is synthesized based on linear time-invariant equivalent (LTIE) models of the hydraulic actuator using the quantitative feedback theory (QFT) design technique. A numerical approach for identifying LTIE frequency response functions of hydraulic actuators from acceptable input-output responses is developed so that linearizing the hydraulic functions can be avoided. The proposed approach can properly identify the features of the hydraulic actuator frequency response that are important for control system design and requires no prior knowledge about the asymptotic behavior or structure of the LTIE transfer functions. A distributed hardware-in-the-loop (HIL) simulation architecture is constructed that enables the performance of the proposed fault tolerant control law to be further substantiated, under realistic operating conditions. Using the HIL framework, the fault tolerant hydraulic actuator is operated as a flight control actuator against the real-time numerical simulation of a high-performance jet aircraft. A robust electrohydraulic loading system is also designed using QFT so that the in-flight aerodynamic load can be experimentally replicated. The results of the HIL experiments show that using the fault tolerant controller to compensate the internal leakage fault at the actuator level can benefit the flight performance of the airplane.
NASA Technical Reports Server (NTRS)
Coppenbarger, Rich; Jung, Yoon; Kozon, Tom; Farrahi, Amir; Malik, Wakar; Lee, Hanbong; Chevalley, Eric; Kistler, Matt
2016-01-01
NASA is collaborating with the FAA and aviation industry to develop and demonstrate new capabilities that integrate arrival, departure, and surface air-traffic operations. The concept relies on trajectory-based departure scheduling and collaborative decision making to reduce delays and uncertainties in taxi and climb operations. The paper describes the concept and benefit mechanisms aimed at improving flight efficiency and predictability while maintaining or improving operational throughput. The potential impact of the technology is studied and discussed through a quantitative analysis of relevant shortfalls at the site identified for initial deployment and demonstration in 2017: Charlotte-Douglas International Airport. Results from trajectory analysis indicate substantial opportunity to reduce taxi delays for both departures and arrivals by metering departures at the gate in a manner that maximizes throughput while adhering to takeoff restrictions due mostly to airspace constraints. Substantial taxi-out delay reduction is shown for flights subject to departure restrictions stemming from traffic flow management initiatives. Opportunities to improve the predictability of taxi, takeoff, and climb operations are examined and their potential impact on airline scheduling decisions and air-traffic forecasting is discussed. In addition, the potential to improve throughput with departure scheduling that maximizes use of available runway and airspace capacity is analyzed.
Cordella, Claire; Dickerson, Bradford C.; Quimby, Megan; Yunusova, Yana; Green, Jordan R.
2016-01-01
Background Primary progressive aphasia (PPA) is a neurodegenerative aphasic syndrome with three distinct clinical variants: non-fluent (nfvPPA), logopenic (lvPPA), and semantic (svPPA). Speech (non-) fluency is a key diagnostic marker used to aid identification of the clinical variants, and researchers have been actively developing diagnostic tools to assess speech fluency. Current approaches reveal coarse differences in fluency between subgroups, but often fail to clearly differentiate nfvPPA from the variably fluent lvPPA. More robust subtype differentiation may be possible with finer-grained measures of fluency. Aims We sought to identify the quantitative measures of speech rate—including articulation rate and pausing measures—that best differentiated PPA subtypes, specifically the non-fluent group (nfvPPA) from the more fluent groups (lvPPA, svPPA). The diagnostic accuracy of the quantitative speech rate variables was compared to that of a speech fluency impairment rating made by clinicians. Methods and Procedures Automatic estimates of pause and speech segment durations and rate measures were derived from connected speech samples of participants with PPA (N=38; 11 nfvPPA, 14 lvPPA, 13 svPPA) and healthy age-matched controls (N=8). Clinician ratings of fluency impairment were made using a previously validated clinician rating scale developed specifically for use in PPA. Receiver operating characteristic (ROC) analyses enabled a quantification of diagnostic accuracy. Outcomes and Results Among the quantitative measures, articulation rate was the most effective for differentiating between nfvPPA and the more fluent lvPPA and svPPA groups. The diagnostic accuracy of both speech and articulation rate measures was markedly better than that of the clinician rating scale, and articulation rate was the best classifier overall. Area under the curve (AUC) values for articulation rate were good to excellent for identifying nfvPPA from both svPPA (AUC=.96) and lvPPA (AUC=.86). Cross-validation of accuracy results for articulation rate showed good generalizability outside the training dataset. Conclusions Results provide empirical support for (1) the efficacy of quantitative assessments of speech fluency and (2) a distinct non-fluent PPA subtype characterized, at least in part, by an underlying disturbance in speech motor control. The trend toward improved classifier performance for quantitative rate measures demonstrates the potential for a more accurate and reliable approach to subtyping in the fluency domain, and suggests that articulation rate may be a useful input variable as part of a multi-dimensional clinical subtyping approach. PMID:28757671
[Quantitative research on operation behavior of acupuncture manipulation].
Li, Jing; Grierson, Lawrence; Wu, Mary X; Breuer, Ronny; Carnahan, Heather
2014-03-01
To explore a method of quantitative evaluation on operation behavior of acupuncture manipulation and further analyze behavior features of professional acupuncture manipulation. According to acupuncture basic manipulations, Scales for Operation Behavior of Acupuncture Basic Manipulation was made and Delphi method was adopted to test its validity. Two independent estimators utilized this scale to assess operation behavior of acupuncture manipulate among 12 acupuncturists and 12 acupuncture-novices and calculate interrater reliability, also the differences of total score of operation behavior in the two groups as well as single-step score, including sterilization, needle insertion, needle manipulation and needle withdrawal, were compared. The validity of this scale was satisfied. The inter-rater reliability was 0. 768. The total score of operation behavior in acupuncturist group was significantly higher than that in the acupuncture-novice group (13.80 +/- 1.05 vs 11.03 +/- 2.14, P < 0.01). The scores of needle insertion and needle manipulation in the acupuncturist group were significantly higher than those in the acupuncture-novice group (4.28 +/- 0.91 vs 2.54 +/- 1.51, P < 0.01; 2.56 +/- 0.65 vs 1.88 +/- 0.88, P < 0.05); however, the scores of sterilization and needle withdrawal in the acupuncturist group were not different from those in the acupuncture-novice group. This scale is suitable for quantitative evaluation on operation behavior of acupuncture manipulation. The behavior features of professional acupuncture manipulation are mainly presented with needle insertion and needle manipulation which has superior difficulty, high coordination and accuracy.
Methodology for urban rail and construction technology research and development planning
NASA Technical Reports Server (NTRS)
Rubenstein, L. D.; Land, J. E.; Deshpande, G.; Dayman, B.; Warren, E. H.
1980-01-01
A series of transit system visits, organized by the American Public Transit Association (APTA), was conducted in which the system operators identified the most pressing development needs. These varied by property and were reformulated into a series of potential projects. To assist in the evaluation, a data base useful for estimating the present capital and operating costs of various transit system elements was generated from published data. An evaluation model was developed which considered the rate of deployment of the research and development project, potential benefits, development time and cost. An outline of an evaluation methodology that considered benefits other than capital and operating cost savings was also presented. During the course of the study, five candidate projects were selected for detailed investigation; (1) air comfort systems; (2) solid state auxiliary power conditioners; (3) door systems; (4) escalators; and (5) fare collection systems. Application of the evaluation model to these five examples showed the usefulness of modeling deployment rates and indicated a need to increase the scope of the model to quantitatively consider reliability impacts.
Analytical control test plan and microbiological methods for the water recovery test
NASA Technical Reports Server (NTRS)
Traweek, M. S. (Editor); Tatara, J. D. (Editor)
1994-01-01
Qualitative and quantitative laboratory results are important to the decision-making process. In some cases, they may represent the only basis for deciding between two or more given options or processes. Therefore, it is essential that handling of laboratory samples and analytical operations employed are performed at a deliberate level of conscientious effort. Reporting erroneous results can lead to faulty interpretations and result in misinformed decisions. This document provides analytical control specifications which will govern future test procedures related to all Water Recovery Test (WRT) Phase 3 activities to be conducted at the National Aeronautics and Space Administration/Marshall Space Flight Center (NASA/MSFC). This document addresses the process which will be used to verify analytical data generated throughout the test period, and to identify responsibilities of key personnel and participating laboratories, the chains of communication to be followed, and ensure that approved methodology and procedures are used during WRT activities. This document does not outline specifics, but provides a minimum guideline by which sampling protocols, analysis methodologies, test site operations, and laboratory operations should be developed.
Temporal system-level organization of the switch from glycolytic to gluconeogenic operation in yeast
Zampar, Guillermo G; Kümmel, Anne; Ewald, Jennifer; Jol, Stefan; Niebel, Bastian; Picotti, Paola; Aebersold, Ruedi; Sauer, Uwe; Zamboni, Nicola; Heinemann, Matthias
2013-01-01
The diauxic shift in Saccharomyces cerevisiae is an ideal model to study how eukaryotic cells readjust their metabolism from glycolytic to gluconeogenic operation. In this work, we generated time-resolved physiological data, quantitative metabolome (69 intracellular metabolites) and proteome (72 enzymes) profiles. We found that the diauxic shift is accomplished by three key events that are temporally organized: (i) a reduction in the glycolytic flux and the production of storage compounds before glucose depletion, mediated by downregulation of phosphofructokinase and pyruvate kinase reactions; (ii) upon glucose exhaustion, the reversion of carbon flow through glycolysis and onset of the glyoxylate cycle operation triggered by an increased expression of the enzymes that catalyze the malate synthase and cytosolic citrate synthase reactions; and (iii) in the later stages of the adaptation, the shutting down of the pentose phosphate pathway with a change in NADPH regeneration. Moreover, we identified the transcription factors associated with the observed changes in protein abundances. Taken together, our results represent an important contribution toward a systems-level understanding of how this adaptation is realized. PMID:23549479
Data from quantitative label free proteomics analysis of rat spleen.
Dudekula, Khadar; Le Bihan, Thierry
2016-09-01
The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.
Development of quantitative risk acceptance criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griesmeyer, J. M.; Okrent, D.
Some of the major considerations for effective management of risk are discussed, with particular emphasis on risks due to nuclear power plant operations. Although there are impacts associated with the rest of the fuel cycle, they are not addressed here. Several previously published proposals for quantitative risk criteria are reviewed. They range from a simple acceptance criterion on individual risk of death to a quantitative risk management framework. The final section discussed some of the problems in the establishment of a framework for the quantitative management of risk.
Morgan, Patrick; Nissi, Mikko J; Hughes, John; Mortazavi, Shabnam; Ellerman, Jutta
2017-07-01
Objectives The purpose of this study was to validate T2* mapping as an objective, noninvasive method for the prediction of acetabular cartilage damage. Methods This is the second step in the validation of T2*. In a previous study, we established a quantitative predictive model for identifying and grading acetabular cartilage damage. In this study, the model was applied to a second cohort of 27 consecutive hips to validate the model. A clinical 3.0-T imaging protocol with T2* mapping was used. Acetabular regions of interest (ROI) were identified on magnetic resonance and graded using the previously established model. Each ROI was then graded in a blinded fashion by arthroscopy. Accurate surgical location of ROIs was facilitated with a 2-dimensional map projection of the acetabulum. A total of 459 ROIs were studied. Results When T2* mapping and arthroscopic assessment were compared, 82% of ROIs were within 1 Beck group (of a total 6 possible) and 32% of ROIs were classified identically. Disease prediction based on receiver operating characteristic curve analysis demonstrated a sensitivity of 0.713 and a specificity of 0.804. Model stability evaluation required no significant changes to the predictive model produced in the initial study. Conclusions These results validate that T2* mapping provides statistically comparable information regarding acetabular cartilage when compared to arthroscopy. In contrast to arthroscopy, T2* mapping is quantitative, noninvasive, and can be used in follow-up. Unlike research quantitative magnetic resonance protocols, T2* takes little time and does not require a contrast agent. This may facilitate its use in the clinical sphere.
Cuthbertson, Daniel J; Weickert, Martin O; Lythgoe, Daniel; Sprung, Victoria S; Dobson, Rebecca; Shoajee-Moradie, Fariba; Umpleby, Margot; Pfeiffer, Andreas F H; Thomas, E Louise; Bell, Jimmy D; Jones, Helen; Kemp, Graham J
2014-11-01
Simple clinical algorithms including the fatty liver index (FLI) and lipid accumulation product (LAP) have been developed as surrogate markers for non-alcoholic fatty liver disease (NAFLD), constructed using (semi-quantitative) ultrasonography. This study aimed to validate FLI and LAP as measures of hepatic steatosis, as determined quantitatively by proton magnetic resonance spectroscopy (1H-MRS). Data were collected from 168 patients with NAFLD and 168 controls who had undergone clinical, biochemical and anthropometric assessment. Values of FLI and LAP were determined and assessed both as predictors of the presence of hepatic steatosis (liver fat>5.5%) and of actual liver fat content, as measured by 1H-MRS. The discriminative ability of FLI and LAP was estimated using the area under the receiver operator characteristic curve (AUROC). As FLI can also be interpreted as a predictive probability of hepatic steatosis, we assessed how well calibrated it was in our cohort. Linear regression with prediction intervals was used to assess the ability of FLI and LAP to predict liver fat content. Further validation was provided in 54 patients with type 2 diabetes mellitus. FLI, LAP and alanine transferase discriminated between patients with and without steatosis with an AUROC of 0.79 (IQR=0.74, 0.84), 0.78 (IQR=0.72, 0.83) and 0.83 (IQR=0.79, 0.88) respectively although could not quantitatively predict liver fat. Additionally, the algorithms accurately matched the observed percentages of patients with hepatic steatosis in our cohort. FLI and LAP may be used to identify patients with hepatic steatosis clinically or for research purposes but could not predict liver fat content. © 2014 European Society of Endocrinology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mu, Jun; Institute of Neuroscience and the Collaborative Innovation Center for Brain Science, Chongqing Medical University, Chongqing; Chongqing Key Laboratory of Neurobiology, Chongqing
Purpose: Tuberculous meningitis (TBM) remains to be one of the most deadly infectious diseases. The pathogen interacts with the host immune system, the process of which is largely unknown. Various cellular processes of Mycobacterium tuberculosis (MTB) centers around lipid metabolism. To determine the lipid metabolism related proteins, a quantitative proteomic study was performed here to identify differential proteins in the cerebrospinal fluid (CSF) obtained from TBM patients (n = 12) and healthy controls (n = 12). Methods: CSF samples were desalted, concentrated, labelled with isobaric tags for relative and absolute quantitation (iTRAQ™), and analyzed by multi-dimensional liquid chromatography-tandem mass spectrometry (LC-MS/MS). Gene ontology andmore » proteomic phenotyping analysis of the differential proteins were conducted using Database for Annotation, Visualization, and Integrated Discovery (DAVID) Bioinformatics Resources. ApoE and ApoB were selected for validation by ELISA. Results: Proteomic phenotyping of the 4 differential proteins was invloved in the lipid metabolism. ELISA showed significantly increased ApoB levels in TBM subjects compared to healthy controls. Area under the receiver operating characteristic curve analysis demonstrated ApoB levels could distinguish TBM subjects from healthy controls and viral meningitis subjects with 89.3% sensitivity and 92% specificity. Conclusions: CSF lipid metabolism disregulation, especially elevated expression of ApoB, gives insights into the pathogenesis of TBM. Further evaluation of these findings in larger studies including anti-tuberculosis medicated and unmedicated patient cohorts with other center nervous system infectious diseases is required for successful clinical translation. - Highlights: • The first proteomic study on the cerebrospinal fluid of tuberculous meningitis patients using iTRAQ. • Identify 4 differential proteins invloved in the lipid metabolism. • Elevated expression of ApoB gives insights into the pathogenesis of TBM.« less
NASA Astrophysics Data System (ADS)
Marquet, P.; Rothenfusser, K.; Rappaz, B.; Depeursinge, C.; Jourdain, P.; Magistretti, P. J.
2016-03-01
Quantitative phase microscopy (QPM) has recently emerged as a powerful label-free technique in the field of living cell imaging allowing to non-invasively measure with a nanometric axial sensitivity cell structure and dynamics. Since the phase retardation of a light wave when transmitted through the observed cells, namely the quantitative phase signal (QPS), is sensitive to both cellular thickness and intracellular refractive index related to the cellular content, its accurate analysis allows to derive various cell parameters and monitor specific cell processes, which are very likely to identify new cell biomarkers. Specifically, quantitative phase-digital holographic microscopy (QP-DHM), thanks to its numerical flexibility facilitating parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.
Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.
Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L
2017-10-01
The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.
miRNA 206 and miRNA 574-5p are highly expression in coronary artery disease
Zhou, Jianqing; Shao, Guofeng; Chen, Xiaoliang; Yang, Xi; Huang, Xiaoyan; Peng, Ping; Ba, Yanna; Zhang, Lin; Jehangir, Tashina; Bu, Shizhong; Liu, Ningsheng; Lian, Jiangfang
2015-01-01
Coronary artery disease (CAD) is the leading cause of human morbidity and mortality worldwide. Innovative diagnostic biomarkers are a pressing need for this disease. miRNAs profiling is an innovative method of identifying biomarkers for many diseases and could be proven as a powerful tool in the diagnosis and treatment of CAD. We performed miRNA microarray analysis from the plasma of three CAD patients and three healthy controls. Subsequently, we performed quantitative real-time PCR (qRT-PCR) analysis of miRNA expression in plasma of another 67 CAD patients and 67 healthy controls. We identified two miRNAs (miR-206 and miR-574-5p) that were significantly up-regulated in CAD patients as compared with healthy controls (P<0.05). The receiver operating characteristic (ROC) curves indicated these two miRNAs had great potential to provide sensitive and specific diagnostic value for CAD. PMID:26685009
Zhang, Jiachao; Hu, Qisong; Xu, Chuanbiao; Liu, Sixin; Li, Congfa
2016-01-01
Pepper pericarp microbiota plays an important role in the pepper peeling process for the production of white pepper. We collected pepper samples at different peeling time points from Hainan Province, China, and used a metagenomic approach to identify changes in the pericarp microbiota based on functional gene analysis. UniFrac distance-based principal coordinates analysis revealed significant changes in the pericarp microbiota structure during peeling, which were attributed to increases in bacteria from the genera Selenomonas and Prevotella. We identified 28 core operational taxonomic units at each time point, mainly belonging to Selenomonas, Prevotella, Megasphaera, Anaerovibrio, and Clostridium genera. The results were confirmed by quantitative polymerase chain reaction. At the functional level, we observed significant increases in microbial features related to acetyl xylan esterase and pectinesterase for pericarp degradation during peeling. These findings offer a new insight into biodegradation for pepper peeling and will promote the development of the white pepper industry.
Key Microbiota Identification Using Functional Gene Analysis during Pepper (Piper nigrum L.) Peeling
Xu, Chuanbiao; Liu, Sixin; Li, Congfa
2016-01-01
Pepper pericarp microbiota plays an important role in the pepper peeling process for the production of white pepper. We collected pepper samples at different peeling time points from Hainan Province, China, and used a metagenomic approach to identify changes in the pericarp microbiota based on functional gene analysis. UniFrac distance-based principal coordinates analysis revealed significant changes in the pericarp microbiota structure during peeling, which were attributed to increases in bacteria from the genera Selenomonas and Prevotella. We identified 28 core operational taxonomic units at each time point, mainly belonging to Selenomonas, Prevotella, Megasphaera, Anaerovibrio, and Clostridium genera. The results were confirmed by quantitative polymerase chain reaction. At the functional level, we observed significant increases in microbial features related to acetyl xylan esterase and pectinesterase for pericarp degradation during peeling. These findings offer a new insight into biodegradation for pepper peeling and will promote the development of the white pepper industry. PMID:27768750
A quantitative comparison of leading-edge vortices in incompressible and supersonic flows
DOT National Transportation Integrated Search
2002-01-14
When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that pl...
Quantitative trait loci associated with anthracnose resistance in sorghum
USDA-ARS?s Scientific Manuscript database
With an aim to develop a durable resistance to the fungal disease anthracnose, two unique genetic sources of resistance were selected to create genetic mapping populations to identify regions of the sorghum genome that encode anthracnose resistance. A series of quantitative trait loci were identifi...
Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis
NASA Technical Reports Server (NTRS)
Shortle, J. F.; Allocco, M.
2005-01-01
Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.
Usefulness of quantitative susceptibility mapping for the diagnosis of Parkinson disease.
Murakami, Y; Kakeda, S; Watanabe, K; Ueda, I; Ogasawara, A; Moriya, J; Ide, S; Futatsuya, K; Sato, T; Okada, K; Uozumi, T; Tsuji, S; Liu, T; Wang, Y; Korogi, Y
2015-06-01
Quantitative susceptibility mapping allows overcoming several nonlocal restrictions of susceptibility-weighted and phase imaging and enables quantification of magnetic susceptibility. We compared the diagnostic accuracy of quantitative susceptibility mapping and R2* (1/T2*) mapping to discriminate between patients with Parkinson disease and controls. For 21 patients with Parkinson disease and 21 age- and sex-matched controls, 2 radiologists measured the quantitative susceptibility mapping values and R2* values in 6 brain structures (the thalamus, putamen, caudate nucleus, pallidum, substantia nigra, and red nucleus). The quantitative susceptibility mapping values and R2* values of the substantia nigra were significantly higher in patients with Parkinson disease (P < .01); measurements in other brain regions did not differ significantly between patients and controls. For the discrimination of patients with Parkinson disease from controls, receiver operating characteristic analysis suggested that the optimal cutoff values for the substantia nigra, based on the Youden Index, were >0.210 for quantitative susceptibility mapping and >28.8 for R2*. The sensitivity, specificity, and accuracy of quantitative susceptibility mapping were 90% (19 of 21), 86% (18 of 21), and 88% (37 of 42), respectively; for R2* mapping, they were 81% (17 of 21), 52% (11 of 21), and 67% (28 of 42). Pair-wise comparisons showed that the areas under the receiver operating characteristic curves were significantly larger for quantitative susceptibility mapping than for R2* mapping (0.91 versus 0.69, P < .05). Quantitative susceptibility mapping showed higher diagnostic performance than R2* mapping for the discrimination between patients with Parkinson disease and controls. © 2015 by American Journal of Neuroradiology.
NASA Astrophysics Data System (ADS)
Kajla, Arun; Deshwal, Sheetal; Agrawal, P. N.
2018-05-01
In the present paper we introduce a Durrmeyer variant of Jain operators based on a function ρ (x) where ρ is a continuously differentiable function on [0,∞), ρ (0)=0 and \\inf ρ '(x)≥ a, a >0, x \\in [0,∞) . For these new operators, some indispensable auxiliary results are established first. Then, the degree of approximation with the aid of Ditzian-Totik modulus of smoothness and the rate of convergence for functions whose derivatives are of bounded variation, is obtained. Further, we focus on the study of a Voronovskaja type asymptotic theorem, quantitative Voronovskaya and Grüss-Voronovskaya type theorems.
Duan, Chuanren; Cui, Yamin; Zhao, Yi; Zhai, Jun; Zhang, Baoyun; Zhang, Kun; Sun, Da; Chen, Hang
2016-10-01
A genetic marker within the 16S rRNA gene of Faecalibacterium was identified for use in a quantitative PCR (qPCR) assay to detect swine faecal contamination in water. A total of 146,038 bacterial sequences were obtained using 454 pyrosequencing. By comparative bioinformatics analysis of Faecalibacterium sequences with those of numerous swine and other animal species, swine-specific Faecalibacterium 16S rRNA gene sequences were identified and Polymerase Chain Okabe (PCR) primer sets designed and tested against faecal DNA samples from swine and non-swine sources. Two PCR primer sets, PFB-1 and PFB-2, showed the highest specificity to swine faecal waste and had no cross-reaction with other animal samples. PFB-1 and PFB-2 amplified 16S rRNA gene sequences from 50 samples of swine with positive ratios of 86 and 90%, respectively. We compared swine-specific Faecalibacterium qPCR assays for the purpose of quantifying the newly identified markers. The quantification limits (LOQs) of PFB-1 and PFB-2 markers in environmental water were 6.5 and 2.9 copies per 100 ml, respectively. Of the swine-associated assays tested, PFB-2 was more sensitive in detecting the swine faecal waste and quantifying the microbial load. Furthermore, the microbial abundance and diversity of the microbiomes of swine and other animal faeces were estimated using operational taxonomic units (OTUs). The species specificity was demonstrated for the microbial populations present in various animal faeces. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Vermicelli Handling Test: A Simple Quantitative Measure of Dexterous Forepaw Function in Rats
Allred, Rachel P.; Adkins, DeAnna L.; Woodlee, Martin T.; Husbands, Lincoln C.; Maldonado, Mónica A.; Kane, Jacqueline R.; Schallert, Timothy; Jones, Theresa A.
2008-01-01
Loss of function in the hands occurs with many brain disorders, but there are few measures of skillful forepaw use in rats available to model these impairments that are both sensitive and simple to administer. Whishaw and Coles (1996) previously described the dexterous manner in which rats manipulate food items with their paws, including thin pieces of pasta. We set out to develop a measure of this food handling behavior that would be quantitative, easy to administer, sensitive to the effects of damage to sensory and motor systems of the CNS and useful for identifying the side of lateralized impairments. When rats handle 7 cm lengths of vermicelli, they manipulate the pasta by repeatedly adjusting the forepaw hold on the pasta piece. As operationally defined, these adjustments can be easily identified and counted by an experimenter without specialized equipment. After unilateral sensorimotor cortex (SMC) lesions, transient middle cerebral artery occlusion (MCAO) and striatal dopamine depleting (6-hydroxydopamine, 6-OHDA) lesions in adult rats, there were enduring reductions in adjustments made with the contralateral forepaw. Additional pasta handling characteristics distinguished between the lesion types. MCAO and 6-OHDA lesions increased the frequency of several identified atypical handling patterns. Severe dopamine depletion increased eating time and adjustments made with the ipsilateral forepaw. However, contralateral forepaw adjustment number most sensitively detected enduring impairments across lesion types. Because of its ease of administration and sensitivity to lateralized impairments in skilled forepaw use, this measure may be useful in rat models of upper extremity impairment. PMID:18325597
An Automated Solar Synoptic Analysis Software System
NASA Astrophysics Data System (ADS)
Hong, S.; Lee, S.; Oh, S.; Kim, J.; Lee, J.; Kim, Y.; Lee, J.; Moon, Y.; Lee, D.
2012-12-01
We have developed an automated software system of identifying solar active regions, filament channels, and coronal holes, those are three major solar sources causing the space weather. Space weather forecasters of NOAA Space Weather Prediction Center produce the solar synoptic drawings as a daily basis to predict solar activities, i.e., solar flares, filament eruptions, high speed solar wind streams, and co-rotating interaction regions as well as their possible effects to the Earth. As an attempt to emulate this process with a fully automated and consistent way, we developed a software system named ASSA(Automated Solar Synoptic Analysis). When identifying solar active regions, ASSA uses high-resolution SDO HMI intensitygram and magnetogram as inputs and providing McIntosh classification and Mt. Wilson magnetic classification of each active region by applying appropriate image processing techniques such as thresholding, morphology extraction, and region growing. At the same time, it also extracts morphological and physical properties of active regions in a quantitative way for the short-term prediction of flares and CMEs. When identifying filament channels and coronal holes, images of global H-alpha network and SDO AIA 193 are used for morphological identification and also SDO HMI magnetograms for quantitative verification. The output results of ASSA are routinely checked and validated against NOAA's daily SRS(Solar Region Summary) and UCOHO(URSIgram code for coronal hole information). A couple of preliminary scientific results are to be presented using available output results. ASSA will be deployed at the Korean Space Weather Center and serve its customers in an operational status by the end of 2012.
Lee, Jiyeong; Joo, Eun-Jeong; Lim, Hee-Joung; Park, Jong-Moon; Lee, Kyu Young; Park, Arum; Seok, AeEun
2015-01-01
Objective Currently, there are a few biological markers to aid in the diagnosis and treatment of depression. However, it is not sufficient for diagnosis. We attempted to identify differentially expressed proteins during depressive moods as putative diagnostic biomarkers by using quantitative proteomic analysis of serum. Methods Blood samples were collected twice from five patients with major depressive disorder (MDD) at depressive status before treatment and at remission status during treatment. Samples were individually analyzed by liquid chromatography-tandem mass spectrometry for protein profiling. Differentially expressed proteins were analyzed by label-free quantification. Enzyme-linked immunosorbent assay (ELISA) results and receiver-operating characteristic (ROC) curves were used to validate the differentially expressed proteins. For validation, 8 patients with MDD including 3 additional patients and 8 matched normal controls were analyzed. Results The quantitative proteomic studies identified 10 proteins that were consistently upregulated or downregulated in 5 MDD patients. ELISA yielded results consistent with the proteomic analysis for 3 proteins. Expression levels were significantly different between normal controls and MDD patients. The 3 proteins were ceruloplasmin, inter-alpha-trypsin inhibitor heavy chain H4 and complement component 1qC, which were upregulated during the depressive status. The depressive status could be distinguished from the euthymic status from the ROC curves for these proteins, and this discrimination was enhanced when all 3 proteins were analyzed together. Conclusion This is the first proteomic study in MDD patients to compare intra-individual differences dependent on mood. This technique could be a useful approach to identify MDD biomarkers, but requires additional proteomic studies for validation. PMID:25866527
Bush, Joseph; Langley, Christopher A; Jenkins, Duncan; Johal, Jaspal; Huckerby, Clair
2017-12-27
This aim of this research was to characterise the breadth and volume of activity conducted by clinical pharmacists in general practice in Dudley Clinical Commissioning Group (CCG), and to provide quantitative estimates of both the savings in general practitioner (GP) time and the financial savings attributable to such activity. This descriptive observational study retrospectively analysed quantitative data collected by Dudley CCG concerning the activity of clinical pharmacists in GP practices during 2015. Over the 9-month period for which data were available, the 5.4 whole time equivalent clinical pharmacists operating in GP practices within Dudley CCG identified 23 172 interventions. Ninety-five per cent of the interventions identified were completed within the study period saving the CCG in excess of £1 000 000. During the 4 months for which resource allocation data were available, the clinical pharmacists saved 628 GP appointments plus an additional 647 h that GPs currently devote to medication review and the management of repeat prescribing. This research suggests that clinical pharmacists in general practice in Dudley CCG are able to deliver clinical interventions efficiently and in high volume. In doing so, clinical pharmacists were able to generate considerable financial returns on investment. Further work is recommended to examine the effectiveness and cost-effectiveness of clinical pharmacists in general practice in improving outcomes for patients. © 2017 Royal Pharmaceutical Society.
Fuzzy pulmonary vessel segmentation in contrast enhanced CT data
NASA Astrophysics Data System (ADS)
Kaftan, Jens N.; Kiraly, Atilla P.; Bakai, Annemarie; Das, Marco; Novak, Carol L.; Aach, Til
2008-03-01
Pulmonary vascular tree segmentation has numerous applications in medical imaging and computer-aided diagnosis (CAD), including detection and visualization of pulmonary emboli (PE), improved lung nodule detection, and quantitative vessel analysis. We present a novel approach to pulmonary vessel segmentation based on a fuzzy segmentation concept, combining the strengths of both threshold and seed point based methods. The lungs of the original image are first segmented and a threshold-based approach identifies core vessel components with a high specificity. These components are then used to automatically identify reliable seed points for a fuzzy seed point based segmentation method, namely fuzzy connectedness. The output of the method consists of the probability of each voxel belonging to the vascular tree. Hence, our method provides the possibility to adjust the sensitivity/specificity of the segmentation result a posteriori according to application-specific requirements, through definition of a minimum vessel-probability required to classify a voxel as belonging to the vascular tree. The method has been evaluated on contrast-enhanced thoracic CT scans from clinical PE cases and demonstrates overall promising results. For quantitative validation we compare the segmentation results to randomly selected, semi-automatically segmented sub-volumes and present the resulting receiver operating characteristic (ROC) curves. Although we focus on contrast enhanced chest CT data, the method can be generalized to other regions of the body as well as to different imaging modalities.
Mapping of polycrystalline films of biological fluids utilizing the Jones-matrix formalism
NASA Astrophysics Data System (ADS)
Ushenko, Vladimir A.; Dubolazov, Alexander V.; Pidkamin, Leonid Y.; Sakchnovsky, Michael Yu; Bodnar, Anna B.; Ushenko, Yuriy A.; Ushenko, Alexander G.; Bykov, Alexander; Meglinski, Igor
2018-02-01
Utilizing a polarized light approach, we reconstruct the spatial distribution of birefringence and optical activity in polycrystalline films of biological fluids. The Jones-matrix formalism is used for an accessible quantitative description of these types of optical anisotropy. We demonstrate that differentiation of polycrystalline films of biological fluids can be performed based on a statistical analysis of the distribution of rotation angles and phase shifts associated with the optical activity and birefringence, respectively. Finally, practical operational characteristics, such as sensitivity, specificity and accuracy of the Jones-matrix reconstruction of optical anisotropy, were identified with special emphasis on biomedical application, specifically for differentiation of bile films taken from healthy donors and from patients with cholelithiasis.
DOT National Transportation Integrated Search
1975-07-01
The volume presents the results of the quantitative analyses of the O'Hare ASTC System operations. The operations environments for the periods selected for detailed analysis of the ASDE films and controller communications recording are described. Fol...
Gagliardi, Filippo; Boari, Nicola; Roberti, Fabio; Caputy, Anthony J; Mortini, Pietro
2014-09-01
Comparative anatomical studies have proved to be invaluable in the evaluation of advantages and drawbacks of single approaches to access established target areas. Approach-related exposed areas do not necessarily represent useful areas when performing surgical manoeuvres. Accordingly the concept of "operability" has recently been introduced as a qualitative assessment of the ability to execute surgical manoeuvres. The authors propose an innovative model for the quantitative assessment of the operability, defined as "operability score" (OS), which can be effectively and easily applied to comparative studies on surgical anatomy. A microanatomical study was conducted on six cadaveric heads. Morphometric measurements were collected and operability scores in selected target points of the surgical field were calculated. As illustrative example, the operability score was applied to the extradural subtemporal transzygomatic approach (ESTZ). The operability score is effective in grading system of surgical operability, and instruments manipulation capability. It is a useful tool to evaluate, in a single approach, areas that can be exposed, and to quantify how those areas are suitable for surgical manoeuvres. Copyright © 2014 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
78 FR 64202 - Quantitative Messaging Research
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-28
... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast- track OMB approval... comments. Please submit your comments using only one method and identify that it is for the ``Quantitative...
Zhang, Zhen; Shang, Haihong; Shi, Yuzhen; Huang, Long; Li, Junwen; Ge, Qun; Gong, Juwu; Liu, Aiying; Chen, Tingting; Wang, Dan; Wang, Yanling; Palanga, Koffi Kibalou; Muhammad, Jamshed; Li, Weijie; Lu, Quanwei; Deng, Xiaoying; Tan, Yunna; Song, Weiwu; Cai, Juan; Li, Pengtao; Rashid, Harun or; Gong, Wankui; Yuan, Youlu
2016-04-11
Upland Cotton (Gossypium hirsutum) is one of the most important worldwide crops it provides natural high-quality fiber for the industrial production and everyday use. Next-generation sequencing is a powerful method to identify single nucleotide polymorphism markers on a large scale for the construction of a high-density genetic map for quantitative trait loci mapping. In this research, a recombinant inbred lines population developed from two upland cotton cultivars 0-153 and sGK9708 was used to construct a high-density genetic map through the specific locus amplified fragment sequencing method. The high-density genetic map harbored 5521 single nucleotide polymorphism markers which covered a total distance of 3259.37 cM with an average marker interval of 0.78 cM without gaps larger than 10 cM. In total 18 quantitative trait loci of boll weight were identified as stable quantitative trait loci and were detected in at least three out of 11 environments and explained 4.15-16.70 % of the observed phenotypic variation. In total, 344 candidate genes were identified within the confidence intervals of these stable quantitative trait loci based on the cotton genome sequence. These genes were categorized based on their function through gene ontology analysis, Kyoto Encyclopedia of Genes and Genomes analysis and eukaryotic orthologous groups analysis. This research reported the first high-density genetic map for Upland Cotton (Gossypium hirsutum) with a recombinant inbred line population using single nucleotide polymorphism markers developed by specific locus amplified fragment sequencing. We also identified quantitative trait loci of boll weight across 11 environments and identified candidate genes within the quantitative trait loci confidence intervals. The results of this research would provide useful information for the next-step work including fine mapping, gene functional analysis, pyramiding breeding of functional genes as well as marker-assisted selection.
Katsoulidou, Antigoni; Petrodaskalaki, Maria; Sypsa, Vana; Papachristou, Eleni; Anastassopoulou, Cleo G; Gargalianos, Panagiotis; Karafoulidou, Anastasia; Lazanas, Marios; Kordossis, Theodoros; Andoniadou, Anastasia; Hatzakis, Angelos
2006-02-01
The COBAS TaqMan HIV-1 test (Roche Diagnostics) was compared with the LCx HIV RNA quantitative assay (Abbott Laboratories), the Versant HIV-1 RNA 3.0 (bDNA) assay (Bayer) and the COBAS Amplicor HIV-1 Monitor v1.5 test (Roche Diagnostics), using plasma samples of various viral load levels from HIV-1-infected individuals. In the comparison of TaqMan with LCx, TaqMan identified as positive 77.5% of the 240 samples versus 72.1% identified by LCx assay, while their overall agreement was 94.6% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.91). Similarly, in the comparison of TaqMan with bDNA 3.0, both methods identified 76.3% of the 177 samples as positive, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.95). Finally, in the comparison of TaqMan with Monitor v1.5, TaqMan identified 79.5% of the 156 samples as positive versus 80.1% identified by Monitor v1.5, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.96). In conclusion, the new COBAS TaqMan HIV-1 test showed excellent agreement with other widely used commercially available tests for the quantitation of HIV-1 viral load.
A method for operative quantitative interpretation of multispectral images of biological tissues
NASA Astrophysics Data System (ADS)
Lisenko, S. A.; Kugeiko, M. M.
2013-10-01
A method for operative retrieval of spatial distributions of biophysical parameters of a biological tissue by using a multispectral image of it has been developed. The method is based on multiple regressions between linearly independent components of the diffuse reflection spectrum of the tissue and unknown parameters. Possibilities of the method are illustrated by an example of determining biophysical parameters of the skin (concentrations of melanin, hemoglobin and bilirubin, blood oxygenation, and scattering coefficient of the tissue). Examples of quantitative interpretation of the experimental data are presented.
1979-10-01
GSS-k was assigned in May 1957 to the system of equipments as used in Operation Plumbbob. Quantitative measurements of the em pulse have been made... quantitative data from the recordings of the SWR vavefoims, It vas necessary to record other infoimation on the photographs. Figure 35, a typical...Capilla vuvefonnG. The Heef’s I’dne and other Gleeson triplet observations confirmed the In/if ground wave positiv « half cycle and indicated a sharply
Darzins, Susan W; Imms, Christine; Stefano, Marilyn Di; Radia-George, Camilla A
2016-10-01
Evidence supports validity of the Personal Care Participation Assessment and Resource Tool (PC-PART), but clinical utility remains unverified. This study aimed to investigate occupational therapists' perceptions about the PC-PART's clinical utility for inpatient rehabilitation. Using mixed methods, occupational therapists who had used the PC-PART as part of a research study in an inpatient rehabilitation setting completed a questionnaire ( n = 9) and participated in a focus group ( n = 6) to explore their perspectives about its clinical utility. Quantitative data were summarized and qualitative data analyzed using inductive thematic analysis. Quantitative data highlighted both positive and negative aspects of the PC-PART's clinical utility. Five themes emerged from the qualitative data: nature of information gathered; familiarity with the instrument; perceived time and effort; item phrasing, interpretation, and presentation; and external influences on clinical use. The PC-PART was perceived to support gathering of clinically useful information, helpful to intervention and discharge planning. Recommendations for improving some item phrasing, operational definitions, and instructions were identified. Although standardized assessments were valued, use in routine practice was challenging, requiring a knowledge translation strategy.
Deciphering principles of transcription regulation in eukaryotic genomes
Nguyen, Dat H; D'haeseleer, Patrik
2006-01-01
Transcription regulation has been responsible for organismal complexity and diversity in the course of biological evolution and adaptation, and it is determined largely by the context-dependent behavior of cis-regulatory elements (CREs). Therefore, understanding principles underlying CRE behavior in regulating transcription constitutes a fundamental objective of quantitative biology, yet these remain poorly understood. Here we present a deterministic mathematical strategy, the motif expression decomposition (MED) method, for deriving principles of transcription regulation at the single-gene resolution level. MED operates on all genes in a genome without requiring any a priori knowledge of gene cluster membership, or manual tuning of parameters. Applying MED to Saccharomyces cerevisiae transcriptional networks, we identified four functions describing four different ways that CREs can quantitatively affect gene expression levels. These functions, three of which have extrema in different positions in the gene promoter (short-, mid-, and long-range) whereas the other depends on the motif orientation, are validated by expression data. We illustrate how nature could use these principles as an additional dimension to amplify the combinatorial power of a small set of CREs in regulating transcription. PMID:16738557
Deutsch, Eric W.; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L.
2015-01-01
Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include mass spectrometry to define protein sequence, protein:protein interactions, and protein post-translational modifications. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative mass spectrometry proteomics. It supports all major operating systems and instrument vendors via open data formats. Here we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of tandem mass spectrometry datasets, as well as some major upcoming features. PMID:25631240
Quantitative research on critical thinking and predicting nursing students' NCLEX-RN performance.
Romeo, Elizabeth M
2010-07-01
The concept of critical thinking has been influential in several disciplines. Both education and nursing in general have been attempting to define, teach, and measure this concept for decades. Nurse educators realize that critical thinking is the cornerstone of the objectives and goals for nursing students. The purpose of this article is to review and analyze quantitative research findings relevant to the measurement of critical thinking abilities and skills in undergraduate nursing students and the usefulness of critical thinking as a predictor of National Council Licensure Examination-Registered Nurse (NCLEX-RN) performance. The specific issues that this integrative review examined include assessment and analysis of the theoretical and operational definitions of critical thinking, theoretical frameworks used to guide the studies, instruments used to evaluate critical thinking skills and abilities, and the role of critical thinking as a predictor of NCLEX-RN outcomes. A list of key assumptions related to critical thinking was formulated. The limitations and gaps in the literature were identified, as well as the types of future research needed in this arena. Copyright 2010, SLACK Incorporated.
Deutsch, Eric W; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L
2015-08-01
Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include MS to define protein sequence, protein:protein interactions, and protein PTMs. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative MS proteomics. It supports all major operating systems and instrument vendors via open data formats. Here, we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of MS/MS datasets, as well as some major upcoming features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Dynamic Redox Regulation of IL-4 Signaling.
Dwivedi, Gaurav; Gran, Margaret A; Bagchi, Pritha; Kemp, Melissa L
2015-11-01
Quantifying the magnitude and dynamics of protein oxidation during cell signaling is technically challenging. Computational modeling provides tractable, quantitative methods to test hypotheses of redox mechanisms that may be simultaneously operative during signal transduction. The interleukin-4 (IL-4) pathway, which has previously been reported to induce reactive oxygen species and oxidation of PTP1B, may be controlled by several other putative mechanisms of redox regulation; widespread proteomic thiol oxidation observed via 2D redox differential gel electrophoresis upon IL-4 treatment suggests more than one redox-sensitive protein implicated in this pathway. Through computational modeling and a model selection strategy that relied on characteristic STAT6 phosphorylation dynamics of IL-4 signaling, we identified reversible protein tyrosine phosphatase (PTP) oxidation as the primary redox regulatory mechanism in the pathway. A systems-level model of IL-4 signaling was developed that integrates synchronous pan-PTP oxidation with ROS-independent mechanisms. The model quantitatively predicts the dynamics of IL-4 signaling over a broad range of new redox conditions, offers novel hypotheses about regulation of JAK/STAT signaling, and provides a framework for interrogating putative mechanisms involving receptor-initiated oxidation.
Dynamic Redox Regulation of IL-4 Signaling
Dwivedi, Gaurav; Gran, Margaret A.; Bagchi, Pritha; Kemp, Melissa L.
2015-01-01
Quantifying the magnitude and dynamics of protein oxidation during cell signaling is technically challenging. Computational modeling provides tractable, quantitative methods to test hypotheses of redox mechanisms that may be simultaneously operative during signal transduction. The interleukin-4 (IL-4) pathway, which has previously been reported to induce reactive oxygen species and oxidation of PTP1B, may be controlled by several other putative mechanisms of redox regulation; widespread proteomic thiol oxidation observed via 2D redox differential gel electrophoresis upon IL-4 treatment suggests more than one redox-sensitive protein implicated in this pathway. Through computational modeling and a model selection strategy that relied on characteristic STAT6 phosphorylation dynamics of IL-4 signaling, we identified reversible protein tyrosine phosphatase (PTP) oxidation as the primary redox regulatory mechanism in the pathway. A systems-level model of IL-4 signaling was developed that integrates synchronous pan-PTP oxidation with ROS-independent mechanisms. The model quantitatively predicts the dynamics of IL-4 signaling over a broad range of new redox conditions, offers novel hypotheses about regulation of JAK/STAT signaling, and provides a framework for interrogating putative mechanisms involving receptor-initiated oxidation. PMID:26562652
USDA-ARS?s Scientific Manuscript database
Obstructive sleep apnea (OSA) is a common heritable disorder displaying marked sexual dimorphism in disease prevalence and progression. Previous genetic association studies have identified a few genetic loci associated with OSA and related quantitative traits, but they have only focused on single et...
Faculty Grading of Quantitative Problems: A Mismatch between Values and Practice
ERIC Educational Resources Information Center
Petcovic, Heather L.; Fynewever, Herb; Henderson, Charles; Mutambuki, Jacinta M.; Barney, Jeffrey A.
2013-01-01
Grading practices can send a powerful message to students about course expectations. A study by Henderson et al. ("American Journal of Physics" 72:164-169, 2004) in physics education has identified a misalignment between what college instructors say they value and their actual scoring of quantitative student solutions. This work identified three…
Kim, Jae Yoon; Moon, Jun-Cheol; Kim, Hyo Chul; Shin, Seungho; Song, Kitae; Kim, Kyung-Hee; Lee, Byung-Moo
2017-01-01
Premise of the study: Positional cloning in combination with phenotyping is a general approach to identify disease-resistance gene candidates in plants; however, it requires several time-consuming steps including population or fine mapping. Therefore, in the present study, we suggest a new combined strategy to improve the identification of disease-resistance gene candidates. Methods and Results: Downy mildew (DM)–resistant maize was selected from five cultivars using a spreader row technique. Positional cloning and bioinformatics tools were used to identify the DM-resistance quantitative trait locus marker (bnlg1702) and 47 protein-coding gene annotations. Eventually, five DM-resistance gene candidates, including bZIP34, Bak1, and Ppr, were identified by quantitative reverse-transcription PCR (RT-PCR) without fine mapping of the bnlg1702 locus. Conclusions: The combined protocol with the spreader row technique, quantitative trait locus positional cloning, and quantitative RT-PCR was effective for identifying DM-resistance candidate genes. This cloning approach may be applied to other whole-genome-sequenced crops or resistance to other diseases. PMID:28224059
Hoon, Lim Siew; Hong-Gu, He; Mackey, Sandra
Paediatric pain management remains a challenge in clinical settings. Parents can contribute to the effective and accurate pain assessment and management of their child. No systematic reviews regarding the parental involvement in their child's post-operative pain management have been published. To determine the best available evidence regarding parental involvement in managing their children's post-operative pain in the hospital setting. The review considered studies that included parents of all ethnic groups with children aged between 6 to 12 years old who were hospitalised and undergone surgery of any kind with post-operative surgical or incision site pain where care was provided in acute hospital settings. The phenomena of interest were the experiences of parents in managing their children's post-operative pain. A three-step search strategy was utilised in each component of this review. Major databases searched included: MEDLINE, CINAHL, Scopus, ScienceDirect, the Cochrane library, PubMed as well as Google Scholar. The search included published studies and papers in English from 1990 to 2009. Each included study was assessed by two independent reviewers using the appropriate appraisal checklists developed by the Joanna Briggs Institute (JBI). Quantitative and qualitative data were extracted from the included papers using standardised data extraction tools from the JBI, Meta-analysis Statistics Assessment and Review Instrument data extraction tool for descriptive/case series and the JBI-Qualitative Assessment and Review Instrument data extraction tool for interpretive and critical research. The five quantitative studies included in this review were not suitable for meta-analysis due to clinical and methodological heterogeneity and therefore the findings are presented in a narrative form. The two qualitative studies were from the same study, therefore meta-synthesis was not possible. Hence the results of the studies were presented in a narrative format. Seven papers were included in this review. The evidence identified topics including: pharmacological and non-pharmacological interventions carried out by parents; the experience of concern, fear, helplessness, anxiety, depression, frustration and lack of support felt by parents during their child's hospitalisation; communication issues and knowledge deficits; need for information by parents to promote effective participation in managing their child's post-operative pain. This review revealed pharmacological and non-pharmacological interventions carried out by parents to alleviate their children's post-operative pain. Obstacles and promoting factors influencing parents' experiences as well as their needs in the process of caring were identified. Parents' roles in their child's surgical pain management should be clarified and their efforts acknowledged, which will encourage parents' active participation in their child's caring process. Nurses should provide guidance, education and support to parents. More studies are needed to examine parents' experiences in caring for their child, investigate the effectiveness of education and guidance provided to parents by the nurses and explore the influence of parents' cultural values and nurses' perceptions of parental participation in their child's care.
Operational Art and Intelligence: What is the Relationship?
1997-05-22
operational art began with the even application of military science and art . Science is knowledge, and art is the creative application of knowledge. In this...intelligence, like operational art, evenly applies science and art . Science , or raw data collection, is synonymous with the gathering of quantitative elements in
Exposure assessment of tetrafluoroethylene and ammonium perfluorooctanoate 1951-2002.
Sleeuwenhoek, Anne; Cherrie, John W
2012-03-01
To develop a method to reconstruct exposure to tetrafluoroethylene (TFE) and ammonium perfluorooctanoate (APFO) in plants producing polytetrafluoroethylene (PTFE) in the absence of suitable objective measurements. These data were used to inform an epidemiological study being carried out to investigate possible risks in workers employed in the manufacture of PTFE and to study trends in exposure over time. For each plant, detailed descriptions of all occupational titles, including tasks and changes over time, were obtained during semi-structured interviews with key plant personnel. A semi-quantitative assessment method was used to assess inhalation exposure to TFE and inhalation plus dermal exposure to APFO. Temporal trends in exposure to TFE and APFO were investigated. In each plant the highest exposures for both TFE and APFO occurred in the polymerisation area. Due to the introduction of control measures, increasing process automation and other improvements, exposures generally decreased over time. In the polymerisation area, the annual decline in exposure to TFE varied by plant from 3.8 to 5.7% and for APFO from 2.2 to 5.5%. A simple method for assessing exposure was developed which used detailed process information and job descriptions to estimate average annual TFE and APFO exposure on an arbitrary semi-quantitative scale. These semi-quantitative estimates are sufficient to identify relative differences in exposure for the epidemiological study and should good data become available, they could be used to provide quantitative estimates for all plants across the whole period of operation. This journal is © The Royal Society of Chemistry 2012
Krishnamurthy, Krish
2013-12-01
The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented. Copyright © 2013 John Wiley & Sons, Ltd.
Han, Qing; Bradshaw, Elizabeth M; Nilsson, Björn; Hafler, David A; Love, J Christopher
2010-06-07
The large diversity of cells that comprise the human immune system requires methods that can resolve the individual contributions of specific subsets to an immunological response. Microengraving is process that uses a dense, elastomeric array of microwells to generate microarrays of proteins secreted from large numbers of individual live cells (approximately 10(4)-10(5) cells/assay). In this paper, we describe an approach based on this technology to quantify the rates of secretion from single immune cells. Numerical simulations of the microengraving process indicated an operating regime between 30 min-4 h that permits quantitative analysis of the rates of secretion. Through experimental validation, we demonstrate that microengraving can provide quantitative measurements of both the frequencies and the distribution in rates of secretion for up to four cytokines simultaneously released from individual viable primary immune cells. The experimental limits of detection ranged from 0.5 to 4 molecules/s for IL-6, IL-17, IFNgamma, IL-2, and TNFalpha. These multidimensional measures resolve the number and intensities of responses by cells exposed to stimuli with greater sensitivity than single-parameter assays for cytokine release. We show that cells from different donors exhibit distinct responses based on both the frequency and magnitude of cytokine secretion when stimulated under different activating conditions. Primary T cells with specific profiles of secretion can also be recovered after microengraving for subsequent expansion in vitro. These examples demonstrate the utility of quantitative, multidimensional profiles of single cells for analyzing the diversity and dynamics of immune responses in vitro and for identifying rare cells from clinical samples.
Code of Federal Regulations, 2010 CFR
2010-07-01
... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...
Code of Federal Regulations, 2013 CFR
2013-07-01
... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...
Code of Federal Regulations, 2014 CFR
2014-07-01
... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...
Code of Federal Regulations, 2011 CFR
2011-07-01
... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...
Code of Federal Regulations, 2012 CFR
2012-07-01
... appropriate for a quantitative survey and include consideration of the methods used in other studies performed... basis for any assumptions and quantitative estimates. If you plan to use an entrainment survival rate...
Exploring the mammalian sensory space: co-operations and trade-offs among senses.
Nummela, Sirpa; Pihlström, Henry; Puolamäki, Kai; Fortelius, Mikael; Hemilä, Simo; Reuter, Tom
2013-12-01
The evolution of a particular sensory organ is often discussed with no consideration of the roles played by other senses. Here, we treat mammalian vision, olfaction and hearing as an interconnected whole, a three-dimensional sensory space, evolving in response to ecological challenges. Until now, there has been no quantitative method for estimating how much a particular animal invests in its different senses. We propose an anatomical measure based on sensory organ sizes. Dimensions of functional importance are defined and measured, and normalized in relation to animal mass. For 119 taxonomically and ecologically diverse species, we can define the position of the species in a three-dimensional sensory space. Thus, we can ask questions related to possible trade-off vs. co-operation among senses. More generally, our method allows morphologists to identify sensory organ combinations that are characteristic of particular ecological niches. After normalization for animal size, we note that arboreal mammals tend to have larger eyes and smaller noses than terrestrial mammals. On the other hand, we observe a strong correlation between eyes and ears, indicating that co-operation between vision and hearing is a general mammalian feature. For some groups of mammals we note a correlation, and possible co-operation between olfaction and whiskers.
Flight Test of a Head-Worn Display as an Equivalent-HUD for Terminal Operations
NASA Technical Reports Server (NTRS)
Shelton, K. J.; Arthur, J. J., III; Prinzel, L. J., III; Nicholas, S. N.; Williams, S. P.; Bailey, R. E.
2015-01-01
Research, development, test, and evaluation of flight deck interface technologies is being conducted by NASA to proactively identify, develop, and mature tools, methods, and technologies for improving overall aircraft safety of new and legacy vehicles operating in the Next Generation Air Transportation System (NextGen). Under NASA's Aviation Safety Program, one specific area of research is the use of small Head-Worn Displays (HWDs) as a potential equivalent display to a Head-up Display (HUD). Title 14 of the US CFR 91.175 describes a possible operational credit which can be obtained with airplane equipage of a HUD or an "equivalent"' display combined with Enhanced Vision (EV). A successful HWD implementation may provide the same safety and operational benefits as current HUD-equipped aircraft but for significantly more aircraft in which HUD installation is neither practical nor possible. A flight test was conducted to evaluate if the HWD, coupled with a head-tracker, can provide an equivalent display to a HUD. Approach and taxi testing was performed on-board NASA's experimental King Air aircraft in various visual conditions. Preliminary quantitative results indicate the HWD tested provided equivalent HUD performance, however operational issues were uncovered. The HWD showed significant potential as all of the pilots liked the increased situation awareness attributable to the HWD's unique capability of unlimited field-of-regard.
Assessment of environments for Mars Science Laboratory entry, descent, and surface operations
Vasavada, Ashwin R.; Chen, Allen; Barnes, Jeffrey R.; Burkhart, P. Daniel; Cantor, Bruce A.; Dwyer-Cianciolo, Alicia M.; Fergason, Robini L.; Hinson, David P.; Justh, Hilary L.; Kass, David M.; Lewis, Stephen R.; Mischna, Michael A.; Murphy, James R.; Rafkin, Scot C.R.; Tyler, Daniel; Withers, Paul G.
2012-01-01
The Mars Science Laboratory mission aims to land a car-sized rover on Mars' surface and operate it for at least one Mars year in order to assess whether its field area was ever capable of supporting microbial life. Here we describe the approach used to identify, characterize, and assess environmental risks to the landing and rover surface operations. Novel entry, descent, and landing approaches will be used to accurately deliver the 900-kg rover, including the ability to sense and "fly out" deviations from a best-estimate atmospheric state. A joint engineering and science team developed methods to estimate the range of potential atmospheric states at the time of arrival and to quantitatively assess the spacecraft's performance and risk given its particular sensitivities to atmospheric conditions. Numerical models are used to calculate the atmospheric parameters, with observations used to define model cases, tune model parameters, and validate results. This joint program has resulted in a spacecraft capable of accessing, with minimal risk, the four finalist sites chosen for their scientific merit. The capability to operate the landed rover over the latitude range of candidate landing sites, and for all seasons, was verified against an analysis of surface environmental conditions described here. These results, from orbital and model data sets, also drive engineering simulations of the rover's thermal state that are used to plan surface operations.
Haggerty, Christopher M.; de Zélicourt, Diane A.; Restrepo, Maria; Rossignac, Jarek; Spray, Thomas L.; Kanter, Kirk R.; Fogel, Mark A.; Yoganathan, Ajit P.
2012-01-01
Background Virtual modeling of cardiothoracic surgery is a new paradigm that allows for systematic exploration of various operative strategies and uses engineering principles to predict the optimal patient-specific plan. This study investigates the predictive accuracy of such methods for the surgical palliation of single ventricle heart defects. Methods Computational fluid dynamics (CFD)-based surgical planning was used to model the Fontan procedure for four patients prior to surgery. The objective for each was to identify the operative strategy that best distributed hepatic blood flow to the pulmonary arteries. Post-operative magnetic resonance data were acquired to compare (via CFD) the post-operative hemodynamics with predictions. Results Despite variations in physiologic boundary conditions (e.g., cardiac output, venous flows) and the exact geometry of the surgical baffle, sufficient agreement was observed with respect to hepatic flow distribution (90% confidence interval-14 ± 4.3% difference). There was also good agreement of flow-normalized energetic efficiency predictions (19 ± 4.8% error). Conclusions The hemodynamic outcomes of prospective patient-specific surgical planning of the Fontan procedure are described for the first time with good quantitative comparisons between preoperatively predicted and postoperative simulations. These results demonstrate that surgical planning can be a useful tool for single ventricle cardiothoracic surgery with the ability to deliver significant clinical impact. PMID:22777126
Flight test of a head-worn display as an equivalent-HUD for terminal operations
NASA Astrophysics Data System (ADS)
Shelton, K. J.; Arthur, J. J.; Prinzel, L. J.; Nicholas, S. N.; Williams, S. P.; Bailey, R. E.
2015-05-01
Research, development, test, and evaluation of flight deck interface technologies is being conducted by NASA to proactively identify, develop, and mature tools, methods, and technologies for improving overall aircraft safety of new and legacy vehicles operating in the Next Generation Air Transportation System (NextGen). Under NASA's Aviation Safety Program, one specific area of research is the use of small Head-Worn Displays (HWDs) as a potential equivalent display to a Head-up Display (HUD). Title 14 of the US CFR 91.175 describes a possible operational credit which can be obtained with airplane equipage of a HUD or an "equivalent"' display combined with Enhanced Vision (EV). A successful HWD implementation may provide the same safety and operational benefits as current HUD-equipped aircraft but for significantly more aircraft in which HUD installation is neither practical nor possible. A flight test was conducted to evaluate if the HWD, coupled with a head-tracker, can provide an equivalent display to a HUD. Approach and taxi testing was performed on-board NASA's experimental King Air aircraft in various visual conditions. Preliminary quantitative results indicate the HWD tested provided equivalent HUD performance, however operational issues were uncovered. The HWD showed significant potential as all of the pilots liked the increased situation awareness attributable to the HWD's unique capability of unlimited field-of-regard.
Resource theory of non-Gaussian operations
NASA Astrophysics Data System (ADS)
Zhuang, Quntao; Shor, Peter W.; Shapiro, Jeffrey H.
2018-05-01
Non-Gaussian states and operations are crucial for various continuous-variable quantum information processing tasks. To quantitatively understand non-Gaussianity beyond states, we establish a resource theory for non-Gaussian operations. In our framework, we consider Gaussian operations as free operations, and non-Gaussian operations as resources. We define entanglement-assisted non-Gaussianity generating power and show that it is a monotone that is nonincreasing under the set of free superoperations, i.e., concatenation and tensoring with Gaussian channels. For conditional unitary maps, this monotone can be analytically calculated. As examples, we show that the non-Gaussianity of ideal photon-number subtraction and photon-number addition equal the non-Gaussianity of the single-photon Fock state. Based on our non-Gaussianity monotone, we divide non-Gaussian operations into two classes: (i) the finite non-Gaussianity class, e.g., photon-number subtraction, photon-number addition, and all Gaussian-dilatable non-Gaussian channels; and (ii) the diverging non-Gaussianity class, e.g., the binary phase-shift channel and the Kerr nonlinearity. This classification also implies that not all non-Gaussian channels are exactly Gaussian dilatable. Our resource theory enables a quantitative characterization and a first classification of non-Gaussian operations, paving the way towards the full understanding of non-Gaussianity.
Zikmund, T; Kvasnica, L; Týč, M; Křížová, A; Colláková, J; Chmelík, R
2014-11-01
Transmitted light holographic microscopy is particularly used for quantitative phase imaging of transparent microscopic objects such as living cells. The study of the cell is based on extraction of the dynamic data on cell behaviour from the time-lapse sequence of the phase images. However, the phase images are affected by the phase aberrations that make the analysis particularly difficult. This is because the phase deformation is prone to change during long-term experiments. Here, we present a novel algorithm for sequential processing of living cells phase images in a time-lapse sequence. The algorithm compensates for the deformation of a phase image using weighted least-squares surface fitting. Moreover, it identifies and segments the individual cells in the phase image. All these procedures are performed automatically and applied immediately after obtaining every single phase image. This property of the algorithm is important for real-time cell quantitative phase imaging and instantaneous control of the course of the experiment by playback of the recorded sequence up to actual time. Such operator's intervention is a forerunner of process automation derived from image analysis. The efficiency of the propounded algorithm is demonstrated on images of rat fibrosarcoma cells using an off-axis holographic microscope. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.
Velasco-Tapia, Fernando
2014-01-01
Magmatic processes have usually been identified and evaluated using qualitative or semiquantitative geochemical or isotopic tools based on a restricted number of variables. However, a more complete and quantitative view could be reached applying multivariate analysis, mass balance techniques, and statistical tests. As an example, in this work a statistical and quantitative scheme is applied to analyze the geochemical features for the Sierra de las Cruces (SC) volcanic range (Mexican Volcanic Belt). In this locality, the volcanic activity (3.7 to 0.5 Ma) was dominantly dacitic, but the presence of spheroidal andesitic enclaves and/or diverse disequilibrium features in majority of lavas confirms the operation of magma mixing/mingling. New discriminant-function-based multidimensional diagrams were used to discriminate tectonic setting. Statistical tests of discordancy and significance were applied to evaluate the influence of the subducting Cocos plate, which seems to be rather negligible for the SC magmas in relation to several major and trace elements. A cluster analysis following Ward's linkage rule was carried out to classify the SC volcanic rocks geochemical groups. Finally, two mass-balance schemes were applied for the quantitative evaluation of the proportion of the end-member components (dacitic and andesitic magmas) in the comingled lavas (binary mixtures). PMID:24737994
Protecting solid-state spins from a strongly coupled environment
NASA Astrophysics Data System (ADS)
Chen, Mo; Calvin Sun, Won Kyu; Saha, Kasturi; Jaskula, Jean-Christophe; Cappellaro, Paola
2018-06-01
Quantum memories are critical for solid-state quantum computing devices and a good quantum memory requires both long storage time and fast read/write operations. A promising system is the nitrogen-vacancy (NV) center in diamond, where the NV electronic spin serves as the computing qubit and a nearby nuclear spin as the memory qubit. Previous works used remote, weakly coupled 13C nuclear spins, trading read/write speed for long storage time. Here we focus instead on the intrinsic strongly coupled 14N nuclear spin. We first quantitatively understand its decoherence mechanism, identifying as its source the electronic spin that acts as a quantum fluctuator. We then propose a scheme to protect the quantum memory from the fluctuating noise by applying dynamical decoupling on the environment itself. We demonstrate a factor of 3 enhancement of the storage time in a proof-of-principle experiment, showing the potential for a quantum memory that combines fast operation with long coherence time.
Broadband frequency and angular response of a sinusoidal bull’s eye antenna
NASA Astrophysics Data System (ADS)
Beaskoetxea, U.; Navarro-Cía, M.; Beruete, M.
2016-07-01
A thorough experimental study of the frequency and beaming angle response of a metallic leaky-wave bull’s eye antenna working at 77 GHz with a sinusoidally corrugated profile is presented. The beam scanning property of these antennas as frequency is varied is experimentally demonstrated and corroborated through theoretical and numerical results. From the experimental results the dispersion diagram of the n = -1 and n = -2 space harmonics is extracted, and the operation at different frequency regimes is identified and discussed. In order to show the contribution of each half of the antenna, numerical examples of the near-field behavior are also displayed. Overall, experimental results are in good qualitative and quantitative agreement with theoretical and numerical calculations. Finally, an analysis of the beamwidth as a function of frequency is performed, showing that it can achieve values below 1.5° in a fractional bandwidth of 4% around the operation frequency, which is an interesting frequency-stable broadside radiation.
Schlatter, Rosane Paixão; Matte, Ursula; Polanczyk, Carisi Anne; Koehler-Santos, Patrícia; Ashton-Prolla, Patricia
2015-01-01
This study identifies and describes the operating costs associated with the molecular diagnosis of diseases, such as hereditary cancer. To approximate the costs associated with these tests, data informed by Standard Operating Procedures for various techniques was collected from hospital software and a survey of market prices. Costs were established for four scenarios of capacity utilization to represent the possibility of suboptimal use in research laboratories. Cost description was based on a single site. The results show that only one technique was not impacted by rising costs due to underutilized capacity. Several common techniques were considerably more expensive at 30% capacity, including polymerase chain reaction (180%), microsatellite instability analysis (181%), gene rearrangement analysis by multiplex ligation probe amplification (412%), non-labeled sequencing (173%), and quantitation of nucleic acids (169%). These findings should be relevant for the definition of public policies and suggest that investment of public funds in the establishment of centralized diagnostic research centers would reduce costs to the Public Health System. PMID:26500437
Liu, Jian; Gao, Yun-Hua; Li, Ding-Dong; Gao, Yan-Chun; Hou, Ling-Mi; Xie, Ting
2014-01-01
To compare the value of contrast-enhanced ultrasound (CEUS) qualitative and quantitative analysis in the identification of breast tumor lumps. Qualitative and quantitative indicators of CEUS for 73 cases of breast tumor lumps were retrospectively analyzed by univariate and multivariate approaches. Logistic regression was applied and ROC curves were drawn for evaluation and comparison. The CEUS qualitative indicator-generated regression equation contained three indicators, namely enhanced homogeneity, diameter line expansion and peak intensity grading, which demonstrated prediction accuracy for benign and malignant breast tumor lumps of 91.8%; the quantitative indicator-generated regression equation only contained one indicator, namely the relative peak intensity, and its prediction accuracy was 61.5%. The corresponding areas under the ROC curve for qualitative and quantitative analyses were 91.3% and 75.7%, respectively, which exhibited a statistically significant difference by the Z test (P<0.05). The ability of CEUS qualitative analysis to identify breast tumor lumps is better than with quantitative analysis.
Music and communication in the operating theatre.
Weldon, Sharon-Marie; Korkiakangas, Terhi; Bezemer, Jeff; Kneebone, Roger
2015-12-01
To observe the extent and the detail with which playing music can impact on communication in the operating theatre. According to the cited sources, music is played in 53-72% of surgical operations performed. Noise levels in the operating theatre already exceed World Health Organisation recommendations. There is currently a divide in opinions on the playing of music in operating theatres, with few studies conducted and no policies or guidance provided. An ethnographic observational study of teamwork in operating theatres through video recordings. Quantitative and qualitative data analysis approaches were used. This study was conducted between 2012-2013 in the UK. Video recordings of 20 operations over six months in two operating theatres were captured. The recordings were divided into music and non-music playing cases. Each case was logged using a request/response sequence identified through interactional analysis. Statistical analysis, using a χ(2) , explored the difference between the proportion of request repetitions and whether music was playing or not. Further interactional analysis was conducted for each request repetition. Request/response observations (N = 5203) were documented. A chi-square test revealed that repeated requests were five times more likely to occur in cases that played music than those that did not. A repeated request can add 4-68 seconds each to operation time and increased tensions due to frustration at ineffective communication. Music played in the operating theatre can interfere with team communication, yet is seldom recognized as a potential safety hazard. Decisions around whether music is played and around the choice of music and its volume, are determined largely by surgeons. Frank discussions between clinicians, managers, patients and governing bodies should be encouraged for recommendations and guidance to be developed. © 2015 John Wiley & Sons Ltd.
Quantifying and managing uncertainty in operational modal analysis
NASA Astrophysics Data System (ADS)
Au, Siu-Kui; Brownjohn, James M. W.; Mottershead, John E.
2018-03-01
Operational modal analysis aims at identifying the modal properties (natural frequency, damping, etc.) of a structure using only the (output) vibration response measured under ambient conditions. Highly economical and feasible, it is becoming a common practice in full-scale vibration testing. In the absence of (input) loading information, however, the modal properties have significantly higher uncertainty than their counterparts identified from free or forced vibration (known input) tests. Mastering the relationship between identification uncertainty and test configuration is of great interest to both scientists and engineers, e.g., for achievable precision limits and test planning/budgeting. Addressing this challenge beyond the current state-of-the-art that are mostly concerned with identification algorithms, this work obtains closed form analytical expressions for the identification uncertainty (variance) of modal parameters that fundamentally explains the effect of test configuration. Collectively referred as 'uncertainty laws', these expressions are asymptotically correct for well-separated modes, small damping and long data; and are applicable under non-asymptotic situations. They provide a scientific basis for planning and standardization of ambient vibration tests, where factors such as channel noise, sensor number and location can be quantitatively accounted for. The work is reported comprehensively with verification through synthetic and experimental data (laboratory and field), scientific implications and practical guidelines for planning ambient vibration tests.
Fehr, M
2014-09-01
Business opportunities in the household waste sector in emerging economies still evolve around the activities of bulk collection and tipping with an open material balance. This research, conducted in Brazil, pursued the objective of shifting opportunities from tipping to reverse logistics in order to close the balance. To do this, it illustrated how specific knowledge of sorted waste composition and reverse logistics operations can be used to determine realistic temporal and quantitative landfill diversion targets in an emerging economy context. Experimentation constructed and confirmed the recycling trilogy that consists of source separation, collection infrastructure and reverse logistics. The study on source separation demonstrated the vital difference between raw and sorted waste compositions. Raw waste contained 70% biodegradable and 30% inert matter. Source separation produced 47% biodegradable, 20% inert and 33% mixed material. The study on collection infrastructure developed the necessary receiving facilities. The study on reverse logistics identified private operators capable of collecting and processing all separated inert items. Recycling activities for biodegradable material were scarce and erratic. Only farmers would take the material as animal feed. No composting initiatives existed. The management challenge was identified as stimulating these activities in order to complete the trilogy and divert the 47% source-separated biodegradable discards from the landfills. © The Author(s) 2014.
Keeping rail on track: preliminary findings on safety culture in Australian rail.
Blewett, Verna; Rainbird, Sophia; Dorrian, Jill; Paterson, Jessica; Cattani, Marcus
2012-01-01
'Safety culture' is identified in the literature as a critical element of healthy and safe workplaces. How can rail organizations ensure that consistently effective work health and safety cultures are maintained across the diversity of their operations? This paper reports on research that is currently underway in the Australian rail industry aimed at producing a Model of Best Practice in Safety Culture for the industry. Located in rail organizations dedicated to the mining industry as well as urban rail and national freight operations, the research examines the constructs of organizational culture that impact on the development and maintenance of healthy and safe workplaces. The research uses a multi-method approach incorporating quantitative (survey) and qualitative (focus groups, interviews and document analysis) methods along with a participative process to identify interventions to improve the organization and develop plans for their implementation. The research uses as its analytical framework the 10 Platinum Rules, from the findings of earlier research in the New South Wales (Australia) mining industry, Digging Deeper. Data collection is underway at the time of writing and preliminary findings are presented at this stage. The research method may be adapted for use as a form of organizational review of safety and health in organizational culture.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, Myra L.; LaFleur, Chris Bensdotter; Muna, Alice Baca
Safety standards development for maintenance facilities of liquid and compressed natural gas fueled vehicles is required to ensure proper facility design and operating procedures. Standard development organizations are utilizing risk-informed concepts to develop natural gas vehicle (NGV) codes and standards so that maintenance facilities meet acceptable risk levels. The present report summarizes Phase II work for existing NGV repair facility code requirements and highlights inconsistencies that need quantitative analysis into their effectiveness. A Hazardous and Operability study was performed to identify key scenarios of interest using risk ranking. Detailed simulations and modeling were performed to estimate the location and behaviormore » of natural gas releases based on these scenarios. Specific code conflicts were identified, and ineffective code requirements were highlighted and resolutions proposed. These include ventilation rate basis on area or volume, as well as a ceiling offset which seems ineffective at protecting against flammable gas concentrations. ACKNOWLEDGEMENTS The authors gratefully acknowledge Bill Houf (SNL -- Retired) for his assistance with the set-up and post-processing of the numerical simulations. The authors also acknowledge Doug Horne (retired) for his helpful discussions. We would also like to acknowledge the support from the Clean Cities program of DOE's Vehicle Technology Office.« less
Miri, Andrew; Daie, Kayvon; Burdine, Rebecca D.; Aksay, Emre
2011-01-01
The advent of methods for optical imaging of large-scale neural activity at cellular resolution in behaving animals presents the problem of identifying behavior-encoding cells within the resulting image time series. Rapid and precise identification of cells with particular neural encoding would facilitate targeted activity measurements and perturbations useful in characterizing the operating principles of neural circuits. Here we report a regression-based approach to semiautomatically identify neurons that is based on the correlation of fluorescence time series with quantitative measurements of behavior. The approach is illustrated with a novel preparation allowing synchronous eye tracking and two-photon laser scanning fluorescence imaging of calcium changes in populations of hindbrain neurons during spontaneous eye movement in the larval zebrafish. Putative velocity-to-position oculomotor integrator neurons were identified that showed a broad spatial distribution and diversity of encoding. Optical identification of integrator neurons was confirmed with targeted loose-patch electrical recording and laser ablation. The general regression-based approach we demonstrate should be widely applicable to calcium imaging time series in behaving animals. PMID:21084686
Brossard-Racine, Marie; Mazer, Barbara; Julien, Marilyse; Majnemer, Annette
2012-01-01
In this study we sought to validate the discriminant ability of the Evaluation Tool of Children's Handwriting-Manuscript in identifying children in Grades 2-3 with handwriting difficulties and to determine the percentage of change in handwriting scores that is consistently detected by occupational therapists. Thirty-four therapists judged and compared 35 pairs of handwriting samples. Receiver operating characteristic (ROC) analyses were performed to determine (1) the optimal cutoff values for word and letter legibility scores that identify children with handwriting difficulties who should be seen in rehabilitation and (2) the minimal clinically important difference (MCID) in handwriting scores. Cutoff scores of 75.0% for total word legibility and 76.0% for total letter legibility were found to provide excellent levels of accuracy. A difference of 10.0%-12.5% for total word legibility and 6.0%-7.0% for total letter legibility were found as the MCID. Study findings enable therapists to quantitatively support clinical judgment when evaluating handwriting. Copyright © 2012 by the American Occupational Therapy Association, Inc.
30 CFR 735.18 - Grant application procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Column 5A of Forms OSM-51A and OSM-51B which reports the quantitative Program Management information of... of Form OSM-51C which reports the quantitative Program Management information of the Small Operator... use the application forms and procedures specified by OSM in accordance with Office of Management and...
ERIC Educational Resources Information Center
Caglayan, Günhan
2013-01-01
This study is about prospective secondary mathematics teachers' understanding and sense making of representational quantities generated by algebra tiles, the quantitative units (linear vs. areal) inherent in the nature of these quantities, and the quantitative addition and multiplication operations--referent preserving versus referent…
Knapik, Joseph; Steelman, Ryan
2016-11-01
To identify and analyze articles in which the authors examined risk factors for soldiers during military static-line airborne operations. We searched for articles in PubMed, the Defense Technical Information Center, reference lists, and other sources using the key words airborne, parachuting, parachutes, paratrooper, injuries, wounds, trauma, and musculoskeletal. The search identified 17 684 potential studies. Studies were included if they were written in English, involved military static-line parachute operations, recorded injuries directly from events on the landing zone or from safety or medical records, and provided data for quantitative assessment of injury risk factors. A total of 23 studies met the review criteria, and 15 were included in the meta-analysis. The summary statistic obtained for each risk factor was the risk ratio, which was the ratio of the injury risk in 1 group to that of another (baseline) group. Where data were sufficient, meta-analyses were performed and heterogeneity and publication bias were assessed. Risk factors for static-line parachuting injuries included night jumps, jumps with extra equipment, higher wind speeds, higher air temperatures, jumps from fixed-wing aircraft rather than balloons or helicopters, jumps onto certain types of terrain, being a female paratrooper, greater body weight, not using the parachute ankle brace, smaller parachute canopies, simultaneous exits from both sides of an aircraft, higher heat index, winds from the rear of the aircraft on exit entanglements, less experience with a particular parachute system, being an enlisted soldier rather than an officer, and jumps involving a greater number of paratroopers. We analyzed and summarized factors that increased the injury risk for soldiers during military static-line parachute operations. Understanding and considering these factors in risk evaluations may reduce the likelihood of injury during parachuting.
Influences of operational practices on municipal solid waste landfill storage capacity.
Li, Yu-Chao; Liu, Hai-Long; Cleall, Peter John; Ke, Han; Bian, Xue-Cheng
2013-03-01
The quantitative effects of three operational factors, that is initial compaction, decomposition condition and leachate level, on municipal solid waste (MSW) landfill settlement and storage capacity are investigated in this article via consideration of a hypothetical case. The implemented model for calculating landfill compression displacement is able to consider decreases in compressibility induced by biological decomposition and load dependence of decomposition compression for the MSW. According to the investigation, a significant increase in storage capacity can be achieved by intensive initial compaction, adjustment of decomposition condition and lowering of leachate levels. The quantitative investigation presented aims to encourage landfill operators to improve management to enhance storage capacity. Furthermore, improving initial compaction and creating a preferential decomposition condition can also significantly reduce operational and post-closure settlements, respectively, which helps protect leachate and gas management infrastructure and monitoring equipment in modern landfills.
Wei, Zheng-mao; Du, Xiang-ke; Huo, Tian-long; Li, Xu-bin; Quan, Guang-nan; Li, Tian-ran; Cheng, Jin; Zhang, Wei-tao
2012-03-01
Quantitative T2 mapping has been a widely used method for the evaluation of pathological cartilage properties, and the histological assessment system of osteoarthritis in the rabbit has been published recently. The aim of the study was to investigate the effectiveness of quantitative T2 mapping evaluation for articular cartilage lesions of a rabbit model of anterior cruciate ligament transection (ACLT) osteoarthritis. Twenty New Zealand White (NZW) rabbits were divided into ACLT surgical group and sham operated group equally. The anterior cruciate ligaments of the rabbits in ACLT group were transected, while the joints were closed intactly in sham operated group. Magnetic resonance (MR) examinations were performed on 3.0T MR unit at week 0, week 6, and week 12. T2 values were computed on GE ADW4.3 workstation. All rabbits were killed at week 13, and left knees were stained with Haematoxylin and Eosin. Semiquantitative histological grading was obtained according to the osteoarthritis cartilage histopathology assessment system. Computerized image analysis was performed to quantitate the immunostained collagen type II. The average MR T2 value of whole left knee cartilage in ACLT surgical group ((29.05±12.01) ms) was significantly higher than that in sham operated group ((24.52±7.97) ms) (P=0.024) at week 6. The average T2 value increased to (32.18±12.79) ms in ACLT group at week 12, but remained near the baseline level ((27.66±8.08) ms) in the sham operated group (P=0.03). The cartilage lesion level of left knee in ACLT group was significantly increased at week 6 (P=0.005) and week 12 (P<0.001). T2 values had positive correlation with histological grading scores, but inverse correlation with optical densities (OD) of type II collagen. This study demonstrated the reliability and practicability of quantitative T2 mapping for the cartilage injury of rabbit ACLT osteoarthritis model.
Implementation of noise budgets for civil airports
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bishop, D.E.
1982-01-01
An increasing number of airports are faced with the need for establishing a lid on the noise from aircraft operations and for developing programs for reducing airport noise on a year-to-year basis. As an example, the California Airport Noise Standard acts to impose such programs on a number of airports in California. Any airport faced with the need to establish a quantitative reduction of noise obviously wants to achieve this reduction with the least impact on numbers of operations and reduction in air transportation services to the community. A reduction in noise and an increase in operations usually can bemore » achieved only by encouraging use of the quietest aircraft available and, further adding incentives for operating procedures that minimize noise. One approach in administering airport noise reduction is to adopt an airport noise budget. As used in this paper, the noise budget concept implies that quantitative limits on the noise environment and on the noise contributions by major airport users will be established. Having methods for enforcing compliance with the airport budget for those airport users that exceed their budget will be established. Thus, the noise budget provides airport management, and major airport users, with quantitative measures for defining noise goals, and actual progress in achieving such goals.« less
76 FR 22045 - Fluopicolide; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-20
... increased quantitative susceptibility of rat or rabbit fetuses to in utero or postnatal exposure to... is identified as docket ID number EPA-HQ-OPP-2006-0481. A quantitative reassessment of the BAM risk... fluopicolide and separately, its metabolite, BAM in food as follows: i. Acute exposure. Quantitative acute...
2017-03-13
camps and Soldier/small unit readiness to guide the development of a quantitative readiness survey . • During structured interviews, 31 Soldiers...camps and Soldier/small unit readiness to guide the development of a quantitative readiness survey . 11UNCLASSIFIED Click to hear quote Methods...We were rock stars.” 60UNCLASSIFIED • These data are driving the selection of attributes for a future quantitative survey of the link between QoL
[New approach for managing microbial risks in food].
Augustin, Jean-Christophe
2015-01-01
The aim of the food legislation is to ensure the protection of human health. Traditionally, the food legislation requires food business operators to apply good hygiene practices and specific procedures to control foodborne pathogens. These regulations allowed reaching a high level of health protection. The improvement of the system will require risk-based approaches. Firstly, risk assessment should allow the identification of high-risk situations where resources should be allocated for a better targeting of risk management. Then, management measures should be adapted to the health objective. In this approach, the appropriate level of protection is converted intofood safety and performance objectives on the food chain, i.e., maximum microbial contamination to fulfil the acceptable risk level. When objectives are defined, the food business operators and competent authorities can identify control options to comply with the objectives and establish microbiological criteria to verify compliance with these objectives. This approach, described for approximately 10 years, operative thanks to the development of quantitative risk assessment techniques, is still difficult to use in practical terms since it requires a commitment of competent authorities to define the acceptable risk and needs also the implementation of sometimes complex risk models.
Vulnerability of manned spacecraft to crew loss from orbital debris penetration
NASA Technical Reports Server (NTRS)
Williamsen, J. E.
1994-01-01
Orbital debris growth threatens the survival of spacecraft systems from impact-induced failures. Whereas the probability of debris impact and spacecraft penetration may currently be calculated, another parameter of great interest to safety engineers is the probability that debris penetration will cause actual spacecraft or crew loss. Quantifying the likelihood of crew loss following a penetration allows spacecraft designers to identify those design features and crew operational protocols that offer the highest improvement in crew safety for available resources. Within this study, a manned spacecraft crew survivability (MSCSurv) computer model is developed that quantifies the conditional probability of losing one or more crew members, P(sub loss/pen), following the remote likelihood of an orbital debris penetration into an eight module space station. Contributions to P(sub loss/pen) are quantified from three significant penetration-induced hazards: pressure wall rupture (explosive decompression), fragment-induced injury, and 'slow' depressurization. Sensitivity analyses are performed using alternate assumptions for hazard-generating functions, crew vulnerability thresholds, and selected spacecraft design and crew operations parameters. These results are then used to recommend modifications to the spacecraft design and expected crew operations that quantitatively increase crew safety from orbital debris impacts.
Real-time thermal imaging of solid oxide fuel cell cathode activity in working condition.
Montanini, Roberto; Quattrocchi, Antonino; Piccolo, Sebastiano A; Amato, Alessandra; Trocino, Stefano; Zignani, Sabrina C; Faro, Massimiliano Lo; Squadrito, Gaetano
2016-09-01
Electrochemical methods such as voltammetry and electrochemical impedance spectroscopy are effective for quantifying solid oxide fuel cell (SOFC) operational performance, but not for identifying and monitoring the chemical processes that occur on the electrodes' surface, which are thought to be strictly related to the SOFCs' efficiency. Because of their high operating temperature, mechanical failure or cathode delamination is a common shortcoming of SOFCs that severely affects their reliability. Infrared thermography may provide a powerful tool for probing in situ SOFC electrode processes and the materials' structural integrity, but, due to the typical design of pellet-type cells, a complete optical access to the electrode surface is usually prevented. In this paper, a specially designed SOFC is introduced, which allows temperature distribution to be measured over all the cathode area while still preserving the electrochemical performance of the device. Infrared images recorded under different working conditions are then processed by means of a dedicated image processing algorithm for quantitative data analysis. Results reported in the paper highlight the effectiveness of infrared thermal imaging in detecting the onset of cell failure during normal operation and in monitoring cathode activity when the cell is fed with different types of fuels.
Global Low Frequency Protein Motions in Long-Range Allosteric Signaling
NASA Astrophysics Data System (ADS)
McLeish, Tom; Rogers, Thomas; Townsend, Philip; Burnell, David; Pohl, Ehmke; Wilson, Mark; Cann, Martin; Richards, Shane; Jones, Matthew
2015-03-01
We present a foundational theory for how allostery can occur as a function of low frequency dynamics without a change in protein structure. Elastic inhomogeneities allow entropic ``signalling at a distance.'' Remarkably, many globular proteins display just this class of elastic structure, in particular those that support allosteric binding of substrates (long-range co-operative effects between the binding sites of small molecules). Through multi-scale modelling of global normal modes we demonstrate negative co-operativity between the two cAMP ligands without change to the mean structure. Crucially, the value of the co-operativity is itself controlled by the interactions around a set of third allosteric ``control sites.'' The theory makes key experimental predictions, validated by analysis of variant proteins by a combination of structural biology and isothermal calorimetry. A quantitative description of allostery as a free energy landscape revealed a protein ``design space'' that identified the key inter- and intramolecular regulatory parameters that frame CRP/FNR family allostery. Furthermore, by analyzing naturally occurring CAP variants from diverse species, we demonstrate an evolutionary selection pressure to conserve residues crucial for allosteric control. The methodology establishes the means to engineer allosteric mechanisms that are driven by low frequency dynamics.
Integrated payload and mission planning, phase 3. Volume 3: Ground real-time mission operations
NASA Technical Reports Server (NTRS)
White, W. J.
1977-01-01
The payloads tentatively planned to fly on the first two Spacelab missions were analyzed to examine the cost relationships of providing mission operations support from onboard vs the ground-based Payload Operations Control Center (POCC). The quantitative results indicate that use of a POCC, with data processing capability, to support real-time mission operations is the most cost effective case.
On differences of linear positive operators
NASA Astrophysics Data System (ADS)
Aral, Ali; Inoan, Daniela; Raşa, Ioan
2018-04-01
In this paper we consider two different general linear positive operators defined on unbounded interval and obtain estimates for the differences of these operators in quantitative form. Our estimates involve an appropriate K-functional and a weighted modulus of smoothness. Similar estimates are obtained for Chebyshev functional of these operators as well. All considerations are based on rearrangement of the remainder in Taylor's formula. The obtained results are applied for some well known linear positive operators.
Clevenger, Josh; Chu, Ye; Chavarro, Carolina; Botton, Stephanie; Culbreath, Albert; Isleib, Thomas G; Holbrook, C C; Ozias-Akins, Peggy
2018-01-01
Late leaf spot (LLS; Cercosporidium personatum ) is a major fungal disease of cultivated peanut ( Arachis hypogaea ). A recombinant inbred line population segregating for quantitative field resistance was used to identify quantitative trait loci (QTL) using QTL-seq. High rates of false positive SNP calls using established methods in this allotetraploid crop obscured significant QTLs. To resolve this problem, robust parental SNPs were first identified using polyploid-specific SNP identification pipelines, leading to discovery of significant QTLs for LLS resistance. These QTLs were confirmed over 4 years of field data. Selection with markers linked to these QTLs resulted in a significant increase in resistance, showing that these markers can be immediately applied in breeding programs. This study demonstrates that QTL-seq can be used to rapidly identify QTLs controlling highly quantitative traits in polyploid crops with complex genomes. Markers identified can then be deployed in breeding programs, increasing the efficiency of selection using molecular tools. Key Message: Field resistance to late leaf spot is a quantitative trait controlled by many QTLs. Using polyploid-specific methods, QTL-seq is faster and more cost effective than QTL mapping.
Clevenger, Josh; Chu, Ye; Chavarro, Carolina; Botton, Stephanie; Culbreath, Albert; Isleib, Thomas G.; Holbrook, C. C.; Ozias-Akins, Peggy
2018-01-01
Late leaf spot (LLS; Cercosporidium personatum) is a major fungal disease of cultivated peanut (Arachis hypogaea). A recombinant inbred line population segregating for quantitative field resistance was used to identify quantitative trait loci (QTL) using QTL-seq. High rates of false positive SNP calls using established methods in this allotetraploid crop obscured significant QTLs. To resolve this problem, robust parental SNPs were first identified using polyploid-specific SNP identification pipelines, leading to discovery of significant QTLs for LLS resistance. These QTLs were confirmed over 4 years of field data. Selection with markers linked to these QTLs resulted in a significant increase in resistance, showing that these markers can be immediately applied in breeding programs. This study demonstrates that QTL-seq can be used to rapidly identify QTLs controlling highly quantitative traits in polyploid crops with complex genomes. Markers identified can then be deployed in breeding programs, increasing the efficiency of selection using molecular tools. Key Message: Field resistance to late leaf spot is a quantitative trait controlled by many QTLs. Using polyploid-specific methods, QTL-seq is faster and more cost effective than QTL mapping. PMID:29459876
Lavallée-Adam, Mathieu
2017-01-01
PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. PMID:27010334
Posada, John A; Patel, Akshay D; Roes, Alexander; Blok, Kornelis; Faaij, André P C; Patel, Martin K
2013-05-01
The aim of this study is to present and apply a quick screening method and to identify the most promising bioethanol derivatives using an early-stage sustainability assessment method that compares a bioethanol-based conversion route to its respective petrochemical counterpart. The method combines, by means of a multi-criteria approach, quantitative and qualitative proxy indicators describing economic, environmental, health and safety and operational aspects. Of twelve derivatives considered, five were categorized as favorable (diethyl ether, 1,3-butadiene, ethyl acetate, propylene and ethylene), two as promising (acetaldehyde and ethylene oxide) and five as unfavorable derivatives (acetic acid, n-butanol, isobutylene, hydrogen and acetone) for an integrated biorefinery concept. Copyright © 2012 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kyoung, S.; Yoo, H.; Ju, H.
2015-03-15
In this paper, the hydrogen delivery capabilities of uranium (U) and zirconium-cobalt (ZrCo) are compared quantitatively in order to find the optimum getter materials for tritium storage. A three-dimensional hydrogen desorption model is applied to two identically designed cylindrical beds with the different materials, and hydrogen desorption simulations are then conducted. The simulation results show superior hydrogen delivery performance and easier thermal management capability for the U bed. This detailed analysis of the hydrogen desorption behaviors of beds with U and ZrCo will help to identify the optimal bed material, bed design, and operating conditions for the storage and deliverymore » system in ITER. (authors)« less
Multipartite Gaussian steering: Monogamy constraints and quantum cryptography applications
NASA Astrophysics Data System (ADS)
Xiang, Yu; Kogias, Ioannis; Adesso, Gerardo; He, Qiongyi
2017-01-01
We derive laws for the distribution of quantum steering among different parties in multipartite Gaussian states under Gaussian measurements. We prove that a monogamy relation akin to the generalized Coffman-Kundu-Wootters inequality holds quantitatively for a recently introduced measure of Gaussian steering. We then define the residual Gaussian steering, stemming from the monogamy inequality, as an indicator of collective steering-type correlations. For pure three-mode Gaussian states, the residual acts as a quantifier of genuine multipartite steering, and is interpreted operationally in terms of the guaranteed key rate in the task of secure quantum secret sharing. Optimal resource states for the latter protocol are identified, and their possible experimental implementation discussed. Our results pin down the role of multipartite steering for quantum communication.
Kim, Minsoo; Kim, Yejin; Kim, Hyosoo; Piao, Wenhua; Kim, Changwon
2016-06-01
An operator decision support system (ODSS) is proposed to support operators of wastewater treatment plants (WWTPs) in making appropriate decisions. This system accounts for water quality (WQ) variations in WWTP influent and effluent and in the receiving water body (RWB). The proposed system is comprised of two diagnosis modules, three prediction modules, and a scenario-based supporting module (SSM). In the diagnosis modules, the WQs of the influent and effluent WWTP and of the RWB are assessed via multivariate analysis. Three prediction modules based on the k-nearest neighbors (k-NN) method, activated sludge model no. 2d (ASM2d) model, and QUAL2E model are used to forecast WQs for 3 days in advance. To compare various operating alternatives, SSM is applied to test various predetermined operating conditions in terms of overall oxygen transfer coefficient (Kla), waste sludge flow rate (Qw), return sludge flow rate (Qr), and internal recycle flow rate (Qir). In the case of unacceptable total phosphorus (TP), SSM provides appropriate information for the chemical treatment. The constructed ODSS was tested using data collected from Geumho River, which was the RWB, and S WWTP in Daegu City, South Korea. The results demonstrate the capability of the proposed ODSS to provide WWTP operators with more objective qualitative and quantitative assessments of WWTP and RWB WQs. Moreover, the current study shows that ODSS, using data collected from the study area, can be used to identify operational alternatives through SSM at an integrated urban wastewater management level.
Sheu, Yahtyng; Zmuda, Joseph M; Boudreau, Robert M; Petit, Moira A; Ensrud, Kristine E; Bauer, Douglas C; Gordon, Christopher L; Orwoll, Eric S; Cauley, Jane A
2011-01-01
Many fractures occur in individuals without osteoporosis defined by areal bone mineral density (aBMD). Inclusion of other aspects of skeletal strength may be useful in identifying at-risk subjects. We used surrogate measures of bone strength at the radius and tibia measured by peripheral quantitative computed tomography (pQCT) to evaluate their relationships with nonvertebral fracture risk. Femoral neck (FN) aBMD, measured by dual-energy X-ray absorptiometry (DXA), also was included. The study population consisted of 1143 white men aged 69+ years with pQCT measures at the radius and tibia from the Minneapolis and Pittsburgh centers of the Osteoporotic Fractures in Men (MrOS) study. Principal-components analysis and Cox proportional-hazards modeling were used to identify 21 of 58 pQCT variables with a major contribution to nonvertebral incident fractures. After a mean 2.9 years of follow-up, 39 fractures occurred. Men without incident fractures had significantly greater bone mineral content, cross-sectional area, and indices of bone strength than those with fractures by pQCT. Every SD decrease in the 18 of 21 pQCT parameters was significantly associated with increased fracture risk (hazard ration ranged from 1.4 to 2.2) independent of age, study site, body mass index (BMI), and FN aBMD. Using area under the receiver operation characteristics curve (AUC), the combination of FN aBMD and three radius strength parameters individually increased fracture prediction over FN aBMD alone (AUC increased from 0.73 to 0.80). Peripheral bone strength measures are associated with fracture risk and may improve our ability to identify older men at high risk of fracture. © 2011 American Society for Bone and Mineral Research.
Loiselle, Christopher; Eby, Peter R.; Kim, Janice N.; Calhoun, Kristine E.; Allison, Kimberly H.; Gadi, Vijayakrishna K.; Peacock, Sue; Storer, Barry; Mankoff, David A.; Partridge, Savannah C.; Lehman, Constance D.
2014-01-01
Rationale and Objectives To test the ability of quantitative measures from preoperative Dynamic Contrast Enhanced MRI (DCE-MRI) to predict, independently and/or with the Katz pathologic nomogram, which breast cancer patients with a positive sentinel lymph node biopsy will have ≥ 4 positive axillary lymph nodes upon completion axillary dissection. Methods and Materials A retrospective review was conducted to identify clinically node-negative invasive breast cancer patients who underwent preoperative DCE-MRI, followed by sentinel node biopsy with positive findings and complete axillary dissection (6/2005 – 1/2010). Clinical/pathologic factors, primary lesion size and quantitative DCE-MRI kinetics were collected from clinical records and prospective databases. DCE-MRI parameters with univariate significance (p < 0.05) to predict ≥ 4 positive axillary nodes were modeled with stepwise regression and compared to the Katz nomogram alone and to a combined MRI-Katz nomogram model. Results Ninety-eight patients with 99 positive sentinel biopsies met study criteria. Stepwise regression identified DCE-MRI total persistent enhancement and volume adjusted peak enhancement as significant predictors of ≥4 metastatic nodes. Receiver operating characteristic (ROC) curves demonstrated an area under the curve (AUC) of 0.78 for the Katz nomogram, 0.79 for the DCE-MRI multivariate model, and 0.87 for the combined MRI-Katz model. The combined model was significantly more predictive than the Katz nomogram alone (p = 0.003). Conclusion Integration of DCE-MRI primary lesion kinetics significantly improved the Katz pathologic nomogram accuracy to predict presence of metastases in ≥ 4 nodes. DCE-MRI may help identify sentinel node positive patients requiring further localregional therapy. PMID:24331270
The quantitative Faber-Krahn inequality for the Robin Laplacian
NASA Astrophysics Data System (ADS)
Bucur, Dorin; Ferone, Vincenzo; Nitsch, Carlo; Trombetti, Cristina
2018-04-01
We prove a quantitative form of the Faber-Krahn inequality for the first eigenvalue of the Laplace operator with Robin boundary conditions. The asymmetry term involves the square power of the Fraenkel asymmetry, multiplied by a constant depending on the Robin parameter, the dimension of the space and the measure of the set.
Zhu, Ying; Zhang, Yun-Xia; Liu, Wen-Wen; Ma, Yan; Fang, Qun; Yao, Bo
2015-04-01
This paper describes a nanoliter droplet array-based single-cell reverse transcription quantitative PCR (RT-qPCR) assay method for quantifying gene expression in individual cells. By sequentially printing nanoliter-scale droplets on microchip using a microfluidic robot, all liquid-handling operations including cell encapsulation, lysis, reverse transcription, and quantitative PCR with real-time fluorescence detection, can be automatically achieved. The inhibition effect of cell suspension buffer on RT-PCR assay was comprehensively studied to achieve high-sensitivity gene quantification. The present system was applied in the quantitative measurement of expression level of mir-122 in single Huh-7 cells. A wide distribution of mir-122 expression in single cells from 3061 copies/cell to 79998 copies/cell was observed, showing a high level of cell heterogeneity. With the advantages of full-automation in liquid-handling, simple system structure, and flexibility in achieving multi-step operations, the present method provides a novel liquid-handling mode for single cell gene expression analysis, and has significant potentials in transcriptional identification and rare cell analysis.
Zhu, Ying; Zhang, Yun-Xia; Liu, Wen-Wen; Ma, Yan; Fang, Qun; Yao, Bo
2015-01-01
This paper describes a nanoliter droplet array-based single-cell reverse transcription quantitative PCR (RT-qPCR) assay method for quantifying gene expression in individual cells. By sequentially printing nanoliter-scale droplets on microchip using a microfluidic robot, all liquid-handling operations including cell encapsulation, lysis, reverse transcription, and quantitative PCR with real-time fluorescence detection, can be automatically achieved. The inhibition effect of cell suspension buffer on RT-PCR assay was comprehensively studied to achieve high-sensitivity gene quantification. The present system was applied in the quantitative measurement of expression level of mir-122 in single Huh-7 cells. A wide distribution of mir-122 expression in single cells from 3061 copies/cell to 79998 copies/cell was observed, showing a high level of cell heterogeneity. With the advantages of full-automation in liquid-handling, simple system structure, and flexibility in achieving multi-step operations, the present method provides a novel liquid-handling mode for single cell gene expression analysis, and has significant potentials in transcriptional identification and rare cell analysis. PMID:25828383
Manned Versus Unmanned Risk and Complexity Considerations for Future Midsized X-Planes
NASA Technical Reports Server (NTRS)
Lechniak, Jason A.; Melton, John E.
2017-01-01
The objective of this work was to identify and estimate complexity and risks associated with the development and testing of new low-cost medium-scale X-plane aircraft primarily focused on air transport operations. Piloting modes that were evaluated for this task were manned, remotely piloted, and unmanned flight research programs. This analysis was conducted early in the data collection period for X-plane concept vehicles before preliminary designs were complete. Over 50 different aircraft and system topics were used to evaluate the three piloting control modes. Expert group evaluations from a diverse set of pilots, engineers, and other experts at Aeronautics Research Mission Directorate centers within the National Aeronautics and Space Administration provided qualitative reasoning on the many issues surrounding the decisions regarding piloting modes. The group evaluations were numerically rated to evaluate each topic quantitatively and were used to provide independent criteria for vehicle complexity and risk. An Edwards Air Force Base instruction document was identified that emerged as a source of the effects found in our qualitative and quantitative data. The study showed that a manned aircraft was the best choice to align with test activities for transport aircraft flight research from a low-complexity and low-risk perspective. The study concluded that a manned aircraft option would minimize the risk and complexity to improve flight-test efficiency and bound the cost of the flight-test portion of the program. Several key findings and discriminators between the three modes are discussed in detail.
Farzianpour, Fereshteh; Mohamadi, Efat; Najafpour, Zhila; Yousefinezhadi, Taraneh; Forootan, Sara; Foroushani, Abbas Rahimi
2016-09-01
Existence of doctors with high performance is one of the necessary conditions to provide high quality services. There are different motivations, which could affect their performance. Recognizing Factors which effect the performance of doctors as an effective force in health care centers is necessary. The aim of this article was evaluate the effective factors which influence on clinical performance of general surgery of Tehran University of Medical Sciences in 2015. This is a cross-sectional qualitative-quantitative study. This research conducted in 3 phases-phases I: (use of library studies and databases to collect data), phase II: localization of detected factors in first phase by using the Delphi technique and phase III: prioritizing the affecting factors on performance of doctors by using qualitative interviews. 12 articles were analyzed from 300 abstracts during the evaluation process. The output of assessment identified 23 factors was sent to surgeons and their assistants for obtaining their opinions. Quantitative analysis of the findings showed that "work qualification" (86.1%) and "managers and supervisors style" (50%) have respectively the most and the least impact on the performance of doctors. Finally 18 effective factors were identified and prioritized in the performance of general surgeons. The results showed that motivation and performance is not a single operating parameter and it depends on several factors according to cultural background. Therefore it is necessary to design, implementation and monitoring based on key determinants of effective interventions due to cultural background.
Location identification of closed crack based on Duffing oscillator transient transition
NASA Astrophysics Data System (ADS)
Liu, Xiaofeng; Bo, Lin; Liu, Yaolu; Zhao, Youxuan; Zhang, Jun; Deng, Mingxi; Hu, Ning
2018-02-01
The existence of a closed micro-crack in plates can be detected by using the nonlinear harmonic characteristics of the Lamb wave. However, its location identification is difficult. By considering the transient nonlinear Lamb under the noise interference, we proposed a location identification method for the closed crack based on the quantitative measurement of Duffing oscillator transient transfer in the phase space. The sliding short-time window was used to create a window truncation of to-be-detected signal. And then, the periodic extension processing for transient nonlinear Lamb wave was performed to ensure that the Duffing oscillator has adequate response time to reach a steady state. The transient autocorrelation method was used to reduce the occurrence of missed harmonic detection due to the random variable phase of nonlinear Lamb wave. Moreover, to overcome the deficiency in the quantitative analysis of Duffing system state by phase trajectory diagram and eliminate the misjudgment caused by harmonic frequency component contained in broadband noise, logic operation method of oscillator state transition function based on circular zone partition was adopted to establish the mapping relation between the oscillator transition state and the nonlinear harmonic time domain information. Final state transition discriminant function of Duffing oscillator was used as basis for identifying the reflected and transmitted harmonics from the crack. Chirplet time-frequency analysis was conducted to identify the mode of generated harmonics and determine the propagation speed. Through these steps, accurate position identification of the closed crack was achieved.
3D-Reconstruction of recent volcanic activity from ROV-video, Charles Darwin Seamounts, Cape Verdes
NASA Astrophysics Data System (ADS)
Kwasnitschka, T.; Hansteen, T. H.; Kutterolf, S.; Freundt, A.; Devey, C. W.
2011-12-01
As well as providing well-localized samples, Remotely Operated Vehicles (ROVs) produce huge quantities of visual data whose potential for geological data mining has seldom if ever been fully realized. We present a new workflow to derive essential results of field geology such as quantitative stratigraphy and tectonic surveying from ROV-based photo and video material. We demonstrate the procedure on the Charles Darwin Seamounts, a field of small hot spot volcanoes recently identified at a depth of ca. 3500m southwest of the island of Santo Antao in the Cape Verdes. The Charles Darwin Seamounts feature a wide spectrum of volcanic edifices with forms suggestive of scoria cones, lava domes, tuff rings and maar-type depressions, all of comparable dimensions. These forms, coupled with the highly fragmented volcaniclastic samples recovered by dredging, motivated surveying parts of some edifices down to centimeter scale. ROV-based surveys yielded volcaniclastic samples of key structures linked by extensive coverage of stereoscopic photographs and high-resolution video. Based upon the latter, we present our workflow to derive three-dimensional models of outcrops from a single-camera video sequence, allowing quantitative measurements of fault orientation, bedding structure, grain size distribution and photo mosaicking within a geo-referenced framework. With this information we can identify episodes of repetitive eruptive activity at individual volcanic centers and see changes in eruptive style over time, which, despite their proximity to each other, is highly variable.
Balance Confidence and Falls in Non-Demented Essential Tremor Patients: The Role of Cognition
Rao, Ashwini K.; Gilman, Arthur; Louis, Elan D.
2014-01-01
Objective To examine 1) the effect of cognitive ability on balance confidence and falls, 2) the relationship of balance confidence and falls with quantitative measures of gait, and 3) measures that predict falls, in people with Essential Tremor (ET). Design Cross-sectional study. Setting: General community. Participants One-hundred-eighty participants (132 people with ET and 48 controls). People with ET were divided into two groups based on the median score on the modified Mini Mental State Exam: ET-LCS vs. ET-HCS. Interventions Not applicable. Main Outcome Measures Activities of Balance Confidence-6 (ABC) scale and falls in the previous year. Results ET-LCS had lower ABC-6 scores and a greater number of falls than ET-HCS (p<0.05 for all measures) or controls (p<0.01 for all measures). Quantitative gait measures were significantly correlated with ABC-6 score and falls. Gait speed (p<0.007) and ABC-6 score (p<0.02) were significant predictors of falls. Receiver Operating Characteristic curve analysis revealed that gait speed < 0.9 m/s and ABC-6 score < 51% were associated with moderate sensitivity and specificity in identifying fallers. Conclusions People with ET with low cognitive scores have impaired gait, and report lower balance confidence, and higher number of falls than their counterparts with higher cognitive scores, and controls. We have identified assessments that are easily administered (gait speed and ABC-6 scale), and are associated with falls in ET. PMID:24769121
NASA Astrophysics Data System (ADS)
Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad
2016-09-01
Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.
Kumar, Ramya; Lahann, Joerg
2016-07-06
The performance of polymer interfaces in biology is governed by a wide spectrum of interfacial properties. With the ultimate goal of identifying design parameters for stem cell culture coatings, we developed a statistical model that describes the dependence of brush properties on surface-initiated polymerization (SIP) parameters. Employing a design of experiments (DOE) approach, we identified operating boundaries within which four gel architecture regimes can be realized, including a new regime of associated brushes in thin films. Our statistical model can accurately predict the brush thickness and the degree of intermolecular association of poly[{2-(methacryloyloxy) ethyl} dimethyl-(3-sulfopropyl) ammonium hydroxide] (PMEDSAH), a previously reported synthetic substrate for feeder-free and xeno-free culture of human embryonic stem cells. DOE-based multifunctional predictions offer a powerful quantitative framework for designing polymer interfaces. For example, model predictions can be used to decrease the critical thickness at which the wettability transition occurs by simply increasing the catalyst quantity from 1 to 3 mol %.
Field quantification of physical exposures of police officers in vehicle operation.
McKinnon, Colin D; Callaghan, Jack P; Dickerson, Clark R
2011-01-01
Mobile police officers perform many of their daily duties in their vehicles. Combined workspace inflexibility and prolonged driving create potential musculoskeletal injury risks. Limited research exists that quantitatively describes postural and load exposures associated with mobile police work. The purpose of this study was to characterize officer activity during a typical workday and identify opportunities for ergonomic intervention. Digital video of traffic officers (N = 10) was used to classify postures according to work activity. Cumulative time in 10 activities was calculated, and a time-history of driver activity documented. Most (55.5 ± 13.4%) time was out of the vehicle, and 22.3 ± 10.5% was spent in single-arm driving. On paper documentation and mobile data terminal use were identified as in-car activities that may benefit from targeted interventions. The primary contribution of this study is characterization of daily mobile police activity and the identification of possible intervention strategies to mitigate physical exposure levels.
Oh, Phil; Borgström, Per; Witkiewicz, Halina; Li, Yan; Borgström, Bengt J; Chrastina, Adrian; Iwata, Koji; Zinn, Kurt R; Baldwin, Richard; Testa, Jacqueline E; Schnitzer, Jan E
2007-03-01
How effectively and quickly endothelial caveolae can transcytose in vivo is unknown, yet critical for understanding their function and potential clinical utility. Here we use quantitative proteomics to identify aminopeptidase P (APP) concentrated in caveolae of lung endothelium. Electron microscopy confirms this and shows that APP antibody targets nanoparticles to caveolae. Dynamic intravital fluorescence microscopy reveals that targeted caveolae operate effectively as pumps, moving antibody within seconds from blood across endothelium into lung tissue, even against a concentration gradient. This active transcytosis requires normal caveolin-1 expression. Whole body gamma-scintigraphic imaging shows rapid, specific delivery into lung well beyond that achieved by standard vascular targeting. This caveolar trafficking in vivo may underscore a key physiological mechanism for selective transvascular exchange and may provide an enhanced delivery system for imaging agents, drugs, gene-therapy vectors and nanomedicines. 'In vivo proteomic imaging' as described here integrates organellar proteomics with multiple imaging techniques to identify an accessible target space that includes the transvascular pumping space of the caveola.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pincus, David; Ryan, Christopher J.; Smith, Richard D.
2013-03-12
Cell signaling systems transmit information by post-translationally modifying signaling proteins, often via phosphorylation. While thousands of sites of phosphorylation have been identified in proteomic studies, the vast majority of sites have no known function. Assigning functional roles to the catalog of uncharacterized phosphorylation sites is a key research challenge. Here we present a general approach to address this challenge and apply it to a prototypical signaling pathway, the pheromone response pathway in Saccharomyces cerevisiae. The pheromone pathway includes a mitogen activated protein kinase (MAPK) cascade activated by a G-protein coupled receptor (GPCR). We used mass spectrometry-based proteomics to identify sitesmore » whose phosphorylation changed when the system was active, and evolutionary conservation to assign priority to a list of candidate MAPK regulatory sites. We made targeted alterations in those sites, and measured the effects of the mutations on pheromone pathway output in single cells. Our work identified six new sites that quantitatively tuned system output. We developed simple computational models to find system architectures that recapitulated the quantitative phenotypes of the mutants. Our results identify a number of regulated phosphorylation events that contribute to adjust the input-output relationship of this model eukaryotic signaling system. We believe this combined approach constitutes a general means not only to reveal modification sites required to turn a pathway on and off, but also those required for more subtle quantitative effects that tune pathway output. Our results further suggest that relatively small quantitative influences from individual regulatory phosphorylation events endow signaling systems with plasticity that evolution may exploit to quantitatively tailor signaling outcomes.« less
Visualizing the Critique: Integrating Quantitative Reasoning with the Design Process
ERIC Educational Resources Information Center
Weinstein, Kathryn
2017-01-01
In the age of "Big Data," information is often quantitative in nature. The ability to analyze information through the sifting of data has been identified as a core competency for success in navigating daily life and participation in the contemporary workforce. This skill, known as Quantitative Reasoning (QR), is characterized by the…
Foran, Paula
2016-01-01
The aim of this research was to determine if guided operating theatre experience in the undergraduate nursing curricula enhanced surgical knowledge and understanding of nursing care provided outside this specialist area in the pre- and post-operative surgical wards. Using quantitative analyses, undergraduate nurses were knowledge tested on areas of pre- and post-operative surgical nursing in their final semester of study. As much learning occurs in nurses' first year of practice, participants were re-tested again after their Graduate Nurse Program/Preceptorship year. Participants' results were compared to the model of operating room education they had participated in to determine if there was a relationship between the type of theatre education they experienced (if any) and their knowledge of surgical ward nursing. Findings revealed undergraduates nurses receiving guided operating theatre experience had a 76% pass rate compared to 56% with non-guided or no experience (p < 0.001). Graduates with guided operating theatre experience as undergraduates or graduate nurses achieved a 100% pass rate compared to 53% with non-guided or no experience (p < 0.001). The research informs us that undergraduate nurses achieve greater learning about surgical ward nursing via guided operating room experience as opposed to surgical ward nursing experience alone. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dewanjee, M.K.; Fuster, V.; Rao, S.A.
1983-05-01
A noninvasive technique has been developed in the dog model for imaging, with a gamma camera, the platelet deposition on Bjoerk-Shiley mitral valve prostheses early postoperatively. At 25 hours after implantation of the prosthesis and 24 hours after intravenous administration of 400 to 500 microCi of platelets labeled with indium-111, the platelet deposition in the sewing ring and perivalvular cardiac tissue can be clearly delineated in a scintiphotograph. An in vitro technique was also developed for quantitation of visceral microemboli in brain, lungs, kidneys, and other tissues. Biodistribution of the labeled platelets was quantitated, and the tissue/blood radioactivity ratio wasmore » determined in 22 dogs in four groups: unoperated normal dogs, sham-operated dogs, prosthesis-implanted dogs, and prosthesis-implanted dogs treated with dipyridamole before and aspirin and dipyridamole immediately after operation. Fifteen to 20% of total platelets were consumed as a consequence of the surgical procedure. On quantitation, we found that platelet deposition on the components of the prostheses was significantly reduced in prosthesis-implanted animals treated with dipyridamole and aspirin when compared with prosthesis-implanted, untreated dogs. All prosthesis-implanted animals considered together had a twofold to fourfold increase in tissue/blood radioactivity ratio in comparison with unoperated and sham-operated animals, an indication that the viscera work as filters and trap platelet microemboli that are presumably produced in the region of the mitral valve prostheses. In the dog model, indium-111-labeled platelets thus provide a sensitive marker for noninvasive imaging of platelet deposition on mechanical mitral valve prostheses, in vitro evaluation of platelet microembolism in viscera, in vitro quantitation of surgical consumption of platelets, and evaluation of platelet-inhibitor drugs.« less
Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F
2010-01-01
The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Swartwout, Michael Alden
New paradigms in space missions require radical changes in spacecraft operations. In the past, operations were insulated from competitive pressures of cost, quality and time by system infrastructures, technological limitations and historical precedent. However, modern demands now require that operations meet competitive performance goals. One target for improvement is the telemetry downlink, where significant resources are invested to acquire thousands of measurements for human interpretation. This cost-intensive method is used because conventional operations are not based on formal methodologies but on experiential reasoning and incrementally adapted procedures. Therefore, to improve the telemetry downlink it is first necessary to invent a rational framework for discussing operations. This research explores operations as a feedback control problem, develops the conceptual basis for the use of spacecraft telemetry, and presents a method to improve performance. The method is called summarization, a process to make vehicle data more useful to operators. Summarization enables rational trades for telemetry downlink by defining and quantitatively ranking these elements: all operational decisions, the knowledge needed to inform each decision, and all possible sensor mappings to acquire that knowledge. Summarization methods were implemented for the Sapphire microsatellite; conceptual health management and system models were developed and a degree-of-observability metric was defined. An automated tool was created to generate summarization methods from these models. Methods generated using a Sapphire model were compared against the conventional operations plan. Summarization was shown to identify the key decisions and isolate the most appropriate sensors. Secondly, a form of summarization called beacon monitoring was experimentally verified. Beacon monitoring automates the anomaly detection and notification tasks and migrates these responsibilities to the space segment. A set of experiments using Sapphire demonstrated significant cost and time savings compared to conventional operations. Summarization is based on rational concepts for defining and understanding operations. Therefore, it enables additional trade studies that were formerly not possible and also can form the basis for future detailed research into spacecraft operations.
NASA Astrophysics Data System (ADS)
Arnold, J.; Gutmann, E. D.; Clark, M. P.; Nijssen, B.; Vano, J. A.; Addor, N.; Wood, A.; Newman, A. J.; Mizukami, N.; Brekke, L. D.; Rasmussen, R.; Mendoza, P. A.
2016-12-01
Climate change narratives for water-resource applications must represent the change signals contextualized by hydroclimatic process variability and uncertainty at multiple scales. Building narratives of plausible change includes assessing uncertainties across GCM structure, internal climate variability, climate downscaling methods, and hydrologic models. Work with this linked modeling chain has dealt mostly with GCM sampling directed separately to either model fidelity (does the model correctly reproduce the physical processes in the world?) or sensitivity (of different model responses to CO2 forcings) or diversity (of model type, structure, and complexity). This leaves unaddressed any interactions among those measures and with other components in the modeling chain used to identify water-resource vulnerabilities to specific climate threats. However, time-sensitive, real-world vulnerability studies typically cannot accommodate a full uncertainty ensemble across the whole modeling chain, so a gap has opened between current scientific knowledge and most routine applications for climate-changed hydrology. To close that gap, the US Army Corps of Engineers, the Bureau of Reclamation, and the National Center for Atmospheric Research are working on techniques to subsample uncertainties objectively across modeling chain components and to integrate results into quantitative hydrologic storylines of climate-changed futures. Importantly, these quantitative storylines are not drawn from a small sample of models or components. Rather, they stem from the more comprehensive characterization of the full uncertainty space for each component. Equally important from the perspective of water-resource practitioners, these quantitative hydrologic storylines are anchored in actual design and operations decisions potentially affected by climate change. This talk will describe part of our work characterizing variability and uncertainty across modeling chain components and their interactions using newly developed observational data, models and model outputs, and post-processing tools for making the resulting quantitative storylines most useful in practical hydrology applications.
Ankle fracture spur sign is pathognomonic for a variant ankle fracture.
Hinds, Richard M; Garner, Matthew R; Lazaro, Lionel E; Warner, Stephen J; Loftus, Michael L; Birnbaum, Jacqueline F; Burket, Jayme C; Lorich, Dean G
2015-02-01
The hyperplantarflexion variant ankle fracture is composed of a posterior tibial lip fracture with posterolateral and posteromedial fracture fragments separated by a vertical fracture line. This infrequently reported injury pattern often includes an associated "spur sign" or double cortical density at the inferomedial tibial metaphysis. The objective of this study was to quantitatively establish the association of the ankle fracture spur sign with the hyperplantarflexion variant ankle fracture. Our clinical database of operative ankle fractures was retrospectively reviewed for the incidence of hyperplantarflexion variant and nonvariant ankle fractures as determined by assessment of injury radiographs, preoperative advanced imaging, and intraoperative observation. Injury radiographs were then evaluated for the presence of the spur sign, and association between the spur sign and variant fractures was analyzed. The incidence of the hyperplantarflexion variant fracture among all ankle fractures was 6.7% (43/640). The spur sign was present in 79% (34/43) of variant fractures and absent in all nonvariant fractures, conferring a specificity of 100% in identifying variant fractures. Positive predictive value and negative predictive value were 100% and 99%, respectively. The ankle fracture spur sign was pathognomonic for the hyperplantarflexion variant ankle fracture. It is important to identify variant fractures preoperatively as patient positioning, operative approach, and fixation construct of variant fractures often differ from those employed for osteosynthesis of nonvariant fractures. Identification of the spur sign should prompt acquisition of advanced imaging to formulate an appropriate operative plan to address the variant fracture pattern. Level III, retrospective comparative study. © The Author(s) 2014.
Quantitative shear wave ultrasound elastography: initial experience in solid breast masses
2010-01-01
Introduction Shear wave elastography is a new method of obtaining quantitative tissue elasticity data during breast ultrasound examinations. The aims of this study were (1) to determine the reproducibility of shear wave elastography (2) to correlate the elasticity values of a series of solid breast masses with histological findings and (3) to compare shear wave elastography with greyscale ultrasound for benign/malignant classification. Methods Using the Aixplorer® ultrasound system (SuperSonic Imagine, Aix en Provence, France), 53 solid breast lesions were identified in 52 consecutive patients. Two orthogonal elastography images were obtained of each lesion. Observers noted the mean elasticity values in regions of interest (ROI) placed over the stiffest areas on the two elastography images and a mean value was calculated for each lesion. A sub-set of 15 patients had two elastography images obtained by an additional operator. Reproducibility of observations was assessed between (1) two observers analysing the same pair of images and (2) findings from two pairs of images of the same lesion taken by two different operators. All lesions were subjected to percutaneous biopsy. Elastography measurements were correlated with histology results. After preliminary experience with 10 patients a mean elasticity cut off value of 50 kilopascals (kPa) was selected for benign/malignant differentiation. Greyscale images were classified according to the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS). BI-RADS categories 1-3 were taken as benign while BI-RADS categories 4 and 5 were classified as malignant. Results Twenty-three benign lesions and 30 cancers were diagnosed on histology. Measurement of mean elasticity yielded an intraclass correlation coefficient of 0.99 for two observers assessing the same pairs of elastography images. Analysis of images taken by two independent operators gave an intraclass correlation coefficient of 0.80. Shear wave elastography versus greyscale BI-RADS performance figures were sensitivity: 97% vs 87%, specificity: 83% vs 78%, positive predictive value (PPV): 88% vs 84%, negative predictive value (NPV): 95% vs 82% and accuracy: 91% vs 83% respectively. These differences were not statistically significant. Conclusions Shear wave elastography gives quantitative and reproducible information on solid breast lesions with diagnostic accuracy at least as good as greyscale ultrasound with BI-RADS classification. PMID:21122101
Quantitative shear wave ultrasound elastography: initial experience in solid breast masses.
Evans, Andrew; Whelehan, Patsy; Thomson, Kim; McLean, Denis; Brauer, Katrin; Purdie, Colin; Jordan, Lee; Baker, Lee; Thompson, Alastair
2010-01-01
Shear wave elastography is a new method of obtaining quantitative tissue elasticity data during breast ultrasound examinations. The aims of this study were (1) to determine the reproducibility of shear wave elastography (2) to correlate the elasticity values of a series of solid breast masses with histological findings and (3) to compare shear wave elastography with greyscale ultrasound for benign/malignant classification. Using the Aixplorer® ultrasound system (SuperSonic Imagine, Aix en Provence, France), 53 solid breast lesions were identified in 52 consecutive patients. Two orthogonal elastography images were obtained of each lesion. Observers noted the mean elasticity values in regions of interest (ROI) placed over the stiffest areas on the two elastography images and a mean value was calculated for each lesion. A sub-set of 15 patients had two elastography images obtained by an additional operator. Reproducibility of observations was assessed between (1) two observers analysing the same pair of images and (2) findings from two pairs of images of the same lesion taken by two different operators. All lesions were subjected to percutaneous biopsy. Elastography measurements were correlated with histology results. After preliminary experience with 10 patients a mean elasticity cut off value of 50 kilopascals (kPa) was selected for benign/malignant differentiation. Greyscale images were classified according to the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS). BI-RADS categories 1-3 were taken as benign while BI-RADS categories 4 and 5 were classified as malignant. Twenty-three benign lesions and 30 cancers were diagnosed on histology. Measurement of mean elasticity yielded an intraclass correlation coefficient of 0.99 for two observers assessing the same pairs of elastography images. Analysis of images taken by two independent operators gave an intraclass correlation coefficient of 0.80. Shear wave elastography versus greyscale BI-RADS performance figures were sensitivity: 97% vs 87%, specificity: 83% vs 78%, positive predictive value (PPV): 88% vs 84%, negative predictive value (NPV): 95% vs 82% and accuracy: 91% vs 83% respectively. These differences were not statistically significant. Shear wave elastography gives quantitative and reproducible information on solid breast lesions with diagnostic accuracy at least as good as greyscale ultrasound with BI-RADS classification.
Quality of life in patients with cleft lip and palate after operation.
Augsornwan, Darawan; Namedang, Sarakull; Pongpagatip, Sumalee; Surakunprapha, Palakorn
2011-12-01
Cleft lip and cleft palate are the most common craniofacial anomalies. Srinagarind Hospital has 150-200 cases each year. The operating process of care, requires continuity of care involving a multidisciplinary team. When the patients go to hospital for an operation they experience pain, limited activity and also food is very different from normal life. When attending school they suffer speech articulation problems and feel shy and isolated, which has a detrimental affect on their life style and quality of life. The main purpose of the study is to the present study quality of life in patients with cleft lip and palate after operation. The present study is descriptive research using qualitative and quantitative approaches. The studied population were patients age 8-18 years old who were admitted at 3C Ward and Outpatient Department, Srinagarind Hospital. 33 patients were interviewed for the quantitative approach. Guideline for in-depth interview with 15 patients were used for the qualitative approach. Quantitative data were analyzed and presented in frequency, percentage and standard deviation. The qualitative data were analyzed through content analysis. Patients consider their QOL is high level, but in detail they still worry about self concept psychological well-being. From indept interview patients would like to get further treatment to minimize their scar as soon as possible. Patients consider their quality of life as high level, but they would like to get further treatment.
ERIC Educational Resources Information Center
Popovich, Karen
2012-01-01
This paper describes the process taken to develop a quantitative-based and Excel™-driven course that combines "BOTH" Management Information Systems (MIS) and Decision Science (DS) modeling outcomes and lays the foundation for upper level quantitative courses such as operations management, finance and strategic management. In addition,…
Bergmann, Tobias; Moore, Carrie; Sidney, John; Miller, Donald; Tallmadge, Rebecca; Harman, Rebecca M; Oseroff, Carla; Wriston, Amanda; Shabanowitz, Jeffrey; Hunt, Donald F; Osterrieder, Nikolaus; Peters, Bjoern; Antczak, Douglas F; Sette, Alessandro
2015-11-01
Here we describe a detailed quantitative peptide-binding motif for the common equine leukocyte antigen (ELA) class I allele Eqca-1*00101, present in roughly 25 % of Thoroughbred horses. We determined a preliminary binding motif by sequencing endogenously bound ligands. Subsequently, a positional scanning combinatorial library (PSCL) was used to further characterize binding specificity and derive a quantitative motif involving aspartic acid in position 2 and hydrophobic residues at the C-terminus. Using this motif, we selected and tested 9- and 10-mer peptides derived from the equine herpesvirus type 1 (EHV-1) proteome for their capacity to bind Eqca-1*00101. PSCL predictions were very efficient, with an receiver operating characteristic (ROC) curve performance of 0.877, and 87 peptides derived from 40 different EHV-1 proteins were identified with affinities of 500 nM or higher. Quantitative analysis revealed that Eqca-1*00101 has a narrow peptide-binding repertoire, in comparison to those of most human, non-human primate, and mouse class I alleles. Peripheral blood mononuclear cells from six EHV-1-infected, or vaccinated but uninfected, Eqca-1*00101-positive horses were used in IFN-γ enzyme-linked immunospot (ELISPOT) assays. When we screened the 87 Eqca-1*00101-binding peptides for T cell reactivity, only one Eqca-1*00101 epitope, derived from the intermediate-early protein ICP4, was identified. Thus, despite its common occurrence in several horse breeds, Eqca-1*00101 is associated with a narrow binding repertoire and a similarly narrow T cell response to an important equine viral pathogen. Intriguingly, these features are shared with other human and macaque major histocompatibility complex (MHC) molecules with a similar specificity for D in position 2 or 3 in their main anchor motif.
Bergmann, Tobias; Moore, Carrie; Sidney, John; Miller, Donald; Tallmadge, Rebecca; Harman, Rebecca M.; Oseroff, Carla; Wriston, Amanda; Shabanowitz, Jeffrey; Hunt, Donald F.; Osterrieder, Nikolaus; Peters, Bjoern; Antczak, Douglas F.; Sette, Alessandro
2016-01-01
Here we describe a detailed quantitative peptide-binding motif for the common equine leukocyte antigen (ELA) class I allele Eqca-1*00101, present in roughly 25 % of Thoroughbred horses. We determined a preliminary binding motif by sequencing endogenously bound ligands. Subsequently, a positional scanning combinatorial library (PSCL) was used to further characterize binding specificity and derive a quantitative motif involving aspartic acid in position 2 and hydrophobic residues at the C-terminus. Using this motif, we selected and tested 9- and 10-mer peptides derived from the equine herpesvirus type 1 (EHV-1) proteome for their capacity to bind Eqca-1*00101. PSCL predictions were very efficient, with an receiver operating characteristic (ROC) curve performance of 0.877, and 87 peptides derived from 40 different EHV-1 proteins were identified with affinities of 500 nM or higher. Quantitative analysis revealed that Eqca-1*00101 has a narrow peptide-binding repertoire, in comparison to those of most human, non-human primate, and mouse class I alleles. Peripheral blood mononuclear cells from six EHV-1-infected, or vaccinated but uninfected, Eqca-1*00101-positive horses were used in IFN-γ enzyme-linked immunospot (ELISPOT) assays. When we screened the 87 Eqca-1*00101-binding peptides for T cell reactivity, only one Eqca-1*00101 epitope, derived from the intermediate-early protein ICP4, was identified. Thus, despite its common occurrence in several horse breeds, Eqca-1*00101 is associated with a narrow binding repertoire and a similarly narrow T cell response to an important equine viral pathogen. Intriguingly, these features are shared with other human and macaque major histocompatibility complex (MHC) molecules with a similar specificity for D in position 2 or 3 in their main anchor motif. PMID:26399241
Xu, Chao; Fang, Jian; Shen, Hui; Wang, Yu-Ping; Deng, Hong-Wen
2018-01-25
Extreme phenotype sampling (EPS) is a broadly-used design to identify candidate genetic factors contributing to the variation of quantitative traits. By enriching the signals in extreme phenotypic samples, EPS can boost the association power compared to random sampling. Most existing statistical methods for EPS examine the genetic factors individually, despite many quantitative traits have multiple genetic factors underlying their variation. It is desirable to model the joint effects of genetic factors, which may increase the power and identify novel quantitative trait loci under EPS. The joint analysis of genetic data in high-dimensional situations requires specialized techniques, e.g., the least absolute shrinkage and selection operator (LASSO). Although there are extensive research and application related to LASSO, the statistical inference and testing for the sparse model under EPS remain unknown. We propose a novel sparse model (EPS-LASSO) with hypothesis test for high-dimensional regression under EPS based on a decorrelated score function. The comprehensive simulation shows EPS-LASSO outperforms existing methods with stable type I error and FDR control. EPS-LASSO can provide a consistent power for both low- and high-dimensional situations compared with the other methods dealing with high-dimensional situations. The power of EPS-LASSO is close to other low-dimensional methods when the causal effect sizes are small and is superior when the effects are large. Applying EPS-LASSO to a transcriptome-wide gene expression study for obesity reveals 10 significant body mass index associated genes. Our results indicate that EPS-LASSO is an effective method for EPS data analysis, which can account for correlated predictors. The source code is available at https://github.com/xu1912/EPSLASSO. hdeng2@tulane.edu. Supplementary data are available at Bioinformatics online. © The Author (2018). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
A framework for organizing and selecting quantitative approaches for benefit-harm assessment.
Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M
2012-11-19
Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment
2012-01-01
Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976
Performance of Lung Ultrasound in Detecting Peri-Operative Atelectasis after General Anesthesia.
Yu, Xin; Zhai, Zhenping; Zhao, Yongfeng; Zhu, Zhiming; Tong, Jianbin; Yan, Jianqin; Ouyang, Wen
2016-12-01
The aim of this prospective observational study was to evaluate the performance of lung ultrasound (LUS) in detecting post-operative atelectasis in adult patients under general anesthesia. Forty-six patients without pulmonary comorbidities who were scheduled for elective neurosurgery were enrolled in the study. A total of 552 pairs of LUS clips and thoracic computed tomography (CT) images were ultimately analyzed to determine the presence of atelectasis in 12 prescribed lung regions. The accuracy of LUS in detecting peri-operative atelectasis was evaluated with thoracic CT as gold standard. Levels of agreement between the two observers for LUS and the two observers for thoracic CT were analyzed using the κ reliability test. The quantitative correlation between LUS scores of aeration and the volumetric data of atelectasis in thoracic CT were further evaluated. LUS had reliable performance in post-operative atelectasis, with a sensitivity of 87.7%, specificity of 92.1% and diagnostic accuracy of 90.8%. The levels of agreement between the two observers for LUS and for thoracic CT were both satisfactory, with κ coefficients of 0.87 (p < 0.0001) and 0.93 (p < 0.0001), respectively. In patients in the supine position, LUS scores were highly correlated with the atelectasis volume of CT (r = 0.58, p < 0.0001). Thus, LUS provides a fast, reliable and radiation-free method to identify peri-operative atelectasis in adults. Copyright © 2016. Published by Elsevier Inc.
Ishikawa, Akira
2017-11-27
Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.
On the Reproducibility of Label-Free Quantitative Cross-Linking/Mass Spectrometry
NASA Astrophysics Data System (ADS)
Müller, Fränze; Fischer, Lutz; Chen, Zhuo Angel; Auchynnikava, Tania; Rappsilber, Juri
2018-02-01
Quantitative cross-linking/mass spectrometry (QCLMS) is an emerging approach to study conformational changes of proteins and multi-subunit complexes. Distinguishing protein conformations requires reproducibly identifying and quantifying cross-linked peptides. Here we analyzed the variation between multiple cross-linking reactions using bis[sulfosuccinimidyl] suberate (BS3)-cross-linked human serum albumin (HSA) and evaluated how reproducible cross-linked peptides can be identified and quantified by LC-MS analysis. To make QCLMS accessible to a broader research community, we developed a workflow that integrates the established software tools MaxQuant for spectra preprocessing, Xi for cross-linked peptide identification, and finally Skyline for quantification (MS1 filtering). Out of the 221 unique residue pairs identified in our sample, 124 were subsequently quantified across 10 analyses with coefficient of variation (CV) values of 14% (injection replica) and 32% (reaction replica). Thus our results demonstrate that the reproducibility of QCLMS is in line with the reproducibility of general quantitative proteomics and we establish a robust workflow for MS1-based quantitation of cross-linked peptides.
Nucleophosmin/B23 is a proliferate shuttle protein associated with nuclear matrix.
Yun, Jing-Ping; Chew, Eng Ching; Liew, Choong-Tsek; Chan, John Y H; Jin, Mei-Lin; Ding, Ming-Xiao; Fai, Yam Hin; Li, H K Richard; Liang, Xiao-Man; Wu, Qiu-Liang
2003-12-15
It has become obvious that a better understanding and potential elucidation of the nucleolar phosphoprotein B23 involving in functional interrelationship between nuclear organization and gene expression. In present study, protein B23 expression were investigated in the regenerative hepatocytes at different periods (at days 0, 1, 2, 3, 4, 7) during liver regeneration after partial hepatectomy on the rats with immunohistochemistry and Western blot analysis. Another experiment was done with immunolabeling methods and two-dimensional (2-D) gel electrophoresis for identification of B23 in the regenerating hepatocytes and HepG2 cells (hepatoblastoma cell line) after sequential extraction with detergents, nuclease, and salt. The results showed that its expression in the hepatocytes had a locative move and quantitative change during the process of liver regeneration post-operation. Its immunochemical localization in the hepatocytes during the process showed that it moved from nucleoli of the hepatocytes in the stationary stage to nucleoplasm, cytoplasm, mitotic spindles, and mitotic chromosomes of the hepatocytes in the regenerating livers. It was quantitatively increased progressively to peak level at day 3 post-operation and declined gradually to normal level at day 7. It was detected in nuclear matrix protein (NMP) composition extracted from the regenerating hepatocytes and HepG2 cells and identified with isoelectric point (pI) value of 5.1 and molecular weight of 40 kDa. These results indicated that B23 was a proliferate shuttle protein involving in cell cycle and cell proliferation associated with nuclear matrix. Copyright 2003 Wiley-Liss, Inc.
Conflagration Analysis System II: Bibliography.
1985-04-01
Therefore, it is Lmportant to examine both the reinforcement and the supplemental considerations Eor the quantitative methods for conflagration...and the meaningful quantitative factors for conflagration analysis are determined, the relevatn literature will be brought into the nainstream of the... quantitative :hods. Fire Development in Multiple Structures From a purely analytical view, the research identified in the literature fire development in
ERIC Educational Resources Information Center
Reid, Jackie; Wilkes, Janelle
2016-01-01
Mapping quantitative skills across the science, technology, engineering and mathematics (STEM) curricula will help educators identify gaps and duplication in the teaching, practice and assessment of the necessary skills. This paper describes the development and implementation of quantitative skills mapping tools for courses in STEM at a regional…
ERIC Educational Resources Information Center
May-Vollmar, Kelly
2017-01-01
Purpose: The purpose of this quantitative correlation study was to identify whether there is a relationship between emotional intelligence and effective leadership practices, specifically with school administrators in Southern California K-12 public schools. Methods: This study was conducted using a quantitative descriptive design, correlation…
Multi-sensor Improved Sea Surface Temperature (MISST) for GODAE
2007-09-30
NAVOCEANO has improved on its methodology to add retrieval error information to the US Navy operational data stream. Quantitative estimates of...hycom.rsmas.miami.edu/ “ POSITIV : Prototype Operational System – ISAR – Temperature Instrumentation for the VOS fleet” CIRA/CSU Joint Hurricane Testbed
Monofluorophosphate is a selective inhibitor of respiratory sulfate-reducing microorganisms.
Carlson, Hans K; Stoeva, Magdalena K; Justice, Nicholas B; Sczesnak, Andrew; Mullan, Mark R; Mosqueda, Lorraine A; Kuehl, Jennifer V; Deutschbauer, Adam M; Arkin, Adam P; Coates, John D
2015-03-17
Despite the environmental and economic cost of microbial sulfidogenesis in industrial operations, few compounds are known as selective inhibitors of respiratory sulfate reducing microorganisms (SRM), and no study has systematically and quantitatively evaluated the selectivity and potency of SRM inhibitors. Using general, high-throughput assays to quantitatively evaluate inhibitor potency and selectivity in a model sulfate-reducing microbial ecosystem as well as inhibitor specificity for the sulfate reduction pathway in a model SRM, we screened a panel of inorganic oxyanions. We identified several SRM selective inhibitors including selenate, selenite, tellurate, tellurite, nitrate, nitrite, perchlorate, chlorate, monofluorophosphate, vanadate, molydate, and tungstate. Monofluorophosphate (MFP) was not known previously as a selective SRM inhibitor, but has promising characteristics including low toxicity to eukaryotic organisms, high stability at circumneutral pH, utility as an abiotic corrosion inhibitor, and low cost. MFP remains a potent inhibitor of SRM growing by fermentation, and MFP is tolerated by nitrate and perchlorate reducing microorganisms. For SRM inhibition, MFP is synergistic with nitrite and chlorite, and could enhance the efficacy of nitrate or perchlorate treatments. Finally, MFP inhibition is multifaceted. Both inhibition of the central sulfate reduction pathway and release of cytoplasmic fluoride ion are implicated in the mechanism of MFP toxicity.
Assessment of physical workload in boiler operations.
Rodrigues, Valéria Antônia Justino; Braga, Camila Soares; Campos, Julio César Costa; Souza, Amaury Paulo de; Minette, Luciano José; Sensato, Guilherme Luciano; Moraes, Angelo Casali de; Silva, Emília Pio da
2012-01-01
The use of boiler wood-fired is fairly common equipment utilized in steam generation for energy production in small industries. The boiler activities are considered dangerous and heavy, mainly due to risks of explosions and the lack of mechanization of the process. This study assessed the burden of physical labor that operators of boilers are subjected during the workday. Assessment of these conditions was carried out through quantitative and qualitative measurements. A heart rate monitor, a wet-bulb globe thermometer (WBGT), a tape-measure and a digital infrared camera were the instruments used to collect the quantitative data. The Nordic Questionnaire and the Painful Areas Diagram were used to relate the health problems of the boiler operator with activity. With study, was concluded that the boiler activity may cause pains in the body of intensity different, muscle fatigue and diseases due to excessive weight and the exposure to heat. The research contributed to improve the boiler operator's workplace and working conditions.
Nang, Roberto N; Monahan, Felicia; Diehl, Glendon B; French, Daniel
2015-04-01
Many institutions collect reports in databases to make important lessons-learned available to their members. The Uniformed Services University of the Health Sciences collaborated with the Peacekeeping and Stability Operations Institute to conduct a descriptive and qualitative analysis of global health engagements (GHEs) contained in the Stability Operations Lessons Learned and Information Management System (SOLLIMS). This study used a summative qualitative content analysis approach involving six steps: (1) a comprehensive search; (2) two-stage reading and screening process to identify first-hand, health-related records; (3) qualitative and quantitative data analysis using MAXQDA, a software program; (4) a word cloud to illustrate word frequencies and interrelationships; (5) coding of individual themes and validation of the coding scheme; and (6) identification of relationships in the data and overarching lessons-learned. The individual codes with the most number of text segments coded included: planning, personnel, interorganizational coordination, communication/information sharing, and resources/supplies. When compared to the Department of Defense's (DoD's) evolving GHE principles and capabilities, the SOLLIMS coding scheme appeared to align well with the list of GHE capabilities developed by the Department of Defense Global Health Working Group. The results of this study will inform practitioners of global health and encourage additional qualitative analysis of other lessons-learned databases. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Catanuto, Giuseppe; Taher, Wafa; Rocco, Nicola; Catalano, Francesca; Allegra, Dario; Milotta, Filippo Luigi Maria; Stanco, Filippo; Gallo, Giovanni; Nava, Maurizio Bruno
2018-03-20
Breast shape is defined utilizing mainly qualitative assessment (full, flat, ptotic) or estimates, such as volume or distances between reference points, that cannot describe it reliably. We will quantitatively describe breast shape with two parameters derived from a statistical methodology denominated principal component analysis (PCA). We created a heterogeneous dataset of breast shapes acquired with a commercial infrared 3-dimensional scanner on which PCA was performed. We plotted on a Cartesian plane the two highest values of PCA for each breast (principal components 1 and 2). Testing of the methodology on a preoperative and postoperative surgical case and test-retest was performed by two operators. The first two principal components derived from PCA are able to characterize the shape of the breast included in the dataset. The test-retest demonstrated that different operators are able to obtain very similar values of PCA. The system is also able to identify major changes in the preoperative and postoperative stages of a two-stage reconstruction. Even minor changes were correctly detected by the system. This methodology can reliably describe the shape of a breast. An expert operator and a newly trained operator can reach similar results in a test/re-testing validation. Once developed and after further validation, this methodology could be employed as a good tool for outcome evaluation, auditing, and benchmarking.
Some effects of adverse weather conditions on performance of airplane antiskid braking systems
NASA Technical Reports Server (NTRS)
Horne, W. B.; Mccarty, J. L.; Tanner, J. A.
1976-01-01
The performance of current antiskid braking systems operating under adverse weather conditions was analyzed in an effort to both identify the causes of locked-wheel skids which sometimes occur when the runway is slippery and to find possible solutions to this operational problem. This analysis was made possible by the quantitative test data provided by recently completed landing research programs using fully instrumented flight test airplanes and was further supported by tests performed at the Langley aircraft landing loads and traction facility. The antiskid system logic for brake control and for both touchdown and locked-wheel protection is described and its response behavior in adverse weather is discussed in detail with the aid of available data. The analysis indicates that the operational performance of the antiskid logic circuits is highly dependent upon wheel spin-up acceleration and can be adversely affected by certain pilot braking inputs when accelerations are low. Normal antiskid performance is assured if the tire-to-runway traction is sufficient to provide high wheel spin-up accelerations or if the system is provided a continuous, accurate ground speed reference. The design of antiskid systems is complicated by the necessity for tradeoffs between tire braking and cornering capabilities, both of which are necessary to provide safe operations in the presence of cross winds, particularly under slippery runway conditions.
Lin, A-S; Chang, S-S; Lin, S-H; Peng, Y-C; Hwu, H-G; Chen, W J
2015-07-01
Schizophrenia patients have higher rates of minor physical anomalies (MPAs) than controls, particularly in the craniofacial region; this difference lends support to the neurodevelopmental model of schizophrenia. Whether MPAs are associated with treatment response in schizophrenia remains unknown. The aim of this case-control study was to investigate whether more MPAs and specific quantitative craniofacial features in patients with schizophrenia are associated with operationally defined treatment resistance. A comprehensive scale, consisting of both qualitatively measured MPAs and quantitative measurements of the head and face, was applied in 108 patients with treatment-resistant schizophrenia (TRS) and in 104 non-TRS patients. Treatment resistance was determined according to the criteria proposed by Conley & Kelly (2001; Biological Psychiatry 50, 898-911). Our results revealed that patients with TRS had higher MPA scores in the mouth region than non-TRS patients, and the two groups also differed in four quantitative measurements (facial width, lower facial height, facial height, and length of the philtrum), after controlling for multiple comparisons using the false discovery rate. Among these dysmorphological measurements, three MPA item types (mouth MPA score, facial width, and lower facial height) and earlier disease onset were further demonstrated to have good discriminant validity in distinguishing TRS from non-TRS patients in a multivariable logistic regression analysis, with an area under the curve of 0.84 and a generalized R 2 of 0.32. These findings suggest that certain MPAs and craniofacial features may serve as useful markers for identifying TRS at early stages of the illness.
Volumes Learned: It Takes More Than Size to "Size Up" Pulmonary Lesions.
Ma, Xiaonan; Siegelman, Jenifer; Paik, David S; Mulshine, James L; St Pierre, Samantha; Buckler, Andrew J
2016-09-01
This study aimed to review the current understanding and capabilities regarding use of imaging for noninvasive lesion characterization and its relationship to lung cancer screening and treatment. Our review of the state of the art was broken down into questions about the different lung cancer image phenotypes being characterized, the role of imaging and requirements for increasing its value with respect to increasing diagnostic confidence and quantitative assessment, and a review of the current capabilities with respect to those needs. The preponderance of the literature has so far been focused on the measurement of lesion size, with increasing contributions being made to determine the formal performance of scanners, measurement tools, and human operators in terms of bias and variability. Concurrently, an increasing number of investigators are reporting utility and predictive value of measures other than size, and sensitivity and specificity is being reported. Relatively little has been documented on quantitative measurement of non-size features with corresponding estimation of measurement performance and reproducibility. The weight of the evidence suggests characterization of pulmonary lesions built on quantitative measures adds value to the screening for, and treatment of, lung cancer. Advanced image analysis techniques may identify patterns or biomarkers not readily assessed by eye and may also facilitate management of multidimensional imaging data in such a way as to efficiently integrate it into the clinical workflow. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Fundamental quantitative security in quantum key generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yuen, Horace P.
2010-12-15
We analyze the fundamental security significance of the quantitative criteria on the final generated key K in quantum key generation including the quantum criterion d, the attacker's mutual information on K, and the statistical distance between her distribution on K and the uniform distribution. For operational significance a criterion has to produce a guarantee on the attacker's probability of correctly estimating some portions of K from her measurement, in particular her maximum probability of identifying the whole K. We distinguish between the raw security of K when the attacker just gets at K before it is used in a cryptographicmore » context and its composition security when the attacker may gain further information during its actual use to help get at K. We compare both of these securities of K to those obtainable from conventional key expansion with a symmetric key cipher. It is pointed out that a common belief in the superior security of a quantum generated K is based on an incorrect interpretation of d which cannot be true, and the security significance of d is uncertain. Generally, the quantum key distribution key K has no composition security guarantee and its raw security guarantee from concrete protocols is worse than that of conventional ciphers. Furthermore, for both raw and composition security there is an exponential catch-up problem that would make it difficult to quantitatively improve the security of K in a realistic protocol. Some possible ways to deal with the situation are suggested.« less
Common Distribution Patterns of Marsupials Related to Physiographical Diversity in Venezuela
Ventura, Jacint; Bagaria, Guillem; Sans-Fuentes, Maria Assumpció; Pérez-Hernández, Roger
2014-01-01
The aim of this study is to identify significant biotic regions (groups of areas with similar biotas) and biotic elements (groups of taxa with similar distributions) for the marsupial fauna in a part of northern South America using physiographical areas as Operational Geographical Units (OGUs). We considered Venezuela a good model to elucidate this issue because of its high diversity in landscapes and the relatively vast amount of information available on the geographical distribution of marsupial species. Based on the presence-absence of 33 species in 15 physiographical sub-regions (OGUs) we identified Operational Biogeographical Units (OBUs) and chorotypes using a quantitative analysis that tested statistical significance of the resulting groups. Altitudinal and/or climatic trends in the OBUs and chorotypes were studied using a redundancy analysis. The classification method revealed four OBUs. Strong biotic boundaries separated: i) the xerophytic zone of the Continental coast (OBU I); ii) the sub-regions north of the Orinoco River (OBU III and IV); and those south to the river (OBU II). Eleven chorotypes were identified, four of which included a single species with a restricted geographic distribution. As for the other chorotypes, three main common distribution patterns have been inferred: i) species from the Llanos and/or distributed south of the Orinoco River; ii) species exclusively from the Andes; and iii) species that either occur exclusively north of the Orinoco River or that show a wide distribution throughout Venezuela. Mean altitude, evapotranspiration and precipitation of the driest month, and temperature range allowed us to characterize environmentally most of the OBUs and chorotypes obtained. PMID:24806452
[Quantitative determination of blood regurgitation via the mitral valve].
Sandrikov, V A
1981-11-01
A method of quantitative determination of blood regurgitation through the mitral valve is considered. Verification experiment on 5 animals with the determination of correlation coefficient of true and predicted regurgitation has shown it to be 0.855 on the average. Besides, observations were undertaken on 621 patient with varying pathology of the heart. A quantitative characteristics of blood regurgitation in patients with mitral defects is given. The method can be used not only under operation conditions, but also in catheterization of the cardiac cavities without administering of an opaque substance.
NASA Astrophysics Data System (ADS)
Martel, Anne L.
2004-04-01
In order to extract quantitative information from dynamic contrast-enhanced MR images (DCE-MRI) it is usually necessary to identify an arterial input function. This is not a trivial problem if there are no major vessels present in the field of view. Most existing techniques rely on operator intervention or use various curve parameters to identify suitable pixels but these are often specific to the anatomical region or the acquisition method used. They also require the signal from several pixels to be averaged in order to improve the signal to noise ratio, however this introduces errors due to partial volume effects. We have described previously how factor analysis can be used to automatically separate arterial and venous components from DCE-MRI studies of the brain but although that method works well for single slice images through the brain when the blood brain barrier technique is intact, it runs into problems for multi-slice images with more complex dynamics. This paper will describe a factor analysis method that is more robust in such situations and is relatively insensitive to the number of physiological components present in the data set. The technique is very similar to that used to identify spectral end-members from multispectral remote sensing images.
Quality and Control of Water Vapor Winds
NASA Technical Reports Server (NTRS)
Jedlovec, Gary J.; Atkinson, Robert J.
1996-01-01
Water vapor imagery from the geostationary satellites such as GOES, Meteosat, and GMS provides synoptic views of dynamical events on a continual basis. Because the imagery represents a non-linear combination of mid- and upper-tropospheric thermodynamic parameters (three-dimensional variations in temperature and humidity), video loops of these image products provide enlightening views of regional flow fields, the movement of tropical and extratropical storm systems, the transfer of moisture between hemispheres and from the tropics to the mid- latitudes, and the dominance of high pressure systems over particular regions of the Earth. Despite the obvious larger scale features, the water vapor imagery contains significant image variability down to the single 8 km GOES pixel. These features can be quantitatively identified and tracked from one time to the next using various image processing techniques. Merrill et al. (1991), Hayden and Schmidt (1992), and Laurent (1993) have documented the operational procedures and capabilities of NOAA and ESOC to produce cloud and water vapor winds. These techniques employ standard correlation and template matching approaches to wind tracking and use qualitative and quantitative procedures to eliminate bad wind vectors from the wind data set. Techniques have also been developed to improve the quality of the operational winds though robust editing procedures (Hayden and Veldon 1991). These quality and control approaches have limitations, are often subjective, and constrain wind variability to be consistent with model derived wind fields. This paper describes research focused on the refinement of objective quality and control parameters for water vapor wind vector data sets. New quality and control measures are developed and employed to provide a more robust wind data set for climate analysis, data assimilation studies, as well as operational weather forecasting. The parameters are applicable to cloud-tracked winds as well with minor modifications. The improvement in winds through use of these new quality and control parameters is measured without the use of rawinsonde or modeled wind field data and compared with other approaches.
Multi-Sensor Improved Sea Surface Temperature (MISST) for GODAE
2007-01-01
new data streams. NAVOCEANO has improved on its methodology to add retrieval error information to the US Navy operational data stream. Quantitative ...HYCOM)”: http://hycom.rsmas.miami.edu/ “ POSITIV : Prototype Operational System – ISAR – Temperature Instrumentation for the VOS fleet” CIRA/CSU Joint
Multi-Sensor Improved Sea Surface Temperature (MISST) for GODAE
2008-01-01
its methodology to add 3 retrieval error information to the US Navy operational data stream. Quantitative estimates of reliability are added to...hycom.rsmas.miami.edu/ “ POSITIV : Prototype Operational System – ISAR – Temperature Instrumentation for the VOS fleet” CIRA/CSU Joint Hurricane Testbed project
David Coblentz; Kurt H. Riitters
2005-01-01
The relationship between topography and biodiversity is well documented in the Madrean Archipelago. However, despite this recognition, most biogeographical studies concerning the role of topography have relied primarily on a qualitative description of the landscape. Using an algorithm that operates on a high-resolution digital elevation model we present a quantitative...
ERIC Educational Resources Information Center
Brückner, Sebastian; Pellegrino, James W.
2016-01-01
The Standards for Educational and Psychological Testing indicate that validation of assessments should include analyses of participants' response processes. However, such analyses typically are conducted only to supplement quantitative field studies with qualitative data, and seldom are such data connected to quantitative data on student or item…
Grindstaff, Quirinus G.
1992-01-01
Described is a new gas chromatograph-mass spectrometer (GC/MS) system and method for quantitative analysis of reactive chemical compounds. All components of such a GC/MS system external to the oven of the gas chromatograph are programmably temperature controlled to operate at a volatilization temperature specific to the compound(s) sought to be separated and measured.
ERIC Educational Resources Information Center
Santos, Michael R.; Hu, Aidong; Jordan, Douglas
2014-01-01
The authors offer a classification technique to make a quantitative skills rubric more operational, with the groupings of multiple-choice questions to match the student learning levels in knowledge, calculation, quantitative reasoning, and analysis. The authors applied this classification technique to the mid-term exams of an introductory finance…
Automatic Gleason grading of prostate cancer using quantitative phase imaging and machine learning
NASA Astrophysics Data System (ADS)
Nguyen, Tan H.; Sridharan, Shamira; Macias, Virgilia; Kajdacsy-Balla, Andre; Melamed, Jonathan; Do, Minh N.; Popescu, Gabriel
2017-03-01
We present an approach for automatic diagnosis of tissue biopsies. Our methodology consists of a quantitative phase imaging tissue scanner and machine learning algorithms to process these data. We illustrate the performance by automatic Gleason grading of prostate specimens. The imaging system operates on the principle of interferometry and, as a result, reports on the nanoscale architecture of the unlabeled specimen. We use these data to train a random forest classifier to learn textural behaviors of prostate samples and classify each pixel in the image into different classes. Automatic diagnosis results were computed from the segmented regions. By combining morphological features with quantitative information from the glands and stroma, logistic regression was used to discriminate regions with Gleason grade 3 versus grade 4 cancer in prostatectomy tissue. The overall accuracy of this classification derived from a receiver operating curve was 82%, which is in the range of human error when interobserver variability is considered. We anticipate that our approach will provide a clinically objective and quantitative metric for Gleason grading, allowing us to corroborate results across instruments and laboratories and feed the computer algorithms for improved accuracy.
Planning for avian flu disruptions on global operations: a DMAIC case study.
Kumar, Sameer
2012-01-01
The author aims to assess the spread of avian flu, its impact on businesses operating in the USA and overseas, and the measures required for corporate preparedness. Six Sigma DMAIC process is used to analyze avian flu's impact and how an epidemic could affect large US business operations worldwide. Wal-Mart and Dell Computers were chosen as one specializes in retail and the other manufacturing. The study identifies avian flu pandemic risks including failure modes on Wal-Mart and Dell Computers global operations. It reveals the factors that reinforce avian-flu pandemic's negative impact on company global supply chains. It also uncovers factors that balance avian-flu pandemic's impact on their global supply chains. Avian flu and its irregularity affect the research outcomes because its spread could fluctuate based on so many factors that could come into play. Further, the potential cost to manufacturers and other supply chain partners is relatively unknown. As a relatively new phenomenon, quantitative data were not available to determine immediate costs. In this decade, the avian influenza H5N1 virus has killed millions of poultry in Asia, Europe and Africa. This flu strain can infect and kill humans who come into contact with this virus. An avian influenza H5N1 outbreak could lead to a devastating effect on global food supply, business services and business operations. The study provides guidance on what global business operation managers can do to prepare for such events, as well as how avian flu progression to a pandemic can disrupt such operations. This study raises awareness about avian flu's impact on businesses and humans and also highlights the need to create contingency plans for corporate preparedness to avoid incurring losses.
An expert system/ion trap mass spectrometry approach for life support systems monitoring
NASA Technical Reports Server (NTRS)
Palmer, Peter T.; Wong, Carla M.; Yost, Richard A.; Johnson, Jodie V.; Yates, Nathan A.; Story, Michael
1992-01-01
Efforts to develop sensor and control system technology to monitor air quality for life support have resulted in the development and preliminary testing of a concept based on expert systems and ion trap mass spectrometry (ITMS). An ITMS instrument provides the capability to identify and quantitate a large number of suspected contaminants at trace levels through the use of a variety of multidimensional experiments. An expert system provides specialized knowledge for control, analysis, and decision making. The system is intended for real-time, on-line, autonomous monitoring of air quality. The key characteristics of the system, performance data and analytical capabilities of the ITMS instrument, the design and operation of the expert system, and results from preliminary testing of the system for trace contaminant monitoring are described.
System impacts of solar dynamic and growth power systems on space station
NASA Technical Reports Server (NTRS)
Farmer, J. T.; Cuddihy, W. F.; Lovelace, U. M.; Badi, D. M.
1986-01-01
Concepts for the 1990's space station envision an initial operational capability with electrical power output requirements of approximately 75 kW and growth power requirements in the range of 300 kW over a period of a few years. Photovoltaic and solar dynamic power generation techniques are contenders for supplying this power to the space station. A study was performed to identify growth power subsystem impacts on other space station subsystems. Subsystem interactions that might suggest early design changes for the space station were emphasized. Quantitative analyses of the effects of power subsystem mass and projected area on space station controllability and reboost requirements were conducted for a range of growth station configurations. Impacts on space station structural dynamics as a function of power subsystem growth were also considered.
NASA Astrophysics Data System (ADS)
Zhukovskiy, Yu L.; Korolev, N. A.; Babanova, I. S.; Boikov, A. V.
2017-10-01
This article is devoted to the prediction of the residual life based on an estimate of the technical state of the induction motor. The proposed system allows to increase the accuracy and completeness of diagnostics by using an artificial neural network (ANN), and also identify and predict faulty states of an electrical equipment in dynamics. The results of the proposed system for estimation the technical condition are probability technical state diagrams and a quantitative evaluation of the residual life, taking into account electrical, vibrational, indirect parameters and detected defects. Based on the evaluation of the technical condition and the prediction of the residual life, a decision is made to change the control of the operating and maintenance modes of the electric motors.
NASA Astrophysics Data System (ADS)
Armstrong, Julian J.; Leigh, Matthew S.; Walton, Ian D.; Zvyagin, Andrei V.; Alexandrov, Sergey A.; Schwer, Stefan; Sampson, David D.; Hillman, David R.; Eastwood, Peter R.
2003-07-01
We describe a long-range optical coherence tomography system for size and shape measurement of large hollow organs in the human body. The system employs a frequency-domain optical delay line of a configuration that enables the combination of high-speed operation with long scan range. We compare the achievable maximum delay of several delay line configurations, and identify the configurations with the greatest delay range. We demonstrate the use of one such long-range delay line in a catheter-based optical coherence tomography system and present profiles of the human upper airway and esophagus in vivo with a radial scan range of 26 millimeters. Such quantitative upper airway profiling should prove valuable in investigating the pathophysiology of airway collapse during sleep (obstructive sleep apnea).
Distributed photovoltaic systems: Utility interface issues and their present status
NASA Technical Reports Server (NTRS)
Hassan, M.; Klein, J.
1981-01-01
Major technical issues involving the integration of distributed photovoltaics (PV) into electric utility systems are defined and their impacts are described quantitatively. An extensive literature search, interviews, and analysis yielded information about the work in progress and highlighted problem areas in which additional work and research are needed. The findings from the literature search were used to determine whether satisfactory solutions to the problems exist or whether satisfactory approaches to a solution are underway. It was discovered that very few standards, specifications, or guidelines currently exist that will aid industry in integrating PV into the utility system. Specific areas of concern identified are: (1) protection, (2) stability, (3) system unbalance, (4) voltage regulation and reactive power requirements, (5) harmonics, (6) utility operations, (7) safety, (8) metering, and (9) distribution system planning and design.
ERIC Educational Resources Information Center
Ling, Chris D.; Bridgeman, Adam J.
2011-01-01
Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…
Quantitation of absorbed or deposited materials on a substrate that measures energy deposition
Grant, Patrick G.; Bakajin, Olgica; Vogel, John S.; Bench, Graham
2005-01-18
This invention provides a system and method for measuring an energy differential that correlates to quantitative measurement of an amount mass of an applied localized material. Such a system and method remains compatible with other methods of analysis, such as, for example, quantitating the elemental or isotopic content, identifying the material, or using the material in biochemical analysis.
Lin, Meng-Lay; Patel, Hetal; Remenyi, Judit; Banerji, Christopher R S; Lai, Chun-Fui; Periyasamy, Manikandan; Lombardo, Ylenia; Busonero, Claudia; Ottaviani, Silvia; Passey, Alun; Quinlan, Philip R; Purdie, Colin A; Jordan, Lee B; Thompson, Alastair M; Finn, Richard S; Rueda, Oscar M; Caldas, Carlos; Gil, Jesus; Coombes, R Charles; Fuller-Pace, Frances V; Teschendorff, Andrew E; Buluwela, Laki; Ali, Simak
2015-08-28
The Nuclear Receptor (NR) superfamily of transcription factors comprises 48 members, several of which have been implicated in breast cancer. Most important is estrogen receptor-α (ERα), which is a key therapeutic target. ERα action is facilitated by co-operativity with other NR and there is evidence that ERα function may be recapitulated by other NRs in ERα-negative breast cancer. In order to examine the inter-relationships between nuclear receptors, and to obtain evidence for previously unsuspected roles for any NRs, we undertook quantitative RT-PCR and bioinformatics analysis to examine their expression in breast cancer. While most NRs were expressed, bioinformatic analyses differentiated tumours into distinct prognostic groups that were validated by analyzing public microarray data sets. Although ERα and progesterone receptor were dominant in distinguishing prognostic groups, other NR strengthened these groups. Clustering analysis identified several family members with potential importance in breast cancer. Specifically, RORγ is identified as being co-expressed with ERα, whilst several NRs are preferentially expressed in ERα-negative disease, with TLX expression being prognostic in this subtype. Functional studies demonstrated the importance of TLX in regulating growth and invasion in ERα-negative breast cancer cells.
Selecting quantitative water management measures at the river basin scale in a global change context
NASA Astrophysics Data System (ADS)
Girard, Corentin; Rinaudo, Jean-Daniel; Caballero, Yvan; Pulido-Velazquez, Manuel
2013-04-01
One of the main challenges in the implementation of the Water Framework Directive (WFD) in the European Union is the definition of programme of measures to reach the good status of the European water bodies. In areas where water scarcity is an issue, one of these challenges is the selection of water conservation and capacity expansion measures to ensure minimum environmental in-stream flow requirements. At the same time, the WFD calls for the use of economic analysis to identify the most cost-effective combination of measures at the river basin scale to achieve its objective. With this respect, hydro-economic river basin models, by integrating economics, environmental and hydrological aspects at the river basin scale in a consistent framework, represent a promising approach. This article presents a least-cost river basin optimization model (LCRBOM) that selects the combination of quantitative water management measures to meet environmental flows for future scenarios of agricultural and urban demand taken into account the impact of the climate change. The model has been implemented in a case study on a Mediterranean basin in the south of France, the Orb River basin. The water basin has been identified as in need for quantitative water management measures in order to reach the good status of its water bodies. The LCRBOM has been developed using GAMS, applying Mixed Integer Linear Programming. It is run to select the set of measures that minimizes the total annualized cost of the applied measures, while meeting the demands and minimum in-stream flow constraints. For the economic analysis, the programme of measures is composed of water conservation measures on agricultural and urban water demands. It compares them with measures mobilizing new water resources coming from groundwater, inter-basin transfers and improvement in reservoir operating rules. The total annual cost of each measure is calculated for each demand unit considering operation, maintenance and investment costs. The results show that by combining quantitative water management measures, the flow regime can be improved to better mimic the natural flow regime. However, the acceptability of the higher cost of the program of measures is not yet assessed. Other stages such as stakeholder participation and negotiation processes are as well required to design an acceptable programme of measures. For this purpose, this type of model opens the path to investigate the problems of equity issues, and measures and costs allocation among the stakeholders of the basin. Acknowledgments: The study has been partially supported by the Hérault General Council, the Languedoc-Rousillon Regional Council, the Rhône Mediterranean Corsica Water Agency and the BRGM, as well as the European Community 7th Framework Project GENESIS (n. 226536) on groundwater systems, and the Plan Nacional I+D+I 2008-2011 of the Spanish Ministry of Science and Innovation (subprojects CGL2009-13238-C02-01 and CGL2009-13238-C02-02).
Network of TAMCNS: Identifying Influence Regions Within the GCSS-MC Database
2017-06-01
relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This... methodology identifies the most important TAMCN and provides a list of TAMCNs in order of importance. We also analyze the community and core structure of...relationships between objects and provides tools to quantitatively determine objects whose influence impacts other objects or the system as a whole. This
Egorov, Evgeny S; Merzlyak, Ekaterina M; Shelenkov, Andrew A; Britanova, Olga V; Sharonov, George V; Staroverov, Dmitriy B; Bolotin, Dmitriy A; Davydov, Alexey N; Barsova, Ekaterina; Lebedev, Yuriy B; Shugay, Mikhail; Chudakov, Dmitriy M
2015-06-15
Emerging high-throughput sequencing methods for the analyses of complex structure of TCR and BCR repertoires give a powerful impulse to adaptive immunity studies. However, there are still essential technical obstacles for performing a truly quantitative analysis. Specifically, it remains challenging to obtain comprehensive information on the clonal composition of small lymphocyte populations, such as Ag-specific, functional, or tissue-resident cell subsets isolated by sorting, microdissection, or fine needle aspirates. In this study, we report a robust approach based on unique molecular identifiers that allows profiling Ag receptors for several hundred to thousand lymphocytes while preserving qualitative and quantitative information on clonal composition of the sample. We also describe several general features regarding the data analysis with unique molecular identifiers that are critical for accurate counting of starting molecules in high-throughput sequencing applications. Copyright © 2015 by The American Association of Immunologists, Inc.
NASA Astrophysics Data System (ADS)
Buczkowski, M.; Fisz, J. J.
2008-07-01
In this paper the possibility of the numerical data modelling in the case of angle- and time-resolved fluorescence spectroscopy is investigated. The asymmetric fluorescence probes are assumed to undergo the restricted rotational diffusion in a hosting medium. This process is described quantitatively by the diffusion tensor and the aligning potential. The evolution of the system is expressed in terms of the Smoluchowski equation with an appropriate time-developing operator. A matrix representation of this operator is calculated, then symmetrized and diagonalized. The resulting propagator is used to generate the synthetic noisy data set that imitates results of experimental measurements. The data set serves as a groundwork to the χ2 optimization, performed by the genetic algorithm followed by the gradient search, in order to recover model parameters, which are diagonal elements of the diffusion tensor, aligning potential expansion coefficients and directions of the electronic dipole moments. This whole procedure properly identifies model parameters, showing that the outlined formalism should be taken in the account in the case of analysing real experimental data.
Quantitative knowledge acquisition for expert systems
NASA Technical Reports Server (NTRS)
Belkin, Brenda L.; Stengel, Robert F.
1991-01-01
A common problem in the design of expert systems is the definition of rules from data obtained in system operation or simulation. While it is relatively easy to collect data and to log the comments of human operators engaged in experiments, generalizing such information to a set of rules has not previously been a direct task. A statistical method is presented for generating rule bases from numerical data, motivated by an example based on aircraft navigation with multiple sensors. The specific objective is to design an expert system that selects a satisfactory suite of measurements from a dissimilar, redundant set, given an arbitrary navigation geometry and possible sensor failures. The systematic development is described of a Navigation Sensor Management (NSM) Expert System from Kalman Filter convariance data. The method invokes two statistical techniques: Analysis of Variance (ANOVA) and the ID3 Algorithm. The ANOVA technique indicates whether variations of problem parameters give statistically different covariance results, and the ID3 algorithms identifies the relationships between the problem parameters using probabilistic knowledge extracted from a simulation example set. Both are detailed.
Sammour, Tarik; Barazanchi, Ahmed W H; Hill, Andrew G
2017-02-01
The aim of this systematic review was to update previous PROSPECT ( http://www.postoppain.org ) review recommendations for the management of pain after excisional haemorrhoidectomy. Randomized studies and reviews published in the English language from July 2006 (end date of last review) to March 2016, assessing analgesic, anaesthetic, and operative interventions pertaining to excisional haemorrhoidectomy in adults, and reporting pain scores, were retrieved from the EMBASE and MEDLINE databases. An additional 464 studies were identified of which 74 met the inclusion criteria. There were 48 randomized controlled trials and 26 reviews. Quantitative analyses were not performed, as there were limited numbers of trials with a sufficiently homogeneous design. Pudendal nerve block, with or without general anaesthesia, is recommended for all patients undergoing haemorrhoidal surgery. Either closed haemorrhoidectomy, or open haemorrhoidectomy with electrocoagulation of the pedicle is recommended as the primary procedure. Combinations of analgesics (paracetamol, non-steroidal anti-inflammatory drugs, and opioids), topical lignocaine and glyceryl trinitrate, laxatives, and oral metronidazole are recommended post-operatively. The recommendations are largely based on single intervention, not multimodal intervention, studies.
Nagaraja, Sridevi; Chen, Lin; DiPietro, Luisa A; Reifman, Jaques; Mitrophanov, Alexander Y
2018-02-20
Pathological scarring in wounds is a prevalent clinical outcome with limited prognostic options. The objective of this study was to investigate whether cellular signaling proteins could be used as prognostic biomarkers of pathological scarring in traumatic skin wounds. We used our previously developed and validated computational model of injury-initiated wound healing to simulate the time courses for platelets, 6 cell types, and 21 proteins involved in the inflammatory and proliferative phases of wound healing. Next, we analysed thousands of simulated wound-healing scenarios to identify those that resulted in pathological (i.e., excessive) scarring. Then, we identified candidate proteins that were elevated (or decreased) at the early stages of wound healing in those simulations and could therefore serve as predictive biomarkers of pathological scarring outcomes. Finally, we performed logistic regression analysis and calculated the area under the receiver operating characteristic curve to quantitatively assess the predictive accuracy of the model-identified putative biomarkers. We identified three proteins (interleukin-10, tissue inhibitor of matrix metalloproteinase-1, and fibronectin) whose levels were elevated in pathological scars as early as 2 weeks post-wounding and could predict a pathological scarring outcome occurring 40 days after wounding with 80% accuracy. Our method for predicting putative prognostic wound-outcome biomarkers may serve as an effective means to guide the identification of proteins predictive of pathological scarring.
Lavallée-Adam, Mathieu; Yates, John R
2016-03-24
PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the Web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
He, Dan; Xie, Xiao; Yang, Fan; Zhang, Heng; Su, Haomiao; Ge, Yun; Song, Haiping; Chen, Peng R
2017-11-13
A genetically encoded, multifunctional photocrosslinker was developed for quantitative and comparative proteomics. By bearing a bioorthogonal handle and a releasable linker in addition to its photoaffinity warhead, this probe enables the enrichment of transient and low-abundance prey proteins after intracellular photocrosslinking and prey-bait separation, which can be subject to stable isotope dimethyl labeling and mass spectrometry analysis. This quantitative strategy (termed isoCAPP) allowed a comparative proteomic approach to be adopted to identify the proteolytic substrates of an E. coli protease-chaperone dual machinery DegP. Two newly identified substrates were subsequently confirmed by proteolysis experiments. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Imaging Cerebral Microhemorrhages in Military Service Members with Chronic Traumatic Brain Injury
Liu, Wei; Soderlund, Karl; Senseney, Justin S.; Joy, David; Yeh, Ping-Hong; Ollinger, John; Sham, Elyssa B.; Liu, Tian; Wang, Yi; Oakes, Terrence R.; Riedy, Gerard
2017-01-01
Purpose To detect cerebral microhemorrhages in military service members with chronic traumatic brain injury by using susceptibility-weighted magnetic resonance (MR) imaging. The longitudinal evolution of microhemorrhages was monitored in a subset of patients by using quantitative susceptibility mapping. Materials and Methods The study was approved by the Walter Reed National Military Medical Center institutional review board and is compliant with HIPAA guidelines. All participants underwent two-dimensional conventional gradient-recalled-echo MR imaging and three-dimensional flow-compensated multi-echo gradient-recalled-echo MR imaging (processed to generate susceptibility-weighted images and quantitative susceptibility maps), and a subset of patients underwent follow-up imaging. Microhemorrhages were identified by two radiologists independently. Comparisons of microhemorrhage number, size, and magnetic susceptibility derived from quantitative susceptibility maps between baseline and follow-up imaging examinations were performed by using the paired t test. Results Among the 603 patients, cerebral microhemorrhages were identified in 43 patients, with six excluded for further analysis owing to artifacts. Seventy-seven percent (451 of 585) of the microhemorrhages on susceptibility-weighted images had a more conspicuous appearance than on gradient-recalled-echo images. Thirteen of the 37 patients underwent follow-up imaging examinations. In these patients, a smaller number of microhemorrhages were identified at follow-up imaging compared with baseline on quantitative susceptibility maps (mean ± standard deviation, 9.8 microhemorrhages ± 12.8 vs 13.7 microhemorrhages ± 16.6; P = .019). Quantitative susceptibility mapping–derived quantitative measures of microhemorrhages also decreased over time: −0.85 mm3 per day ± 1.59 for total volume (P = .039) and −0.10 parts per billion per day ± 0.14 for mean magnetic susceptibility (P = .016). Conclusion The number of microhemorrhages and quantitative susceptibility mapping–derived quantitative measures of microhemorrhages all decreased over time, suggesting that hemosiderin products undergo continued, subtle evolution in the chronic stage. PMID:26371749
Imaging Cerebral Microhemorrhages in Military Service Members with Chronic Traumatic Brain Injury.
Liu, Wei; Soderlund, Karl; Senseney, Justin S; Joy, David; Yeh, Ping-Hong; Ollinger, John; Sham, Elyssa B; Liu, Tian; Wang, Yi; Oakes, Terrence R; Riedy, Gerard
2016-02-01
To detect cerebral microhemorrhages in military service members with chronic traumatic brain injury by using susceptibility-weighted magnetic resonance (MR) imaging. The longitudinal evolution of microhemorrhages was monitored in a subset of patients by using quantitative susceptibility mapping. The study was approved by the Walter Reed National Military Medical Center institutional review board and is compliant with HIPAA guidelines. All participants underwent two-dimensional conventional gradient-recalled-echo MR imaging and three-dimensional flow-compensated multiecho gradient-recalled-echo MR imaging (processed to generate susceptibility-weighted images and quantitative susceptibility maps), and a subset of patients underwent follow-up imaging. Microhemorrhages were identified by two radiologists independently. Comparisons of microhemorrhage number, size, and magnetic susceptibility derived from quantitative susceptibility maps between baseline and follow-up imaging examinations were performed by using the paired t test. Among the 603 patients, cerebral microhemorrhages were identified in 43 patients, with six excluded for further analysis owing to artifacts. Seventy-seven percent (451 of 585) of the microhemorrhages on susceptibility-weighted images had a more conspicuous appearance than on gradient-recalled-echo images. Thirteen of the 37 patients underwent follow-up imaging examinations. In these patients, a smaller number of microhemorrhages were identified at follow-up imaging compared with baseline on quantitative susceptibility maps (mean ± standard deviation, 9.8 microhemorrhages ± 12.8 vs 13.7 microhemorrhages ± 16.6; P = .019). Quantitative susceptibility mapping-derived quantitative measures of microhemorrhages also decreased over time: -0.85 mm(3) per day ± 1.59 for total volume (P = .039) and -0.10 parts per billion per day ± 0.14 for mean magnetic susceptibility (P = .016). The number of microhemorrhages and quantitative susceptibility mapping-derived quantitative measures of microhemorrhages all decreased over time, suggesting that hemosiderin products undergo continued, subtle evolution in the chronic stage. © RSNA, 2015.
Lorenz, Kim; Cohen, Barak A.
2012-01-01
Quantitative trait loci (QTL) with small effects on phenotypic variation can be difficult to detect and analyze. Because of this a large fraction of the genetic architecture of many complex traits is not well understood. Here we use sporulation efficiency in Saccharomyces cerevisiae as a model complex trait to identify and study small-effect QTL. In crosses where the large-effect quantitative trait nucleotides (QTN) have been genetically fixed we identify small-effect QTL that explain approximately half of the remaining variation not explained by the major effects. We find that small-effect QTL are often physically linked to large-effect QTL and that there are extensive genetic interactions between small- and large-effect QTL. A more complete understanding of quantitative traits will require a better understanding of the numbers, effect sizes, and genetic interactions of small-effect QTL. PMID:22942125
Factors Influencing F/OSS Cloud Computing Software Product Success: A Quantitative Study
ERIC Educational Resources Information Center
Letort, D. Brian
2012-01-01
Cloud Computing introduces a new business operational model that allows an organization to shift information technology consumption from traditional capital expenditure to operational expenditure. This shift introduces challenges from both the adoption and creation vantage. This study evaluates factors that influence Free/Open Source Software…
Simulation Modeling of a Facility Layout in Operations Management Classes
ERIC Educational Resources Information Center
Yazici, Hulya Julie
2006-01-01
Teaching quantitative courses can be challenging. Similarly, layout modeling and lean production concepts can be difficult to grasp in an introductory OM (operations management) class. This article describes a simulation model developed in PROMODEL to facilitate the learning of layout modeling and lean manufacturing. Simulation allows for the…
DOT National Transportation Integrated Search
2012-03-01
Many organizations have become concerned about the environmental impact of their facilities and : operations. In order to lessen environmental impact, quantitative assessment of practice based on : improvements from a baseline condition is needed. Th...
A Quantitative Analysis of Children's Splitting Operations and Fraction Schemes
ERIC Educational Resources Information Center
Norton, Anderson; Wilkins, Jesse L. M.
2009-01-01
Teaching experiments with pairs of children have generated several hypotheses about students' construction of fractions. For example, Steffe (2004) hypothesized that robust conceptions of improper fractions depends on the development of a splitting operation. Results from teaching experiments that rely on scheme theory and Steffe's hierarchy of…
Increasing Information Sharing Among Independent Police Departments
2009-03-01
the operational level . 50 Robert Fox (Lieutenant, Los Angeles Police Department – Assigned to the Joint Regional... Angeles Police Department ( LAPD ) is the largest independent police department in Los Angeles County. The department is responsible for patrolling...agencies operating in metropolitan areas based on a quantitative analysis of the 46 independent police
Ridde, Valéry
2003-01-01
OBJECTIVE: To gauge the effects of operating the Bamako Initiative in Kongoussi district, Burkina Faso. METHODS: Qualitative and quasi-experimental quantitative methodologies were used. FINDINGS: Following the introduction of fees-for-services in July 1997, the number of consultations for curative care fell over a period of three years by an average of 15.4% at "case" health centres but increased by 30.5% at "control" health centres. Moreover, although the operational results for essential drugs depots were not known, expenditure increased on average 2.7 times more than income and did not keep pace with the decline in the utilization of services. Persons in charge of the management committees had difficulties in releasing funds to ensure access to care for the poor. CONCLUSION: The introduction of fees-for-services had an adverse effect on service utilization. The study district is in a position to bear the financial cost of taking care of the poor and the community is able to identify such people. Incentives must be introduced by the state and be swiftly applied so that the communities agree to a more equitable system and thereby allow access to care for those excluded from services because they are unable to pay. PMID:12973646
[The perception of risk among police officers from different areas of the State of Rio de Janeiro].
Constantino, Patrícia; Ribeiro, Adalgisa Peixoto; Correia, Bruna Soares Chaves
2013-03-01
This article seeks to identify the perception of risk among police officers in the State of Rio de Janeiro based on their areas of operation: Capital, Interior and Baixada Fluminense (BF), by analyzing comparative victimization. It is a transversal study using the triangulation method. The quantitative research investigated 533 police officers in the Capital, 159 in the Interior and 222 in the BF; the qualitative approach included interviews with 17 police chiefs and 15 focus groups in the three areas. The results indicate that risk perceptions of officers, and the strategies used to minimize them are characteristics that unite them. Despite its universal nature, risk has differentiated gradations in relation to function and territory of operation. In the Capital there is greater exposure to the risk of confrontation with criminals, less respect for the police from the population, though there is greater operational support from the corporation. Contrary to perception, victimization is related to the territory: 67.8% of police officers were victimized in the Capital last year; 13.7% in the Interior; and 9.7% in the BF. The expectation is that the analyses will provide input for management of technical support and health assistance for police officers, considering the specificities of work in the different areas.
Optical assessment of tumor resection margins in the breast
Brown, J. Quincy; Bydlon, Torre M.; Richards, Lisa M.; Yu, Bing; Kennedy, Stephanie A.; Geradts, Joseph; Wilke, Lee G.; Junker, Marlee; Gallagher, Jennifer; Barry, William; Ramanujam, Nimmi
2011-01-01
Breast conserving surgery, in which the breast tumor and surrounding normal tissue are removed, is the primary mode of treatment for invasive and in situ carcinomas of the breast, conditions that affect nearly 200,000 women annually. Of these nearly 200,000 patients who undergo this surgical procedure, between 20–70% of them may undergo additional surgeries to remove tumor that was left behind in the first surgery, due to the lack of intra-operative tools which can detect whether the boundaries of the excised specimens are free from residual cancer. Optical techniques have many attractive attributes which may make them useful tools for intra-operative assessment of breast tumor resection margins. In this manuscript, we discuss clinical design criteria for intra-operative breast tumor margin assessment, and review optical techniques appied to this problem. In addition, we report on the development and clinical testing of quantitative diffuse reflectance imaging (Q-DRI) as a potential solution to this clinical need. Q-DRI is a spectral imaging tool which has been applied to 56 resection margins in 48 patients at Duke University Medical Center. Clear sources of contrast between cancerous and cancer-free resection margins were identified with the device, and resulted in an overall accuracy of 75% in detecting positive margins. PMID:21544237
Analysis of physical exercises and exercise protocols for space transportation system operation
NASA Technical Reports Server (NTRS)
Coleman, A. E.
1982-01-01
A quantitative evaluation of the Thornton-Whitmore treadmill was made so that informed management decisions regarding the role of this treadmill in operational flight crew exercise programs could be made. Specific tasks to be completed were: The Thornton-Whitmore passive treadmill as an exercise device at one-g was evaluated. Hardware, harness and restraint systems for use with the Thornton-Whitmore treadmill in the laboratory and in Shuttle flights were established. The quantitative and qualitative performance of human subjects on the Thorton-Whitmore treadmill with forces in excess of one-g, was evaluated. The performance of human subjects on the Thornton-Whitmore treadmill in weightlessness (onboard Shuttle flights) was also determined.
Multiparametric Quantitative Ultrasound Imaging in Assessment of Chronic Kidney Disease.
Gao, Jing; Perlman, Alan; Kalache, Safa; Berman, Nathaniel; Seshan, Surya; Salvatore, Steven; Smith, Lindsey; Wehrli, Natasha; Waldron, Levi; Kodali, Hanish; Chevalier, James
2017-11-01
To evaluate the value of multiparametric quantitative ultrasound imaging in assessing chronic kidney disease (CKD) using kidney biopsy pathologic findings as reference standards. We prospectively measured multiparametric quantitative ultrasound markers with grayscale, spectral Doppler, and acoustic radiation force impulse imaging in 25 patients with CKD before kidney biopsy and 10 healthy volunteers. Based on all pathologic (glomerulosclerosis, interstitial fibrosis/tubular atrophy, arteriosclerosis, and edema) scores, the patients with CKD were classified into mild (no grade 3 and <2 of grade 2) and moderate to severe (at least 2 of grade 2 or 1 of grade 3) CKD groups. Multiparametric quantitative ultrasound parameters included kidney length, cortical thickness, pixel intensity, parenchymal shear wave velocity, intrarenal artery peak systolic velocity (PSV), end-diastolic velocity (EDV), and resistive index. We tested the difference in quantitative ultrasound parameters among mild CKD, moderate to severe CKD, and healthy controls using analysis of variance, analyzed correlations of quantitative ultrasound parameters with pathologic scores and the estimated glomerular filtration rate (GFR) using Pearson correlation coefficients, and examined the diagnostic performance of quantitative ultrasound parameters in determining moderate CKD and an estimated GFR of less than 60 mL/min/1.73 m 2 using receiver operating characteristic curve analysis. There were significant differences in cortical thickness, pixel intensity, PSV, and EDV among the 3 groups (all P < .01). Among quantitative ultrasound parameters, the top areas under the receiver operating characteristic curves for PSV and EDV were 0.88 and 0.97, respectively, for determining pathologic moderate to severe CKD, and 0.76 and 0.86 for estimated GFR of less than 60 mL/min/1.73 m 2 . Moderate to good correlations were found for PSV, EDV, and pixel intensity with pathologic scores and estimated GFR. The PSV, EDV, and pixel intensity are valuable in determining moderate to severe CKD. The value of shear wave velocity in assessing CKD needs further investigation. © 2017 by the American Institute of Ultrasound in Medicine.
Álvarez, María F.; Angarita, Myrian; Delgado, María C.; García, Celsa; Jiménez-Gomez, José; Gebhardt, Christiane; Mosquera, Teresa
2017-01-01
The genetic basis of quantitative disease resistance has been studied in crops for several decades as an alternative to R gene mediated resistance. The most important disease in the potato crop is late blight, caused by the oomycete Phytophthora infestans. Quantitative disease resistance (QDR), as any other quantitative trait in plants, can be genetically mapped to understand the genetic architecture. Association mapping using DNA-based markers has been implemented in many crops to dissect quantitative traits. We used an association mapping approach with candidate genes to identify the first genes associated with quantitative resistance to late blight in Solanum tuberosum Group Phureja. Twenty-nine candidate genes were selected from a set of genes that were differentially expressed during the resistance response to late blight in tetraploid European potato cultivars. The 29 genes were amplified and sequenced in 104 accessions of S. tuberosum Group Phureja from Latin America. We identified 238 SNPs in the selected genes and tested them for association with resistance to late blight. The phenotypic data were obtained under field conditions by determining the area under disease progress curve (AUDPC) in two seasons and in two locations. Two genes were associated with QDR to late blight, a potato homolog of thylakoid lumen 15 kDa protein (StTL15A) and a stem 28 kDa glycoprotein (StGP28). Key message: A first association mapping experiment was conducted in Solanum tuberosum Group Phureja germplasm, which identified among 29 candidates two genes associated with quantitative resistance to late blight. PMID:28674545
Flemming, Kate
2010-01-01
This paper is a report of a Critical Interpretive Synthesis to synthesize quantitative research, in the form of an effectiveness review and a guideline, with qualitative research to examine the use of morphine to treat cancer-related pain. Critical Interpretive Synthesis is a new method of reviewing, developed from meta-ethnography, which integrates systematic review methodology with a qualitative tradition of enquiry. It has not previously been used specifically to synthesize effectiveness and qualitative literature. Data sources. An existing systematic review of quantitative research and a guideline examining the effectiveness of oral morphine to treat cancer pain were identified. Electronic searches of Medline, CINAHL, Embase, PsychINFO, Health Management Information Consortium database and the Social Science Citation Index to identify qualitative research were carried out in May 2008. Qualitative research papers reporting on the use of morphine to treat cancer pain were identified. The findings of the effectiveness research were used as a framework to guide the translation of findings from qualitative research using an integrative grid. A secondary translation of findings from the qualitative research, not specifically mapped to the effectiveness literature, was guided by the framework. Nineteen qualitative papers were synthesized with the quantitative effectiveness literature, producing 14 synthetic constructs. These were developed into four synthesizing arguments which drew on patients', carers' and healthcare professionals' interpretations of the meaning and context of the use of morphine to treat cancer pain. Critical Interpretive Synthesis can be adapted to synthesize reviews of quantitative research into effectiveness with qualitative research and fits into an existing typology of approaches to synthesizing qualitative and quantitative research.
Álvarez, María F; Angarita, Myrian; Delgado, María C; García, Celsa; Jiménez-Gomez, José; Gebhardt, Christiane; Mosquera, Teresa
2017-01-01
The genetic basis of quantitative disease resistance has been studied in crops for several decades as an alternative to R gene mediated resistance. The most important disease in the potato crop is late blight, caused by the oomycete Phytophthora infestans. Quantitative disease resistance (QDR), as any other quantitative trait in plants, can be genetically mapped to understand the genetic architecture. Association mapping using DNA-based markers has been implemented in many crops to dissect quantitative traits. We used an association mapping approach with candidate genes to identify the first genes associated with quantitative resistance to late blight in Solanum tuberosum Group Phureja. Twenty-nine candidate genes were selected from a set of genes that were differentially expressed during the resistance response to late blight in tetraploid European potato cultivars. The 29 genes were amplified and sequenced in 104 accessions of S. tuberosum Group Phureja from Latin America. We identified 238 SNPs in the selected genes and tested them for association with resistance to late blight. The phenotypic data were obtained under field conditions by determining the area under disease progress curve (AUDPC) in two seasons and in two locations. Two genes were associated with QDR to late blight, a potato homolog of thylakoid lumen 15 kDa protein ( StTL15A ) and a stem 28 kDa glycoprotein ( StGP28 ). Key message : A first association mapping experiment was conducted in Solanum tuberosum Group Phureja germplasm, which identified among 29 candidates two genes associated with quantitative resistance to late blight.
Identification of suitable reference genes for hepatic microRNA quantitation.
Lamba, Vishal; Ghodke-Puranik, Yogita; Guan, Weihua; Lamba, Jatinder K
2014-03-07
MicroRNAs (miRNAs) are short (~22 nt) endogenous RNAs that play important roles in regulating expression of a wide variety of genes involved in different cellular processes. Alterations in microRNA expression patterns have been associated with a number of human diseases. Accurate quantitation of microRNA levels is important for their use as biomarkers and in determining their functions. Real time PCR is the gold standard and the most frequently used technique for miRNA quantitation. Real time PCR data analysis includes normalizing the amplification data to suitable endogenous control/s to ensure that microRNA quantitation is not affected by the variability that is potentially introduced at different experimental steps. U6 (RNU6A) and RNU6B are two commonly used endogenous controls in microRNA quantitation. The present study was designed to investigate inter-individual variability and gender differences in hepatic microRNA expression as well as to identify the best endogenous control/s that could be used for normalization of real-time expression data in liver samples. We used Taqman based real time PCR to quantitate hepatic expression levels of 22 microRNAs along with U6 and RNU6B in 50 human livers samples (25 M, 25 F). To identify the best endogenous controls for use in data analysis, we evaluated the amplified candidates for their stability (least variability) in expression using two commonly used software programs: Normfinder and GeNormplus, Both Normfinder and GeNormplus identified U6 to be among the least stable of all the candidates analyzed, and RNU6B was also not among the top genes in stability. mir-152 and mir-23b were identified to be the two most stable candidates by both Normfinder and GeNormplus in our analysis, and were used as endogenous controls for normalization of hepatic miRNA levels. Measurements of microRNA stability indicate that U6 and RNU6B are not suitable for use as endogenous controls for normalizing microRNA relative quantitation data in hepatic tissue, and their use can led to possibly erroneous conclusions.
Quantitative Predictive Models for Systemic Toxicity (SOT)
Models to identify systemic and specific target organ toxicity were developed to help transition the field of toxicology towards computational models. By leveraging multiple data sources to incorporate read-across and machine learning approaches, a quantitative model of systemic ...
Distribution of voids in field concrete.
DOT National Transportation Integrated Search
1978-01-01
This study was intended to evaluate the air void characteristics of concrete in an attempt to identify, quantitatively or semi-quantitatively, different types of voids and to predict their influence on strength and durability. At the outset, it was a...
78 FR 52166 - Quantitative Messaging Research
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-22
... COMMODITY FUTURES TRADING COMMISSION Quantitative Messaging Research AGENCY: Commodity Futures... survey will follow qualitative message testing research (for which CFTC received fast-track OMB approval... message testing research (for which CFTC received fast-track OMB approval) and is necessary to identify...
Pandit, Jaideep J; Tavare, Aniket
2011-07-01
It is important that a surgical list is planned to utilise as much of the scheduled time as possible while not over-running, because this can lead to cancellation of operations. We wished to assess whether, theoretically, the known duration of individual operations could be used quantitatively to predict the likely duration of the operating list. In a university hospital setting, we first assessed the extent to which the current ad-hoc method of operating list planning was able to match the scheduled operating list times for 153 consecutive historical lists. Using receiver operating curve analysis, we assessed the ability of an alternative method to predict operating list duration for the same operating lists. This method uses a simple formula: the sum of individual operation times and a pooled standard deviation of these times. We used the operating list duration estimated from this formula to generate a probability that the operating list would finish within its scheduled time. Finally, we applied the simple formula prospectively to 150 operating lists, 'shadowing' the current ad-hoc method, to confirm the predictive ability of the formula. The ad-hoc method was very poor at planning: 50% of historical operating lists were under-booked and 37% over-booked. In contrast, the simple formula predicted the correct outcome (under-run or over-run) for 76% of these operating lists. The calculated probability that a planned series of operations will over-run or under-run was found useful in developing an algorithm to adjust the planned cases optimally. In the prospective series, 65% of operating lists were over-booked and 10% were under-booked. The formula predicted the correct outcome for 84% of operating lists. A simple quantitative method of estimating operating list duration for a series of operations leads to an algorithm (readily created on an Excel spreadsheet, http://links.lww.com/EJA/A19) that can potentially improve operating list planning.
An eQTL Analysis of Partial Resistance to Puccinia hordei in Barley
Chen, Xinwei; Hackett, Christine A.; Niks, Rients E.; Hedley, Peter E.; Booth, Clare; Druka, Arnis; Marcel, Thierry C.; Vels, Anton; Bayer, Micha; Milne, Iain; Morris, Jenny; Ramsay, Luke; Marshall, David; Cardle, Linda; Waugh, Robbie
2010-01-01
Background Genetic resistance to barley leaf rust caused by Puccinia hordei involves both R genes and quantitative trait loci. The R genes provide higher but less durable resistance than the quantitative trait loci. Consequently, exploring quantitative or partial resistance has become a favorable alternative for controlling disease. Four quantitative trait loci for partial resistance to leaf rust have been identified in the doubled haploid Steptoe (St)/Morex (Mx) mapping population. Further investigations are required to study the molecular mechanisms underpinning partial resistance and ultimately identify the causal genes. Methodology/Principal Findings We explored partial resistance to barley leaf rust using a genetical genomics approach. We recorded RNA transcript abundance corresponding to each probe on a 15K Agilent custom barley microarray in seedlings from St and Mx and 144 doubled haploid lines of the St/Mx population. A total of 1154 and 1037 genes were, respectively, identified as being P. hordei-responsive among the St and Mx and differentially expressed between P. hordei-infected St and Mx. Normalized ratios from 72 distant-pair hybridisations were used to map the genetic determinants of variation in transcript abundance by expression quantitative trait locus (eQTL) mapping generating 15685 eQTL from 9557 genes. Correlation analysis identified 128 genes that were correlated with resistance, of which 89 had eQTL co-locating with the phenotypic quantitative trait loci (pQTL). Transcript abundance in the parents and conservation of synteny with rice allowed us to prioritise six genes as candidates for Rphq11, the pQTL of largest effect, and highlight one, a phospholipid hydroperoxide glutathione peroxidase (HvPHGPx) for detailed analysis. Conclusions/Significance The eQTL approach yielded information that led to the identification of strong candidate genes underlying pQTL for resistance to leaf rust in barley and on the general pathogen response pathway. The dataset will facilitate a systems appraisal of this host-pathogen interaction and, potentially, for other traits measured in this population. PMID:20066049
Wang, Lu; Wei, Chenchen; Deng, Linghui; Wang, Ziqiong; Song, Mengyuan; Xiong, Yao; Liu, Ming
2018-06-01
Hemorrhagic transformation is a serious complication of acute ischemic stroke, which may cause detrimental outcomes and the delayed use of anticoagulation therapy. Early predicting and identifying the patients at high risk of hemorrhagic transformation before clinical deterioration occurrence become a research priority. To study the value of plasma matrix metalloproteinase-9 predicting hemorrhagic transformation after ischemic stroke. We searched PubMed, Ovid, Cochrane Library, and other 2 Chinese databases to identify literatures published up to September 2017 and performed meta-analysis by STATA (version 12.0, StataCorp LP, College Station, TX). Twelve studies incorporating 1492 participants were included and 7 studies were included in the quantitative statistical analysis. The pooled sensitivity was 85% (95% confidence interval [CI]: 75%, 91%) and the pooled specificity was 79% (95% CI: 67%, 87%). The area under the receiver operating characteristic curve was .89 (95% CI .86, .91). Significant heterogeneity for all estimates value existed (all the P value < .05 and I 2 > 50%). There is no threshold effect with P value greater than .05 of the correlation coefficient. Meta-regression and subgroup analysis showed cut-off value and hemorrhagic subtype contributed to heterogeneity. Deeks' funnel plot indicated no significant publication bias for 7 quantitative analysis studies. Matrix metalloproteinase-9 has high predictive value for hemorrhagic transformation after acute ischemic stroke. It may be useful to test matrix metalloproteinase-9 to exclude patients at low risk of hemorrhage for precise treatment in the future clinical work. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.
Quantitative CT Measures of Bronchiectasis in Smokers.
Diaz, Alejandro A; Young, Thomas P; Maselli, Diego J; Martinez, Carlos H; Gill, Ritu; Nardelli, Pietro; Wang, Wei; Kinney, Gregory L; Hokanson, John E; Washko, George R; San Jose Estepar, Raul
2017-06-01
Bronchiectasis is frequent in smokers with COPD; however, there are only limited data on objective assessments of this process. The objective was to assess bronchovascular morphology, calculate the ratio of the diameters of bronchial lumen and adjacent artery (BA ratio), and identify those measurements able to discriminate bronchiectasis. We collected quantitative CT (QCT) measures of BA ratios, peak wall attenuation, wall thickness (WT), wall area, and wall area percent (WA%) at matched fourth through sixth airway generations in 21 ever smokers with bronchiectasis (cases) and 21 never-smoking control patients (control airways). In cases, measurements were collected at both bronchiectatic and nonbronchiectatic airways. Logistic analysis and the area under receiver operating characteristic curve (AUC) were used to assess the predictive ability of QCT measurements for bronchiectasis. The whole-lung and fourth through sixth airway generation BA ratio, WT, and WA% were significantly greater in bronchiectasis cases than control patients. The AUCs for the BA ratio to predict bronchiectasis ranged from 0.90 (whole lung) to 0.79 (fourth-generation). AUCs for WT and WA% ranged from 0.72 to 0.75 and from 0.71 to 0.75. The artery diameters but not bronchial diameters were smaller in bronchiectatic than both nonbronchiectatic and control airways (P < .01 for both). Smoking-related increases in the BA ratio appear to be driven by reductions in vascular caliber. QCT measures of BA ratio, WT, and WA% may be useful to objectively identify and quantify bronchiectasis in smokers. ClinicalTrials.gov; No.: NCT00608764; URL: www.clinicaltrials.gov. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
Balance confidence and falls in nondemented essential tremor patients: the role of cognition.
Rao, Ashwini K; Gilman, Arthur; Louis, Elan D
2014-10-01
To examine (1) the effect of cognitive ability on balance confidence and falls, (2) the relationship of balance confidence and falls with quantitative measures of gait, and (3) measures that predict falls, in people with essential tremor (ET). Cross-sectional study. General community. People with ET (n=132) and control subjects (n=48). People with ET were divided into 2 groups based on the median score on the Modified Mini-Mental State Examination: those with lower cognitive test scores (ET-LCS) and those with higher cognitive test scores (ET-HCS). Not applicable. Six-item Activities of Balance Confidence (ABC-6) Scale and falls in the previous year. Participants with ET-LCS had lower ABC-6 scores and a greater number of falls than those with ET-HCS (P<.05 for all measures) or control subjects (P<.01 for all measures). Quantitative gait measures were significantly correlated with ABC-6 score and falls. Gait speed (P<.007) and ABC-6 score (P<.02) were significant predictors of falls. Receiver operating characteristic curve analysis revealed that gait speed <0.9m/s and ABC-6 score <51% were associated with moderate sensitivity and specificity in identifying fallers. People with ET-LCS have impaired gait and report lower balance confidence and a higher number of falls than their counterparts (ET-HCS) and than control subjects. We have identified assessments that are easily administered (gait speed, ABC-6 Scale) and are associated with falls in ET. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Kruid, Jan; Fogel, Ronen; Limson, Janice Leigh
2017-05-01
Identifying the most efficient oxidation process to achieve maximum removal of a target pollutant compound forms the subject of much research. There exists a need to develop rapid screening tools to support research in this area. In this work we report on the development of a quantitative assay as a means for identifying catalysts capable of decolourising methylene blue through the generation of oxidising species from hydrogen peroxide. Here, a previously described methylene blue test strip method was repurposed as a quantitative, aqueous-based spectrophotometric assay. From amongst a selection of metal salts and metallophthalocyanine complexes, monitoring of the decolourisation of the cationic dye methylene blue (via Fenton-like and non-Fenton oxidation reactions) by the assay identified the following to be suitable oxidation catalysts: CuSO 4 (a Fenton-like catalyst), iron(II)phthalocyanine (a non-Fenton oxidation catalyst), as well as manganese(II) phthalocyanine. The applicability of the method was examined for the removal of bisphenol A (BPA), as measured by HPLC, during parallel oxidation experiments. The order of catalytic activity was identified as FePc > MnPc > CuSO 4 for both BPA and MB. The quantitative MB decolourisation assay may offer a rapid method for screening a wide range of potential catalysts for oxidation processes. Copyright © 2017 Elsevier Ltd. All rights reserved.
Quantitative Analysis of a Hybrid Electric HMMWV for Fuel Economy Improvement
2012-05-01
HMMWV of equivalent size. Hybrid vehicle powertrains show improved fuel economy gains due to optimized engine operation and regenerative braking . In... regenerative braking . Validated vehicle models as well as data collected on test tracks are used in the quantitative analysis. The regenerative braking ...hybrid electric vehicle, drive cycle, fuel economy, engine efficiency, regenerative braking . 1 Introduction The US Army (Tank Automotive
Exploration Health Risks: Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley
2006-01-01
Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of conservative and nonconservative assumptions on the probability results. We discuss the methods necessary to assess mission risks once exploration mission scenarios are characterized. Preliminary efforts have produced results that are commensurate with earlier qualitative estimates of risk probabilities in this and other operational contexts, indicating that our approach may be usefully applied in support of the development of human health and performance standards for long-duration space exploration missions. This approach will also enable mission-specific probabilistic risk assessments for space exploration missions.
Sunderland, John J; Christian, Paul E
2015-01-01
The Clinical Trials Network (CTN) of the Society of Nuclear Medicine and Molecular Imaging (SNMMI) operates a PET/CT phantom imaging program using the CTN's oncology clinical simulator phantom, designed to validate scanners at sites that wish to participate in oncology clinical trials. Since its inception in 2008, the CTN has collected 406 well-characterized phantom datasets from 237 scanners at 170 imaging sites covering the spectrum of commercially available PET/CT systems. The combined and collated phantom data describe a global profile of quantitative performance and variability of PET/CT data used in both clinical practice and clinical trials. Individual sites filled and imaged the CTN oncology PET phantom according to detailed instructions. Standard clinical reconstructions were requested and submitted. The phantom itself contains uniform regions suitable for scanner calibration assessment, lung fields, and 6 hot spheric lesions with diameters ranging from 7 to 20 mm at a 4:1 contrast ratio with primary background. The CTN Phantom Imaging Core evaluated the quality of the phantom fill and imaging and measured background standardized uptake values to assess scanner calibration and maximum standardized uptake values of all 6 lesions to review quantitative performance. Scanner make-and-model-specific measurements were pooled and then subdivided by reconstruction to create scanner-specific quantitative profiles. Different makes and models of scanners predictably demonstrated different quantitative performance profiles including, in some cases, small calibration bias. Differences in site-specific reconstruction parameters increased the quantitative variability among similar scanners, with postreconstruction smoothing filters being the most influential parameter. Quantitative assessment of this intrascanner variability over this large collection of phantom data gives, for the first time, estimates of reconstruction variance introduced into trials from allowing trial sites to use their preferred reconstruction methodologies. Predictably, time-of-flight-enabled scanners exhibited less size-based partial-volume bias than non-time-of-flight scanners. The CTN scanner validation experience over the past 5 y has generated a rich, well-curated phantom dataset from which PET/CT make-and-model and reconstruction-dependent quantitative behaviors were characterized for the purposes of understanding and estimating scanner-based variances in clinical trials. These results should make it possible to identify and recommend make-and-model-specific reconstruction strategies to minimize measurement variability in cancer clinical trials. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Executable Architecture of Net Enabled Operations: State Machine of Federated Nodes
2009-11-01
verbal descriptions from operators) of the current Command and Control (C2) practices into model form. In theory these should be Standard Operating...faudra une grande quantité de données pour faire en sorte que le modèle reflète les processus véritables, les auteurs recommandent que la machine à...descriptions from operators) of the current C2 practices into model form. In theory these should be SOPs that execute as a thread from start to finish. The
Benharash, Peyman; Buch, Eric; Frank, Paul; Share, Michael; Tung, Roderick; Shivkumar, Kalyanam; Mandapati, Ravi
2015-01-01
Background New approaches to ablation of atrial fibrillation (AF) include focal impulse and rotor modulation (FIRM) mapping, and initial results reported with this technique have been favorable. We sought to independently evaluate the approach by analyzing quantitative characteristics of atrial electrograms used to identify rotors and describe acute procedural outcomes of FIRM-guided ablation. Methods and Results All FIRM-guided ablation procedures (n=24; 50% paroxysmal) at University of California, Los Angeles Medical Center were included for analysis. During AF, unipolar atrial electrograms collected from a 64-pole basket catheter were used to construct phase maps and identify putative AF sources. These sites were targeted for ablation, in conjunction with pulmonary vein isolation in most patients (n=19; 79%). All patients had rotors identified (mean, 2.3±0.9 per patient; 72% in left atrium). Prespecified acute procedural end point was achieved in 12 of 24 (50%) patients: AF termination (n=1), organization (n=3), or >10% slowing of AF cycle length (n=8). Basket electrodes were within 1 cm of 54% of left atrial surface area, and a mean of 31 electrodes per patient showed interpretable atrial electrograms. Offline analysis revealed no differences between rotor and distant sites in dominant frequency or Shannon entropy. Electroanatomic mapping showed no rotational activation at FIRM-identified rotor sites in 23 of 24 patients (96%). Conclusions FIRM-identified rotor sites did not exhibit quantitative atrial electrogram characteristics expected from rotors and did not differ quantitatively from surrounding tissue. Catheter ablation at these sites, in conjunction with pulmonary vein isolation, resulted in AF termination or organization in a minority of patients (4/24; 17%). Further validation of this approach is necessary. PMID:25873718
Långsjö, Teemu K; Vasara, Anna I; Hyttinen, Mika M; Lammi, Mikko J; Kaukinen, Antti; Helminen, Heikki J; Kiviranta, Ilkka
2010-01-01
The aim of this study was to undertake a stereological analysis to quantify the dimensions of the collagen network in the repair tissue of porcine joints after they had been subjected to autologous chondrocyte transplantation (ACT). ACT was used to repair cartilage lesions in knee joints of pigs. Electron-microscopic stereology, immunostaining for type II collagen, and quantitative polarized-light microscopy were utilized to study the collagen fibrils in the repair tissue 3 and 12 months after the operation. The collagen volume density (V(V)) was lower in the repair tissue than in normal cartilage at 3 months (20.4 vs. 23.7%) after the operation. The collagen surface density (S(V), 1.5·10(-2) vs. 3.1·10(-2) nm(2)/nm(3)) and V(V) increased with time in the repair tissue (20.4 vs. 44.7%). Quantitative polarized-light microscopy detected a higher degree of collagen parallelism in the repair tissue at 3 months after the operation (55.7 vs. 49.7%). In contrast, 1 year after the operation, fibril parallelism was lower in the repair tissue than in the control cartilage (47.5 vs. 69.8%). Following ACT, V(V) and S(V) increased in the repair tissue with time, reflecting maturation of the tissue. One year after the operation, there was a lower level of fibril organization in the repair tissue than in the control cartilage. Thus, the newly synthesized collagen fibrils in the repair tissue appeared to form a denser network than in the control cartilage, but the fibrils remained more randomly oriented. Copyright © 2010 S. Karger AG, Basel.
Hydrophobic ionic liquids for quantitative bacterial cell lysis with subsequent DNA quantification.
Fuchs-Telka, Sabine; Fister, Susanne; Mester, Patrick-Julian; Wagner, Martin; Rossmanith, Peter
2017-02-01
DNA is one of the most frequently analyzed molecules in the life sciences. In this article we describe a simple and fast protocol for quantitative DNA isolation from bacteria based on hydrophobic ionic liquid supported cell lysis at elevated temperatures (120-150 °C) for subsequent PCR-based analysis. From a set of five hydrophobic ionic liquids, 1-butyl-1-methylpyrrolidinium bis(trifluoromethylsulfonyl)imide was identified as the most suitable for quantitative cell lysis and DNA extraction because of limited quantitative PCR inhibition by the aqueous eluate as well as no detectable DNA uptake. The newly developed method was able to efficiently lyse Gram-negative bacterial cells, whereas Gram-positive cells were protected by their thick cell wall. The performance of the final protocol resulted in quantitative DNA extraction efficiencies for Gram-negative bacteria similar to those obtained with a commercial kit, whereas the number of handling steps, and especially the time required, was dramatically reduced. Graphical Abstract After careful evaluation of five hydrophobic ionic liquids, 1-butyl-1-methylpyrrolidinium bis(trifluoromethylsulfonyl)imide ([BMPyr + ][Ntf 2 - ]) was identified as the most suitable ionic liquid for quantitative cell lysis and DNA extraction. When used for Gram-negative bacteria, the protocol presented is simple and very fast and achieves DNA extraction efficiencies similar to those obtained with a commercial kit. ddH 2 O double-distilled water, qPCR quantitative PCR.
NASA Astrophysics Data System (ADS)
Shirley, Rachel Elizabeth
Nuclear power plant (NPP) simulators are proliferating in academic research institutions and national laboratories in response to the availability of affordable, digital simulator platforms. Accompanying the new research facilities is a renewed interest in using data collected in NPP simulators for Human Reliability Analysis (HRA) research. An experiment conducted in The Ohio State University (OSU) NPP Simulator Facility develops data collection methods and analytical tools to improve use of simulator data in HRA. In the pilot experiment, student operators respond to design basis accidents in the OSU NPP Simulator Facility. Thirty-three undergraduate and graduate engineering students participated in the research. Following each accident scenario, student operators completed a survey about perceived simulator biases and watched a video of the scenario. During the video, they periodically recorded their perceived strength of significant Performance Shaping Factors (PSFs) such as Stress. This dissertation reviews three aspects of simulator-based research using the data collected in the OSU NPP Simulator Facility: First, a qualitative comparison of student operator performance to computer simulations of expected operator performance generated by the Information Decision Action Crew (IDAC) HRA method. Areas of comparison include procedure steps, timing of operator actions, and PSFs. Second, development of a quantitative model of the simulator bias introduced by the simulator environment. Two types of bias are defined: Environmental Bias and Motivational Bias. This research examines Motivational Bias--that is, the effect of the simulator environment on an operator's motivations, goals, and priorities. A bias causal map is introduced to model motivational bias interactions in the OSU experiment. Data collected in the OSU NPP Simulator Facility are analyzed using Structural Equation Modeling (SEM). Data include crew characteristics, operator surveys, and time to recognize and diagnose the accident in the scenario. These models estimate how the effects of the scenario conditions are mediated by simulator bias, and demonstrate how to quantify the strength of the simulator bias. Third, development of a quantitative model of subjective PSFs based on objective data (plant parameters, alarms, etc.) and PSF values reported by student operators. The objective PSF model is based on the PSF network in the IDAC HRA method. The final model is a mixed effects Bayesian hierarchical linear regression model. The subjective PSF model includes three factors: The Environmental PSF, the simulator Bias, and the Context. The Environmental Bias is mediated by an operator sensitivity coefficient that captures the variation in operator reactions to plant conditions. The data collected in the pilot experiments are not expected to reflect professional NPP operator performance, because the students are still novice operators. However, the models used in this research and the methods developed to analyze them demonstrate how to consider simulator bias in experiment design and how to use simulator data to enhance the technical basis of a complex HRA method. The contributions of the research include a framework for discussing simulator bias, a quantitative method for estimating simulator bias, a method for obtaining operator-reported PSF values, and a quantitative method for incorporating the variability in operator perception into PSF models. The research demonstrates applications of Structural Equation Modeling and hierarchical Bayesian linear regression models in HRA. Finally, the research demonstrates the benefits of using student operators as a test platform for HRA research.
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of abort triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of abort triggers.
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
This paper describes the quantitative application of the theory of System Health Management and its operational subset, Fault Management, to the selection of Abort Triggers for a human-rated launch vehicle, the United States' National Aeronautics and Space Administration's (NASA) Space Launch System (SLS). The results demonstrate the efficacy of the theory to assess the effectiveness of candidate failure detection and response mechanisms to protect humans from time-critical and severe hazards. The quantitative method was successfully used on the SLS to aid selection of its suite of Abort Triggers.
Clinical application of a light-pen computer system for quantitative angiography
NASA Technical Reports Server (NTRS)
Alderman, E. L.
1975-01-01
The paper describes an angiographic analysis system which uses a video disk for recording and playback, a light-pen for data input, minicomputer processing, and an electrostatic printer/plotter for hardcopy output. The method is applied to quantitative analysis of ventricular volumes, sequential ventriculography for assessment of physiologic and pharmacologic interventions, analysis of instantaneous time sequence of ventricular systolic and diastolic events, and quantitation of segmental abnormalities. The system is shown to provide the capability for computation of ventricular volumes and other measurements from operator-defined margins by greatly reducing the tedium and errors associated with manual planimetry.
Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P
2017-12-05
The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.
Understanding Pre-Quantitative Risk in Projects
NASA Technical Reports Server (NTRS)
Cooper, Lynne P.
2011-01-01
Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.
Flowers, Sarah A.; Ali, Liaqat; Lane, Catherine S.; Olin, Magnus; Karlsson, Niclas G.
2013-01-01
Rheumatoid arthritis is a common and debilitating systemic inflammatory condition affecting up to 1% of the world's population. This study aimed to investigate the immunological significance of O-glycans in chronic arthritis at a local and systemic level. O-Glycans released from synovial glycoproteins during acute and chronic arthritic conditions were compared and immune-reactive glycans identified. The sulfated core 1 O-glycan (Galβ1–3GalNAcol) was immune reactive, showing a different isomeric profile in the two conditions. From acute reactive arthritis, three isomers could be sequenced, but in patients with chronic rheumatoid arthritis, only a single 3-Gal sulfate-linked isomer could be identified. The systemic significance of this glycan epitope was investigated using the salivary mucin MUC7 in patients with rheumatoid arthritis and normal controls. To analyze this low abundance glycan, a selected reaction monitoring (SRM) method was developed to differentiate and relatively quantitate the core 1 O-glycan and the sulfated core 1 O-glycan Gal- and GalNAc-linked isomers. The acquisition of highly sensitive full scan linear ion trap MS/MS spectra in addition to quantitative SRM data allowed the 3- and 6-linked Gal isomers to be differentiated. The method was used to relatively quantitate the core 1 glycans from MUC7 to identify any systemic changes in this carbohydrate epitope. A statistically significant increase in sulfation was identified in salivary MUC7 from rheumatoid arthritis patients. This suggests a potential role for this epitope in chronic inflammation. This study was able to develop an SRM approach to specifically identify and relatively quantitate sulfated core 1 isomers and the unsulfated structure. The expansion of this method may afford an avenue for the high throughput investigation of O-glycans. PMID:23457413
Knapik, Joseph; Steelman, Ryan
2016-01-01
Objective: To identify and analyze articles in which the authors examined risk factors for soldiers during military static-line airborne operations. Data Sources: We searched for articles in PubMed, the Defense Technical Information Center, reference lists, and other sources using the key words airborne, parachuting, parachutes, paratrooper, injuries, wounds, trauma, and musculoskeletal. Study Selection: The search identified 17 684 potential studies. Studies were included if they were written in English, involved military static-line parachute operations, recorded injuries directly from events on the landing zone or from safety or medical records, and provided data for quantitative assessment of injury risk factors. A total of 23 studies met the review criteria, and 15 were included in the meta-analysis. Data Extraction: The summary statistic obtained for each risk factor was the risk ratio, which was the ratio of the injury risk in 1 group to that of another (baseline) group. Where data were sufficient, meta-analyses were performed and heterogeneity and publication bias were assessed. Data Synthesis: Risk factors for static-line parachuting injuries included night jumps, jumps with extra equipment, higher wind speeds, higher air temperatures, jumps from fixed-wing aircraft rather than balloons or helicopters, jumps onto certain types of terrain, being a female paratrooper, greater body weight, not using the parachute ankle brace, smaller parachute canopies, simultaneous exits from both sides of an aircraft, higher heat index, winds from the rear of the aircraft on exit entanglements, less experience with a particular parachute system, being an enlisted soldier rather than an officer, and jumps involving a greater number of paratroopers. Conclusions: We analyzed and summarized factors that increased the injury risk for soldiers during military static-line parachute operations. Understanding and considering these factors in risk evaluations may reduce the likelihood of injury during parachuting. PMID:28068166
Qualitative and quantitative studies of chemical composition of sandarac resin by GC-MS.
Kononenko, I; de Viguerie, L; Rochut, S; Walter, Ph
2017-01-01
The chemical composition of sandarac resin was investigated qualitatively and quantitatively by gas chromatography-mass spectrometry (GC-MS). Six compounds with labdane and pimarane skeletons were identified in the resin. The obtained mass spectra were interpreted and the mass spectrometric behaviour of these diterpenoids under EI conditions was described. Quantitative analysis by the method of internal standard revealed that identified diterpenoids represent only 10-30% of the analysed sample. The sandarac resin from different suppliers was analysed (from Kremer, Okhra, Color Rare, La Marchande de Couleurs, L'Atelier Montessori, Hevea). The analysis of different lumps of resins showed that the chemical composition differs from one lump to another, varying mainly in the relative distributions of the components.
ERIC Educational Resources Information Center
Odion, Segun
2011-01-01
The purpose of this quantitative correlational research study was to examine the relationship between costs of operation and total return on profitability of outsourcing information technology technical support in a two-year period of outsourcing operations. United States of America list of Fortune 1000 companies' chief information officers…
This standard operating procedure (SOP) describes a new, rapid, and relatively inexpensive way to remove a precise area of paint from the substrate of building structures in preparation for quantitative analysis. This method has been applied successfully in the laboratory, as we...
A Model for Resource Allocation Using Operational Knowledge Assets
ERIC Educational Resources Information Center
Andreou, Andreas N.; Bontis, Nick
2007-01-01
Purpose: The paper seeks to develop a business model that shows the impact of operational knowledge assets on intellectual capital (IC) components and business performance and use the model to show how knowledge assets can be prioritized in driving resource allocation decisions. Design/methodology/approach: Quantitative data were collected from 84…
ERIC Educational Resources Information Center
Sipko, Marek M.
2010-01-01
This study evaluated the effectiveness of the U.S. Marine Corps combat operational stress preventive training program to determine whether the program meets the training effectiveness criteria of the Marine Corps. This evaluation entailed both qualitative and quantitative inquiries to answer the subject matter research questions. The…
The Relationship between Homework and Performance in an Introductory Operations Management Course.
ERIC Educational Resources Information Center
Peters, Michael; Kethley, Bryan; Bullington, Kimball
2002-01-01
Homework of 142 students in an operations management was graded and performance on a multiple-choice test was compared to that of 188 without graded homework. The treatment group had lower overall scores. Grading did not affect performance on quantitative questions but had a significant effect on nonquantitative questions. (SK)
DOT National Transportation Integrated Search
2012-03-01
Many organizations have become concerned about the environmental impact of their facilities and operations. In order to lessen environmental impact, quantitative assessment of practice based on improvements from a baseline condition is needed. The Ka...
Quantitative analysis of a spinal surgeon's learning curve for scoliosis surgery.
Ryu, K J; Suh, S W; Kim, H W; Lee, D H; Yoon, Y; Hwang, J H
2016-05-01
The aim of this study was a quantitative analysis of a surgeon's learning curve for scoliosis surgery and the relationship between the surgeon's experience and post-operative outcomes, which has not been previously well described. We have investigated the operating time as a function of the number of patients to determine a specific pattern; we analysed factors affecting the operating time and compared intra- and post-operative outcomes. We analysed 47 consecutive patients undergoing scoliosis surgery performed by a single, non-trained scoliosis surgeon. Operating time was recorded for each of the four parts of the procedures: dissection, placement of pedicle screws, reduction of the deformity and wound closure. The median operating time was 310 minutes (interquartile range 277.5 to 432.5). The pattern showed a continuous decreasing trend in operating time until the patient number reached 23 to 25, after which it stabilised with fewer patient-dependent changes. The operating time was more affected by the patient number (r =- 0.75) than the number of levels fused (r = 0.59). Blood loss (p = 0.016) and length of stay in hospital (p = 0.012) were significantly less after the operating time stabilised. Post-operative functional outcome scores and the rate of complications showed no significant differences. We describe a detailed learning curve for scoliosis surgery based on a single surgeon's practise, providing useful information for novice scoliosis surgeons and for those responsible for training in spinal surgery. Cite this article: Bone Joint J 2016;98-B:679-85. ©2016 The British Editorial Society of Bone & Joint Surgery.
Martín-Gamboa, Mario; Iribarren, Diego; Susmozas, Ana; Dufour, Javier
2016-08-01
A novel approach is developed to evaluate quantitatively the influence of operational inefficiency in biomass production on the life-cycle performance of hydrogen from biomass gasification. Vine-growers and process simulation are used as key sources of inventory data. The life cycle assessment of biohydrogen according to current agricultural practices for biomass production is performed, as well as that of target biohydrogen according to agricultural practices optimised through data envelopment analysis. Only 20% of the vineyards assessed operate efficiently, and the benchmarked reduction percentages of operational inputs range from 45% to 73% in the average vineyard. The fulfilment of operational benchmarks avoiding irregular agricultural practices is concluded to improve significantly the environmental profile of biohydrogen (e.g., impact reductions above 40% for eco-toxicity and global warming). Finally, it is shown that this type of bioenergy system can be an excellent replacement for conventional hydrogen in terms of global warming and non-renewable energy demand. Copyright © 2016 Elsevier Ltd. All rights reserved.
Measuring Quantum Coherence with Entanglement.
Streltsov, Alexander; Singh, Uttam; Dhar, Himadri Shekhar; Bera, Manabendra Nath; Adesso, Gerardo
2015-07-10
Quantum coherence is an essential ingredient in quantum information processing and plays a central role in emergent fields such as nanoscale thermodynamics and quantum biology. However, our understanding and quantitative characterization of coherence as an operational resource are still very limited. Here we show that any degree of coherence with respect to some reference basis can be converted to entanglement via incoherent operations. This finding allows us to define a novel general class of measures of coherence for a quantum system of arbitrary dimension, in terms of the maximum bipartite entanglement that can be generated via incoherent operations applied to the system and an incoherent ancilla. The resulting measures are proven to be valid coherence monotones satisfying all the requirements dictated by the resource theory of quantum coherence. We demonstrate the usefulness of our approach by proving that the fidelity-based geometric measure of coherence is a full convex coherence monotone, and deriving a closed formula for it on arbitrary single-qubit states. Our work provides a clear quantitative and operational connection between coherence and entanglement, two landmark manifestations of quantum theory and both key enablers for quantum technologies.
Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun
2017-01-30
Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less confidence matched to confident proteins were retrieved. However, the total peptide-spectrum-match false discovery rate (PSM FDR) after retrieval analysis was still controlled at a confident level of FDR≤1%. As expected, the penalty for occasionally incorporating incorrect peptide identifications is negligible by comparison with the improvements in quantitative performance. More quantifiable peptides, lower missing value rate, better quantitative accuracy and precision were significantly achieved for the same protein identifications by this simple strategy. This strategy is theoretically applicable for any quantitative approaches in proteomics and thereby provides more quantitative information, especially on low-abundance proteins. Published by Elsevier B.V.
Lo, Julia C; Pluyter, Kari R; Meijer, Sebastiaan A
2016-02-01
The aim of this study was to examine individual markers of resilience and obtain quantitative insights into the understanding and the implications of variation and expertise levels in train traffic operators' goals and strategic mental models and their impact on performance. The Dutch railways are one of the world's most heavy utilized railway networks and have been identified to be weak in system and organizational resilience. Twenty-two train traffic controllers enacted two scenarios in a human-in-the-loop simulator. Their experience, goals, strategic mental models, and performance were assessed through questionnaires and simulator logs. Goals were operationalized through performance indicators and strategic mental models through train completion strategies. A variation was found between operators for both self-reported primary performance indicators and completion strategies. Further, the primary goal of only 14% of the operators reflected the primary organizational goal (i.e., arrival punctuality). An incongruence was also found between train traffic controllers' self-reported performance indicators and objective performance in a more disrupted condition. The level of experience tends to affect performance differently. There is a gap between primary organizational goals and preferred individual goals. Further, the relative strong diversity in primary operator goals and strategic mental models indicates weak resilience at the individual level. With recent and upcoming large-scale changes throughout the sociotechnical space of the railway infrastructure organization, the findings are useful to facilitate future railway traffic control and the development of a resilient system. © 2015, Human Factors and Ergonomics Society.
Runtime optimization of an application executing on a parallel computer
None
2014-11-25
Identifying a collective operation within an application executing on a parallel computer; identifying a call site of the collective operation; determining whether the collective operation is root-based; if the collective operation is not root-based: establishing a tuning session and executing the collective operation in the tuning session; if the collective operation is root-based, determining whether all compute nodes executing the application identified the collective operation at the same call site; if all compute nodes identified the collective operation at the same call site, establishing a tuning session and executing the collective operation in the tuning session; and if all compute nodes executing the application did not identify the collective operation at the same call site, executing the collective operation without establishing a tuning session.
Runtime optimization of an application executing on a parallel computer
Faraj, Daniel A; Smith, Brian E
2014-11-18
Identifying a collective operation within an application executing on a parallel computer; identifying a call site of the collective operation; determining whether the collective operation is root-based; if the collective operation is not root-based: establishing a tuning session and executing the collective operation in the tuning session; if the collective operation is root-based, determining whether all compute nodes executing the application identified the collective operation at the same call site; if all compute nodes identified the collective operation at the same call site, establishing a tuning session and executing the collective operation in the tuning session; and if all compute nodes executing the application did not identify the collective operation at the same call site, executing the collective operation without establishing a tuning session.
Runtime optimization of an application executing on a parallel computer
Faraj, Daniel A.; Smith, Brian E.
2013-01-29
Identifying a collective operation within an application executing on a parallel computer; identifying a call site of the collective operation; determining whether the collective operation is root-based; if the collective operation is not root-based: establishing a tuning session and executing the collective operation in the tuning session; if the collective operation is root-based, determining whether all compute nodes executing the application identified the collective operation at the same call site; if all compute nodes identified the collective operation at the same call site, establishing a tuning session and executing the collective operation in the tuning session; and if all compute nodes executing the application did not identify the collective operation at the same call site, executing the collective operation without establishing a tuning session.
Onoka, Chima A; Hanson, Kara; Mills, Anne
2016-08-01
There has been growing interest in the potential for private health insurance (PHI) and private organisations to contribute to universal health coverage (UHC). Yet evidence from low and middle income countries remains very thin. This paper examines the evolution of health maintenance organisations (HMOs) in Nigeria, the nature of the PHI plans and social health insurance (SHI) programmes and their performance, and the implications of their business practices for providing PHI and UHC-related SHI programmes. An embedded case study design was used with multiple subunits of analysis (individual HMOs and the HMO industry) and mixed (qualitative and quantitative) methods, and the study was guided by the structure-conduct-performance paradigm that has its roots in the neo-classical theory of the firm. Quantitative data collection and 35 in-depth interviews were carried out between October 2012 to July 2013. Although HMOs first emerged in Nigeria to supply PHI, their expansion was driven by their role as purchasers in the government's national health insurance scheme that finances SHI programmes, and facilitated by a weak accreditation system. HMOs' characteristics distinguish the market they operate in as monopolistically competitive, and HMOs as multiproduct firms operating multiple risk pools through parallel administrative systems. The considerable product differentiation and consequent risk selection by private insurers promote inefficiencies. Where HMOs and similar private organisations play roles in health financing systems, effective regulatory institutions and mandates must be established to guide their behaviours towards attainment of public health goals and to identify and control undesirable business practices. Lessons are drawn for policy makers and programme implementers especially in those low and middle-income countries considering the use of private organisations in their health financing systems. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Siddiqi, A.; Muhammad, A.; Wescoat, J. L., Jr.
2017-12-01
Large-scale, legacy canal systems, such as the irrigation infrastructure in the Indus Basin in Punjab, Pakistan, have been primarily conceived, constructed, and operated with a techno-centric approach. The emerging socio-hydrological approaches provide a new lens for studying such systems to potentially identify fresh insights for addressing contemporary challenges of water security. In this work, using the partial definition of water security as "the reliable availability of an acceptable quantity and quality of water", supply reliability is construed as a partial measure of water security in irrigation systems. A set of metrics are used to quantitatively study reliability of surface supply in the canal systems of Punjab, Pakistan using an extensive dataset of 10-daily surface water deliveries over a decade (2007-2016) and of high frequency (10-minute) flow measurements over one year. The reliability quantification is based on comparison of actual deliveries and entitlements, which are a combination of hydrological and social constructs. The socio-hydrological lens highlights critical issues of how flows are measured, monitored, perceived, and experienced from the perspective of operators (government officials) and users (famers). The analysis reveals varying levels of reliability (and by extension security) of supply when data is examined across multiple temporal and spatial scales. The results shed new light on evolution of water security (as partially measured by supply reliability) for surface irrigation in the Punjab province of Pakistan and demonstrate that "information security" (defined as reliable availability of sufficiently detailed data) is vital for enabling water security. It is found that forecasting and management (that are social processes) lead to differences between entitlements and actual deliveries, and there is significant potential to positively affect supply reliability through interventions in the social realm.
Liu, Xiulan; Chen, Lizhang; He, Xiang
2012-02-01
To analyze the status quo of quantitative classification in Hunan Province catering industry, and to discuss the countermeasures in-depth. According to relevant laws and regulations, and after referring to Daily supervision and quantitative scoring sheet and consulting experts, a checklist of key supervision indicators was made. The implementation of quantitative classification in 10 cities in Hunan Province was studied, and the status quo was analyzed. All the 390 catering units implemented quantitative classified management. The larger the catering enterprise, the higher level of quantitative classification. In addition to cafeterias, the smaller the catering units, the higher point of deduction, and snack bars and beverage stores were the highest. For those quantified and classified as C and D, the point of deduction was higher in the procurement and storage of raw materials, operation processing and other aspects. The quantitative classification of Hunan Province has relatively wide coverage. There are hidden risks in food security in small catering units, snack bars, and beverage stores. The food hygienic condition of Hunan Province needs to be improved.
Muto, Shunsuke; Rusz, Ján; Tatsumi, Kazuyoshi; Adam, Roman; Arai, Shigeo; Kocevski, Vancho; Oppeneer, Peter M; Bürgler, Daniel E; Schneider, Claus M
2014-01-01
Electron magnetic circular dichroism (EMCD) allows the quantitative, element-selective determination of spin and orbital magnetic moments, similar to its well-established X-ray counterpart, X-ray magnetic circular dichroism (XMCD). As an advantage over XMCD, EMCD measurements are made using transmission electron microscopes, which are routinely operated at sub-nanometre resolution, thereby potentially allowing nanometre magnetic characterization. However, because of the low intensity of the EMCD signal, it has not yet been possible to obtain quantitative information from EMCD signals at the nanoscale. Here we demonstrate a new approach to EMCD measurements that considerably enhances the outreach of the technique. The statistical analysis introduced here yields robust quantitative EMCD signals. Moreover, we demonstrate that quantitative magnetic information can be routinely obtained using electron beams of only a few nanometres in diameter without imposing any restriction regarding the crystalline order of the specimen.
2006-09-01
Astin, Director (December, 1965) [2] Agranovich, Y. Ya. “The theory of operators with dominant main diagonal. I.” Positiv - ity, Volume 2 (1998) pages 153...A Spherical Harmonics Solu- tion for Radiative Transfer Problems with Reflecting Boundaries and Internal Sources” Journal of Quantitative Spectroscopy...F. Huxley. “A Quantitative Description of Membrane Current and its Application to Conduction and Excitation in Nerve” Journal of Physiology (1952
Gallagher, Anthony G; Henn, Patrick J; Neary, Paul C; Senagore, Anthony J; Marcello, Peter W; Bunting, Brendan P; Seymour, Neal E; Satava, Richard M
2018-05-01
Training in medicine must move to an outcome-based approach. A proficiency-based progression outcome approach to training relies on a quantitative estimation of experienced operator performance. We aimed to develop a method for dealing with atypical expert performances in the quantitative definition of surgical proficiency. In study one, 100 experienced laparoscopic surgeons' performances on virtual reality and box-trainer simulators were assessed for two similar laparoscopic tasks. In study two, 15 experienced surgeons and 16 trainee colorectal surgeons performed one simulated hand-assisted laparoscopic colorectal procedure. Performance scores of experienced surgeons in both studies were standardized (i.e. Z-scores) using the mean and standard deviations (SDs). Performances >1.96 SDs from the mean were excluded in proficiency definitions. In study one, 1-5% of surgeons' performances were excluded having performed significantly below their colleagues. Excluded surgeons made significantly fewer correct incisions (mean = 7 (SD = 2) versus 19.42 (SD = 4.6), P < 0.0001) and a greater proportion of incorrect incisions (mean = 45.71 (SD = 10.48) versus 5.25 (SD = 6.6), P < 0.0001). In study two, one experienced colorectal surgeon performance was >4 SDs for time to complete the procedure and >6 SDs for path length. After their exclusions, experienced surgeons' performances were significantly better than trainees for path length: P = 0.031 and for time: P = 0.002. Objectively assessed atypical expert performances were few. Z-score standardization identified them and produced a more robust quantitative definition of proficiency. © 2018 Royal Australasian College of Surgeons.
Mu, Jun; Yang, Yongtao; Chen, Jin; Cheng, Ke; Li, Qi; Wei, Yongdong; Zhu, Dan; Shao, Weihua; Zheng, Peng; Xie, Peng
2015-10-30
Tuberculous meningitis (TBM) remains to be one of the most deadly infectious diseases. The pathogen interacts with the host immune system, the process of which is largely unknown. Various cellular processes of Mycobacterium tuberculosis (MTB) centers around lipid metabolism. To determine the lipid metabolism related proteins, a quantitative proteomic study was performed here to identify differential proteins in the cerebrospinal fluid (CSF) obtained from TBM patients (n = 12) and healthy controls (n = 12). CSF samples were desalted, concentrated, labelled with isobaric tags for relative and absolute quantitation (iTRAQ™), and analyzed by multi-dimensional liquid chromatography-tandem mass spectrometry (LC-MS/MS). Gene ontology and proteomic phenotyping analysis of the differential proteins were conducted using Database for Annotation, Visualization, and Integrated Discovery (DAVID) Bioinformatics Resources. ApoE and ApoB were selected for validation by ELISA. Proteomic phenotyping of the 4 differential proteins was invloved in the lipid metabolism. ELISA showed significantly increased ApoB levels in TBM subjects compared to healthy controls. Area under the receiver operating characteristic curve analysis demonstrated ApoB levels could distinguish TBM subjects from healthy controls and viral meningitis subjects with 89.3% sensitivity and 92% specificity. CSF lipid metabolism disregulation, especially elevated expression of ApoB, gives insights into the pathogenesis of TBM. Further evaluation of these findings in larger studies including anti-tuberculosis medicated and unmedicated patient cohorts with other center nervous system infectious diseases is required for successful clinical translation. Copyright © 2015 Elsevier Inc. All rights reserved.
Identification of allergens by IgE-specific testing improves outcomes in atopic dermatitis.
Will, Brett M; Severino, Richard; Johnson, Douglas W
2017-11-01
IgE quantitative assaying of allergens (IgEQAA) has long been implemented by allergists in determining patients' reactivities for allergic rhinitis and asthma, two of the three diagnoses in atopic syndrome. This test operates by measuring the patient's IgE response to different allergens and can identify potential triggers for a patient's symptoms. Despite this, IgEQAA has yet to see the same widespread use in the field of dermatology, specifically in the treatment of patients with atopic dermatitis (AD). The affected body surface area (BSA) at first presentation, IgEQAA classes, and total immunoglobulin E (IgE) concentration were taken retrospectively for 54 patients with AD. Of the 54 patients observed, 41 had an abnormally high total IgE concentration (76%). Additionally, it was observed that nine (17%) of our patients significantly improved after making lifestyle changes. Knowledge of the identified specific antigens can guide patients to make lifestyle modifications that may improve disease outcomes. IgEQAA and avoidance of allergens may help some patients with AD. © 2017 The International Society of Dermatology.
Mbuthia, Jackson Mwenda; Rewe, Thomas Odiwuor; Kahi, Alexander Kigunzu
2015-02-01
This study evaluated pig production practices by smallholder farmers in two distinct production systems geared towards addressing their constraints and prospects for improvement. The production systems evaluated were semi-intensive and extensive and differed in remoteness, market access, resource availability and pig production intensity. Data were collected using structured questionnaires where a total of 102 pig farmers were interviewed. Qualitative and quantitative research methods were employed to define the socioeconomic characteristics of the production systems, understanding the different roles that pigs play, marketing systems and constraints to production. In both systems, regular cash income and insurance against emergencies were ranked as the main reasons for rearing pigs. Marketing of pigs was mainly driven by the type of production operation. Finances, feeds and housing were identified as the major constraints to production. The study provides important parameters and identifies constraints important for consideration in design of sustainable production improvement strategies. Feeding challenges can be improved through understanding the composition and proper utilization of local feed resources. Provision of adequate housing would improve the stocking rates and control mating.
Shieh, Fwu-Shan; Jongeneel, Patrick; Steffen, Jamin D; Lin, Selena; Jain, Surbhi; Song, Wei; Su, Ying-Hsiu
2017-01-01
Identification of viral integration sites has been important in understanding the pathogenesis and progression of diseases associated with particular viral infections. The advent of next-generation sequencing (NGS) has enabled researchers to understand the impact that viral integration has on the host, such as tumorigenesis. Current computational methods to analyze NGS data of virus-host junction sites have been limited in terms of their accessibility to a broad user base. In this study, we developed a software application (named ChimericSeq), that is the first program of its kind to offer a graphical user interface, compatibility with both Windows and Mac operating systems, and optimized for effectively identifying and annotating virus-host chimeric reads within NGS data. In addition, ChimericSeq's pipeline implements custom filtering to remove artifacts and detect reads with quantitative analytical reporting to provide functional significance to discovered integration sites. The improved accessibility of ChimericSeq through a GUI interface in both Windows and Mac has potential to expand NGS analytical support to a broader spectrum of the scientific community.
Shieh, Fwu-Shan; Jongeneel, Patrick; Steffen, Jamin D.; Lin, Selena; Jain, Surbhi; Song, Wei
2017-01-01
Identification of viral integration sites has been important in understanding the pathogenesis and progression of diseases associated with particular viral infections. The advent of next-generation sequencing (NGS) has enabled researchers to understand the impact that viral integration has on the host, such as tumorigenesis. Current computational methods to analyze NGS data of virus-host junction sites have been limited in terms of their accessibility to a broad user base. In this study, we developed a software application (named ChimericSeq), that is the first program of its kind to offer a graphical user interface, compatibility with both Windows and Mac operating systems, and optimized for effectively identifying and annotating virus-host chimeric reads within NGS data. In addition, ChimericSeq’s pipeline implements custom filtering to remove artifacts and detect reads with quantitative analytical reporting to provide functional significance to discovered integration sites. The improved accessibility of ChimericSeq through a GUI interface in both Windows and Mac has potential to expand NGS analytical support to a broader spectrum of the scientific community. PMID:28829778
Intracavity widely-tunable quantum cascade laser spectrometer.
Brownsword, Richard A; Weidmann, Damien
2013-01-28
A grating-tuned extended-cavity quantum cascade laser (EC-QCL) operating around 7.6 µm was assembled to provide a tuning range of ~80 cm⁻¹ with output power of up to 30 mW. The EC-QCL output power was shown to be sensitive to the presence of a broadband absorbing gas mixture contained in a 2-cm cell introduced inside the extended laser cavity. In this arrangement, enhanced absorption relative to single path linear absorption was observed. To describe observations, in the QCL rate-equation model was included the effect of intracavity absorption. The model qualitatively reproduced the absorption behavior observed. In addition, it allowed quantitative measurements of mixing ratio of dimethyl carbonate, which was used as a test broadband absorber. A number of alternative data acquisition and reduction methods were identified. As the intracavity absorber modifies the laser threshold current, phase-sensitive detection of the laser threshold current was found to be the most attractive way to determine the mixing ratio of the absorber. The dimethyl carbonate detection limit was estimated to be 1.4 ppmv for 10 second integration. Limitations and possible ways of improvements were also identified.
Quantitative analysis of backflow of reversible pump-turbine in generating mode
NASA Astrophysics Data System (ADS)
Liu, K. H.; Zhang, Y. N.; Li, J. W.; Xian, H. Z.
2016-05-01
Significant vibration and pressure fluctuations are usually observed when pump- turbine is operated during the off-design conditions, especially turbine brake and runaway. The root cause of these instability phenomena is the abnormal unsteady flow (especially the backflow) inside the pump-turbine. In the present paper, numerical simulation method is adopted to investigate the characteristics of the flow inside the whole passage of pump-turbine with two guide vane openings (6° and 21° respectively) and three kinds of operating conditions (turbine, runaway and turbine braking respectively). A quantitative analysis of backflow is performed in both the axial and radial directions and the generation and development of backflow in the pump turbine are revealed with great details.
Characterizing user requirements for future land observing satellites
NASA Technical Reports Server (NTRS)
Barker, J. L.; Cressy, P. J.; Schnetzler, C. C.; Salomonson, V. V.
1981-01-01
The objective procedure was developed for identifying probable sensor and mission characteristics for an operational satellite land observing system. Requirements were systematically compiled, quantified and scored by type of use, from surveys of federal, state, local and private communities. Incremental percent increases in expected value of data were estimated for critical system improvements. Comparisons with costs permitted selection of a probable sensor system, from a set of 11 options, with the following characteristics: 30 meter spatial resolution in 5 bands and 15 meters in 1 band, spectral bands nominally at Thematic Mapper (TM) bands 1 through 6 positions, and 2 day data turn around for receipt of imagery. Improvements are suggested for both the form of questions and the procedures for analysis of future surveys in order to provide a more quantitatively precise definition of sensor and mission requirements.
Comin, Cesar Henrique; Xu, Xiaoyin; Wang, Yaming; Costa, Luciano da Fontoura; Yang, Zhong
2014-12-01
We present an image processing approach to automatically analyze duo-channel microscopic images of muscular fiber nuclei and cytoplasm. Nuclei and cytoplasm play a critical role in determining the health and functioning of muscular fibers as changes of nuclei and cytoplasm manifest in many diseases such as muscular dystrophy and hypertrophy. Quantitative evaluation of muscle fiber nuclei and cytoplasm thus is of great importance to researchers in musculoskeletal studies. The proposed computational approach consists of steps of image processing to segment and delineate cytoplasm and identify nuclei in two-channel images. Morphological operations like skeletonization is applied to extract the length of cytoplasm for quantification. We tested the approach on real images and found that it can achieve high accuracy, objectivity, and robustness. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
A quantitative telomeric chromatin isolation protocol identifies different telomeric states
NASA Astrophysics Data System (ADS)
Grolimund, Larissa; Aeby, Eric; Hamelin, Romain; Armand, Florence; Chiappe, Diego; Moniatte, Marc; Lingner, Joachim
2013-11-01
Telomere composition changes during tumourigenesis, aging and in telomere syndromes in a poorly defined manner. Here we develop a quantitative telomeric chromatin isolation protocol (QTIP) for human cells, in which chromatin is cross-linked, immunopurified and analysed by mass spectrometry. QTIP involves stable isotope labelling by amino acids in cell culture (SILAC) to compare and identify quantitative differences in telomere protein composition of cells from various states. With QTIP, we specifically enrich telomeric DNA and all shelterin components. We validate the method characterizing changes at dysfunctional telomeres, and identify and validate known, as well as novel telomere-associated polypeptides including all THO subunits, SMCHD1 and LRIF1. We apply QTIP to long and short telomeres and detect increased density of SMCHD1 and LRIF1 and increased association of the shelterins TRF1, TIN2, TPP1 and POT1 with long telomeres. Our results validate QTIP to study telomeric states during normal development and in disease.
Thoma, Brent; Camorlinga, Paola; Chan, Teresa M; Hall, Andrew Koch; Murnaghan, Aleisha; Sherbino, Jonathan
2018-01-01
Quantitative research is one of the many research methods used to help educators advance their understanding of questions in medical education. However, little research has been done on how to succeed in publishing in this area. We conducted a scoping review to identify key recommendations and reporting guidelines for quantitative educational research and scholarship. Medline, ERIC, and Google Scholar were searched for English-language articles published between 2006 and January 2016 using the search terms, "research design," "quantitative," "quantitative methods," and "medical education." A hand search was completed for additional references during the full-text review. Titles/abstracts were reviewed by two authors (BT, PC) and included if they focused on quantitative research in medical education and outlined reporting guidelines, or provided recommendations on conducting quantitative research. One hundred articles were reviewed in parallel with the first 30 used for calibration and the subsequent 70 to calculate Cohen's kappa coefficient. Two reviewers (BT, PC) conducted a full text review and extracted recommendations and reporting guidelines. A simple thematic analysis summarized the extracted recommendations. Sixty-one articles were reviewed in full, and 157 recommendations were extracted. The thematic analysis identified 86 items, 14 categories, and 3 themes. Fourteen quality evaluation tools and reporting guidelines were found. Discussion This paper provides guidance for junior researchers in the form of key quality markers and reporting guidelines. We hope that quantitative researchers in medical education will be informed by the results and that further work will be done to refine the list of recommendations.
Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides
USDA-ARS?s Scientific Manuscript database
In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...
Values in Qualitative and Quantitative Research
ERIC Educational Resources Information Center
Duffy, Maureen; Chenail, Ronald J.
2008-01-01
The authors identify the philosophical underpinnings and value-ladenness of major research paradigms. They argue that useful and meaningful research findings for counseling can be generated from both qualitative and quantitative research methodologies, provided that the researcher has an appreciation of the importance of philosophical coherence in…
Detecting Genetic Interactions for Quantitative Traits Using m-Spacing Entropy Measure
Yee, Jaeyong; Kwon, Min-Seok; Park, Taesung; Park, Mira
2015-01-01
A number of statistical methods for detecting gene-gene interactions have been developed in genetic association studies with binary traits. However, many phenotype measures are intrinsically quantitative and categorizing continuous traits may not always be straightforward and meaningful. Association of gene-gene interactions with an observed distribution of such phenotypes needs to be investigated directly without categorization. Information gain based on entropy measure has previously been successful in identifying genetic associations with binary traits. We extend the usefulness of this information gain by proposing a nonparametric evaluation method of conditional entropy of a quantitative phenotype associated with a given genotype. Hence, the information gain can be obtained for any phenotype distribution. Because any functional form, such as Gaussian, is not assumed for the entire distribution of a trait or a given genotype, this method is expected to be robust enough to be applied to any phenotypic association data. Here, we show its use to successfully identify the main effect, as well as the genetic interactions, associated with a quantitative trait. PMID:26339620
Okuthe, O S; McLeod, A; Otte, J M; Buyu, G E
2003-09-01
Assessment of livestock production constraints in the smallholder dairy systems in the western Kenya highlands was carried out using both qualitative and quantitative epidemiological methods. Rapid rural appraisals (qualitative) were conducted in rural and peri-urban areas. A cross-sectional survey (quantitative) was then conducted on a random sample of farms in the study area. Diseases, poor communication, lack of marketing of livestock produce, lack of artificial insemination services, and feed and water shortages during the dry season were identified as the major constraints to cattle production in both areas. Tick borne diseases (especially East Coast fever) were identified as the major constraint to cattle production. Qualitative methods were found to be more flexible and cheaper than the quantitative methods by a ratio of between 2.19-2.0. The two methods were found to complement each other. Qualitative studies could be applied in preliminary studies before initiating more specific follow up quantitative studies.
Dispatching packets on a global combining network of a parallel computer
Almasi, Gheorghe [Ardsley, NY; Archer, Charles J [Rochester, MN
2011-07-19
Methods, apparatus, and products are disclosed for dispatching packets on a global combining network of a parallel computer comprising a plurality of nodes connected for data communications using the network capable of performing collective operations and point to point operations that include: receiving, by an origin system messaging module on an origin node from an origin application messaging module on the origin node, a storage identifier and an operation identifier, the storage identifier specifying storage containing an application message for transmission to a target node, and the operation identifier specifying a message passing operation; packetizing, by the origin system messaging module, the application message into network packets for transmission to the target node, each network packet specifying the operation identifier and an operation type for the message passing operation specified by the operation identifier; and transmitting, by the origin system messaging module, the network packets to the target node.
Operational Hydrological Forecasting During the Iphex-iop Campaign - Meet the Challenge
NASA Technical Reports Server (NTRS)
Tao, Jing; Wu, Di; Gourley, Jonathan; Zhang, Sara Q.; Crow, Wade; Peters-Lidard, Christa D.; Barros, Ana P.
2016-01-01
An operational streamflow forecasting testbed was implemented during the Intense Observing Period (IOP) of the Integrated Precipitation and Hydrology Experiment (IPHEx-IOP) in May-June 2014 to characterize flood predictability in complex terrain. Specifically, hydrological forecasts were issued daily for 12 headwater catchments in the Southern Appalachians using the Duke Coupled surface-groundwater Hydrology Model (DCHM) forced by hourly atmospheric fields and QPFs (Quantitative Precipitation Forecasts) produced by the NASA-Unified Weather Research and Forecasting (NU-WRF) model. Previous day hindcasts forced by radar-based QPEs (Quantitative Precipitation Estimates) were used to provide initial conditions for present day forecasts. This manuscript first describes the operational testbed framework and workflow during the IPHEx-IOP including a synthesis of results. Second, various data assimilation approaches are explored a posteriori (post-IOP) to improve operational (flash) flood forecasting. Although all flood events during the IOP were predicted by the IPHEx operational testbed with lead times of up to 6 h, significant errors of over- and, or under-prediction were identified that could be traced back to the QPFs and subgrid-scale variability of radar QPEs. To improve operational flood prediction, three data-merging strategies were pursued post-IOP: (1) the spatial patterns of QPFs were improved through assimilation of satellite-based microwave radiances into NU-WRF; (2) QPEs were improved by merging raingauge observations with ground-based radar observations using bias-correction methods to produce streamflow hindcasts and associated uncertainty envelope capturing the streamflow observations, and (3) river discharge observations were assimilated into the DCHM to improve streamflow forecasts using the Ensemble Kalman Filter (EnKF), the fixed-lag Ensemble Kalman Smoother (EnKS), and the Asynchronous EnKF (i.e. AEnKF) methods. Both flood hindcasts and forecasts were significantly improved by assimilating discharge observations into the DCHM. Specifically, Nash-Sutcliff Efficiency (NSE) values as high as 0.98, 0.71 and 0.99 at 15-min time-scales were attained for three headwater catchments in the inner mountain region demonstrating that the assimilation of discharge observations at the basins outlet can reduce the errors and uncertainties in soil moisture at very small scales. Success in operational flood forecasting at lead times of 6, 9, 12 and 15 h was also achieved through discharge assimilation with NSEs of 0.87, 0.78, 0.72 and 0.51, respectively. Analysis of experiments using various data assimilation system configurations indicates that the optimal assimilation time window depends both on basin properties and storm-specific space-time-structure of rainfall, and therefore adaptive, context-aware configurations of the data assimilation system are recommended to address the challenges of flood prediction in headwater basins.
Operational hydrological forecasting during the IPHEx-IOP campaign - Meet the challenge
NASA Astrophysics Data System (ADS)
Tao, Jing; Wu, Di; Gourley, Jonathan; Zhang, Sara Q.; Crow, Wade; Peters-Lidard, Christa; Barros, Ana P.
2016-10-01
An operational streamflow forecasting testbed was implemented during the Intense Observing Period (IOP) of the Integrated Precipitation and Hydrology Experiment (IPHEx-IOP) in May-June 2014 to characterize flood predictability in complex terrain. Specifically, hydrological forecasts were issued daily for 12 headwater catchments in the Southern Appalachians using the Duke Coupled surface-groundwater Hydrology Model (DCHM) forced by hourly atmospheric fields and QPFs (Quantitative Precipitation Forecasts) produced by the NASA-Unified Weather Research and Forecasting (NU-WRF) model. Previous day hindcasts forced by radar-based QPEs (Quantitative Precipitation Estimates) were used to provide initial conditions for present day forecasts. This manuscript first describes the operational testbed framework and workflow during the IPHEx-IOP including a synthesis of results. Second, various data assimilation approaches are explored a posteriori (post-IOP) to improve operational (flash) flood forecasting. Although all flood events during the IOP were predicted by the IPHEx operational testbed with lead times of up to 6 h, significant errors of over- and, or under-prediction were identified that could be traced back to the QPFs and subgrid-scale variability of radar QPEs. To improve operational flood prediction, three data-merging strategies were pursued post-IOP: (1) the spatial patterns of QPFs were improved through assimilation of satellite-based microwave radiances into NU-WRF; (2) QPEs were improved by merging raingauge observations with ground-based radar observations using bias-correction methods to produce streamflow hindcasts and associated uncertainty envelope capturing the streamflow observations, and (3) river discharge observations were assimilated into the DCHM to improve streamflow forecasts using the Ensemble Kalman Filter (EnKF), the fixed-lag Ensemble Kalman Smoother (EnKS), and the Asynchronous EnKF (i.e. AEnKF) methods. Both flood hindcasts and forecasts were significantly improved by assimilating discharge observations into the DCHM. Specifically, Nash-Sutcliff Efficiency (NSE) values as high as 0.98, 0.71 and 0.99 at 15-min time-scales were attained for three headwater catchments in the inner mountain region demonstrating that the assimilation of discharge observations at the basin's outlet can reduce the errors and uncertainties in soil moisture at very small scales. Success in operational flood forecasting at lead times of 6, 9, 12 and 15 h was also achieved through discharge assimilation with NSEs of 0.87, 0.78, 0.72 and 0.51, respectively. Analysis of experiments using various data assimilation system configurations indicates that the optimal assimilation time window depends both on basin properties and storm-specific space-time-structure of rainfall, and therefore adaptive, context-aware configurations of the data assimilation system are recommended to address the challenges of flood prediction in headwater basins.
Glasby, Michael A; Tsirikos, Athanasios I; Henderson, Lindsay; Horsburgh, Gillian; Jordan, Brian; Michaelson, Ciara; Adams, Christopher I; Garrido, Enrique
2017-08-01
To compare measurements of motor evoked potential latency stimulated either magnetically (mMEP) or electrically (eMEP) and central motor conduction time (CMCT) made pre-operatively in conscious patients using transcranial and intra-operatively using electrical cortical stimulation before and after successful instrumentation for the treatment of adolescent idiopathic scoliosis. A group initially of 51 patients with adolescent idiopathic scoliosis aged 12-19 years was evaluated pre-operatively in the outpatients' department with transcranial magnetic stimulation. The neurophysiological data were then compared statistically with intra-operative responses elicited by transcranial electrical stimulation both before and after successful surgical intervention. MEPs were measured as the cortically evoked compound action potentials of Abductor hallucis. Minimum F-waves were measured using conventional nerve conduction methods and the lower motor neuron conduction time was calculated and this was subtracted from MEP latency to give CMCT. Pre-operative testing was well tolerated in our paediatric/adolescent patients. No neurological injury occurred in any patient in this series. There was no significant difference in the values of mMEP and eMEP latencies seen pre-operatively in conscious patients and intra-operatively in patients under anaesthetic. The calculated quantities mCMCT and eCMCT showed the same statistical correlations as the quantities mMEP and eMEP latency. The congruency of mMEP and eMEP and of mCMCT and eCMCT suggests that these measurements may be used comparatively and semi-quantitatively for the comparison of pre-, intra-, and post-operative spinal cord function in spinal deformity surgery.
Okada, Tohru; Iwano, Shingo; Ishigaki, Takeo; Kitasaka, Takayuki; Hirano, Yasushi; Mori, Kensaku; Suenaga, Yasuhito; Naganawa, Shinji
2009-02-01
The ground-glass opacity (GGO) of lung cancer is identified only subjectively on computed tomography (CT) images as no quantitative characteristic has been defined for GGOs. We sought to define GGOs quantitatively and to differentiate between GGOs and solid-type lung cancers semiautomatically with a computer-aided diagnosis (CAD). High-resolution CT images of 100 pulmonary nodules (all peripheral lung cancers) were collected from our clinical records. Two radiologists traced the contours of nodules and distinguished GGOs from solid areas. The CT attenuation value of each area was measured. Differentiation between cancer types was assessed by a receiver-operating characteristic (ROC) analysis. The mean CT attenuation of the GGO areas was -618.4 +/- 212.2 HU, whereas that of solid areas was -68.1 +/- 230.3 HU. CAD differentiated between solidand GGO-type lung cancers with a sensitivity of 86.0% and specificity of 96.5% when the threshold value was -370 HU. Four nodules of mixed GGOs were incorrectly classified as the solid type. CAD detected 96.3% of GGO areas when the threshold between GGO and solid areas was 194 HU. Objective definition of GGO area by CT attenuation is feasible. This method is useful for semiautomatic differentiation between GGOs and solid types of lung cancer.
NASA Astrophysics Data System (ADS)
Aubrey, A. D.; Christensen, L. E.; Brockers, R.; Thompson, D. R.
2014-12-01
Requirements for greenhouse gas point source detection and quantification often require high spatial resolution on the order of meters. These applications, which help close the gap in emissions estimate uncertainties, also demand sensing with high sensitivity and in a fashion that accounts for spatiotemporal variability on the order of seconds to minutes. Low-cost vertical takeoff and landing (VTOL) small unmanned aerial systems (sUAS) provide a means to detect and identify the location of point source gas emissions while offering ease of deployment and high maneuverability. Our current fielded gas sensing sUAS platforms are able to provide instantaneous in situ concentration measurements at locations within line of sight of the operator. Recent results from field experiments demonstrating methane detection and plume characterization will be discussed here, including performance assessment conducted via a controlled release experiment in 2013. The logical extension of sUAS gas concentration measurement is quantification of flux rate. We will discuss the preliminary strategy for quantitative flux determination, including intrinsic challenges and heritage from airborne science campaigns, associated with this point source flux quantification. This system approach forms the basis for intelligent autonomous quantitative characterization of gas plumes, which holds great value for applications in commercial, regulatory, and safety environments.
Findings from a study of aspiring nursing student leaders.
Waite, Roberta; McKinney, Nicole S
2015-12-01
Transformational leadership skills are critical to operate effectively in today's healthcare environment. Prelicensure nurses do not often practice these skills in a meaningful way during their undergraduate educational experience. This paper describes quantitative pre-post findings from the Kouzes and Posner Student Leadership Practices Inventory to examine students' leadership attributes pre-post engagement in an 18 month undergraduate leadership program. This is a non-experimental convenience study that used a quantitative pre-post survey design collecting data from participants and observers using the 360 Kouzes and Posner Student Leadership Practices Inventory. A private university in the northeastern region of the United States. Fourteen junior level nursing students who concurrently participated in a leadership program while concurrently completing their required academic courses for their bachelor's degree in nursing. Paired sample t-tests were used to determine if there was statistical significance among student participants' and observers' perceptions of specific leadership behaviors and skills of students at the onset (pretest) and at the conclusion (posttest) of the leadership program. Participant and observer scores were positively correlated and statistical significance was identified in several practice areas. It is important to integrate transformation leadership skills into undergraduate curriculum since it supports students' engagement in their own learning and instills foundational knowledge critical to their leadership trajectory. Copyright © 2015 Elsevier Ltd. All rights reserved.
Quantitative analysis of iris parameters in keratoconus patients using optical coherence tomography.
Bonfadini, Gustavo; Arora, Karun; Vianna, Lucas M; Campos, Mauro; Friedman, David; Muñoz, Beatriz; Jun, Albert S
2015-01-01
To investigate the relationship between quantitative iris parameters and the presence of keratoconus. Cross-sectional observational study that included 15 affected eyes of 15 patients with keratoconus and 26 eyes of 26 normal age- and sex-matched controls. Iris parameters (area, thickness, and pupil diameter) of affected and unaffected eyes were measured under standardized light and dark conditions using anterior segment optical coherence tomography (AS-OCT). To identify optimal iris thickness cutoff points to maximize the sensitivity and specificity when discriminating keratoconus eyes from normal eyes, the analysis included the use of receiver operating characteristic (ROC) curves. Iris thickness and area were lower in keratoconus eyes than in normal eyes. The mean thickness at the pupillary margin under both light and dark conditions was found to be the best parameter for discriminating normal patients from keratoconus patients. Diagnostic performance was assessed by the area under the ROC curve (AROC), which had a value of 0.8256 with 80.0% sensitivity and 84.6% specificity, using a cutoff of 0.4125 mm. The sensitivity increased to 86.7% when a cutoff of 0.4700 mm was used. In our sample, iris thickness was lower in keratoconus eyes than in normal eyes. These results suggest that tomographic parameters may provide novel adjunct approaches for keratoconus screening.
Study of SRM Critical Surfaces Using Near Infrared Optical Fiber Spectrometry
NASA Technical Reports Server (NTRS)
Workman, G. L.; Hughes, C.; Arendale, W. A.
1997-01-01
The measurement and control of cleanliness for critical surfaces during manufacturing and in service operations provides a unique challenge in the current thrust for environmentally benign processes. Of particular interest has been work performed in maintaining quality in the production of bondline surfaces in propulsion systems and the identification of possible contaminants which are detrimental to the integrity of the bondline. This work requires an in-depth study of the possible sources of contamination, methodologies to identify contaminants, discrimination between contaminants and chemical species caused by environment, and the effect of particular contaminants on the bondline integrity of the critical surfaces. This paper will provide an introduction to the use of Near Infrared (NIR) optical fiber spectrometry in a nondestructive measurement system for process monitoring and how it can be used to help clarify issues concerning surface chemistry. In a previous conference, experimental results for quantitative measurement of silicone and Conoco HD2 greases, and tape residues on solid rocket motor surfaces were presented. This paper will present data for metal hydroxides and discuss the use of the integrating sphere to minimize the effects of physical properties of the surfaces (such as surface roughness) on the results obtained from the chemometric methods used for quantitative analysis.
Hu, Valerie W.; Addington, Anjene; Hyman, Alexander
2011-01-01
The heterogeneity of symptoms associated with autism spectrum disorders (ASDs) has presented a significant challenge to genetic analyses. Even when associations with genetic variants have been identified, it has been difficult to associate them with a specific trait or characteristic of autism. Here, we report that quantitative trait analyses of ASD symptoms combined with case-control association analyses using distinct ASD subphenotypes identified on the basis of symptomatic profiles result in the identification of highly significant associations with 18 novel single nucleotide polymorphisms (SNPs). The symptom categories included deficits in language usage, non-verbal communication, social development, and play skills, as well as insistence on sameness or ritualistic behaviors. Ten of the trait-associated SNPs, or quantitative trait loci (QTL), were associated with more than one subtype, providing partial replication of the identified QTL. Notably, none of the novel SNPs is located within an exonic region, suggesting that these hereditary components of ASDs are more likely related to gene regulatory processes (or gene expression) than to structural or functional changes in gene products. Seven of the QTL reside within intergenic chromosomal regions associated with rare copy number variants that have been previously reported in autistic samples. Pathway analyses of the genes associated with the QTL identified in this study implicate neurological functions and disorders associated with autism pathophysiology. This study underscores the advantage of incorporating both quantitative traits as well as subphenotypes into large-scale genome-wide analyses of complex disorders. PMID:21556359
Genetic dissection of the maize (Zea mays L.) MAMP response.
Zhang, Xinye; Valdés-López, Oswaldo; Arellano, Consuelo; Stacey, Gary; Balint-Kurti, Peter
2017-06-01
Loci associated with variation in maize responses to two microbe-associated molecular patterns (MAMPs) were identified. MAMP responses were correlated. No relationship between MAMP responses and quantitative disease resistance was identified. Microbe-associated molecular patterns (MAMPs) are highly conserved molecules commonly found in microbes which can be recognized by plant pattern recognition receptors. Recognition triggers a suite of responses including production of reactive oxygen species (ROS) and nitric oxide (NO) and expression changes of defense-related genes. In this study, we used two well-studied MAMPs (flg22 and chitooctaose) to challenge different maize lines to determine whether there was variation in the level of responses to these MAMPs, to dissect the genetic basis underlying that variation and to understand the relationship between MAMP response and quantitative disease resistance (QDR). Naturally occurring quantitative variation in ROS, NO production, and defense genes expression levels triggered by MAMPs was observed. A major quantitative traits locus (QTL) associated with variation in the ROS production response to both flg22 and chitooctaose was identified on chromosome 2 in a recombinant inbred line (RIL) population derived from the maize inbred lines B73 and CML228. Minor QTL associated with variation in the flg22 ROS response was identified on chromosomes 1 and 4. Comparison of these results with data previously obtained for variation in QDR and the defense response in the same RIL population did not provide any evidence for a common genetic basis controlling variation in these traits.
Bilić, Petra; Guillemin, Nicolas; Kovačević, Alan; Beer Ljubić, Blanka; Jović, Ines; Galan, Asier; Eckersall, Peter David; Burchmore, Richard; Mrljak, Vladimir
2018-05-15
Idiopathic dilated cardiomyopathy (iDCM) is a primary myocardial disorder with an unknown aetiology, characterized by reduced contractility and ventricular dilation of the left or both ventricles. Naturally occurring canine iDCM was used herein to identify serum proteomic signature of the disease compared to the healthy state, providing an insight into underlying mechanisms and revealing proteins with biomarker potential. To achieve this, we used high-throughput label-based quantitative LC-MS/MS proteomics approach and bioinformatics analysis of the in silico inferred interactome protein network created from the initial list of differential proteins. To complement the proteomic analysis, serum biochemical parameters and levels of know biomarkers of cardiac function were measured. Several proteins with biomarker potential were identified, such as inter-alpha-trypsin inhibitor heavy chain H4, microfibril-associated glycoprotein 4 and apolipoprotein A-IV, which were validated using an independent method (Western blotting) and showed high specificity and sensitivity according to the receiver operating characteristic curve analysis. Bioinformatics analysis revealed involvement of different pathways in iDCM, such as complement cascade activation, lipoprotein particles dynamics, elastic fibre formation, GPCR signalling and respiratory electron transport chain. Idiopathic dilated cardiomyopathy is a severe primary myocardial disease of unknown cause, affecting both humans and dogs. This study is a contribution to the canine heart disease research by means of proteomic and bioinformatic state of the art analyses, following similar approach in human iDCM research. Importantly, we used serum as non-invasive and easily accessible biological source of information and contributed to the scarce data on biofluid proteome research on this topic. Bioinformatics analysis revealed biological pathways modulated in canine iDCM with potential of further targeted research. Also, several proteins with biomarker potential have been identified and successfully validated. Copyright © 2018 Elsevier B.V. All rights reserved.
Improved diagnosis of pulmonary emphysema using in vivo dark-field radiography.
Meinel, Felix G; Yaroshenko, Andre; Hellbach, Katharina; Bech, Martin; Müller, Mark; Velroyen, Astrid; Bamberg, Fabian; Eickelberg, Oliver; Nikolaou, Konstantin; Reiser, Maximilian F; Pfeiffer, Franz; Yildirim, Ali Ö
2014-10-01
The purpose of this study was to assess whether the recently developed method of grating-based x-ray dark-field radiography can improve the diagnosis of pulmonary emphysema in vivo. Pulmonary emphysema was induced in female C57BL/6N mice using endotracheal instillation of porcine pancreatic elastase and confirmed by in vivo pulmonary function tests, histopathology, and quantitative morphometry. The mice were anesthetized but breathing freely during imaging. Experiments were performed using a prototype small-animal x-ray dark-field scanner that was operated at 35 kilovolt (peak) with an exposure time of 5 seconds for each of the 10 grating steps. Images were compared visually. For quantitative comparison of signal characteristics, regions of interest were placed in the upper, middle, and lower zones of each lung. Receiver-operating-characteristic statistics were performed to compare the effectiveness of transmission and dark-field signal intensities and the combined parameter "normalized scatter" to differentiate between healthy and emphysematous lungs. A clear visual difference between healthy and emphysematous mice was found for the dark-field images. Quantitative measurements of x-ray dark-field signal and normalized scatter were significantly different between the mice with pulmonary emphysema and the control mice and showed good agreement with pulmonary function tests and quantitative histology. The normalized scatter showed a significantly higher discriminatory power (area under the receiver-operating-characteristic curve [AUC], 0.99) than dark-field (AUC, 0.90; P = 0.01) or transmission signal (AUC, 0.69; P < 0.001) alone did, allowing for an excellent discrimination of healthy and emphysematous lung regions. In a murine model, x-ray dark-field radiography is technically feasible in vivo and represents a substantial improvement over conventional transmission-based x-ray imaging for the diagnosis of pulmonary emphysema.
Assessing the risk posed by natural hazards to infrastructures
NASA Astrophysics Data System (ADS)
Eidsvig, Unni Marie K.; Kristensen, Krister; Vidar Vangelsten, Bjørn
2017-03-01
This paper proposes a model for assessing the risk posed by natural hazards to infrastructures, with a focus on the indirect losses and loss of stability for the population relying on the infrastructure. The model prescribes a three-level analysis with increasing level of detail, moving from qualitative to quantitative analysis. The focus is on a methodology for semi-quantitative analyses to be performed at the second level. The purpose of this type of analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures, identifying the most critical scenarios and investigating the need for further analyses (third level). The proposed semi-quantitative methodology considers the frequency of the natural hazard, different aspects of vulnerability, including the physical vulnerability of the infrastructure itself, and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale according to pre-defined ranking criteria. The proposed indicators, which characterise conditions that influence the probability of an infrastructure malfunctioning caused by a natural event, are defined as (1) robustness and buffer capacity, (2) level of protection, (3) quality/level of maintenance and renewal, (4) adaptability and quality of operational procedures and (5) transparency/complexity/degree of coupling. Further indicators describe conditions influencing the socio-economic consequences of the infrastructure malfunctioning, such as (1) redundancy and/or substitution, (2) cascading effects and dependencies, (3) preparedness and (4) early warning, emergency response and measures. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard, the potential duration of the infrastructure malfunctioning (e.g. depending on the required restoration effort) and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented for demonstration purposes, where risk posed by adverse weather and natural hazards to primary road, water supply and power networks is assessed. The application examples show that the proposed model provides a useful tool for screening of potential undesirable events, contributing to a targeted reduction of the risk.
Gao, Meng; Wang, Yuesheng; Wei, Huizhen; Ouyang, Hui; He, Mingzhen; Zeng, Lianqing; Shen, Fengyun; Guo, Qiang; Rao, Yi
2014-06-01
A method was developed for the determination of amygdalin and its metabolite prunasin in rat plasma after intragastric administration of Maxing shigan decoction. The analytes were identified by ultra-high performance liquid chromatography-tandem quadrupole time of flight mass spectrometry and quantitatively determined by ultra-high performance liquid chromatography-tandem triple quadrupole mass spectrometry. After purified by liquid-liquid extraction, the qualitative analysis of amygdalin and prunasin in the plasma sample was performed on a Shim-pack XR-ODS III HPLC column (75 mm x 2.0 mm, 1.6 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on a Triple TOF 5600 quadrupole time of flight mass spectrometer. The quantitative analysis of amygdalin and prunasin in the plasma sample was performed by separation on an Agilent C18 HPLC column (50 mm x 2.1 mm, 1.7 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on an AB Q-TRAP 4500 triple quadrupole mass spectrometer utilizing electrospray ionization (ESI) interface operated in negative ion mode and multiple-reaction monitoring (MRM) mode. The qualitative analysis results showed that amygdalin and its metabolite prunasin were detected in the plasma sample. The quantitative analysis results showed that the linear range of amygdalin was 1.05-4 200 ng/mL with the correlation coefficient of 0.999 0 and the linear range of prunasin was 1.25-2 490 ng/mL with the correlation coefficient of 0.997 0. The method had a good precision with the relative standard deviations (RSDs) lower than 9.20% and the overall recoveries varied from 82.33% to 95.25%. The limits of detection (LODs) of amygdalin and prunasin were 0.50 ng/mL. With good reproducibility, the method is simple, fast and effective for the qualitative and quantitative analysis of the amygdalin and prunasin in plasma sample of rats which were administered by Maxing shigan decoction.
Identifying thermal breakdown products of thermoplastics.
Guillemot, Marianne; Oury, Benoît; Melin, Sandrine
2017-07-01
Polymers processed to produce plastic articles are subjected to temperatures between 150°C and 450°C or more during overheated processing and breakdowns. Heat-based processing of this nature can lead to emission of volatile organic compounds (VOCs) into the thermoplastic processing shop. In this study, laboratory experiments, qualitative and quantitative emissions measurement in thermoplastic factories were carried out. The first step was to identify the compounds released depending on the thermoplastic nature, the temperature and the type of process. Then a thermal degradation protocol that can extrapolate the laboratory results to industry scenarios was developed. The influence of three parameters on released thermal breakdown products was studied: the sample preparation methods-manual cutting, ambient, or cold grinding-the heating rate during thermal degradation-5, 10 20, and 50°C/min-and the decomposition method-thermogravimetric analysis and pyrolysis. Laboratory results were compared to atmospheric measurements taken at 13 companies to validate the protocol and thereby ensure its representativeness of industrial thermal processing. This protocol was applied to most commonly used thermoplastics to determine their thermal breakdown products and their thermal behaviour. Emissions data collected by personal exposure monitoring and sampling at the process emission area show airborne concentrations of detected compounds to be in the range of 0-3 mg/m 3 under normal operating conditions. Laser cutting or purging operations generate higher pollution levels in particular formaldehyde which was found in some cases at a concentration above the workplace exposure limit.
RFID in healthcare: a Six Sigma DMAIC and simulation case study.
Southard, Peter B; Chandra, Charu; Kumar, Sameer
2012-01-01
The purpose of this paper is to develop a business model to generate quantitative evidence of the benefits of implementing radio frequency identification (RFID) technology, limiting the scope to outpatient surgical processes in hospitals. The study primarily uses the define-measure-analyze-improve-control (DMAIC) approach, and draws on various analytical tools such as work flow diagrams, value stream mapping, and discrete event simulation to examine the effect of implementing RFID technology on improving effectiveness (quality and timeliness) and efficiency (cost reduction) of outpatient surgical processes. The analysis showed significant estimated annual cost and time savings in carrying out patients' surgical procedures with RFID technology implementation for the outpatient surgery processes in a hospital. This is largely due to the elimination of both non-value added activities of locating supplies and equipment and also the elimination of the "return" loop created by preventable post operative infections. Several poka-yokes developed using RFID technology were identified to eliminate those two issues. Several poka-yokes developed using RFID technology were identified for improving the safety of the patient and cost effectiveness of the operation to ensure the success of the outpatient surgical process. Many stakeholders in the hospital environment will be impacted including patients, physicians, nurses, technicians, administrators and other hospital personnel. Different levels of training of hospital personnel will be required, based on the degree of interaction with the RFID system. Computations of costs and savings will help decision makers understand the benefits and implications of the technology in the hospital environment.
This Standard Operating Procedure (SOP) describes a new, rapid, and relatively inexpensive one step procedure which grinds the paint samples removed from the substrate and simultaneously quantitatively extracts the Pb from the paint in only one step in preparation for quantitativ...
1982-01-01
the FAETS Operational Scenario, followed by the FAETS Description and Operation. FAETS Specifications will be given, as well as the difinition of the...aircraft, expanded basing, new or improved avionics and new or improved armament. Furthermore, explicit quantitative ’ inter- dependence between
ERIC Educational Resources Information Center
Sandhu, Navjot; Hussain, Javed; Matlay, Harry
2012-01-01
Purpose: The purpose of this paper is to investigate the entrepreneurship education and training (EET) needs of small family businesses operating in the agricultural sector of the Indian economy. Design/methodology/approach: Quantitative and qualitative data were collected through a survey of 122 agricultural family firms in the Indian state of…
ERIC Educational Resources Information Center
Chang, Su-Chao; Lee, Ming-Shing
2007-01-01
Purpose: The main purpose of this paper is to investigate the relationship among leadership, organizational culture, the operation of learning organization and employees' job satisfaction. Design/methodology/approach: A quantitative research design was employed. A total of 1,000 questionnaires were mailed out and received 134 valid replies.…
Classification of cassava genotypes based on qualitative and quantitative data.
Oliveira, E J; Oliveira Filho, O S; Santos, V S
2015-02-02
We evaluated the genetic variation of cassava accessions based on qualitative (binomial and multicategorical) and quantitative traits (continuous). We characterized 95 accessions obtained from the Cassava Germplasm Bank of Embrapa Mandioca e Fruticultura; we evaluated these accessions for 13 continuous, 10 binary, and 25 multicategorical traits. First, we analyzed the accessions based only on quantitative traits; next, we conducted joint analysis (qualitative and quantitative traits) based on the Ward-MLM method, which performs clustering in two stages. According to the pseudo-F, pseudo-t2, and maximum likelihood criteria, we identified five and four groups based on quantitative trait and joint analysis, respectively. The smaller number of groups identified based on joint analysis may be related to the nature of the data. On the other hand, quantitative data are more subject to environmental effects in the phenotype expression; this results in the absence of genetic differences, thereby contributing to greater differentiation among accessions. For most of the accessions, the maximum probability of classification was >0.90, independent of the trait analyzed, indicating a good fit of the clustering method. Differences in clustering according to the type of data implied that analysis of quantitative and qualitative traits in cassava germplasm might explore different genomic regions. On the other hand, when joint analysis was used, the means and ranges of genetic distances were high, indicating that the Ward-MLM method is very useful for clustering genotypes when there are several phenotypic traits, such as in the case of genetic resources and breeding programs.
Shilling, Val; Morris, Christopher; Thompson-Coon, Jo; Ukoumunne, Obioha; Rogers, Morwenna; Logan, Stuart
2013-07-01
To review the qualitative and quantitative evidence of the benefits of peer support for parents of children with disabling conditions in the context of health, well-being, impact on family, and economic and service implications. We comprehensively searched multiple databases. Eligible studies evaluated parent-to-parent support and reported on the psychological health and experience of giving or receiving support. There were no limits on the child's condition, study design, language, date, or setting. We sought to aggregate quantitative data; findings of qualitative studies were combined using thematic analysis. Qualitative and quantitative data were brought together in a narrative synthesis. Seventeen papers were included: nine qualitative studies, seven quantitative studies, and one mixed-methods evaluation. Four themes were identified from qualitative studies: (1) shared social identity, (2) learning from the experiences of others, (3) personal growth, and (4) supporting others. Some quantitative studies reported a positive effect of peer support on psychological health and other outcomes; however, this was not consistently confirmed. It was not possible to aggregate data across studies. No costing data were identified. Qualitative studies strongly suggest that parents perceive benefit from peer support programmes, an effect seen across different types of support and conditions. However, quantitative studies provide inconsistent evidence of positive effects. Further research should explore whether this dissonance is substantive or an artefact of how outcomes have been measured. © The Authors. Developmental Medicine & Child Neurology © 2013 Mac Keith Press.
Clark, Renee M; Besterfield-Sacre, Mary E
2009-03-01
We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.
Quantum Dynamical Applications of Salem's Theorem
NASA Astrophysics Data System (ADS)
Damanik, David; Del Rio, Rafael
2009-07-01
We consider the survival probability of a state that evolves according to the Schrödinger dynamics generated by a self-adjoint operator H. We deduce from a classical result of Salem that upper bounds for the Hausdorff dimension of a set supporting the spectral measure associated with the initial state imply lower bounds on a subsequence of time scales for the survival probability. This general phenomenon is illustrated with applications to the Fibonacci operator and the critical almost Mathieu operator. In particular, this gives the first quantitative dynamical bound for the critical almost Mathieu operator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leitner, Peter; Heyn, Martin F.; Kernbichler, Winfried
In this paper, the impact of momentum and energy conservation of the collision operator in the kinetic description for Resonant Magnetic Perturbations (RMPs) in a tokamak is studied. The particle conserving differential collision operator of Ornstein-Uhlenbeck type is supplemented with integral parts such that energy and momentum are conserved. The application to RMP penetration in a tokamak shows that energy conservation in the electron collision operator is important for the quantitative description of plasma shielding effects at the resonant surface. On the other hand, momentum conservation in the ion collision operator does not significantly change the results.
Occurrence of invertebrates at 38 stream sites in the Mississippi Embayment study unit, 1996-99
Caskey, Brian J.; Justus, B.G.; Zappia, Humbert
2002-01-01
A total of 88 invertebrate species and 178 genera representing 59 families, 8 orders, 6 classes, and 3 phyla was identified at 38 stream sites in the Mississippi Embayment Study Unit from 1996 through 1999 as part of the National Water-Quality Assessment Program. Sites were selected based on land use within the drainage basins and the availability of long-term streamflow data. Invertebrates were sampled as part of an overall sampling design to provide information related to the status and trends in water quality in the Mississippi Embayment Study Unit, which includes parts of Arkansas, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. Invertebrate sampling and processing was conducted using nationally standardized techniques developed for the National Water-Quality Assessment Program. These techniques included both a semi-quantitative method, which targeted habitats where invertebrate diversity is expected to be highest, and a qualitative multihabitat method, which samples all available habitat types possible within a sampling reach. All invertebrate samples were shipped to the USGS National Water-Quality Laboratory (NWQL) where they were processed. Of the 365 taxa identified, 156 were identified with the semi-quantitative method that involved sampling a known quantity of what was expected to be the richest habitat, woody debris. The qualitative method, which involved sampling all available habitats, identified 345 taxa The number of organisms identified in the semi-quantitative samples ranged from 74 to 3,295, whereas the number of taxa identified ranged from 9 to 54. The number of organisms identified in the qualitative samples ranged from 42 to 29,634, whereas the number of taxa ranged from 18 to 81. From all the organisms identified, chironomid taxa were the most frequently identified, and plecopteran taxa were among the least frequently identified.
Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen
2013-01-01
ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy.
Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen
2013-01-01
ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy. PMID:23990911
Smith, Ben J; Zehle, Katharina; Bauman, Adrian E; Chau, Josephine; Hawkshaw, Barbara; Frost, Steven; Thomas, Margaret
2006-04-01
This study examined the use of quantitative methods in Australian health promotion research in order to identify methodological trends and priorities for strengthening the evidence base for health promotion. Australian health promotion articles were identified by hand searching publications from 1992-2002 in six journals: Health Promotion Journal of Australia, Australian and New Zealand journal of Public Health, Health Promotion International, Health Education Research, Health Education and Behavior and the American Journal of Health Promotion. The study designs and statistical methods used in articles presenting quantitative research were recorded. 591 (57.7%) of the 1,025 articles used quantitative methods. Cross-sectional designs were used in the majority (54.3%) of studies with pre- and post-test (14.6%) and post-test only (9.5%) the next most common designs. Bivariate statistical methods were used in 45.9% of papers, multivariate methods in 27.1% and simple numbers and proportions in 25.4%. Few studies used higher-level statistical techniques. While most studies used quantitative methods, the majority were descriptive in nature. The study designs and statistical methods used provided limited scope for demonstrating intervention effects or understanding the determinants of change.
NASA Astrophysics Data System (ADS)
Setiani, C.; Waluya, S. B.; Wardono
2018-03-01
The purposes of this research are: (1) to identify learning quality in Model Eliciting Activities (MEAs) using a Metaphorical Thinking (MT) approach regarding qualitative and quantitative; (2) to analyze mathematical literacy of students based on Self-Efficacy (SE). This research is mixed method concurrent embedded design with qualitative research as the primary method. The quantitative research used quasi-experimental with non-equivalent control group design. The population is VIII grade students of SMP Negeri 3 Semarang Indonesia. Quantitative data is examined by conducting completeness mean test, standard completeness test, mean differentiation test and proportional differentiation test. Qualitative data is analyzed descriptively. The result of this research shows that MEAs learning using MT approach accomplishes good criteria both quantitatively and qualitatively. Students with low self-efficacy can identify problems, but they are lack ability to arrange problem-solving strategy on mathematical literacy questions. Students with medium self-efficacy can identify information provided in issues, but they find difficulties to use math symbols in making a representation. Students with high self-efficacy are excellent to represent problems into mathematical models as well as figures by using appropriate symbols and tools, so they can arrange strategy easily to solve mathematical literacy questions.
QDMR: a quantitative method for identification of differentially methylated regions by entropy
Zhang, Yan; Liu, Hongbo; Lv, Jie; Xiao, Xue; Zhu, Jiang; Liu, Xiaojuan; Su, Jianzhong; Li, Xia; Wu, Qiong; Wang, Fang; Cui, Ying
2011-01-01
DNA methylation plays critical roles in transcriptional regulation and chromatin remodeling. Differentially methylated regions (DMRs) have important implications for development, aging and diseases. Therefore, genome-wide mapping of DMRs across various temporal and spatial methylomes is important in revealing the impact of epigenetic modifications on heritable phenotypic variation. We present a quantitative approach, quantitative differentially methylated regions (QDMRs), to quantify methylation difference and identify DMRs from genome-wide methylation profiles by adapting Shannon entropy. QDMR was applied to synthetic methylation patterns and methylation profiles detected by methylated DNA immunoprecipitation microarray (MeDIP-chip) in human tissues/cells. This approach can give a reasonable quantitative measure of methylation difference across multiple samples. Then DMR threshold was determined from methylation probability model. Using this threshold, QDMR identified 10 651 tissue DMRs which are related to the genes enriched for cell differentiation, including 4740 DMRs not identified by the method developed by Rakyan et al. QDMR can also measure the sample specificity of each DMR. Finally, the application to methylation profiles detected by reduced representation bisulphite sequencing (RRBS) in mouse showed the platform-free and species-free nature of QDMR. This approach provides an effective tool for the high-throughput identification of potential functional regions involved in epigenetic regulation. PMID:21306990
Houssami, Nehmat; Turner, Robin M; Morrow, Monica
2017-09-01
Although there is no consensus on whether pre-operative MRI in women with breast cancer (BC) benefits surgical treatment, MRI continues to be used pre-operatively in practice. This meta-analysis examines the association between pre-operative MRI and surgical outcomes in BC. A systematic review was performed to identify studies reporting quantitative data on pre-operative MRI and surgical outcomes (without restriction by type of surgery received or type of BC) and using a controlled design. Random-effects logistic regression calculated the pooled odds ratio (OR) for each surgical outcome (MRI vs. no-MRI groups), and estimated ORs stratified by study-level age. Subgroup analysis was performed for invasive lobular cancer (ILC). Nineteen studies met eligibility criteria: 3 RCTs and 16 comparative studies that included newly diagnosed BC of any type except for three studies restricted to ILC. Primary analysis (85,975 subjects) showed that pre-operative MRI was associated with increased odds of receiving mastectomy [OR 1.39 (1.23, 1.57); p < 0.001]; similar findings were shown in analyses stratified by study-level median age. Secondary analyses did not find statistical evidence of an effect of MRI on the rates of re-excision, re-operation, or positive margins; however, MRI was significantly associated with increased odds of receiving contralateral prophylactic mastectomy [OR 1.91 (1.25, 2.91); p = 0.003]. Subgroup analysis for ILC did not find any association between MRI and the odds of receiving mastectomy [OR 1.00 (0.75, 1.33); p = 0.988] or the odds of re-excision [OR 0.65 (0.35, 1.24); p = 0.192]. Pre-operative MRI is associated with increased odds of receiving ipsilateral mastectomy and contralateral prophylactic mastectomy as surgical treatment in newly diagnosed BC patients.
[Methods of quantitative proteomics].
Kopylov, A T; Zgoda, V G
2007-01-01
In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.
An evidential reasoning extension to quantitative model-based failure diagnosis
NASA Technical Reports Server (NTRS)
Gertler, Janos J.; Anderson, Kenneth C.
1992-01-01
The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.
Quantitative Reasoning and the Sine Function: The Case of Zac
ERIC Educational Resources Information Center
Moore, Kevin C.
2014-01-01
A growing body of literature has identified quantitative and covariational reasoning as critical for secondary and undergraduate student learning, particularly for topics that require students to make sense of relationships between quantities. The present study extends this body of literature by characterizing an undergraduate precalculus…
Go Figure! Using Quantitative Measures to Enhance Program and Student Success
ERIC Educational Resources Information Center
Frost, Leanne H.; Braun, Gwendolyn K.
2006-01-01
Using quantitative assessment, Montana State University-Billings substantially improved and expanded its developmental education program and learning center during the past five years. Student-centered questions drove the research efforts. By gathering, analyzing and sharing hard data, the department identified unmet student needs, discovered…
Comparative analysis of hospital energy use: pacific northwest and scandinavia.
Burpee, Heather; McDade, Erin
2014-01-01
This study aimed to establish the potential for significant energy reduction in hospitals in the United States by providing evidence of Scandinavian operational precedents with high Interior Environmental Quality (IEQ) and substantially lower energy profiles than comparable U.S. facilities. These facilities set important precedents for design teams seeking operational examples for achieving aggressive energy and interior environmental quality goals. This examination of operational hospitals is intended to offer hospital owners, designers, and building managers a strong case and concrete framework for strategies to achieve exceptionally high performing buildings. Energy efficient hospitals have the potential to significantly impact the U.S.'s overall energy profile, and key stakeholders in the hospital industry need specific, operationally grounded precedents in order to successfully implement informed energy reduction strategies. This study is an outgrowth of previous research evaluating high quality, low energy hospitals that serve as examples for new high performance hospital design, construction, and operation. Through extensive interviews, numerous site visits, the development of case studies, and data collection, this team has established thorough qualitative and quantitative analyses of several contemporary hospitals in Scandinavia and the Pacific Northwest. Many Scandinavian hospitals demonstrate a low energy profile, and when analyzed in comparison with U.S. hospitals, such Scandinavian precedents help define the framework required to make significant changes in the U.S. hospital building industry. Eight hospitals, four Scandinavian and four Pacific Northwest, were quantitatively compared using the Environmental Protection Agency's Portfolio Manager, allowing researchers to answer specific questions about the impact of energy source and architectural and mechanical strategies on energy efficiency in operational hospitals. Specific architectural, mechanical, and plant systems make these Scandinavian hospitals more energy efficient than their Pacific Northwest counterparts. More importantly, synergistic systems integration allows for their significant reductions in energy consumption. This quantitative comparison of operational Scandinavian and Pacific Northwest hospitals resulted in compelling evidence of the potential for deep energy savings in the U.S., and allowed researchers to outline specific strategies for achieving such reductions. © 2014 Vendome Group, LLC.
Power Grid Construction Project Portfolio Optimization Based on Bi-level programming model
NASA Astrophysics Data System (ADS)
Zhao, Erdong; Li, Shangqi
2017-08-01
As the main body of power grid operation, county-level power supply enterprises undertake an important emission to guarantee the security of power grid operation and safeguard social power using order. The optimization of grid construction projects has been a key issue of power supply capacity and service level of grid enterprises. According to the actual situation of power grid construction project optimization of county-level power enterprises, on the basis of qualitative analysis of the projects, this paper builds a Bi-level programming model based on quantitative analysis. The upper layer of the model is the target restriction of the optimal portfolio; the lower layer of the model is enterprises’ financial restrictions on the size of the enterprise project portfolio. Finally, using a real example to illustrate operation proceeding and the optimization result of the model. Through qualitative analysis and quantitative analysis, the bi-level programming model improves the accuracy and normative standardization of power grid enterprises projects.
Starke, Sandra D; Baber, Chris; Cooke, Neil J; Howes, Andrew
2017-05-01
Road traffic control rooms rely on human operators to monitor and interact with information presented on multiple displays. Past studies have found inconsistent use of available visual information sources in such settings across different domains. In this study, we aimed to broaden the understanding of observer behaviour in control rooms by analysing a case study in road traffic control. We conducted a field study in a live road traffic control room where five operators responded to incidents while wearing a mobile eye tracker. Using qualitative and quantitative approaches, we investigated the operators' workflow using ergonomics methods and quantified visual information sampling. We found that individuals showed differing preferences for viewing modalities and weighting of task components, with a strong coupling between eye and head movement. For the quantitative analysis of the eye tracking data, we propose a number of metrics which may prove useful to compare visual sampling behaviour across domains in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Operation of the Computer Software Management and Information Center (COSMIC)
NASA Technical Reports Server (NTRS)
1983-01-01
The major operational areas of the COSMIC center are described. Quantitative data on the software submittals, program verification, and evaluation are presented. The dissemination activities are summarized. Customer services and marketing activities of the center for the calendar year are described. Those activities devoted to the maintenance and support of selected programs are described. A Customer Information system, the COSMIC Abstract Recording System Project, and the COSMIC Microfiche Project are summarized. Operational cost data are summarized.
Application of automation and robotics to lunar surface human exploration operations
NASA Technical Reports Server (NTRS)
Woodcock, Gordon R.; Sherwood, Brent; Buddington, Patricia A.; Bares, Leona C.; Folsom, Rolfe; Mah, Robert; Lousma, Jack
1990-01-01
Major results of a study applying automation and robotics to lunar surface base buildup and operations concepts are reported. The study developed a reference base scenario with specific goals, equipment concepts, robot concepts, activity schedules and buildup manifests. It examined crew roles, contingency cases and system reliability, and proposed a set of technologies appropriate and necessary for effective lunar operations. This paper refers readers to four companion papers for quantitative details where appropriate.
Xu, Fei-Fan; Chen, Jin-Hong; Leung, Gilberto Ka Kit; Hao, Shu-Yu; Xu, Long; Hou, Zong-Gang; Mao, Xiang; Shi, Guang-Zhi; Li, Jing-Sheng; Liu, Bai-Yun
2014-01-01
Post-operative volume of subdural fluid is considered to correlate with recurrence in chronic subdural haematoma (CSDH). Information on the applications of computer-assisted volumetric analysis in patients with CSDHs is lacking. To investigate the relationship between haematoma recurrence and longitudinal changes in subdural fluid volume using CT volumetric analysis. Fifty-four patients harbouring 64 CSDHs were studied prospectively. The association between recurrence rate and CT findings were investigated. Eleven patients (20.4%) experienced post-operative recurrence. Higher pre-operative (over 120 ml) and/or pre-discharge subdural fluid volumes (over 22 ml) were significantly associated with recurrence; the probability of non-recurrence for values below these thresholds were 92.7% and 95.2%, respectively. CSDHs with larger pre-operative (over 15.1 mm) and/or residual (over 11.7 mm) widths also had significantly increased recurrence rates. Bilateral CSDHs were not found to be more likely to recur in this series. On receiver-operating characteristic curve, the areas under curve for the magnitude of changes in subdural fluid volume were greater than a single time-point measure of either width or volume of the subdural fluid cavity. Close imaging follow-up is important for CSDH patients for recurrence prediction. Using quantitative CT volumetric analysis, strong evidence was provided that changes in the residual fluid volume during the 'self-resolution' period can be used as significantly radiological predictors of recurrence.
Reynolds, Alexandra S; Guo, Xiaotao; Matthews, Elizabeth; Brodie, Daniel; Rabbani, Leroy E; Roh, David J; Park, Soojin; Claassen, Jan; Elkind, Mitchell S V; Zhao, Binsheng; Agarwal, Sachin
2017-08-01
Traditional predictors of neurological prognosis after cardiac arrest are unreliable after targeted temperature management. Absence of pupillary reflexes remains a reliable predictor of poor outcome. Diffusion-weighted imaging has emerged as a potential predictor of recovery, and here we compare imaging characteristics to pupillary exam. We identified 69 patients who had MRIs within seven days of arrest and used a semi-automated algorithm to perform quantitative volumetric analysis of apparent diffusion coefficient (ADC) sequences at various thresholds. Area under receiver operating characteristic curves (ROC-AUC) were estimated to compare predictive values of quantitative MRI with pupillary exam at days 3, 5 and 7 post-arrest, for persistence of coma and functional outcomes at discharge. Cerebral Performance Category scores of 3-4 were considered poor outcome. Excluding patients where life support was withdrawn, ≥2.8% diffusion restriction of the entire brain at an ADC of ≤650×10 -6 m 2 /s was 100% specific and 68% sensitive for failure to wake up from coma before discharge. The ROC-AUC of ADC changes at ≤450×10 -6 mm 2 /s and ≤650×10 -6 mm 2 /s were significantly superior in predicting failure to wake up from coma compared to bilateral absence of pupillary reflexes. Among survivors, >0.01% of diffusion restriction of the entire brain at an ADC ≤450×10 -6 m 2 /s was 100% specific and 46% sensitive for poor functional outcome at discharge. The ROC curve predicting poor functional outcome at ADC ≤450×10 -6 mm 2 /s had an AUC of 0.737 (0.574-0.899, p=0.04). Post-anoxic diffusion changes using quantitative brain MRI may aid in predicting persistent coma and poor functional outcomes at hospital discharge. Copyright © 2017 Elsevier B.V. All rights reserved.
Claus, Rainer; Lucas, David M.; Stilgenbauer, Stephan; Ruppert, Amy S.; Yu, Lianbo; Zucknick, Manuela; Mertens, Daniel; Bühler, Andreas; Oakes, Christopher C.; Larson, Richard A.; Kay, Neil E.; Jelinek, Diane F.; Kipps, Thomas J.; Rassenti, Laura Z.; Gribben, John G.; Döhner, Hartmut; Heerema, Nyla A.; Marcucci, Guido; Plass, Christoph; Byrd, John C.
2012-01-01
Purpose Increased ZAP-70 expression predicts poor prognosis in chronic lymphocytic leukemia (CLL). Current methods for accurately measuring ZAP-70 expression are problematic, preventing widespread application of these tests in clinical decision making. We therefore used comprehensive DNA methylation profiling of the ZAP-70 regulatory region to identify sites important for transcriptional control. Patients and Methods High-resolution quantitative DNA methylation analysis of the entire ZAP-70 gene regulatory regions was conducted on 247 samples from patients with CLL from four independent clinical studies. Results Through this comprehensive analysis, we identified a small area in the 5′ regulatory region of ZAP-70 that showed large variability in methylation in CLL samples but was universally methylated in normal B cells. High correlation with mRNA and protein expression, as well as activity in promoter reporter assays, revealed that within this differentially methylated region, a single CpG dinucleotide and neighboring nucleotides are particularly important in ZAP-70 transcriptional regulation. Furthermore, by using clustering approaches, we identified a prognostic role for this site in four independent data sets of patients with CLL using time to treatment, progression-free survival, and overall survival as clinical end points. Conclusion Comprehensive quantitative DNA methylation analysis of the ZAP-70 gene in CLL identified important regions responsible for transcriptional regulation. In addition, loss of methylation at a specific single CpG dinucleotide in the ZAP-70 5′ regulatory sequence is a highly predictive and reproducible biomarker of poor prognosis in this disease. This work demonstrates the feasibility of using quantitative specific ZAP-70 methylation analysis as a relevant clinically applicable prognostic test in CLL. PMID:22564988
Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.
2017-01-01
Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972
NASA Technical Reports Server (NTRS)
Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.
2010-01-01
Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, within a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability, and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation, testing results, and other information. Where appropriate, actual performance history was used to calculate failure rates for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to assess compliance with requirements and to highlight design or performance shortcomings for further decision making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability, and maintainability analysis, and present findings and observation based on analysis leading to the Ground Operations Project Preliminary Design Review milestone.
Chabaud, Aurore; Eschalier, Bénédicte; Zullian, Myriam; Plan-Paquet, Anne; Aubreton, Sylvie; Saragaglia, Dominique; Descamps, Stéphane; Coudeyre, Emmanuel
2018-05-01
Providing patients with validated information before total hip arthroplasty may help lessen discrepancies between patients' expectations and the surgical result. This study sought to validate an information booklet for candidates for hip arthroplasty by using a mixed qualitative and quantitative approach based on a panel of patients and a sample of healthcare professionals. We developed a booklet in accordance with the standard methods and then conducted focus groups to collect the opinions of a sample of multidisciplinary experts involved in the care of patients with hip osteoarthritis. The number of focus groups and experts was determined according to the data saturation principle. A panel of patients awaiting hip arthroplasty or those in the immediate post-operative period assessed the booklet with self-reporting questionnaires (knowledge, beliefs, and expectations) and semi-structured interviews. All experts and both patient groups validated the booklet in terms of content and presentation. Semi-structured interviews were uninformative, especially for post-operative patients. Reading the booklet significantly (P<0.001) improved the knowledge scores in both groups, with no intergroup differences, but did not affect beliefs in either patient group. Only pre-operative patients significantly changed their expectations. Our mixed qualitative and quantitative approach allowed us to validate a booklet for patients awaiting hip arthroplasty, taking into account the opinions of both patients and healthcare professionals. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Effect of Diffusion Limitations on Multianalyte Determination from Biased Biosensor Response
Baronas, Romas; Kulys, Juozas; Lančinskas, Algirdas; Žilinskas, Antanas
2014-01-01
The optimization-based quantitative determination of multianalyte concentrations from biased biosensor responses is investigated under internal and external diffusion-limited conditions. A computational model of a biocatalytic amperometric biosensor utilizing a mono-enzyme-catalyzed (nonspecific) competitive conversion of two substrates was used to generate pseudo-experimental responses to mixtures of compounds. The influence of possible perturbations of the biosensor signal, due to a white noise- and temperature-induced trend, on the precision of the concentration determination has been investigated for different configurations of the biosensor operation. The optimization method was found to be suitable and accurate enough for the quantitative determination of the concentrations of the compounds from a given biosensor transient response. The computational experiments showed a complex dependence of the precision of the concentration estimation on the relative thickness of the outer diffusion layer, as well as on whether the biosensor operates under diffusion- or kinetics-limited conditions. When the biosensor response is affected by the induced exponential trend, the duration of the biosensor action can be optimized for increasing the accuracy of the quantitative analysis. PMID:24608006
Population- and individual-specific regulatory variation in Sardinia.
Pala, Mauro; Zappala, Zachary; Marongiu, Mara; Li, Xin; Davis, Joe R; Cusano, Roberto; Crobu, Francesca; Kukurba, Kimberly R; Gloudemans, Michael J; Reinier, Frederic; Berutti, Riccardo; Piras, Maria G; Mulas, Antonella; Zoledziewska, Magdalena; Marongiu, Michele; Sorokin, Elena P; Hess, Gaelen T; Smith, Kevin S; Busonero, Fabio; Maschio, Andrea; Steri, Maristella; Sidore, Carlo; Sanna, Serena; Fiorillo, Edoardo; Bassik, Michael C; Sawcer, Stephen J; Battle, Alexis; Novembre, John; Jones, Chris; Angius, Andrea; Abecasis, Gonçalo R; Schlessinger, David; Cucca, Francesco; Montgomery, Stephen B
2017-05-01
Genetic studies of complex traits have mainly identified associations with noncoding variants. To further determine the contribution of regulatory variation, we combined whole-genome and transcriptome data for 624 individuals from Sardinia to identify common and rare variants that influence gene expression and splicing. We identified 21,183 expression quantitative trait loci (eQTLs) and 6,768 splicing quantitative trait loci (sQTLs), including 619 new QTLs. We identified high-frequency QTLs and found evidence of selection near genes involved in malarial resistance and increased multiple sclerosis risk, reflecting the epidemiological history of Sardinia. Using family relationships, we identified 809 segregating expression outliers (median z score of 2.97), averaging 13.3 genes per individual. Outlier genes were enriched for proximal rare variants, providing a new approach to study large-effect regulatory variants and their relevance to traits. Our results provide insight into the effects of regulatory variants and their relationship to population history and individual genetic risk.
Hu, Yong; Kwok, Jerry Weilun; Tse, Jessica Yuk-Hang; Luk, Keith Dip-Kei
2014-06-01
Nonsurgical rehabilitation therapy is a commonly used strategy to treat chronic low back pain (LBP). The selection of the most appropriate therapeutic options is still a big challenge in clinical practices. Surface electromyography (sEMG) topography has been proposed to be an objective assessment of LBP rehabilitation. The quantitative analysis of dynamic sEMG would provide an objective tool of prognosis for LBP rehabilitation. To evaluate the prognostic value of quantitative sEMG topographic analysis and to verify the accuracy of the performance of proposed time-varying topographic parameters for identifying the patients who have better response toward the rehabilitation program. A retrospective study of consecutive patients. Thirty-eight patients with chronic nonspecific LBP and 43 healthy subjects. The accuracy of the time-varying quantitative sEMG topographic analysis for monitoring LBP rehabilitation progress was determined by calculating the corresponding receiver-operating characteristic (ROC) curves. Physiologic measure was the sEMG during lumbar flexion and extension. Patients who suffered from chronic nonspecific LBP without the history of back surgery and any medical conditions causing acute exacerbation of LBP during the clinical test were enlisted to perform the clinical test during the 12-week physiotherapy (PT) treatment. Low back pain patients were classified into two groups: "responding" and "nonresponding" based on the clinical assessment. The responding group referred to the LBP patients who began to recover after the PT treatment, whereas the nonresponding group referred to some LBP patients who did not recover or got worse after the treatment. The results of the time-varying analysis in the responding group were compared with those in the nonresponding group. In addition, the accuracy of the analysis was analyzed through ROC curves. The time-varying analysis showed discrepancies in the root-mean-square difference (RMSD) parameters between the responding and nonresponding groups. The relative area (RA) and relative width (RW) of RMSD at flexion and extension in the responding group were significantly lower than those in the nonresponding group (p<.05). The areas under the ROC curve of RA and RW of RMSD at flexion and extension were greater than 0.7 and were statistically significant. The quantitative time-varying analysis of sEMG topography showed significant difference between the healthy and LBP groups. The discrepancies in quantitative dynamic sEMG topography of LBP group from normal group, in terms of RA and RW of RMSD at flexion and extension, were able to identify those LBP subjects who would respond to a conservative rehabilitation program focused on functional restoration of lumbar muscle. Copyright © 2014 Elsevier Inc. All rights reserved.
Remenyi, Judit; Banerji, Christopher R.S.; Lai, Chun-Fui; Periyasamy, Manikandan; Lombardo, Ylenia; Busonero, Claudia; Ottaviani, Silvia; Passey, Alun; Quinlan, Philip R.; Purdie, Colin A.; Jordan, Lee B.; Thompson, Alastair M.; Finn, Richard S.; Rueda, Oscar M.; Caldas, Carlos; Gil, Jesus; Coombes, R. Charles; Fuller-Pace, Frances V.; Teschendorff, Andrew E.; Buluwela, Laki; Ali, Simak
2015-01-01
The Nuclear Receptor (NR) superfamily of transcription factors comprises 48 members, several of which have been implicated in breast cancer. Most important is estrogen receptor-α (ERα), which is a key therapeutic target. ERα action is facilitated by co-operativity with other NR and there is evidence that ERα function may be recapitulated by other NRs in ERα-negative breast cancer. In order to examine the inter-relationships between nuclear receptors, and to obtain evidence for previously unsuspected roles for any NRs, we undertook quantitative RT-PCR and bioinformatics analysis to examine their expression in breast cancer. While most NRs were expressed, bioinformatic analyses differentiated tumours into distinct prognostic groups that were validated by analyzing public microarray data sets. Although ERα and progesterone receptor were dominant in distinguishing prognostic groups, other NR strengthened these groups. Clustering analysis identified several family members with potential importance in breast cancer. Specifically, RORγ is identified as being co-expressed with ERα, whilst several NRs are preferentially expressed in ERα-negative disease, with TLX expression being prognostic in this subtype. Functional studies demonstrated the importance of TLX in regulating growth and invasion in ERα-negative breast cancer cells. PMID:26280373
Clinical application of ICF key codes to evaluate patients with dysphagia following stroke
Dong, Yi; Zhang, Chang-Jie; Shi, Jie; Deng, Jinggui; Lan, Chun-Na
2016-01-01
Abstract This study was aimed to identify and evaluate the International Classification of Functioning (ICF) key codes for dysphagia in stroke patients. Thirty patients with dysphagia after stroke were enrolled in our study. To evaluate the ICF dysphagia scale, 6 scales were used as comparisons, namely the Barthel Index (BI), Repetitive Saliva Swallowing Test (RSST), Kubota Water Swallowing Test (KWST), Frenchay Dysarthria Assessment, Mini-Mental State Examination (MMSE), and the Montreal Cognitive Assessment (MoCA). Multiple regression analysis was performed to quantitate the relationship between the ICF scale and the other 7 scales. In addition, 60 ICF scales were analyzed by the least absolute shrinkage and selection operator (LASSO) method. A total of 21 ICF codes were identified, which were closely related with the other scales. These included 13 codes from Body Function, 1 from Body Structure, 3 from Activities and Participation, and 4 from Environmental Factors. A topographic network map with 30 ICF key codes was also generated to visualize their relationships. The number of ICF codes identified is in line with other well-established evaluation methods. The network topographic map generated here could be used as an instruction tool in future evaluations. We also found that attention functions and biting were critical codes of these scales, and could be used as treatment targets. PMID:27661012
NASA Astrophysics Data System (ADS)
Reynders, Edwin; Maes, Kristof; Lombaert, Geert; De Roeck, Guido
2016-01-01
Identified modal characteristics are often used as a basis for the calibration and validation of dynamic structural models, for structural control, for structural health monitoring, etc. It is therefore important to know their accuracy. In this article, a method for estimating the (co)variance of modal characteristics that are identified with the stochastic subspace identification method is validated for two civil engineering structures. The first structure is a damaged prestressed concrete bridge for which acceleration and dynamic strain data were measured in 36 different setups. The second structure is a mid-rise building for which acceleration data were measured in 10 different setups. There is a good quantitative agreement between the predicted levels of uncertainty and the observed variability of the eigenfrequencies and damping ratios between the different setups. The method can therefore be used with confidence for quantifying the uncertainty of the identified modal characteristics, also when some or all of them are estimated from a single batch of vibration data. Furthermore, the method is seen to yield valuable insight in the variability of the estimation accuracy from mode to mode and from setup to setup: the more informative a setup is regarding an estimated modal characteristic, the smaller is the estimated variance.
Autoregulation and Virulence Control by the Toxin-Antitoxin System SavRS in Staphylococcus aureus
Wen, Wen; Liu, Banghui; Xue, Lu; Zhu, Zhongliang; Niu, Liwen
2018-01-01
ABSTRACT Toxin-antitoxin (TA) systems play diverse physiological roles, such as plasmid maintenance, growth control, and persister cell formation, but their involvement in bacterial pathogenicity remains largely unknown. Here, we have identified a novel type II toxin-antitoxin system, SavRS, and revealed the molecular mechanisms of its autoregulation and virulence control in Staphylococcus aureus. Electrophoretic mobility shift assay and isothermal titration calorimetry data indicated that the antitoxin SavR acted as the primary repressor bound to its own promoter, while the toxin SavS formed a complex with SavR to enhance the ability to bind to the operator site. DNase I footprinting assay identified the SavRS-binding site containing a short and long palindrome in the promoter region. Further, mutation and DNase I footprinting assay demonstrated that the two palindromes were crucial for DNA binding and transcriptional repression. More interestingly, genetic deletion of the savRS system led to the increased hemolytic activity and pathogenicity in a mouse subcutaneous abscess model. We further identified two virulence genes, hla and efb, by real-time quantitative reverse transcription-PCR and demonstrated that SavR and SavRS could directly bind to their promoter regions to repress virulence gene expression. PMID:29440365
ERIC Educational Resources Information Center
Trexler, Grant Lewis
2012-01-01
This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…
Subjective Quantitative Studies of Human Agency
ERIC Educational Resources Information Center
Alkire, Sabina
2005-01-01
Amartya Sen's writings have articulated the importance of human agency, and identified the need for information on agency freedom to inform our evaluation of social arrangements. Many approaches to poverty reduction stress the need for empowerment. This paper reviews "subjective quantitative measures of human agency at the individual level." It…
Quantitative Analysis of Qualitative Information from Interviews: A Systematic Literature Review
ERIC Educational Resources Information Center
Fakis, Apostolos; Hilliam, Rachel; Stoneley, Helen; Townend, Michael
2014-01-01
Background: A systematic literature review was conducted on mixed methods area. Objectives: The overall aim was to explore how qualitative information from interviews has been analyzed using quantitative methods. Methods: A contemporary review was undertaken and based on a predefined protocol. The references were identified using inclusion and…
USDA-ARS?s Scientific Manuscript database
Alfalfa (Medicago sativa L.) is an internationally significant forage crop. Forage yield, lodging resistance and spring vigor are important agronomic traits conditioned by quantitative genetic and environmental effects. The objective of this study was to identify quantitative trait loci (QTL) and mo...
Rapid and potentially portable detection and quantification technologies for foodborne pathogens
USDA-ARS?s Scientific Manuscript database
Introduction Traditional microbial culture methods are able to detect and identify a single specific bacterium, but may require days or weeks and typically do not produce quantitative data. The quest for faster, quantitative results has spurred development of “rapid methods” which usually employ bio...
Simulation and the Development of Clinical Judgment: A Quantitative Study
ERIC Educational Resources Information Center
Holland, Susan
2015-01-01
The purpose of this quantitative pretest posttest quasi-experimental research study was to explore the effect of the NESD on clinical judgment in associate degree nursing students and compare the differences between groups when the Nursing Education Simulation Design (NESD) guided simulation in order to identify educational strategies promoting…
Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared
ERIC Educational Resources Information Center
von Helversen, Bettina; Rieskamp, Jorg
2009-01-01
The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…
QUANTITATIVE PCR ANALYSIS OF HOUSE DUST CAN REVEAL ABNORMAL MOLD CONDITIONS
Indoor mold populations were measured in the dust of homes in Cleveland and Cincinnati, OH, by quantitative PCR (QPCR) and, in Cincinnati, also by culturing. QPCR assays for 82 species (or groups of species) were used to identify and quantify indoor mold populations in moldy home...
In situ health monitoring for bogie systems of CRH380 train on Beijing-Shanghai high-speed railway
NASA Astrophysics Data System (ADS)
Hong, Ming; Wang, Qiang; Su, Zhongqing; Cheng, Li
2014-04-01
Based on the authors' research efforts over the years, an in situ structural health monitoring (SHM) technique taking advantage of guided elastic waves has been developed and deployed via an online diagnosis system. The technique and the system were recently implemented on China's latest high-speed train (CRH380CL) operated on Beijing-Shanghai High-Speed Railway. The system incorporated modularized components including active sensor network, active wave generation, multi-channel data acquisition, signal processing, data fusion, and results presentation. The sensor network, inspired by a new concept—"decentralized standard sensing", was integrated into the bogie frames during the final assembly of CRH380CL, to generate and acquire bogie-guided ultrasonic waves, from which a wide array of signal features were extracted. Fusion of signal features through a diagnostic imaging algorithm led to a graphic illustration of the overall health state of the bogie in a real-time and intuitive manner. The in situ experimentation covered a variety of high-speed train operation events including startup, acceleration/deceleration, full-speed operation (300 km/h), emergency braking, track change, as well as full stop. Mock-up damage affixed to the bogie was identified quantitatively and visualized in images. This in situ testing has demonstrated the feasibility, effectiveness, sensitivity, and reliability of the developed SHM technique and the system towards real-world applications.
NASA Astrophysics Data System (ADS)
Kempton, Willett; Tomić, Jasna
Vehicle-to-grid power (V2G) uses electric-drive vehicles (battery, fuel cell, or hybrid) to provide power for specific electric markets. This article examines the systems and processes needed to tap energy in vehicles and implement V2G. It quantitatively compares today's light vehicle fleet with the electric power system. The vehicle fleet has 20 times the power capacity, less than one-tenth the utilization, and one-tenth the capital cost per prime mover kW. Conversely, utility generators have 10-50 times longer operating life and lower operating costs per kWh. To tap V2G is to synergistically use these complementary strengths and to reconcile the complementary needs of the driver and grid manager. This article suggests strategies and business models for doing so, and the steps necessary for the implementation of V2G. After the initial high-value, V2G markets saturate and production costs drop, V2G can provide storage for renewable energy generation. Our calculations suggest that V2G could stabilize large-scale (one-half of US electricity) wind power with 3% of the fleet dedicated to regulation for wind, plus 8-38% of the fleet providing operating reserves or storage for wind. Jurisdictions more likely to take the lead in adopting V2G are identified.
NASA Astrophysics Data System (ADS)
Hakoda, Christopher; Ren, Baiyang; Lissenden, Cliff J.; Rose, Joseph L.
2017-02-01
Thin-film PVDF (polyvinylidene fluoride) transducers are appealing as low cost, light weight, durable, and flexible sensors for structural health monitoring applications in aircraft structures. However, due to the relatively low Curie temperature of PVDF, there is a concern that it's performance will drop below acceptable levels during elevated-temperature operating conditions. To verify acceptable performance in these environmental operating conditions, temperature history data were collected between 23-60 °C. The effect of temperature on the thin-film PVDF was investigated and a temperature-independent damage feature was assessed. The temperature dependence of the signal's peak amplitude was investigated in both the time domain and the spectral domain to get two damage features. It was found that the measurement of the incident guided wave by the thin-film PVDF transducer had a temperature dependence that varied with frequency. A third damage feature, the mode ratio, was also calculated in the spectral domain with the goal of defining a damage feature that is temperature independent. A comparison of how well these damage features performed when used to identify a notch in an aluminum plate was made using receiver operating characteristic curves and their respective area under the curve values. This result demonstrated that a temperature-independent damage feature can be calculated, to some degree, by using a mode ratio between two modes of similar temperature dependence.
Revenue management of air cargo service in theory and practice
NASA Astrophysics Data System (ADS)
Budiarto, S.; Putro, H. P.; Pradono, P.; Yudoko, G.
2018-05-01
This study examines the air cargo service by comparing existing theories from previous research with the conditions on the ground. The object of the study is focused on the freight forwarder and the airport management. This study reviews the models and results of previous research that will be summarized and used to identify any issues related to the characteristics of air cargo operational services, as well as observing and monitoring literature with airlines, shipping companies, and airport management to explore and see the gap between prior research and implementation of its process in the air cargo service. The first phase in this study is to provide an overview of the air cargo industry. The second phase analyzes the characteristic differences between air cargo services and air passenger operating services. And the third phase is a literary bibliography study of air cargo operations, where the focus is on the studies using quantitative models from the perspective of the object of the study, which is the optimization of revenue management on air cargo services. From the results of the study, which is based on the gap between theory and practice, new research opportunities which are related to management of air cargo service revenue in the form of model development are found by adding booking timelines aspects of cargo that can affect the revenue of cargo airline companies and airports.
Petroleum and Water Logistics Operations
2005-06-19
7-4 Existent Gum ...7-4 Potential Gum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-5 Flashpoint...the formation of insoluble gums . Corrosion Quantitative and qualitative tests for corrosion indicate whether products are free of corrosion ten
Matrix evaluation of science objectives
NASA Technical Reports Server (NTRS)
Wessen, Randii R.
1994-01-01
The most fundamental objective of all robotic planetary spacecraft is to return science data. To accomplish this, a spacecraft is fabricated and built, software is planned and coded, and a ground system is designed and implemented. However, the quantitative analysis required to determine how the collection of science data drives ground system capabilities has received very little attention. This paper defines a process by which science objectives can be quantitatively evaluated. By applying it to the Cassini Mission to Saturn, this paper further illustrates the power of this technique. The results show which science objectives drive specific ground system capabilities. In addition, this process can assist system engineers and scientists in the selection of the science payload during pre-project mission planning; ground system designers during ground system development and implementation; and operations personnel during mission operations.
Quantitative Monitoring of Microbial Species during Bioleaching of a Copper Concentrate.
Hedrich, Sabrina; Guézennec, Anne-Gwenaëlle; Charron, Mickaël; Schippers, Axel; Joulian, Catherine
2016-01-01
Monitoring of the microbial community in bioleaching processes is essential in order to control process parameters and enhance the leaching efficiency. Suitable methods are, however, limited as they are usually not adapted to bioleaching samples and often no taxon-specific assays are available in the literature for these types of consortia. Therefore, our study focused on the development of novel quantitative real-time PCR (qPCR) assays for the quantification of Acidithiobacillus caldus, Leptospirillum ferriphilum, Sulfobacillus thermosulfidooxidans , and Sulfobacillus benefaciens and comparison of the results with data from other common molecular monitoring methods in order to evaluate their accuracy and specificity. Stirred tank bioreactors for the leaching of copper concentrate, housing a consortium of acidophilic, moderately thermophilic bacteria, relevant in several bioleaching operations, served as a model system. The microbial community analysis via qPCR allowed a precise monitoring of the evolution of total biomass as well as abundance of specific species. Data achieved by the standard fingerprinting methods, terminal restriction fragment length polymorphism (T-RFLP) and capillary electrophoresis single strand conformation polymorphism (CE-SSCP) on the same samples followed the same trend as qPCR data. The main added value of qPCR was, however, to provide quantitative data for each species whereas only relative abundance could be deduced from T-RFLP and CE-SSCP profiles. Additional value was obtained by applying two further quantitative methods which do not require nucleic acid extraction, total cell counting after SYBR Green staining and metal sulfide oxidation activity measurements via microcalorimetry. Overall, these complementary methods allow for an efficient quantitative microbial community monitoring in various bioleaching operations.
Quantitative Monitoring of Microbial Species during Bioleaching of a Copper Concentrate
Hedrich, Sabrina; Guézennec, Anne-Gwenaëlle; Charron, Mickaël; Schippers, Axel; Joulian, Catherine
2016-01-01
Monitoring of the microbial community in bioleaching processes is essential in order to control process parameters and enhance the leaching efficiency. Suitable methods are, however, limited as they are usually not adapted to bioleaching samples and often no taxon-specific assays are available in the literature for these types of consortia. Therefore, our study focused on the development of novel quantitative real-time PCR (qPCR) assays for the quantification of Acidithiobacillus caldus, Leptospirillum ferriphilum, Sulfobacillus thermosulfidooxidans, and Sulfobacillus benefaciens and comparison of the results with data from other common molecular monitoring methods in order to evaluate their accuracy and specificity. Stirred tank bioreactors for the leaching of copper concentrate, housing a consortium of acidophilic, moderately thermophilic bacteria, relevant in several bioleaching operations, served as a model system. The microbial community analysis via qPCR allowed a precise monitoring of the evolution of total biomass as well as abundance of specific species. Data achieved by the standard fingerprinting methods, terminal restriction fragment length polymorphism (T-RFLP) and capillary electrophoresis single strand conformation polymorphism (CE-SSCP) on the same samples followed the same trend as qPCR data. The main added value of qPCR was, however, to provide quantitative data for each species whereas only relative abundance could be deduced from T-RFLP and CE-SSCP profiles. Additional value was obtained by applying two further quantitative methods which do not require nucleic acid extraction, total cell counting after SYBR Green staining and metal sulfide oxidation activity measurements via microcalorimetry. Overall, these complementary methods allow for an efficient quantitative microbial community monitoring in various bioleaching operations. PMID:28066365
Wu, Jia; Gong, Guanghua; Cui, Yi; Li, Ruijiang
2016-11-01
To predict pathological response of breast cancer to neoadjuvant chemotherapy (NAC) based on quantitative, multiregion analysis of dynamic contrast enhancement magnetic resonance imaging (DCE-MRI). In this Institutional Review Board-approved study, 35 patients diagnosed with stage II/III breast cancer were retrospectively investigated using 3T DCE-MR images acquired before and after the first cycle of NAC. First, principal component analysis (PCA) was used to reduce the dimensionality of the DCE-MRI data with high temporal resolution. We then partitioned the whole tumor into multiple subregions using k-means clustering based on the PCA-defined eigenmaps. Within each tumor subregion, we extracted four quantitative Haralick texture features based on the gray-level co-occurrence matrix (GLCM). The change in texture features in each tumor subregion between pre- and during-NAC was used to predict pathological complete response after NAC. Three tumor subregions were identified through clustering, each with distinct enhancement characteristics. In univariate analysis, all imaging predictors except one extracted from the tumor subregion associated with fast washout were statistically significant (P < 0.05) after correcting for multiple testing, with area under the receiver operating characteristic (ROC) curve (AUC) or AUCs between 0.75 and 0.80. In multivariate analysis, the proposed imaging predictors achieved an AUC of 0.79 (P = 0.002) in leave-one-out cross-validation. This improved upon conventional imaging predictors such as tumor volume (AUC = 0.53) and texture features based on whole-tumor analysis (AUC = 0.65). The heterogeneity of the tumor subregion associated with fast washout on DCE-MRI predicted pathological response to NAC in breast cancer. J. Magn. Reson. Imaging 2016;44:1107-1115. © 2016 International Society for Magnetic Resonance in Medicine.
Moodley, Kuven K; Perani, Daniela; Minati, Ludovico; Della Rosa, Pasquale Anthony; Pennycook, Frank; Dickson, John C; Barnes, Anna; Contarino, Valeria Elisa; Michopoulou, Sofia; D'Incerti, Ludovico; Good, Catriona; Fallanca, Federico; Vanoli, Emilia Giovanna; Ell, Peter J; Chan, Dennis
2015-01-01
Simultaneous PET-MRI is used to compare patterns of cerebral hypometabolism and atrophy in six different dementia syndromes. The primary objective was to conduct an initial exploratory study regarding the concordance of atrophy and hypometabolism in syndromic variants of Alzheimer's disease (AD) and frontotemporal dementia (FTD). The secondary objective was to determine the effect of image analysis methods on determination of atrophy and hypometabolism. PET and MRI data were acquired simultaneously on 24 subjects with six variants of AD and FTD (n = 4 per group). Atrophy was rated visually and also quantified with measures of cortical thickness. Hypometabolism was rated visually and also quantified using atlas- and SPM-based approaches. Concordance was measured using weighted Cohen's kappa. Atrophy-hypometabolism concordance differed markedly between patient groups; kappa scores ranged from 0.13 (nonfluent/agrammatic variant of primary progressive aphasia, nfvPPA) to 0.49 (posterior cortical variant of AD, PCA). Heterogeneity was also observed within groups; the confidence intervals of kappa scores ranging from 0-0.25 for PCA to 0.29-0.61 for nfvPPA. More widespread MRI and PET changes were identified using quantitative methods than on visual rating. The marked differences in concordance identified in this initial study may reflect differences in the molecular pathologies underlying AD and FTD syndromic variants but also operational differences in the methods used to diagnose these syndromes. The superior ability of quantitative methodologies to detect changes on PET and MRI, if confirmed on larger cohorts, may favor their usage over qualitative visual inspection in future clinical diagnostic practice.
Interrogating the topological robustness of gene regulatory circuits by randomization
Levine, Herbert; Onuchic, Jose N.
2017-01-01
One of the most important roles of cells is performing their cellular tasks properly for survival. Cells usually achieve robust functionality, for example, cell-fate decision-making and signal transduction, through multiple layers of regulation involving many genes. Despite the combinatorial complexity of gene regulation, its quantitative behavior has been typically studied on the basis of experimentally verified core gene regulatory circuitry, composed of a small set of important elements. It is still unclear how such a core circuit operates in the presence of many other regulatory molecules and in a crowded and noisy cellular environment. Here we report a new computational method, named random circuit perturbation (RACIPE), for interrogating the robust dynamical behavior of a gene regulatory circuit even without accurate measurements of circuit kinetic parameters. RACIPE generates an ensemble of random kinetic models corresponding to a fixed circuit topology, and utilizes statistical tools to identify generic properties of the circuit. By applying RACIPE to simple toggle-switch-like motifs, we observed that the stable states of all models converge to experimentally observed gene state clusters even when the parameters are strongly perturbed. RACIPE was further applied to a proposed 22-gene network of the Epithelial-to-Mesenchymal Transition (EMT), from which we identified four experimentally observed gene states, including the states that are associated with two different types of hybrid Epithelial/Mesenchymal phenotypes. Our results suggest that dynamics of a gene circuit is mainly determined by its topology, not by detailed circuit parameters. Our work provides a theoretical foundation for circuit-based systems biology modeling. We anticipate RACIPE to be a powerful tool to predict and decode circuit design principles in an unbiased manner, and to quantitatively evaluate the robustness and heterogeneity of gene expression. PMID:28362798
Ocean regional circulation model sensitizes to resolution of the lateral boundary conditions
NASA Astrophysics Data System (ADS)
Pham, Van Sy; Hwang, Jin Hwan
2017-04-01
Dynamical downscaling with nested regional oceanographic models is an effective approach for forecasting operationally coastal weather and projecting long term climate on the ocean. Nesting procedures deliver the unwanted in dynamic downscaling due to the differences of numerical grid sizes and updating steps. Therefore, such unavoidable errors restrict the application of the Ocean Regional Circulation Model (ORCMs) in both short-term forecasts and long-term projections. The current work identifies the effects of errors induced by computational limitations during nesting procedures on the downscaled results of the ORCMs. The errors are quantitatively evaluated for each error source and its characteristics by the Big-Brother Experiments (BBE). The BBE separates identified errors from each other and quantitatively assess the amount of uncertainties employing the same model to simulate for both nesting and nested model. Here, we focus on discussing errors resulting from two main matters associated with nesting procedures. They should be the spatial grids' differences and the temporal updating steps. After the diverse cases from separately running of the BBE, a Taylor diagram was adopted to analyze the results and suggest an optimization intern of grid size and updating period and domain sizes. Key words: lateral boundary condition, error, ocean regional circulation model, Big-Brother Experiment. Acknowledgement: This research was supported by grants from the Korean Ministry of Oceans and Fisheries entitled "Development of integrated estuarine management system" and a National Research Foundation of Korea (NRF) Grant (No. 2015R1A5A 7037372) funded by MSIP of Korea. The authors thank the Integrated Research Institute of Construction and Environmental Engineering of Seoul National University for administrative support.
Bernarde, Cédric; Keravec, Marlène; Mounier, Jérôme; Gouriou, Stéphanie; Rault, Gilles; Férec, Claude; Barbier, Georges; Héry-Arnaud, Geneviève
2015-01-01
Airway microbiota composition has been clearly correlated with many pulmonary diseases, and notably with cystic fibrosis (CF), an autosomal genetic disorder caused by mutation in the CF transmembrane conductance regulator (CFTR). Recently, a new molecule, ivacaftor, has been shown to re-establish the functionality of the G551D-mutated CFTR, allowing significant improvement in lung function. The purpose of this study was to follow the evolution of the airway microbiota in CF patients treated with ivacaftor, using quantitative PCR and pyrosequencing of 16S rRNA amplicons, in order to identify quantitative and qualitative changes in bacterial communities. Three G551D children were followed up longitudinally over a mean period of more than one year covering several months before and after initiation of ivacaftor treatment. 129 operational taxonomy units (OTUs), representing 64 genera, were identified. There was no significant difference in total bacterial load before and after treatment. Comparison of global community composition found no significant changes in microbiota. Two OTUs, however, showed contrasting dynamics: after initiation of ivacaftor, the relative abundance of the anaerobe Porphyromonas 1 increased (p<0.01) and that of Streptococcus 1 (S. mitis group) decreased (p<0.05), possibly in relation to the anti-Gram-positive properties of ivacaftor. The anaerobe Prevotella 2 correlated positively with the pulmonary function test FEV-1 (r=0.73, p<0.05). The study confirmed the presumed positive role of anaerobes in lung function. Several airway microbiota components, notably anaerobes (obligate or facultative anaerobes), could be valuable biomarkers of lung function improvement under ivacaftor, and could shed light on the pathophysiology of lung disease in CF patients.
Wu, Q-M; Zhao, X-Y; You, H
2016-01-01
Esophageal-gastro Varices (EGV) may develop in any histological stages of primary biliary cirrhosis (PBC). We aim to establish and validate quantitative fibrosis (qFibrosis) parameters in portal, septal and fibrillar areas as ideal predictors of EGV in PBC patients. PBC patients with liver biopsy, esophagogastroscopy and Second Harmonic Generation (SHG)/Two-photon Excited Fluorescence (TPEF) microscopy images were retrospectively enrolled in this study. qFibrosis parameters in portal, septal and fibrillar areas were acquired by computer-assisted SHG/TPEF imaging system. Independent predictor was identified using multivariate logistic regression analysis. PBC patients with liver biopsy, esophagogastroscopy and Second Harmonic Generation (SHG)/Two-photon Excited Fluorescence (TPEF) microscopy images were retrospectively enrolled in this study. qFibrosis parameters in portal, septal and fibrillar areas were acquired by computer-assisted SHG/TPEF imaging system. Independent predictor was identified using multivariate logistic regression analysis. Among the forty-nine PBC patients with qFibrosis images, twenty-nine PBC patients with both esophagogastroscopy data and qFibrosis data were selected out for EGV prognosis analysis and 44.8% (13/29) of them had EGV. The qFibrosis parameters of collagen percentage and number of crosslink in fibrillar area, short/long/thin strings number and length/width of the strings in septa area were associated with EGV (p < 0.05). Multivariate logistic analysis showed that the collagen percentage in fibrillar area ≥ 3.6% was an independent factor to predict EGV (odds ratio 6.9; 95% confidence interval 1.6-27.4). The area under receiver operating characteristic (ROC), diagnostic sensitivity and specificity was 0.9, 100% and 75% respectively. Collagen percentage in Collagen percentage in the fibrillar area as an independent predictor can highly predict EGV in PBC patients.
Watanabe, Colin; Cuellar, Trinna L.; Haley, Benjamin
2016-01-01
ABSTRACT Incorporating miRNA-like features into vector-based hairpin scaffolds has been shown to augment small RNA processing and RNAi efficiency. Therefore, defining an optimal, native hairpin context may obviate a need for hairpin-specific targeting design schemes, which confound the movement of functional siRNAs into shRNA/artificial miRNA backbones, or large-scale screens to identify efficacious sequences. Thus, we used quantitative cell-based assays to compare separate third generation artificial miRNA systems, miR-E (based on miR-30a) and miR-3G (based on miR-16-2 and first described in this study) to widely-adopted, first and second generation formats in both Pol-II and Pol-III expression vector contexts. Despite their unique structures and strandedness, and in contrast to first and second-generation RNAi triggers, the third generation formats operated with remarkable similarity to one another, and strong silencing was observed with a significant fraction of the evaluated target sequences within either promoter context. By pairing an established siRNA design algorithm with the third generation vectors we could readily identify targeting sequences that matched or exceeded the potency of those discovered through large-scale sensor-based assays. We find that third generation hairpin systems enable the maximal level of siRNA function, likely through enhanced processing and accumulation of precisely-defined guide RNAs. Therefore, we predict future gains in RNAi potency will come from improved hairpin expression and identification of optimal siRNA-intrinsic silencing properties rather than further modification of these scaffolds. Consequently, third generation systems should be the primary format for vector-based RNAi studies; miR-3G is advantageous due to its small expression cassette and simplified, cost-efficient cloning scheme. PMID:26786363
Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.
Kendall, Katherine A
2017-10-01
Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.
Embedding the perceptions of people with dementia into quantitative research design.
O'Rourke, Hannah M; Duggleby, Wendy; Fraser, Kimberly D
2015-05-01
Patient perspectives about quality of life are often found in the results of qualitative research and could be applied to steer the direction of future research. The purpose of this paper was to describe how findings from a body of qualitative research on patient perspectives about quality of life were linked to a clinical administrative dataset and then used to design a subsequent quantitative study. Themes from two systematic reviews of qualitative evidence (i.e., metasyntheses) identified what affects quality of life according to people with dementia. Selected themes and their sub-concepts were then mapped to an administrative dataset (the Resident Assessment Instrument 2.0) to determine the study focus, formulate nine hypotheses, and select a patient-reported outcome. A literature review followed to confirm existence of a knowledge gap, identify adjustment variables, and support design decisions. A quantitative study to test the association between conflict and sadness for people with dementia in long-term care was derived from metasynthesis themes. Challenges included (1) mapping broad themes to the administrative dataset; (2) decisions associated with inclusion of variables not identified by people with dementia from the qualitative research; and (3) selecting a patient-reported outcome, when the dataset lacked a valid subjective quality-of-life measure. Themes derived from a body of qualitative research capturing a target populations' perspective can be linked to administrative data and used to design a quantitative study. Using this approach, the quantitative findings will be meaningful with respect to the quality of life of the target population.
Shen, Jin-Jing; Gong, Xing-Chu; Pan, Jian-Yang; Qu, Hai-Bin
2017-03-01
Design space approach was applied in this study to optimize the lime milk precipitation process of Lonicera Japonica (Jinyinhua) aqueous extract. The evaluation indices for this process were total organic acid purity and amounts of 6 organic acids obtained from per unit mass of medicinal materials. Four critical process parameters (CPPs) including drop speed of lime milk, pH value after adding lime milk, settling time and settling temperature were identified by using the weighted standardized partial regression coefficient method. Quantitative models between process evaluation indices and CPPs were established by a stepwise regression analysis. A design space was calculated by a Monte-Carlo simulation method, and then verified. The verification test results showed that the operation within the design space can guarantee the stability of the lime milk precipitation process. The recommended normal operation space is as follows: drop speed of lime milk of 1.00-1.25 mL•min⁻¹, pH value of 11.5-11.7, settling time of 1.0-1.2 h, and settling temperature of 10-20 ℃.. Copyright© by the Chinese Pharmaceutical Association.
Commercialization of New Beam Applications
NASA Astrophysics Data System (ADS)
McKeown, Joseph
1996-05-01
The commercialization of electron processing applications is driven by demonstrated technical advantages over current practice. Mature and reliable accelerator technology has permitted more consistent product quality and the development of new processes. However, the barriers to commercial adoption are often not amenable to solution within the laboratory alone. Aspects of the base accelerator technology, plant engineering, production, project management, financing, regulatory control, product throughput and plant operational efficiency all contribute to the business risk. Experiences in building three 10 MeV, 50 kW, IMPELA electron accelerators at approximately 8 M each and achieving cumulative operational availability greater than 98% in commercial environments have identified key parameters defining those aspects. The allowed ranges of these parameters to generate the 1.5 M annual revenue that is typically necessary to support outlays of this scale are presented. Such data have been used in proposals to displace expensive chemicals in the viscose industry, sterilize sewage sludge, detoxify chemically contaminated soils and build radiation service centers for a diversity of applications. The proposals face stiff competition from traditional chemical methods. Quantitative technical and business details of these activities are provided and an attempt is made to establish realistic expectations for the exploitation of electron beam technologies in emerging applications.
NASA Astrophysics Data System (ADS)
Gao, Fang; Rey-de-Castro, Roberto; Wang, Yaoxiong; Rabitz, Herschel; Shuang, Feng
2016-05-01
Many systems under control with an applied field also interact with the surrounding environment. Understanding the control mechanisms has remained a challenge, especially the role played by the interaction between the field and the environment. In order to address this need, here we expand the scope of the Hamiltonian-encoding and observable-decoding (HE-OD) technique. HE-OD was originally introduced as a theoretical and experimental tool for revealing the mechanism induced by control fields in closed quantum systems. The results of open-system HE-OD analysis presented here provide quantitative mechanistic insights into the roles played by a Markovian environment. Two model open quantum systems are considered for illustration. In these systems, transitions are induced by either an applied field linked to a dipole operator or Lindblad operators coupled to the system. For modest control yields, the HE-OD results clearly show distinct cooperation between the dynamics induced by the optimal field and the environment. Although the HE-OD methodology introduced here is considered in simulations, it has an analogous direct experimental formulation, which we suggest may be applied to open systems in the laboratory to reveal mechanistic insights.
Bakas, Spyridon; Akbari, Hamed; Sotiras, Aristeidis; Bilello, Michel; Rozycki, Martin; Kirby, Justin S.; Freymann, John B.; Farahani, Keyvan; Davatzikos, Christos
2017-01-01
Gliomas belong to a group of central nervous system tumors, and consist of various sub-regions. Gold standard labeling of these sub-regions in radiographic imaging is essential for both clinical and computational studies, including radiomic and radiogenomic analyses. Towards this end, we release segmentation labels and radiomic features for all pre-operative multimodal magnetic resonance imaging (MRI) (n=243) of the multi-institutional glioma collections of The Cancer Genome Atlas (TCGA), publicly available in The Cancer Imaging Archive (TCIA). Pre-operative scans were identified in both glioblastoma (TCGA-GBM, n=135) and low-grade-glioma (TCGA-LGG, n=108) collections via radiological assessment. The glioma sub-region labels were produced by an automated state-of-the-art method and manually revised by an expert board-certified neuroradiologist. An extensive panel of radiomic features was extracted based on the manually-revised labels. This set of labels and features should enable i) direct utilization of the TCGA/TCIA glioma collections towards repeatable, reproducible and comparative quantitative studies leading to new predictive, prognostic, and diagnostic assessments, as well as ii) performance evaluation of computer-aided segmentation methods, and comparison to our state-of-the-art method. PMID:28872634
Using borehole flow data to characterize the hydraulics of flow paths in operating wellfields
Paillet, F.; Lundy, J.
2004-01-01
Understanding the flow paths in the vicinity of water well intakes is critical in the design of effective wellhead protection strategies for heterogeneous carbonate aquifers. High-resolution flow logs can be combined with geophysical logs and borehole-wall-image logs (acoustic televiewer) to identify the porous beds, solution openings, and fractures serving as conduits connecting the well bore to the aquifer. Qualitative methods of flow log analysis estimate the relative transmissivity of each water-producing zone, but do not indicate how those zones are connected to the far-field aquifer. Borehole flow modeling techniques can be used to provide quantitative estimates of both transmissivity and far-field hydraulic head in each producing zone. These data can be used to infer how the individual zones are connected with each other, and to the surrounding large-scale aquifer. Such information is useful in land-use planning and the design of well intakes to prevent entrainment of contaminants into water-supply systems. Specific examples of flow log applications in the identification of flow paths in operating wellfields are given for sites in Austin and Faribault, Minnesota. Copyright ASCE 2004.
Carvalho, Paloma Aparecida; Göttems, Leila Bernarda Donato; Pires, Maria Raquel Gomes Maia; de Oliveira, Maria Liz Cunha
2015-01-01
Objective: to evaluate the perception of healthcare professionals about the safety culture in the operating room of a public hospital, large-sized, according to the domains of the Safety Attitudes Questionnaire (SAQ). Method: descriptive, cross-sectional and quantitative research, with the application of the SAQ to 226 professionals. Descriptive data analysis, instrument consistency and exploratory factor analysis. Results: participants were distributed homogeneously between females (49.6%) and males (50.4%); mean age of 39.6 (SD±9.9) years and length of professional experience of 9.9 (SD±9.2) years. And Cronbach's ( of 0.84. It was identified six domains proposed in the questionnaire: stress perception (74.5) and job satisfaction (70.7) showed satisfactory results; teamwork environment (59.1) and climate of security (48.9) presented scores below the minimum recommended (75); unit's management perceptions (44.5), hospital management perceptions (34.9) and working conditions (41.9) presented the lowest averages. Conclusions: the results showed that, from the perspective of the professionals, there is weakness in the values, attitudes, skills and behaviors that determine the safety culture in a healthcare organization. PMID:26625994
A multi-armed bandit approach to superquantile selection
2017-06-01
decision learning, machine learning, intelligence processing, intelligence cycle, quantitative finance. 15. NUMBER OF PAGES 73 16. PRICE CODE 17...fulfillment of the requirements for the degree of MASTER OF SCIENCE IN OPERATIONS RESEARCH from the NAVAL POSTGRADUATE SCHOOL June 2017 Approved by...Roberto S. Szechtman Thesis Advisor Michael P. Atkinson Second Reader Patricia A. Jacobs Chair, Operations Research Department iii THIS PAGE
Operational Assessment of Tools for Accelerating Leader Development (ALD): Volume 2, Appendices
2009-06-01
Qual Qualitative Quant Quantitative RC Reserve Component R&D Research and Development re: reference reqts requirements ROTC Reserve Officer...part in the Accelerating Leader Development program, please complete the Pretest , Training, and Posttest . Of course, you may complete the longer...ARI Research Note 2009-09 Operational Assessment of Tools for Accelerating Leader Development (ALD): Volume II, Appendices Bruce
Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions
Shatsky, Maxim; Dong, Ming; Liu, Haichuan; ...
2016-04-20
Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less
Quantitative Tagless Copurification: A Method to Validate and Identify Protein-Protein Interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shatsky, Maxim; Dong, Ming; Liu, Haichuan
Identifying protein-protein interactions (PPIs) at an acceptable false discovery rate (FDR) is challenging. Previously we identified several hundred PPIs from affinity purification - mass spectrometry (AP-MS) data for the bacteria Escherichia coli and Desulfovibrio vulgaris. These two interactomes have lower FDRs than any of the nine interactomes proposed previously for bacteria and are more enriched in PPIs validated by other data than the nine earlier interactomes. To more thoroughly determine the accuracy of ours or other interactomes and to discover further PPIs de novo, here we present a quantitative tagless method that employs iTRAQ MS to measure the copurification ofmore » endogenous proteins through orthogonal chromatography steps. 5273 fractions from a four-step fractionation of a D. vulgaris protein extract were assayed, resulting in the detection of 1242 proteins. Protein partners from our D. vulgaris and E. coli AP-MS interactomes copurify as frequently as pairs belonging to three benchmark data sets of well-characterized PPIs. In contrast, the protein pairs from the nine other bacterial interactomes copurify two- to 20-fold less often. We also identify 200 high confidence D. vulgaris PPIs based on tagless copurification and colocalization in the genome. These PPIs are as strongly validated by other data as our AP-MS interactomes and overlap with our AP-MS interactome for D.vulgaris within 3% of expectation, once FDRs and false negative rates are taken into account. Finally, we reanalyzed data from two quantitative tagless screens of human cell extracts. We estimate that the novel PPIs reported in these studies have an FDR of at least 85% and find that less than 7% of the novel PPIs identified in each screen overlap. Our results establish that a quantitative tagless method can be used to validate and identify PPIs, but that such data must be analyzed carefully to minimize the FDR.« less